Tag: Data

  • California discipline data show widespread disparities despite reforms

    California discipline data show widespread disparities despite reforms

    This audio is auto-generated. Please let us know if you have feedback.

    Dive Brief:

    • California’s Black, foster and homeless student populations are experiencing persistent and widespread discipline disparities despite state reforms to reduce inequities, a new report from the National Center for Youth Law said.
    • The report found that students in the foster system lost 76.6 days of instruction per 100 students enrolled in 2023-24 due to out-of-school suspensions — seven times the statewide average for all students of 10.7 days lost per 100 students. And in many districts, the suspension gap between Black and White students has increased significantly over the past seven years.
    • NCYL warns that discipline disparities could widen even more as the Trump administration seeks to eliminate school discipline practices meant to address racial inequity for historically marginalized student populations. 

    Dive Insight:

    NCYL’s analysis of discipline data in California shows that while some districts have made progress in reducing disparities, many continue to suspend and expel students at disproportionately high rates.

    For example, students experiencing homelessness lost 29.1 days because of out-of-school suspensions per 100 students enrolled in 2023-24. Students with disabilities lost 23.4 days of instruction per 100 students enrolled the same school year, which is nearly three times higher than students without disabilities, according to the report. 

    Black foster youth had the highest disproportionate discipline rate with 121.8 days per 100 students enrolled due to out-of-school suspensions. That’s 15 times the rate of lost instruction for all enrolled Whites students, which was 7.9 lost days per 100 students.

    The report’s analysis pulls from discipline data between the 2017-18 and 2023-24 school years. California doesn’t publicly report on the number of school days lost by offense category. Rather, NCYL developed the metric to compare rates across districts, over time and between student groups, the report said.

    Additionally, NCYL’s data analysis shows that most suspensions are for minor misconduct that did not involve injury, such as the use of profanity or vulgarity. The 2024-25 school year was the first in which no suspensions were allowed for willful defiance in grades K-12 in California, although the policy had been phased in for younger grades in the years before. 

    The report recommends that the state disaggregate discipline data for the offenses with the highest rates so the public can see which are for violent and nonviolent behaviors. Currently, most suspensions in California schools, even for profanity and vulgarity offenses, can be reported under a category titled “violent incident, no injury,” which can be misleading, NCYL said.

    When most suspensions are reported under the category of ‘violent incident, no injury’ or ‘violent incident, injury’ people will assume the offenses were violent, but they could be mostly profanity and vulgarity, said Dan Losen, co-author of the report and senior director for education at NCYL. 

    “Don’t call obscenity violence. It’s not violent,” Losen said. “These very subjective determinations about what’s profanity, what’s vulgarity, what’s obscene, what’s not obscene is fertile ground for implicit racial bias.”

    The report highlights several California districts making improvements in reducing discipline disparities. Merced Union High District, for instance, has reduced its rate of lost instruction from 58.3 days per 100 Black students in 2017-18, to 8.8 days per 100 Black students in 2023-24. Lost instruction days for students with disabilities went from 32 in 2017-18 to 6.1 in 2023-24 per 100 students with disabilities.

    The report credited the reductions in lost instruction to the district’s efforts at problem-solving rather than punitive measures and for providing student supports like individualized interventions and behavioral services.

    NCYL recommends several statewide initiatives to reduce discipline disparities, including strengthening state civil rights enforcement and oversight of district discipline practices, as well as expanding support for students in the foster system, students experiencing homelessness, and students with disabilities.

    However, statewide reforms in California could be in jeopardy under the Trump administration’s efforts to stamp out diversity, equity and inclusion programs nationally, the report said. Such state reforms have included a ban on suspensions for willful defiance in grades K-12 and the explicit inclusion of school discipline in the California Department of Education’s statewide accountability system.

    Specifically, the report points to a White House executive order issued in April that calls for a stop to “unlawful ‘equity’ ideology” in school discipline. The order requires the U.S. Department of Education to issue guidance on states’ and districts’ obligations “not to engage in racial discrimination under Title VI in all contexts, including school discipline.”

    Critics of equity-based discipline policies say they hamper school safety. 

    Title VI of the Civil Rights Act prohibits discrimination based on race, color or national origin in federally funded programs.

    The federal discipline guidance required by Trump’s executive order has not yet been issued, and the Education Department did not respond to inquiries about its status. While discipline policies are typically set at the school, district or state levels, the federal government can issue guidance and investigate schools for discriminatory practices under Title VI.

    The civil rights law has historically been invoked to protect the rights of historically marginalized students, including when they are overrepresented in school discipline — and especially exclusionary discipline — data. However, the current administration has used the law to protect White and Asian students, sometimes at the expense of DEI efforts meant to level the playing field for those historically marginalized groups.

    “One should expect that, soon, all student groups that have experienced unjustifiably high rates of removal will be excluded from educational opportunities on disciplinary grounds even more often,” the NCYL report said.

    Source link

  • Transfer Data Shows Little Progress for First-Time Students

    Transfer Data Shows Little Progress for First-Time Students

    The new “Tracking Transfer” report from the National Student Clearinghouse Research Center shows little improvement in transfer rates for first-time college students. But it also sheds light on factors that could contribute to better outcomes.

    The latest report, part of a series, examined transfer data for students who entered community college in 2017 and for former community college students enrolled at four-year institutions that academic year.

    It found that only 31.6 percent of first-time students who started community college in 2017 transferred within six years. And slightly fewer than half of those who transferred, 49.7 percent, earned a bachelor’s degree, consistent with outcomes for the previous cohort.

    But some types of students had better outcomes than others. For example, students who came to community college with some dual-enrollment credits had higher transfer and bachelor’s degree completion rates, 46.9 percent and 60.1 percent, respectively.

    Bachelor’s degree completion rates were also highest for transfer students at public four-year institutions compared to other types of institutions. Nearly three-quarters of students who transferred from community colleges to public four-year institutions in the 2017–18 academic year earned a bachelor’s degree within six years. The report also found that most transfer students from community colleges, 75.2 percent, attend public four-year colleges and universities.

    Retention rates among these students were also fairly high. Among students who transferred, 82 percent returned to their four-year institutions the following year. The retention rate was even higher for students who earned a certificate or an associate degree before they transferred, 86.8 percent, which was nearly 10 percentage points higher those who didn’t earn a credential before transferring.

    Source link

  • In training educators to use AI, we must not outsource the foundational work of teaching

    In training educators to use AI, we must not outsource the foundational work of teaching

    This story was originally published by Chalkbeat. Sign up for their newsletters at ckbe.at/newsletters.

    I was conferencing with a group of students when I heard the excitement building across my third grade classroom. A boy at the back table had been working on his catapult project for over an hour through our science lesson, into recess, and now during personalized learning time. I watched him adjust the wooden arm for what felt like the 20th time, measure another launch distance, and scribble numbers on his increasingly messy data sheet.

    “The longer arm launches farther!” he announced to no one in particular, his voice carrying the matter-of-fact tone of someone who had just uncovered a truth about the universe. I felt that familiar teacher thrill, not because I had successfully delivered a physics lesson, but because I hadn’t taught him anything at all.

    Last year, all of my students chose a topic they wanted to explore and pursued a personal learning project about it. This particular student had discovered the relationship between lever arm length and projectile distance entirely through his own experiments, which involved mathematics, physics, history, and data visualization.

    Other students drifted over to try his longer-armed design, and soon, a cluster of 8-year-olds were debating trajectory angles and comparing medieval siege engines to ancient Chinese catapults.

    They were doing exactly what I dream of as an educator: learning because they wanted to know, not because they had to perform.

    Then, just recently, I read about the American Federation of Teachers’ new $23 million partnership with Microsoft, OpenAI, and Anthropic to train educators how to use AI “wisely, safely and ethically.” The training sessions would teach them how to generate lesson plans and “microwave” routine communications with artificial intelligence.

    My heart sank.

    As an elementary teacher who also conducts independent research on the intersection of AI and education, and writes the ‘Algorithmic Mind’ column about it for Psychology Today, I live in the uncomfortable space between what technology promises and what children actually need. Yes, I use AI, but only for administrative work like drafting parent newsletters, organizing student data, and filling out required curriculum planning documents. It saves me hours on repetitive tasks that have nothing to do with teaching.

    I’m all for showing educators how to use AI to cut down on rote work. But I fear the AFT’s $23 million initiative isn’t about administrative efficiency. According to their press release, they’re training teachers to use AI for “instructional planning” and as a “thought partner” for teaching decisions. One featured teacher describes using AI tools to help her communicate “in the right voice” when she’s burned out. Another says AI can assist with “late-night lesson planning.”

    That sounds more like outsourcing the foundational work of teaching.

    Watching my student discover physics principles through intrinsic curiosity reminded me why this matters so much. When we start relying on AI to plan our lessons and find our teaching voice, we’re replacing human judgment with algorithmic thinking at the very moment students need us most. We’re prioritizing the product of teaching over the process of learning.

    Most teachers I talk to share similar concerns about AI. They focus on cheating and plagiarism. They worry about students outsourcing their thinking and how to assess learning when they can’t tell if students actually understand anything. The uncomfortable truth is that students have always found ways to avoid genuine thinking when we value products over process. I used SparkNotes. Others used Google. Now, students use ChatGPT.

    The problem is not technology; it’s that we continue prioritizing finished products over messy learning processes. And as long as education rewards predetermined answers over curiosity, students will find shortcuts.

    That’s why teachers need professional development that moves in the opposite direction. They need PD that helps them facilitate genuine inquiry and human connection; foster classrooms where confusion is valued as a precursor to understanding; and develop in students an intrinsic motivation.

    When I think about that boy measuring launch distances with handmade tools, I realize he was demonstrating the distinctly human capacity to ask questions that only he wanted to address. He didn’t need me to structure his investigation or discovery. He needed the freedom to explore, materials to experiment with, and time to pursue his curiosity wherever it led.

    The learning happened not because I efficiently delivered content, but because I stepped back and trusted his natural drive to understand.

    Children don’t need teachers who can generate lesson plans faster or give AI-generated feedback, but educators who can inspire questions, model intellectual courage, and create communities where wonder thrives and real-world problems are solved.

    The future belongs to those who can combine computational tools with human wisdom, ethics, and creativity. But this requires us to maintain the cognitive independence to guide AI systems rather than becoming dependent on them.

    Every time I watch my students make unexpected connections, I’m reminded that the most important learning happens in the spaces between subjects, in the questions that emerge from genuine curiosity, in the collaborative thinking that builds knowledge through relationships. We can’t microwave that. And we shouldn’t try.

    Chalkbeat is a nonprofit news site covering educational change in public schools.

    For more news on AI in education, visit eSN’s Digital Learning hub.

    Latest posts by eSchool Media Contributors (see all)

    Source link

  • International Student Mobility Data Sources: A Primer

    International Student Mobility Data Sources: A Primer

    Part 1: Understanding the Types of Sources and Their Differences

    There has perhaps never been more of a need for data on globally mobile students than now. In 2024, there were about 6.9 million international students studying outside their home countries, a record high, and the number is projected to grow to more than 10 million by 2030. Nations all around the world count on global student mobility for a number of reasons: Sending nations benefit by sending some of their young people abroad for education, particularly when there is less capacity at home to absorb all demand. Many of those young people return to the benefit of the local job market with new skills and knowledge and with global experience, while others remain abroad and are able to contribute in other ways, including sending remittances. Host nations benefit in numerous ways, from the economic contributions of international students (in everything from tuition payments to spending in the local economy) to social and political benefits, including building soft power.

    At the same time, economic, political, and social trends worldwide challenge the current ecosystem of global educational mobility. Many top destinations of international students, including Canada and the United States, have developed heavily restrictive policies toward such students and toward migrants overall. The COVID-19 pandemic demonstrated that one global challenge can upend international education, even if temporarily.

    Data plays a key role in helping those who work in or touch upon international education. All players in the space—from institutional officials and service providers to policymakers and researchers—can use global and national data sources to see trends in student flows, as well as potential changes and disruptions.

    This article is the first in a two-part series exploring global student mobility data. In this first article, I will delve into considerations that apply in examining any international student data source. In the second, forthcoming article, we will examine some of the major data sources in global student mobility, both global and national, with the latter focused on the “Big Four” host countries: the United States, Canada, the United Kingdom, and Australia.

    In utilizing any global student mobility data source, it is crucial to understand some basics about each source. Here are some key questions to ask about any source and how to understand what each provides.

    Who collects the data?

    Table displaying major international student mobility data sources for trends around the world and in the "Big Four" countries.

    There are three main types of entities that collect student mobility data at a national level:

    • Government ministries or agencies: These entities are generally mandated by law or statute to collect international student data for specific purposes. Depending on the entity’s purview, such data could include student visa or permit applications and issuances, students arriving at ports of entry (such as an airport or border crossing), enrollment in an educational institution, or students registered as working during or after completing coursework.
    • Non-governmental organizations (NGOs): Non-profit entities focused on international education or related fields such as higher education or immigration may collect international student data, sometimes with funding or support from relevant government ministries. One good example is the Institute of International Education (IIE) in the U.S., which has collected data on international students and scholars since 1948, much of that time with funding and support from the U.S. Department of State.
    • Individual institutions: Of course, individual universities and colleges usually collect data on all their students, usually with specific information on international students, sometimes by government mandate. In countries such as the U.S. and Canada, these institutions must report such data to governmental ministries. They may also choose to report to non-governmental agencies, such as IIE. Such data may or may not otherwise be publicly available.

    At the international level, the main data sources are generally an aggregation of data from national sources. There are three main efforts:

    How are the data collected?

    The method in which mobility data are collected affects the level of accuracy of such data. The sources that collect data internationally or on multiple countries, such as UNESCO Institute for Statistics (UIS) and IIE’s Project Atlas, are primarily aggregators. They collect the data from national sources, either government ministries or international education organizations, such as the British Council or the Canadian Bureau for International Education (CBIE).

    For primary data collection, there are three main methods:

    • Mandatory reporting: Certain government entities collect data by law or regulation. Data are naturally collected as part of processing and granting student visas or permits, as the S. State Department and Immigration, Refugees and Citizenship Canada (IRCC) do. In other cases, postsecondary institutions are required to track and report on their international students—from application to graduation and sometimes on to post-graduation work programs. This is the case in the U.S. through SEVIS (the Student and Exchange Visitor Information System), overseen by the U.S. Department of Homeland Security (DHS), through which deputized institutional officials track all international students. The data from this system are reported regularly by DHS. In other cases, data are collected annually, often through a survey form, as Statistics Canada does through its Postsecondary Student Information System (PSIS).
    • Census: Some non-profit organizations attempt to have all postsecondary institutions report their data, often through an online questionnaire. This is the method by which IIE obtains data for its annual Open Doors Report, which tracks both international students in the U.S. and students enrolled in U.S. institutions studying abroad short-term in other countries.
    • Survey: A survey gathers data from a sample, preferably representative, of the overall population—in this case, higher education institutions—to form inferences about the international student population. (This should not be confused with the “surveys” issued by government agencies, usually referring to a questionnaire form, typically online nowadays, through which institutions are required to report data.) This method is used in IIE’s snapshot surveys in the fall and spring of each year, intended to provide an up-to-date picture of international student enrollment as a complement to Open Doors, which reflects information on international students from the previous academic year.

    When are the data collected and reported?

    Chart showing the data collection and reporting practices of major global, U.S., and Canadian international student datasets.

    In considering data sources, it is important to know when the data were collected and what time periods they reflect. Government data sources are typically the most up-to-date due to their mandatory nature. Data are often collected continuously in real time, such as when a student visa is approved or when an international student officially starts a course of study. However, each ministry releases data at differing intervals. Australia’s Department of Education, for example, is well known for releasing new data almost every month. USCIS and IRCC tend to release data roughly quarterly, though both provide monthly breakdowns of their data in some cases.

    Non-governmental entities generally do not collect data continuously. Instead, they may collect data annually, semiannually, or even less frequently. IIE’s Open Doors collects data annually for the previous academic year on international students and two years prior on U.S. study abroad students. The results for both are released every November.

    The international aggregated sources receive data from national sources at widely varying times. As a result, there can be gaps in data, making comparison between or among countries challenging. Some countries don’t send data at all, often due to lack of resources for doing so. Even major host countries, notably China, send little if any data to UNESCO.

    What type of student mobility data are included in the source?

    Sources collect different types of student mobility data. One such breakdown is between inbound and outbound students—that is, those whom a country hosts versus those who leave the country to go study in other countries. Most government sources, such as IRCC, focus solely on inbound students—the international students hosted within the country— due to the organizations’ mandate and ability to collect data. Non-governmental organizations, such as IIE, often attempt to capture information on outbound (or “study abroad”) students. Many international sources, such as UNESCO UIS, capture both.

    Another important breakdown addresses whether the data included degree-seekers, students studying abroad for credit back home, or those going abroad not explicitly for study but for a related purpose, such as research or internships:

    • Degree mobility: captures data on students coming into a country or going abroad for pursuit of a full degree.
    • Credit mobility: captures information on those abroad studying short-term for academic credit with their home institution, an arrangement often called “study abroad” (particularly in the U.S. and Canada) or “educational exchange.” The length of the study abroad opportunity typically can last anywhere from one year to as little as one week. Short-duration programs, such as faculty-led study tours, have become an increasingly popular option among students looking for an international experience. In most cases, the home institution is in the student’s country of origin, but that is not always the case. For example, a Vietnamese international student might be studying for a full degree in the U.S. but as part of the coursework studies in Costa Rica for one semester.
    • Non-credit mobility: captures information on those who go abroad not for credit-earning coursework but for something highly related to a degree program, such as research, fieldwork, non-credit language study, an internship, or a volunteer opportunity. This may or may not be organized through the student’s education institution, and the parameters around this type of mobility can be blurry.

    It’s important to know what each data source includes. Most governmental data sources will include both degree and credit mobility—students coming to study for a full degree or only as part of a short-term educational exchange. The dataset may or may not distinguish between these students, which is important to know if the distinction between such students is important for the data user’s purposes.

    For outbound (“study abroad”) mobility, it’s easier for organizations to track credit mobility rather than degree mobility. IIE’s Open Doors, for example, examines only credit mobility for outbound students because it collects data through U.S. institutions, which track their outbound study abroad students and help them receive appropriate credits for their work abroad once they return. There is not a similar mechanism for U.S. degree-seekers going to other countries. That said, organizations such as IIE have attempted such research in the past, even if it is not an ongoing effort. Typically, the best way to find numbers on students from a particular country seeking full degrees abroad is to use UNESCO and sort the full global data by country of origin. UNESCO can also be used to find the numbers in a specific host country, or, in some cases, it may be better to go directly to the country’s national data source if available.

    Non-credit mobility has been the least studied form of student mobility, largely because it is difficult to capture due to its amorphous nature. Nevertheless, some organizations, like IIE, have made one-off or periodic attempts to capture it.

    Who is captured in the data source? How is “international student” defined?

    Each data source may define the type of globally mobile student within the dataset differently. Chiefly, it’s important to recognize whether the source captures only data on international students in the strictest sense (based on that specific legal status) or on others who are not citizens of the host country. The latter could include permanent immigrants (such as permanent residents), temporary workers, and refugees or asylum seekers. The terms used can vary, from “foreign student” to a “nonresident” (sometimes “nonresident alien”), as some U.S. government sources use. It’s important to check the specific definition of the students for whom information is captured.

    Most of the major student mobility data sources capture only data on international students as strictly defined by the host country. Here are the definitions of “international student” for the Big Four:

    • United States: A non-immigrant resident holding an F-1, M-1, or certain types of J-1 (The J-1 visa is an exchange visa that includes but is not limited to students and can include individuals working in youth summer programs or working as au pairs, for example.)
    • Canada: A temporary resident holding a study permit from a designated learning institution (DLI)
    • United Kingdom: An individual on a Student visa
    • Australia: An individual who is not an Australian citizen or permanent resident or who is not a citizen of New Zealand, studying in Australia on a temporary visa

    Some countries make a distinction between international students enrolled in academic programs, such as at a university, versus those studying a trade or in a vocational school; there might also be distinct categorization for those attending language training. For example, in the U.S., M-1 visas are for international students studying in vocational education programs and may not be captured in some data sources, notably Open Doors.

    Understanding the terminology used for international students helps in obtaining the right type of data. For example, one of the primary methods of obtaining data on international students in Canada is through IRCC data held on the Government of Canada’s Open Government Portal. But you won’t find any such dataset on “international students.” Instead, you need to search for “study permit holders.”

    Does the data source include students studying online or at a branch campus abroad, or who are otherwise physically residing outside the host country?

    Some universities and colleges have robust online programs that include significant numbers of students studying physically in other countries. (This was also true for many institutions during the pandemic. As a result, in the U.S., IIE temporarily included non-U.S. students studying at a U.S. institution online from elsewhere.) Other institutions have branch campuses or other such transnational programs that blur the line between international and domestic students. So, it’s important to ask: Does the data source include those not physically present in the institution’s country? The terminology for each country can vary. For example, in Australia, where such practices are very prominent, the term usually used to refer to students studying in Australian institutions but not physically in Australia is “offshore students.”

    What levels of study are included in the dataset?

    The focus of this article is postsecondary education, but some data sources do include primary and secondary students (“K-12 students” in the U.S. and Canada). IRCC’s study permit holder data includes students at all levels, including K-12 students. The ministry does provide some data broken down by level of study and other variables, such as country of citizenship and province or territory.

    What about data on international students who are working?

    Many host countries collect data and report on international students who are employed or participating in paid or unpaid internships during or immediately after their coursework. The specifics vary from country to country depending on how such opportunities for international students are structured and which government agencies are charged with overseeing such efforts. For example, in the U.S., the main work opportunities for most international students both during study (under Curricular Practical Training, or CPT) and after study (usually under Optional Practical Training, or OPT) are overseen by the student’s institution and reported via SEVIS. IIE’s Open Doors tracks students specifically for OPT but not CPT. By contrast, the main opportunity for international students to work in Canada after graduating from a Canadian institution is through the post-graduation work permit (PGWP). Students transfer to a new legal status in Canada, in contrast with U.S.-based international students under OPT, who remain on their student visa until their work opportunity ends. As a result, IRCC reports separate data on graduate students working under the PGWP, though data are relatively scant.

    At some point, students who are able to and make the choice to stay and work beyond such opportunities in their new country transition to new legal statuses, such as the H-1B visa (a specialty-occupation temporary work visa) in the U.S., or directly to permanent residency in many countries. The data required to examine these individuals varies.

    What about data beyond demographics?

    While most international student datasets focus on numbers and demographic breakdowns, some datasets and other related research focus on such topics as the contributions of international students to national and local economies. For example, NAFSA: Association of International Educators, the main professional association for international educators in the U.S., maintains the International Student Economic Value Tool, which quantifies the dollar amounts that international students contribute to the U.S. at large, individual states, and congressional districts. Part of the intention behind this is to provide a tool for policy advocacy in Washington, D.C., and in state and local governments.

    How can I contextualize international student numbers within the broader higher education context of a country?

    Many countries collect and publish higher education data and other research. Each country assigns this function to different ministries or agencies. For example, in Canada, most such data are collected and published by Statistics Canada (StatCan), which is charged with data collection and research broadly for the country. In the U.S., this function falls under the Department of Education’s National Center for Education Statistics (NCES), which runs a major higher education data bank known as IPEDS, the Integrated Postsecondary Education Data System. StatCan does provide some data on international students, while IPEDS in the U.S. reports numbers of “nonresident” students, defined as “a person who is not a citizen or national of the United States and who is in this country on a visa or temporary basis and does not have the right to remain indefinitely.” This term likely encompasses mostly those on international student visas.

    I will discuss some of these higher education data sources in Part 2 of this series.

    How do I learn what I need to know about each individual dataset?

    Each major data source typically provides a glossary, methodology section, and/or appendix that helps users understand the dataset. In Part 2 of this series, we will examine some of the major international and national data sources, including where to locate further such information for each.

    It’s critical for users of student mobility data sources to understand these nuances in order to accurately and appropriately utilize the data. In the second part of this series, we will examine several prominent data sources.

    Source link

  • Proposal would remove federal data collection for special education racial disparities

    Proposal would remove federal data collection for special education racial disparities

    This audio is auto-generated. Please let us know if you have feedback.

    The U.S. Department of Education is proposing to remove a requirement for states to collect and report on racial disparities in special education, according to a notice being published in the Federal Register on Friday.  

    The data collection is part of the annual state application under Part B of the Individuals with Disabilities Education Act. The application provides assurances that the state and its districts will comply with IDEA rules as a condition for receiving federal IDEA funding. 

    The data collection for racial overrepresentation or underrepresentation in special education — known as significant disproportionality — helps identify states and districts that have racial disparities among student special education identifications, placements and discipline. About 5% of school districts nationwide were identified with significant disproportionality in the 2020-21 school year, according to federal data.

    The Education Department said it wants to remove the data collection because the agency anticipates it will reduce paperwork burdens for the states. According to several state Part B applications filed earlier this year, the significant disproportionality data collection adds more hours in paperwork duties. 

    For example, Florida’s application said it records an average of 25 additional hours for responses reporting data related to significant disproportionality in any given year, including the time for reviewing instructions, searching existing data sources, gathering and maintaining the data needed, and completing and reviewing the collection of information. Alabama’s and Oregon’s applications also cite an additional 25 hours each for the collections. 

    The department has not said it wants to rescind or pause the significant disproportionality regulation, a rule known as Equity in IDEA, which was last updated in 2016. 

    However, under the first Trump administration, the rule became a hot button issue when then-U.S. Education Secretary Betsy DeVos said its implementation would be delayed. 

    The Council of Parent Attorneys and Advocates, a nonprofit supportive of education rights for students with disabilities, sued the Education Department and won, and by April 2019, the rule was back in full effect. 

    Denise Marshall, CEO of COPAA said in a Thursday email to K-12 Dive that the proposal to remove the Equity in IDEA federal data collection was “yet another unlawful attempt by the Administration to shirk its obligations under the law to students of color.”

    Marshall added that the data collection fulfills a critical role in enforcing the significant disproportionality requirement in IDEA. The collection allows states and districts to examine the data, determine if there is racial disproportionality, and develop measures to address the problem. Marshall points out that IDEA does not declare significant disproportionality unlawful. Rather, the law and regulations provide a method for states and districts to address systemic racial disproportionality in special education.  

    Robyn Linscott, director of education and family policy at The Arc, an organization that advocates for people with intellectual and developmental disabilities, said that even if in the future there is no longer a data collection for significant disproportionality at the federal level, the information would still need to be collected by states and districts as required by IDEA.

    But the loss of the central repository of information on significant disproportionality in schools will make it more difficult for advocacy groups and technical assistance centers to support school and district efforts to reduce racial disparities in special education.

    In the absence of the data being available at the federal level, it will be “much more difficult” for people not within a state education agency to be able to access the data, Linscott said.

    Correction: A previous version of this article erred in spelling out the IDEA acronym. It stands for the Individuals with Disabilities Education Act. We have updated our story.

    Source link

  • Humanizing Higher Ed Data: The Strategic Power of Student Digital Twins

    Humanizing Higher Ed Data: The Strategic Power of Student Digital Twins

    Higher education institutions are overflowing with data, yet many still struggle to turn that information into actionable insight. With systems siloed across admissions, academics, student support, and alumni relations, it’s hard to get a clear picture of the student journey — let alone use that data to enhance engagement or predict outcomes.

    Enter the “digital twin”: a transformative framework that helps institutions centralize, contextualize, and humanize student data. More than a dashboard or data warehouse, a student digital twin creates a living, dynamic model that reflects how students interact with your institution in real time. It’s the difference between looking at data and understanding a student.

    The data disconnect holding higher ed back

    Disconnected data is one of the most persistent obstacles facing colleges and universities. Key information is often trapped in different systems — student information systems (SIS), learning management systems (LMS), customer relationship management (CRM) tools, financial aid platforms, and more.

    This fragmentation makes it difficult to:

    • Personalize student communications
    • Identify at-risk students in time to intervene
    • Support seamless transfers or cross-departmental collaboration
    • Harness emerging technologies like generative AI

    The result? Missed opportunities, inefficient outreach, and limited visibility into student experiences.

    Demystifying the student digital twin

    A digital twin is a virtual representation of a physical entity. In higher education, that entity is the student. The student digital twin brings together behavioral, academic, and operational data to create a comprehensive, contextual profile of each learner.

    Unlike a static dashboard or data warehouse, a digital twin captures relationships, sequences, and interactions. It enables institutions to:

    • Visualize student journeys across systems
    • Model future scenarios
    • Generate predictive insights
    • Power real-time personalization

    Most importantly, a digital twin humanizes data by shifting the focus from systems to students.

    What makes it work: The Connected Core® architecture

    At Collegis, the digital twin is powered by Connected Core — a composable, cloud-native platform built specifically for higher education. The architecture includes:

    • Integrated data fabric: A higher ed-specific data layer that unifies SIS, LMS, CRM, and more.
    • Packaged business capabilities: Modular features like lead scoring, advising nudges, and financial aid workflows.
    • Composable platform: A low-code development environment that allows institutions to customize workflows and experiences.

    Together, these elements create an agile foundation for digital transformation and continuous improvement.

    Ready for a Smarter Way Forward?

    Higher ed is hard — but you don’t have to figure it out alone. We can help you transform challenges into opportunities.

    Use cases that drive institutional impact

    Digital twins aren’t theoretical. They’re already delivering measurable value across the student lifecycle. With real implementations across enrollment, student success, and digital engagement, Collegis partners are proving just how powerful a connected data foundation can be.

    These examples show how the digital twin moves from concept to impact:

    • AI lead prioritization: By integrating digital journey signals with CRM intelligence, one partner increased inquiry-to-appointment conversion by 38%.
    • Transfer credit evaluation: AI-driven transcript assessments delivered >85% accuracy in early evaluations, reducing friction for prospective students.
    • AI-powered website search: Semantic search functionality improved engagement by 250% during pilot testing, enhancing conversion potential.

    These outcomes demonstrate how digital twins don’t just aggregate data — they activate it.

    Implementation, integration, and ROI

    One common question we encounter about this concept is, “Can’t we do this with our own data warehouse?” The answer is not really.

    Data warehouses are optimized for reporting, not real-time personalization. The digital twin’s networked model is designed for operational use, enabling advisors, marketers, and faculty to act in the moment.

    Collegis typically helps institutions realize value within three to six months. Whether starting with a marketing use case or building a full student model, we work with partners to:

    • Identify quick wins
    • Integrate priority data sources
    • Build a data model tailored to their institution

    Why Collegis — and why now?

    Unlike generic analytics platforms, Connected Core is purpose-built for higher education. It’s not a retrofitted enterprise tool. The following features make it unique from other offerings:

    • AI-native and human-centered: It’s designed to deliver explainable, actionable insights.
    • Composed, not constrained: It’s flexible enough to integrate with legacy systems and custom-built tools.
    • A strategic partnership: Collegis provides not just the technology, but the advisory services and data talent to ensure sustained success.

    Start humanizing your student data

    The digital twin helps institutions shift from reactive reporting to proactive engagement. It empowers colleges and universities to not only understand their students better, but to serve them more effectively.

    Ready to explore how a student digital twin could transform your data strategy? Contact us to request a demo!

    Innovation Starts Here

    Higher ed is evolving — don’t get left behind. Explore how Collegis can help your institution thrive.

    Source link

  • The silent hero of modern learning

    The silent hero of modern learning

    Key points:

    Education is undergoing a profound digital transformation. From immersive AR/VR learning in science labs to hybrid classrooms, real-time collaboration platforms, and remote learning at scale, how students learn and educators teach is changing rapidly. These modern, data-intensive applications require far more than basic connectivity. They demand high bandwidth, ultra-low latency, and rock-solid reliability across every corner of the campus.

    In other words, the minimum requirement today is maximal connectivity. And this is where Optical LAN (OLAN) becomes a game changer.

    The challenge with traditional LANs

    Most schools and universities still rely on traditional copper-based local area networks (LANs). But these aging systems are increasingly unable to meet the demands of today’s digital education environments. Copper cabling comes with inherent speed and distance limitations, requiring rip-and-replace upgrades every 5 to 7 years to keep up with evolving needs.

    To increase network capacity, institutions must replace in-wall cables, switches, and other infrastructure–an expensive, time-consuming and highly disruptive process. Traditional LANs also come with large physical footprints, high maintenance requirements, and significant energy consumption, all of which add to their total cost of ownership (TCO).

    In a world that’s demanding smarter, faster, and greener networks, it’s clear that copper no longer makes the grade.

    Built for the campus of the future

    Optical LAN is a purpose-built solution for both in-campus and in-building connectivity, leveraging the superior performance of fiber optic infrastructure. It addresses the limitations of copper LANs head-on and offers significant improvements in scalability, energy efficiency and cost-effectiveness.

    Here’s why it’s such a compelling option for education networks:

    1. Massive capacity and seamless scalability

    Fiber offers virtually unlimited bandwidth. Today’s OLAN systems can easily support speeds of 10G and 25G, with future-readiness for 50G and even 100G. And unlike copper networks, education IT managers and operators don’t need to replace the cabling to upgrade; they simply add new wavelengths (light signals) to increase speed or capacity. This means educational institutions can scale up without disruptive overhauls.

    Better yet, fiber allows for differentiated quality of service on a single line. For example, a school can use a 1G wavelength to connect classrooms and dormitories, while allocating 10G bandwidth to high-performance labs. This flexibility is ideal for delivering customized connectivity across complex campus environments.

    New School Safety Resources

    2. Extended reach across the entire campus

    One of the standout features of OLAN is its extended reach. Fiber can deliver high-speed connections over distances up to 20–30 km without needing signal boosters or additional switches. This makes it perfect for large campuses where buildings like lecture halls, research centers, dorms, and libraries are spread out over wide areas. In contrast, copper LANs typically max out at a few dozen meters, requiring more switches, patch panels and costly infrastructure.

    With OLAN, a single centralized network can serve the entire campus, reducing complexity and improving performance.

    3. Energy efficiency and sustainability

    Sustainability is top-of-mind for many educational institutions, and OLAN is a clear winner here. Fiber technology is up to 8 times more energy-efficient than other wired or wireless options. It requires fewer active components, generates less heat and significantly reduces the need for cooling.

    Studies show that OLAN uses up to 40 percent less power than traditional LAN systems. This translates into lower electricity bills and a reduced carbon footprint–important factors for schools pursuing green building certifications.

    In fact, a BREEAM (Building Research Establishment Environmental Assessment Method) assessment conducted by ENCON found that deploying OLAN improved BREEAM scores by 7.7 percent, particularly in categories like management, energy, health and materials. For perspective, adding solar panels typically improves BREEAM scores by 5-8 percent.

    4. Simpler, smarter architecture

    Optical LAN significantly simplifies the network design. Instead of multiple layers of LAN switches and complex cabling, OLAN relies on a single centralized switch and slim, passive optical network terminals (ONTs). A single fiber cable can serve up to 128 endpoints, using a fraction of the physical space required by copper bundles.

    This lean architecture means:

    • Smaller cable trays and no heavy-duty racks
    • Faster installation and easier maintenance
    • Fewer points of failure and lower IT footprint

    The result? A network that’s easier to manage, more reliable, and built to grow with an education institution’s needs.

    5. Unmatched cost efficiency

    While fiber was once seen as expensive, the economics have shifted. The Association for Passive Optical LAN (APOLAN) found that POL saved 40 percent of the cost for a four-story building in 2022. Even more, Optical LAN now delivers up to 50 percent lower TCO over a 5-year period compared to traditional LAN systems, according to multiple industry studies.

    Cost savings are achieved through:

    • Up to 70 percent less cabling
    • Fewer switches and active components
    • Reduced energy and cooling costs
    • Longer lifecycle as fiber lasts more than 50 years

    In essence, OLAN delivers more value for less money, which is a compelling equation for budget-conscious education institutions.

    The future is fiber

    With the rise of Wi-Fi 7 and ever-increasing demands on network infrastructure, even wireless connectivity depends on robust wired backhaul. Optical LAN ensures that Wi-Fi access points have the bandwidth they need to deliver high-speed, uninterrupted service.

    And as educational institutions continue to adopt smart building technologies, video surveillance, IoT devices, and remote learning platforms, only fiber can keep up with the pace of change.

    Optical LAN empowers educational institutions to build networks that are faster, greener, simpler, and future-proof. With growing expectations from students, faculty, and administrators, now is the perfect time to leave legacy limitations behind and invest in a fiber-powered future.

    After all, why keep replacing copper every few years when operators can build it right once?

    Latest posts by eSchool Media Contributors (see all)

    Source link

  • New Data Highlights Demographic Shifts in College Admissions Prior to Enrollment

    New Data Highlights Demographic Shifts in College Admissions Prior to Enrollment

    Title: College Enrollment Patterns Are Changing. New Data Show Applicant and Admit Pools Are Too.

    Authors: Jason Cohn, Bryan J. Cook, Victoria Nelson

    Source: Urban Institute

    Since 2020 the world of higher education has changed drastically. Higher education has seen the effects of COVID-19, the end of race-conscious admissions, significant delays in student awards from the new FAFSA, and changing federal and state policy towards DEI.

    The Urban Institute, in collaboration with the Association of Undergraduate Education at Research Universities, University of Southern California’s Center for Enrollment Research, Policy, and Practice, and in partnership with 18 institutions of higher education aimed to fill data gaps seen in potential shifts in racial demographic profiles of students who applied for, were admitted to, and enrolled in four-year IHEs between 2018-2024.

    The data analysis found that trends in applicant, admit, and enrollee profiles varied greatly by race and ethnicity. Despite differences in data trends, all IHEs found an increase in the number of students who chose not to disclose their race or ethnicity in 2024.

    The analysis found substantial changes to Black applicant, admit, and enrollee data. Among Black students at selective institutions (defined by an acceptance rate of below 50 percent) there were differences between 2023 and 2024 of the share of applicants (8.3 percent to 8.7 percent) and admits (6.6 percent to 5.9 percent). This is contrasted further due to the differences between the share of Black applicants and admits between 2021 to 2023, which stayed relatively consistent.

    The analysis took note of a change in trends for White students as well. White students represented the only student group that consistently made up a larger share of admits than applicants (six to nine percentage points larger); despite the fact that White students demonstrated a consistent decrease in applicant, admit, and enrollee groups since 2018.

    The analysis concludes that ultimately more data is needed at every point in the college admissions process. Enrollment data gives limited insight into the very end of the process and if more data is gathered throughout a student’s journey to college, then we can better grasp how all different types of students are interacting with higher education.

    Read the full report here.

    —Harper Davis


    If you have any questions or comments about this blog post, please contact us.

    Source link

  • Experts knock new Trump plan to collect college admissions data

    Experts knock new Trump plan to collect college admissions data

    President Donald Trump wants to collect more admissions data from colleges and universities to make sure they’re complying with a 2023 Supreme Court decision that ended race-conscious affirmative action. And he wants that data now. 

    But data experts and higher education scholars warn that any new admissions data is likely to be inaccurate, impossible to interpret and ultimately misused by policymakers. That’s because Trump’s own policies have left the statistics agency inside the Education Department with a skeleton staff and not enough money, expertise or time to create this new dataset. 

    The department already collects data on enrollment from every institution of higher education that participates in the federal student loan program. The results are reported through the Integrated Postsecondary Education Data System (IPEDS). But in an Aug. 7 memorandum, Trump directed the Education Department, which he sought to close in March, to expand that task and provide “transparency” into how some 1,700 colleges that do not admit everyone are making their admissions decisions. And he gave Education Secretary Linda McMahon just 120 days to get it done. 

    Related: Our free weekly newsletter alerts you to what research says about schools and classrooms.

    Expanding data collection on applicants is not a new idea. The Biden administration had already ordered colleges to start reporting race and ethnicity data to the department this fall in order to track changes in diversity in postsecondary education. But in a separate memorandum to the head of the National Center for Education Statistics (NCES), McMahon asked for even more information, including high school grades and college entrance exam scores, all broken down by race and gender.  

    Bryan Cook, director of higher education policy at the Urban Institute, a think tank in Washington, D.C., called the 120-day timeline “preposterous” because of the enormous technical challenges. For example, IPEDS has never collected high school GPAs. Some schools use a weighted 5.0 scale, giving extra points for advanced classes, and others use an unweighted 4.0 scale, which makes comparisons messy. Other issues are equally thorny. Many schools no longer require applicants to report standardized test scores and some no longer ask them about race so the data that Trump wants doesn’t exist for those colleges. 

    “You’ve got this effort to add these elements without a mechanism with which to vet the new variables, as well as a system for ensuring their proper implementation,” said Cook. “You would almost think that whoever implemented this didn’t know what they were doing.” 

    Cook has helped advise the Education Department on the IPEDS data collection for 20 years and served on technical review panels, which are normally convened first to recommend changes to the data collection. Those panels were disbanded earlier this year, and there isn’t one set up to vet Trump’s new admissions data proposal.

    Cook and other data experts can’t figure out how a decimated education statistics agency could take on this task. All six NCES employees who were involved in IPEDS data collection were fired in March, and there are only three employees left out of 100 at NCES, which is run by an acting commissioner who also has several other jobs. 

    An Education Department official, who did not want to be named, denied that no one left inside the Education Department has IPEDS experience. The official said that staff inside the office of the chief data officer, which is separate from the statistics agency, have a “deep familiarity with IPEDS data, its collection and use.” Former Education Department employees told me that some of these employees have experience in analyzing the data, but not in collecting it.

    In the past, there were as many as a dozen employees who worked closely with RTI International, a scientific research institute, which handles most of the IPEDS data collection work. 

    Technical review eliminated

    Of particular concern is that RTI’s $10 million annual contract to conduct the data collection had been slashed approximately in half by the Department of Government Efficiency, also known as DOGE, according to two former employees, who asked to remain anonymous out of fear of retaliation. Those severe budget cuts eliminated the technical review panels that vet proposed changes to IPEDS, and ended training for colleges and universities to submit data properly, which helped with data quality. RTI did not respond to my request to confirm the cuts or answer questions about the challenges it will face in expanding its work on a reduced budget and staffing.

    The Education Department did not deny that the IPEDS budget had been cut in half. “The RTI contract is focused on the most mission-critical IPEDS activities,” the Education Department official said. “The contract continues to include at least one task under which a technical review panel can be convened.”  

    Additional elements of the IPEDS data collection have also been reduced, including a contract to check data quality.

    Last week, the scope of the new task became more apparent. On Aug. 13, the administration released more details about the new admissions data it wants, describing how the Education Department is attempting to add a whole new survey to IPEDS, called the Admissions and Consumer Transparency Supplement (ACTS), which will disaggregate all admissions data and most student outcome and financial aid data by race and gender. College will have to report on both undergraduate and graduate school admissions. The public has 60 days to comment, and the administration wants colleges to start reporting this data this fall. 

    Complex collection

    Christine Keller, executive director of the Association for Institutional Research, a trade group of higher education officials who collect and analyze data, called the new survey “one of the most complex IPEDS collections ever attempted.” 

    Traditionally, it has taken years to make much smaller changes to IPEDS, and universities are given a year to start collecting the new data before they are required to submit it. (Roughly 6,000 colleges, universities and vocational schools are required to submit data to IPEDS as a condition for their students to take out federal student loans or receive federal Pell Grants. Failure to comply results in fines and the threat of losing access to federal student aid.)

    Normally, the Education Department would reveal screenshots of data fields, showing what colleges would need to enter into the IPEDS computer system. But the department has not done that, and several of the data descriptions are ambiguous. For example, colleges will have to report test scores and GPA by quintile, broken down by race and ethnicity and gender. One interpretation is that a college would have to say how many Black male applicants, for example, scored above the 80th percentile on the SAT or the ACT. Another interpretation is that colleges would need to report the average SAT or ACT score of the top 20 percent of Black male applicants. 

    The Association for Institutional Research used to train college administrators on how to collect and submit data correctly and sort through confusing details — until DOGE eliminated that training. “The absence of comprehensive, federally funded training will only increase institutional burden and risk to data quality,” Keller said. Keller’s organization is now dipping into its own budget to offer a small amount of free IPEDS training to universities

    The Education Department is also requiring colleges to report five years of historical admissions data, broken down into numerous subcategories. Institutions have never been asked to keep data on applicants who didn’t enroll. 

    “It’s incredible they’re asking for five years of prior data,” said Jordan Matsudaira, an economist at American University who worked on education policy in the Biden and Obama administrations. “That will be square in the pandemic years when no one was reporting test scores.”

    ‘Misleading results’

    Matsudaira explained that IPEDS had considered asking colleges for more academic data by race and ethnicity in the past and the Education Department ultimately rejected the proposal. One concern is that slicing and dicing the data into smaller and smaller buckets would mean that there would be too few students and the data would have to be suppressed to protect student privacy. For example, if there were two Native American men in the top 20 percent of SAT scores at one college, many people might be able to guess who they were. And a large amount of suppressed data would make the whole collection less useful.

    Also, small numbers can lead to wacky results. For example, a small college could have only two Hispanic male applicants with very high SAT scores. If both were accepted, that’s a 100 percent admittance rate. If only 200 white women out of 400 with the same test scores were accepted, that would be only a 50 percent admittance rate. On the surface, that can look like both racial and gender discrimination. But it could have been a fluke. Perhaps both of those Hispanic men were athletes and musicians. The following year, the school might reject two different Hispanic male applicants with high test scores but without such impressive extracurriculars. The admissions rate for Hispanic males with high test scores would drop to zero. “You end up with misleading results,” said Matsudaira. 

    Reporting average test scores by race is another big worry. “It feels like a trap to me,” said Matsudaira. “That is mechanically going to give the administration the pretense of claiming that there’s lower standards of admission for Black students relative to white students when you know that’s not at all a correct inference.”

    The statistical issue is that there are more Asian and white students at the very high end of the SAT score distribution, and all those perfect 1600s will pull the average up for these racial groups. (Just like a very tall person will skew the average height of a group.) Even if a college has a high test score threshold that it applies to all racial groups and no one below a 1400 is admitted, the average SAT score for Black students will still be lower than that of white students. (See graphic below.) The only way to avoid this is to purely admit by test score and take only the students with the highest scores. At some highly selective universities, there are enough applicants with a 1600 SAT to fill the entire class. But no institution fills its student body by test scores alone. That could mean overlooking applicants with the potential to be concert pianists, star soccer players or great writers.

    The Average Score Trap

    This graphic by Kirabo Jackson, an economist at Northwestern University, depicts the problem of measuring racial discrimination though average test scores. Even for a university that admits all students above a certain cut score, the average score of one racial group (red) will be higher than the average score of the other group (blue). Source: graphic posted on Bluesky Social by Josh Goodman

    Admissions data is a highly charged political issue. The Biden administration originally spearheaded the collection of college admissions data by race and ethnicity. Democrats wanted to collect this data to show how the nation’s colleges and universities were becoming less diverse with the end of affirmative action. This data is slated to start this fall, following a full technical and procedural review. 

    Now the Trump administration is demanding what was already in the works, and adding a host of new data requirements — without following normal processes. And instead of tracking the declining diversity in higher education, Trump wants to use admissions data to threaten colleges and universities. If the new directive produces bad data that is easy to misinterpret, he may get his wish.

    Contact staff writer Jill Barshay at 212-678-3595, jillbarshay.35 on Signal, or [email protected].

    This story about college admissions data was written by Jill Barshay and produced by The Hechinger Report, a nonprofit, independent news organization focused on inequality and innovation in education. Sign up for Proof Points and other Hechinger newsletters.

    The Hechinger Report provides in-depth, fact-based, unbiased reporting on education that is free to all readers. But that doesn’t mean it’s free to produce. Our work keeps educators and the public informed about pressing issues at schools and on campuses throughout the country. We tell the whole story, even when the details are inconvenient. Help us keep doing that.

    Join us today.

    Source link

  • Week in review: Details emerge on plans to collect new admissions data

    Week in review: Details emerge on plans to collect new admissions data

    Most clicked story of the week:

    Nearly three dozen selective colleges are facing an antitrust lawsuit alleging they used the early decision admissions process to reduce competition and inflate prices. Also named as defendants are application platforms Common App and Scoir, as well as the Consortium on Financing Higher Education, an information-sharing coalition of selective liberal arts colleges.

    By the numbers

     

    740,000

    That’s the estimated number of work hours the higher education sector can expect to add as a result of the U.S. Department of Education’s plan to cull new data from colleges on their applicants’ race and sex. Behind the push is the Trump administration’s hostility toward diversity initiatives and its aggressive approach to enforcing the U.S. Supreme Court’s ban on race-based admissions.

    Anti-DEI push in courts, board rooms and classrooms:

    • A federal judge declined to block Alabama’s governor from enforcing a new law that eliminates diversity, equity and inclusion offices and forbids colleges from requiring students to adopt a long list of “divisive concepts.” The professors and students who sued over the law expressed concerns that it is overly vague and restricts their free speech rights. 
    • The Iowa Board of Regents adopted a new policy requiring public university faculty to present controversial subjects “in a way that reflects the range of scholarly views and ongoing debate in the field.” Before last week’s vote, the board stripped the proposal’s original language around DEI and critical race theory after public pushback. But one regent noted the policy does not define “controversial” and raised questions about who would. 
    • Students for Fair Admissions dropped its lawsuits against the U.S. Military Academy at West Point and the U.S. Air Force Academy over race-conscious admissions. Both academies dropped their diversity efforts in admissions earlier this year under a directive from the Trump administration. 

    Quote of the week:


    “Our actions clearly demonstrate our commitment to addressing antisemitic actions and promoting an inclusive campus environment by upholding a safe, respectful, and accountable environment.”

    George Washington University


    The private institution became one of the latest targets of the Trump administration, which claimed the university was indifferent to harassment of Jewish and Israeli students on its Washington, D.C., campus. As with its accusations against a handful of other colleges, the administration cited a pro-Palestinian protest encampment at GWU in spring 2024. The university asked the local police to clear the encampment shortly after it was formed.

    Source link