Tag: Data

  • Data Shows Uptake of Statewide Digital Mental Health Support

    Data Shows Uptake of Statewide Digital Mental Health Support

    In 2023, New Jersey’s Office of the Secretary of Higher Education signed a first-of-its-kind agreement with a digital mental health provider, Uwill, to provide free access to virtual mental health services to college students across the state.

    Over the past two years, 18,000-plus students across 45 participating colleges and universities have registered with the service, representing about 6 percent of the eligible postsecondary population. The state considers the partnership a success and hopes to codify the offering to ensure its sustainability beyond the current governor’s term.

    The details: New Jersey’s partnership with Uwill was spurred by a 2021 survey of 15,500 undergraduate and graduate students from 60 institutions in the state, which found that 70 percent of respondents rated their stress and anxiety as higher in fall 2021 than in fall 2020. Forty percent indicated they were concerned about their mental health in light of the pandemic.

    Under the agreement, students can use Uwill’s teletherapy, crisis connection and wellness programming at any time. Like others in the teletherapy space, Uwill offers an array of diverse licensed mental health providers, giving students access to therapists who share their backgrounds or language, or who reside in their state. Over half (55 percent) of the counselors Uwill hires in New Jersey are Black, Indigenous or people of color; among them, they speak 11 languages.

    What makes Uwill distinct from its competitors is that therapy services are on-demand, meaning students are matched with a counselor within minutes of logging on to the platform. Students can request to see the same counselor in the future, but the nearly immediate access ensures they are not caught in long wait or intake times, especially compared to in-person counseling services.

    Under New Jersey’s agreement, colleges and students do not pay for Uwill services, but colleges must receive state aid to be eligible.

    The research: The need for additional counseling capacity on college campuses has grown over the past decade, as an increasing number of students enter higher education with pre-existing mental health conditions. The most recent survey of counseling center staff by the Association for University and College Counseling Center Directors (AUCCCD) found that while demand for services is on the decline compared to recent years, a larger number of students have more serious conditions.

    Over half of four-year institutions and about one-third of community colleges nationwide provide teletherapy to students via third-party vendors, according to AUCCCD data. The average number of students who engaged with services in 2024 was 453, across institution size.

    Online therapy providers tout the benefits of having a service that supplements on-campus, in-person therapists’ services to provide more comprehensive care, including racially and ethnically diverse staff, after-hours support and on-demand resources for students.

    Eric Wood, director of counseling and mental health at Texas Christian University, told Inside Higher Ed that an ideal teletherapy vendor is one that increases capacity for on-campus services, expanding availability for on-campus staff and ensuring that students do not fall through the cracks.

    A 2024 analysis of digital mental health tools from the Hope Center at Temple University—which did not include Uwill—found they can improve student mental health, but there is little direct evidence regarding marginalized student populations’ use of or benefits from them. Instead, the greatest benefit appears to be for students who would not otherwise engage in traditional counseling or who simply seek preventative resources.

    One study featured in the Hope Center’s report noted the average student only used their campus’s wellness app or teletherapy service once; the report calls for more transparency around usage data prior to institutional investment.

    The data: Uwill reported that from April 2023 to May 2025, 18,207 New Jersey students engaged in their services at the 45 participating institutions, which include Princeton, Rutgers, Montclair State and Seton Hall Universities, as well as the New Jersey Institute of Technology and Stevens Institute of Technology. Engaged students were defined as any students who logged in to the app and created an account.

    New Jersey’s total college enrollment in 2022 was 378,819, according to state data. An Inside Higher Ed analysis of publicly available data found total enrollment (including undergraduate and graduate students) among the 45 participating colleges to be 327,353. Uwill participants in New Jersey, therefore, totaled around 4 percent of the state’s postsecondary students or 6 percent of eligible students.

    The state paid $4 million for the first year of the Uwill contract, as reported by Higher Ed Dive, pulling dollars from a $10 million federal grant to support pandemic relief and a $16 million budget allocation for higher education partnerships. That totals about $89,000 per institution for the first year alone, or $12 per eligible student, according to an Inside Higher Ed estimate.

    In a 2020 interview with Inside Higher Ed, Uwill CEO Michael London said the minimum cost to a college for one year of services is about $25,000, or $10 to $20 per student per year.

    New Jersey students met with counselors in more than 78,000 therapy sessions, or about six sessions per student between 2023 and 2025, according to Uwill data. Students also engaged in 548 chat sessions with therapists, sent 6,593 messages and requested 1,216 crisis connections during the first two years of service.

    User engagement has slowly ticked up since the partnership launched. In January 2024, the state said more than 7,600 students registered on the platform, scheduling nearly 20,000 sessions. By September 2024, Uwill reported more than 13,000 registered students on the platform, scheduling more than 49,000 sessions. The most recent data, published June 6, identified 18,000 students engaging in 78,000 sessions.

    Over 1,200 of Montclair State’s 22,000 students have registered with Uwill since June 2023, Jaclyn Friedman-Lombardo, Montclair State’s director of counseling and psychological services, said at a press conference, or approximately 6 percent of the total campus population.

    The state does not require institutions to track student usage data to compare usage to campus counseling center services, but some institutions choose to, according to a spokesperson for both the office of the secretary and Uwill. The secretary’s office can view de-identified campus-level data and institutions can engage with more detailed data, as well.

    Creating access: One of the goals of implementing digital mental health interventions is to expand access beyond traditional counseling centers, such as after hours, on weekends or over academic breaks.

    Roughly 30 percent of participants in the Uwill partnership completed a session between 5 p.m. and 9 a.m. on a weeknight or on the weekends. Over the 2024–25 winter break, students engaged in 3,073 therapy sessions. More than 90 of those took place outside New Jersey. Students also used Uwill services over summer vacation this past year (9,235 sessions from May 20 to Aug. 26, of which 10 percent took place outside New Jersey).

    A majority of users were traditional-aged college students (17 to 24 years old), and 32 percent were white, 25 percent Hispanic and 17 percent Black. The report did not compare participating students’ race to those using on-campus services or general campus populations.

    About 85 percent of New Jersey users were looking for a BIPOC therapist, and 9 percent requested therapists who speak languages other than English, including Hindi and Mandarin.

    Postsession assessment completed by students who do schedule an appointment has returned positive responses, with a feedback score of 9.5 out of 10 in New Jersey, compared to Uwill’s 9.2 rating nationally.

    Unanswered questions: Wood indicated the data leaves some questions left unanswered, such as whether students were also clients at the on-campus counseling center, or if the service had improved students’ mental health over time from a clinical perspective.

    “Just because a student had four sessions with a telehealth provider, if they came right back to the counseling center, did it really make an impact on the center’s capacity to see students?” Wood said.

    The high cost of the service should also give counseling center directors pause, Wood said, because those dollars could be used for a variety of other interventions to create capacity.

    The data indicated some benefits to counseling center capacity, including diverse staff and after-hours support. But to create a true return on investment, counseling centers should calculate how much capacity the tele–mental health service created and its direct impact on student wellness, not just participation in services.

    “It would be ideal to compare the number of students receiving services (not just creating an account) through the platform to the number of students who would likely benefit from receiving treatment, as identified by clinically validated mental health screens on population surveys,” said Sara Abelson, assistant professor at the Hope Center and the report’s lead author.

    What’s next: New Jersey renewed its contract with Uwill first in January 2024 and then again in May, extending through spring 2026. State leaders said the ongoing services are still supported by pandemic relief funds.

    On May 2, New Jersey assemblywoman Andrea Katz from the Eighth District introduced a bill, the Mental Health Early Access on Campus Act, which would require colleges to implement mental health first aid training among campus stakeholders, peer support programs, mental health orientation education and teletherapy services to ensure counseling ratios are one to every 1,250 students per campus. The International Accreditation of Counseling Services recommends universities maintain a ratio of at least one full-time equivalency for every 1,000 to 1,500 students.

    “We know that mental health services that our kids need are not going to end when we change governors,” Katz said at a press conference. “We need to make sure that all of this is codified into law.”

    Source link

  • How to Build a High-Impact Data Team Without the Full-Time Headcount [Webinar]

    How to Build a High-Impact Data Team Without the Full-Time Headcount [Webinar]

    You’re under increased pressure to make better, data-informed decisions. However, most colleges and universities don’t have the budget to build the kind of data team that drives strategic progress. And even if you can hire, you’re competing with other industries that pay top dollar, making it hard, if not impossible, to find the right data resource with all the skills to move your operation forward. Don’t let hiring roadblocks make you settle for siloed insights and stagnant dashboards.

    How to Build a High-Impact Data Team
    Without the Full-Time Headcount
    Thursday, June 26
    2:00 pm ET / 1:00 pm CT 

    In this webinar, Jeff Certain, VP of Solution Development and Go-to-Market, and Dan Antonson, AVP of Data and Analytics, break down how a managed services model can help you create a high-impact data team at a fraction of the cost and give you access to a robust bench of highly specialized data talent. They will also share some real-world examples of nimble, high-impact data teams in action. 

    You’ll walk away knowing: 

    • Which data roles are needed for success and scale in higher ed 
    • How to rapidly scale data operations without adding FTEs 
    • Why managed services are a smarter investment than full-time hires 
    • Ways to tap into cross-functional expertise on demand 
    • How to build a future-ready data infrastructure without ballooning your org chart 

    Whether you’re starting from scratch or trying to scale a lean team, this session will offer practical, flexible strategies to get there faster — and more cost-effectively.  

    Who Should Attend:

    If you are a data-minded decision-maker in higher ed or a cabinet-level leader being asked to do more with less, this webinar is for you. 

    • Presidents and provosts 
    • CFOs and COOs 
    • Enrollment and marketing leaders  

    Expert Speakers

    Jeff Certain

    VP of Solution Development and Go-to-Market

    Collegis Education

    Dan Antonson

    AVP of Data and Analytics

    Collegis Education

    It’s time to move past the piecemeal approach and start driving real outcomes with your data. Complete the form to reserve your spot! We look forward to seeing you on Thursday, June 26. 

    Source link

  • Education Department reinstates some research and data activities

    Education Department reinstates some research and data activities

    Education Secretary Linda McMahon has repeatedly said that the February and March cancellations and firings at her department cut not only the “fat” but also into some of the “muscle” of the federal role in education. So, even as she promises to dismantle her department, she is also bringing back some people and restarting some activities. Court filings and her own congressional testimony illuminate what this means for the agency as a whole, and for education research in particular. 

    McMahon told a U.S. House committee last month she rehired 74 employees out of the roughly 2,000 who were laid off or agreed to separation packages. A court filing earlier this month says the agency will revive about a fifth of research and statistics contracts killed earlier this year, at least for now, though that doesn’t mean the work will look exactly as it did before.  

    The Trump administration disclosed in a June 5 federal court filing in Maryland that it either has or is planning to reinstate 20 of 101 terminated contracts to comply with congressional statutes. More than half of the reversals will restart 10 regional education laboratories that the Trump administration had said were engaged in “wasteful and ideologically driven spending,” but had been very popular with state education leaders. The reinstatements also include an international assessment, a study of how to help struggling readers, and Datalab, a web-based data analysis tool for the public. 

    Related: Our free weekly newsletter alerts you to what research says about schools and classrooms.

    Even some of the promised reinstatements are uncertain because the Education Department plans to put some of them up for new bids (see table below). That process could take months and potentially result in smaller contracts with fewer studies or hours of technical assistance. 

    These research activities were terminated by Elon Musk’s Department of Government Efficiency (DOGE) before McMahon was confirmed by the Senate. The Education Department’s disclosure of the reinstatements occurred a week after President Donald Trump bid farewell to Musk in the Oval Office and on the same day that the Trump-Musk feud exploded on social media. 

    See which IES contracts have been or are slated to be restarted, or under consideration for reinstatement
    Description Status
    1 Regional Education Laboratory – Mid Atlantic Intends to seek new bids and restart contract
    2 Regional Education Laboratory – Southwest Intends to seek new bids and restart contract
    3 Regional Education Laboratory – Northwest Intends to seek new bids and restart contract
    4 Regional Education Laboratory – West Intends to seek new bids and restart contract
    5 Regional Education Laboratory – Appalachia Intends to seek new bids and restart contract
    6 Regional Education Laboratory – Pacific Intends to seek new bids and restart contract
    7 Regional Education Laboratory – Central Intends to seek new bids and restart contract
    8 Regional Education Laboratory – Midwest Intends to seek new bids and restart contract
    9 Regional Education Laboratory – Southeast Intends to seek new bids and restart contract
    10 Regional Education Laboratory – Northeast and Islands Intends to seek new bids and restart contract
    11 Regional Education Laboratory – umbrella support contract Intends to seek new bids and restart contract
    12 What Works Clearinghouse (website, training reviewers, but no reviewing of education research) Approved for reinstatement
    13 Statistical standards and data confidentiality technical assistance for the National Center for Education Statistics Reinstated
    14.  Statistical and confidentiality review of electronic data files and technical reports Approved for reinstatement
    15 Datalab, a web-based data analysis tool for the public Approved for reinstatement
    16 U.S. participation in the Program for International Student Assessment (PISA), an international test overseen by the Organization for Economic Cooperation and Development (OECD) Reinstated
    17 Data quality and statistical methodology assistance Reinstated
    18 EDFacts, a  collection of administrative data from school districts around the country Reinstated
    19 Demographic and geospatial estimates (e.g. school poverty and school locations) used for academic research and federal program administration Approved for reinstatement
    20 Evaluation of the Multi-tiered System of Supports in reading, an approach to help struggling students Approved for reinstatement
    21 Implementation of the Striving Readers Comprehensive Literacy Program and feasibility of conducting an impact evaluation of it.  Evaluating whether to restart
    22 Policy-relevant findings for the National Evaluation of Career and Technical Education Evaluating whether to restart
    23 The National Postsecondary Student Aid Study (how students finance college, college graduation rates and workforce outcomes) Evaluating whether to restart
    24 Additional higher ed studies Evaluating whether to restart
    25 Publication assistance on educational topics and the annual report Evaluating whether to restart
    26 Conducting peer review of applications, manuscripts and grant competitions at the Institute of Education Sciences Evaluating whether to restart

    The Education Department press office said it had no comment beyond what was disclosed in the legal brief. 

    Education researchers, who are suing the Trump administration to restore all of its previous research and statistical activities, were not satisfied.

    Elizabeth Tipton, president of the Society for Research on Educational Effectiveness (SREE) said the limited reinstatement is “upsetting.” “They’re trying to make IES as small as they possibly can,” she said, referring to the Institute of Education Sciences, the department’s research and data arm. 

    SREE and the American Educational Research Association (AERA) are suing McMahon and the Education Department in the Maryland case. The suit asks for a temporary reinstatement of all the contracts and the rehiring of IES employees while the courts adjudicate the broader constitutional issue of whether the Trump administration violated congressional statutes and exceeded its executive authority.

    The 20 reinstatements were not ordered by the court, and in some instances, the Education Department is voluntarily restarting only a small slice of a research activity, making it impossible to produce anything meaningful for the public. For example, the department said it is reinstating a contract for operating the What Works Clearinghouse, a website that informs schools about evidence-based teaching practices. But, in the legal brief, the department disclosed that it is not planning to reinstate any of the contracts to produce new content for the site. 

    Related: Education researchers sue Trump administration, testing executive power

    In the brief, the administration admitted that congressional statues mention a range of research and data collection activities. But the lawyers argued that the legislative language often uses the word may instead of must, or notes that evaluations of education programs should be done “as time and resources allow.” 

    “Read together, the Department has wide discretion in whether and which evaluations to undertake,” the administration lawyers wrote. 

    The Trump administration argued that as long as it has at least one contract in place, it is technically fulfilling a congressional mandate. For example, Congress requires that the Education Department participate in international assessments. That is why it is now restarting the contract to administer the Program for International Student Assessment (PISA), but not other international assessments that the country has participated in, such as the Trends in International Mathematics and Science Study (TIMSS).

    The administration argued that researchers didn’t make a compelling case that they would be irreparably harmed if many contracts were not restarted. “There is no harm alleged from not having access to as-yet uncreated data,” the lawyers wrote.

    One of the terminated contracts was supposed to help state education agencies create longitudinal data systems for tracking students from pre-K to the workforce. The department’s brief says that states, not professional associations of researchers, should sue to restore those contracts. 

    Related: DOGE’s death blow to education studies

    In six instances, the administration said it was evaluating whether to restart a study. For example, the legal brief says that because Congress requires the evaluation of literacy programs, the department is considering a reinstatement of a study of the Striving Readers Comprehensive Literacy Program. But lawyers said there was no urgency to restart it because there is no deadline for evaluations in the legislative language.

    In four other instances, the Trump administration said it wasn’t feasible to restart a study, despite congressional requirements. For example, Congress mandates that the Education Department identify and evaluate promising adult education strategies. But after terminating such a study in February, the Education Department admitted that it is now too difficult to restart it. The department also said it could not easily restart two studies of math curricula in low-performing schools. One of the studies called for the math program to be implemented in the first year and studied in the second year, which made it especially difficult to restart. A fourth study the department said it could not restart would have evaluated the effectiveness of extra services to help teens with disabilities transition from high school to college or work. When DOGE pulled the plug on that study, those teens lost those services too. 

    Contact staff writer Jill Barshay at 212-678-3595, jillbarshay.35 on Signal, or [email protected].

    This story about the reinstatement of education statistics and research was written by Jill Barshay and produced by The Hechinger Report, a nonprofit, independent news organization focused on inequality and innovation in education. Sign up for Proof Points and other Hechinger newsletters.

    The Hechinger Report provides in-depth, fact-based, unbiased reporting on education that is free to all readers. But that doesn’t mean it’s free to produce. Our work keeps educators and the public informed about pressing issues at schools and on campuses throughout the country. We tell the whole story, even when the details are inconvenient. Help us keep doing that.

    Join us today.

    Source link

  • Data Shows Attendance Improves Student Success

    Data Shows Attendance Improves Student Success

    Prior research shows attendance is one of the best predictors of class grades and student outcomes, creating a strong argument for faculty to incentivize or require attendance.

    Attaching grades to attendance, however, can create its own challenges, because many students generally want more flexibility in their schedules and think they should be assessed on what they learn—not how often they show up. A student columnist at the University of Washington expressed frustration at receiving a 20 percent weighted participation grade, which the professor graded based on exit tickets students submitted at the end of class.

    “Our grades should be based on our understanding of the material, not whether or not we were in the room,” Sophie Sanjani wrote in The Daily, UW’s student paper.

    Keenan Hartert, a biology professor at Minnesota State University, Mankato, set out to understand the factors affecting students’ performance in his own course and found that attendance was one of the strongest predictors of their success.

    His finding wasn’t an aha moment, but reaffirmed his position that attendance is an early indicator of GPA and class community building. The challenge, he said, is how to apply such principles to an increasingly diverse student body, many of whom juggle work, caregiving responsibilities and their own personal struggles.

    “We definitely have different students than the ones I went to school with,” Hartert said. “We do try to be the most flexible, because we have a lot of students that have a lot of other things going on that they can’t tell us. We want to be there for them.”

    Who’s missing class? It’s not uncommon for a student to miss class for illness or an outside conflict, but higher rates of absence among college students in recent years are giving professors pause.

    An analysis of 1.1 million students across 22 major research institutions found that the number of hours students have spent attending class, discussion sections and labs declined dramatically from the 2018–19 academic year to 2022–23, according to the Student Experience in the Research University (SERU) Consortium.

    More than 30 percent of students who attended community college in person skipped class sometimes in the past year, a 2023 study found; 4 percent said they skipped class often or very often.

    Students say they opt out of class for a variety of reasons, including lack of motivation, competing priorities and external challenges. A professor at Colorado State University surveyed 175 of his students in 2023 and found that 37 percent said they regularly did not attend class because of physical illness, mental health concerns, a lack of interest or engagement, or simply because it wasn’t a requirement.

    A 2024 survey from Trellis Strategies found that 15 percent of students missed class sometimes due to a lack of reliable transportation. Among working students, one in four said they regularly missed class due to conflicts with their work schedule.

    High rates of anxiety and depression among college students may also impact their attendance. More than half of 817 students surveyed by Harmony Healthcare IT in 2024 said they’d skipped class due to mental health struggles; one-third of respondents indicated they’d failed a test because of negative mental health.

    A case study: MSU Mankato’s Hartert collected data on about 250 students who enrolled in his 200-level genetics course over several semesters.

    Using an end-of-term survey, class activities and his own grade book information, Hartert collected data measuring student stress, hours slept, hours worked, number of office hours attended, class attendance and quiz grades, among other metrics.

    Mapping out the various factors, Hartert’s case study modeled other findings in student success literature: a high number of hours worked correlated negatively with the student’s course grade, while attendance in class and at review sessions correlated positively with academic outcomes.

    Data analysis by Keenan Hartert, a biology professor at Minnesota State University, Mankato, found student employment negatively correlated with their overall class grade.

    Keenan Hartert

    The data also revealed to Hartert some of the challenges students face while enrolled. “It was brutal to see how many students [were working full-time]. Just seeing how many were [working] over 20 [hours] and how many were over 30 or 40, it was different.”

    Nationally, two-thirds of college students work for pay while enrolled, and 43 percent of employed students work full-time, according to fall 2024 data from Trellis Strategies.

    Hartert also asked students if they had any financial resources to support them in case of emergency; 28 percent said they had no fallback. Of those students, 90 percent were working more than 20 hours per week.

    Four pie charts show how working students often lack financial support and how working more hours is connected to passing or failing a course.

    Data analysis of student surveys show students who are working are less likely to have financial resources to support them in an emergency.

    The findings illustrated to him the challenges many students face in managing their job shifts while trying to meet attendance requirements.

    A Faculty Aside

    While some faculty may be less interested in using predictive analytics for their own classes, Hartert found tracking factors like how often a student attends office hours was beneficial to helping him achieve his own career goals, because he could include those measurements in his tenure review.

    An interpersonal dynamic: A less measured factor in the attendance debate is not a student’s own learning, but the classroom environment they contribute to. Hartert framed it as students motivating their peers unknowingly. “The people that you may not know that sit around you and see you, if you’re gone, they may think, ‘Well, they gave up, why should I keep trying?’ Even if they’ve never spoken to you.”

    One professor at the University of Oregon found that peer engagement positively correlated with academic outcomes. Raghuveer Parthasarathy restructured his general education physics course to promote engagement by creating an “active zone,” or a designated seating area in the classroom where students sat if they wanted to participate in class discussions and other active learning conversations.

    Compared to other sections of the course, the class was more engaged across the board, even among those who didn’t opt to sit in the participation zone. Additionally, students who sat in the active zone were more likely to earn higher grades on exams and in the course over all.

    Attending class can also create connections between students and professors, something students say they want and expect.

    A May 2024 student survey by Inside Higher Ed and Generation Lab found that 35 percent of respondents think their academic success would be most improved by professors getting to know them better. In a separate question, 55 percent of respondents said they think professors are at least partly responsible for becoming a mentor.

    The SERU Consortium found student respondents in 2023 were less likely to say a professor knew or had learned their name compared to their peers in 2013. Students were also less confident that they knew a professor well enough to ask for a letter of recommendation for a job or graduate school.

    “You have to show up to class then, so I know who you are,” Hartert said.

    Meeting in the middle: To encourage attendance, Hartert employs active learning methods such as creative writing or case studies, which help demonstrate the value of class participation. His favorite is a jury scenario, in which students put their medical expertise into practice with criminal cases. “I really try and get them in some gray-area stuff and remind them, just because it’s a big textbook doesn’t mean that you can’t have some creative, fun ideas,” Hartert said.

    For those who can’t make it, all of Hartert’s lectures are recorded and available online to watch later. Recording lectures, he said, “was a really hard bridge to cross, post-COVID. I was like, ‘Nobody’s going to show up.’ But every time I looked at the data [for] who was looking at the recording, it’s all my top students.” That was reason enough for him to leave the recordings available as additional practice and resources.

    Students who can’t make an in-person class session can receive attendance credit by sending Hartert their notes and answers to any questions asked live during the class, proving they watched the recording.

    Hartert has also made adjustments to how he uses class time to create more avenues for working students to engage. His genetics course includes a three-hour lab section, which rarely lasts the full time, Hartert said. Now, the final hour of the lab is a dedicated review session facilitated by peer leaders, who use practice questions Hartert designed. Initial data shows working students who stayed for the review section of labs were more likely to perform better on their exams.

    “The good news is when it works out, like when we can make some adjustments, then we can figure our way through,” Hartert said. “But the reality of life is that time marches on and things happen, and you gotta choose a couple priorities.”

    Do you have an academic intervention that might help others improve student success? Tell us about it.

    Source link

  • Preserving the Federal Data Trump Is Trying to Purge

    Preserving the Federal Data Trump Is Trying to Purge

    Within days of taking office, the Trump administration began purging federal demographic data—on a wide range of topics, including public health, education and climate—from government websites to comply with the president’s bans on “gender ideology” and diversity, equity and inclusion initiatives.

    Over the past five months, more than 3,000 taxpayer-funded data sets—many congressionally mandated—collected by federal agencies including the Centers for Disease Control and Prevention, the National Center for Education Statistics, and the Census Bureau, have been caught in the cross fire.

    One of the first data sets to disappear was the White House Council on Environmental Quality’s Climate and Economic Justice Screening Tool, an interactive map of U.S. Census tracts “marginalized by underinvestment and overburdened by pollution,” according to a description written under a previous administration.

    It’s the type of detailed, comprehensive data academics rely on to write theses, dissertations, articles and books that often help to inform public policy. And without access to it and reams of other data sets, researchers in the United States and beyond won’t have the information they need to identify social, economic and technological trends and forge potential solutions.

    “Removing this data is removing a big piece of knowledge from humanity,” said Cathy Richards, a civic science fellow and data inclusion specialist at the Open Environmental Data Project, which aims to strengthen the role of data in environmental and climate governance. “A lot of science is about innovating on what people did before. New scientists work with data they may have never seen before, but they’re using the knowledge that came before them to create something better. I don’t think we fully understand the impact [that] deleting 50 years of knowledge will have on science in the future.”

    That’s why she and scores of other concerned academic librarians, researchers and data whizzes are collaborating—many of them as unpaid volunteers—to preserve as much of that data as they can on nongovernment websites. Some of the groups involved include OEDP, the Data Rescue Project, Safeguarding Research and Culture, the Internet Archive, the End of Term Archive, and the Data.gov Archive, which is run by the Harvard Law School Library.

    For Richards at OEDP, data-preservation efforts started right after Trump won the election in November.

    She and her colleagues remembered how Trump, a climate change denier, had removed some—mostly environmental—data in 2017, and they wanted to get a head start on preserving any data that could become a target during his second term. OEDP, which launched in 2020 in response to the first Trump administration’s environmental policies, which prioritized fossil fuel extraction, compiled a list of about 200 potentially vulnerable federal data sets researchers said would be critical to continuing their work. They spent the last two months of 2024 and the first weeks of 2025 collecting and downloading as many data sets as they could ahead of Trump’s Jan. 20 inauguration, which they then transferred to stable, independent and publicly accessible webpages.

    “That took time,” Richards said, noting that not every data set and its accompanying metadata was easy to replicate. “Each varied significantly. Some required scraping. In one case I had to manually download 400 files, clicking each one every few minutes.”

    While they made a lot of headway, OEDP’s small team wasn’t able to preserve all of the data sets on their list by late January. And once Trump took office, the research community’s fears that the president would start scrubbing federal data were quickly realized.

    “Data started to go down very quickly,” at a much larger scale compared to 2017, Richards said, with anything that mentioned race, gender or the LGBTQ+ community, among other keywords, becoming a target. “We started getting emails from people saying these websites were no longer working, panicking because they needed it to finish their thesis.”

    As of this month, OEDP has completed archiving about 100 data sets, including the CDC’s Pregnancy Mortality Surveillance System, the Census Bureau’s American Community Survey, and the White House’s Climate and Economic Justice Screening Tool. As it works to complete dozens more, it’s also in communication with the other data-preservation efforts to make sure the work isn’t duplicated and that researchers and the general public can maintain access to as much data as possible.

    ‘Disrupted Trust’

    Prior to Trump’s inauguration, 307,851 data sets were available on Data.gov. One month later, the number had dipped to 304,621. In addition to data-rescue efforts, the winnowing prompted outcry from the research community.

    “As scientists who rely on these data to understand the causes and consequences of population change for individuals and communities, but also as taxpayers who have supported the collection, dissemination, and storage of these data, we are deeply concerned,” read a joint statement that the Population Association of America and the Association of Population Centers published in early February. “Removing data indiscriminately, even temporarily, from secure portals maintained by federal agencies undermines trust in the nation’s statistical and scientific research agencies and puts the integrity of these data at risk.”

    Federal judges have since ordered the government to restore many of the deleted data sets—as of Sunday, Data.gov said there are 311,609 data sets available—and the Trump administration has complied, albeit reluctantly. For instance, the CDC’s Social Vulnerability Index, which since 2007 has tracked communities that may need support before, during or after natural disasters, came back online in February. But it now has a warning label from the Trump administration, which claims that the information does “not reflect biological reality” and the government therefore “rejects it.”

    Richards, of OEDP, remains skeptical about the return of some of the data, speculating that the government may alter it to better fit its ideological narratives before restoring it. Thus, capturing the data before it gets taken down in the first place is “important for us to have that baseline proof that this is how things were on Jan. 18 and 19,” she said.

    Lynda Kellum, a longtime academic data librarian who is helping to run the Data Rescue Project—which has already finished archiving some 1,000 federal data sets with the help of hundreds of volunteers—said she’s also “a little bit pessimistic” about the future of data collection. That’s not only because the Trump administration has fired thousands of federal workers who carry out that data collection, canceled billions in research contracts and removed reams of public data; it’s also because the Department of Government Efficiency has accessed protected personal data contained within some of those data sets.

    “How do we actually talk to people about what’s protected and what those protections are for the data the government is collecting? DOGE has disrupted that trust,” she said. “For example, someone sent us a message asking us why they should participate in the American Community Survey when they weren’t sure what was going to happen with their (confidential, legally protected) data … There are still those protections in place, but there’s skepticism about whether those protections will hold because of what has happened in the past five months.”

    Some legal protections are already eroding. On Friday, the U.S. Supreme Court sided with the Trump administration in determining that DOGE should have—for now—access to information collected by the Social Security Administration, including Social Security numbers, medical and mental health records, and family court information. (The case is now headed to a federal appeals court in Virginia that will decide on its merits.)

    Henrik Schönemann, a digital history and humanities expert at Humboldt University of Berlin, who helps run the Safeguarding History and Culture initiative, which has also archived high volumes of federal data since January, said efforts to rescue federal data collections are vital to the global research community. “Even if the United States falls out of it, we are still here and we still need this data,” he said. And if and when this political moment passes, “hopefully having this data can help [the United States] rebuild.”

    While Schönemann thinks it’s an “illusion” that independent federal data-preservation efforts can effectively counter the United States’ slide into autocracy, he believes it’s better than nothing.

    “It’s building communities and showing people they can do something about it,” he said. “And maybe this empowerment could lead them to feeling empowered in other areas and give people hope.”

    Source link

  • Data lag and ambition aggregation means APPs are fundamentally flawed

    Data lag and ambition aggregation means APPs are fundamentally flawed

    In her letter to the sector last November, Secretary of State Bridget Phillipson said that she expects universities to play a stronger role in expanding access and improving outcomes for disadvantaged students.

    Her letter noted that the gap in outcomes from higher education between disadvantaged students and others is unacceptably large and is widening, with participation from disadvantaged students in decline for the first time in two decades.

    She’s referring to the Free School Meals (FSM) eligible HE progression rate – 29 percent in 2022–23, down for the first time in the series.

    Of course in 2023–24, or this year, the numbers for FSM and any number of other factors could be much worse – but on the current schedule, we won’t be seeing an update to OfS’ access and participation data dashboard until “summer or autumn 2025”, and even then only for 2023–24.

    If you’re prepared to brave the long loading times – which for me generate a similar level of frustration to that I used to experience watching Eurovision national finals 20 years ago – you can drill down into that dashboard by provider.

    It’s a mixed picture, with a lot of splits to choose from. But what the data doesn’t tell us is how providers are doing when compared to their signed off targets in their (mainly 2020–21 to 2024–25) access and participation plans.

    The last time OfS published any monitoring data was for the 2020–21 academic year – almost three years ago, in September 2022.

    That means that we can’t see how well providers are doing against their targets, and nor do we have any sense of any action that OfS may (or may not) have taken to tackle underperformance.

    So I decided to have a go. I restricted my analysis to the Russell Group, and extracted all of the targets from the 2020–21 to 2024–25 plans that were measurable via the dashboard.

    I then compared the 2022–23 performance with the relevant milestone, and with the original baseline. Where the target was unclear on what type of student was in scope, I assumed FT, first degree students.

    The results are pretty worrying.

    Baseline 2022-23 Milestone 2022-23 Actual Behind milestone? Behind baseline?
    PROG Disabled Percentage difference in progression to employment and further study between disabled and non-disabled. 3.00 2.00 0.10 N N
    PROG Ethnicity Percentage difference in graduate employability between white and black students 7.9 4.70 -2.50 N N
    CONT Disabled Percentage difference in non-continuation rates non-disabled and students with mental health conditions 7.00 5.50 1.80 N N
    CONT Disabled Percentage difference in continuation rates between disabled students and non-disabled students. 6.4 3 1.3 N N
    CONT Low Participation Neighbourhood (LPN) Percentage difference in non-continuation rates between POLAR4 quintile 5 and quintile 1 students. 5 3.5 2.3 N N
    CONT Low Participation Neighbourhood (LPN) Percentage difference in non-continuation rates between POLAR4 quintile 5 and quintile 1 students. 4 2.5 3.40 Y N
    CONT Low Participation Neighbourhood (LPN) Close the gap in non-continuation between POLAR 4 Q1 and Q5 undergraduate students from 3.8% in 2016/2017 to 1.5% in 2024/25 3.8 3 6.4 Y Y
    CONT Low Participation Neighbourhood (LPN) POLAR4 Q1 non-continuation gap v Q5 (relates to KPM3) 4 3.25 6.1 Y Y
    CONT Low Participation Neighbourhood (LPN) Percentage difference in non-continuation rates between POLAR4 quintile 5 and quintile 1 students 2.40 1.00 6.90 Y Y
    CONT Low Participation Neighbourhood (LPN) Percentage difference in continuation rates between the most (POLAR Q5) and least (POLAR Q1) representative groups. 2.4 1.5 3.1 Y Y
    CONT Mature Percentage point difference in non-continuation rates between young (under 21) and mature (21 and over) students. 10 9 6.8 N N
    CONT Mature Percentage difference in continuation rates of mature first degree entrants when compared to young students. 10.2 7 -0.4 N N
    CONT Mature Significantly raise the percentage of our intake from mature students 5.90 7.00 4.10 N Y
    CONT Mature Percentage difference in non-continuation rates mature and non-mature students 9.00 6.00 7.40 Y N
    CONT Mature Percentage difference in non-continuation rates between mature (aged 21+) and young (aged 8.00 5.00 5.10 Y N
    CONT MATURE Close the gap in non-continuation between young and mature full-time, first degree students from 7.8% in 2016/2017 to 4.4% in 2024/2025. 7.8 6.8 10.2 Y Y
    CONT Mature Mature v Young non-continuation gap 9 8.5 10.1 Y Y
    CONT Mature Close the gap in continuation rates between young and mature students (by 1pp each year) by 2024/25. 5 3 6.1 Y Y
    CONT Mature Percentage difference in non-continuation rates between mature and young students 5.30 3.80 5.80 Y Y
    ATTAIN Disabled Percentage difference in degree attainment (1st and 2:1) between disabled students and other students 2.60 1.72 0.9 N N
    ATTAIN Disabled Disabled students attainment gap v non-disabled 3 1.5 1.2 N N
    ATTAIN Disabled To significantly reduce the difference in degree attainment (1st and 2:1) between disabled students and students with no known disability 4.4 2 0.30 N N
    ATTAIN Disabled Percentage point difference in good degree attainment (1st and 2:1) between disabled and not known to be disabled students. 6 5 -2.2 N N
    ATTAIN Disabled To remove the absolute gap in degree outcomes for students with a disability (OfS KPM5). 4.0 2.0 -0.60 N N
    ATTAIN Disabled Percentage difference in degree attainment (1st and 2:1) between disabled and non-disabled students 3.90 2.00 3.60 Y N
    ATTAIN Disabled Percentage difference in degree attainment (1st and 2:1) between students with registered mental health disabilities and non-disabled students 5.80 3.00 4.7 Y N
    ATTAIN Disabled Percentage difference in degree attainment (1st and 2:1) between disabled students and non-disabled students 4.2 2.3 3.6 Y N
    ATTAIN Ethnicity Black students attainment gap v White (relates to KPM4) 20 15.5 11.2 N N
    ATTAIN Ethnicity By 2025, reduce the attainment gap between Asian and white students 8.4 5.2 4.80 N N
    ATTAIN Ethnicity Percentage difference in degree attainment (1st and 2:1) between black and white students (5 year rolling average). 12 8.6 4.60 N N
    ATTAIN Ethnicity Percentage difference in degree attainment (1st and 2:1) between white and asian students. 19 17 14.4 N N
    ATTAIN Ethnicity Percentage difference in degree attainment (1st and 2:1) between white and black students. 14.00 11.00 9.90 N N
    ATTAIN Ethnicity To close the gap between Black and White student continuation rates (reducing the gap by 4 percentage points, from 8% to 4%, by 2024/2025). 8 5.6 5.5 N N
    ATTAIN Ethnicity To close the gap between BME and White student attainment (reducing the gap by 3 percentage points from 11% to 8% by 2024/25). 17 13.1 11.6 N N
    ATTAIN Ethnicity Close the unexplained gap between proportion of BAME and white full-time, first degree students attaining a 2:1 or above from 12.7% in 2017/2018 to 5.5% in 2024/2025. 12.7 10.3 10.8 Y N
    ATTAIN Ethnicity Significantly increase the percentage of our intake from Black students 2.30 3.80 2.90 Y N
    ATTAIN Ethnicity Percentage difference in degree attainment (1st and 2:1) between white and black students 15.70 9.815 11.6 Y N
    ATTAIN Ethnicity Percentage difference in degree attainment (1st and 2:1) between white and Asian students 12.5 8.375 11.4 Y N
    ATTAIN Ethnicity Percentage difference in degree attainment (1st and 2:1) between black and white students. 20 15 19.00 Y N
    ATTAIN Ethnicity Percentage difference in degree attainment (1st and 2:1) between white and BME students. 5.20 2.00 4.60 Y N
    ATTAIN Ethnicity Percentage difference in degree attainment (1st and 2:1) between white and black students. 13.8 6 12.9 Y N
    ATTAIN Ethnicity Percentage difference in degree attainment (1st and 2:1) between BAME and White students. 7.00 4.00 7.50 Y Y
    ATTAIN Ethnicity Percentage difference in degree attainment (1st and 2:1) between white and black students 4.50 3.00 31.00 Y Y
    ATTAIN Ethnicity Percentage difference in degree attainment (1st and 2:1) between white and BAME students 9.50 6.00 11.60 Y Y
    ATTAIN Ethnicity By 2025, reduce the attainment gap between black and white students 8.7 5.9 10.70 Y Y
    ATTAIN Ethnicity To significantly reduce the difference in degree attainment (1st and 2:1) between white and black students 11.6 10 20.00 Y Y
    ATTAIN Ethnicity To significantly reduce the difference in degree attainment (1st and 2:1) between white and Asian students 10.6 10 14.50 Y Y
    ATTAIN Ethnicity Percentage point difference in good degree attainment (1st and 2:1) between white and black students. 18 14 22.1 Y Y
    ATTAIN Ethnicity Percentage difference in degree attainment (1st and 2:1) between white and black students. 17 15 22.9 Y Y
    ATTAIN Ethnicity Halve the gap in attainment that are visible between black and white students (OfS KPM4). 10.0 7.0 15.80 Y Y
    ATTAIN Ethnicity To close the gap between Black and White student attainment (by raising the attainment of Black students) reducing the gap by 8.5 percentage points from 17% to 8.5% by 2024/25 11 9.5 24 Y Y
    ATTAIN Low Participation Neighbourhood (LPN) Percentage difference in degree attainment (1st and 2:1) between POLAR4 quintile 5 and quintile 1 students 9.10 4.645 8.7 Y N
    ATTAIN MATURE Close the unexplained gap between proportion of mature and young full-time, first degree students attaining a 2:1 or above from 12.1% in 2017/2018 to 6.8% in 2024/2025. 12.1 8.8 12.6 Y Y
    ATTAIN Socio-economic Percentage difference in degree attainment (1st and 2:1) between students from most and least deprived areas (based on IMD) 10.20 6.00 12.30 Y Y
    ATTAIN Socio-economic To significantly reduce the difference in degree attainment (1st and 2:1) between the most and least advantaged as measured by IMD. 10.4 8.8 15.60 Y Y
    ATTAIN Socio-economic Reduce the gaps in attainment that are visible between IMD Q1 and Q5 (OfS KPM3). 10.0 7.0 13.70 Y Y
    ACCESS Disabled By 2025, increase the proportion of students with a declared disability enrolling from the baseline of 9% to 13% 9 11 15.70 N N
    ACCESS Ethnicity Significantly increase the percentage of our intake from Asian students 6.90 8.50 9.70 N N
    ACCESS Ethnicity Percentage of BAME entrants 10.10 12.50 12.70 N N
    ACCESS Ethnicity Increase percentage proportion of students identifying as black entering to at least match or exceed sector average (11%). 9.5 10.5 11.7 N N
    ACCESS Ethnicity To increase the proportion of Black, young, full-time undergraduate entrants by 1.2 percentage points, from 2.4% to 3.6% by 2024/25. 2.4 2.8 2.1 Y Y
    ACCESS Low Participation Neighbourhood (LPN) Ratio in entry rates for POLAR4 quintile 5: quintile 1 students 7.4:1 6:1 4.5 N N
    ACCESS Low Participation Neighbourhood (LPN) Reduce the ratio in entry rates for POLAR4 quintile 5: quintile 1 students 3.9:1 3.4:1 3.4:1 N N
    ACCESS Low Participation Neighbourhood (LPN) By 2025, reduce the gap in access between those from the highest and lowest POLAR4 quintiles enrolling from the baseline of 49% to 41% 49 45 41.00 N N
    ACCESS Low Participation Neighbourhood (LPN) Ratio of students from POLAR Q1 compared to POLAR Q5. 01:14 01:11 8.5 N N
    ACCESS Low Participation Neighbourhood (LPN) Close the gap in access between Q1 and Q5 students from a ratio of 5.5 in 2017/2018 to 3.5 by 2024/2025. 5.5 3.64 4.2 Y N
    ACCESS Low Participation Neighbourhood (LPN) Reduce ratio in entry rates for POLAR4 quintile 5: quintile 1 students 12:1 8:1 8.5 Y N
    ACCESS Low Participation Neighbourhood (LPN) To reduce the gap in participation and ratio in entry rates for POLAR 4 Quintile 5: Quintile 1 students Ratio Q5:Q1 of 5.2:1 500 students from POLAR 4 Q1 4.5 or 500 Y N
    ACCESS Low Participation Neighbourhood (LPN) LPN determined by POLAR 4 data. Looking specifically at increasing the intake for LPN Quintile 1 students, and thereby reduce the ratio of Q5 to Q1. (Target articulated as both a percentage and number). 8.0%, 391 10%, 490 8.6, 400 Y N
    ACCESS Low Participation Neighbourhood (LPN) Ratio in entry rates for POLAR4 quintile 5: quintile 1 students. 7.4:1 5.5:1 6.9 Y N
    ACCESS Low Participation Neighbourhood (LPN) Ratio in entry rates for POLAR4 quintile 5: quintile 1 students. All undergraduates. 6.2:1 5.1:1 6.3 Y Y
    ACCESS Low Participation Neighbourhood (LPN) Ratio in entry rates for POLAR4 quintile 5: quintile 1 students. 4.2:1 3.5:1 4.3 Y Y
    ACCESS Low Participation Neighbourhood (LPN) Ratio in entry rates for POLAR4 quintile 5: quintile 1 students. Reduce gap to 3.0 to 1.0 by 2024-25 (OfS KPM2). 5:2 to 1 4 5.2 Y Y
    ACCESS Low Participation Neighbourhood (LPN) To increase the proportion of young, full-time undergraduate entrants from POLAR4 Q1 by 2.5 percentage points, from 7.8% to 10.3%, by 2024/25. 7.8 8.9 10.3 Y Y
    ACCESS Low Participation Neighbourhood (LPN) To increase the proportion of young, full-time undergraduate entrants from POLAR4 Q2 by 2.5 percentage points, from 12.4% to 14.9%, by 2024/25. 12.4 13.9 15.4 Y Y
    ACCESS Mature Percentage of mature entrants 5.80 7.20 3.70 Y Y
    ACCESS Mature Percentage of mature students as part of the overall cohort. 9.2 11.0 6.70 Y Y
    ACCESS Multiple Increase the proportion of BME students from Q1 and Q2 backgrounds 5.2 8 7.6 N Y
    ACCESS Socio-economic Eliminate the IMD Q5:Q1 access gap by 2024/25. 5 2 -4.5 N N
    ACCESS Socio-economic By 2025, reduce the gap in access between those from the highest and lowest IMD quintiles from the baseline of 16.4% to 10.4% 16.4 13.5 7.00 N N
    ACCESS Socio-economic Percentage point difference in access rates between IMD quintile 1 and 2 and quintile 3, 4 and 5 students. 51.8 43.8 53.4 Y Y

    Milestones and baselines

    If we start with access, of the 25 targets that can be analysed, 14 behind milestone – and 10 show a worse performance than the baseline.

    On continuation, 11 of the 17 are behind milestone, and 9 are behind the baseline. And on attainment, 25 of the 38 are behind milestone, and 14 behind baseline.

    Notwithstanding that some of the other targets might have been smashed, and that in all cases the performance may well have improved since then, that looks like pretty poor performance to me.

    It’s the sort of thing that we might have expected to result in fines, or at least specific conditions of registration being imposed.

    But as far as we know, nothing beyond enhanced monitoring has been applied – and even then, we don’t know who has been under enhanced monitoring.

    And the results are a problem. When OfS launched this batch of plans, it noted that young people from the most advantaged areas of England were over six times as likely to attend one of the most selective universities – including Oxford, Cambridge and other members of the Russell Group – as those from the most disadvantaged areas, and that that gap had hardly changed despite a significant expansion in the number of university places available.

    At the rates of progress forecast under those plans, the ratio was supposed to be less than 4:1 by 2025. It was still at 5.44 in the Russell Group in 2022–23.

    It was supposed to mean around 6,500 extra students from the most disadvantaged areas attending those universities each year from 2024-25 onwards. The Russell Group isn’t the whole of “high tariff” – but it had only increased its total of POLAR1 students by 1350 by 2022/23.

    OfS also said that nationally, the gap between the proportion of white and black students who are awarded a 1st or 2:1 degree would drop from 22 to 11.2 percentage points by this year. As we’ve noted before on the site, the apparent narrowing during Covid was more of a statistical trick than anything else. It was up at 22.4 in 2022–23.

    And the gap in dropout rates between students from the most and least represented groups was supposed to fall from 4.6 to 2.9 percentage points – it was up at 5.3pp in 2022–23.

    The aggregation of ambition into press-releasable targets appears to have suffered from a similar fate to the equivalent exercise over financial sustainability.

    What a wonderful thing

    Of course, much has happened since January 2020. To the extent to which there were challenges over the student life cycle, they were likely exacerbated by the pandemic and a subsequent cost of living crisis.

    But when you’re approving four year plans, changes in the external risk environment ought to mean that it revises what it now calls an Equality of Opportunity Risk Register to reflect that – and either allows providers to revise targets down, or requires more action/investment to meet the targets agreed.

    Neither of those things seem to have happened.

    It’s also the case that OfS has radically changed how it regulates in this area. Back then, the director for fair access and participation was Chris Millward. It’s now John Blake. And the guidance, nature of the plans expected and monitoring regimes have all been revamped.

    But when we’re dealing with long-term plans, a changing of the guard does run the risk that the expectations and targets agreed under any old regime get sidelined and forgotten about – letting poor performers off the hook.

    It certainly feels like that’s the case. And while John Blake is widely respected, it’s hard to believe that he’ll still be the director for fair access and participation by the end of the latest round of plans – 2029.

    Hindsight is a wonderful thing, of course, but notwithstanding the external environment changes, few anticipated that any of the gaps, percentages or ratios would worsen for any of the targets set back in 2019.

    That matters because of that OfS aggregation issue. It’s not just that some providers can drag down the performance of the sector as a whole. It’s that no provider was set the target of not getting any worse on the myriad of measures that it didn’t pick for its plan.

    For all we know, while a certain number of providers might have set and agreed a target, say, on POLAR1 access or IMD attainment, performance could have worsened in all of those that didn’t – and that poses a major problem for the regulator and the design of the thing.

    It remains the case that we’re lacking clarity on the way in which the explosion of franchised, urban area business provision has impacted the stats of both the providers that have lit that blue touch paper, and the sector’s scores overall. For me, improvements in access via that method look like cheating – and declines in continuation, completion or progression ought to mean serious questions over funding policy within the Department for Education.

    We don’t really know – but need to know – the impact of other providers’ behaviour on an individual provider’s external environment. If, for example, high tariff universities scoop up more disadvantaged students (without necessarily actually narrowing the gap), that could end up widening the gap elsewhere too. There’s only so many moles to whack when you’re looking at access.

    We still can’t see A&P performance by subject area – which has always been an issue when we think about access to the professions, but is an even bigger issue now that whole subject areas are being culled in the face of financial problems.

    And the size and shape question lingers too. UCAS figures at the close of clearing suggested that high tariff providers were set to balance the books by expanding in ways they claimed were impossible when the “mutant algorithm” hit in 2020.

    Much of continuation, completion and progression appears to be about the overall mix of students at a provider – something that’s made much more challenging in medium and lower tariff providers if high-tariff ones lower theirs.

    In the forthcoming skills white paper, we should expect exhortations from ministers that the sector improves its performance on access and participation. It will have choices on provider type, subject area, the types of disadvantage to focus on, and the mix of measures between things inside its control in the external environment, and things within providers’ control (or at least influence) that OfS should expect.

    Whatever it chooses, on the evidence available, it will have real problems judging either its own performance, its regulator’s, groups of providers or even individuals’. If you think the sector still has some distance to go on fairness, that just won’t do.

    Source link

  • Teacher stress levels have surpassed pandemic-era highs

    Teacher stress levels have surpassed pandemic-era highs

    Key points:

    America’s K-12 educators are more stressed than ever, with many considering leaving the profession altogether, according to new survey data from Prodigy Education.

    The Teacher Stress Survey, which polled more than 800 K-12 educators across the U.S., found that nearly half of teachers (45 percent) view the 2024-25 school year as the most stressful of their careers. The surveyed educators were also three times more likely to say that the 2024-25 school year has been the hardest compared to 2020, when they had to teach during the height of the COVID-19 pandemic.

    Student behavior challenges (58 percent), low compensation (44 percent), and administrative demands (28 percent) are driving teacher burnout and turnover at alarming rates. Public school teachers were more likely to report stress from unrealistic workloads, large class sizes, school safety concerns, and student behavior issues than their private school counterparts.

    “The fact that stress levels for so many teachers have exceeded those of the pandemic era should be a wake-up call,” said Dr. Josh Prieur, director of education enablement at Prodigy Education and former assistant principal in the U.S. public school system. “Teachers need tangible, meaningful, and sustained support … every week of the year.”

    Additional key findings include:

    • The vast majority of teachers (95 percent) are experiencing some level of stress, with more than two-thirds (68 percent) reporting moderate to very high stress. K-5 teachers were the most likely to feel extremely/very stressed (33 percent). Sixty-three percent of teachers report that their current stress levels are higher than when they first started teaching. 
    • Nearly one in 10 teachers surveyed (9 percent) are planning to leave the profession this year, while nearly one in four (23 percent) are actively thinking about it. One-third of teachers do not expect to be teaching three years from now, likely because nearly half (48 percent) of teachers don’t feel appreciated for the work that they do.
    • Teachers are finding ways to prioritize their well-being, but time limits and job pressures often get in the way. Seventy-eight percent of teachers say they actively make time for self-care, but nearly half (43 percent) feel guilty for spending time on self-care and 78 percent have skipped self-care due to work demands. Implementing school-provided self-care perks and mandatory self-care breaks would appeal to teachers, with 85 percent and 76 percent taking advantage of each benefit, respectively.
    • Top solutions that would reduce teachers’ stress include a higher salary (59 percent), a four-day school week (33 percent), stronger classroom discipline policies (32 percent), and smaller class sizes (25 percent). Public school teachers were more likely to prefer a shorter week, while private school educators opted for higher pay. 

    “Teacher Appreciation Week should serve as the starting point for building systems that show we value teachers’ time, talent, and well-being,” said Dr. Prieur. “Districts can do this by investing in tools that reduce the burden on teachers, prioritizing time for self-care and implementing policies that reinforce teachers’ value as an ongoing commitment to bettering the profession.”

    This press release originally appeared online.

    Laura Ascione
    Latest posts by Laura Ascione (see all)

    Source link

  • Trump cuts could expose student data to cyber threats

    Trump cuts could expose student data to cyber threats

    When hackers hit a school district, they can expose Social Security numbers, home addresses, and even disability and disciplinary records. Now, cybersecurity advocates warn that the Trump administration’s budget and personnel cuts, along with rule changes, are stripping away key defenses that schools need.

    “Cyberattacks on schools are escalating and just when we need federal support the most, it’s being pulled away,” said Keith Krueger, chief executive officer of the Consortium for School Networking, an association of technology officials in K-12 schools. 

    Related: Our free weekly newsletter alerts you to what research says about schools and classrooms.

    The stakes are high. Schools are a top target in ransomware attacks, and cyber criminals have sometimes succeeded in shutting down whole school districts. The largest such incident occurred in December, when hackers stole personal student and teacher data from PowerSchool, a company that runs student information systems and stores report cards. The theft included data from more than 60 million students and almost 10 million teachers. PowerSchool paid an undisclosed ransom, but the criminals didn’t stop. Now, in a second round of extortion, the same cyber criminals are demanding ransoms from school districts.  

    The federal government has been stepping up efforts to help schools, particularly since a 2022 cyberattack on the Los Angeles Unified School District, the nation’s second-largest. Now this urgently needed assistance is under threat. 

    Warning service

    Of chief concern is a cybersecurity service known as MS-ISAC, which stands for Multi-State Information Sharing and Analysis Center. It warns more than 5,700 schools around the country that have signed up for the service about malware and other threats and recommends security patches. This technical service is free to schools, but is funded by an annual congressional appropriation of $27 million through the Cybersecurity and Infrastructure Security Agency (CISA), an agency within the Department of Homeland Security.

    On March 6, the Trump administration announced a $10 million funding cut as part of broader budget and staffing cuts throughout CISA. That was ultimately negotiated down to $8.3 million, but the service still lost more than half of its remaining $15.7 budget for the year. The non-profit organization that runs it, the Center for Internet Services, is digging into its reserves to keep it operating. But those funds are expected to run out in the coming weeks, and it is unclear how the service will continue operating without charging user fees to schools. 

    “Many districts don’t have the budget or resources to do this themselves, so not having access to the no cost services we offer is a big issue,” said Kelly Lynch Wyland, a spokeswoman for the Center for Internet Services.  

    Sharing threat information

    Another concern is the effective disbanding of the Government Coordinating Council, which helps schools address ransomware attacks and other threats through policy advice, including how to respond to ransom requests, whom to inform when an attack happens and good practices for preventing attacks. This coordinating council was formed only a year ago by the Department of Education and CISA. It brings together 13 nonprofit school organizations representing superintendents, state education leaders, technology officers and others. The council met frequently after the PowerSchool data breach to share information. 

    Now, amid the second round of extortions, school leaders have not been able to meet because of a change in rules governing open meetings. The group was originally exempt from meeting publicly because it was discussing critical infrastructure threats. But the Department of Homeland Security, under the Trump administration, reinstated open meeting rules for certain advisory committees, including this one. That makes it difficult to speak frankly about efforts to thwart criminal activity.

    Non-governmental organizations are working to resurrect the council, but it would be in a diminished form without government participation.

    “The FBI really comes in when there’s been an incident to find out who did it, and they have advice on whether you should pay or not pay your ransom,” said Krueger of the school network consortium. 

    A federal role

    A third concern is the elimination in March of the education Department’s Office of Educational Technology. This seven-person office dealt with education technology policies — including cybersecurity. It issued cybersecurity guidance to schools and held webinars and meetings to explain how schools could improve and shore up their defenses. It also ran a biweekly meeting to talk about K-12 cybersecurity across the Education Department, including offices that serve students with disabilities and English learners. 

    Eliminating this office has hampered efforts to decide which security controls, such as encryption or multi-factor authentication, should be in educational software and student information systems. 

    Many educators worry that without this federal coordination, student privacy is at risk. “My biggest concern is all the data that’s up in the cloud,” said Steve Smith, the founder of the Student Data Privacy Consortium and the former chief information officer for Cambridge Public Schools in Massachusetts. “Probably 80 to 90 percent of student data isn’t on school-district controlled services. It’s being shared with ed tech providers and hosted on their information systems.”

    Security controls

    “How do we ensure that those third-party providers are providing adequate security against breaches and cyber attacks?” said Smith. “The office of ed tech was trying to bring people together to move toward an agreed upon national standard. They weren’t going to mandate a data standard, but there were efforts to bring people together and start having conversations about the expected minimum controls.”

    That federal effort ended, Smith said, with the new administration. But his consortium is still working on it. 

    In an era when policymakers are seeking to decrease the federal government’s involvement in education, arguing for a centralized, federal role may not be popular. But there’s long been a federal role for student data privacy, including making sure that school employees don’t mishandle and accidentally expose students’ personal information. The Family Educational Rights and Privacy Act, commonly known as FERPA, protects student data. The Education Department continues to provide technical assistance to schools to comply with this law. Advocates for school cybersecurity say that the same assistance is needed to help schools prevent and defend against cyber crimes.

    “We don’t expect every town to stand up their own army to protect themselves against China or Russia,” said Michael Klein, senior director for preparedness and response at the Institute for Security and Technology, a nonpartisan think tank. Klein was a senior advisor for cybersecurity in the Education Department during the previous administration. “In the same way, I don’t think we should expect every school district to stand up their own cyber-defense army to protect themselves against ransomware attacks from major criminal groups.” 

    And it’s not financially practical. According to the school network consortium only a third of school districts have a full-time employee or the equivalent dedicated to cybersecurity. 

    Budget storms ahead

    Some federal programs to help schools with cybersecurity are still running. The Federal Communications Commission launched a $200 million pilot program to support cybersecurity efforts by schools and libraries. FEMA funds cybersecurity for state and local governments, which includes public schools. Through these funds, schools can obtain phishing training and malware detection. But with budget battles ahead, many educators fear these programs could also be cut. 

    Perhaps the biggest risk is the end to the entire E-Rate program that helps schools pay for the internet access. The Supreme Court is slated to decide this term on whether the funding structure is an unconstitutional tax.

    “If that money goes away, they’re going to have to pull money from somewhere,” said Smith of the Student Data Privacy Consortium. “They’re going to try to preserve teaching and learning, as they should.  Cybersecurity budgets are things that are probably more likely to get cut.

    “It’s taken a long time to get to the point where we see privacy and cybersecurity as critical pieces,” Smith said. “I would hate for us to go back a few years and not be giving them the attention they should.”

    Contact staff writer Jill Barshay at 212-678-3595, jillbarshay.35 on Signal, or [email protected].

    This story about student cybersecurity was written by Jill Barshay and produced by The Hechinger Report, a nonprofit, independent news organization focused on inequality and innovation in education. Sign up for Proof Points and other Hechinger newsletters.

    The Hechinger Report provides in-depth, fact-based, unbiased reporting on education that is free to all readers. But that doesn’t mean it’s free to produce. Our work keeps educators and the public informed about pressing issues at schools and on campuses throughout the country. We tell the whole story, even when the details are inconvenient. Help us keep doing that.

    Join us today.

    Source link

  • From Intuition to Intelligence: Leveraging Data to Guide Academic Portfolio Strategy

    From Intuition to Intelligence: Leveraging Data to Guide Academic Portfolio Strategy

    In today’s competitive higher education landscape, institutions can no longer afford to rely on instinct alone when it comes to academic program planning. The stakes are too high and the margin for error too slim. 

    Leaders are facing increasing pressure to align their portfolios with market demand, institutional mission, and student expectations — all while navigating constrained resources and shifting demographics. 

    The good news? You don’t have to guess. Market intelligence offers a smarter, more strategic foundation for building and refining your academic program mix. 

    Why program optimization matters now more than ever 

    Most institutions have at least one program that’s no longer pulling its weight — whether due to declining enrollment, outdated relevance, or oversaturated competition. At the same time, there are often untapped opportunities for growth in emerging or underserved fields. 

    But how do you decide which programs to scale, sustain, or sunset? 

    Optimizing your portfolio requires more than internal performance metrics. It calls for an external lens — one that brings into view national and regional trends, labor market signals, and consumer behavior. When done effectively, academic portfolio strategy becomes less about trial and error, and more about clarity and confidence. 

    The first step: Start with the market 

    The strongest portfolio strategies begin with robust external data. At Collegis Education, we draw from sources like the National Center for Education Statistics (IPEDS), Lightcast labor market analytics, and Google search trends to assess program performance, student demand, and employment outlooks. 

    National trends give us the big picture and a foundation to start from. But for our partners, we prioritize regional analysis — because institutions ultimately compete and serve in specific geographic contexts, even with fully online programs. Understanding what’s growing in your state or region is often more actionable than knowing what’s growing nationwide. 

    Our proprietary methodology filters for: 

    • Five-year conferral growth with positive year-over-year trends 
    • Programs offered by a sufficient number of institutions (to avoid anomalies) 
    • Competitive dynamics and saturation thresholds 
    • Job postings and projected employment growth 

    This data-driven process helps institutions avoid chasing short-term trends and instead focus on sustainable growth areas. 

    Ready for a Smarter Way Forward?

    Higher ed is hard — but you don’t have to figure it out alone. We can help you transform challenges into opportunities.

    Data in action: Insights from today’s growth programs 

    Collegis’ latest program growth analyses — drawing from 2023 conferral data — surface a diverse mix of high-opportunity programs. While we won’t detail every entry here, a few trends stand out: 

    • Technology and healthcare programs remain strong at the undergraduate level, with degrees like Computer Science and Health Sciences showing continued growth. 
    • Graduate credentials in education and nursing reflect both workforce need and strong student interest. 
    • Laddering potential is especially evident in fields like psychology and health sciences, where institutions can design seamless transitions from associate to bachelor’s. In fields such as education, options to ladder from certificate to master’s programs are growing in demand. 

    What’s most important isn’t the specific programs, it’s what they reveal: external data can confirm intuition, challenge assumptions, and unlock new strategic direction. And when paired with regional insights, these findings become even more powerful. 

    How to turn insight into strategy 

    Having market data is just the beginning. The true value lies in how institutions use it. At Collegis, we help our partners translate insight into action through a structured portfolio development process that includes the following: 

    1. Market analysis: Analyzing external data to identify growth areas, saturation risks, and demand signals — regionally and nationally. 
    1. Gap analysis: Identifying misalignments between current offerings and market opportunity. 
    1. Institutional alignment: Layering in internal metrics — enrollment, outcomes, mission fit, modality, and margin. 
    1. Strategic decisions: Prioritizing programs to expand, launch, refine, or sunset. 
    1. Implementation support: Developing go-to-market plans, supporting change management, and measuring results. 

    By grounding these decisions in both internal and external intelligence, institutions can future-proof their portfolios — driving enrollment, meeting workforce needs, and staying mission-aligned. 

    Put data to work for your portfolio 

    Program portfolio strategy doesn’t have to be a guessing game. With the right data and a trusted partner, institutions can make bold, confident moves that fuel growth and student success. 

    Whether you’re validating your instincts or exploring new academic directions, Collegis can help. Our market research and portfolio development services are built to support institutions at every step of the process — with national insights and regional specificity to guide your next move. 

    Innovation Starts Here

    Higher ed is evolving — don’t get left behind. Explore how Collegis can help your institution thrive.

    Source link

  • Microsoft and FFA help students use smart sensors and AI to learn about the future of farming and technology

    Microsoft and FFA help students use smart sensors and AI to learn about the future of farming and technology

    Microsoft Corp. and the National FFA Organization on Tuesday announced the national expansion of FarmBeats for Students, a cutting-edge educational program integrating smart sensors, data science and artificial intelligence (AI) to teach precision agriculture in classrooms. Starting today, FFA teachers and students throughout the United States, including FFA chapters in 185 middle and high schools, will receive a classroom set of FarmBeats for Students kits free of charge. The kits include ready-to-use sensor systems along with curriculum for teachers and are designed for classrooms of all kinds; no prior technical experience is required.

    More and more farmers are adopting advanced technology, including automating systems such as tractors and harvesters and using drones and data analysis to intervene early against pests and disease, to maximize crop yield, optimize resource usage, and adjust to changing weather patterns. Gaining hands-on experience with machine automation, data science and AI will help American agricultural students remain competitive in the global market.

    Using the FarmBeats for Students kits and free curriculum, students build environmental sensor systems and use AI to monitor soil moisture and detect nutrient deficiencies — allowing them to understand what is happening with their plants and make data-driven decisions in real time. Students can adapt the kit to challenges unique to their region — such as drought, frost and pests — providing them with practical experience in tackling real-world issues in their hometowns.

    “Microsoft is committed to ensuring students and teachers have the tools they need to succeed in today’s tech-driven world, and that includes giving students hands-on experience with precision farming, data science and AI,” said Mary Snapp, Microsoft vice president, Strategic Initiatives. “By teaming up with FFA to bring FarmBeats for Students to students across the country, we hope to inspire the next generation of agriculture leaders and equip them with the skills to tackle any and all challenges as they guide us into the future.”

    “Our partnership with Microsoft exemplifies the power of collaboration in addressing industry needs while fostering personal and professional growth among students,” said Christine White, chief program officer, National FFA Organization. “Supporting agricultural education and leadership development is crucial for shaping the next generation of innovators and problem solvers. Programs like this equip students with technical knowledge, confidence and adaptability to thrive in diverse and evolving industries. Investing in these young minds today sets the stage for a more sustainable, innovative and resilient agricultural future.”

    In addition, teachers, students or parents interested in FarmBeats for Students can purchase a kit for $35 at this link and receive free training at Microsoft Learn.

    Any educator interested in implementing the FarmBeats for Students program can now access a new, free comprehensive course on the Microsoft Educator Learn Center, providing training on precision agriculture, data science and AI, allowing teachers to earn professional development hours and badges. 

    FarmBeats for Students was co-developed by Microsoft, FFA and agriculture educators. The program aligns with the AI for K-12 initiative guidelines; Agriculture, Food and Natural Resources career standards; Computer Science Teachers Association standards; and Common Core math standards.

    For more information about FarmBeats for Students, visit aka.ms/FBFS.

    Kevin Hogan
    Latest posts by Kevin Hogan (see all)

    Source link