Tag: Data

  • Using QILT data to up student satisfaction – Campus Review

    Using QILT data to up student satisfaction – Campus Review

    Students need to know universities are actively using Quality Indicators for Learning and Teaching (QILT) survey data to change processes, the director of the survey said on Thursday.

    Please login below to view content or subscribe now.

    Membership Login

    Source link

  • Helping professional services get confident with data

    Helping professional services get confident with data

    “I don’t do data.”

    It’s a phrase heard all too often across professional services in UK higher education.

    Despite the sector’s growing reliance on data to inform strategic decisions, evaluate performance, and improve services, a significant skills gap remains—particularly among non-specialist staff.

    Critical skills

    Universities increasingly regard data as a critical asset. But while institutional expectations are rising, many professional services teams feel underprepared to meet what is now expected of them. The ability to interpret, contextualise, and communicate insights from data is now an essential part of most roles. And yet, for many professionals, data remains confusing, intimidating, or simply outside their perceived remit.

    This gap isn’t just about technical skills—it’s about confidence, culture, and collaboration. Professional services staff are often expected to make sense of complex datasets without the training or tools to do so effectively. Everyone is expected to engage with data daily, but few are properly equipped to do so. The result? Missed opportunities, reliance on specialist teams, and a growing divide between “data people” and everyone else.

    That divide threatens more than just productivity. In an era of AI and self-service analytics, the risk is that subject matter expertise gets lost or overridden by automated insights or misunderstood metrics. True value comes not just from accessing data, but from interpreting it through a lens of organisational understanding and professional experience. So how can we bridge the gap between those who do and those who don’t do data?

    The options

    Often the answer seems to be recruiting external data specialists – usually at considerable expense. While this brings in the needed expertise it also creates silos rather than building capability across teams. This approach not only strains budgets—with specialist salaries commanding premium rates in today’s competitive market—but also creates dependency on individuals who may lack contextual understanding of higher education. There is also a problem of longevity. When these specialists eventually leave, they take their knowledge with them, leaving institutions vulnerable.

    By contrast, institutions that invest in developing data confidence across existing staff leverage their team’s deep sector knowledge while creating more sustainable, resilient capabilities. The return on investment becomes clear: upskilling current staff who understand institutional nuances creates more value than repeatedly recruiting external experts who require months to grasp the complexities of university operations.

    Meanwhile, higher education faces an ever-expanding regulatory and statutory data burden. From HESA returns and TEF submissions to access and participation plans and REF preparations, the volume and complexity of mandatory reporting continues to grow. Each new requirement brings not just additional work but increased scrutiny and consequences for inaccuracy or misinterpretation. This regulatory landscape demands that universities distribute data capabilities widely rather than concentrating them in specialist teams who come close to breaking point during reporting seasons.

    When professional services staff across the institution can confidently engage with data, universities can respond more nimbly to regulatory changes, identify compliance risks earlier, and transform what might otherwise be box-ticking exercises into meaningful insights that drive institutional improvement.

    Data confident

    Recognising this challenge, UHR and Strive Higher have developed the Developing Confident Data Partners programme—a practical, supportive course designed specifically for HR and People professionals in higher education. Drawing on insights from UHR’s 6,000+ members, the programme addresses the real barriers to data confidence and equips participants with the skills and language to contribute meaningfully to data-informed conversations.

    By bridging the gap between subject matter expertise and data literacy, this initiative empowers professionals to engage more fully with the data-driven culture of their institutions. As one participant put it:

    The programme boosted my confidence and has taken away some of the mystery that some pure data experts can often create. I know what to do now before I ask for data, and what to say when I do want some.

    In a sector where informed decision-making is critical, the data skills gap in professional services can no longer be ignored. The Confident Data Partners programme is one step toward a more inclusive, capable, and collaborative data culture across UK higher education.

    The journey is just beginning. The opportunities in a data-driven world are endless, but success hinges on individuals understanding how to use data to inform strategy, planning and continuous improvement, and being able to communicate and collaborate with their peers.

    This initiative has been a learning experience for us both. It’s shown how, when data aligns with real-world needs, the results are transformative. Because when data meets purpose – that’s where the magic happens.

    Source link

  • Columbia University Settles Class Action Lawsuit Over Inflated Rankings Data for $9 Million

    Columbia University Settles Class Action Lawsuit Over Inflated Rankings Data for $9 Million

    Columbia University has reached a $9 million settlement agreement with undergraduate students who alleged the institution deliberately submitted false information to U.S. News & World Report to artificially boost its college rankings position.

    The preliminary settlement, filed last Monday in Manhattan federal court and pending judicial approval, resolves claims that Columbia misrepresented key data points to enhance its standing in the influential annual rankings. The university reached as high as No. 2 in the undergraduate rankings in 2022 before the alleged misconduct came to light.

    Students alleged that Columbia consistently provided inaccurate data to U.S. News, including the false claim that 83% of its classes contained fewer than 20 students. The lawsuit argued these misrepresentations were designed to improve the university’s ranking position and, consequently, attract more students willing to pay premium tuition rates.

    The settlement covers approximately 22,000 undergraduate students who attended Columbia College, the Fu Foundation School of Engineering and Applied Science, and the School of General Studies between fall 2016 and spring 2022.

    The controversy began in July 2022 when Columbia mathematics professor Dr. Michael Thaddeus published a detailed analysis questioning the accuracy of data underlying the university’s No. 2 ranking. His report alleged that much of the information Columbia provided to U.S. News was either inaccurate or misleading.

    Following the publication of Thaddeus’s findings, Columbia’s ranking plummeted to No. 18 in September 2022. The dramatic drop highlighted the significant impact that data accuracy has on institutional rankings and reputation.

    In response to the allegations, Columbia announced in June 2023 that its undergraduate programs would withdraw from participating in U.S. News rankings altogether. The university cited concerns about the “outsized influence” these rankings have on prospective students’ decision-making processes.

    “Much is lost when we attempt to distill the quality and nuance of an education from a series of data points,” Columbia stated in explaining its decision to withdraw from the rankings process.

    While denying wrongdoing in the settlement agreement, Columbia acknowledged past deficiencies in its reporting practices. The university stated it “deeply regrets deficiencies in prior reporting” and has implemented new measures to ensure data accuracy.

    Columbia now provides prospective students with information that has been reviewed by an independent advisory firm, demonstrating the institution’s commitment to transparency and accurate representation of its educational offerings.

    Columbia’s decision to withdraw from U.S. News rankings reflects a growing skepticism among elite institutions about the value and impact of college ranking systems. Harvard and Yale have also stopped submitting data to U.S. News for various programs, signaling a potential shift in how prestigious universities approach rankings participation.

    Under the terms of the agreement, student attorneys plan to seek up to one-third of the settlement amount for legal fees, which would leave approximately $6 million available for distribution among affected students. The settlement requires approval from a federal judge before taking effect.

    Student lawyers characterized the accord as “fair, reasonable and adequate” given the circumstances of the case and the challenges inherent in proving damages from ranking manipulation.

    Source link

  • Do states have ‘statutory right’ to Education Department data and guidance?

    Do states have ‘statutory right’ to Education Department data and guidance?

    This audio is auto-generated. Please let us know if you have feedback.

    States suing the U.S. Department of Education over its mass layoffs claim the reduction in force is impacting the agency’s legally required functions, including research and grant distribution. But in documents submitted to the U.S. Supreme Court on Monday, the Education Department said states “have no statutory right to any particular level of government data or guidance.”

    The department is pushing the high court to let its massive RIF go through after being paused by both a federal district judge and the 1st U.S. Circuit Court of Appeals. In court documents, the agency said “it can carry out its statutorily mandated functions with a pared-down staff and that many discretionary functions are better left to the States.”

    Its request to carry through on the RIFs comes even as the agency notified “all impacted employees” on administrative leave in a June 6 email obtained by K-12 Dive that it is “actively assessing how to reintegrate you back to the office in the most seamless way possible” to comply with the court orders. 

    On June 16 — the same day as the agency’s latest Supreme Court filing —  it also emailed RIFed staff for information to help the department in “understanding potential reentry timelines and identifying any accommodations that may be needed.” 

    However, several of the more than 1,300 department employees put on administrative leave in March told K-12 Dive last week that they do not think the agency intends to actually bring them back. This is despite many of the employees having worked on legally required tasks the department has lagged on or trimmed down since the layoffs, as well as the department’s efforts to seemingly comply with the court orders. 

    “While they’re saying we’re coming back, they’re also still appealing the [RIF] process,” said one Education Department employee who is on administrative leave because of the RIF. “It feels like they’re slow-walking it.” 

    Employees ‘in limbo’ as department lags on statutory tasks

    The department is still paying all these employees’ salaries — amounting to millions of dollars — only for them to sit tight. 

    According to an email from American Federation of Government Employees Local 252, the union representing a majority of the laid-off employees, the Education Department is spending at least $7 million in taxpayer dollars per month to workers on leave.

    That amount is, in fact, only for 833 of the 962 laid-off Education Department workers that the union represents and whom it was able to reach for its analysis. Thus, much more than $7 million is actually being spent per month to keep the more than 1,300 laid-off employees on payroll. 

    Since March, the department has spent approximately $21.5 million on just those 833 employees, according to data provided by AFGE Local 252.

    While the Education Department emailed laid-off employees multiple times in the past month to gather information for “reintegration and space planning efforts” on government IDs, retirement plans and devices, among other things, several employees called this a superficial effort to comply with court orders. 

    In the meantime, employees are free to apply to other jobs, start their own organizations, and go on vacation if they so choose, according to employees K-12 Dive spoke with as well as an AFGE Local 252 spokesperson. 

    “We feel like we’re in limbo,” said an employee who has been on administrative leave since March. “They haven’t talked to us.” 

    This employee and the others who spoke to K-12 Dive asked to remain anonymous for fear that identification could negatively affect their employment status or severance terms.

    Condition of Education report falls behind

    This employee would have been working at the National Center for Education Statistics on data related to the Condition of Education Report, which is required by law — and for which the department missed its June 1 deadline “for the first time ever,” according to Sen. Patty Murray, D-Wash. 

    After leaving just a handful of employees in NCES, the department has so far released only a webpage titled “Learn About the New Condition of Education 2025: Part I,” which includes significantly less information than in previous years.

    “Now all we have is a bare-bones ‘highlight’ document with no explanation to Congress or to the public,” Murray said in a June 5 Senate Health, Education, Labor and Pensions Committee hearing. “And that is really unacceptable — students, families, teachers all deserve to see a full report.” 

    In 2024, the report was a 44-page document including new analysis, comparisons with past years, and graphs to visualize the data. It included over 20 indicators grouped by topics from pre-kindergarten through secondary and postsecondary education, labor force outcomes and international comparisons. Individual indicators ranged from school safety issues like active shooter incidents to recovery from the coronavirus pandemic. 
      
    This year, the department said it would be “updating indicators on a rolling basis” due to its “emphasis on timeliness” and would determine “which indicators matter the most.” More than two weeks after its missed June 1 deadline, however, the report still only includes a highlights page with five indicators linking to data tables, many of which had already been released. 

    Meanwhile, Democratic lawmakers have also expressed concerns that the department lagged on its statutory responsibilities to disburse key federal funds, including for Title I-A — which they said took three times as long to distribute than under the last administration. The delay in funding distribution gave states and districts less time to plan for helping underserved students, including those experiencing homelessness, lawmakers said.

    The U.S. Department of Education did not respond to K-12 Dive’s requests for comment on its missed June 1 deadline for the report or on how it will increase government efficiency and cut costs while spending millions on salaries for employees who are not working. 

    Sen. Patty Murray speaks into a microphone

    Senate Appropriations Committee ranking member Sen. Patty Murray, D-Wash., questions McMahon during a hearing about the proposed 15% cut to the Education Department’s budget on Capitol Hill June 3, 2025, in Washington, D.C. The budget was consistent with President Trump’s executive order to wind down the Education Department.

    Chip Somodevilla/Getty Images via Getty Images

     

    Department says RIF impacts are “speculative”

    However, in its Supreme Court filing on Monday, the department dismissed as “speculative allegations” states’ complaints of disruptions to services as a result of the RIFs.

    The states, in their filing last week seeking to block the RIFs, said that “collection of accurate and reliable data is necessary for numerous statutory functions within the Department that greatly affect the States.” 

    The department relies on this data “to allocate billions of dollars in educational funds among the States under Title I of the Elementary and Secondary Education Act,” the states said in their June 13 response to the administration’s plea to the Supreme Court to let its RIFs take effect. The department has given “no explanation of how such allocation can occur without the collection and analysis of underlying data, or of how the data can be collected or analyzed without staff,” their filing said.

    In its Monday response, the department maintained that it is not required by law to maintain “a particular quality of audit.” The states arguing to maintain the department’s previous staffing levels are trying to “micromanage government staffing based on speculation that the putative quality of statutorily mandated services will decline,” the agency said.

    However, when pressed by Sen. Murray in a budget hearing earlier this month, Education Secretary Linda McMahon said “no” analysis was conducted about how the firings would impact the agency’s functions or how it would continue its statutorily required responsibilities without much of its staff. The department did read “training manuals and things of that nature” prior to the layoffs, she said, and had conversations with “the department.” 

    But several laid-off staffers told K-12 Dive that they were never spoken to about how their responsibilities would continue to be fulfilled after their departure. 

    “They don’t understand what they’ve cut,” an employee said.

    Source link

  • Data Shows Uptake of Statewide Digital Mental Health Support

    Data Shows Uptake of Statewide Digital Mental Health Support

    In 2023, New Jersey’s Office of the Secretary of Higher Education signed a first-of-its-kind agreement with a digital mental health provider, Uwill, to provide free access to virtual mental health services to college students across the state.

    Over the past two years, 18,000-plus students across 45 participating colleges and universities have registered with the service, representing about 6 percent of the eligible postsecondary population. The state considers the partnership a success and hopes to codify the offering to ensure its sustainability beyond the current governor’s term.

    The details: New Jersey’s partnership with Uwill was spurred by a 2021 survey of 15,500 undergraduate and graduate students from 60 institutions in the state, which found that 70 percent of respondents rated their stress and anxiety as higher in fall 2021 than in fall 2020. Forty percent indicated they were concerned about their mental health in light of the pandemic.

    Under the agreement, students can use Uwill’s teletherapy, crisis connection and wellness programming at any time. Like others in the teletherapy space, Uwill offers an array of diverse licensed mental health providers, giving students access to therapists who share their backgrounds or language, or who reside in their state. Over half (55 percent) of the counselors Uwill hires in New Jersey are Black, Indigenous or people of color; among them, they speak 11 languages.

    What makes Uwill distinct from its competitors is that therapy services are on-demand, meaning students are matched with a counselor within minutes of logging on to the platform. Students can request to see the same counselor in the future, but the nearly immediate access ensures they are not caught in long wait or intake times, especially compared to in-person counseling services.

    Under New Jersey’s agreement, colleges and students do not pay for Uwill services, but colleges must receive state aid to be eligible.

    The research: The need for additional counseling capacity on college campuses has grown over the past decade, as an increasing number of students enter higher education with pre-existing mental health conditions. The most recent survey of counseling center staff by the Association for University and College Counseling Center Directors (AUCCCD) found that while demand for services is on the decline compared to recent years, a larger number of students have more serious conditions.

    Over half of four-year institutions and about one-third of community colleges nationwide provide teletherapy to students via third-party vendors, according to AUCCCD data. The average number of students who engaged with services in 2024 was 453, across institution size.

    Online therapy providers tout the benefits of having a service that supplements on-campus, in-person therapists’ services to provide more comprehensive care, including racially and ethnically diverse staff, after-hours support and on-demand resources for students.

    Eric Wood, director of counseling and mental health at Texas Christian University, told Inside Higher Ed that an ideal teletherapy vendor is one that increases capacity for on-campus services, expanding availability for on-campus staff and ensuring that students do not fall through the cracks.

    A 2024 analysis of digital mental health tools from the Hope Center at Temple University—which did not include Uwill—found they can improve student mental health, but there is little direct evidence regarding marginalized student populations’ use of or benefits from them. Instead, the greatest benefit appears to be for students who would not otherwise engage in traditional counseling or who simply seek preventative resources.

    One study featured in the Hope Center’s report noted the average student only used their campus’s wellness app or teletherapy service once; the report calls for more transparency around usage data prior to institutional investment.

    The data: Uwill reported that from April 2023 to May 2025, 18,207 New Jersey students engaged in their services at the 45 participating institutions, which include Princeton, Rutgers, Montclair State and Seton Hall Universities, as well as the New Jersey Institute of Technology and Stevens Institute of Technology. Engaged students were defined as any students who logged in to the app and created an account.

    New Jersey’s total college enrollment in 2022 was 378,819, according to state data. An Inside Higher Ed analysis of publicly available data found total enrollment (including undergraduate and graduate students) among the 45 participating colleges to be 327,353. Uwill participants in New Jersey, therefore, totaled around 4 percent of the state’s postsecondary students or 6 percent of eligible students.

    The state paid $4 million for the first year of the Uwill contract, as reported by Higher Ed Dive, pulling dollars from a $10 million federal grant to support pandemic relief and a $16 million budget allocation for higher education partnerships. That totals about $89,000 per institution for the first year alone, or $12 per eligible student, according to an Inside Higher Ed estimate.

    In a 2020 interview with Inside Higher Ed, Uwill CEO Michael London said the minimum cost to a college for one year of services is about $25,000, or $10 to $20 per student per year.

    New Jersey students met with counselors in more than 78,000 therapy sessions, or about six sessions per student between 2023 and 2025, according to Uwill data. Students also engaged in 548 chat sessions with therapists, sent 6,593 messages and requested 1,216 crisis connections during the first two years of service.

    User engagement has slowly ticked up since the partnership launched. In January 2024, the state said more than 7,600 students registered on the platform, scheduling nearly 20,000 sessions. By September 2024, Uwill reported more than 13,000 registered students on the platform, scheduling more than 49,000 sessions. The most recent data, published June 6, identified 18,000 students engaging in 78,000 sessions.

    Over 1,200 of Montclair State’s 22,000 students have registered with Uwill since June 2023, Jaclyn Friedman-Lombardo, Montclair State’s director of counseling and psychological services, said at a press conference, or approximately 6 percent of the total campus population.

    The state does not require institutions to track student usage data to compare usage to campus counseling center services, but some institutions choose to, according to a spokesperson for both the office of the secretary and Uwill. The secretary’s office can view de-identified campus-level data and institutions can engage with more detailed data, as well.

    Creating access: One of the goals of implementing digital mental health interventions is to expand access beyond traditional counseling centers, such as after hours, on weekends or over academic breaks.

    Roughly 30 percent of participants in the Uwill partnership completed a session between 5 p.m. and 9 a.m. on a weeknight or on the weekends. Over the 2024–25 winter break, students engaged in 3,073 therapy sessions. More than 90 of those took place outside New Jersey. Students also used Uwill services over summer vacation this past year (9,235 sessions from May 20 to Aug. 26, of which 10 percent took place outside New Jersey).

    A majority of users were traditional-aged college students (17 to 24 years old), and 32 percent were white, 25 percent Hispanic and 17 percent Black. The report did not compare participating students’ race to those using on-campus services or general campus populations.

    About 85 percent of New Jersey users were looking for a BIPOC therapist, and 9 percent requested therapists who speak languages other than English, including Hindi and Mandarin.

    Postsession assessment completed by students who do schedule an appointment has returned positive responses, with a feedback score of 9.5 out of 10 in New Jersey, compared to Uwill’s 9.2 rating nationally.

    Unanswered questions: Wood indicated the data leaves some questions left unanswered, such as whether students were also clients at the on-campus counseling center, or if the service had improved students’ mental health over time from a clinical perspective.

    “Just because a student had four sessions with a telehealth provider, if they came right back to the counseling center, did it really make an impact on the center’s capacity to see students?” Wood said.

    The high cost of the service should also give counseling center directors pause, Wood said, because those dollars could be used for a variety of other interventions to create capacity.

    The data indicated some benefits to counseling center capacity, including diverse staff and after-hours support. But to create a true return on investment, counseling centers should calculate how much capacity the tele–mental health service created and its direct impact on student wellness, not just participation in services.

    “It would be ideal to compare the number of students receiving services (not just creating an account) through the platform to the number of students who would likely benefit from receiving treatment, as identified by clinically validated mental health screens on population surveys,” said Sara Abelson, assistant professor at the Hope Center and the report’s lead author.

    What’s next: New Jersey renewed its contract with Uwill first in January 2024 and then again in May, extending through spring 2026. State leaders said the ongoing services are still supported by pandemic relief funds.

    On May 2, New Jersey assemblywoman Andrea Katz from the Eighth District introduced a bill, the Mental Health Early Access on Campus Act, which would require colleges to implement mental health first aid training among campus stakeholders, peer support programs, mental health orientation education and teletherapy services to ensure counseling ratios are one to every 1,250 students per campus. The International Accreditation of Counseling Services recommends universities maintain a ratio of at least one full-time equivalency for every 1,000 to 1,500 students.

    “We know that mental health services that our kids need are not going to end when we change governors,” Katz said at a press conference. “We need to make sure that all of this is codified into law.”

    Source link

  • How to Build a High-Impact Data Team Without the Full-Time Headcount [Webinar]

    How to Build a High-Impact Data Team Without the Full-Time Headcount [Webinar]

    You’re under increased pressure to make better, data-informed decisions. However, most colleges and universities don’t have the budget to build the kind of data team that drives strategic progress. And even if you can hire, you’re competing with other industries that pay top dollar, making it hard, if not impossible, to find the right data resource with all the skills to move your operation forward. Don’t let hiring roadblocks make you settle for siloed insights and stagnant dashboards.

    How to Build a High-Impact Data Team
    Without the Full-Time Headcount
    Thursday, June 26
    2:00 pm ET / 1:00 pm CT 

    In this webinar, Jeff Certain, VP of Solution Development and Go-to-Market, and Dan Antonson, AVP of Data and Analytics, break down how a managed services model can help you create a high-impact data team at a fraction of the cost and give you access to a robust bench of highly specialized data talent. They will also share some real-world examples of nimble, high-impact data teams in action. 

    You’ll walk away knowing: 

    • Which data roles are needed for success and scale in higher ed 
    • How to rapidly scale data operations without adding FTEs 
    • Why managed services are a smarter investment than full-time hires 
    • Ways to tap into cross-functional expertise on demand 
    • How to build a future-ready data infrastructure without ballooning your org chart 

    Whether you’re starting from scratch or trying to scale a lean team, this session will offer practical, flexible strategies to get there faster — and more cost-effectively.  

    Who Should Attend:

    If you are a data-minded decision-maker in higher ed or a cabinet-level leader being asked to do more with less, this webinar is for you. 

    • Presidents and provosts 
    • CFOs and COOs 
    • Enrollment and marketing leaders  

    Expert Speakers

    Jeff Certain

    VP of Solution Development and Go-to-Market

    Collegis Education

    Dan Antonson

    AVP of Data and Analytics

    Collegis Education

    It’s time to move past the piecemeal approach and start driving real outcomes with your data. Complete the form to reserve your spot! We look forward to seeing you on Thursday, June 26. 

    Source link

  • Education Department reinstates some research and data activities

    Education Department reinstates some research and data activities

    Education Secretary Linda McMahon has repeatedly said that the February and March cancellations and firings at her department cut not only the “fat” but also into some of the “muscle” of the federal role in education. So, even as she promises to dismantle her department, she is also bringing back some people and restarting some activities. Court filings and her own congressional testimony illuminate what this means for the agency as a whole, and for education research in particular. 

    McMahon told a U.S. House committee last month she rehired 74 employees out of the roughly 2,000 who were laid off or agreed to separation packages. A court filing earlier this month says the agency will revive about a fifth of research and statistics contracts killed earlier this year, at least for now, though that doesn’t mean the work will look exactly as it did before.  

    The Trump administration disclosed in a June 5 federal court filing in Maryland that it either has or is planning to reinstate 20 of 101 terminated contracts to comply with congressional statutes. More than half of the reversals will restart 10 regional education laboratories that the Trump administration had said were engaged in “wasteful and ideologically driven spending,” but had been very popular with state education leaders. The reinstatements also include an international assessment, a study of how to help struggling readers, and Datalab, a web-based data analysis tool for the public. 

    Related: Our free weekly newsletter alerts you to what research says about schools and classrooms.

    Even some of the promised reinstatements are uncertain because the Education Department plans to put some of them up for new bids (see table below). That process could take months and potentially result in smaller contracts with fewer studies or hours of technical assistance. 

    These research activities were terminated by Elon Musk’s Department of Government Efficiency (DOGE) before McMahon was confirmed by the Senate. The Education Department’s disclosure of the reinstatements occurred a week after President Donald Trump bid farewell to Musk in the Oval Office and on the same day that the Trump-Musk feud exploded on social media. 

    See which IES contracts have been or are slated to be restarted, or under consideration for reinstatement
    Description Status
    1 Regional Education Laboratory – Mid Atlantic Intends to seek new bids and restart contract
    2 Regional Education Laboratory – Southwest Intends to seek new bids and restart contract
    3 Regional Education Laboratory – Northwest Intends to seek new bids and restart contract
    4 Regional Education Laboratory – West Intends to seek new bids and restart contract
    5 Regional Education Laboratory – Appalachia Intends to seek new bids and restart contract
    6 Regional Education Laboratory – Pacific Intends to seek new bids and restart contract
    7 Regional Education Laboratory – Central Intends to seek new bids and restart contract
    8 Regional Education Laboratory – Midwest Intends to seek new bids and restart contract
    9 Regional Education Laboratory – Southeast Intends to seek new bids and restart contract
    10 Regional Education Laboratory – Northeast and Islands Intends to seek new bids and restart contract
    11 Regional Education Laboratory – umbrella support contract Intends to seek new bids and restart contract
    12 What Works Clearinghouse (website, training reviewers, but no reviewing of education research) Approved for reinstatement
    13 Statistical standards and data confidentiality technical assistance for the National Center for Education Statistics Reinstated
    14.  Statistical and confidentiality review of electronic data files and technical reports Approved for reinstatement
    15 Datalab, a web-based data analysis tool for the public Approved for reinstatement
    16 U.S. participation in the Program for International Student Assessment (PISA), an international test overseen by the Organization for Economic Cooperation and Development (OECD) Reinstated
    17 Data quality and statistical methodology assistance Reinstated
    18 EDFacts, a  collection of administrative data from school districts around the country Reinstated
    19 Demographic and geospatial estimates (e.g. school poverty and school locations) used for academic research and federal program administration Approved for reinstatement
    20 Evaluation of the Multi-tiered System of Supports in reading, an approach to help struggling students Approved for reinstatement
    21 Implementation of the Striving Readers Comprehensive Literacy Program and feasibility of conducting an impact evaluation of it.  Evaluating whether to restart
    22 Policy-relevant findings for the National Evaluation of Career and Technical Education Evaluating whether to restart
    23 The National Postsecondary Student Aid Study (how students finance college, college graduation rates and workforce outcomes) Evaluating whether to restart
    24 Additional higher ed studies Evaluating whether to restart
    25 Publication assistance on educational topics and the annual report Evaluating whether to restart
    26 Conducting peer review of applications, manuscripts and grant competitions at the Institute of Education Sciences Evaluating whether to restart

    The Education Department press office said it had no comment beyond what was disclosed in the legal brief. 

    Education researchers, who are suing the Trump administration to restore all of its previous research and statistical activities, were not satisfied.

    Elizabeth Tipton, president of the Society for Research on Educational Effectiveness (SREE) said the limited reinstatement is “upsetting.” “They’re trying to make IES as small as they possibly can,” she said, referring to the Institute of Education Sciences, the department’s research and data arm. 

    SREE and the American Educational Research Association (AERA) are suing McMahon and the Education Department in the Maryland case. The suit asks for a temporary reinstatement of all the contracts and the rehiring of IES employees while the courts adjudicate the broader constitutional issue of whether the Trump administration violated congressional statutes and exceeded its executive authority.

    The 20 reinstatements were not ordered by the court, and in some instances, the Education Department is voluntarily restarting only a small slice of a research activity, making it impossible to produce anything meaningful for the public. For example, the department said it is reinstating a contract for operating the What Works Clearinghouse, a website that informs schools about evidence-based teaching practices. But, in the legal brief, the department disclosed that it is not planning to reinstate any of the contracts to produce new content for the site. 

    Related: Education researchers sue Trump administration, testing executive power

    In the brief, the administration admitted that congressional statues mention a range of research and data collection activities. But the lawyers argued that the legislative language often uses the word may instead of must, or notes that evaluations of education programs should be done “as time and resources allow.” 

    “Read together, the Department has wide discretion in whether and which evaluations to undertake,” the administration lawyers wrote. 

    The Trump administration argued that as long as it has at least one contract in place, it is technically fulfilling a congressional mandate. For example, Congress requires that the Education Department participate in international assessments. That is why it is now restarting the contract to administer the Program for International Student Assessment (PISA), but not other international assessments that the country has participated in, such as the Trends in International Mathematics and Science Study (TIMSS).

    The administration argued that researchers didn’t make a compelling case that they would be irreparably harmed if many contracts were not restarted. “There is no harm alleged from not having access to as-yet uncreated data,” the lawyers wrote.

    One of the terminated contracts was supposed to help state education agencies create longitudinal data systems for tracking students from pre-K to the workforce. The department’s brief says that states, not professional associations of researchers, should sue to restore those contracts. 

    Related: DOGE’s death blow to education studies

    In six instances, the administration said it was evaluating whether to restart a study. For example, the legal brief says that because Congress requires the evaluation of literacy programs, the department is considering a reinstatement of a study of the Striving Readers Comprehensive Literacy Program. But lawyers said there was no urgency to restart it because there is no deadline for evaluations in the legislative language.

    In four other instances, the Trump administration said it wasn’t feasible to restart a study, despite congressional requirements. For example, Congress mandates that the Education Department identify and evaluate promising adult education strategies. But after terminating such a study in February, the Education Department admitted that it is now too difficult to restart it. The department also said it could not easily restart two studies of math curricula in low-performing schools. One of the studies called for the math program to be implemented in the first year and studied in the second year, which made it especially difficult to restart. A fourth study the department said it could not restart would have evaluated the effectiveness of extra services to help teens with disabilities transition from high school to college or work. When DOGE pulled the plug on that study, those teens lost those services too. 

    Contact staff writer Jill Barshay at 212-678-3595, jillbarshay.35 on Signal, or [email protected].

    This story about the reinstatement of education statistics and research was written by Jill Barshay and produced by The Hechinger Report, a nonprofit, independent news organization focused on inequality and innovation in education. Sign up for Proof Points and other Hechinger newsletters.

    The Hechinger Report provides in-depth, fact-based, unbiased reporting on education that is free to all readers. But that doesn’t mean it’s free to produce. Our work keeps educators and the public informed about pressing issues at schools and on campuses throughout the country. We tell the whole story, even when the details are inconvenient. Help us keep doing that.

    Join us today.

    Source link

  • Data Shows Attendance Improves Student Success

    Data Shows Attendance Improves Student Success

    Prior research shows attendance is one of the best predictors of class grades and student outcomes, creating a strong argument for faculty to incentivize or require attendance.

    Attaching grades to attendance, however, can create its own challenges, because many students generally want more flexibility in their schedules and think they should be assessed on what they learn—not how often they show up. A student columnist at the University of Washington expressed frustration at receiving a 20 percent weighted participation grade, which the professor graded based on exit tickets students submitted at the end of class.

    “Our grades should be based on our understanding of the material, not whether or not we were in the room,” Sophie Sanjani wrote in The Daily, UW’s student paper.

    Keenan Hartert, a biology professor at Minnesota State University, Mankato, set out to understand the factors affecting students’ performance in his own course and found that attendance was one of the strongest predictors of their success.

    His finding wasn’t an aha moment, but reaffirmed his position that attendance is an early indicator of GPA and class community building. The challenge, he said, is how to apply such principles to an increasingly diverse student body, many of whom juggle work, caregiving responsibilities and their own personal struggles.

    “We definitely have different students than the ones I went to school with,” Hartert said. “We do try to be the most flexible, because we have a lot of students that have a lot of other things going on that they can’t tell us. We want to be there for them.”

    Who’s missing class? It’s not uncommon for a student to miss class for illness or an outside conflict, but higher rates of absence among college students in recent years are giving professors pause.

    An analysis of 1.1 million students across 22 major research institutions found that the number of hours students have spent attending class, discussion sections and labs declined dramatically from the 2018–19 academic year to 2022–23, according to the Student Experience in the Research University (SERU) Consortium.

    More than 30 percent of students who attended community college in person skipped class sometimes in the past year, a 2023 study found; 4 percent said they skipped class often or very often.

    Students say they opt out of class for a variety of reasons, including lack of motivation, competing priorities and external challenges. A professor at Colorado State University surveyed 175 of his students in 2023 and found that 37 percent said they regularly did not attend class because of physical illness, mental health concerns, a lack of interest or engagement, or simply because it wasn’t a requirement.

    A 2024 survey from Trellis Strategies found that 15 percent of students missed class sometimes due to a lack of reliable transportation. Among working students, one in four said they regularly missed class due to conflicts with their work schedule.

    High rates of anxiety and depression among college students may also impact their attendance. More than half of 817 students surveyed by Harmony Healthcare IT in 2024 said they’d skipped class due to mental health struggles; one-third of respondents indicated they’d failed a test because of negative mental health.

    A case study: MSU Mankato’s Hartert collected data on about 250 students who enrolled in his 200-level genetics course over several semesters.

    Using an end-of-term survey, class activities and his own grade book information, Hartert collected data measuring student stress, hours slept, hours worked, number of office hours attended, class attendance and quiz grades, among other metrics.

    Mapping out the various factors, Hartert’s case study modeled other findings in student success literature: a high number of hours worked correlated negatively with the student’s course grade, while attendance in class and at review sessions correlated positively with academic outcomes.

    Data analysis by Keenan Hartert, a biology professor at Minnesota State University, Mankato, found student employment negatively correlated with their overall class grade.

    Keenan Hartert

    The data also revealed to Hartert some of the challenges students face while enrolled. “It was brutal to see how many students [were working full-time]. Just seeing how many were [working] over 20 [hours] and how many were over 30 or 40, it was different.”

    Nationally, two-thirds of college students work for pay while enrolled, and 43 percent of employed students work full-time, according to fall 2024 data from Trellis Strategies.

    Hartert also asked students if they had any financial resources to support them in case of emergency; 28 percent said they had no fallback. Of those students, 90 percent were working more than 20 hours per week.

    Four pie charts show how working students often lack financial support and how working more hours is connected to passing or failing a course.

    Data analysis of student surveys show students who are working are less likely to have financial resources to support them in an emergency.

    The findings illustrated to him the challenges many students face in managing their job shifts while trying to meet attendance requirements.

    A Faculty Aside

    While some faculty may be less interested in using predictive analytics for their own classes, Hartert found tracking factors like how often a student attends office hours was beneficial to helping him achieve his own career goals, because he could include those measurements in his tenure review.

    An interpersonal dynamic: A less measured factor in the attendance debate is not a student’s own learning, but the classroom environment they contribute to. Hartert framed it as students motivating their peers unknowingly. “The people that you may not know that sit around you and see you, if you’re gone, they may think, ‘Well, they gave up, why should I keep trying?’ Even if they’ve never spoken to you.”

    One professor at the University of Oregon found that peer engagement positively correlated with academic outcomes. Raghuveer Parthasarathy restructured his general education physics course to promote engagement by creating an “active zone,” or a designated seating area in the classroom where students sat if they wanted to participate in class discussions and other active learning conversations.

    Compared to other sections of the course, the class was more engaged across the board, even among those who didn’t opt to sit in the participation zone. Additionally, students who sat in the active zone were more likely to earn higher grades on exams and in the course over all.

    Attending class can also create connections between students and professors, something students say they want and expect.

    A May 2024 student survey by Inside Higher Ed and Generation Lab found that 35 percent of respondents think their academic success would be most improved by professors getting to know them better. In a separate question, 55 percent of respondents said they think professors are at least partly responsible for becoming a mentor.

    The SERU Consortium found student respondents in 2023 were less likely to say a professor knew or had learned their name compared to their peers in 2013. Students were also less confident that they knew a professor well enough to ask for a letter of recommendation for a job or graduate school.

    “You have to show up to class then, so I know who you are,” Hartert said.

    Meeting in the middle: To encourage attendance, Hartert employs active learning methods such as creative writing or case studies, which help demonstrate the value of class participation. His favorite is a jury scenario, in which students put their medical expertise into practice with criminal cases. “I really try and get them in some gray-area stuff and remind them, just because it’s a big textbook doesn’t mean that you can’t have some creative, fun ideas,” Hartert said.

    For those who can’t make it, all of Hartert’s lectures are recorded and available online to watch later. Recording lectures, he said, “was a really hard bridge to cross, post-COVID. I was like, ‘Nobody’s going to show up.’ But every time I looked at the data [for] who was looking at the recording, it’s all my top students.” That was reason enough for him to leave the recordings available as additional practice and resources.

    Students who can’t make an in-person class session can receive attendance credit by sending Hartert their notes and answers to any questions asked live during the class, proving they watched the recording.

    Hartert has also made adjustments to how he uses class time to create more avenues for working students to engage. His genetics course includes a three-hour lab section, which rarely lasts the full time, Hartert said. Now, the final hour of the lab is a dedicated review session facilitated by peer leaders, who use practice questions Hartert designed. Initial data shows working students who stayed for the review section of labs were more likely to perform better on their exams.

    “The good news is when it works out, like when we can make some adjustments, then we can figure our way through,” Hartert said. “But the reality of life is that time marches on and things happen, and you gotta choose a couple priorities.”

    Do you have an academic intervention that might help others improve student success? Tell us about it.

    Source link

  • Preserving the Federal Data Trump Is Trying to Purge

    Preserving the Federal Data Trump Is Trying to Purge

    Within days of taking office, the Trump administration began purging federal demographic data—on a wide range of topics, including public health, education and climate—from government websites to comply with the president’s bans on “gender ideology” and diversity, equity and inclusion initiatives.

    Over the past five months, more than 3,000 taxpayer-funded data sets—many congressionally mandated—collected by federal agencies including the Centers for Disease Control and Prevention, the National Center for Education Statistics, and the Census Bureau, have been caught in the cross fire.

    One of the first data sets to disappear was the White House Council on Environmental Quality’s Climate and Economic Justice Screening Tool, an interactive map of U.S. Census tracts “marginalized by underinvestment and overburdened by pollution,” according to a description written under a previous administration.

    It’s the type of detailed, comprehensive data academics rely on to write theses, dissertations, articles and books that often help to inform public policy. And without access to it and reams of other data sets, researchers in the United States and beyond won’t have the information they need to identify social, economic and technological trends and forge potential solutions.

    “Removing this data is removing a big piece of knowledge from humanity,” said Cathy Richards, a civic science fellow and data inclusion specialist at the Open Environmental Data Project, which aims to strengthen the role of data in environmental and climate governance. “A lot of science is about innovating on what people did before. New scientists work with data they may have never seen before, but they’re using the knowledge that came before them to create something better. I don’t think we fully understand the impact [that] deleting 50 years of knowledge will have on science in the future.”

    That’s why she and scores of other concerned academic librarians, researchers and data whizzes are collaborating—many of them as unpaid volunteers—to preserve as much of that data as they can on nongovernment websites. Some of the groups involved include OEDP, the Data Rescue Project, Safeguarding Research and Culture, the Internet Archive, the End of Term Archive, and the Data.gov Archive, which is run by the Harvard Law School Library.

    For Richards at OEDP, data-preservation efforts started right after Trump won the election in November.

    She and her colleagues remembered how Trump, a climate change denier, had removed some—mostly environmental—data in 2017, and they wanted to get a head start on preserving any data that could become a target during his second term. OEDP, which launched in 2020 in response to the first Trump administration’s environmental policies, which prioritized fossil fuel extraction, compiled a list of about 200 potentially vulnerable federal data sets researchers said would be critical to continuing their work. They spent the last two months of 2024 and the first weeks of 2025 collecting and downloading as many data sets as they could ahead of Trump’s Jan. 20 inauguration, which they then transferred to stable, independent and publicly accessible webpages.

    “That took time,” Richards said, noting that not every data set and its accompanying metadata was easy to replicate. “Each varied significantly. Some required scraping. In one case I had to manually download 400 files, clicking each one every few minutes.”

    While they made a lot of headway, OEDP’s small team wasn’t able to preserve all of the data sets on their list by late January. And once Trump took office, the research community’s fears that the president would start scrubbing federal data were quickly realized.

    “Data started to go down very quickly,” at a much larger scale compared to 2017, Richards said, with anything that mentioned race, gender or the LGBTQ+ community, among other keywords, becoming a target. “We started getting emails from people saying these websites were no longer working, panicking because they needed it to finish their thesis.”

    As of this month, OEDP has completed archiving about 100 data sets, including the CDC’s Pregnancy Mortality Surveillance System, the Census Bureau’s American Community Survey, and the White House’s Climate and Economic Justice Screening Tool. As it works to complete dozens more, it’s also in communication with the other data-preservation efforts to make sure the work isn’t duplicated and that researchers and the general public can maintain access to as much data as possible.

    ‘Disrupted Trust’

    Prior to Trump’s inauguration, 307,851 data sets were available on Data.gov. One month later, the number had dipped to 304,621. In addition to data-rescue efforts, the winnowing prompted outcry from the research community.

    “As scientists who rely on these data to understand the causes and consequences of population change for individuals and communities, but also as taxpayers who have supported the collection, dissemination, and storage of these data, we are deeply concerned,” read a joint statement that the Population Association of America and the Association of Population Centers published in early February. “Removing data indiscriminately, even temporarily, from secure portals maintained by federal agencies undermines trust in the nation’s statistical and scientific research agencies and puts the integrity of these data at risk.”

    Federal judges have since ordered the government to restore many of the deleted data sets—as of Sunday, Data.gov said there are 311,609 data sets available—and the Trump administration has complied, albeit reluctantly. For instance, the CDC’s Social Vulnerability Index, which since 2007 has tracked communities that may need support before, during or after natural disasters, came back online in February. But it now has a warning label from the Trump administration, which claims that the information does “not reflect biological reality” and the government therefore “rejects it.”

    Richards, of OEDP, remains skeptical about the return of some of the data, speculating that the government may alter it to better fit its ideological narratives before restoring it. Thus, capturing the data before it gets taken down in the first place is “important for us to have that baseline proof that this is how things were on Jan. 18 and 19,” she said.

    Lynda Kellum, a longtime academic data librarian who is helping to run the Data Rescue Project—which has already finished archiving some 1,000 federal data sets with the help of hundreds of volunteers—said she’s also “a little bit pessimistic” about the future of data collection. That’s not only because the Trump administration has fired thousands of federal workers who carry out that data collection, canceled billions in research contracts and removed reams of public data; it’s also because the Department of Government Efficiency has accessed protected personal data contained within some of those data sets.

    “How do we actually talk to people about what’s protected and what those protections are for the data the government is collecting? DOGE has disrupted that trust,” she said. “For example, someone sent us a message asking us why they should participate in the American Community Survey when they weren’t sure what was going to happen with their (confidential, legally protected) data … There are still those protections in place, but there’s skepticism about whether those protections will hold because of what has happened in the past five months.”

    Some legal protections are already eroding. On Friday, the U.S. Supreme Court sided with the Trump administration in determining that DOGE should have—for now—access to information collected by the Social Security Administration, including Social Security numbers, medical and mental health records, and family court information. (The case is now headed to a federal appeals court in Virginia that will decide on its merits.)

    Henrik Schönemann, a digital history and humanities expert at Humboldt University of Berlin, who helps run the Safeguarding History and Culture initiative, which has also archived high volumes of federal data since January, said efforts to rescue federal data collections are vital to the global research community. “Even if the United States falls out of it, we are still here and we still need this data,” he said. And if and when this political moment passes, “hopefully having this data can help [the United States] rebuild.”

    While Schönemann thinks it’s an “illusion” that independent federal data-preservation efforts can effectively counter the United States’ slide into autocracy, he believes it’s better than nothing.

    “It’s building communities and showing people they can do something about it,” he said. “And maybe this empowerment could lead them to feeling empowered in other areas and give people hope.”

    Source link

  • Data lag and ambition aggregation means APPs are fundamentally flawed

    Data lag and ambition aggregation means APPs are fundamentally flawed

    In her letter to the sector last November, Secretary of State Bridget Phillipson said that she expects universities to play a stronger role in expanding access and improving outcomes for disadvantaged students.

    Her letter noted that the gap in outcomes from higher education between disadvantaged students and others is unacceptably large and is widening, with participation from disadvantaged students in decline for the first time in two decades.

    She’s referring to the Free School Meals (FSM) eligible HE progression rate – 29 percent in 2022–23, down for the first time in the series.

    Of course in 2023–24, or this year, the numbers for FSM and any number of other factors could be much worse – but on the current schedule, we won’t be seeing an update to OfS’ access and participation data dashboard until “summer or autumn 2025”, and even then only for 2023–24.

    If you’re prepared to brave the long loading times – which for me generate a similar level of frustration to that I used to experience watching Eurovision national finals 20 years ago – you can drill down into that dashboard by provider.

    It’s a mixed picture, with a lot of splits to choose from. But what the data doesn’t tell us is how providers are doing when compared to their signed off targets in their (mainly 2020–21 to 2024–25) access and participation plans.

    The last time OfS published any monitoring data was for the 2020–21 academic year – almost three years ago, in September 2022.

    That means that we can’t see how well providers are doing against their targets, and nor do we have any sense of any action that OfS may (or may not) have taken to tackle underperformance.

    So I decided to have a go. I restricted my analysis to the Russell Group, and extracted all of the targets from the 2020–21 to 2024–25 plans that were measurable via the dashboard.

    I then compared the 2022–23 performance with the relevant milestone, and with the original baseline. Where the target was unclear on what type of student was in scope, I assumed FT, first degree students.

    The results are pretty worrying.

    Baseline 2022-23 Milestone 2022-23 Actual Behind milestone? Behind baseline?
    PROG Disabled Percentage difference in progression to employment and further study between disabled and non-disabled. 3.00 2.00 0.10 N N
    PROG Ethnicity Percentage difference in graduate employability between white and black students 7.9 4.70 -2.50 N N
    CONT Disabled Percentage difference in non-continuation rates non-disabled and students with mental health conditions 7.00 5.50 1.80 N N
    CONT Disabled Percentage difference in continuation rates between disabled students and non-disabled students. 6.4 3 1.3 N N
    CONT Low Participation Neighbourhood (LPN) Percentage difference in non-continuation rates between POLAR4 quintile 5 and quintile 1 students. 5 3.5 2.3 N N
    CONT Low Participation Neighbourhood (LPN) Percentage difference in non-continuation rates between POLAR4 quintile 5 and quintile 1 students. 4 2.5 3.40 Y N
    CONT Low Participation Neighbourhood (LPN) Close the gap in non-continuation between POLAR 4 Q1 and Q5 undergraduate students from 3.8% in 2016/2017 to 1.5% in 2024/25 3.8 3 6.4 Y Y
    CONT Low Participation Neighbourhood (LPN) POLAR4 Q1 non-continuation gap v Q5 (relates to KPM3) 4 3.25 6.1 Y Y
    CONT Low Participation Neighbourhood (LPN) Percentage difference in non-continuation rates between POLAR4 quintile 5 and quintile 1 students 2.40 1.00 6.90 Y Y
    CONT Low Participation Neighbourhood (LPN) Percentage difference in continuation rates between the most (POLAR Q5) and least (POLAR Q1) representative groups. 2.4 1.5 3.1 Y Y
    CONT Mature Percentage point difference in non-continuation rates between young (under 21) and mature (21 and over) students. 10 9 6.8 N N
    CONT Mature Percentage difference in continuation rates of mature first degree entrants when compared to young students. 10.2 7 -0.4 N N
    CONT Mature Significantly raise the percentage of our intake from mature students 5.90 7.00 4.10 N Y
    CONT Mature Percentage difference in non-continuation rates mature and non-mature students 9.00 6.00 7.40 Y N
    CONT Mature Percentage difference in non-continuation rates between mature (aged 21+) and young (aged 8.00 5.00 5.10 Y N
    CONT MATURE Close the gap in non-continuation between young and mature full-time, first degree students from 7.8% in 2016/2017 to 4.4% in 2024/2025. 7.8 6.8 10.2 Y Y
    CONT Mature Mature v Young non-continuation gap 9 8.5 10.1 Y Y
    CONT Mature Close the gap in continuation rates between young and mature students (by 1pp each year) by 2024/25. 5 3 6.1 Y Y
    CONT Mature Percentage difference in non-continuation rates between mature and young students 5.30 3.80 5.80 Y Y
    ATTAIN Disabled Percentage difference in degree attainment (1st and 2:1) between disabled students and other students 2.60 1.72 0.9 N N
    ATTAIN Disabled Disabled students attainment gap v non-disabled 3 1.5 1.2 N N
    ATTAIN Disabled To significantly reduce the difference in degree attainment (1st and 2:1) between disabled students and students with no known disability 4.4 2 0.30 N N
    ATTAIN Disabled Percentage point difference in good degree attainment (1st and 2:1) between disabled and not known to be disabled students. 6 5 -2.2 N N
    ATTAIN Disabled To remove the absolute gap in degree outcomes for students with a disability (OfS KPM5). 4.0 2.0 -0.60 N N
    ATTAIN Disabled Percentage difference in degree attainment (1st and 2:1) between disabled and non-disabled students 3.90 2.00 3.60 Y N
    ATTAIN Disabled Percentage difference in degree attainment (1st and 2:1) between students with registered mental health disabilities and non-disabled students 5.80 3.00 4.7 Y N
    ATTAIN Disabled Percentage difference in degree attainment (1st and 2:1) between disabled students and non-disabled students 4.2 2.3 3.6 Y N
    ATTAIN Ethnicity Black students attainment gap v White (relates to KPM4) 20 15.5 11.2 N N
    ATTAIN Ethnicity By 2025, reduce the attainment gap between Asian and white students 8.4 5.2 4.80 N N
    ATTAIN Ethnicity Percentage difference in degree attainment (1st and 2:1) between black and white students (5 year rolling average). 12 8.6 4.60 N N
    ATTAIN Ethnicity Percentage difference in degree attainment (1st and 2:1) between white and asian students. 19 17 14.4 N N
    ATTAIN Ethnicity Percentage difference in degree attainment (1st and 2:1) between white and black students. 14.00 11.00 9.90 N N
    ATTAIN Ethnicity To close the gap between Black and White student continuation rates (reducing the gap by 4 percentage points, from 8% to 4%, by 2024/2025). 8 5.6 5.5 N N
    ATTAIN Ethnicity To close the gap between BME and White student attainment (reducing the gap by 3 percentage points from 11% to 8% by 2024/25). 17 13.1 11.6 N N
    ATTAIN Ethnicity Close the unexplained gap between proportion of BAME and white full-time, first degree students attaining a 2:1 or above from 12.7% in 2017/2018 to 5.5% in 2024/2025. 12.7 10.3 10.8 Y N
    ATTAIN Ethnicity Significantly increase the percentage of our intake from Black students 2.30 3.80 2.90 Y N
    ATTAIN Ethnicity Percentage difference in degree attainment (1st and 2:1) between white and black students 15.70 9.815 11.6 Y N
    ATTAIN Ethnicity Percentage difference in degree attainment (1st and 2:1) between white and Asian students 12.5 8.375 11.4 Y N
    ATTAIN Ethnicity Percentage difference in degree attainment (1st and 2:1) between black and white students. 20 15 19.00 Y N
    ATTAIN Ethnicity Percentage difference in degree attainment (1st and 2:1) between white and BME students. 5.20 2.00 4.60 Y N
    ATTAIN Ethnicity Percentage difference in degree attainment (1st and 2:1) between white and black students. 13.8 6 12.9 Y N
    ATTAIN Ethnicity Percentage difference in degree attainment (1st and 2:1) between BAME and White students. 7.00 4.00 7.50 Y Y
    ATTAIN Ethnicity Percentage difference in degree attainment (1st and 2:1) between white and black students 4.50 3.00 31.00 Y Y
    ATTAIN Ethnicity Percentage difference in degree attainment (1st and 2:1) between white and BAME students 9.50 6.00 11.60 Y Y
    ATTAIN Ethnicity By 2025, reduce the attainment gap between black and white students 8.7 5.9 10.70 Y Y
    ATTAIN Ethnicity To significantly reduce the difference in degree attainment (1st and 2:1) between white and black students 11.6 10 20.00 Y Y
    ATTAIN Ethnicity To significantly reduce the difference in degree attainment (1st and 2:1) between white and Asian students 10.6 10 14.50 Y Y
    ATTAIN Ethnicity Percentage point difference in good degree attainment (1st and 2:1) between white and black students. 18 14 22.1 Y Y
    ATTAIN Ethnicity Percentage difference in degree attainment (1st and 2:1) between white and black students. 17 15 22.9 Y Y
    ATTAIN Ethnicity Halve the gap in attainment that are visible between black and white students (OfS KPM4). 10.0 7.0 15.80 Y Y
    ATTAIN Ethnicity To close the gap between Black and White student attainment (by raising the attainment of Black students) reducing the gap by 8.5 percentage points from 17% to 8.5% by 2024/25 11 9.5 24 Y Y
    ATTAIN Low Participation Neighbourhood (LPN) Percentage difference in degree attainment (1st and 2:1) between POLAR4 quintile 5 and quintile 1 students 9.10 4.645 8.7 Y N
    ATTAIN MATURE Close the unexplained gap between proportion of mature and young full-time, first degree students attaining a 2:1 or above from 12.1% in 2017/2018 to 6.8% in 2024/2025. 12.1 8.8 12.6 Y Y
    ATTAIN Socio-economic Percentage difference in degree attainment (1st and 2:1) between students from most and least deprived areas (based on IMD) 10.20 6.00 12.30 Y Y
    ATTAIN Socio-economic To significantly reduce the difference in degree attainment (1st and 2:1) between the most and least advantaged as measured by IMD. 10.4 8.8 15.60 Y Y
    ATTAIN Socio-economic Reduce the gaps in attainment that are visible between IMD Q1 and Q5 (OfS KPM3). 10.0 7.0 13.70 Y Y
    ACCESS Disabled By 2025, increase the proportion of students with a declared disability enrolling from the baseline of 9% to 13% 9 11 15.70 N N
    ACCESS Ethnicity Significantly increase the percentage of our intake from Asian students 6.90 8.50 9.70 N N
    ACCESS Ethnicity Percentage of BAME entrants 10.10 12.50 12.70 N N
    ACCESS Ethnicity Increase percentage proportion of students identifying as black entering to at least match or exceed sector average (11%). 9.5 10.5 11.7 N N
    ACCESS Ethnicity To increase the proportion of Black, young, full-time undergraduate entrants by 1.2 percentage points, from 2.4% to 3.6% by 2024/25. 2.4 2.8 2.1 Y Y
    ACCESS Low Participation Neighbourhood (LPN) Ratio in entry rates for POLAR4 quintile 5: quintile 1 students 7.4:1 6:1 4.5 N N
    ACCESS Low Participation Neighbourhood (LPN) Reduce the ratio in entry rates for POLAR4 quintile 5: quintile 1 students 3.9:1 3.4:1 3.4:1 N N
    ACCESS Low Participation Neighbourhood (LPN) By 2025, reduce the gap in access between those from the highest and lowest POLAR4 quintiles enrolling from the baseline of 49% to 41% 49 45 41.00 N N
    ACCESS Low Participation Neighbourhood (LPN) Ratio of students from POLAR Q1 compared to POLAR Q5. 01:14 01:11 8.5 N N
    ACCESS Low Participation Neighbourhood (LPN) Close the gap in access between Q1 and Q5 students from a ratio of 5.5 in 2017/2018 to 3.5 by 2024/2025. 5.5 3.64 4.2 Y N
    ACCESS Low Participation Neighbourhood (LPN) Reduce ratio in entry rates for POLAR4 quintile 5: quintile 1 students 12:1 8:1 8.5 Y N
    ACCESS Low Participation Neighbourhood (LPN) To reduce the gap in participation and ratio in entry rates for POLAR 4 Quintile 5: Quintile 1 students Ratio Q5:Q1 of 5.2:1 500 students from POLAR 4 Q1 4.5 or 500 Y N
    ACCESS Low Participation Neighbourhood (LPN) LPN determined by POLAR 4 data. Looking specifically at increasing the intake for LPN Quintile 1 students, and thereby reduce the ratio of Q5 to Q1. (Target articulated as both a percentage and number). 8.0%, 391 10%, 490 8.6, 400 Y N
    ACCESS Low Participation Neighbourhood (LPN) Ratio in entry rates for POLAR4 quintile 5: quintile 1 students. 7.4:1 5.5:1 6.9 Y N
    ACCESS Low Participation Neighbourhood (LPN) Ratio in entry rates for POLAR4 quintile 5: quintile 1 students. All undergraduates. 6.2:1 5.1:1 6.3 Y Y
    ACCESS Low Participation Neighbourhood (LPN) Ratio in entry rates for POLAR4 quintile 5: quintile 1 students. 4.2:1 3.5:1 4.3 Y Y
    ACCESS Low Participation Neighbourhood (LPN) Ratio in entry rates for POLAR4 quintile 5: quintile 1 students. Reduce gap to 3.0 to 1.0 by 2024-25 (OfS KPM2). 5:2 to 1 4 5.2 Y Y
    ACCESS Low Participation Neighbourhood (LPN) To increase the proportion of young, full-time undergraduate entrants from POLAR4 Q1 by 2.5 percentage points, from 7.8% to 10.3%, by 2024/25. 7.8 8.9 10.3 Y Y
    ACCESS Low Participation Neighbourhood (LPN) To increase the proportion of young, full-time undergraduate entrants from POLAR4 Q2 by 2.5 percentage points, from 12.4% to 14.9%, by 2024/25. 12.4 13.9 15.4 Y Y
    ACCESS Mature Percentage of mature entrants 5.80 7.20 3.70 Y Y
    ACCESS Mature Percentage of mature students as part of the overall cohort. 9.2 11.0 6.70 Y Y
    ACCESS Multiple Increase the proportion of BME students from Q1 and Q2 backgrounds 5.2 8 7.6 N Y
    ACCESS Socio-economic Eliminate the IMD Q5:Q1 access gap by 2024/25. 5 2 -4.5 N N
    ACCESS Socio-economic By 2025, reduce the gap in access between those from the highest and lowest IMD quintiles from the baseline of 16.4% to 10.4% 16.4 13.5 7.00 N N
    ACCESS Socio-economic Percentage point difference in access rates between IMD quintile 1 and 2 and quintile 3, 4 and 5 students. 51.8 43.8 53.4 Y Y

    Milestones and baselines

    If we start with access, of the 25 targets that can be analysed, 14 behind milestone – and 10 show a worse performance than the baseline.

    On continuation, 11 of the 17 are behind milestone, and 9 are behind the baseline. And on attainment, 25 of the 38 are behind milestone, and 14 behind baseline.

    Notwithstanding that some of the other targets might have been smashed, and that in all cases the performance may well have improved since then, that looks like pretty poor performance to me.

    It’s the sort of thing that we might have expected to result in fines, or at least specific conditions of registration being imposed.

    But as far as we know, nothing beyond enhanced monitoring has been applied – and even then, we don’t know who has been under enhanced monitoring.

    And the results are a problem. When OfS launched this batch of plans, it noted that young people from the most advantaged areas of England were over six times as likely to attend one of the most selective universities – including Oxford, Cambridge and other members of the Russell Group – as those from the most disadvantaged areas, and that that gap had hardly changed despite a significant expansion in the number of university places available.

    At the rates of progress forecast under those plans, the ratio was supposed to be less than 4:1 by 2025. It was still at 5.44 in the Russell Group in 2022–23.

    It was supposed to mean around 6,500 extra students from the most disadvantaged areas attending those universities each year from 2024-25 onwards. The Russell Group isn’t the whole of “high tariff” – but it had only increased its total of POLAR1 students by 1350 by 2022/23.

    OfS also said that nationally, the gap between the proportion of white and black students who are awarded a 1st or 2:1 degree would drop from 22 to 11.2 percentage points by this year. As we’ve noted before on the site, the apparent narrowing during Covid was more of a statistical trick than anything else. It was up at 22.4 in 2022–23.

    And the gap in dropout rates between students from the most and least represented groups was supposed to fall from 4.6 to 2.9 percentage points – it was up at 5.3pp in 2022–23.

    The aggregation of ambition into press-releasable targets appears to have suffered from a similar fate to the equivalent exercise over financial sustainability.

    What a wonderful thing

    Of course, much has happened since January 2020. To the extent to which there were challenges over the student life cycle, they were likely exacerbated by the pandemic and a subsequent cost of living crisis.

    But when you’re approving four year plans, changes in the external risk environment ought to mean that it revises what it now calls an Equality of Opportunity Risk Register to reflect that – and either allows providers to revise targets down, or requires more action/investment to meet the targets agreed.

    Neither of those things seem to have happened.

    It’s also the case that OfS has radically changed how it regulates in this area. Back then, the director for fair access and participation was Chris Millward. It’s now John Blake. And the guidance, nature of the plans expected and monitoring regimes have all been revamped.

    But when we’re dealing with long-term plans, a changing of the guard does run the risk that the expectations and targets agreed under any old regime get sidelined and forgotten about – letting poor performers off the hook.

    It certainly feels like that’s the case. And while John Blake is widely respected, it’s hard to believe that he’ll still be the director for fair access and participation by the end of the latest round of plans – 2029.

    Hindsight is a wonderful thing, of course, but notwithstanding the external environment changes, few anticipated that any of the gaps, percentages or ratios would worsen for any of the targets set back in 2019.

    That matters because of that OfS aggregation issue. It’s not just that some providers can drag down the performance of the sector as a whole. It’s that no provider was set the target of not getting any worse on the myriad of measures that it didn’t pick for its plan.

    For all we know, while a certain number of providers might have set and agreed a target, say, on POLAR1 access or IMD attainment, performance could have worsened in all of those that didn’t – and that poses a major problem for the regulator and the design of the thing.

    It remains the case that we’re lacking clarity on the way in which the explosion of franchised, urban area business provision has impacted the stats of both the providers that have lit that blue touch paper, and the sector’s scores overall. For me, improvements in access via that method look like cheating – and declines in continuation, completion or progression ought to mean serious questions over funding policy within the Department for Education.

    We don’t really know – but need to know – the impact of other providers’ behaviour on an individual provider’s external environment. If, for example, high tariff universities scoop up more disadvantaged students (without necessarily actually narrowing the gap), that could end up widening the gap elsewhere too. There’s only so many moles to whack when you’re looking at access.

    We still can’t see A&P performance by subject area – which has always been an issue when we think about access to the professions, but is an even bigger issue now that whole subject areas are being culled in the face of financial problems.

    And the size and shape question lingers too. UCAS figures at the close of clearing suggested that high tariff providers were set to balance the books by expanding in ways they claimed were impossible when the “mutant algorithm” hit in 2020.

    Much of continuation, completion and progression appears to be about the overall mix of students at a provider – something that’s made much more challenging in medium and lower tariff providers if high-tariff ones lower theirs.

    In the forthcoming skills white paper, we should expect exhortations from ministers that the sector improves its performance on access and participation. It will have choices on provider type, subject area, the types of disadvantage to focus on, and the mix of measures between things inside its control in the external environment, and things within providers’ control (or at least influence) that OfS should expect.

    Whatever it chooses, on the evidence available, it will have real problems judging either its own performance, its regulator’s, groups of providers or even individuals’. If you think the sector still has some distance to go on fairness, that just won’t do.

    Source link