Author: admin

  • The latest sector-wide financial sustainability assessment from the Office for Students

    The latest sector-wide financial sustainability assessment from the Office for Students

    As the higher education sector in England gets deeper into the metaphorical financial woods, the frequency of OfS updates on the sector’s financial position increases apace.

    Today’s financial sustainability bulletin constitutes an update to the regulator’s formal annual assessment of sector financial sustainability published in May 2025. The update takes account of the latest recruitment data and any policy changes that could affect the sector’s financial outlook that would not have been taken into account at the point that providers submitted their financial returns to OfS ahead of the May report.

    Recruitment headlines

    At sector level, UK and international recruitment trends for autumn 2025 entry have shown growth by 3.1 per cent and 6.3 per cent respectively. But this is still lower than the aggregate sector forecasts of 4.1 per cent and 8.6 per cent, which OfS estimates could result in a total sector wide net loss of £437.8m lower than forecast tuition fee income. “Optimism bias” in financial forecasting might have been dialled back in recent years following stiff warnings from OfS, but these figures suggest it’s still very much a factor.

    Growth has also been uneven across the sector, with large research intensive institutions increasing UK undergraduate numbers at a startling 9.9 per cent in 2025 (despite apparently collectively forecasting a modest decline of 1.7 per cent), and pretty much everyone else coming in lower than forecast or taking a hit. Medium-sized institutions win a hat tip for producing the most accurate prediction in UK undergraduate growth – actual growth of 2.3 per cent compared to projected growth of 2.7 per cent.

    The picture shifts slightly when it comes to international recruitment, where larger research-intensives have issued 3.3 per cent fewer Confirmations of Acceptance of Studies (CAS) against a forecasted 6.6 per cent increase, largely driven by reduction in visas issued to students from China. Smaller and specialist institutions by contrast seem to have enjoyed growth well beyond forecast. The individual institutional picture will, of course, vary even more – and it’s worth adding that the data is not perfect, as not every student applies through UCAS.

    Modelling the impact

    OfS has factored in all of the recruitment data it has, and added in new policy announcements, including estimation of the impact of the indexation of undergraduate tuition fees, and increases to employers National Insurance contributions, but not the international levy because nobody knows when that is happening or how it will be calculated. It has then applied its model to providers’ financial outlook.

    The headline makes for sombre reading – across all categories of provider OfS is predicting that if no action were taken, the numbers of providers operating in deficit in 2025–26 would rise from 96 to 124, representing on increase from 35 per cent of the sector to 45 per cent.

    Contrary to the impression given by UK undergraduate recruitment headlines, the negative impact isn’t concentrated in any one part of the sector. OfS modelling suggests that ten larger research-intensive institutions could tip into deficit in 2025–26, up from five that were already forecasting themselves to be in that position. The only category of provider where OfS estimates indicate fewer providers in deficit than forecast is large teaching-intensives.

    The 30 days net liquidity is the number you need to keep an eye on because running out of cash would be much more of a problem than running a deficit for institutional survival. OfS modelling suggests that the numbers reporting net liquidity of under 30 days could rise from 41 to 45 in 2025–26, with overall numbers concentrated in the smaller and specialist/specialist creative groups.

    What it all means

    Before everyone presses the panic button, it’s really important to be aware, as OfS points out, that providers will be well aware of their own recruitment data and the impact on their bottom line, and will have taken what action they can to reduce in-year costs, though nobody should underestimate the ongoing toll those actions will have taken on staff and students.

    Longer term, as always, the outlook appears sunnier, but that’s based on some ongoing optimism in financial forecasting. If, as seems to keep happening, some of that optimism turns out to be misplaced, then the financial struggles of the sector are far from over.

    Against this backdrop, the question remains less about who might collapse in a heap and more about how to manage longer term strategic change to adapt providers’ business models to the environment that higher education providers are operating in. Though government has announced that it wants providers to coordinate, specialise and collaborate, while the sector continues to battle heavy financial weather those aspirations will be difficult to realise, however desirable they might be in principle.

    Source link

  • Adult Student Priorities Survey: Understanding Your Adult Learners 

    Adult Student Priorities Survey: Understanding Your Adult Learners 

    The Adult Student Priorities Survey (ASPS) is the instrument in the family of Satisfaction-Priorities Surveys that best captures the experiences of graduate level students and adult learners in undergraduate programs at four-year institutions. The Adult Student Priorities Survey provides the student perspectives for non-traditional populations along with external national benchmarks to inform decision-making for nearly 100 institutions across the country.

    Why the Adult Student Priorities Survey matters

    As a comprehensive survey instrument, the Adult Student Priorities Survey assesses student satisfaction within the context of the level of importance that students place on a variety of experiences, both inside and outside of the classroom. The combination of satisfaction and importance scores provides the identification of institutional strengths (areas of high importance and high satisfaction) and institutional challenges (areas of high importance and low satisfaction). Strengths can be celebrated, and challenges can be addressed by campus leadership to build on the good where possible and to re-enforce other areas where needed.

    With the survey implementation, all currently enrolled students (based on who the institution wants to include) can provide feedback on their experiences with instruction, advising, registration, recruitment/financial aid, support services and how they feel as a student at the institution. The results deliver external benchmarks with other institutions serving adult learners, including data that is specific to graduate programs, and the ability to monitor internal benchmarks when the survey is administered over multiple years. (The national student satisfaction results are published annually). The delivered results also provide the option to analyze subset data for all standard and customizable demographic indicators to understand where targeted initiatives may be required to best serve student populations.

    Connecting ASPS data to student success and retention

    Like the Student Satisfaction Inventory and the Priorities Survey for Online Learners (the other survey instruments in the Satisfaction-Priorities family), the data gathered by the Adult Student Priorities Survey can support multiple initiatives on campus including to inform student success efforts, to provide the student voice for strategic planning, to document priorities for accreditation purposes and to highlight positive messaging for recruitment activities. Student satisfaction has been positively linked with higher individual student retention and higher institutional graduation rates, getting right to the heart of higher education student success.

    Learn more about best practices for administering the online Adult Student Priorities Survey at your institution, which can be done any time during the academic year on the institutions’ timeline.

    Ask for a complimentary consultation with our student success experts

    What is your best approach to increasing student retention and completion? Our experts can help you identify roadblocks to student persistence and maximize student progression. Reach out to set up a time to talk.

    Request now

    Source link

  • Decoding College Student Motivational Data:

    Decoding College Student Motivational Data:

    Two institutions, The College of New Jersey (TCNJ) and SUNY Morrisville, shared their experiences implementing the College Student Inventory (CSI) during a webinar I hosted. Both institutions found the CSI valuable for identifying at-risk students, gauging their willingness to accept help, and connecting students with relevant campus resources. The CSI’s value lies in its ability to identify students at risk, gauge their receptivity to assistance, and facilitate immediate connections to campus resources.  

    The College of New Jersey (TCNJ)

    Jamel T. Johnson, director of the office of mentoring, retention, and success programs, spearheaded a campuswide implementation of the CSI in 2025, building on their previous use within the Educational Opportunity Fund program. Johnson aimed to increase completion rates from approximately 70% to 100%. They achieved a remarkable 93.7% completion rate and are now analyzing the data to inform targeted interventions and partnerships across campus. Johnson’s focus is on understanding the data gleaned from the CSI to inform broader campus initiatives, signaling an ongoing process of implementation and refinement. As Johnson stated, “We’re excited about what we have seen, and we’re excited about where we’re going to be going with the assessment.” 

    The CSI’s Overall Risk Index showed Johnson that there was concern with commuter students. He was able to get this data in front of a team within their student affairs division whose core task is to support commuter students. “We’ve met with them and now they’re deploying different efforts to meet the needs based upon what we have seen.” Johnson is set to administer the Mid-Year Student Assessment (MYSA) and will use the data to help further their efforts for their commuter students.

    When asked, “What types of early intervention strategies have you found to be most effective when guided by?” Johnson used two words “conversation versus correction”. Again, emphasizing that the CSI is not an aptitude test. Johnson did not want correction and score talk to be the first interaction his students had with his staff.

    Johnson emphasized the importance of stakeholders seeing themselves reflected in the data when discussing campus collaboration. When a campus fosters collaboration and effectively utilizes its data, the positive impact on students becomes evident.

    SUNY Morrisville

    Morrisville State University of New York

    Brenda Oursler-White, director of assessment and accreditation and interim dean for the School of Liberal Arts, Science, and Society, implemented the CSI in fall 2023 to improve first-time, full-time student retention rates. There was a significant increase in completion rates, rising from 73% in fall 2024 to 85.3% in fall 2025. Oursler-White attributes this success to student engagement, clear messaging about the benefits of the assessment, and connecting students to resources based on their results.  

    SUNY Morrisville’s success was partly driven by showcasing the tangible benefits of completing the CSI, specifically the increased likelihood of returning for the spring semester compared to those who didn’t participate. Oursler-White stated, “The College Student Inventory isn’t like magic wand, meaning if you complete it, you’re going to be successful. They still have to put in the work.” With a target to improve first-time, full-time student retention rates, she expressed that a key challenge was securing buy-in from faculty, staff, administration, and students.

    When asked, “What types of early intervention strategies have you found to be most effective?” Oursler-White’s response was similar to Johnson’s. She put an emphasis on using the word ranking rather than score and working with the student to interpret their results. The student saw 65% and thought of it as a letter grade. When in reality they were above the national norm and at the 65th percentile. It was important to have clear communication and to allow the student to learn more about themselves while building a relationship and a sense of belonging. Oursler-White took it upon herself to hand out over 600 student reports, meeting within the classroom to work with students hand-in-hand with their results and next steps.

    Boost student success through motivational assessment

    We are grateful to these two campuses for sharing their experiences to assist others with understanding how the data can best be utilized on campus. If you are interested in learning more, download the webinar recording.

    To explore next steps and discover how the College Student Inventory (CSI) can impact retention and student success efforts, ask for a walkthrough or please reach out to me via email.

    Source link

  • AI is unlocking insights from PTES to drive enhancement of the PGT experience faster than ever before

    AI is unlocking insights from PTES to drive enhancement of the PGT experience faster than ever before

    If, like me, you grew up watching Looney Tunes cartoons, you may remember Yosemite Sam’s popular phrase, “There’s gold in them thar hills.”

    In surveys, as in gold mining, the greatest riches are often hidden and difficult to extract. This principle is perhaps especially true when institutions are seeking to enhance the postgraduate taught (PGT) student experience.

    PGT students are far more than an extension of the undergraduate community; they represent a crucial, diverse and financially significant segment of the student body. Yet, despite their growing numbers and increasing strategic importance, PGT students, as Kelly Edmunds and Kate Strudwick have recently pointed out on Wonkhe, remain largely invisible in both published research and core institutional strategy.

    Advance HE’s Postgraduate Taught Experience Survey (PTES) is therefore one of the few critical insights we have about the PGT experience. But while the quantitative results offer a (usually fairly consistent) high-level view, the real intelligence required to drive meaningful enhancement inside higher education institutions is buried deep within the thousands of open-text comments collected. Faced with the sheer volume of data the choice is between eye-ball scanning and the inevitable introduction of human bias, or laborious and time-consuming manual coding. The challenge for the institutions participating in PTES this year isn’t the lack of data: it’s efficiently and reliably turning that dense, often contradictory, qualitative data into actionable, ethical, and equitable insights.

    AI to the rescue

    The application of machine learning AI technology to analysis of qualitative student survey data presents us with a generational opportunity to amplify the student voice. The critical question is not whether AI should be used, but how to ensure its use meets robust and ethical standards. For that you need the right process – and the right partner – to prioritise analytical substance, comprehensiveness, and sector-specific nuance.

    UK HE training is non-negotiable. AI models must be deeply trained on a vast corpus of UK HE student comments. Without this sector-specific training, analysis will fail to accurately interpret the nuances of student language, sector jargon, and UK-specific feedback patterns.

    Analysis must rely on a categorisation structure that has been developed and refined against multiple years of PTES data. This continuity ensures that the thematic framework reflects the nuances of the PGT experience.

    To drive targeted enhancement, the model must break down feedback into highly granular sub-themes – moving far beyond simplistic buckets – ensuring staff can pinpoint the exact issue, whether it falls under learning resources, assessment feedback, or thesis supervision.

    The analysis must be more than a static report. It must be delivered through integrated dashboard solutions that allow institutions to filter, drill down, and cross-reference the qualitative findings with demographic and discipline data. Only this level of flexibility enables staff to take equitable and targeted enhancement actions across their diverse PGT cohorts.

    When these principles are prioritised, the result is an analytical framework specifically designed to meet the rigour and complexity required by the sector.

    The partnership between Advance HE, evasys, and Student Voice AI, which analysed this year’s PTES data, demonstrates what is possible when these rigorous standards are prioritised. We have offered participating institutions a comprehensive service that analyses open comments alongside the detailed benchmarking reports that Advance HE already provides. This collaboration has successfully built an analytical framework that exemplifies how sector-trained AI can deliver high-confidence, actionable intelligence.

    Jonathan Neves, Head of Research and Surveys, Advance HE calls our solution “customised, transparent and genuinely focused on improving the student experience, “ and adds, “We’re particularly impressed by how they present the data visually and look forward to seeing results from using these specialised tools in tandem.”

    Substance uber alles

    The commitment to analytical substance is paramount; without it, the risk to institutional resources and equity is severe. If institutions are to derive value, the analysis must be comprehensive. When the analysis lacks this depth institutional resources are wasted acting on partial or misleading evidence.

    Rigorous analysis requires minimising what we call data leakage: the systematic failure to capture or categorise substantive feedback. Consider the alternative: when large percentages of feedback are ignored or left uncategorised, institutions are effectively muting a significant portion of the student voice. Or when a third of the remaining data is lumped into meaningless buckets like “other,” staff are left without actionable insight, forced to manually review thousands of comments to find the true issues.

    This is the point where the qualitative data, intended to unlock enhancement, becomes unusable for quality assurance. The result is not just a flawed report, but the failure to deliver equitable enhancement for the cohorts whose voices were lost in the analytical noise.

    Reliable, comprehensive processing is just the first step. The ultimate goal of AI analysis should be to deliver intelligence in a format that seamlessly integrates into strategic workflows. While impressive interfaces are visually appealing, genuine substance comes from the capacity to produce accurate, sector-relevant outputs. Institutions must be wary of solutions that offer a polished facade but deliver compromised analysis. Generic generative AI platforms, for example, offer the illusion of thematic analysis but are not robust.

    But robust validation of any output is still required. This is the danger of smoke and mirrors – attractive dashboards that simply mask a high degree of data leakage, where large volumes of valuable feedback are ignored, miscategorised or rendered unusable by failing to assign sentiment.

    Dig deep, act fast

    When institutions choose rigour, the outcomes are fundamentally different, built on a foundation of confidence. Analysis ensures that virtually every substantive PGT comment is allocated to one or more UK-derived categories, providing a clear thematic structure for enhancement planning.

    Every comment with substance is assigned both positive and negative sentiment, providing staff with the full, nuanced picture needed to build strategies that leverage strengths while addressing weaknesses.

    This shift from raw data to actionable intelligence allows institutions to move quickly from insight to action. As Parama Chaudhury, Pro-Vice Provost (Education – Student Academic Experience) at UCL noted, the speed and quality of this approach “really helped us to get the qualitative results alongside the quantitative ones and encourage departmental colleagues to use the two in conjunction to start their work on quality enhancement.”

    The capacity to produce accurate, sector-relevant outputs, driven by rigorous processing, is what truly unlocks strategic value. Converting complex data tables into readable narrative summaries for each theme allows academic and professional services leaders alike to immediately grasp the findings and move to action. The ability to access categorised data via flexible dashboards and in exportable formats ensures the analysis is useful for every level of institutional planning, from the department to the executive team. And providing sector benchmark reports allows institutions to understand their performance relative to peers, turning internal data into external intelligence.

    The postgraduate taught experience is a critical pillar of UK higher education. The PTES data confirms the challenge, but the true opportunity lies in how institutions choose to interpret the wealth of student feedback they receive. The sheer volume of PGT feedback combined with the ethical imperative to deliver equitable enhancement for all students demands analytical rigour that is complete, nuanced, and sector-specific.

    This means shifting the focus from simply collecting data to intelligently translating the student voice into strategic priorities. When institutions insist on this level of analytical integrity, they move past the risk of smoke and mirrors and gain the confidence to act fast and decisively.

    It turns out Yosemite Sam was right all along: there’s gold in them thar hills. But finding it requires more than just a map; it requires the right analytical tools and rigour to finally extract that valuable resource and forge it into meaningful institutional change.

    This article is published in association with evasys. evasys and Student Voice AI are offering no-cost advanced analysis of NSS open comments delivering comprehensive categorisation and sentiment analysis, secure dashboard to view results and a sector benchmark report. Click here to find out more and request your free analysis.

    Source link

  • REF should be about technical professionals too

    REF should be about technical professionals too

    Every great discovery begins long before a headline or journal article.

    Behind every experiment, dataset, and lecture lies a community of highly skilled technical professionals, technologists, facility managers, and infrastructure specialists. They design and maintain the systems that make research work, train others to use complex equipment, and ensure data integrity and reproducibility. Yet their contribution has too often been invisible in how we assess and reward research excellence.

    The pause in the Research Excellence Framework (REF) is more than a scheduling adjustment, it’s a moment to reflect on what we value within the UK research and innovation sector.

    If we are serious about supporting excellence, we must recognise all those who make it possible, not just those whose names appear on papers or grants, but the whole team, including technical professionals whose expertise enables every discovery.

    Making people visible in research culture

    Over the past decade, there has been growing recognition that research culture, including visibility, recognition, and support for technical professionals is central to delivering world-class outcomes. Initiatives such as the Technician Commitment, now backed by more than 140 universities and research institutes, have led the way in embedding good practice around technical professional careers, progression, and recognition.

    Alongside this, the UK Institute for Technical Skills and Strategy (UK ITSS) continues to advocate for technical professionals nationally to ensure they are visible and their inputs are recognised within the UK’s research and innovation system. These developments have helped reshape how universities think about people, culture, and environment, creating the conditions where all contributors to research and innovation can thrive.

    A national capability – not a hidden workforce

    This shift is not just about fairness or inclusion, it’s about the UK’s ability to deliver on its strategic ambitions. Technical professionals are critical to achieving the goals set out in the UK Government’s Modern Industrial Strategy and to the success of frontier technologies such as artificial intelligence, quantum, engineering biology, advanced connectivity, and semiconductors. These frontier sectors rely on technical specialists to design, operate, and maintain the underpinning infrastructure on which research and innovation depend.

    Without a stable, well-supported technical professional workforce, the UK risks losing the very capacity it needs to remain globally competitive. Attracting, training, and retaining this talent necessitates that technical roles are visible and recognised – not treated as peripheral to research, but as essential to it.

    Why REF matters

    This is where the People, Culture and Environment (PCE) element of the REF becomes critical. REF has always shaped behaviour across the sector. Its weighting signals what the UK values in research and innovation. Some have argued that PCE should be reduced (or indeed removed) to simplify the REF process, ease administrative burden, or avoid what they see as subjectivity in the assessment of research culture. Others have suggested a greater emphasis on environment would shift focus away from research excellence, or that culture work is too challenging to consistently assess across institutions. But these arguments overlook something fundamental, that the quality of our research, the excellence we deliver as a sector, is intrinsically tied to the conditions in which it is produced. As such, reducing the weighting of PCE would send a contradictory message, that culture, collaboration, and support for people are secondary to outputs rather than two sides of the same coin.

    The Stern Review and the Future Research Assessment Programme both recognised the need for a greater focus on research and innovation environments. PCE is not an optional extra, it is fundamental to research integrity, innovation, and excellence. A justifiably robust weighting reflects this reality and gives institutions the incentive to continue investing in healthy, supportive, and inclusive environments.

    Universities have already made significant progress on this by developing new data systems, engaging staff, and benchmarking culture change. There is clear evidence that the proposed PCE focus has driven positive shifts in institutional behaviour. To step away from this now would risk undoing that progress and undermine the growing recognition of technical professionals as central to research and innovation success.

    Including technical professionals explicitly within REF delivers real benefits for both technical professionals and their institutions, and ultimately strengthens research excellence. For technicians, recognition within the PCE element encourages universities to create the kind of environments in which they can thrive – cultures that value their expertise, provide clearer career pathways, invest in skills, and ensure they have the support and infrastructure to contribute fully to research. Crucially, REF 2029 also enables institutions to submit outputs led by technical colleagues, recognising their role in developing methods, tools, data, and innovations that directly advance knowledge.

    For universities, embedding this broader community within PCE strengthens the systems REF is designed to assess. It drives safer, more efficient and sustainable facilities, improves data quality and integrity, and fosters collaborative, well-supported research environments. By incentivising investment in skilled, stable, and, empowered technical teams, the inclusion of technicians enhances the reliability, reproducibility, and innovation potential of research – ultimately raising the standard of research excellence across the institution.

    From hidden to central

    REF has the power not only to measure excellence, but to shape it. By maintaining a strong focus on people and culture, it can encourage institutions to build the frameworks, leadership roles, and recognition mechanisms that enable all contributors, whether technical, academic, or professional, to contribute and excel.

    In doing so, REF can help normalise good practice, embed openness and transparency, and ensure that the environments underpinning discovery are as innovative and excellence driven as the research itself.

    Technical professionals have always been at the heart of UK research. Their skill, creativity, and dedication underpin every discovery, innovation, and breakthrough. What’s changing now is visibility. Their contribution is increasingly recognised and celebrated as foundational to research excellence and national capability.

    As REF evolves, it must continue to reward the environments that nurture, develop, and sustain technical expertise. In doing so, it can help ensure that technical professionals are not just acknowledged but firmly established at the centre of the UK’s research and innovation system – visible, recognised, and vital (as ever) to its future success.

    Source link

  • New HEPI and the University of Central Lancashire Report: Student Working Lives

    New HEPI and the University of Central Lancashire Report: Student Working Lives

    Author:
    Professor Adrian Wright, Dr Mark Wilding, Mary Lawler and Martin Lowe

    Published:

    A new major report from HEPI and the University of Central Lancashire reveals the realities of UK student life and highlights how paid work is increasingly an everyday part of the student experience.

    Student Working Lives (HEPI Report 195), written by Professor Adrian Wright, Dr Mark Wilding, Mary Lawler, Martin Lowe, draws on extensive research to show how students are juggling study, employment and caring responsibilities in the midst of a deepening cost-of-living crisis. The findings paint a striking picture of students for whom paid work has become a necessity, not a choice. Findings suggest two-thirds of students work to cover their basic living costs, and 26% of students work to support their families.

    The report looks at the type of work students are employed in, as well as the impact this has on their study. It calls for systemic reform across the higher education sector to design a higher education that moves away from assuming a full-time residential model, and supports student realities.

    You can read the press release and access the full report here.

    Source link

  • McMahon Says ED Agreements Are Temporary

    McMahon Says ED Agreements Are Temporary

    Jim Watson/AFP/Getty Images

    To Education Secretary Linda McMahon, outsourcing education-related grant programs to other federal departments is just a “proof of concept” for her larger goal—closing the 45-year-old agency.

    “Let’s move programs out on a temporary basis. Let’s see how the work is done. What is the result? What is the outcome?” she said in an all-staff meeting at the department Tuesday, shortly after publicly announcing six interagency agreements. “And if it has worked and we have proven that this is the best way to do it, then we’ll ask Congress to codify this and make it a permanent move.” (The meeting was closed to the public. All quotes are pulled from a recording obtained by Inside Higher Ed.)

    In 20 minutes, the secretary explained her plan and the framework through which she hopes her employees and the nation will view it.

    “We are not talking about shutting down the Department of Education. We are talking about returning education to states where it belongs,” she said. “That is the right messaging.”

    McMahon cited polling that she said showed that while the public doesn’t support shutting down ED, respondents are more supportive when they hear the plan still preserves ED’s programs by sending them to other agencies.

    A restructuring like the one in Tuesday’s announcement has been rumored for months, and the changes mirror recommendations outlined in Project 2025—a conservative blueprint that called for closing ED. (The education section of Project 2025 was spearheaded by Lindsey Burke, who is now the department’s deputy chief of staff for policy and programs.)

    To advance President Trump’s goal of shuttering the agency, McMahon has previously shipped career and technical education programs to the Department of Labor and laid off nearly half of her staff.

    But while the secretary said she understands the “unrest” and “uncertainty” the reductions in force have caused and stressed that they were hard decisions made with the “greatest of thought and care,” she stood firm on her belief that they were necessary.

    “I applaud and appreciate everything that every one of you in this room is doing and has done over the years,” she said. “I’m not saying to any one of you that your efforts aren’t good enough—what I’m saying is the policies behind those efforts have not been good enough.”

    McMahon then argued that the first agreement reached earlier this year with Labor has paid off.

    By co-managing, “we can be more efficient and economical,” she said. “For instance, we’ve utilized Labor’s system now on grant drawdowns, and we’ve drawn down over 500 already, and they work very proficiently. It’s a better system than we had here.”

    Although some conservatives praised the administration’s actions, others cast doubt on their magnitude or argued they were distracting attention from what really matters. For Margaret Spellings, former education secretary under President George W. Bush, that’s the “economic emergency” of improving student outcomes.

    “Moving programs from one department to another does not actually eliminate the federal bureaucracy, and it may make the system harder for students, teachers and families to navigate and get the support they need,” she said. “We need to keep the main thing the main thing, and that is how to improve education and outcomes for all students.”

    McMahon, on the other hand, told employees that this move is key to doing just that.

    “We want to make sure that [students] understand there are many opportunities for them … that there are programs that will give them a great livelihood, whether they want to be electricians or doctors or Indian chiefs,” she said. “We are not closing education; we are lifting education up.”

    Source link

  • Higher Education Labor United ("HELU") November 2025 Report

    Higher Education Labor United ("HELU") November 2025 Report

     

    November 2025 HELU Chair’s Message

    Billionaires and the ultra-wealthy have no place in setting the future agenda for higher ed. We – the students, community members, workers that actually make the campus work – do. 

     

    Upcoming Events:

     
     

    From the Blog:

    In Michigan, the MI HELU coalition decided that we wanted to get ahead of the curve by providing candidates with a forum that focused exclusively on Higher Education and the challenges we are facing.

    Together, we’re fighting back against the demonization of higher ed and we won’t cave to governmental bullying to water down our education system with the goal of elimination. Our students deserve better, and so do we.

    Founded in 2020 during the initial phase of the COVID-19 pandemic, Scholars for a New Deal for Higher Education (SNDHE) is a group of teachers and researchers committed to rebuilding our colleges and universities so that they can be a true public resource for everyone.

    And now [New York is] being punished by a federal government that sees organized labor, public education, and social investment as threats instead of strengths.

    Public protest and influencing public opinion is keeping UCW (CWA Local 3821) busy. Members have been fighting fiercely to Defend Remote Work at their state institutions.

     

    Want to support our work? Make a contribution.

    We invite you to support HELU’s work by making a direct financial contribution. While HELU’s main source of income is solidarity pledges from member organizations, these funds from individuals help us to grow capacity as we work to align the higher ed labor movement.

    Source link

  • University of Nebraska-Lincoln faculty vote no confidence in chancellor

    University of Nebraska-Lincoln faculty vote no confidence in chancellor

    This audio is auto-generated. Please let us know if you have feedback.

    Dive Brief:

    • University of Nebraska-Lincoln’s faculty senate on Tuesday passed a no-confidence resolution in the public institution’s chancellor, Rodney Bennett, in part over allegations of poor leadership and financial management. 
    • In a 60-14 vote, faculty approved a measure calling for Bennett’s removal and formally stating no-confidence over allegations of “failures in strategic leadership, fiscal stewardship, governance integrity, external relations, and personnel management.”
    • The no-confidence vote — UNL’s first — follows fierce debate at the university over Bennett’s plan to cut a handful of academic programs as part of a broader effort to slash $27.5 million from UNL’s budget.

    Dive Insight:

    The no-confidence resolution reflects faculty pushback against Bennett since September, when the chancellor unveiled a proposal to slash six programs — which he later reduced to four — as part of a budget-reduction plan.  

    Criticisms have focused largely on what faculty say is a lack of transparency about how, precisely, programs were judged worthy of keeping or cutting. They also allege that Bennett, who joined UNL as chancellor in 2023, has largely failed to include faculty in the decision-making process. 

    The budget process and timeline precluded “meaningful faculty and departmental leadership consultation” and “undermines the possibility of completing a thorough review of evidence, consequences, and public comments,” according to a Nov. 3 memo the faculty senate circulated ahead of the no-confidence resolution. 

    As to the timeline, Bennett announced his initial proposal on Sept. 12, and roughly two months later issued his final recommendation, which the University of Nebraska System’s regents plan to vote on at a Dec. 5 meeting.

    The memo also questioned Bennett’s approach to reducing UNL’s deficit, saying that his plan relies on “immediate cost-reductions and across-the-board cuts rather than multi-year fiscal modeling or revenue diversification.”

    “This system is a $3.5 to $4 billion enterprise, and we are damaging it for $27.5 million,” Faculty Senate President John Shrader said in prepared remarks at a Nov. 4 meeting. “These cuts are going to be devastating to this campus. So damaging to be irreparable.”

    The memo further said Bennett had been “noticeably absent” from several faculty senate meetings and accused him of having periods of sparse contact with the senate’s executive committee, despite UNL bylaws calling for him to meet twice a month with the panel.

    Faculty shared governance represents one of many voices of institutions of higher education,” University of Nebraska System President Jeffrey Gold said in a statement emailed Wednesday. “We value the voice of UNL’s faculty; however, ultimate decisions rest with the Board of Regents.

    A UNL spokesperson said Wednesday that Bennett does not plan to comment on the no-confidence vote.

    In October, an academic advisory body of faculty, staff, students and administrators tasked with reviewing Bennett’s plan called for more time to consider alternatives to ending programs and voted against winding down four of the original six programs Bennett originally put forward for closure. 

    Bennett’s final plan spares two programs that were on the chopping block but still included two others that the Academic Planning Committee voted against eliminating. 

    “None of us want to be in this space, where the decisions we must make will inevitably impact the lives of individuals and change how we do some things on campus,” Bennett said in November when announcing his final proposal. “However, our reality is that UNL’s expenses have been greater than its revenue for many years.”

    The proposal would slash UNL’s statistics, educational administration, Earth and atmospheric sciences, and textile, merchandising and fashion design programs. 

    UNL’s chapter of the American Association of University Professors, which has actively opposed the cuts, lauded Tuesday’s no-confidence vote by the faculty senate. 

    “The faculty has made clear that this chancellor does not have what it takes to lead our flagship institution,” UNL AAUP President Sarah Zuckerman, who is an educational administration professor at the university, said in a statement Tuesday. “We will not accept a lack of transparency, the exclusion of faculty from decision-making, or the erosion of our university’s 156-year-old mission to educate Nebraska’s students.

    Source link

  • He spent 37 days in jail for a Facebook post — now FIRE has his back

    He spent 37 days in jail for a Facebook post — now FIRE has his back

    A 61-year-old Tennessee man is finally free after spending a shocking 37 days in jail — all for posting a meme. 

    Retired police officer Larry Bushart told a local radio station he’s “very happy to be going home” after his nightmarish ordeal. 

    But for Larry and FIRE, the fight isn’t over.

    In September, after Charlie Kirk’s assassination, Larry shared a meme on a Facebook thread about a vigil in Perry County, Tennessee. The meme quoted President Donald Trump saying, “We have to get over it” following a January 2024 school shooting at Perry High School in Iowa. The meme included the commentary, “This seems relevant today …”

    The meme that Larry Bushart shared on Facebook.

    Just after 11 p.m. on Sept. 21, four officers came to Larry’s home, handcuffed him, and took him to jail. He was locked up for “threatening mass violence at a school.” His bond — an astronomical $2 million! 

    Police justified the arrest by saying that people took the meme as a threat to their high school, which has a similar name to the one where the school shooting occurred 20 months earlier. However, police have been unable to produce any evidence that members of the public took the meme as a threat. As The Intercept noted: “There were no public signs of this hysteria. Nor was there much evidence of an investigation—or any efforts to warn county schools.”

    Larry was jailed for more than five weeks. But that wasn’t the only thing he suffered. During that time, he lost his post-retirement job doing medical transportation and missed the birth of his granddaughter.

    Bushart in a police car

    Bushart during his arrest in September, Perry County, Tennessee.

    Prosecutors finally dropped the charges — only after the arrest went viral. Now a newly freed Larry, who spent over three decades with law enforcement and the Tennessee Department of Correction, is preparing to sue.

    “A free country does not dispatch police in the dead of night to pull people from their homes because a sheriff objects to their social media posts,” FIRE’s Adam Steinbaugh told The Washington Post. Now, FIRE is representing Larry to defend his rights — and yours.

    A meme doesn’t become a threat just because a sheriff says it is. In America, there are very few exceptions to the First Amendment, including true threats or incitement of imminent lawless action. 

    Jailing first, justifying later, flips those limits on their head. If officials can arrest you because they dislike your social media posts, then none of us are safe to express ourselves.

    Stay tuned for updates.

    Source link