Blog

  • 5 Steps to Update Assignments to Foster Critical Thinking and Authentic Learning in an AI Age – Faculty Focus

    5 Steps to Update Assignments to Foster Critical Thinking and Authentic Learning in an AI Age – Faculty Focus

    Source link

  • How can students’ module feedback help prepare for success in NSS?

    How can students’ module feedback help prepare for success in NSS?

    Since the dawn of student feedback there’s been a debate about the link between module feedback and the National Student Survey (NSS).

    Some institutions have historically doubled down on the idea that there is a read-across from the module learning experience to the student experience as captured by NSS and treated one as a kind of “dress rehearsal” for the other by asking the NSS questions in module feedback surveys.

    This approach arguably has some merits in that it sears the NSS questions into students’ minds to the point that when they show up in the actual NSS it doesn’t make their brains explode. It also has the benefit of simplicity – there’s no institutional debate about what module feedback should include or who should have control of it. If there isn’t a deep bench of skills in survey design in an institution there could be a case for adopting NSS questions on the grounds they have been carefully developed and exhaustively tested with students. Some NSS questions have sufficient relevance in the module context to do the job, even if there isn’t much nuance there – a generic question about teaching quality or assessment might resonate at both levels, but it can’t tell you much about specific pedagogic innovations or challenges in a particular module.

    However, there are good reasons not to take this “dress rehearsal” approach. NSS endeavours to capture the breadth of the student experience at a very high level, not the specific module experience. It’s debatable whether module feedback should even be trying to measure “experience” – there are other possible approaches, such as focusing on learning gains, or skills development, especially if the goal is to generate actionable feedback data about specific module elements. For both students and academics seeing the same set of questions repeated ad nauseam is really rather boring, and is as likely to create disengagement and alienation from the “experience” construct NSS proposes than a comforting sense of familiarity and predictability.

    But separating out the two feedback mechanisms entirely doesn’t make total sense either. Though the totemic status of NSS has been tempered in recent years it remains strategically important as an annual temperature check, as a nationally comparable dataset, as an indicator of quality for the Teaching Excellence Framework and, unfortunately, as a driver of league table position. Securing consistently good NSS scores, alongside student continuation and employability, will feature in most institutions’ key performance indicators and, while vice chancellors and boards will frequently exercise their critical judgement about what the data is actually telling them, when it comes to the crunch no head of institution or board wants to see their institution slip.

    Module feedback, therefore, offers an important “lead indicator” that can help institutions maximise the likelihood that students have the kind of experience that will prompt them to give positive NSS feedback – indeed, the ability to continually respond and adapt in light of feedback can often be a condition of simply sustaining existing performance. But if simply replicating the NSS questions at module level is not the answer, how can these links best be drawn? Wonkhe and evasys recently convened an exploratory Chatham House discussion with senior managers and leaders from across the sector to gather a range of perspectives on this complex issue. While success in NSS remains part of the picture for assigning value and meaning to module feedback in particular institutional contexts there is a lot else going on as well.

    A question of purpose

    Module feedback can serve multiple purposes, and it’s an open question whether some of those purposes are considered to be legitimate for different institutions. To give some examples, module feedback can:

    • Offer institutional leaders an institution-wide “snapshot” of comparable data that can indicate where there is a need for external intervention to tackle emerging problems in a course, module or department.
    • Test and evaluate the impact of education enhancement initiatives at module, subject or even institution level, or capture progress with implementing systems, policies or strategies
    • Give professional service teams feedback on patterns of student engagement with and opinions on specific provision such as estates, IT, careers or library services
    • Give insight to module leaders about specific pedagogic and curriculum choices and how these were received by students to inform future module design
    • Give students the opportunity to reflect on their own learning journey and engagement
    • Generate evidence of teaching quality that academic staff can use to support promotion or inform fellowship applications
    • Depending on the timing, capture student sentiment and engagement and indicate where students may need additional support or whether something needs to be changed mid-module

    Needless to say, all of these purposes can be legitimate and worthwhile but not all of them can comfortably coexist. Leaders may prioritise comparability of data ie asking the same question across all modules to generate comparable data and generate priorities. Similarly, those operating across an institution may be keen to map patterns and capture differences across subjects – one example offered at the round table was whether students had met with their personal tutor. Such questions may be experienced at department or module level as intrusive and irrelevant to more immediately purposeful questions around students’ learning experience on the module. Module leaders may want to design their own student evaluation questions tailored to inform their pedagogic practice and future iterations of the module.

    There are also a lot of pragmatic and cultural considerations to navigate. Everyone is mindful that students get asked to feed back on their experiences A LOT – sometimes even before they have had much of a chance to actually have an experience. As students’ lives become more complicated, institutions are increasingly wary of the potential for cognitive overload that comes with being constantly asked for feedback. Additionally, institutions need to make their processes of gathering and acting on feedback visible to students so that students can see there is an impact to sharing their views – and will confirm this when asked in the NSS. Some institutions are even building questions that test whether students can see the feedback loop being closed into their student surveys.

    Similarly, there is also a strong appreciation of the need to adopt survey approaches that support and enable staff to take action and adapt their practice in response to feedback, affecting the design of the questions, the timing of the survey, how quickly staff can see the results and the degree to which data is presented in a way that is accessible and digestible. For some, trusting staff to evaluate their modules in the way they see fit is a key tenet of recognising their professionalism and competence – but there is a trade-off in terms of visibility of data institution-wide or even at department or subject level.

    Frameworks and ecosystems

    There are some examples in the sector of mature approaches to linking module evaluation data to NSS – it is possible to take a data-led approach that tests the correlation between particular module evaluation question responses and corresponding NSS question outcomes within particular thematic areas or categories, and builds a data model that proposes informed hypotheses about areas of priority for development or approaches that are most likely to drive NSS improvement. This approach does require strong data analysis capability, which not every institution has access to, but it certainly warrants further exploration where the skills are there. The use of a survey platform like evasys allows for the creation of large module evaluation datasets that could be mapped on to NSS results through business intelligence tools to look for trends and correlations that could indicate areas for further investigation.

    Others take the view that maximising NSS performance is something of a red herring as a goal in and of itself – if the wider student feedback system is working well, then the result should be solid NSS performance, assuming that NSS is basically measuring the right things at a high level. Some go even further and express concern that over-focus on NSS as an indicator of quality can be to the detriment of designing more authentic student voice ecosystems.

    But while thinking in terms of the whole system is clearly going to be more effective than a fragmented approach, given the various considerations and trade-offs discussed it is genuinely challenging for institutions to design such effective ecosystems. There is no “right way” to do it but there is an appetite to move module feedback beyond the simple assessment of what students like or don’t like, or the checking of straightforward hygiene factors, to become a meaningful tool for quality enhancement and pedagogic innovation. There is a sense that rather than drawing direct links between module feedback and NSS outcomes, institutions would value a framework-style approach that is able to accommodate the multiple actors and forms of value that are realised through student voice and feedback systems.

    In the coming academic year Wonkhe and evasys are planning to work with institutional partners on co-developing a framework or toolkit to integrate module feedback systems into wider student success and academic quality strategies – contact us to express interest in being involved.

    This article is published in association with evasys.

    Source link

  • Higher education leadership is at an inflection point – we must transform, or be transformed

    Higher education leadership is at an inflection point – we must transform, or be transformed

    At a recent “fireside chat” at a sector event, after I had outlined to those present some details of the transformational journey the University of East London (UEL) has been on in the past six years, one of those attending said to me: “Until UEL has produced Nobel Prize winners, you can’t say it has transformed.”

    While I chose not to address the comment immediately – the sharp intake of breath and rebuttals that followed from other colleagues present seemed enough at the time – it has played on my mind since.

    It wasn’t so much the comment’s narrow mindedness that shocked, but the confidence with which it was delivered. Yet, looking at the ways in which we often celebrate and highlight sector success – through league tables, mission groups, or otherwise – it is little wonder my interlocutor felt so assured in his worldview.

    Value judgement

    This experience leads me to offer this provocation: as a sector, many of our metrics are failing us, and we must embrace the task of redefining value in 21st century higher education with increased seriousness.

    If you disagree, and feel that traditional proxies such as the number of Nobel Prizes awarded to an institution should continue to count as the bellwethers for quality, you may wish to pause and consider a few uncomfortable truths.

    Yes, the UK is a global leader in scientific excellence. But we are also among the worst in the OECD for translating that science into commercial or productivity gains. The UK is a leading global research hub, producing 57 per cent more academic publications than the US in per capita terms. Yet compared to the US, the UK lags significantly behind in development and scale-up metrics like business-funded R&D, patents, venture capital and unicorns.

    Universities have been strongly incentivised to increase research volume in recent years, but as the outgoing chief executive of UKRI Ottoline Leyser recently posited to the Commons Science, Innovation and Technology committee do we need to address this relatively unstrategic expansion of research activity across a range of topics, detached from economic growth and national priorities? Our global rankings – built on proxies like Nobel Prizes – are celebrated, while our real-world economic outcomes stagnate. We excel in research, yet struggle in relevance. That disconnect comes at a cost.

    I recently contributed to a collection of essays on entrepreneurial university leadership, edited by Ceri Nursaw and published by HEPI – a collection that received a somewhat critical response in the pages of Research Professional, with the reviewer dismissing the notion of bold transformation on the basis that: “The avoidance of risk-taking is why universities have endured since the Middle Ages.”

    Yes. And the same mindset that preserved medieval institutions also kept them closed to women, divorced from industry, and indifferent to poverty for centuries. Longevity is not the same as leadership – and it’s time we stopped confusing the two. While we should all be rightfully proud of the great heritage of our sector, we’re at real risk of that pride choking progress at a critical inflection point.

    Lead or be led

    Universities UK chief executive Vivienne Stern’s recent keynote at the HEPI Annual Conference reminded us that higher education has evolved through tectonic shifts such as the industrial revolution’s technical institutes, the social revolution that admitted women, the 1960s “white heat” of technological change, and the rise of mass higher education.

    Now we are on the edge of the next seismic evolution. The question is: will the sector lead it, or be shaped by it? At the University of East London, we’ve chosen to lead by pressing ahead with a bold transformation built on a central premise that a careers-first approach can drive success in every part of the university – not on precedents that leave us scrambling for relevance in a changing world.

    Under this steam, we’ve achieved the UK’s fastest, most diversified, debt-free revenue growth. We’ve become an engine of inclusive enterprise, moving from 90th to 2nd in the UK for annual student start-ups in six years, with a more than 1,000 per cent increase in the survival of student-backed businesses. We’ve overseen a 25-point increase in positive graduate outcomes – the largest, fastest rise in graduate success – as well as ranking first in England for graduating students’ overall positivity. We use money like we use ideas: to close gaps, not widen them. To combat inequality, not entrench it.

    So, let me return to the Nobel Prize comment. The metrics that matter most to our economy and society, the achievements that tangibly improve lives, are not displayed in glass cabinets – rather those that matter most are felt every day by every member of our society. Recent polling shows what the public wants from growth: improved health and wellbeing, better education and skills, reduced trade barriers. Our government’s policy frameworks – from the industrial strategy to the AI strategy – depend on us as a sector to deliver those outcomes.

    Yet how well do our reputational rankings align with these national imperatives? How well does our regulatory framework reward the institutions that deliver on them? Are we optimising for prestige – or for purpose? We are living at a pivot point in history. The institutions that thrive through it will not be those that retreat into tradition. They will be those that rethink leadership, rewire purpose, and reinvent practice.

    Too much of higher education innovation is incremental; transformational innovation is rare. But it is happening – if we choose to see it, support it, and scale it. I urge others to join me in making the case for such a choice, because the next chapter of higher education will be written by those who act boldly now – or rewritten for those who don’t.

    Source link

  • Why students reject second-language study – Campus Review

    Why students reject second-language study – Campus Review

    Students are turning away from learning a second language other than English because they don’t see it as a viable qualification even though it is a core skill in other countries, experts have flagged.

    Please login below to view content or subscribe now.

    Membership Login

    Source link

  • Education research centre MCERA closes – Campus Review

    Education research centre MCERA closes – Campus Review

    A not-for-profit research centre that provided media training for academics and disseminated education research to the public will close after eight years of operation.

    Please login below to view content or subscribe now.

    Membership Login

    Source link

  • University of Manchester VC – Campus Review

    University of Manchester VC – Campus Review

    Professor Duncan Ivison is vice-chancellor of the University of Manchester, the birthplace of the computer.

    Please login below to view content or subscribe now.

    Membership Login

    Source link

  • The Digital Twin: How to Connect and Enable Your Student Data for Outreach, Personalization, and Predictive Insights [Webinar]

    The Digital Twin: How to Connect and Enable Your Student Data for Outreach, Personalization, and Predictive Insights [Webinar]

    You’re sitting on mountains of student data scattered across CRMs, SIS, LMS, and advising tools. Systems don’t talk. Dashboards are disconnected. And AI? Not even close. Without connection, context, or clarity, that data is nothing more than a headache and a barrier to impact. 

    The Digital Twin: How to Connect and Enable Your Student Data for Outreach, Personalization, and Predictive Insights 
    Thursday, July 24 
    2:00 pm ET / 1:00 pm CT 

    In this webinar, Bryan Chitwood, Director of Data Enablement, breaks down how you can start building your students’ Digital Twin and turn your fragmented data into real-time, actionable intelligence. We’ll show you how unified student data profiles fuel more innovative outreach, personalized engagement, and predictive insights across the student lifecycle. 

    You’ll walk away knowing: 

    • How to connect siloed data sources into a unified, reliable student profile 
    • What a Digital Twin is and how it differs from your CRM or SIS data 
    • Use-cases for personalization, predictive engagement, and lifecycle outreach 
    • Real examples of how institutions are putting Digital Twins to work right now 

    If your campus is drowning in data but starving for strategy, this is the conversation you need. 

    Who Should Attend: 

    If you are a data-minded decision-maker in higher ed or a cabinet-level leader being asked to do more with less, this webinar is for you. 

    • Presidents and Provosts 
    • VPs of Enrollment, Marketing, and Student Success 
    • Leaders charged with driving digital transformation and data-enabled decision making 

    Meet Your Presenter

    Bryan Chitwood

    Director of Data Enablement, Collegis Education

    Complete the form on the right to reserve your spot! We look forward to seeing you on Thursday, July 24. 

    Source link

  • Higher Education Inquirer : IMPORTANT INFO for Sweet v Cardona (now Sweet v McMahon) CLASS

    Higher Education Inquirer : IMPORTANT INFO for Sweet v Cardona (now Sweet v McMahon) CLASS

    Just dropping this IMPORTANT INFO from the DOE for Sweet v Cardona (now Sweet v McMahon) peeps who are CLASS – DECISION GROUPS and POST-CLASS.

    Edited To Add

    Decisions Class are streamlined R and R submissions.

    Post-class denials MUST ask the DOE for a reconsideration, which allows you to add additional evidence.

    Orginial Post:

    For REVISE and RESUBMITS (R and R) notices, the DOE is now saying that they WILL “disregard R and R*”* submissions if you EMAIL additional supporting documents or material. You CANNOT email the R and R back.

    You MUST submit a NEW BDTR APPLICATION and INCLUDE your previous BDTR application number which can be fund on the Denial letter.

    YOU HAVE 6 MONTHS TO RE-SUBMIT FROM THE RECEIPT OF THE R AND NOTICE (Here: https://studentaid.gov/borrower-defense**/

    **)

    The DOE states, “If you email supplemental information to the DOE or attempt to update your existing application, you will be treated as having failed to Revise and Resubmit”.

    ALSO, If you are still trying to add more evidence to your BDTR application this late in the game, you may want to wait for the decision letter to come out. We are reaching Group 5 Decision deadline, and Post-Class is 6 months after that. If you feel uneasy about your evidence, START collecting it now!

    Follow all DIRECTIONS on anything you get from the DOE relating to BDTR (except demanding payment, they can pound sand LOL).

    In Solidarity!!!

    Source link

  • IPEDS Data Collection Schedule (US Department of Education)

    IPEDS Data Collection Schedule (US Department of Education)

    The IPEDS data collection calendar for 2025-26 has now been posted and is available within the Data Collection System’s (DCS) Help menu, and on the DCS login page at: https://surveys.nces.ed.gov/ipeds/public/data-collection-schedule

    What is IPEDS?

    IPEDS is the Integrated Postsecondary Education Data System. It is a system of interrelated surveys conducted annually by the U.S. Department of Education’s National Center for Education Statistics (NCES). IPEDS gathers information from every college, university, and technical and vocational institution that participates in the federal student financial aid programs. The Higher Education Act of 1965, as amended, requires that institutions that participate in federal student aid programs report data on enrollments, program completions, graduation rates, faculty and staff, finances, institutional prices, and student financial aid. These data are made available to students and parents through the College Navigator college search Web site and to researchers and others through the IPEDS Data Center. To learn more about IPEDS Survey components, visit https://nces.ed.gov/Ipeds/use-the-data/survey-components.

    How is IPEDS Used?

    IPEDS provides basic data needed to describe — and analyze trends in — postsecondary education in the United States, in terms of the numbers of students enrolled, staff employed, dollars expended, and degrees earned. Congress, federal agencies, state governments, education providers, professional associations, private businesses, media, students and parents, and others rely on IPEDS data for this basic information on postsecondary institutions.

    IPEDS forms the institutional sampling frame for other NCES postsecondary surveys, such as the National Postsecondary Student Aid Study.

    Which Institutions Report to IPEDS?

    The completion of all IPEDS surveys is mandatory for institutions that participate in or are applicants for participation in any federal student financial aid program (such as Pell grants and federal student loans) authorized by Title IV of the Higher Education Act of 1965, as amended (20 USC 1094, Section 487(a)(17) and 34 CFR 668.14(b)(19)).

    Institutions that complete IPEDS surveys each year include research universities, state colleges and universities, private religious and liberal arts colleges, for-profit institutions, community and technical colleges, non-degree-granting institutions such as beauty colleges, and others.

    To find out if a particular institution reports to IPEDS, go to College Navigator and search by the institution name.

    What Data are Collected in IPEDS?

    IPEDS collects data on postsecondary education in the United States in eight areas: institutional characteristics; institutional prices; admissions; enrollment; student financial aid; degrees and certificates conferred; student persistence and success; and institutional resources including human, resources, finance, and academic libraries.

    Source link

  • How the 2025 U.S. Department of Education Reorganization Fulfills Grover Norquist’s Dream (Glen McGhee)

    How the 2025 U.S. Department of Education Reorganization Fulfills Grover Norquist’s Dream (Glen McGhee)

    In 2001, conservative activist Grover Norquist declared that his goal was to shrink government “to the size where I can drag it into the bathroom and drown it in the bathtub.” More than two decades later, under the leadership of Secretary Linda McMahon, the U.S. Department of Education’s March 2025 reorganization delivers on that radical vision—not with fire and fury, but with vacancies, ambiguity, and quiet institutional collapse.

    Vacant Seats, Hollow Power

    With dozens of senior leadership roles left vacant, enforcement functions gutted, and policymaking handed over to political allies and industry insiders, the Department no longer resembles a federal agency tasked with protecting students and public investment. Instead, it has become a hollowed-out vessel primed for deregulation, privatization, and corporate exploitation.

    The new organizational chart is littered with the word “VACANT.” From Chiefs of Staff and Deputy Assistant Secretaries to senior advisors in enforcement, civil rights, and postsecondary education, entire divisions have been effectively immobilized. The Office of Civil Rights is barely staffed at the top. The Rehabilitation Services Administration is leaderless. The General Counsel’s office lacks oversight in key regulatory areas. This is not streamlining—it is strategic self-sabotage.

    Federal Student Aid (FSA), overseeing over $1.5 trillion in loans, is run by an acting chief. Critical offices such as the Office of Postsecondary Education (OPE) are fragmented, missing key leadership across multiple branches—especially those charged with accreditation, innovation, and borrower protections.

    The Kent Controversy: A Symptom of Systemic Rot

    The collapse of federal oversight is not only evident in the vacancies—it is also embodied in controversial political appointments. As education policy watchdog David Halperin has reported, the Trump administration’s nominee for Under Secretary of Education, Nicholas Kent, epitomizes the revolving door between the Department of Education and the for-profit college industry.

    Kent’s career includes roles at Education Affiliates, which in 2015 paid $13 million to settle a Department of Justice case involving false claims for federal student aid, and later at Career Education Colleges and Universities (CECU), the lobbying group for the for-profit college sector. Under Kent’s policy leadership at CECU, the organization actively fought against borrower defense rules, gainful employment regulations, and other safeguards meant to protect students from exploitative educational institutions.

    Despite this record, the Senate Health, Education, Labor and Pensions (HELP) Committee advanced Kent’s nomination on May 22, 2025, in a party-line 12–11 vote—without a hearing. HELP Ranking Member Bernie Sanders objected, saying, “In my view, we should not be confirming the former lobbyist that represented for-profit colleges.” Advocates, including Halperin and six education justice organizations, sent a letter to Chairman Bill Cassidy calling for public scrutiny of Kent’s background and the Trump administration’s destructive higher education agenda.

    Among their concerns are the elimination of key enforcement staff and research arms at the Department, the cancellation of ongoing research contracts, the rollback of borrower defense and gainful employment protections, the $37 million fine reversal against Grand Canyon University for deceptive practices, and the Department’s silence on accreditation reform and oversight of predatory schools. These developments, the letter argued, mark a decisive return to the era of unchecked corporate education—where taxpayer dollars are funneled to dubious institutions and students are left with mountains of debt and worthless credentials.

    “Mission Accomplished” for the Privatization Movement

    This version of the Department of Education, stripped of its regulatory muscle and stocked with industry sympathizers, is not an accident. It’s the culmination of decades of libertarian, neoliberal, and religious-right agitation to disempower public education. The policy pipeline now flows directly from organizations like the Heritage Foundation and ALEC to appointed officials with deep ties to the industries they were once charged with policing.

    Rather than serving the public, the department’s primary role now appears to be facilitating the private sector’s conquest of higher education—through deregulation, outsourcing, and the erosion of civil rights protections.

    A Shrinking Federal Presence, an Expanding Crisis

    The consequences are far-reaching. Marginalized students—Black, brown, low-income, first-generation, disabled—depend disproportionately on federal guarantees, oversight, and funding. As these protections recede, so too does their access to meaningful educational opportunity. Instead, they are increasingly funneled into high-debt, low-return programs or shut out entirely.

    Meanwhile, the political vacuum left by this strategic dismantling is being filled by corporate actors, right-wing religious institutions, and profit-seeking “ed-tech” startups. The dream of public education as a democratic equalizer is being replaced by a market of extraction and exploitation.

    The Dream Realized

    Grover Norquist’s fantasy of drowning the government has now been partially fulfilled in the U.S. Department of Education. What remains is an agency in name only—a shell that no longer enforces its core mission. In the name of efficiency and deregulation, the department has abandoned millions of students and ceded its authority to those who view education as a commodity rather than a public right.

    The danger now is not only what’s been lost, but what is being built in its place. The Higher Education Inquirer will continue to monitor the ongoing capture of education policy and fight for a system that serves students, not shareholders.

    Sources:

    U.S. Department of Education, Organizational Chart, March 17, 2025

    David Halperin, Republic Report, “The Senate Shouldn’t Vote on Trump Higher Education Pick without a Hearing”

    U.S. Department of Justice press releases on Education Affiliates

    Politico Pro Education updates, May 2025

    Senate HELP Committee voting record, May 22, 2025

    Heritage Foundation and CECU policy recommendations

    Source link