Tag: system

  • Designing the 2026 Classroom: Emerging Learning Trends in an AI-Powered Education System – Faculty Focus

    Designing the 2026 Classroom: Emerging Learning Trends in an AI-Powered Education System – Faculty Focus

    Source link

  • A more focused research system does not by itself solve structural deficits

    A more focused research system does not by itself solve structural deficits

    Financial pressures across the higher education sector have necessitated a closer look at the various incomes and associated costs of the research, teaching and operational streams. For years, larger institutions have relied upon the cross-subsidy of their research, primarily from overseas student fees – a subsidy that is under threat from changes in geopolitics and indeed our own UK policies on immigration and visa controls.

    The UK is now between a rock and a hard place: how can it support the volume and focus of research needed to grow the knowledge-based economy of our UK industrial strategy, while also addressing the financial deficits that even the existing levels of research create?

    Several research leaders have recently been suggesting that a more efficient research system is one where higher education institutions focus on their strengths and collaborate more. But while acknowledging that efficiency savings are required and the relentless growth of bureaucracy – partly imposed by government but also self-inflicted within the HEIs – can be addressed, the funding gulf is far wider than these savings could possibly deliver.

    Efficiency savings alone will not solve the scale of structural deficits in the system. Furthermore, given that grant application success rates are systemically below 20 per cent and frequently below ten or even five per cent, the sector is already only funding its strongest applications. Fundamentally, currently demand far outstrips supply, leading to inefficiency and poor prioritisation decisions.

    Since most of the research costs are those supporting the salaries and student stipends of the researchers themselves, significant cost-cutting necessitates a reduction in the size of the research workforce – a reduction that would fly in the face of our future workforce requirement. We could leave this inevitable reduction to market forces, but the resulting disinvestment will likely impact the resource intensive subjects upon which much of our future economic growth depends.

    We recognise also that solutions cannot solely rely upon the public purse. So, what could we do now to improve both the efficiency of our state research spend and third-party investment into the system?

    What gets spent

    First of all, the chronic underfunding of the teaching of UK domestic students cannot continue, as it puts even further pressure on institutional resources. The recent index-linking of fees in England was a brave step to address this, but to maintain a viable UK research and innovation system, the other UK nations will also urgently need to address the underfunding of teaching. And in doing so we must remain mindful of the potential unintended consequences that increased fees might have on socio-economic exclusion.

    Second, paying a fair price for the research we do. Much has been made of the seemingly unrestricted “quality-related” funding (QR, or REG in Scotland) driven by the REF process. The reality is that QR simply makes good the missing component of research funding which through TRAC analysis is now estimated to cover less than 70 per cent of the true costs of the research.

    It ought to be noted that this missing component exists over all the recently announced research buckets extending across curiosity-driven, government-priority, and scale-up support. The government must recognise that QR is not purely the funding of discovery research, but rather it is the dual funding of research in general – and that the purpose of dual funding is to tension delivery models to ensure HEI efficiency of delivery.

    Next, there is pressing a need for UKRI to focus resource on the research most likely to lead to economic or societal benefit. This research spans all disciplines from the hardest of sciences to the most creative of the arts.

    Although these claims are widely made within every grant proposal, perhaps the best evidence of their validity lies in the co-investment these applications attract. We note the schemes such as EPSRC’s prosperity partnerships and their quantum technology hubs show that when packaged to encompass a range of technology readiness levels (TRL), industry is willing to support both low and high TRL research.

    We would propose that across UKRI more weighting is given to those applications supported by matching funds from industry or, in the case of societal impact, by government departments or charities. The next wave of matched co-funding of local industry-linked innovation should also privilege schemes which elicit genuine new industry investment, as opposed to in-kind funding, as envisaged in Local Innovation Partnership Funds. This avoids increasing research volume which is already not sustainable.

    The research workforce

    In recent times, the UKRI budgets and funding schemes for research and training (largely support for doctoral students) have been separated from each other. This can mean that the work of doctoral students is separated from the cutting-edge research that they were once the enginehouse of delivering. This decoupling means that the research projects themselves now require allocated, and far more expensive, post-doctoral staff to deliver. We see nothing in the recent re-branding of doctoral support to “landscape” and “focal” awards that is set to change this disconnect.

    It should be acknowledged that centres for doctoral training were correctly introduced nearly 20 years ago to ensure our students were better trained and better supported – but we would argue that the sector has now moved on and graduate schools within our leading HEIs address these needs without need for duplication by doctoral centres.

    Our proposal would be that, except for a small number of specific areas and initiatives supported by centres of doctoral training (focal awards) and central to the UK’s skills need, the normal funding of UKRI-supported doctoral students should be associated with projects funded by UKRI or other sources external to higher education institutions. This may require the reassignment of recently pooled training resources back to the individual research councils, rebalanced to meet national needs.

    This last point leads to the question of what the right shape of the HEI-based research-focused workforce is. We would suggest that emphasis should be placed on increasing the number of graduate students – many of whom aspire to move on from the higher education sector after their graduation to join the wider workforce – rather than post-doctoral researchers who (regrettably) mistakenly see their appointment as a first step to a permanent role in a sector which is unlikely to grow.

    Post-doctoral researchers are of course vital to the delivery of some research projects and comprise the academic researchers of the future. Emerging research leaders should continue to be supported through, for example, future research leader fellowships, empowered to pursue their own research ambitions. This rebalancing of the research workforce will go some way to rebalancing supply and demand.

    Organisational change

    Higher education institutions are hotbeds of creativity and empowerment. However, typical departments have an imbalanced distribution of research resources where appointment and promotion criteria are linked to individual grant income. While not underestimating the important leadership roles this implies, we feel that research outcomes would be better delivered through internal collaborations of experienced researchers where team science brings complementary skills together in partnership rather than subservience.

    This change in emphasis requires institutions to consider their team structures and HR processes. It also requires funders to reflect these changes in their assessment criteria and selection panel working methods. Again, this rebalancing of the research workforce would go some way to addressing supply and demand while improving the delivery of the research we fund.

    None of these suggestions represent a quick fix for our financial pressures, which need to be addressed. But taken together we believe them to be a supportive step, helping stabilise the financial position of the sector, while ensuring its continuing contribution to the UK economy and society. If we fail to act, the UK risks a disorderly reduction of its research capability at precisely the moment our global competitors are accelerating.

    Source link

  • International students missing out under US Early Decision system

    International students missing out under US Early Decision system

    Stakeholders are worried about the Early Decision (ED) system – where students apply early to their first-choice institution and, if admitted, are required to commit to attending. Although admission is not guaranteed, the common practice is that students must ‘lock in’ once accepted and withdraw all other applications, even in different countries.

    But with rising visa denials in Donald Trump’s United States, fears are rising that international students could be at an unfair disadvantage.

    Education consultant Elisabeth Marksteiner, pointed out that even if a student applies for a visa as soon as they have been accepted by an institution, they could be denied in late August, with the semester due to start in early September

    “Suddenly the student has no live applications anywhere in the entire world. There is no plan B – the whole point about ED is it takes out all insurance, effectively,” she told The PIE News.

    “There are some countries where we know it can be 11 months to get a visa appointment… there is no way that you are going to make it.”

    Advice from the National Association for College Admission Counseling (NACAC) on ED was updated in August to make it more specific and transparent for parents and school counselors alike.

    “The updates aim to ensure applicants, parents/guardians, and counselors fully understand the implications of an ED commitment under various possible scenarios,” it said.

    The practice has become a popular way for institutions to gauge their enrolment numbers ahead of time. And according to Marksteiner, enforcing binding ED agreements is a low-stakes approach for elite institutions – even if it means some international students won’t be able to take up their place.

    “The people who are most using ED are the ones at the top of the pile. They will always be able to fill their class,” she said.

    The people who are most using ED are the ones at the top of the pile. They will always be able to fill their class
    Elisabeth Marksteiner, education consultant

    ED offers often use complicated wording and “legalese” that, according to Marksteiner, can leave parents and high schoolers feeling uneasy.

    “It seems to me that we have lost effectively our moral compass in holding ED agreements in the way that we do,” she explained.

    In September, Tulane made headlines after it slapped Colorado Academy with a one-year ban on ED applications after one of its students allegedly pulled out of an offer.

    However, some institutions are changing their policies to make sure than non-US applicants do not have to withdraw their applications from other parts of the world.

    Visa delays have been a persistent problem for US higher education institutions under the second Trump administration – part of an “escalating cascade” of attacks on international students, according to an address by Presidents’ Alliance CEO Miriam Feldblum at this week’s PIE Live North America conference in Chicago.

    Since taking office for the second time, President Trump has imposed a travel ban on 19 countries, enforced an immigration crackdown that has affected thousands of international students and suspended visa interviews across the world for several weeks – a move whose effects are still being felt.

    Source link

  • UC System Reverses Decision to End Incentives for Postdocs

    UC System Reverses Decision to End Incentives for Postdocs

    Justin Sullivan/Getty Images

    In a letter to system chancellors Tuesday, University of California system president James Milliken said he would not end financial support for hiring postdoctoral fellows out of the UC President’s Postdoctoral Fellowship Program. 

    A system spokesperson told Inside Higher Ed earlier this month that the UC office had decided to halt its $85,000 per fellow, per year, hiring incentives beginning with fellows hired as full-time faculty after summer 2025. 

    “Given the myriad challenges currently facing UC—including disruptions in billions of dollars in annual federal support, as well as uncertainty around the state budget—reasonable questions were raised in recent months about whether the University could maintain the commitment to current levels of incentive funding,” Milliken wrote in the Tuesday letter. 

    He said he considered a proposal to sunset the incentive program but ultimately decided against it. Still, he said, there may be some future changes to the program, including a potential cap on the number of incentives supported and changes to how they are distributed across system campuses. 

    “After learning more about the history and success of the program and weighing the thoughtful perspectives that have been shared, I have concluded that barring extraordinary financial setbacks, the PPFP faculty hiring incentive program will continue while the University continues to assess the program’s structure as well as its long-term financial sustainability.”

    Source link

  • Feds cannot withhold funding from UC system amid lawsuit, judge rules

    Feds cannot withhold funding from UC system amid lawsuit, judge rules

    This audio is auto-generated. Please let us know if you have feedback.

    Dive Brief:

    • A federal judge on Friday issued a preliminary injunction barring the Trump administration from freezing the University of California system’s research funding as part of civil rights investigations. 
    • In a scathing ruling, U.S. District Judge Rita Lin found the administration’s actions unconstitutional, describing “a playbook of initiating civil rights investigations of preeminent universities to justify cutting off federal funding,” with the aim of “forcing them to change their ideological tune.”
    • While a lawsuit over the Trump administration’s actions is ongoing, Lin barred the federal government from using civil rights investigations to freeze UC grant money, condition its grants on any measure that would violate recipients’ speech rights, or seek fines and other money from the system.

    Dive Insight:

     In her ruling, Lin described a “three-stage playbook” that the Trump administration uses to target universities. First, an agency involved with the administration’s Task Force to Combat Anti-Semitism announces civil rights investigations or planned enforcement actions. Then, the administration issues mass grants cancellations without following legally mandated administrative procedures, Lin wrote.

    In the third stage, Lin said, the U.S. Department of Justice demands payment of millions or billions of dollars in addition to other policy changes in return for restored funding. A DOJ spokesperson on Monday declined to comment on the lawsuit. 

    In the case of UC, the judge ruled that plaintiffs — a coalition of faculty groups and unions, including the American Association of University Professors — provided “overwhelming evidence” of the administration’s “concerted campaign to purge ‘woke,’ ‘left,’ and ‘socialist’ viewpoints from our country’s leading universities.”

    It is undisputed that this precise playbook is now being executed at the University of California,” wrote Lin, citing public statements by Leo Terrell, senior counsel in the DOJ’s civil rights wing and the head of administration’s antisemitism task force. Terrell alleged that the UC system had been “hijacked by the left” and vowed to open investigations. 

    The Trump administration did just that. In August, it froze $584 million in research funding at the University of California, Los Angeles after concluding that the institution violated civil rights law. It primarily cited UCLA’s decision to allow a 2024 pro-Palestinian protest encampment to remain on campus for almost a week before calling in the police. 

    The administration has sought a $1.2 billion penalty from UCLA to release the funds and settle the allegations. “The costs associated with this demand, if left to stand, would have far-reaching consequences,” Chancellor Julio Frenk said in a public message in August. 

    Lin noted in her Friday ruling that the administration also sought settlement terms “that had nothing to do with antisemitism,” including policy changes to how UCLA handles student protests, an adoption of the administration’s views on gender, and a review of its diversity, equity and inclusion programs.

    The administration’s campaign resulted in a significant and ongoing chilling of faculty’s actions, both in and out of the classroom, Lin said.

    In addition to teaching and conducting research differently, members of the plaintiff groups have also changed how they engage in public discourse and limited their participation in protest, Lin said. Faculty have self-censored on topics such as structural racism and scrubbed their websites of references to DEI out of fear of reprisal. 

    These are classic, predictable First Amendment harms, and exactly what Defendants publicly said that they intended,” Lin concluded.

    While acknowledging the importance of combating antisemitism, Lin said the government was “silent on what actions UCLA took to address” antisemitism issues on its campus between May of 2024, when pro-Palestinian protesters established an encampment, and July 2025, when the DOJ concluded UCLA had violated civil rights law by not doing enough to protect Jewish students from harassment.

    As part of a separate lawsuit, Lin in September ordered the National Institutes of Health and other agencies to restore suspended grants to UCLA. 

    UCLA and the UC system are just one of several prominent universities similarly targeted by the federal government. At least five institutions so far have signed deals with the Trump administration to resolve federal civil investigations. The agreements brokered by Columbia, Brown and Cornell universities require each to pay millions of dollars to the federal government, causes favored by the Trump administration or both.

    Harvard University, on the other hand, has fought back against the administration’s tactics. After repeated federal attacks, accompanied by unprecedented ultimatums, the university sued the administration and successfully had the government’s $2.2 billion funding freeze against it reversed. The Trump administration has previously stated its intent to appeal. 

    Source link

  • Measuring What Matters: A Faculty Development System That Improves Teaching Quality – Faculty Focus

    Measuring What Matters: A Faculty Development System That Improves Teaching Quality – Faculty Focus

    Source link

  • Measuring What Matters: A Faculty Development System That Improves Teaching Quality – Faculty Focus

    Measuring What Matters: A Faculty Development System That Improves Teaching Quality – Faculty Focus

    Source link

  • Algorithms aren’t the problem. It’s the classification system they support

    Algorithms aren’t the problem. It’s the classification system they support

    The Office for Students (OfS) has published its annual analysis of sector-level degree classifications over time, and alongside it a report on Bachelors’ degree classification algorithms.

    The former is of the style (and with the faults) we’ve seen before. The latter is the controversial bit, both to the extent to which parts of it represent a “new” set of regulatory requirements, and a “new” set of rules over what universities can and can’t do when calculating degree results.

    Elsewhere on the site my colleague David Kernohan tackles the regulation issue – the upshots of the “guidance” on the algorithms, including what it will expect universities to do both to algorithms in use now, and if a provider ever decides to revise them.

    Here I’m looking in detail at its judgements over two practices. Universities are, to all intents and purposes, being banned from any system which discounts credits with the lowest marks – a practice which the regulator says makes it difficult to demonstrate that awards reflect achievement.

    It’s also ruling out “best of” algorithm approaches – any universities that determine degree class by running multiple algorithms and selecting the one that gives the highest result will also have to cease doing so. Anyone still using these approaches by 31 July 2026 has to report itself to OfS.

    Powers and process do matter, as do questions as to whether this is new regulation, or merely a practical interpretation of existing rules. But here I’m concerned with the principle. Has OfS got a point? Do systems such as those described above amount to misleading people who look at degree results over what a student has achieved?

    More, not less

    A few months ago now on Radio 4’s More or Less, I was asked how Covid had impacted university students’ attainment. On a show driven by data, I was wary about admitting that as a whole, I think it would be fair to say that UK HE isn’t really sure.

    When in-person everything was cancelled back in 2020, universities scrambled to implement “no detriment” policies that promised students wouldn’t be disadvantaged by the disruption.

    Those policies took various forms – some guaranteed that classifications couldn’t fall below students’ pre-pandemic trajectory, others allowed students to select their best marks, and some excluded affected modules entirely.

    By 2021, more than a third of graduates were receiving first-class honours, compared to around 16 per cent a decade earlier – with ministers and OfS on the march over the risk of “baking in” the grade inflation.

    I found that pressure troubling at the time. It seemed to me that for a variety of reasons, providers may have, as a result of the pandemic, been confronting a range of faults with degree algorithms – for the students, courses and providers that we have now, it was the old algorithms that were the problem.

    But the other interesting thing for me was what those “safety net” policies revealed about the astonishing diversity of practice across the sector when it comes to working out the degree classification.

    For all of the comparison work done – including, in England, official metrics on the Access and Participation Dashboard over disparities in “good honours” awarding – I was wary about admitting to Radio 4’s listeners that it’s not just differences in teaching, assessment and curriculum that can drive someone getting a First here and a 2:2 up the road.

    When in-person teaching returned in 2022 and 2023, the question became what “returning to normal” actually meant. Many – under regulatory pressure not to “bake in” grade inflation – removed explicit no-detriment policies, and the proportion of firsts and upper seconds did ease slightly.

    But in many providers, many of the flexibilities introduced during Covid – around best-mark selection, module exclusions and borderline consideration – had made explicit and legitimate what was already implicit in many institutional frameworks. And many were kept.

    Now, in England, OfS is to all intents and purposes banning a couple of the key approaches that were deployed during Covid. For a sector that prizes its autonomy above almost everything else, that’ll trigger alarm.

    But a wider look at how universities actually calculate degree classifications reveals something – the current system embodies fundamentally different philosophies about what a degree represents, are philosophies that produce systematically different outcomes for identical student performance, and are philosophies that should not be written off lightly.

    What we found

    Building on David Allen’s exercise seven years ago, a couple of weeks ago I examined the publicly available degree classification regulations for more than 150 UK universities, trawling through academic handbooks, quality assurance documents and regulatory frameworks.

    The shock for the Radio 4 listener on the Clapham Omnibus would be that there is no standardised national system with minor variations, but there is a patchwork of fundamentally different approaches to calculating the same qualification.

    Almost every university claims to use the same framework for UG quals – the Quality Assurance Agency benchmarks, the Framework for Higher Education Qualifications and standard grade boundaries of 70 for a first, 60 for a 2:1, 50 for a 2:2 and 40 for a third. But underneath what looks like consistency there’s extraordinary diversity in how marks are then combined into final classifications.

    The variations cluster around a major divide. Some universities – predominantly but not exclusively in the Russell Group – operate on the principle that a degree classification should reflect the totality of your assessed work at higher levels. Every module (at least at Level 5 and 6) counts, every mark matters, and your classification is the weighted average of everything you did.

    Other universities – predominantly post-1992 institutions but with significant exceptions – take a different view. They appear to argue that a degree classification should represent your actual capability, demonstrated through your best work.

    Students encounter setbacks, personal difficulties and topics that don’t suit their strengths. Assessment should be about demonstrating competence, not punishing every misstep along a three-year journey.

    Neither philosophy is obviously wrong. The first prioritises consistency and comprehensiveness. The second prioritises fairness and recognition that learning isn’t linear. But they produce systematically different outcomes, and the current system does allow both to operate under the guise of a unified national framework.

    Five features that create flexibility

    Five structural features appear repeatedly across university algorithms, each pushing outcomes in one direction.

    1. Best-credit selection

    This first one has become widespread, particularly outside the Russell Group. Rather than using all module marks, many universities allow students to drop their worst performances.

    One uses the best 105 credits out of 120 at each of Levels 5 and 6. Another discards the lowest 20 credits automatically. A third takes only the best 90 credits at each level. Several others use the best 100 credits at each stage.

    The rationale is obvious – why should one difficult module or one difficult semester define an entire degree?

    But the consequence is equally obvious. A student who scores 75-75-75-75-55-55 across six modules averages 68.3 per cent. At universities where everything counts, that’s a 2:1. At universities using best-credit selection that drops the two 55s, it averages 75 – a clear first.

    Best-credit selection is the majority position among post-92s, but virtually absent at Russell Group universities. OfS is now pretty much banning this practice.

    The case against rests on B4.2(c) (academic regulations must be “designed to ensure” awards are credible) and B4.4(e) (credible means awards “reflect students’ knowledge and skills”). Discounting credits with lowest marks “excludes part of a student’s assessed achievement” and so:

    …may result in a student receiving a class of degree that overlooks material evidence of their performance against the full learning outcomes for the course.

    2. Multiple calculation routes

    These take that principle further. Several universities calculate your degree multiple ways and award whichever result is better. One runs two complete calculations – using only your best 100 credits at Level 6, or taking your best 100 at both levels with 20:80 weighting. You get whichever is higher.

    Another offers three complete routes – unweighted mean, weighted mean and a profile-based method. Students receive the highest classification any method produces.

    For those holding onto their “standards”, this sort of thing is mathematically guaranteed to inflate outcomes. You’re measuring the best possible interpretation of what students achieved, not what they achieved every time. As a result, comparison across institutions becomes meaningless. Again, this is now pretty much being banned.

    This time, the case against is that:

    …the classification awarded should not simply be the most favourable result, but the result that most accurately reflects the student’s level of achievement against the learning outcomes.

    3. Borderline uplift rules

    What happens on the cusps? Borderline uplift rules create all sorts of discretion around the theoretical boundaries.

    One university automatically uplifts students to the higher class if two-thirds of their final-stage credits fall within that band, even if their overall average sits below the threshold. Another operates a 0.5 percentage point automatic uplift zone. Several maintain 2.0 percentage point consideration zones where students can be promoted if profile criteria are met.

    If 10 per cent of students cluster around borderlines and half are uplifted, that’s a five per cent boost to top grades at each boundary – the cumulative effect is substantial.

    One small and specialist plays the counterfactual – when it gained degree-awarding powers, it explicitly removed all discretionary borderline uplift. The boundaries are fixed – and it argues this is more honest than trying to maintain discretion that inevitably becomes inconsistent.

    OfS could argue borderline uplift breaches B4.2(b)’s requirement that assessments be “reliable” – defined as requiring “consistency as between students.”

    When two students with 69.4% overall averages receive different classifications (one uplifted to First, one remaining 2:1) based on mark distribution patterns or examination board discretion, the system produces inconsistent outcomes for identical demonstrated performance.

    But OfS avoids this argument, likely because it would directly challenge decades of established discretion on borderlines – a core feature of the existing system. Eliminating all discretion would conflict with professional academic judgment practices that the sector considers fundamental, and OfS has chosen not to pick that fight.

    4. Exit acceleration

    Heavy final-year weighting amplifies improvement while minimising early difficulties. Where deployed, the near-universal pattern is now 25 to 30 per cent for Level 5 and 70 to 75 per cent for Level 6. Some institutions weight even more heavily, with year three counting for 60 per cent of the final mark.

    A student who averages 55 in year two and 72 in year three gets 67.2 overall with typical 30:70 weighting – a 2:1. A student who averages 72 in year two and 55 in year three gets 59.9 – just short of a 2:1.

    The magnitude of change is identical – it’s just that the direction differs. The system structurally rewards late bloomers and penalises any early starters who plateau.

    OfS could argue that 75 per cent final-year weighting breaches B4.2(a)’s requirement for “appropriately comprehensive” assessment. B4 Guidance 335M warns that assessment “focusing only on material taught at the end of a long course… is unlikely to provide a valid assessment of that course,” and heavy (though not exclusive) final-year emphasis arguably extends this principle – if the course’s subject matter is taught across three years, does minimizing assessment of two-thirds of that teaching constitute comprehensive evaluation?

    But OfS doesn’t make this argument either, likely because year weighting is explicit in published regulations, often driven by PSRB requirements, and represents settled institutional choices rather than recent innovations. Challenging it would mean questioning established pedagogical frameworks rather than targeting post-hoc changes that might mask grade inflation.

    5. First-year exclusion

    Finally, with a handful of institutional and PSRB exceptions, the first-year-not-counting is now pretty much universal, removing what used to be the bottom tail of performance distributions.

    While this is now so standard it seems natural, it represents a significant structural change from 20 to 30 years ago. You can score 40s across the board in first year and still graduate with a first if you score 70-plus in years two and three.

    Combine it with other features, and the interaction effects compound. At universities using best 105 credits at each of Levels 5 and 6 with 30:70 weighting, only 210 of 360 total credits – 58 per cent – actually contribute to your classification. And so on.

    OfS could argue first-year exclusion breaches comprehensiveness requirements – when combined with best-credit selection, only 210 of 360 total credits (58%) might count toward classification. But OfS explicitly notes this practice is now “pretty much universal” with only “a handful of institutional and PSRB exceptions,” treating it as neutral accepted practice rather than a compliance concern.

    Targeting something this deeply embedded across the sector would face overwhelming institutional autonomy defenses and would effectively require the sector to reinstate a practice it collectively abandoned over the past two decades.

    OfS’ strategy is to focus regulatory pressure on recent adoptions of “inherently inflationary” practices rather than challenging longstanding sector-wide norms.

    Institution type

    Russell Group universities generally operate on the totality-of-work philosophy. Research-intensives typically employ single calculation methods, count all credits and maintain narrow borderline zones.

    But there are exceptions. One I’ve seen has automatic borderline uplift that’s more generous than many post-92s. Another’s 2.0 percentage point borderline zone adds substantial flexibility. If anything, the pattern isn’t uniformity of rigour – it’s uniformity of philosophy.

    One London university has a marks-counting scheme rather than a weighted average – what some would say is the most “rigorous” system in England. And two others – you can guess who – don’t fit this analysis at all, with subject-specific systems and no university-wide algorithms.

    Post-1992s systematically deploy multiple flexibility features. Best-credit selection appears at roughly 70 per cent of post-92s. Multiple calculation routes appear at around 40 per cent of post-92s versus virtually zero per cent at research-intensive institutions. Several post-92s have introduced new, more flexible classification algorithms in the past five years, while Russell Group frameworks have been substantially stable for a decade or more.

    This difference reflects real pressures. Post-92s face acute scrutiny on student outcomes from league tables, OfS monitoring and recruitment competition, and disproportionately serve students from disadvantaged backgrounds with lower prior attainment.

    From one perspective, flexibility is a cynical response to metrics pressure. From another, it’s recognition that their students face different challenges. Both perspectives contain truth.

    Meanwhile, Scottish universities present a different model entirely, using GPA-based calculations across SCQF Levels 9 and 10 within four-year degree structures.

    The Scottish system is more internally standardised than the English system, but the two are fundamentally incompatible. As OfS attempts to mandate English standardisation, Scottish universities will surely refuse, citing devolved education powers.

    London is a city with maximum algorithmic diversity within minimum geographic distance. Major London universities use radically different calculation systems despite competing for similar students. A student with identical marks might receive a 2:1 at one, a first at another and a first with higher average at a third, purely over algorithmic differences.

    What the algorithm can’t tell you

    The “five features” capture most of the systematic variation between institutional algorithms. But they’re not the whole story.

    First, they measure the mechanics of aggregation, not the standards of marking. A 65 per cent essay at one university may represent genuinely different work from a 65 per cent at another. External examining is meant to moderate this, but the system depends heavily on trust and professional judgment. Algorithmic variation compounds whatever underlying marking variation exists – but marking standards themselves remain largely opaque.

    Second, several important rules fall outside the five-feature framework but still create significant variation. Compensation and condonement rules – how universities handle failed modules – differ substantially. Some allow up to 30 credits of condoned failure while still classifying for honours. Others exclude students from honours classification with any substantial failure, regardless of their other marks.

    Compulsory module rules also cut across the best-credit philosophy. Many universities mandate that dissertations or major projects must count toward classification even if they’re not among a student’s best marks. Others allow them to be dropped. A student who performs poorly on their dissertation but excellently elsewhere will face radically different outcomes depending on these rules.

    In a world where huge numbers of students now have radically less module choice than they did just a few years ago as a result of cuts, they would have reason to feel doubly aggrieved if modules they never wanted to take in the first place will now count when they didn’t last week.

    Several universities use explicit credit-volume requirements at each classification threshold. A student might need not just a 60 per cent average for a 2:1, but also at least 180 credits at 60 per cent or above, including specific volumes from the final year. This builds dual criteria into the system – you need both the average and the profile. It’s philosophically distinct from borderline uplift, which operates after the primary calculation.

    And finally, treatment of reassessed work varies. Nearly all universities cap resit marks at the pass threshold, but some exclude capped marks from “best credit” calculations while others include them. For students who fail and recover, this determines whether they can still achieve high classifications or are effectively capped at lower bands regardless of their other performance.

    The point isn’t so much that I (or OfS) have missed the “real” drivers of variation – the five features genuinely are the major structural mechanisms. But the system’s complexity runs deeper than any five-point list can capture. When we layer compensation rules onto best-credit selection, compulsory modules onto multiple calculation routes, and volume requirements onto borderline uplift, the number of possible institutional configurations runs into the thousands.

    The transparency problem

    Every day’s a school day at Wonkhe, but what has been striking for me is quite how difficult the information has been to access and compare. Some institutions publish comprehensive regulations as dense PDF documents. Others use modular web-based regulations across multiple pages. Some bury details in programme specifications. Several have no easily locatable public explanation at all.

    UUK’s position on this, I’d suggest, is a something of a stretch:

    University policies are now much more transparent to students. Universities are explaining how they calculate the classification of awards, what the different degree classifications mean and how external examiners ensure consistency between institutions.

    Publication cycles vary unpredictably, cohort applicability is often ambiguous, and cross-referencing between regulations, programme specifications and external requirements adds layers upon layers of complexity. The result is that meaningful comparison is effectively impossible for anyone outside the quality assurance sector.

    This opacity matters because it masks that non-comparability problem. When an employer sees “2:1, BA in History” on a CV, they have no way of knowing whether this candidate’s university used all marks or selected the best 100 credits, whether multiple calculation routes were available or how heavily final-year work was weighted. The classification looks identical regardless. That makes it more, not less, likely that they’ll just go on prejudices and league tables – regardless of the TEF medal.

    We can estimate the impact conservatively. Year one exclusion removes perhaps 10 to 15 per cent of the performance distribution. Best-credit selection removes another five to 10 per cent. Heavy final-year weighting amplifies improvement trajectories. Multiple calculation routes guarantee some students shift up a boundary. Borderline rules uplift perhaps three to five per cent of the cohort at each threshold.

    Stack these together and you could shift perhaps 15 to 25 per cent of students up one classification band compared to a system that counted everything equally with single-method calculation and no borderline flexibility. Degree classifications are measuring as much about institutional algorithm choices as about student learning or teaching quality.

    Yes, but

    When universities defend these features, the justifications are individually compelling. Best-credit selection rewards students’ strongest work rather than penalising every difficult moment. Multiple routes remove arbitrary disadvantage. Borderline uplift reflects that the difference between 69.4 and 69.6 per cent is statistically meaningless. Final-year emphasis recognises that learning develops over time. First-year exclusion creates space for genuine learning without constant pressure.

    None of these arguments is obviously wrong. Each reflects defensible beliefs about what education is for. The problem is that they’re not universal beliefs, and the current system allows multiple philosophies to coexist under a facade of equivalence.

    Post-92s add an equity dimension – their flexibility helps students from disadvantaged backgrounds who face greater obstacles. If standardisation forces them to adopt strict algorithms, degree outcomes will decline at institutions serving the most disadvantaged students. But did students really learn less, or attain to a “lower” standard?

    The counterargument is that if the algorithm itself makes classifications structurally easier to achieve, you haven’t promoted equity – you’ve devalued the qualification. And without the sort of smart, skills and competencies based transcripts that most of our pass/fail cousins across Europe adopt, UK students end up choosing between a rock and a hard place – if only they were conscious of that choice.

    The other thing that strikes me is that the arguments I made in December 2020 for “baking in” grade inflation haven’t gone away just because the pandemic has. If anything, the case for flexibility has strengthened as the cost of living crisis, inadequate maintenance support and deteriorating student mental health create circumstances that affect performance through no fault of students’ own.

    Students are working longer hours in paid employment to afford rent and food, living in unsuitable accommodation, caring for family members, and managing mental health conditions at record levels. The universities that retained pandemic-era flexibilities – best-credit selection, generous borderline rules, multiple calculation routes – aren’t being cynical about grade inflation. They’re recognising that their students disproportionately face these obstacles, and that a “totality-of-work” philosophy systematically penalises students for circumstances beyond their control rather than assessing what they’re actually capable of achieving.

    The philosophical question remains – should a degree classification reflect every difficult moment across three years, or should it represent genuine capability demonstrated when circumstances allow? Universities serving disadvantaged students have answered that question one way – research-intensive universities serving advantaged students have answered it another.

    OfS’s intervention threatens to impose the latter philosophy sector-wide, eliminating the flexibility that helps students from disadvantaged backgrounds show their “best selves” rather than punishing them for structural inequalities that affect their week-to-week performance.

    Now what

    As such, a regulator seeking to intervene faces an interesting challenge with no obviously good options – albeit one of its own making. Another approach might have been to cap the most egregious practices – prohibit triple-route calculations, limit best-credit selection to 90 per cent of total credits, cap borderline zones at 1.5 percentage points.

    That would eliminate the worst outliers while preserving meaningful autonomy. The sector would likely comply minimally while claiming victory, but oodles of variation would remain.

    A stricter approach would be mandating identical algorithms – but would provoke rebellion. Devolved nations would refuse, citing devolved powers and triggering a constitutional comparison. Research intensive universities would mount legal challenges on academic freedom grounds, if they’re not preparing to do so already. Post-92s would deploy equity arguments, claiming standardisation harms universities serving disadvantaged students.

    A politically savvy but inadequate approach might have been mandatory transparency rather than prescription. Requiring universities to publish algorithms in standardised format with some underpinning philosophy would help. That might preserve autonomy while creating a bit of accountability. Maybe competitive pressure and reputational risk will drive voluntary convergence.

    But universities will resist even being forced to quantify and publicise the effects of their grading systems. They’ll argue it undermines confidence and damages the UK’s international reputation.

    Given the diversity of courses, providers, students and PSRBs, algorithms also feel like a weird thing to standardise. I can make a much better case for a defined set of subject awards, a shared governance framework (including subject benchmark statements, related PSRBs and degree algorithms) than I can for tightening standardisation in isolation.

    The fundamental problem is that the UK degree classification system was designed for a different age, a different sector and a different set of students. It was probably a fiction to imagine that sorting everyone into First, 2:1, 2:2 and Third was possible even 40 years ago – but today, it’s such obvious nonsense that without richer transcripts, it just becomes another way to drag down the reputation of the sector and its students.

    Unfit for purpose

    In 2007, the Burgess Review – commissioned by Universities UK itself – recommended replacing honours degree classifications with detailed achievement transcripts.

    Burgess identified the exact problems we have today – considerable variation in institutional algorithms, the unreliability of classification as an indicator of achievement, and the fundamental inadequacy of trying to capture three years of diverse learning in a single grade.

    The sector chose not to implement Burgess’s recommendations, concerned that moving away from classifications would disadvantage UK graduates in labour markets “where the classification system is well understood.”

    Eighteen years later, the classification system is neither well understood nor meaningful. A 2:1 at one institution isn’t comparable to a 2:1 at another, but the system’s facade of equivalence persists.

    The sector chose legibility and inertia over accuracy and ended up with neither – sticking with a system that protected institutional diversity while robbing students of the ability to show off theirs. As we see over and over again, a failure to fix the roof when the sun was shining means reform may now arrive externally imposed.

    Now the regulator is knocking on the conformity door, there’s an easy response. OfS can’t take an annual pop at grade inflation if most of the sector abandons the outdated and inadequate degree classification system. Nothing in the rules seems to mandate it, some UG quals don’t use it (think regulated professional bachelors), and who knows where the White Paper’s demand for meaningful exit awards at Level 4 and 5 fit into all of this.

    Maybe we shouldn’t be surprised that a regulator that oversees a meaningless and opaque medal system with a complex algorithm that somehow boils an entire university down to “Bronze”, “Silver” Gold” or “Requires Improvement” is keen to keep hold of the equivalent for students.

    But killing off the dated relic would send a really powerful signal – that the sector is committed to developing the whole student, explaining their skills and attributes and what’s good about them – rather than pretending that the classification makes the holder of a 2:1 “better” than those with a Third, and “worse” than those with a First.

    Source link

  • Only radical thinking will deliver the integrated tertiary system the country needs

    Only radical thinking will deliver the integrated tertiary system the country needs

    The post-16 white paper was an opportunity to radically enable an education and skills ecosystem that is built around the industrial strategy, and that has real resonance with place.

    The idea that skills exist in an entirely different space to education is just wrongheaded. The opportunity comes, however, when we can see a real connection, both in principle and in practice, between further and higher education: a tertiary system that can serve students, employers and society.

    Significant foundations are already in place with the Lifelong Learning Entitlement providing sharp focus within the higher education sector and apprenticeships, now well established, and well regarded across both HE and FE. Yet we still have the clear problem that schools, FE, teaching in HE, research and knowledge transfer are fragmented across the DfE and other associated sector bodies.

    Sum of the parts

    The policy framework needs to be supported by a major and radical rethink of how the parts fit together so we can truly unlock the combined transformational power of education and innovation to raise aspirations, opportunity, attainment, and ultimately, living standards. This could require a tertiary commission of the likes of Diamond and Hazelkorn in the Welsh system in the mid-2010’s.

    Such a commission could produce bold thinking on the scale of the academies movement in schools over the last 25 years. The encouragement to bring groups of schools together has resulted in challenge, but also significant opportunity. We have seen the creation of some excellent FE college groups following an area-based review around a decade ago. The first major coming together of HE institutions is in train with Greenwich and Kent. We have seen limited pilot FE/HE mergers. Now feels like the right time for blue sky thinking that enables the best of all of those activities in a structured and purposeful way that is primarily focused on the benefits to learning and national productivity rather than simply financial necessity.

    Creating opportunities for HE, FE and schools to come together not only in partnerships, but in structural ways will enable the innovation that will create tangible change in local and regional communities. All parts of the education ecosystem face ever-increasing financial challenge. If an FE college and a university wished to offer shared services, then there would need to be competitive tender for the purposes of best value. This sounds sensible except the cost of running such a process is high. If those institutions are part of the same group, then it can be done so much more efficiently.

    FE colleges are embedded in their place and even more connected to local communities. The ability to reach into more disadvantaged communities and to take the HE classroom from the traditional university setting, is a distinct benefit. The growth in private, for-profit HE provision is often because it has a great ability to reach into specific communities. The power of FE/HE collaboration into those same communities would bring both choice and exciting possibility.

    While in theory FE and HE can merge through a section 28 application to the Secretary of State, the reality is that any activity to this point has been marginal and driven by motivation other than enhanced skills provision. If the DfE were to enable, and indeed drive, such collaboration they could create both financial efficiencies and a much greater and more coordinated offer to employers and learners.

    The industrial strategy and the growth in devolved responsibility for skills create interesting new opportunities but we must find ways that avoid a new decade of confusion for employers and learners. The announcement of new vocational qualifications, Technical Excellence Colleges and the like are to be welcomed but must be more than headlines. Learners and employers alike need to be able to see pathways and support for their lifelong skills and learning needs.

    Path to integration

    The full integration of FE and HE could create powerful regional and place-based education and skills offers. Adding in schools and creating education trusts that straddle all levels means that employers could benefit from integrated offers, less bureaucracy and clear, accelerated pathways.

    So now is the moment to develop Integrated Skills and Education Trusts (ISET): entities that sit within broad groups and benefit from the efficiencies of scale but maintaining local provision. Taking the best of FE, understanding skills and local needs and the best of HE and actively enabling them to come together.

    Our experience at Coventry, working closely and collaboratively with several FE partners, is that the barriers thrown up within the DfE are in stark and clear contrast to the policy statements of ministers and, indeed, of the Prime Minister. The post-16 white paper will only lead to real change if the policy and the “plumbing” align. The call has to be to think with ambition and to encourage and enable action that serves learners, employers and communities with an education and skills offer that is fit for the next generation.

    Source link

  • A university system reliant on international students has an obligation to understand them

    A university system reliant on international students has an obligation to understand them

    It is becoming difficult to ignore potential tension between the internationalisation of higher education and plans to cut net migration. Recent UK government policies, such as the reduction of the graduate visa from two years to 18 months, could have severe consequences for universities in Scotland.

    Scottish government funding per home student has not kept pace with inflation. To compensate for the subsequent gap in resources, universities have become more dependent on international enrolments.

    In addition, Scotland faces specific demographic challenges. By 2075, the number of working aged Scots is predicted to fall by 14.7 per cent and, without migration, the population would be in decline. Encouraging young people to remain after graduation could help to balance the ageing population. However, although the Scottish government favours a more generous post-study visa route, this is not supported by Westminster.

    Ability to adjust

    Rhetoric around internationalisation tends to emphasise positive factors such as increased diversity and cross-cultural exchange. Yet, as an English for Academic Purposes (EAP) practitioner, I have long been concerned that learners from diverse linguistic backgrounds are often viewed through a lens of deficiency. There is also a risk that their own needs will be overlooked in the midst of political and economic debate.

    To better understand how students’ sense of identity is affected by moving into new educational and social settings, I carried out interview-based research at a Scottish university. Like other “prestigious” institutions, it attracts a large number of applicants from abroad. In particular, some taught master’s degrees (such as those in the field of language education) are dominated by Chinese nationals. Indeed, when recruiting postgraduate interviewees, I was not surprised when only two (out of 11) came from other countries (Thailand and Japan).

    My analysis of data revealed typical reasons for choosing the university: ranking, reputation and the shorter duration of master’s courses. Participants described being met with unfamiliar expectations on arrival, especially as regards writing essays and contributing to discussion. For some, this challenged their previous identities as competent individuals with advanced English skills. These issues were exacerbated in “all-white” classes, where being in the minority heightened linguistic anxiety and the fear of being judged. They had varied experiences of group work: several reported – not necessarily intentional but nonetheless problematic – segregation of students by nationality, undermining the notion that a multi-national population results in close mixing on campus.

    In a survey administered to a wider cohort of respondents on a pre-sessional EAP programme, the majority agreed or strongly agreed when asked if they would befriend British people while at university.

    However, making such connections is far from straightforward. International students are sometimes criticised for socialising in monocultural groups and failing to fully “fit in”. However, the fatigue of living one’s life in another language and simultaneously coping with academic demands means that getting to know locals is not a priority. At the same time, research participants expressed regret at the lack of opportunity to interact with other nationalities, with one remarking, “if everyone around me is Chinese, why did I choose to study abroad?” Some encountered prejudice or marginalisation, reporting that they felt ignored by “fluent” speakers of English. Understandably, this had a detrimental effect on their ability to adjust.

    Different ways to belong

    To gain different perspectives, I also spoke with teachers who work with international students. EAP tutors believed that their classes offer a safe space for them to gain confidence and become used to a new way of working. However, they wondered whether there would be a similarly supportive atmosphere in mainstream university settings. Subject lecturers did not invoke phrases such as “dumbing down”, but several had altered their teaching methods to better suit learners from non-Anglophone backgrounds.

    In addition, they questioned whether internationalisation always equated to diversity. One commented on the advantages of having a “multicultural quality”, but added that it “has to be a mix” – something which is not possible if, like on her course, there are no Scottish students. Another mentioned that the propensity to “stick with your own people” is not a uniquely Chinese phenomenon, but common behaviour regardless of background.

    A few academics had noticed that most Chinese students take an attitude of, “I’m doing my (one-year) master’s and maybe then I have to move back to China.” Chinese students are less likely than some other nationalities to apply for a graduate visa, suggesting that their investment in a degree abroad is of a transactional nature.

    The majority of survey respondents indicated that they would adapt to a new way of life while living abroad. However, during my last conversation with focal interviewees, I uncovered different levels of belonging, ranging from, “I feel like I’m from Scotland”, to “my heart was always in China”, to “I don’t have any home.” Participants generally viewed their stay as temporary: in fact, all but the Japanese student (who accepted a job in the US) returned to their home country after graduation. Although they described their time in Scotland in mostly positive terms, some were disappointed that it had not provided a truly intercultural experience.

    Meltdown

    It is clear that universities in Scotland have become overly reliant on international tuition for their financial sustainability. At the same time, there is conflict between the devolved administration’s depiction of Scotland as outward looking and welcoming, and the reality of stricter migration policies over which it has no control.

    Discourses which position international students as outsiders who add to high immigration numbers could deter some from coming. If they are seen only as economic assets, their own cultural capital and agency might be neglected. It is also important to problematise the notion of “integration”: even my small study suggests that there are different ways of belonging. No group of learners is homogeneous: even if they come from the same country, individual experiences will differ.

    To navigate the current financial crisis, Scottish universities need to do everything possible to maintain their appeal. With elections being held next year, higher education policy will continue to be a key area of discussion. At present, there are no plans to introduce fees for home students, making revenue from international tuition all the more essential.

    However, at a time of global uncertainty, taking overseas students for granted feels enormously unwise. Instead, it is crucial to ask how they can be made to feel like valued members of the academic community. The answer to this question might be different for everyone, but engaging with students themselves, rather than relying on unhelpful assumptions, would be a start.

    Source link