Category: Students

  • Making higher education work for international student carers

    Making higher education work for international student carers

    Student carers – those juggling unpaid caring for family or friends, as well as student parents – can often feel invisible to their higher education provider. Their needs cut across multiple areas, including attendance, assessment, finances and mental health, with many (quietly) facing the complicated arithmetic of balancing time, money and labour.

    It is not only UK-domiciled students that face these challenges. Little addressed in the academic literature, international student carers face challenges both similar to and distinct from those experienced by UK home students.

    Similar and distinct

    Student carers of all nationalities describe disrupted attendance when emergencies arise, lost concentration, as well as difficult trade-offs between paid work and academic engagement.

    Uncertainty amplifies these pressures: some students simply choose not to disclose information about their caregiving because of fear of stigma; others do not trust staff to handle with care what is a personal and sensitive dimension of their lives; still others do not know where to seek support.

    Identifying carers, therefore, is a necessary first step to providing support. However, it is not always straightforward – institutions commonly lack routine, reliable data on caring status, making targeted support ad hoc rather than systemic.

    Yet international student carers face additional, distinctive barriers that make the same problems harder to resolve. Visa rules are an illustrative example. These restrict when dependants can accompany students and cap the number of hours most international students can work during term-time.

    For instance, students on degree-level courses can generally work up to 20 hours per week, while those on foundation and pre-sessional English routes are limited to ten hours. Self-employment is not permitted, and internships or placements must be approved by the sponsor.

    For those caring for family overseas, emotional load and logistical complexity are high: families divide care across borders, rely on remittances, and use digital tools to coordinate support at distance. For those caring for dependants present in the UK, the absence of recourse to public funds combined with the limitations set on working hours further intensify financial challenges. These are not abstract constraints – students I have spoken to flagged the restriction on working hours as a core stressor that diverted their attention from study.

    Making it work

    The UK policy context matters as it shapes what universities can and cannot do. While recent changes have tightened dependant rules for international students, universities still retain a significant degree of agency. These include proactive identification of student carers, flexible design of learning and assessment, targeted financial and career advice, as well as culturally sensitive outreach.

    What does this look like in practice? First, it is time that institutions recognise that disclosure is not a single moment, but a process requiring trust. Rather than a “pray-and-hope” approach where students are asked to declare their caring status on a single form, universities should try to normalise conversations across the student lifecycle: in admissions, enrolment, welcome activities, academic tutorials and welfare checks. Staff training plays an important role here. Academic and professional services teams need concise guidance on how to spot signs of caring, how to ask sensitively, and how to go about making reasonable adjustments, be that through a Carer Passport or other means. This helps reduce the pressure on student carers to self-advocate.

    Next, administrative burden needs to be reduced as much as possible – student carers are often acutely time poor. Tools like the just mentioned Carer Passport can help here by making informal agreements more formal and removing the need (and burden) of repeated disclosure.

    Reasonable adjustments might include extended deadlines, alternative attendance arrangements, priority access to recorded lectures or seminar times. The design of such initiatives should not blindside carers, they should be involved in the development process. This co-production may also help tackle the trust deficit.

    Third, financial and careers support must be tailored to visa realities. Generic money advice may be helpful, but is likely insufficient for international student carers’ needs, given the restrictions on working hours and access to benefits. One support route, if budgets allow, could be targeted bursaries, hardship funding that consider caring costs, and career advice that specifically addresses visa limits and limits of working hours. Partnerships with external funds and local community organisations could also be beneficial.

    And finally, community can provide another support mechanism. Peer networks, carers’ groups and targeted social spaces allow student carers, particularly international ones who may be far from family networks, to share coping strategies and practical tips. These groups also provide powerful evidence to inform policy change within universities: student testimony should feed directly into institutional planning, not sit in a file.

    The effort required

    None of the above requires revolutionary or even radical institutional reinvention – though it does demand time and allocation of resources. That said, I would contend that the efforts are worth it for a couple of reasons.

    The first is that supporting international student carers is simply a matter of fairness. Secondly, but of equal importance, universities that make study feasible for (international) student carers will stand a better chance of attracting and retaining talent that might otherwise never apply or withdraw.

    The absence of international student carers means a loss of enriching perspectives in the classroom – and conversely their presence entails a stronger evidence base from which to build inclusive practice.

    Source link

  • Uni boosts gender diversity by 30% in maths – Campus Review

    Uni boosts gender diversity by 30% in maths – Campus Review

    As the artificial intelligence (AI) and quantum computing industries explode, trained STEM professionals are in high demand. Mathematics is foundational to these fields.

    Please login below to view content or subscribe now.

    Membership Login

    Source link

  • AI is unlocking insights from PTES to drive enhancement of the PGT experience faster than ever before

    AI is unlocking insights from PTES to drive enhancement of the PGT experience faster than ever before

    If, like me, you grew up watching Looney Tunes cartoons, you may remember Yosemite Sam’s popular phrase, “There’s gold in them thar hills.”

    In surveys, as in gold mining, the greatest riches are often hidden and difficult to extract. This principle is perhaps especially true when institutions are seeking to enhance the postgraduate taught (PGT) student experience.

    PGT students are far more than an extension of the undergraduate community; they represent a crucial, diverse and financially significant segment of the student body. Yet, despite their growing numbers and increasing strategic importance, PGT students, as Kelly Edmunds and Kate Strudwick have recently pointed out on Wonkhe, remain largely invisible in both published research and core institutional strategy.

    Advance HE’s Postgraduate Taught Experience Survey (PTES) is therefore one of the few critical insights we have about the PGT experience. But while the quantitative results offer a (usually fairly consistent) high-level view, the real intelligence required to drive meaningful enhancement inside higher education institutions is buried deep within the thousands of open-text comments collected. Faced with the sheer volume of data the choice is between eye-ball scanning and the inevitable introduction of human bias, or laborious and time-consuming manual coding. The challenge for the institutions participating in PTES this year isn’t the lack of data: it’s efficiently and reliably turning that dense, often contradictory, qualitative data into actionable, ethical, and equitable insights.

    AI to the rescue

    The application of machine learning AI technology to analysis of qualitative student survey data presents us with a generational opportunity to amplify the student voice. The critical question is not whether AI should be used, but how to ensure its use meets robust and ethical standards. For that you need the right process – and the right partner – to prioritise analytical substance, comprehensiveness, and sector-specific nuance.

    UK HE training is non-negotiable. AI models must be deeply trained on a vast corpus of UK HE student comments. Without this sector-specific training, analysis will fail to accurately interpret the nuances of student language, sector jargon, and UK-specific feedback patterns.

    Analysis must rely on a categorisation structure that has been developed and refined against multiple years of PTES data. This continuity ensures that the thematic framework reflects the nuances of the PGT experience.

    To drive targeted enhancement, the model must break down feedback into highly granular sub-themes – moving far beyond simplistic buckets – ensuring staff can pinpoint the exact issue, whether it falls under learning resources, assessment feedback, or thesis supervision.

    The analysis must be more than a static report. It must be delivered through integrated dashboard solutions that allow institutions to filter, drill down, and cross-reference the qualitative findings with demographic and discipline data. Only this level of flexibility enables staff to take equitable and targeted enhancement actions across their diverse PGT cohorts.

    When these principles are prioritised, the result is an analytical framework specifically designed to meet the rigour and complexity required by the sector.

    The partnership between Advance HE, evasys, and Student Voice AI, which analysed this year’s PTES data, demonstrates what is possible when these rigorous standards are prioritised. We have offered participating institutions a comprehensive service that analyses open comments alongside the detailed benchmarking reports that Advance HE already provides. This collaboration has successfully built an analytical framework that exemplifies how sector-trained AI can deliver high-confidence, actionable intelligence.

    Jonathan Neves, Head of Research and Surveys, Advance HE calls our solution “customised, transparent and genuinely focused on improving the student experience, “ and adds, “We’re particularly impressed by how they present the data visually and look forward to seeing results from using these specialised tools in tandem.”

    Substance uber alles

    The commitment to analytical substance is paramount; without it, the risk to institutional resources and equity is severe. If institutions are to derive value, the analysis must be comprehensive. When the analysis lacks this depth institutional resources are wasted acting on partial or misleading evidence.

    Rigorous analysis requires minimising what we call data leakage: the systematic failure to capture or categorise substantive feedback. Consider the alternative: when large percentages of feedback are ignored or left uncategorised, institutions are effectively muting a significant portion of the student voice. Or when a third of the remaining data is lumped into meaningless buckets like “other,” staff are left without actionable insight, forced to manually review thousands of comments to find the true issues.

    This is the point where the qualitative data, intended to unlock enhancement, becomes unusable for quality assurance. The result is not just a flawed report, but the failure to deliver equitable enhancement for the cohorts whose voices were lost in the analytical noise.

    Reliable, comprehensive processing is just the first step. The ultimate goal of AI analysis should be to deliver intelligence in a format that seamlessly integrates into strategic workflows. While impressive interfaces are visually appealing, genuine substance comes from the capacity to produce accurate, sector-relevant outputs. Institutions must be wary of solutions that offer a polished facade but deliver compromised analysis. Generic generative AI platforms, for example, offer the illusion of thematic analysis but are not robust.

    But robust validation of any output is still required. This is the danger of smoke and mirrors – attractive dashboards that simply mask a high degree of data leakage, where large volumes of valuable feedback are ignored, miscategorised or rendered unusable by failing to assign sentiment.

    Dig deep, act fast

    When institutions choose rigour, the outcomes are fundamentally different, built on a foundation of confidence. Analysis ensures that virtually every substantive PGT comment is allocated to one or more UK-derived categories, providing a clear thematic structure for enhancement planning.

    Every comment with substance is assigned both positive and negative sentiment, providing staff with the full, nuanced picture needed to build strategies that leverage strengths while addressing weaknesses.

    This shift from raw data to actionable intelligence allows institutions to move quickly from insight to action. As Parama Chaudhury, Pro-Vice Provost (Education – Student Academic Experience) at UCL noted, the speed and quality of this approach “really helped us to get the qualitative results alongside the quantitative ones and encourage departmental colleagues to use the two in conjunction to start their work on quality enhancement.”

    The capacity to produce accurate, sector-relevant outputs, driven by rigorous processing, is what truly unlocks strategic value. Converting complex data tables into readable narrative summaries for each theme allows academic and professional services leaders alike to immediately grasp the findings and move to action. The ability to access categorised data via flexible dashboards and in exportable formats ensures the analysis is useful for every level of institutional planning, from the department to the executive team. And providing sector benchmark reports allows institutions to understand their performance relative to peers, turning internal data into external intelligence.

    The postgraduate taught experience is a critical pillar of UK higher education. The PTES data confirms the challenge, but the true opportunity lies in how institutions choose to interpret the wealth of student feedback they receive. The sheer volume of PGT feedback combined with the ethical imperative to deliver equitable enhancement for all students demands analytical rigour that is complete, nuanced, and sector-specific.

    This means shifting the focus from simply collecting data to intelligently translating the student voice into strategic priorities. When institutions insist on this level of analytical integrity, they move past the risk of smoke and mirrors and gain the confidence to act fast and decisively.

    It turns out Yosemite Sam was right all along: there’s gold in them thar hills. But finding it requires more than just a map; it requires the right analytical tools and rigour to finally extract that valuable resource and forge it into meaningful institutional change.

    This article is published in association with evasys. evasys and Student Voice AI are offering no-cost advanced analysis of NSS open comments delivering comprehensive categorisation and sentiment analysis, secure dashboard to view results and a sector benchmark report. Click here to find out more and request your free analysis.

    Source link

  • Students don’t think anything will change. They’re probably right

    Students don’t think anything will change. They’re probably right

    The standout quote for me from new Office for Students (OfS) commissioned research on student consumer rights comes from a 21-year-old undergrad in a focus group:

    If you were unhappy with your course, I don’t know how you’d actually say to them, ‘I want my money back, this was rubbish,’ basically. I don’t think that they would actually do that. It would just be a long, drawn-out process and they could just probably just argue for their own sake that your experience was your experience, other students didn’t agree, for example, on your course.

    There’s a lot going on in there. It captures the power imbalance between students and institutions, predicts institutional defensiveness, anticipates bureaucratic obstacles, and reveals a kind of learned helplessness – this student hasn’t even tried to complain, and has already concluded it’s futile.

    It’s partly about dissatisfaction with what’s being delivered, and a lack of clarity about their rights. But it’s also about students who don’t believe that raising concerns will achieve anything meaningful.

    Earlier this year, the regulator asked Public First to examine students’ perceptions of their consumer rights, and here we have the results of a nationally representative poll of 2,001 students at providers in England, alongside two focus groups.

    On the surface, things look pretty healthy – 83 per cent of students believe the information they received before enrolment was upfront, clear, timely, accurate, accessible and comprehensive, and the same proportion say their learning experience aligns with what they were promised.

    But scratch a bit and we find a student body that struggles to distinguish between promises and expectations, that has limited awareness of their rights, that doesn’t trust complaints processes to achieve anything meaningful, and that is largely unaware of the external bodies that exist to protect them.

    Whether you see this as a problem of comms, regulatory effectiveness, or student engagement probably depends on where you sit – but it’s hard to argue it represents a protection regime that’s working as intended.

    Learning to be helpless

    Research on complaints tends towards five interlocking barriers that prevent people from holding institutions and service providers to account – and each of them can be found in this data.

    There’s opportunity costs (complaining takes time and energy), conflict aversion (people fear confrontation), confidence and capital (people doubt they have standing to complain), ignorance (people don’t know their rights), and fear of retribution (people worry about consequences). In this research, they combine to create an environment in which students who experience problems just put up with them.

    When they were asked about the biggest barrier to making a complaint, the top answer was doubt that it would make a difference – cited by 36 per cent of respondents. The polling also found that 26 per cent of students said they have “no faith” that something would change if they raised a complaint, and around one in six students (17 per cent) disagreed with the statement “at my university, students have a meaningful say in decisions that affect their education.”

    One postgrad described the experience of repeatedly raising concerns about poor organisation:

    People also just don’t think anything’s going to happen if they make a complaint, like I don’t think it would. With my masters’, it was so badly organised at the start, like we kept turning up for lectures and people just wouldn’t turn up and things like that […] We had this group chat and we were all like, ‘What’s going on? We’re paying so much money for this,’ and […] it just seemed like no one knew what was going on, but we raised it to the rep to raise it to like one of the lecturers and then […] it would just still happen. So it’s like they’re not going to change it.

    That’s someone who tried to work the system, followed the proper channels, raised concerns through the designated representative – and concluded it was futile.

    The second most common barrier captures the opportunity costs thing – lack of time or energy to go through the process, cited by 35 per cent. Combined with doubting it would make a difference, we end up with a decent proportion of students who have cost-benefit analysed complaining and decided it’s not worth the effort. Domestic students were particularly likely to cite futility as a barrier – 41 per cent versus 25 per cent of international students.

    They’ve learned helplessness – and only change their ways when failures impact their marks, only to find that “you should should have complained earlier” is the key response they’ll get when the academic appeal goes in.

    Fear of retribution is also in there. About a quarter of students cited concern that complaining might affect their grades or relationships with staff (25-26 per cent) or said they felt intimidated or worried about possible consequences (23-26 per cent). A postgraduate put it bluntly:

    I think people are scared of getting struck off their course.

    Another student imagined what would happen if they tried to escalate to an external body:

    I think [going to the OIA] would have to be a pretty serious thing to do, and I think that because it’s external to the university, I’d feel a little bit like a snitch. I would have to have a lot of evidence to back up what I’m saying, and I think that it would be a really long, drawn-out process, that I ultimately wouldn’t really trust would get resolved. And so I just wouldn’t really see it as worth it to make that complaint.

    That’s the way it is

    What are students accepting as just how things are? The two things students were most likely to identify as promises from their university were a well-equipped campus, facilities and accommodation (79 per cent) and high quality teaching and resources (78 per cent).

    Over three-quarters of students said the promises made by their university had not been fully met – 59 per cent said they had been mostly met, 14 per cent partially met and 1 per cent not met at all, leaving just 24 per cent who thought promises had been fully met.

    Yet fewer than half of respondents said these were “clear and consistent parts of their university experiences” – 42 per cent for physical resources and just 37 per cent for teaching and resources. In other words, the things students most clearly remember being promised are precisely the things that, for a large minority, show up as patchy, unreliable features of day-to-day university life rather than dependable fixtures.

    There’s also a 41 percentage point gap between what students believe they were promised on teaching quality and what they report actually experiencing – 78 per cent say high quality teaching and resources were promised, but only 37 per cent say that kind of provision is a clear and consistent part of their experience. Public First note that “high quality” wasn’t explicitly defined in the polling, so these are students’ own judgements rather than a technical standard – but the size of the mismatch is still striking.

    About a quarter of students (23 per cent) reported receiving lower quality teaching than expected, rising to 26 per cent among undergraduates. Twenty-two per cent experienced fewer contact hours and more online or hybrid teaching than expected, and twenty-one per cent reported limited access to academic staff.

    One undergraduate described being taught by someone who made clear he didn’t want to be there:

    One of our lecturers, he wasn’t actually a sports journalism lecturer, he’s just off the normal journalism course, and he made it pretty clear that he didn’t like any of us and he didn’t want to be there when he was teaching us. And we basically got told that we had to go and get on with it, pretty much. So there wasn’t any sort of solution of, ‘We’ll change lecturers,’ or anything, it’s just, ‘You’ll get in more trouble if you don’t go, so just get on with it and finish it.

    When presented with a list of possible disruptions and asked which they’d experienced, 70 per cent identified at least one type. The most common was cancellation or postponement of in-person teaching, reported by 35 per cent of undergraduates. Industrial action affecting teaching or marking hit 20 per cent of students overall, and 16 per cent said it had significantly impacted their academic experience.

    Limited support from academic staff affected 20 per cent overall, rising to one in four postgraduate students – and this was the disruption that students were most likely to say had significantly impacted their experience (23 per cent overall, climbing to 32 per cent among international students).

    Telling is how dissatisfied students were with institutional responses to disruptions. Forty-two per cent said they were not that satisfied or not at all satisfied with their institution’s response to cancelled or postponed teaching – 45 per cent said the same about the response to strikes or industrial action. In other words, students experienced disruption, they weren’t happy with how it was handled, and yet most didn’t complain, because (again) they didn’t think it would achieve anything.

    Informal v informant

    Unsurprisingly, most students (65 per cent) had never lodged a formal complaint against their institution. On its face, that could look like satisfaction – if students aren’t complaining, perhaps things are generally fine. But when you dig into the reasons students give for not complaining, about one in four students (24 per cent) who hadn’t complained said they weren’t confident they’d know how to go about it – that’s the ignorance barrier.

    And the bigger obstacles weren’t procedural – they were about believing it was pointless or fearing consequences.

    When students did complain, they were at least twice as likely to have done so through informal channels (such as course representatives or conversations, 23 per cent) than through formal procedures (11 per cent). That’s your conflict aversion in action – you try the informal route first, see if you can get something fixed quietly without escalating to a formal process that might create confrontation.

    But it also means the formal complaints processes that are supposed to provide accountability and redress (and documented institutional learning) are being bypassed by students who’ve concluded they’re not worth engaging with.

    Among those who did complain formally, around half (54 per cent) felt satisfied with their institution’s handling of it – which means nearly half didn’t. So if you’re a student considering whether to raise a complaint, and you believe there’s roughly a 50-50 chance it won’t be handled satisfactorily, if you’ve already concluded there’s a strong likelihood it won’t change anything anyway, why would you bother?

    Especially when you add in the other barriers – concern it might affect grades or relationships with staff, feeling intimidated or worried about consequences, lack of trust in the university to handle it fairly.

    The focus groups reinforce the picture of systematic dismissal. One undergraduate explained the calculation:

    If you were unhappy with your course, I don’t know how you’d actually say to them, ‘I want my money back, this was rubbish,’ basically. I don’t think that they would actually do that. It would just be a long, drawn-out process and they could just probably just argue for their own sake that your experience was your experience, other students didn’t agree, for example, on your course.

    That’s someone that has already mapped out in their head exactly how the institution would respond – they’d argue it’s subjective, other students were happy, your experience doesn’t represent a breach of contract. And, of course, they’re probably right.

    An entitled generation

    If students don’t believe complaining will achieve anything, part of the reason is that they don’t really understand what they’re entitled to expect in the first place. The research found that only 50 per cent of students said they understood and could describe their rights and entitlements as a student – which very much undermines the whole premise of students as empowered consumers able to hold institutions to account.

    When asked how well informed they felt about various rights, the results were even worse. Only 32 per cent of students felt well informed about their right to fair and transparent assessment – the highest figure for any right listed. More than half (52 per cent) said they felt not that well informed or not at all informed about their right to receive compensation. You can’t assert rights you don’t know you have.

    The focus groups then show just how fuzzy students’ understanding of “promises” really is. Participants found it difficult to identify what had been explicitly promised to them, with received ideas about higher education playing a significant role in shaping student expectations.

    They could articulate areas where their experiences fell short – reduced contact hours, poor teaching quality, limited access to careers support – but struggled to identify where these amounted to broken promises.

    One undergraduate captured this confusion as follows:

    I personally think I do get what I was promised when I applied to university. Not like I’m an easy-going person or anything, but I do get what I need in the university, yes.

    Notice the subtle shift from “promised” to “need” – the student can’t quite articulate what was promised, so they fall back on whether they’re getting what they need, which is a much vaguer and more subjective standard.

    This matters a lot, because if you don’t know what you were promised, you can’t confidently assert that a promise has been broken. You might feel disappointed, you might think things should be better, but you can’t point to a specific commitment and say “you told me X and you’ve given me Y.”

    Which means that even when students want to complain, they’re starting from a position of uncertainty about whether they have grounds to do so. It’s the perfect recipe for learned helplessness – you’re dissatisfied, but you’re not sure if you’re entitled to be dissatisfied, so you conclude it’s safer to just accept it.

    The one clear exception? Doctoral students, who were confident they’d been promised the support of a supervisor:

    When I was applying for a PhD, I applied to several universities, so I was selected and accepted in [Institution A] and [Institution B], but I decided to come to [Institution A] for the supervisor – he interviewed me, he sent me the acceptance letter.

    Getting on the escalator

    If the picture so far suggests a system where students lack confidence in internal complaints processes, the findings on external avenues for redress make sense. Only 8 per cent of all students had heard of the Office of the Independent Adjudicator (OIAHE), and the focus groups confirm there was “little to no awareness of external organisations or avenues of redress for students”.

    More broadly, more than a third (35 per cent) of students said they were unaware of any of the external organisations or routes listed through which students in England can raise complaints about their university – rising to 41 per cent among undergraduates and 38 per cent among domestic students. The list they were shown included the OIA, the OfS, Citizens Advice, solicitors, local MPs, the QAA, and trade unions or SUs like NUS. More than a third couldn’t identify a single one of these as somewhere you might go with a concern about your university.

    As for OfS itself, just 18 per cent of students overall had heard of it, falling to 14 per cent among undergraduates. Let’s go ahead and assume that they’ve not read Condition B2.

    When asked where they would go for information about their rights, the most common answer was the university website (53 per cent) or just searching online (51 per cent). About 42 per cent said they’d look to their SU for information about rights. That’s positive – SUs are meant to provide independent advice and advocacy for students. But the fact that only 42 per cent think to go there, versus 53 per cent who’d go to the university website, suggests SUs aren’t being seen as the first port of call.

    Among postgraduates in the focus groups, there was “limited interest in the use of these avenues for redress”, with the implicit sense that if intra-institutional channels of redress seemed drawn-out, daunting and potentially fruitless, it was unlikely that “resorting to extra-institutional channels would make the situation better”. If students have concluded that internal processes are bureaucratic and ineffective, they’re not going to invest additional time and energy in external ones – especially when they don’t know those external routes exist in the first place.

    Explorations

    It’s an odd little bit of research in many ways. It’s hard to tell if recommendations have been deleted, or just weren’t asked for – either way, they’re missing. It’s also frustratingly divorced from OfS’ wider work on “treating students fairly” – I know from my own work over the decades that students tend initially to be overconfident about their rights knowledge, only to realise they’ve over or undercooked when you give them crunchier statements like these “prohibited behaviours” (which of course only seem to be “prohibited”, for the time being, in providers that will join the register in the future).

    More curious is the extent to which OfS knows all of this already. Six years ago this board paper made clear that consumer protection arrangements were failing students on multiple fronts. It knew that information available to support student choice was inadequate – insufficiently detailed about matters that actually concern students and poorly structured for meaningful comparisons between providers and courses, with disadvantaged students and mature learners particularly affected by lack of accessible support and guidance.

    It knew that the contractual relationship between students and providers remains fundamentally unequal, with ongoing cases of unclear or unfair terms that leave students uncertain about what they’re actually purchasing in terms of quality, contact time, support and costs, while terms systematically favoured providers.

    It also knew that its existing tools weren’t allowing intervention even when it saw evidence that regulatory objectives were being delivered, and questioned whether a model requiring individual students to challenge providers for breaches was realistic or desirable.

    So many things would help – recognition of the role of student advocacy, closer adjudication, better coordination between OfS and the OIA, banning NDAs for more than sexual misconduct are four that spring to mind, all of which should be underpinned by a proper theory of change that assumes that not all power over English HE is held in Westward House in Bristol.

    If students have concluded that complaining is futile, there are really three possible responses. One would be to figure that the promises being made raise expectations too high. But there are so many actors specifically dedicated to not talking down a particular university or the sector in general as to render “tell them reality” fairly futile.

    Another is to try to convince them they’re wrong – better communications about rights, clearer signposting of redress routes, more prominent information about successful complaints. You obviously can’t give that job to universities.

    The third would be to ask what would need to change for complaining to actually be worthwhile. That would require processes that are genuinely quick and accessible, institutional cultures where raising concerns is welcomed rather than seen as troublemaking, meaningful remedies when things go wrong, and external oversight bodies that can intervene quickly and effectively.

    But there’s no sign of any of that. A cynic might conclude that a regulator under pressure to help providers manage their finances might need to keep busy and look the other way while modules are slashed and facilities cut.

    Why this matters more than it might seem

    Over the years, people have asserted to me that students-as-consumers, or even the whole idea of student rights, is antithetical to the partnership between students and educators required to create learning and its outcomes.

    “It’s like going to the gym”, they’ll say. “You don’t get fit just by joining”. Sure. But if the toilets are out of order or the equipment is broken, you’re not a partner then. The odd one will try it on. But most of them are perfectly capable of keeping two analogies in their head at the same time.

    In reality, it’s not rights but resignation, when it becomes systematic, that corrodes the basis on which the student-university relationship is supposed to work. If students don’t believe they can hold institutions to account, then all the partnership talk in the world becomes hollow.

    National bodies can write ever more detailed conditions about complaint processes, information provision, and student engagement. Universities can publish ever more comprehensive policies about policies and redress mechanisms. None of it matters if students have concluded that actually using those mechanisms is futile.

    There’s something profoundly upsetting about a system where three-quarters of students believe promises haven’t been kept, but most conclude there’s no point complaining because nothing will change. It speaks to a deeper breakdown than just poor communications or inadequate complaints processes.

    It’s precisely because students aren’t just consumers purchasing a service that we should worry. They’re participants in an institution that’s supposed to be about more than transactions. Universities ask students to trust them with years of their lives, substantial amounts of money (whether paid upfront by international students or through future loan repayments by domestic students), and significant life decisions about career paths and personal development.

    In return, students are supposed to be able to trust that universities will deliver what they promise, listen when things go wrong, and be held accountable when they fail to meet their end of the deal.

    The parallels with broader social contract failures are hard to miss. Just as students don’t believe complaining will change anything at their university, many young people don’t believe political engagement will change anything in society more broadly. Just as students have concluded that formal institutional processes are unlikely to deliver meaningful redress, many citizens have concluded that formal democratic processes are unlikely to deliver meaningful change.

    The learned helplessness this research documents in higher education mirrors learned helplessness – which later turns to extremism – in civic life.

    I don’t think I’ve ever heard of any uni willing to reimburse or cover if they’ve done a poor job of teaching. That’s never come to me.

    They’re right.

    Source link

  • Tutorials must persevere at unis: Opinion – Campus Review

    Tutorials must persevere at unis: Opinion – Campus Review

    Monash University has announced it will replace tutorials for senior law students with seminars that encourage “active learning activities” but have significantly larger class sizes.

    Please login below to view content or subscribe now.

    Membership Login

    Source link

  • Duty of care isn’t about mental health, it’s about preventing harm

    Duty of care isn’t about mental health, it’s about preventing harm

    When people talk about a “duty of care” in higher education, the conversation almost always circles back to mental health – to counselling services, wellbeing strategies, or suicide prevention.

    It’s understandable. Those are visible, urgent needs. But the phrase “duty of care” carries far more weight than any one policy or pastoral initiative.

    It reaches into every space where universities hold power over students’ lives, and every context where harm is foreseeable and preventable.

    That misunderstanding has shaped national policy, too. When over 128,000 people petitioned Parliament for a statutory duty of care in 2023, the Government’s response was to establish the Higher Education Mental Health Implementation Taskforce – a body focused on mental health and suicide prevention.

    Its four objectives spoke volumes – boosting University Mental Health Charter sign-ups, expanding data analytics to flag students in distress, promoting “compassionate communications” to guide staff interactions with students and, where appropriate, with families, and overseeing a National Review of Higher Education Student Suicides.

    These were not bad aims – but they did not speak to the duty that had been demanded. None addressed the legal, structural, or preventative responsibilities that underpin a real duty of care.

    The Taskforce has tackled symptoms, not systems – outcomes, not obligations. By focusing on “student mental health,” the issue became more comfortable – easier to manage within existing policy frameworks and reputational boundaries.

    It allowed the sector to appear to act, while sidestepping the harder questions of legal clarity, parity, and the accountability owed to those who were harmed, failed, or lost.

    In a 2023 Wonkhe article, Sunday Blake made this point with striking clarity. “Duty of care,” she wrote, “is not just about suicide prevention.”

    Nor, by extension, is duty of care just about mental health. Universities shape students’ experiences through housing, assessment, social structures, disciplinary systems, placement arrangements, and daily communications.

    They wield influence that can support, endanger, empower or neglect. If the phrase “duty of care” is to mean anything, it must cover the full spectrum of foreseeable harm – not only the moments of crisis but the conditions that allow harm to build unseen.

    Importantly, this broader understanding of duty of care is not confined to campaigners or bereaved families. The British Medical Association has also recently called for a statutory duty of care across higher education, after hundreds of medical students reported sexual misconduct, harassment, and institutional neglect in a UK-wide survey.

    Drawing on evidence from its Medical Students Committee, the BMA argued that universities hold both knowledge and control, and therefore must bear legal responsibility for preventing foreseeable harm. Crucially, the BMA understands duty of care as a legal obligation – not a wellbeing initiative. Their intervention shows that this is not a niche debate about mental health, but a structural failure across the entire higher-education sector.

    That wider perspective is not a theoretical question. It has been tested – violently, publicly, and avoidably – in real life.

    The stabbing

    In October 2009, Katherine Rosen was a third-year pre-med student at UCLA, one of America’s leading public universities. She was attending a routine chemistry class – an ordinary academic setting – when another student, Damon Thompson, approached her from behind and stabbed her in the neck and chest with a kitchen knife. He nearly killed her.

    It was sudden. It was unprovoked. But it was not unexpected.

    Thompson had a long, documented history of paranoid delusions. University psychiatrists had diagnosed him with schizophrenia and major depressive disorder. He reported hearing voices and believed classmates were plotting against him.

    He had been expelled from university housing after multiple altercations. He told staff he was thinking about hurting others. He had specifically named Katherine in a complaint – claiming she had called him “stupid” during lab work.

    Staff knew. Multiple professionals were aware of his condition – and the risks he posed. Just one day before the attack, he was discussed at a campus risk assessment meeting. And yet – no action was taken. No warning was issued, no protection was offered, and no safeguarding plan was put in place.

    Katherine was left completely unaware. Because the university chose to do nothing.

    The legal battle

    After surviving the attack, Katherine took an action that would shape the future of student safety law in the United States – she sued her university.

    Her claim was simple but profound. UCLA, she argued, had a special relationship with her as a student. That relationship – based on enrolment, proximity, institutional control, and expectation of care – created a legal duty to protect her from foreseeable harm. And that duty, she said, had been breached.

    She wasn’t demanding perfection or suggesting universities could prevent every imaginable harm. She asked a basic question – if a student has been clearly identified as a threat, and the university knows it, doesn’t it have a legal responsibility to act before someone gets seriously hurt – or killed?

    UCLA’s response? No. The university claimed it had no legal duty to protect adult students from the criminal acts of others – even when it was aware of a risk. This wasn’t their responsibility, they said. Universities weren’t guardians, and students weren’t children. No duty, no breach, no liability.

    Their argument rested on a key principle of common law, shared by both the US and UK – that legal duties of care only arise in specific, established situations. Traditionally, adult-to-adult relationships – like those between a university and its students – did not automatically create such duties. Courts are cautious – they don’t want to impose sweeping responsibilities on institutions that may be unreasonable or unmanageable. But that argument ignores a crucial reality – the power imbalance, the structure, and the unique environment of university life.

    The judgment

    Katherine’s case wound its way through the California courts for almost ten years. At every level, the same question remained – does a university owe a duty of care to its students in classroom settings, especially when it is aware of a specific risk?

    Finally, in 2018, the California Supreme Court delivered a landmark ruling in her favour.

    The Court held – by a clear majority – that yes, universities do owe such a duty. Not universally, not in every context – but during curricular activities, and particularly when risks are foreseeable, they must take reasonable protective measures.

    The judgment clarified that a “special relationship” exists between universities and their students, based on the student’s dependence on the university for a “safe environment.” That relationship created not just moral expectations but legal ones.

    In the Court’s own words:

    Phrased at the appropriate level of generality, then, the question here is not whether UCLA could predict that Damon Thompson would stab Katherine Rosen in the chemistry lab. It is whether a reasonable university could foresee that its negligent failure to control a potentially violent student, or to warn students who were foreseeable targets of his ire, could result in harm to one of those students.

    That emphasis on warning mattered. The Court was clear that the duty it recognised did not demand extraordinary measures or perfect foresight. The minimum reasonable step UCLA could have taken — and failed to take — was to warn Katherine or put in place basic protective actions once staff knew she was a potential target. It was this failure at the most elementary level of safeguarding that brought the duty sharply into focus.

    And again:

    Colleges [universities] provide academic courses in exchange for a fee, but a college is far more to its students than a business. Residential colleges provide living spaces, but they are more than mere landlords. Along with educational services, colleges provide students social, athletic, and cultural opportunities. Regardless of the campus layout, colleges provide a discrete community for their students. For many students, college is the first time they have lived away from home. Although college students may no longer be minors under the law, they may still be learning how to navigate the world as adults. They are dependent on their college communities to provide structure, guidance, and a safe learning environment.

    This ruling was a seismic moment. It wasn’t just about Katherine – it was about thousands of other students, across hundreds of other classrooms, who could now expect, not merely hope, that their university would act when danger loomed.

    The precedent was narrow but profound

    This victory came at a cost. It took nearly a decade of litigation, immense emotional strength, and personal resilience. And even in success, the ruling was carefully limited in scope:

    … that universities owe a duty to protect students from foreseeable violence during curricular activities.

    The duty applied only to harm that was:

    • Foreseeable,
    • Tied to curricular activities, and
    • Within the university’s ability to prevent.

    It did not impose a sweeping obligation on universities to protect students in all circumstances – nor should it. But it decisively rejected the idea that universities have no duty to protect.

    This distinction – between the impossible and the reasonable – is crucial. The court did not ask universities to do the impossible. It simply expected them to act reasonably when aware of a real and specific risk to student safety. That principle sets a clear floor, not an unreachable ceiling, for institutional responsibility.

    It also highlights a broader truth – duty of care in higher education is not a binary. It is not all or nothing. A range of duties may arise depending on the setting – academic, residential, or social – or the nature of the risk. The more control a university exercises, and the more vulnerable the student, the greater the duty it may owe.

    This is not about creating impossible expectations – it is about recognising that responsibility must follow power.

    That same logic – and the emerging recognition of limited but enforceable duties – has begun to surface in UK courts. In Feder and McCamish v The Royal Welsh College of Music and Drama, a County Court held that higher education institutions have a duty of care to carry out reasonable investigations when they receive allegations of sexual assault:

    …by taking reasonable protective, supportive, investigatory and, when appropriate, disciplinary steps and in associated communications.

    Again, where institutions have knowledge and control, the law expects a proportionate response.

    But it is important to recognise just how narrow the duty was in Feder & McCamish. The College already had safeguarding procedures in place, and liability arose only because it failed to follow the process it had voluntarily adopted when students reported serious sexual assault.

    The court did not recognise any general duty to protect student welfare – it simply enforced the College’s own promises. It illustrates the limits of UK law – duties arise only in piecemeal, procedural ways, leaving large gaps in protection whenever an institution has not explicitly committed itself to a particular process, or chooses not to follow it.

    Why this story matters now

    The Rosen judgment exposes a truth that too many still miss. Duty of care in higher education is not about expanding counselling teams or implementing wellbeing charters. It’s about the structure of responsibility itself – who knows what, who can act, and who must act when risk is foreseeable.

    In Katherine Rosen’s case, mental health support for Damon Thompson already existed. What failed was the system around him – communication, coordination, and the willingness to protect others. The danger was known, the mechanisms to prevent it were available, and the decision to use them was not taken.

    That is why framing “duty of care” as a question of mental health provision misses the point. Whether the risk is psychological, physical, financial, or reputational, the same principle applies – when institutions hold both knowledge and control, they owe a duty to act with care.

    From assaults in halls to exploitation on placements, from harassment ignored to risks left unmonitored, the duty of care spans far more than mental health. It is about foreseeable harm in any form. It is about accountability that matches authority. It is about creating a culture in which doing nothing or ignoring what you know is no longer an option.

    As Parliament prepares to debate the issue once again, the Rosen case stands as a reminder that this conversation cannot stop at wellbeing. The question is not whether universities should care about students’ mental health – of course they should. The question is whether they will take responsibility for the predictable consequences of their own systems, structures, and decisions.

    Katherine Rosen’s survival – and her long legal struggle – gave the world a clearer definition of that responsibility. It showed that duty of care is not about offering sympathy after the fact, but about preventing foreseeable harm before it happens. That is the real meaning of duty of care in higher education – and it is the clarity the UK still urgently lacks.

    Source link

  • Teaching math the way the brain learns changes everything

    Teaching math the way the brain learns changes everything

    Key points:

    Far too many students enter math class expecting to fail. For them, math isn’t just a subject–it’s a source of anxiety that chips away at their confidence and makes them question their abilities. A growing conversation around math phobia is bringing this crisis into focus. A recent article, for example, unpacked the damage caused by the belief that “I’m just not a math person” and argued that traditional math instruction often leaves even bright, capable students feeling defeated.

    When a single subject holds such sway over not just academic outcomes but a student’s sense of self and future potential, we can’t afford to treat this as business as usual. It’s not enough to explore why this is happening. We need to focus on how to fix it. And I believe the answer lies in rethinking how we teach math, aligning instruction with the way the brain actually learns.

    Context first, then content

    A key shortcoming of traditional math curriculum–and a major contributor to students’ fear of math–is the lack of meaningful context. Our brains rely on context to make sense of new information, yet math is often taught in isolation from how we naturally learn. The fix isn’t simply throwing in more “real-world” examples. What students truly need is context, and visual examples are one of the best ways to get there. When math concepts are presented visually, students can better grasp the structure of a problem and follow the logic behind each step, building deeper understanding and confidence along the way.

    In traditional math instruction, students are often taught a new concept by being shown a procedure and then practicing it repeatedly in hopes that understanding will eventually follow. But this approach is backward. Our brains don’t learn that way, especially when it comes to math. Students need context first. Without existing schemas to draw from, they struggle to make sense of new ideas. Providing context helps them build the mental frameworks necessary for real understanding.

    Why visual-first context matters

    Visual-first context gives students the tools they need to truly understand math. A curriculum built around visual-first exploration allows students to have an interactive experience–poking and prodding at a problem, testing ideas, observing patterns, and discovering solutions. From there, students develop procedures organically, leading to a deeper, more complete understanding. Using visual-first curriculum activates multiple parts of the brain, creating a deeper, lasting understanding. Shifting to a math curriculum that prioritizes introducing new concepts through a visual context makes math more approachable and accessible by aligning with how the brain naturally learns.

    To overcome “math phobia,” we also need to rethink the heavy emphasis on memorization in today’s math instruction. Too often, students can solve problems not because they understand the underlying concepts, but because they’ve memorized a set of steps. This approach limits growth and deeper learning. Memorization of the right answers does not lead to understanding, but understanding can lead to the right answers.

    Take, for example, a third grader learning their times tables. The third grader can memorize the answers to each square on the times table along with its coordinating multipliers, but that doesn’t mean they understand multiplication. If, instead, they grasp how multiplication works–what it means–they can figure out the times tables on their own. The reverse isn’t true. Without conceptual understanding, students are limited to recall, which puts them at a disadvantage when trying to build off previous knowledge.

    Learning from other subjects

    To design a math curriculum that aligns with how the brain naturally learns new information, we can take cues from how other subjects are taught. In English, for example, students don’t start by memorizing grammar rules in isolation–they’re first exposed to those rules within the context of stories. Imagine asking a student to take a grammar quiz before they’ve ever read a sentence–that would seem absurd. Yet in math, we often expect students to master procedures before they’ve had any meaningful exposure to the concepts behind them.

    Most other subjects are built around context. Students gain background knowledge before being expected to apply what they’ve learned. By giving students a story or a visual context for the mind to process–breaking it down and making connections–students can approach problems like a puzzle or game, instead of a dreaded exercise. Math can do the same. By adopting the contextual strategies used in other subjects, math instruction can become more intuitive and engaging, moving beyond the traditional textbook filled with equations.

    Math doesn’t have to be a source of fear–it can be a source of joy, curiosity, and confidence. But only if we design it the way the brain learns: with visuals first, understanding at the center, and every student in mind. By using approaches that provide visual-first context, students can engage with math in a way that mirrors how the brain naturally learns. This shift in learning makes math more approachable and accessible for all learners.

    Source link

  • How CTE inspires long and fulfilling careers

    How CTE inspires long and fulfilling careers

    This post originally published on iCEV’s blog, and is republished here with permission.

    A career-centered education built on real experience

    One of the most transformative aspects of Career and Technical Education is how it connects learning to real life. When students understand that what they’re learning is preparing them for long and fulfilling careers, they engage more deeply. They build confidence, competence, and the practical skills employers seek in today’s competitive economy.

    I’ve seen that transformation firsthand, both as a teacher and someone who spent two decades outside the classroom as a financial analyst working with entrepreneurs. I began teaching Agricultural Science in 1987, but stepped away for 20 years to gain real-world experience in banking and finance. When I returned to teaching, I brought those experiences with me, and they changed the way I taught.

    Financial literacy in my Ag classes was not just another chapter in the curriculum–it became a bridge between the classroom and the real world. Students were not just completing assignments; they were developing skills that would serve them for life. And they were thriving. At Rio Rico High School in Arizona, we embed financial education directly into our Ag III and Ag IV courses. Students not only gain technical knowledge but also earn the Arizona Department of Education’s Personal Finance Diploma seal. I set a clear goal: students must complete their certifications by March of their senior year. Last year, 22 students achieved a 100% pass rate.

    Those aren’t just numbers. They’re students walking into the world with credentials, confidence, and direction. That’s the kind of outcome only CTE can deliver at scale.

    This is where curriculum systems designed around authentic, career-focused content make all the difference. With the right structure and tools, educators can consistently deliver high-impact instruction that leads to meaningful, measurable outcomes.

    CTE tools that work

    Like many teachers, I had to adapt quickly when the COVID-19 pandemic hit. I transitioned to remote instruction with document cameras, media screens, and Google Classroom. That’s when I found iCEV. I started with a 30-day free trial, and thanks to the support of their team, I was up and running fast. 

    iCEV became the adjustable wrench in my toolbox: versatile, reliable, and used every single day. It gave me structure without sacrificing flexibility. Students could access content independently, track their progress, and clearly see how their learning connected to real-world careers.

    But the most powerful lesson I have learned in CTE has nothing to do with tech or platforms. It is about trust. My advice to any educator getting started with CTE? Don’t start small. Set the bar high. Trust your students. They will rise. And when they do, you’ll see how capable they truly are.

    From classroom to career: The CTE trajectory

    CTE offers something few other educational pathways can match: a direct, skills-based progression from classroom learning to career readiness. The bridge is built through internships, industry partnerships, and work-based learning: components that do more than check a box. They shape students into adaptable, resilient professionals.

    In my program, students leave with more than knowledge. They leave with confidence, credentials, and a clear vision for their future. That’s what makes CTE different. We’re not preparing students for the next test. We’re preparing them for the next chapter of their lives.

    These opportunities give students a competitive edge. They introduce them to workplace dynamics, reinforce classroom instruction, and open doors to mentorship and advancement. They make learning feel relevant and empowering.

    As explored in the broader discussion on why the world needs CTE, the long-term impact of CTE extends far beyond individual outcomes. It supports economic mobility, fills critical workforce gaps, and ensures that learners are equipped not only for their first job, but for the evolution of work across their lifetimes.

    CTE educators as champions of opportunity

    Behind every successful student story is an educator or counselor who believed in their potential and provided the right support at the right time. As CTE educators, we’re not just instructors; we are workforce architects, building pipelines from education to employment with skill and heart.

    We guide students through certifications, licenses, career clusters, and postsecondary options. We introduce students to nontraditional career opportunities that might otherwise go unnoticed, and we ensure each learner is on a path that fits their strengths and aspirations.

    To sustain this level of mentorship and innovation, educators need access to tools that align with both classroom needs and evolving industry trends. High-quality guides provide frameworks for instruction, career planning, and student engagement, allowing us to focus on what matters most: helping every student achieve their full potential.

    Local roots, national impact

    When we talk about long and fulfilling careers, we’re also talking about the bigger picture:  stronger local economies, thriving communities, and a workforce that’s built to last.

    CTE plays a vital role at every level. It prepares students for in-demand careers that support their families, power small businesses, and fill national workforce gaps. States that invest in high-quality CTE programs consistently see the return: lower dropout rates, higher postsecondary enrollment, and greater job placement success.

    But the impact goes beyond metrics. When one student earns a certification, that success ripples outward—it lifts families, grows businesses, and builds stronger communities.

    CTE isn’t just about preparing students for jobs. It’s about giving them purpose. And when we invest in that purpose, we invest in long-term progress.

    Empowering the next generation with the right tools

    Access matters. The best ideas and strategies won’t create impact unless they are available, affordable, and actionable for the educators who need them. That’s why it’s essential for schools to explore resources that can strengthen their existing programs and help them grow.

    A free trial offers schools a way to explore these solutions without risk—experiencing firsthand how career-centered education can fit into their unique context. For those seeking deeper insights, a live demo can walk teams through the full potential of a platform built to support student success from day one.

    When programs are equipped with the right tools, they can exceed minimum standards. They can transform the educational experience into a launchpad for lifelong achievement.

    CTE is more than a pathway. It is a movement driven by student passion, educator commitment, and a collective belief in the value of hard work and practical knowledge. Every certification earned, every skill mastered, and every student empowered brings us closer to a future built on long and fulfilling careers for everyone.

    For more news on career readiness, visit eSN’s Innovative Teaching hub.

    Latest posts by eSchool Media Contributors (see all)

    Source link

  • The higher education “market” still doesn’t work

    The higher education “market” still doesn’t work

    When I was prepping up for Policy Radar in October, I gave some brief thought as to how students are positioned and imagined in the Post-16 Education and Skills White Paper.

    And if you’re not a fan of the student-as-consumer framing that has dominated policy for over a decade, I have bad news.

    “Good value for students” will be delivered through “quality” related conditional fee uplifts, and better information for course choice.

    Ministers promise to “improve the quality of information for individuals” so they can pick courses that lead to “positive outcomes” – classic consumer-style transparency, outcome signalling and value propositions.

    And UCAS is leaned on as the main choice architecture for applicants, promising work to improve the quality, prominence and timing of information that applicants see.

    I won’t repeat here why I don’t think that student-as-consumer is anything like as damaging as some do. It was the subject of the first thing I ever wrote for this site, and the arguments are well-rehearsed.

    What I am interested in here is the extent to which the protections that are supposed to exist for students as consumers are working. And to do that, I thought I’d take a little trip down memory lane.

    Consumers at the heart of the system

    Back in 2013, when reforms were being implemented in England to triple tuition fees to £9,000, there had been a very conscious effort in the White Paper that underpinned those changes to frame students as consumers.

    HEFCE was positioned as a “consumer champion for students” tasked with “promoting competition”, we learned that “putting financial power into the hands of learners makes student choice meaningful” and a partnership with Which? was to improve the presentation of course information to help students get “value for money”.

    The “forces of competition” were to replace the “burdens of bureaucracy” in driving up the quality”, the system was to be designed to be “more responsive to student choice” as a market demand signal, the National Student Survey was positioned as a tool for consumer comparison, and the liberation of number controls that had previously “limit[ed] student choice” was to enable students to “vote with their feet”.

    Students were at the heart of the system – as long as you imagined them as consumers.

    The Office for Fair Trading (OfS) wasn’t so sure. The Competition and Markets Authority’s predecessor body had been lobbied by NUS over terms in student contracts that allowed academic sanctions for non-academic debt – and once that was resolved, it took a wider look at the “market” (for undergraduate students in England) to see whether it was working.

    It was keen to assess whether the risks inherent in applying market mechanisms to public services – information asymmetries, lock-in effects, regulatory gaps, and race-to-the-bottom dynamics – were being adequately managed.

    So it launched a call for information, and just before it got dissolved into the CMA, published a report of its findings with recommendations both for the successor body and government.

    Now, given the white paper has done little to change the framing, the question for me when re-reading it was whether any of the problems it identified are still around, or worse.

    The inquiry was structured around four explicit questions – whether students were able to make well-informed choices that drive competition, whether students were treated fairly when they get to university, whether there was any evidence of anti-competitive behaviour between higher education institutions, and whether the regulatory environment was designed to protect students while facilitating entry, innovation, and managed exit by providers.

    On that third one, it found no evidence of anti-competitive behaviour, and in the White Paper, the CMA is now said to be working with the Department for Education (DfE) to clarify how collaboration between providers can happen within the existing legal framework. It’s the others I’ve looked at in detail below.

    Enabling students to make informed choices

    The OFT’s first investigation area was whether students could make the well-informed choices that the marketisation model relied upon.

    The theoretical benefits of competition – providers competing on quality, students voting with their feet, market forces driving standards – were only going to work if consumers could assess what they were buying. Given education is a “post-experience good” that can’t be judged until after consumption, this was always going to be the trickiest part of making a market work.

    As such, it identified information asymmetry as one of three meta-themes underlying market dysfunction. Students were making life-changing, debt-incurring decisions with incomplete, misleading, inaccessible or outdated information – potentially in breach of Consumer Protection from Unfair Trading Regulations and rendering the entire choice-and-competition model built on sand.

    On teaching quality indicators, students couldn’t find basic information about educational experience. Graham Gibbs’ research had identified key predictors – staff-to-student ratios, funding per student, who teaches, class sizes, contact hours – yet none were readily available. Someone reviewing physics courses couldn’t tell whether they’d get eleven or 25 hours weekly.

    By 2014, the National Student Survey (NSS) was prominent but only indirectly measured teaching quality. Without observable process variables, institutions faced weak incentives to invest in teaching and students couldn’t exert competitive pressure. For OfT, the choice mechanism was essentially decorative.

    On employment outcomes, career prospects were the major decision factor, yet DLHE tracked employment only six months post-graduation when many were in temporary roles. The 40-month longitudinal DLHE had sample sizes too small for course-level statistics – students couldn’t compare actual career trajectories. It was also worried about value-added – employment data didn’t control for intake characteristics. Universities taking privileged students looked advantageous regardless of what they actually contributed – for the OfT, that risked perverse incentives where institutions were rewarded for cream-skimming privileged students rather than adding educational value.

    It was also worried about prestige signals like entry requirements and research rankings crowding out quality signals. Presenting outcomes without contextualising intake breached a basic market principle – for the OFT, consumers should assess product quality independent of customer characteristics. And on hidden costs, an NUS survey had found 69 per cent of undergraduates incurred additional charges beyond tuition – equipment hire, studio fees, bench fees – many of which were unknown when applying, raising legal concerns and practical affordability questions.

    The OFT recommended that HEFCE’s ongoing information review address coverage gaps around the learning environment including contact hours, class sizes and teaching approaches; that HEFCE and the sector focus on improving quality and comparability of long-term employment and salary data; that employment data account for institutions taking students with different backgrounds and abilities, acknowledging significant methodological challenges around controlling for prior attainment, socioeconomic background and subject mix; and that material information about additional costs be disclosed to avoid misleading omissions.

    A decade later, things are much worse. DiscoverUni replaced Unistats but core Gibbs indicators remain absent. Contact hours became a political football – piloted as a TEF metric in 2017, abandoned as unworkable, then demanded by ministers in 2022 with sector resistance fearing “Mickey Mouse degrees” tabloid headlines. Staff ratios, class sizes and teaching qualifications still aren’t standardised. The TEF provides gold/silver/bronze ratings but doesn’t drill down to process variables or subject areas predicting actual experience.

    On employment outcomes, things are marginally better but inadequate. Graduate Outcomes tracks employment at 15 months rather than six, but there’s still no standardised long-term earnings trajectory data at course level. On value-added, the situation is virtually unchanged. OfS uses benchmarks in regulation but these aren’t prominently displayed for prospective students. IFS research periodically demonstrates dramatic differences between raw and adjusted outcomes, but this isn’t integrated into official student-facing information.

    The Russell Group benefits enormously from selecting privileged students whose career prospects would be strong regardless of institutional quality. Students can’t distinguish educational quality from privilege – arguably worse given increased marketing of graduate salary data without the context that would make it meaningful. And on hidden costs, the picture is mixed and hard to assess. There is no standardised disclosure format, no regulatory requirement for prominence at application, and a real mess over wider participation costs. The fundamental issue persists.

    Most importantly, well-informed choices pretty much rely on the idea that information is predictive – whether you’re talking about higher education’s experience outputs or its outcomes, what a student is told is supposed to signal what they’ll get. But rapid contraction of courses (and modules within courses), coupled with significant changes in the labour market, all mean that prediction is becoming increasingly futile. That’s a market that, on OfT terms, doesn’t work.

    The student experience at university

    Back in 2013, the OFT identified lock-in effects as the second of three meta-themes undermining the market model.

    Once enrolled, students were effectively trapped by high switching costs, weak credit transfer, financial complications and social costs. For the regulator, that fundamentally broke the competitive mechanism that the entire reform package relied upon. If students couldn’t credibly exit poor provision, institutions faced weak pressure to maintain quality after enrolment. The threat of exit – essential to making markets work – was largely hollow. That enabled institutions to change terms, raise fees and alter courses with relative impunity.

    It found only 1.9 per cent of students switched institutions nationally. While around 90 per cent of institutions awarded credits in theory, there was no guaranteed right to transfer them with assessment happening case by case. Information about credit transfer was technical and non-user friendly. Students faced multiple barriers including difficulty assessing credit equivalence, poor information, financial complications and high social costs of relocating. And students leaving mid-year had to wait until next academic year to access funding again, particularly trapping disadvantaged students in unsuitable courses.

    On fees and courses changing mid-stream, the OFT received reports of fees increasing mid-way through courses, particularly for international students – 58 per cent of institutions didn’t offer fixed tuition for international students on courses over one year. That contravened principles requiring students to know total costs upfront and potentially constituted aggressive commercial practices by exploiting students’ constrained positions.

    Course changes posed similar problems – locations changing, modules reduced, lectures moved to weekends, content changing, modules unavailable. Terms permitting key features to change without valid reason were potentially unfair.

    On misleading information, the OFT heard concerns about false or misleading information about graduate prospects, accreditation, qualification type, course content and facilities, breaching Consumer Protection from Unfair Trading Regulations. Institutions also failed to inform students of potential fee increases, course changes and mandatory additional charges – material omissions affecting informed decisions.

    On complaints and redress, while resolution times were improving from 20 per cent taking over a year in 2009 to 5 per cent, still 12 per cent took six-plus months. Students often graduated before complaints were resolved. A power imbalance between students and institutions required accessible, clear pathways – yet students reported difficulty finding complaint forms, fear of complaining and being put off by bureaucratic processes. Many were unaware of the OIA or how to access it. There was no public data on complaints handled internally by institutions, meaning systemic problems remained hidden and students couldn’t make informed choices between institutions.

    The OFT didn’t make formal recommendations on credit transfer, noting that difficulties arose partly from inevitable variations in how institutions structure degrees, but highlighted that institutions appeared to lack processes for assessing credit equivalence. It implied that fees and course terms needed greater transparency and stability, that misleading information must be eliminated, that academic sanctions should only apply to academic debt, that complaint processes needed to be faster and more accessible with transparency about complaint volumes, that OIA coverage should be comprehensive, and that the structural barriers to price competition needed addressing.

    A decade later, the picture is bleak. Credit transfer has worsened substantially – despite being crucial to the Lifelong Learning Entitlement, it remains one of those old chestnuts where the collective impulse is to explain why it cannot happen. Multiple government attempts have been unsuccessful, and recent OIA complaints show students still don’t realise until too late that transferring will significantly impact loan funding or bursaries.

    On fees and courses changing, the problem persists and legal standards have tightened considerably with both Ofcom and the CMA now viewing inflation-linked mid-contract price increases as causing consumer harm. The 2024 increase to £9,535 exposed widespread non-compliance with many institutions lacking legally sound terms.

    Unilateral course changes without proper consent remain endemic. The CMA secured undertakings from UEA in 2017, and recent OfS and Trading Standards interventions have identified unreasonably wide discretion in terms, and this summer when I looked, less than a third had deleted industrial action from force majeure clauses.

    On misleading information, the DMCC Act has tightened requirements but enforcement is patchy and two-tier with new providers facing enhanced scrutiny while registered providers don’t face the same requirements. Students still cannot bring direct legal claims for misleading omissions.

    On complaints, in 2021 the OIA closed 2,654 complaints but failed to meet its KPI of closing 75 per cent within six months, and the OIA’s influence seems to be waning – with providers implementing good practice recommendations on time dropping from 88 per cent in 2018 to just 60 per cent recently – significantly worse than 2014. Provider websites still include demotivating language about the OIA having no regulatory powers, and there’s still no public data on internal complaints.

    Almost every problem identified has persisted or worsened. Credit transfer remains a policy aspiration without practical implementation. Mid-course changes have intensified under financial pressure. Complaints resolution has deteriorated. Price competition remains absent. Students remain locked into courses with weak protections against opportunistic behaviour by institutions under financial strain.

    The regulatory environment

    The OFT identified regulatory-market misalignment as the third meta-theme. A framework designed for a government-funded sector was governing a student-funded market. As funding shifted, areas without direct public funding fell outside regulatory oversight, creating gaps in student protection and quality assurance. The regulatory architecture hadn’t caught up with the marketisation it was supposed to facilitate.

    It found a system that relied on ad hoc administrative arrangements on decades-old frameworks, lacking democratic legitimacy and a clear statutory basis. Multiple overlapping responsibilities created extreme complexity – the Regulatory Partnership Group produced an Operating Framework just to map arrangements.

    The OFT’s recommendations were implicit – comprehensive reform with primary legislation, simplified structures, reduced uncertainty, accommodation of innovation, competitive neutrality, independent quality assurance, clear exit regimes and quality safeguards.

    Later in the decade, HERA 2017 provided primary legislation establishing the Office for Students (OfS) with statutory frameworks, attempting to address the funding model misalignment. But complexity has arguably worsened dramatically – and beyond OfS, providers and their students are supposed to navigate DfE, UKVI, HESA, QAA, OIA, EHRC, employment law, charity law, Foreign Influence Registration, Prevent and more.

    Crucially, from a student perspective, enrolling is now riskier. Student Protection Plans exist but in sudden insolvency required funds are unlikely protected. OfS has limited teach-out quality monitoring. Plans are outdated and unrealistic – significantly worse than 2014. With financial pressures, there’s evidence of quality degradation – staff leaving, class sizes dwindling, any warm body delivering modules – yet OfS has no meaningful monitoring.

    Survival strategies involve cutting contact hours, study support, module choices and learning resources. Quality floor enforcement remains weak. OFT’s predicted race to the bottom may be materialising.

    What the OFT didn’t see coming

    The 2014 report identified market failures within domestic undergraduate provision but couldn’t anticipate how internationalisation would create entirely new categories of consumer harm. The report barely addressed international students – who by 2024 would represent over 30 per cent of the student body at many institutions.

    International student recruitment spawned multiple interlocking problems. International postgraduate taught students face hefty non-refundable deposits. When students discover agents pushed unsuitable courses or accommodation falls through they lose thousands, creating a regulatory dead-end where CMA refers complaints to OfS, OfS can’t update on progress and OIA says applicants aren’t yet students. UK universities pay agents 5-30 per cent of first-year tuition yet BUILA and UUKi guidelines advise against publishing commission fees. A BUILA survey found significant proportions of recruitment staff believe agents prioritise higher commission over best-fit programmes. A model where these “vulnerable consumers” are only around for a year and whose immigration status is managed by the university is not an ideal breeding ground for consumer confidence when something goes wrong.

    Fee transparency has also emerged as a distinct problem the OFT couldn’t anticipate. Universities’ fee increase policies fail to comply with DMCC drip pricing requirements, using vague language like “fees may rise with inflation” without specifying an index, amount or giving equal prominence. The DMCC Act Section 230 strengthens requirements around total cost presentation – yet widespread non-compliance exists with no enforcement.

    Time for a re-run

    David Behan’s 2024 review of OfS argued that regulating in the student interest required OfS to act as a consumer protection regulator, noting the unique characteristics of higher education as a market where students make one-off, life-changing choices that cannot easily be reversed.

    He recommended OfS be given new powers to address consumer protection issues systematically, including powers to investigate complaints, impose sanctions for unfair practices and require institutions to remedy harm.

    The Post-16 Education and Skills White Paper contains no sign of these powers. Instead, OfS has developed something called “treating students fairly” as part of its regulatory framework, which applies only to new providers joining the register, and exempts the 130-plus existing providers where the problems concentrate.

    The framework doesn’t address CAS allocation crises, agent commission opacity, accommodation affordability, the mess of participation costs information, mid-contract price increases, clauses that limit compensation for breach of contract to the total paid in fees, under and over-recruitment, restructures that render promises meaningless, a lack of awareness of rights over changes, weak regulation on disabled students’ access, protection that doesn’t work and regulator that hopes students have paid their fees by credit card. The issues the OfT identified in 2014 have not been resolved – they have intensified and multiplied alongside entirely new categories of harm that never appeared in the original review. And in any case, OfS only covers England.

    There are also so many issues I’ve not covered off in detail – not least the hinterland of ancillary markets that quietly shape the “purchase”. Accommodation tie-ins and exclusive nomination deals that funnel applicants into PBSA on university letterheads. Guarantor insurance and “admin fees by another name”. Pressure-selling tactics at Clearing. Drip pricing across compulsory materials, fieldwork and resits with no total cost of ownership up front.

    International applicants squeezed by CAS timing, opaque visa-refusal refunds and agent commission structures the sector still won’t publish. And in the franchising boom, students can’t tell who their legal counterparty is, Student Protection Plans don’t bite cleanly down the chain, and complaints ping-pong between delivery partner, validator and redress schemes.

    Then there’s invisible digital and welfare layers that a consumer lens keeps missing. VLE reliability and service levels that would trigger service credits in any other sector but here are just “IT issues”. Prospectuses that promise personalised disability or welfare support without disclosing capacity limits or waiting times. Placements and professional accreditation marketed as features, then quietly downgraded with “not guaranteed” microprint when markets tighten.

    And the quiet austerity of mid-course “variation” – fewer options, thinner contact, shorter opening hours, more asynchronous delivery – with no price adjustment, no consent and no meaningful exit. If this is a market, where are the market remedies?

    What’s needed ideally is a bespoke set of student rights that recognise the distinctive features of higher education as an experience – the information asymmetries, the post-experience good characteristics, the lock-in effects, the visa and immigration entanglements and the power imbalances between institutions and individuals.

    But if that’s not coming – and the White Paper suggests it isn’t – then the market architecture remains, and with it the need for functioning regulation.

    The CMA should do its job. It should re-run the 2014 review to assess how the market has evolved over the past decade, expand its coverage to include the issues that have emerged, and use the powers that the DMCC Act has given it. By its own definitions, the evidence of harm is overwhelming.

    Source link

  • Helping students to make good choices isn’t about more faulty search filters

    Helping students to make good choices isn’t about more faulty search filters

    A YouTube video about Spotify popped into my feed this weekend, and it’s been rattling around my head ever since.

    Partly because it’s about music streaming, but mostly because it’s all about what’s wrong with how we think about student choice in higher education.

    The premise runs like this. A guy decides to do “No Stream November” – a month without Spotify, using only physical media instead.

    His argument, backed by Barry Schwartz’s paradox of choice research and a raft of behavioural economics, is that unlimited access to millions of songs has made us less satisfied, not more.

    We skip tracks every 20 to 30 seconds. We never reach the guitar solo. We’re treating music like a discount buffet – trying a bit of everything but never really savouring anything. And then going back to the playlists we created earlier.

    The video’s conclusion is that scarcity creates satisfaction. Ritual and effort (opening the album, dropping the needle, sitting down to actually listen) make music meaningful.

    Six carefully chosen options produce more satisfaction than 24, let alone millions. It’s the IKEA effect applied to music – we value what we labour over.

    I’m interested in choice. Notwithstanding the debate over what a “course” is, Unistats data shows that there were 36,421 of them on offer in 2015/16. This year that figure is 30,801.

    That still feels like a lot, given that the University of Helsinki only offers 34 bachelor’s degree programmes.

    Of course a lot of the entries on DiscoverUni separately list “with a foundation year” and there’s plenty of subject combinations.

    But nevertheless, the UK’s bewildering range of programmes must be quite a nightmare for applicants to pick through – it’s just that once they’re on them, job cuts and switches to block teaching are delivering increasingly less choice in elective pathways than they used to.

    We appear to have a system that combines overwhelming choice at the point of least knowledge (age 17, alongside A-levels, with imperfect information) with rigid narrowness at the point of most knowledge (once enrolled, when students actually understand what they want to study and why). It’s the worst of both worlds.

    What the white paper promises

    The government’s vision for improving student choice runs to a couple of paragraphs in the Skills White Paper, and it’s worth quoting in full:

    We will work with UCAS, the Office for Students and the sector to improve the quality of information for individuals, informed by the best evidence on the factors that influence the choices people make as they consider their higher education options. Providing applicants with high-quality, impartial, personalised and timely information is essential to ensuring they can make informed decisions when choosing what to study. Recent UCAS reforms aimed at increasing transparency and improving student choice include historic entry grades data, allowing students, along with their teachers and advisers, to see both offer rates and the historic grades of previous successful applicants admitted to a particular course, in addition to the entry requirements published by universities and colleges.

    As we see more students motivated by career prospects, we will work with UCAS and Universities UK to ensure that graduate outcomes information spanning employment rates, earnings and the design and nature of work (currently available on Discover Uni) are available on the UCAS website. We will also work with the Office for Students to ensure their new approach to assessing quality produces clear ratings which will help prospective students understand the quality of the courses on offer, including clear information on how many students successfully complete their courses.”

    The implicit theory of change is straightforward – if we just give students more data about each of the courses, they’ll make better choices, and everyone wins. It’s the same logic that says if Spotify added more metadata to every track (BPM, lyrical themes, engineer credits), you’d finally find the perfect song. I doubt it.

    Pump up the Jam

    If the Department for Education (DfE) was serious about deploying the best evidence on the factors that influence the choices people make, it would know about the research showing that more information doesn’t solve choice overload, because choice overload is a cognitive capacity problem, not an information quality problem.

    Sheena Iyengar and Mark Lepper’s foundational 2000 study in the Journal of Personality and Social Psychology found that when students faced 30 essay topic options versus six options, completion rates dropped from 74 per cent to 60 per cent, and essay quality declined significantly on both content and form measures. That’s a 14 percentage point completion drop from excessive choice alone, and objectively worse work from those who did complete.

    A study on Jam showed customers were ten times more likely to buy when presented with six flavours rather than 24, despite 60 per cent more people initially stopping at the extensive display. More choice is simultaneously more appealing and more demotivating. That’s the paradox.

    CFE Research’s 2018 study for the Office for Students (back when providing useful research for the sector was something it did) laid this all out explicitly for higher education contexts.

    Decision making about HE is challenging because the system is complex and there are lots of alternatives and attributes to consider. Those considering HE are making decisions in conditions of uncertainty, and in these circumstances, individuals tend to rely on convenient but flawed mental shortcuts rather than solely rational criteria. There’s no “one size fits all” information solution, nor is there a shortlist of criteria that those considering HE use.

    The study found that students rely heavily on family, friends, and university visits, and many choices ultimately come down to whether a decision “feels right” rather than rational analysis of data. When asked to explain their decisions retrospectively, students’ explanations differ from their actual decision-making processes – we’re not reliable informants about why we made certain choices.

    A 2015 meta-analysis by Chernev, Böckenholt, and Goodman in the Journal of Consumer Psychology identified the conditions under which choice overload occurs – it’s moderated by choice set complexity, decision task difficulty, and individual differences in decision-making style. Working memory capacity limits humans to processing approximately seven items simultaneously. When options exceed this cognitive threshold, students experience decision paralysis.

    Maximiser students (those seeking the absolute best option) make objectively better decisions but feel significantly worse about them. They selected jobs with 20 per cent higher salaries yet felt less satisfied, more stressed, frustrated, anxious, and regretful than satisficers (those accepting “good enough”). For UK applicants facing tens of thousands of courses, maximisers face a nearly impossible optimisation problem, leading to chronic second-guessing and regret.

    The equality dimension is especially stark. Bailey, Jaggars, and Jenkins’s research found that students in “cafeteria college” systems with abundant disconnected choices “often have difficulty navigating these choices and end up making poor decisions about what programme to enter, what courses to take, and when to seek help.” Only 30 per cent completed three-year degrees within three years.

    First-generation students, students from lower socioeconomic backgrounds, and students of colour are systematically disadvantaged by overwhelming choice because they lack the cultural capital and family knowledge to navigate it effectively.

    The problem once in

    But if unlimited choice at entry is a cognitive overload problem, what happens once students enrol should balance that with flexibility and breadth. Students gain expertise, develop clearer goals, and should have more autonomy to explore and specialise as they progress.

    Except that’s not what’s happening. Financial pressures across the sector are driving institutions to reduce module offerings – exactly when research suggests students need more flexibility, not less.

    The Benefits of Hindsight research on graduate regret says it all. A sizeable share of applicants later wish they’d chosen differently – not usually to avoid higher education, but to pick a different subject or provider. The regret grows once graduates hit the labour market.

    Many students who felt mismatched would have liked to change course or university once enrolled – about three in five undergraduates and nearly two in three graduates among those expressing regret – but didn’t, often because they didn’t know how, thought it was too late, or feared the cost and disruption.

    The report argues there’s “inherent rigidity” in UK provision – a presumption that the initial choice should stick despite evolving interests, new information, and labour-market realities. Students described courses being less practical or less aligned to work than expected, or modules being withdrawn as finances tightened. That dynamic narrows options precisely when students are learning what they do and don’t want.

    Career options become the dominant reason graduates cite for wishing they’d chosen differently. But that’s not because they lacked earnings data at 17. It’s because their interests evolved, they discovered new fields, labour market signals changed, and the rigid structure gave them no way to pivot without starting again.

    The Competition and Markets Authority now explicitly identifies as misleading actions “where an HE provider gives a misleading impression about the number of optional modules that will be available.” Students have contractual rights to the module catalogue promised during recruitment. Yet redundancy rounds repeatedly reduce the size and scope of optional module catalogues for students who remain.

    There’s also an emerging consensus from the research on what actually works for module choice. An LSE analysis found that adding core modules within the home department was associated with higher satisfaction, whereas mandatory modules outside the home department depressed it. Students want depth and coherence in their chosen subject. They also value autonomous choice over breadth options.

    Research repeatedly shows that elective modules are evaluated more positively than required ones (autonomy effects), and interdisciplinary breadth is associated with stronger cross-disciplinary skills and higher post-HE earnings when it’s purposeful and scaffolded.

    What would actually work

    So what does this all suggest?

    As I’ve discussed on the site before, at the University of Helsinki – Finland’s flagship institution with 40,000 students – there’s 32 undergraduate programmes. Within each programme, students must take 90 ECTS credits in their major subject, but the other 75 ECTS credits must come from other programmes’ modules. That’s 42 per cent of the degree as mandatory breadth, but students choose which modules from clear disciplinary categories.

    The structure is simple – six five-credit introductory courses in your subject, then 60 credits of intermediate study with substantial module choice, including proseminars, thesis work, and electives. Add 15 credits for general studies (study planning, digital skills, communication), and you’ve got a degree. The two “modules” (what we’d call stages) get a single grade each on a one-to-five scale, producing a simple, legible transcript.

    Helsinki runs this on a 22.2 to one staff-student ratio, significantly worse than the UK average, after Finland faced €500 million in higher education cuts. It’s not lavishly resourced – it’s structurally efficient.

    Maynooth University in Ireland reduced CAO (their UCAS) entry routes from about 50 to roughly 20 specifically to “ease choice and deflate points inflation.” Students can start with up to four subjects in year one, then move to single major, double major, or major with minor. Switching options are kept open through first year. It’s progressive specialisation – broad exploration early when students have least context, increasing focus as they develop expertise.

    Also elsewhere on the site, Técnico in Lisbon – the engineering and technology faculty of the University of Lisbon – rationalised to 18 undergraduate courses following a student-led reform process. Those 18 courses contain hundreds of what the UK system would call “courses” via module combinations, but without the administrative overhead. They require nine ECTS credits (of 180) in social sciences and humanities for all engineering programmes because “engineers need to be equipped not just to build systems, but to understand the societies they shape.”

    Crucially, students themselves pushed for this structure. They conducted structured interviews, staged debates, and developed reform positions. They wanted shared first years, fewer concurrent modules to reduce cognitive load, more active learning methods, and more curricular flexibility including free electives and minors.

    The University of Vilnius allows up to 25 per cent of the degree as “individual studies” – but it’s structured into clear categories – minors (30 to 60 credits in a secondary field, potentially leading to double diploma), languages (20-plus options with specific registration windows), interdisciplinary modules (curated themes), and cross-institution courses (formal cooperation with arts and music academies). Not unlimited chaos, just structured exploration within categorical choices.

    What all these models share is a recognition that you can have both depth and breadth, structure and flexibility, coherence and exploration – if you design programmes properly. You need roughly 60 to 70 per cent core pathway in the major for depth and satisfaction, 20 to 30 per cent guided electives organised into three to five clear categories per decision point, and maybe 10 to 15 per cent completely free electives.

    The UK’s subject benchmark statements, if properly refreshed (and consolidated down a bit) could provide the regulatory infrastructure for it all. Australia undertook a version of this in 2010 through their Learning and Teaching Academic Standards project, which defined threshold learning outcomes for major discipline groupings through extensive sector consultation (over 420 meetings with more than 6,100 attendees). Those TLOs now underpin TEQSA’s quality regime and enable programme-level approval while protecting autonomy.

    Bigger programmes, better choice

    The white paper’s information provision agenda isn’t wrong – it’s just addressing the wrong problem at the wrong end of the process. Publishing earnings data doesn’t solve cognitive overload from tens of thousands of courses, quality ratings don’t help students whose interests evolve and who need flexibility to pivot, and historic entry grades don’t fix the rigidity that manufactures regret.

    What would actually help is structural reform that the international evidence consistently supports – consolidation to roughly 20 to 40 programmes per institution (aligned with subject benchmark statement areas), with substantial protected module choice within those programmes, organised into clear categories like minors, languages, and interdisciplinary options.

    Some of those groups of individual modules might struggle to recruit if they were whole courses – think music and languages. They may well (and across Europe, do) sustain research-active academics if they could exist in broader structures. Fewer, clearer programmes at entry when students have least context, and more, structured flexibility during the degree when students have expertise to choose wisely.

    The efficiency argument is real – maintaining thousands of separate course codes, each with approval processes, quality assurance, marketing materials, and UCAS coordination is absurd overhead for what’s often just different permutations of the same modules. See also hundreds of “programme leaders” each having to be chased to fill a form in.

    Fewer programme directors with more module convenors beneath them is far more rational. And crucially, modules serve multiple student populations (what other systems would call majors and minors, and students taking breadth from elsewhere), making specialist provision viable even with smaller cohorts.

    The equality case is compelling – guided pathways with structured choice demonstrably improve outcomes for first-in-family students, students of colour, and low-income students, populations that regulators are charged with protecting. If current choice architecture systematically disadvantages exactly these students, that’s not pedagogical preference – it’s a regulatory failure.

    And the evidence on what students actually want once enrolled validates it all – they value depth in their chosen subject, they want autonomous choice over breadth options (not forced generic modules), they benefit from interdisciplinary exposure when it’s purposeful, and they need flexibility to correct course when their goals evolve.

    The white paper could have engaged with any of this. Instead, we get promises to publish more data on UCAS. It’s more Spotify features when what students need is a curated record collection and the freedom to build their own mixtape once they know what they actually like.

    What little reform is coming is informed by the assumption that if students just had better search filters, unlimited streaming would finally work. It won’t.

    Source link