Category: Students

  • How cost of living is influencing UK student mobility

    How cost of living is influencing UK student mobility

    Drive along any motorway in September and you will see car after car full of duvets, pots and pans, and clothes as students head off to pastures new. I remember my own experience, crossing the Severn Bridge with the bedding on the front seat of my Fiesta muffling Oasis’ Definitely Maybe.

    This stereotypical view of a literal journey into higher education isn’t the case for everyone, however. In fact, far more students live at home during their studies than you may think.

    The UCAS application asks students about whether they intend to live at home. In 2024, 30 per cent of UK 18-year-olds said they planned to live at home during their studies – up from 25 per cent in 2019 and just 21 per cent in 2015.

    However, when we look beyond the headline numbers, over half of the most disadvantaged students (IMD Q1) live at home during their studies, compared to fewer than one in five of the least disadvantaged (IMD Q5). Regional distribution will have an impact here, particularly London.

    Scottish students are more likely to live at home during their studies. On a recent visit to Edinburgh, all the students I met spoke with excitement about their plans to study at their chosen university within the city. By contrast, Welsh domiciled students are the least likely to live at home during their studies.

    In London, 52 per cent of 18-year-olds progress to HE – with around half of those students staying in London, making it unsurprising that the capital sees the highest proportion of live at home students in England.

    Cost of living pressures

    Cost of living is undoubtedly influencing student choice. At the January equal consideration deadline, UCAS saw a 2.1 per cent increase in the number of UK 18-year-old applicants – a record high. However, regular readers of Wonkhe will know this also represents a decline in the application rate – the proportion of the 18-year-old population applying to HE, and UCAS insight increasingly points to the cost of living playing a role.

    Our latest survey insight suggests that 43 per cent of pre-applicants feel they are less likely to progress to HE due to cost-of-living pressures, up from 24 per cent in 2023 – although their commitment to going to university remains high.

    Financial support is also of growing importance to students when it comes to deciding where to study. While finding the perfect course content was the most important factor when shortlisting universities (49 per cent), the financial support available while studying (such as a scholarship or bursary) was a close second (46 per cent). Specific cost-of-living support offered by universities was third (34 per cent).

    The availability of support with the cost of living has risen in relative importance as a factor when shortlisting universities from 12th in 2022 to 3rd in 2024 – a significant shift, which suggests a change in student mindset. There have also been large changes in rank importance of “universities that are close to home” from 9th to 4th, “universities with low-cost accommodation” from 13th to 7th and “universities I can attend but still live with my parents” from 16th to 11th.

    Source: Potential applicants for 2025 entry, 1,023 UK respondents, Dec 2024–Jan 2025

    It isn’t just at the point of application where we see the cost of living impacting choice. In 2024, UCAS saw 43,000 students decline the place they were holding in favour of an alternative institution or subject – making this the largest group of students using Clearing.

    This is not a spur of the moment decision, with 52 per cent having already decided to do this prior to receiving their results and a further one in five considering it based on their results.

    When asked what drove their decision, 23 per cent told us they had a change in personal circumstances and 17 per cent wanted to live somewhere cheaper. We also know this impacts on all cohorts of students – 19 per cent of international students that don’t accept a university offer through UCAS tell us they have found a more attractive financial offer elsewhere.

    However, the primary reason that students use Decline My Place is linked to the course, with 31 per cent changing their mind about the subject they wish to study.

    Support measures

    It’s clear that cost of living and financial support is a key factor influencing student choice and so we must ensure this information is easily accessible and understood by students.

    Students tell us they’d like more practical information about student discounts, financial support packages or bursaries/scholarships. UCAS will shortly be launching a scholarships and bursary tool to promote these opportunities to students.

    Around half of offer holders in 2024 recalled receiving information about cost of living support. This presents a timely opportunity for any university staff working in marketing, recruitment or admissions to ensure information about financial support is easy to find on their website, along with information about timetabling to help students understand how they may be able to balance work and study commitments.

    There will be certain groups of students that are even more acutely impacted by cost of living challenges. Last cycle saw a record number of students in receipt of Free School Meals – 19.9 per cent – enter HE. Whilst it is only a small part of the puzzle, UCAS has removed the application fee for these students.

    Cost of living pressures are likely to persist, with students continuing to assess the value of HE in this context. The sector should continue to highlight the benefits of university study as a vehicle for social mobility, along with the graduate premium – the higher earnings they typically earn compared to non-graduate peers. But we also need to make it clearer how HE of all forms remains accessible – from funds for travel to open days, to in study commuter breakfasts, hardship funds, cost of living support, and high-quality careers guidance to support graduate employability.

    This article is published in association with UCAS. It forms part of our ongoing series on commuter students – you can read the whole series here

    Source link

  • A review of student suicides suggests that standards are now necessary

    A review of student suicides suggests that standards are now necessary

    For years, bereaved families have fought for answers – and change – after losing their children to suicide at university.

    When life is difficult, Samaritans are here – day or night, 365 days a year. You can call them for free on 116 123, email them at [email protected], or visit http://www.samaritans.org to find your nearest branch.

    Arguably the most high-profile have been Bob and Margaret Abrahart, who led this charge after their daughter Natasha died in April 2018 at the University of Bristol.

    Despite her severe social anxiety, Natasha was required to give oral presentations that filled her with dread, and in 2022, a judge ruled that Bristol had discriminated against Natasha under the Equality Act by not making reasonable adjustments.

    But he did not find the university owed a general duty of care to avoid causing psychiatric harm – noting that:

    …if a relevant duty of care did exist… there can be no doubt that the university would have been in breach.

    That distinction prompted the Abraharts and other bereaved families to launch the “#ForThe100” campaign, named after the estimated annual student suicide toll. Their petition for a statutory duty of care gathered over 128,000 signatures and triggered a Westminster Hall debate in 2023, where MPs across parties voiced support.

    The skills minister at the time, Robert Halfon, rejected the call for statutory change. Instead, as part of a higher education mental health implementation taskforce, he announced an independent review of student suicide deaths – a “watching brief” approach that effectively deferred the question of legal responsibility while monitoring the sector.

    The review has now been published – and it reveals a catalogue of missed opportunities, systematic failures, and inadequate protections for vulnerable students.

    It also evidences the patterns identified by campaigners for years – poor monitoring of disengagement, communication silos between academic and support services, inadequate training for staff, and safety concerns in university accommodation.

    The big question now is whether the evidence will drive the legal and cultural shifts needed to protect students and prevent future deaths – or whether it will become yet another well-intentioned PDF on the ever-growing pile of guidance that relies on voluntary implementation.

    A review of student suicides

    The National Confidential Inquiry into Suicide and Safety in Mental Health (NCISH) team from the University of Manchester was commissioned to conduct the review. Their approach was methodical – all higher education institutions in England were asked to submit redacted serious incident reports for suspected suicides and serious self-harm incidents occurring during the 2023-2024 academic year.

    The response was robust. Of the 115 Universities UK members, 113 (98 per cent) provided a nominated contact, and 110 (96 per cent) responded with information about serious incidents during the academic year. That does at least suggest that universities recognise the importance of addressing student suicide, even if some remain hesitant about legal frameworks for doing so.

    In total, universities reported 107 suspected suicide deaths and 62 incidents of non-fatal self-harm during the 2023-2024 academic year. Of these, 104 serious incident reports (79 for suspected suicides and 25 for self-harm) were submitted to NCISH for analysis. As such, it is the largest collection of detailed individual-level data on student suicide ever compiled in the UK.

    The team then analyzed those reports against established standards, including both the Universities UK/PAPYRUS/Samaritans guidance for conducting serious incident reviews, and NCISH’s own 10 standards for investigating serious incidents. They examined student characteristics, identified risk factors, evaluated the quality of investigations, and assessed the recommendations and action plans arising from these reviews.

    Pressure and disengagement

    In 38 per cent of cases, students were experiencing academic problems or pressures. These ranged from exam-related stress (10 per cent) to anxiety about falling behind or meeting deadlines (19 per cent).

    Nearly a third (32 per cent) of reports identified evidence of non-attendance – a critical warning sign that was often met with inadequate response, if it was noticed at all. The most common intervention was an automated email from administrators, rather than proactive personal outreach.

    The report argues that that represents a significant missed opportunity for intervention – calling for students who are struggling academically to be recognised as potentially at risk, with an enhanced focus on providing a supportive response, as well as increased awareness of support at key pressure points in the academic calendar, especially during exam periods.

    The review also found that while 21 per cent of students were or had been part of “support to study” procedures or equivalent, there were clear instances where a cause for concern had not been appropriately escalated.

    The report identifies a need for additional or more robust processes for monitoring student engagement and non-attendance, including recommendations to review attendance triggers, the development of consistent approaches to responding to non-attendance, and the implementation of earlier interventions when disengagement is identified.

    The timing of incidents reinforces the connection to academic pressure, with peaks occurring in March and May – coinciding with assessment and exam periods – and notably fewer incidents during holiday periods, suggesting that academic stressors play a significant role in student distress.

    One thing I’d add here is that it really shouldn’t be a given that students in the UK all progress and complete at the same pace – that we are the country in the OECD whose students complete the fastest and drop out the least has some obvious downsides that the LLE, and a large dose of culture change, really ought to tackle.

    The other thing worth considering is culture. In our work on student health last month, academic culture popped up a significant but often overlooked determinant of student health in survey responses, with students describing patterns of overwork, presenteeism, and a “meritocracy of difficulty” that rewards suffering over learning outcomes.

    Students’ comments revealed how unhealthy work patterns are normalized within academic environments, with concerns about overwhelming assessment deadlines, high-stakes exams disadvantaging students with health conditions, and the glorification of struggle across disciplines. Students also highlighted the disconnect between wellbeing messaging and impossible workloads, articulating a desire for intellectually challenging environments that don’t lead to burnout – as well as both personal and systems empathy.

    Their solutions included workload mapping, identifying assessment bottlenecks, flexible assessment strategies offering multiple ways to demonstrate learning, staff training on setting healthy work boundaries, health impact assessments for curriculum design, accessibility-focused policies, clear distinctions between challenging content and unnecessary stress, student workload panels with authority to flag unsustainable demands, and revised attendance policies to discourage presenteeism during illness. They are all worth considering – as are projects like the one referenced here.

    Mental health, neurodiversity and support services

    Nearly half (47 per cent) of reports identified mental health difficulties as a factor prior to the incident, with 31 per cent noting diagnosed mental health conditions. Most commonly, these were depression and anxiety disorders (20 per cent).

    Significantly, 30 per cent of reports described a diagnosis or suspected diagnosis of neurodiversity, including attention deficit hyperactivity disorder (ADHD), autism spectrum disorder, or dyslexia. Of these neurodivergent students, only 14 described reasonable adjustments or support/inclusion plans tailored to their needs, and 12 per cent also had a mental health diagnosis. That suggests big gaps in support for students with overlapping mental health and neurodevelopmental needs.

    Especially concerning is that 70 per cent of students were known to university support services before their death, most often wellbeing services. These weren’t cases where students were suffering in silence – they had actively reached out for help within the university system. In many cases, students had multiple touchpoints with support services, but there were often gaps in follow-up, inadequate assessment of risk severity, and insufficient intensity or continuity of support.

    It’s partly the silo problem again. The report identified plenty of problems with information sharing in 24 per cent of cases, where critical details about a student’s mental health were not communicated between clinical, pastoral, and academic staff. Communication breakdowns meant that while a student might disclose suicidal thoughts to a counselor, personal/academic tutors remained unaware of the severity of their situation, continuing to apply normal academic pressures.

    Similarly, when academic staff noticed concerning changes in attendance or performance, this information wasn’t consistently shared with mental health professionals who could have intervened.

    The review specifically recommends improving information sharing internally and externally but notes that (often unfounded) concerns about confidentiality prevent effective coordination – leaving vulnerable students to navigate fragmented support systems and tell their story repeatedly to different university staff. What I’d note is that recommendations and guidance on this have been around for years now – universities clearly need to go further, and faster.

    And the realities of the funding system, the state of the sector’s finances and the resultant staff-student ratios in plenty of departments also need an honest conversation. If it’s noticing that matters, other students also need to be in the mix as well as academic staff.

    Location and transition

    Where location was known, 23 per cent of incidents occurred in university-managed accommodation – suggesting serious safety concerns in spaces directly controlled by institutions. The review specifically recommends reviewing the safety of university-managed accommodation, including physical safety, high-risk locations, the criteria for welfare checks, and signposting for support, particularly out-of-hours.

    I’d suggest that that should probably reflect, via the codes of practice the firms will be required to join to escape the regulation in the Renter’s Rights Act, standards in private halls too – although that would, of course require a modicum of coordination between DfE and the Ministry of Housing, Communities and Local Government.

    Almost three-quarters (73 per cent) of students were undergraduates, with over a quarter (27 per cent) in their first year of undergraduate studies, backing up previous research that has indicated that the first year represents a particularly vulnerable transition period – often leaving home, managing independent living, forming new relationships, and adapting to university-level academic demands.

    The review suggests these changes create a perfect storm of risk factors – first-year students often lack established campus support networks while losing daily contact with home support systems, may struggle with imposter syndrome or academic uncertainty, and frequently hesitate to seek help, believing their struggles are just “normal” adjustment issues.

    The problem is then compounded by institutional factors – with no prior academic record to contextualise changes in engagement and larger first-year class sizes, warning signs frequently go unnoticed by staff. The review specifically calls for enhanced induction processes and early intervention systems for first-years, recognising that proactive support during this critical transition period could significantly reduce suicide risk.

    I remain convinced that near-universal systems of group social mentoring found on the continent could have a major role to play here – they’re even in the legislation in Finland – but I also wonder whether the other notable OECD comparison, that (together with Belgium) we have pretty much the youngest bachelor’s entrants in the world, could also do with some significant thought.

    DfE has, of course, had a previous run at coordinating a national piece of work on transition support and standards – but the less said about that the better. We almost certainly need something more consistent, substantial and credit-bearing – I sketched out what that could look like here.

    International students

    International students accounted for nearly a quarter (24 per cent) of all submitted reports – a disproportionately high percentage given their representation in the overall student population. The overrepresentation could suggest additional challenges, including potential cultural and language barriers, social isolation, and distance from established support networks.

    In many ways, they face much of what home students face, with unfamiliar academic and cultural expectations, (often) studying in a second language, managing complex visa requirements, and coping with significant financial pressures due to higher fees and limited work rights piled on top. Many also experience intense pressure to succeed from family members who may have made substantial sacrifices to fund their education.

    The review found that cultural differences significantly impacted how international students experienced and expressed mental health difficulties. In some cases, cultural stigma around mental illness prevented students from seeking help, while in others, language barriers made it difficult to effectively communicate distress to university staff. The report also noted particular difficulties with international students who were isolated within their own cultural groups, making it harder for wider university systems to identify warning signs.

    Despite the overrepresentation of international students in suicide cases, the review found minimal evidence of culturally sensitive support services or targeted outreach. Many just applied a one-size-fits-all approach to wellbeing support that failed to account for diverse cultural understandings of mental health.

    The review specifically recommends that universities develop more culturally competent services and proactive engagement strategies for international students – particularly those from countries with significant cultural differences from the UK.

    There’s a reason why new Office for Students Condition E6 on harassment and sexual misconduct specifically requires approaches that are tailored to a provider’s specific student population, and that systems and processes to help prevent and respond to harassment and sexual misconduct are accessible to international students. It’s true on this issue too.

    Investigation quality and university response

    Following a death by suicide, the review found significant gaps in postvention support – the care provided to those affected. While 41 per cent of reports showed evidence of support for peers following a suicide, there was significantly less support for affected staff (18 per cent) and bereaved families (9 per cent).

    The review recommends that anyone affected by a student’s death by suicide should be offered or signposted to appropriate support – acknowledging that effective postvention is itself a critical component of preventing further deaths.

    The review then found wide variation in how universities investigate student deaths and respond to them. In three-quarters (76 per cent) of all reviewed cases, families were not involved in any aspect of the suicide investigation process. While 72 per cent of reports indicated that the family was contacted after the death to offer condolences, only 11 per cent of families contributed to or were offered involvement in the investigation process. And just 6 per cent of reports had been shared with the families.

    As the report notes, families provided:

    …moving accounts of feeling excluded from the process of finding out what happened to their loved ones, and some had a perception that the university was evasive and reluctant to answer important and painful questions.

    The exclusion of those who knew the student best not only denies families closure but also prevents universities from gaining valuable insights about circumstances outside the institution.

    It also raised significant questions about who conducts these investigations and their qualifications to do so. In 35 per cent of reports, information on the lead reviewer was not available. Only 13 per cent explicitly stated that the lead reviewer had no prior involvement with the student – a fundamental principle of independent investigation.

    There was also little evidence that those conducting the reviews had specific training or expertise in suicide prevention or investigation. As the report notes:

    …completing a serious incident review is an additional strategic-level responsibility, with no status of its own within someone’s job role.

    Most reviews focused narrowly on the university’s own processes and records, rarely seeking information from external sources. Despite 60 per cent of reports indicating the student had contact with other agencies (such as healthcare providers), only 6 per cent of these included contributions from those organizations in the review process.

    The gathering of information “did not generally extend to records and contributions from other agencies” such as primary care, secondary mental health care, and the criminal justice system. This was true even where the university was aware that those agencies had played a critical role in the student’s care. This inward-looking approach created significant knowledge gaps that could have been filled with input from families, health providers, and other external sources.

    The report also notes that there were examples of gaps in the chronology with little or no information between the student’s last contact with the HE provider and the incident. Without a comprehensive understanding of the student’s circumstances, universities can’t effectively identify all factors contributing to suicide risk.

    This won’t come as a surprise to anyone working in HE, but while 79 per cent of reports identified learning to help prevent future incidents (generating almost 300 recommendations in total), the implementation process was often weak. Over half (53 per cent) identified specific actions, but 18 per cent of these lacked clear owners and 40 per cent had no timescales for delivery.

    That raises questions about whether these recommendations are ever fully implemented or simply filed away. Learning points were “inconsistently assigned or scheduled,” with a lack of institutional commitment to following through on identified improvements. Without accountability mechanisms and clear follow-up processes, there’s little assurance that these recommendations will lead to meaningful change.

    Learning from tragedy

    The review makes 19 specific recommendations across four categories – safety concerns, suicide prevention within university systems, amendments to guidance, and wider system messages. They are comprehensive – but they largely represent guidance rather than enforceable standards.

    The first recommendation, for example, calls for “mental health awareness and suicide prevention training” to be available for all student-facing staff, with consideration for making such training mandatory – acknowledging the critical role staff play in identifying and responding to students in distress.

    But the report stops short of recommending that training be required – using the softer language of “consideration” for mandatory training. It’s a recommendation I’ve read hundreds of times over the years, and in the financial and redundancies state the sector is in, it would be hard to believe that it’s going to happen without a requirement that it does.

    That’ll be why OfS is now requiring it in E6 for harassment and sexual misconduct, and why that includes a line on “no saying you can’t afford it – if you can’t afford it, don’t provide HE”. Something similar should surely apply here.

    Meanwhile recommendations 3 and 4 address academic pressures, calling for students struggling academically to be “recognised as potentially at risk” and for increased support at key academic calendar points. They are a shift toward viewing academic processes not just as educational tools but as potential risk factors for mental health – a perspective that aligns with campaigners’ arguments for a duty of care that encompasses the whole student experience.

    Although as I said above, some system-structural issues relating to age and pace ought be on the list inside DfE’s reform plans for proper consideration.

    While it stops short of recommending a duty of care, it does call for “a duty of candour” to be introduced to the HE sector, setting out organisational responsibilities to be open and transparent with families after a suspected suicide. That would include a duty to provide information on what happened, at the earliest point.

    As it stands, Keir Starmer promised that such a duty, to apply to public authorities including universities, would appear by 15 April – the anniversary of the Hillsborough disaster. But it’s a deadline that was missed – with rumours that officials have been attempting to water it down and questions over whether it would apply in internal investigations as well as statutory inquiries. A decision will need to come soon.

    Mark Shanahan, on behalf of the LEARN Network, argues that universities are learning communities, but it is unclear from the research whether the learning leads to change. If nothing else, they’re supporting the idea that the exercise becomes annual:

    In some ways, it’s a vindication to see the concerns of bereaved families confirmed, when many feel so excluded when they try to find out what happened to their sons and daughters. Without families’ strength and persistence this report would not have been commissioned. We need to see it repeated annually if lessons are to be learned over the longer term.

    Given that so few University Mental Health Charter Awards have been achieved (just two in 2025), the network also argues that a legal duty of care by universities towards students, delivered by statute and/or regulation is the only way to accelerate the changes advocated in this report.

    Duty of care?

    The review comes, of course, amid ongoing confusion about what a “duty of care” would actually mean in a university context. The current government position, articulated by DfE minister Janet Daby, is that “a duty of care in HE may arise in certain circumstances” which “would be a matter for the courts to decide.”

    On BBC News, asked why a legal duty of care had not been introduced, skills minister Jacqui Smith says that “we do think that universities have a general duty of care to their students”, but that there were “some legal challenges”:

    We’ll be absolutely clear with universities that this is their responsibility. We’ve made resource available and we will continue to challenge them to deliver that.

    Being “absolutely clear” means establishing a legal duty and then asking your regulator to proactively monitor compliance with it – not a combo of endless finger wagging and a charter whose evaluation report found universities where mental health and wellbeing efforts were ad hoc, siloed, had limited proactive outreach, featured inconsistent and sometimes contradictory responses across departments, and lacked a strategic approach to mental health in curriculum design, community building and risk management.

    And “resource” probably doesn’t mean the paltry £5 per student in the grant letter.

    The position on duty of care contrasts sharply with the certainty provided in other contexts – like as the duty of care employers owe to their employees or that schools owe to their pupils – and means students enter university without clarity on what protections they can expect, while universities operate without clear standards for their responsibilities.

    As Bob Abrahart argues:

    …students and universities need instead to know where they stand.

    The review signals pretty clearly that the ambiguity has real consequences – inconsistent practices, missed warning signs, and preventable tragedies. Valuable recommendations will mean nothing if their implementation remains voluntary without a statutory framework.

    And as I’ve argued before on the site, when students have rights and know their rights, they’re better able to contribute to decent conversations about how they might be implemented practically. The rest is all “in an ideal world”, and we’re very much not in an ideal world right now.

    A more comprehensive statutory duty of care would establish clear standards for prevention, requiring universities to take reasonable steps to avoid foreseeable harm. It would not, as opponents suggest, treat students as children or make universities responsible for all aspects of student wellbeing. It would provide clarity on the reasonable expectations students can have of their institutions, and ensure consistency across the sector.

    The review has shown where the problems lie – now ministerial courage is needed to implement solutions that are universally applied. The 107 students whose deaths formed the basis of this review deserved better. Future students deserve the protection of clear, enforceable standards that their staff get.

    Source link

  • EHRC is consulting on sex in the Equality Act. Universities should too

    EHRC is consulting on sex in the Equality Act. Universities should too

    24 hours after it promised to, the Equality and Human Rights Commission (EHRC) has launched a consultation on updates to its statutory Code of Practice for services, public functions and associations, following the Supreme Court’s ruling on the meaning of “woman” in the Equality Act.

    As a reminder, the kernel of the ruling was that the definition of sex in the Equality Act 2010 (the Act) should be interpreted as “biological” sex only. This means that, for the purposes of that Act, a person’s legal sex is the one that was recorded at their birth.

    That’s different to the previous interpretation adopted by the courts, which was that the definition of sex also includes people who have obtained a Gender Recognition Certificate (GRC). According to the new ruling, obtaining a GRC does not change your legal sex for Equality Act purposes.

    In this limbo period, the pressure from one side of the debate has been calling for X or Y to change now, and from another side arguing that it’s the Codes of Practice that matter.

    Designed to help individuals, employers, and service providers understand and comply with equality laws, they cover areas like employment, services, education, and public functions – and while they are not legally binding, courts and tribunals often take them into account in discrimination cases.

    Given the pressure, consultation was going to be rapid – open for two weeks – but EHRC has now announced that it will run until 30 June 2025. Once the consultation is done, it will review responses, make amendments to the Code of Practice, and it will then be submitted to the Minister for Women and Equalities for approval and laying in Parliament.

    The consultation specifically focuses on sections of the Code of Practice that need to be updated following the judgment – the rest was consulted on between October 2024 and January 2025.

    The idea is to gather views on whether its proposed updates clearly articulate the practical implications of the judgment and enable those who will use the Code to understand, and comply with, the Equality Act 2010. The EHRC is at pains to point out that the Supreme Court made the legal position on the definition of sex clear, so the EHRC is not seeking views on those legal aspects.

    Thus far, my conversations around the sector indicate a “we’re waiting for the guidance” approach before anyone does anything – though there has been at least one case where a university has had to backtrack and apologise having taken a decision that anticipated the interpretation of the new ruling.

    And that does create both some “race against time” pressure for universities (and SUs) intending to wait for the final guidance given the proximity to Welcome Weeks, and on the expectations people will have for what happens next, which I’ve explained below.

    Enforcement

    One of the major controversies has been about what we might call “enforcement” in a single-sex space, facility or service.

    In other words, if a toilet is marked up as “women”, will those operating said toilet be able to, or even expected to find some way of checking on someone’s biological birth sex?

    In the EHRC draft, asking someone about their birth sex publicly, or in a tone that is rude or combative, could amount to unlawful discrimination or harassment. Policies that apply different questioning standards to different people – for example, asking only those who “don’t look like they belong” in a particular space – also risk breaching indirect discrimination provisions, unless there is clear and justifiable reasoning behind them.

    The draft guidance also stresses that verifying someone’s birth sex – or requesting documents like a birth certificate – is a step that should be taken only in rare cases, and even then with great care. As such, the ability to enforce a single-sex policy by removing someone is more limited than it may initially appear, particularly for those on one side of the debate.

    If a trans person declines to answer a question about birth sex, or answers in a way that is later contested, a provider cannot simply demand proof unless they have a clear, lawful basis for doing so. And even then, their request must be framed within a structured policy that minimises legal risk and respects privacy. The idea that someone can simply be turned away on the basis of an appearance-based suspicion is not really viable under this framework.

    Another complication is that direct discrimination by perception applies to sex, even if the person doesn’t legally hold that characteristic under the Equality Act. The draft confirms that a trans woman, though not considered a woman in law, could still claim sex discrimination if they were treated less favourably because they are perceived to be a woman.

    This widens protection in practice – it confirms that discriminatory treatment based on how someone is seen, not just what they are, can still breach the Act – even where legal definitions of sex or gender reassignment wouldn’t otherwise apply.

    On maternity, protection from pregnancy and maternity discrimination now clearly rests on biological sex – the previous reliance on case law to justify protection for trans men with a GRC has been removed, because under the clarified interpretation, their legal sex remains female for Equality Act purposes.

    This simplifies the legal basis – trans men who are pregnant are now protected as women in law, not by exception or interpretation. It reinforces the Act’s grounding in biological sex across all protected characteristics, aligning pregnancy and maternity provisions with the rest of the guidance.

    Objective discrimination

    Changes to section 5 of the code mean that in EHRC’s view, the Equality Act’s provisions on indirect discrimination now more clearly include people who don’t share a particular protected characteristic but experience the same disadvantage as those who do. This reflects the “same disadvantage” principle that is more commonly understood re race or disability, and applies it explicitly to sex and gender reassignment.

    A case study shows how a trans woman can claim indirect sex discrimination if they are disadvantaged in the same way as women – because of a shared experience with women of feeling unsafe. The protection doesn’t depend on whether they are legally female under the Act, nor on any formal association with women – it just depends on whether they face substantively the same disadvantage from the same policy.

    This extends legal protection in practice – it allows a broader group of individuals to challenge policies that disproportionately harm one group, even if their own legal status differs. It also reinforces the role of objective justification – so organisations must be able to show that their decisions are fair, necessary, and proportionate, or risk breaching the Act.

    Harassment

    Section 8 explains the general test for harassment under the Equality Act, and the change here is that harassment based on perception is now explicitly covered in the context of sex and gender reassignment. A new example involves a trans woman – showing that even if someone is wrongly perceived to have a protected characteristic, such as being biologically female, they are still protected under the Equality Act if they face unwanted conduct related to that perception.

    That broadens the scope of protection for trans people and others facing abuse based on assumptions, and it further clarifies that intent is irrelevant – what matters is the effect of the conduct and its link to a perceived protected characteristic.

    Associations

    Section 12 explains how the Equality Act applies to associations, and makes clear that women-only associations can lawfully refuse membership to trans women, based on the clarified interpretation that sex under the Equality Act means biological sex at birth.

    The example given shows that a trans woman does not share the protected characteristic of “sex as a woman” under the Act and therefore can be excluded from an association that lawfully restricts membership to women.

    But it doesn’t say that a trans woman must be excluded.

    The complicator that isn’t covered in the draft runs something like this. If a women-only association excludes cis men, that is lawful – and always has been – because the Equality Act 2010 explicitly allows associations to restrict membership to people who share a protected characteristic, such as sex.

    But if that the association then permits trans women (whose legal sex is male under the Act, even if they have a GRC) but excludes cis men (also legally male), EHRC says that would be applying inconsistent treatment within the same legal sex category – undermining the justification for using the sex-based restriction in the first place.

    So if that association wants to use the lawful sex-based exception, it in theory has to apply that restriction consistently based on biological sex – exclude all who are legally male under the Act, including both cis men and trans women.

    If it instead admits some individuals with the legal sex of “male” (e.g. trans women), while excluding others (cis men), then the restriction is no longer based solely on sex, but on gender identity or appearance – and that is not a lawful basis for exclusion, and so could lead to claims of direct sex discrimination by cis men, since they are being treated less favourably than other legally male individuals (trans women).

    So if there’s a staff women’s group or an SU has a women’s officer, converting the group or position into one that’s about a topic rather than a membership characteristic looks OK. Even if it was about a membership characteristic, as long as it isn’t actively excluding cis men while including trans women, that would seem to be fine – notwithstanding there may be arguments about feelings of exclusion, or expectations raised in an inappropriate way and so on.

    Sport

    This has been a key issue in the commentary – EHRC’s draft clearly permits organisers of gender-affected competitive sports (i.e. sports where strength, stamina, or physique create a meaningful performance gap) to exclude or treat trans people differently if it is necessary for reasons of fair competition or safety.

    Crucially, the guidance affirms that exclusion must be justified, proportionate, and based on evidence. A blanket ban on trans participation would likely be unlawful unless organisers can show that it is essential to protect fairness or safety. For example, excluding a trans man from a men’s boxing event due to safety concerns is likely lawful – if justified with reference to physical risk.

    The guidance also clarifies that if an event is mixed-sex and so does not invoke the single-sex exception, sex discrimination claims may arise – for example, from cis women disadvantaged by trans women competitors.

    Basically it emphasises the need for clear, evidence-based policies, especially in sports where fairness and safety are contested. Organisers are encouraged to draw on medical guidance and national governing body rules, balancing inclusion with legal duties to all participants. For universities and SUs, there’s clearly a line to be drawn between what we might call “BUCS sport” and “a kickabout organised by reslife”, although that line is not especially clear here.

    Single sex services

    If you are operating a single-sex service, another revised section encourages service providers to develop clear, written policies on when and how they will deliver separate or single-sex services, while also allowing for limited, carefully considered exceptions in individual cases – such as admitting a male child to a women’s changing room – as long as it does not undermine the core purpose of the service (e.g. safeguarding women’s access, privacy, or safety).

    Another section draws a clearer line around how and when trans people may lawfully be excluded from single- or separate-sex services, while reinforcing that such exclusions must always be proportionate, justified, and considered on a case-by-case basis.

    And it reminds that admitting someone of the opposite biological sex – such as a trans person – to a single-sex service may legally change the nature of that service, making it no longer covered by the single-sex exceptions under the Equality Act, itself creating a legal risk of sex discrimination against those excluded from the redefined service.

    Providers are therefore advised to consider less intrusive alternatives, like offering additional mixed or separate services, or adapting facilities (private, unisex toilets), where feasible. But if alternative arrangements would be impractical or undermine the service itself, exclusion may still be proportionate and lawful – but that decision must also consider how the trans person presents, what alternatives exist, and whether the exclusion leaves them without access to essential services like toilets or changing rooms.

    Communal accommodation

    This is less common in these days of cluster flats and ensuite, but again there’s competing rights to weigh up. It’s lawful to restrict access to communal sleeping or sanitary facilities based on biological sex, particularly where shared use would compromise privacy. But excluding someone because of gender reassignment – such as a trans woman from women’s dorms – can also be lawful, but only if it is a proportionate means of achieving a legitimate aim, such as protecting privacy or avoiding distress.

    Upshots and implications

    So what now? The wait and see approach does mean that where universities (and their SUs) are making changes to facilities, groups, services, positions and so on, or not making changes, those decisions will now likely need to be made over the summer – which means a storing up of trouble for the new academic year.

    What’s clear from the guidance is that whether we’re talking about a women’s officer role in an SU, a changing room, a block of toilets or a women’s self defence group, there are going to be options.

    The draft seems to indicate that a sign on a toilet that’s painted pink saying “trans inclusive, use the toilet you are most comfortable with” would be legally fine – but it could also trigger indirect sex discrimination claims, particularly if women argue that the policy has created a space where they no longer feel safe or comfortable, especially in settings where privacy, safeguarding, or trauma concerns are significant.

    And that risk is heightened if an organisation fails to provide alternative single-sex spaces for those who need or expect them.

    In other words, the single sex provisions in the Equality Act are mainly about the ability to enforce and exclude the “other” sex from something, any expectations that are set (and then not met), and the availability of alternatives.

    If you can’t rely on the Equality Act to carry out your exclusion, you can’t exclude. As such if we imagine a block of toilets, there will be choices:

    Option 1 is effectively to say “these toilets are open to all”, or “these are aimed at those who identify as women”. As long as it’s clear that the intention is not to actively exclude men from those toilets, and that people can use which ever toilet they feel most comfortable in, that appears to be legal.

    Option 2 is to take a toilet block currently marked as “women” and to make clear that these are a single sex facility as per the Equality Act 2010 – in other words, these are toilets specifically for biological women.

    The same goes for pretty much everything that’s currently gendered.

    The complicator in the case of toilets is that under UK law, employers are required to provide separate toilet facilities for men and women unless each facility is a fully enclosed, lockable room intended for single occupancy – and the tricky part is that a lot of toilets on a university campus may well act both as staff toilets, and (service provider) student toilets.

    But nevertheless, this still opens up considerable flexibility. Neither the ruling, nor the draft guidance, nor the finalised guidance, is going to tell universities (and their SUs) what to do about a given facility, service, group, scheme or position – and nor will it supply information on the expectations of staff and students on a given campus.

    And that sets up a problem for September.

    • If a toilet block currently marked “women” is marked up in an Option 1 way, some people on campus will be furious that it’s not been defined exclusively for biological women.
    • If a toilet block currently marked “women” is marked up in an Option 2 way, some people on campus will be furious that it’s not been defined in a way that is trans inclusive.

    The decisions across the portfolio of facilities, services, groups, schemes or positions almost by definition can’t be consistent – and so the way those decisions are made, and why, will be crucial – as will careful communication of them.

    In other words, here in the dying days of May, the time to roll sleeves up and get consulting with staff and students is now – not during Welcome Week.

    Source link

  • Redistribution doesn’t work when there’s nothing left to redistribute

    Redistribution doesn’t work when there’s nothing left to redistribute

    Too many people across our country do not get the chance to succeed.

    So the government is committed to supporting the aspiration of every person who meets the requirements and wants to go to university or pursue an apprenticeship, regardless of their background, where they live and their personal circumstances.

    Those aren’t my words – they’re the words of the House of Commons’ HE supply teacher Janet Daby, who answers for actual (Lords) minister Jacqui Smith whenever a question comes up about universities or students.

    This answer is a typical one – in which she notes that in the summer, the department (for education) will set out its plan for HE reform and that it will expect providers to play an “even stronger” role in improving access and outcomes for all disadvantaged students.

    Specifically on financial support:

    Whilst many HE providers have demonstrated positive examples of widening access, including targeted outreach and bursaries, we want to see the sector go further.

    Back in 2014, partly to get “top-up fees” through Parliament, then secretary of state Charles Clarke announced that a new Office for Fair Access (OFFA) would be created – and that it would require universities to offer up some of their additional fee income in bursaries.

    Assuming that a proportion of student financial support should come partly via universities’ own budgets has always created a tension – between those who say that local decision making (aka institutional autonomy) is better at designing schemes that get the money to where it’s really needed, and those that argue that redistributing fee income within a provider rather than across the country means that financial support ends up being based not on need, but on the number of other students at your university that need it.

    We used to be able to see that clearly. OfFA used to track how many “OfFA countable” students each provider had and their spending on financial support, and it would generally show that providers doing the most for access tended to have the least to spend per student.

    Over time, direct student financial support declined in popularity. Research questioned bursaries’ impact on applications (unsurprising given how hard it was to find information on them), and it tended to struggle to find retention benefits from 2006-2011 – findings that then got extrapolated far beyond their timeframe.

    Pressure to demonstrate impact led providers to focus on entry and completion metrics rather than the experience students were having as a result. That seemed less critical in the mid-2010s when inflation was low and maintenance loans were cranked up to hide the fact that grants were eliminated. Students living at home (more likely from widening participation backgrounds) also got relatively generous maintenance support compared to their costs.

    Eventually, provider-level reporting on student financial support pretty much disappeared as the Office for Students started to emphasise outcomes over experience or spending transparency.

    But with maintenance support over the past few years some distance from inflation, and the income thresholds over which parents are expected to top up stuck at the level they were set at in the year that Madeleine McCann went missing (18 whole years ago), we really do need some sense of how the mix is panning out.

    So to help us to understand what’s been going on, for the fourth year running we’ve managed to extract some data out of OfS via an FOI request.

    The data

    Ever since the days of the Office for Fair Access (OFFA), HESA has collected data on the amounts of student financial support, and the number of students that helps, for each university in England – and here we have that data over the past few years.

    It covers four different types of spend on student financial support:

    • Cash: This covers any bursary/scholarship/award that is paid to students, where there is no restriction on the use of the award
    • Near cash: This includes any voucher schemes or prepaid cards awarded to students where there are defined outlets or services for which the voucher/card can be used
    • Accommodation discounts: This includes discounted accommodation in university halls / residences
    • Other: This includes all in-kind or cash support that is not included in the above categories and includes, but is not limited to, travel costs, laboratory costs, printer credits, equipment paid for, subsidised field trips and subsidised meal costs

    Some caveats: We remain less than 100 per cent convinced about the data quality, this doesn’t tell us how much money is going to disadvantaged students specifically, it doesn’t tell us about need (and the extent to which need is being met), I’ve yanked out most of what we used to call alternative providers for comparison purposes, and it only covers home domiciled undergraduates (and below, in terms of level of study).

    But it is, nevertheless, fascinating. Here’s the numbers for each provider in England:

    [Full Screen]

    If we nationally just look at cash help, in 2023/24 just over £496m went to just under 311k students – a spend per head of £1,598 – very slightly above last year’s £1,464 per head.

    But dive a little deeper and you find astonishing disparities. In the Russell Group the £ per head was £2,362 – about £40 up on the previous year. Across Million+ providers that figure was £726 – just £4 more than 2 years ago.

    Interestingly, per student helped, the Russell Group spent the same in cash help per student as it did in 2019. Maybe inflation doesn’t apply in elite universities, or maybe they’re getting worse at recruiting those on low incomes. Meanwhile the cash spend per student helped across Million+ universities has almost halved from £1,309 in 2019/20.

    Clearly all universities are under financial pressure – but what we see is almost certainly an artefact of redistributing fee income around a provider rather than around a country, and it appears to result in manifest unfairness.

    Even if we don’t adjust for inflation, spend per student helped has fallen for 45 universities between 2022/23 and 2023/24, and since 2019, it’s fallen for 56 universities. If we do apply inflation (CPI), only five are beating their 2019 SPH. No wonder students are struggling to come to campus.

    Some may say that it might be better just to look at what’s been going on under the auspices of formal, declarable access and participation work. HESA finance data now includes a look at expenditure – but not the number of students that expenditure covers, nor the total amounts invested pre-pandemic, and nor the amounts allocated in premium funding, all of which would aid meaningful comparison.

    Moving money around

    I tend, in general, to be a fan of redistribution and cross-subsidy. It can help reduce economic inequality, promote social stability, and ensure that everyone has access to basic necessities. It reflects a commitment to fairness and the idea that a society should care for all its members.

    As such, the logical bit of my brian never had much of a problem with the Charles Clarke/OFFA expectation – it was at least aimed at ensuring that everyone got to have a decent experience at university.

    But the redistributive effects of moving money around a provider when some providers (which already tend to be the richest) have fewer poor kids to spend it on never really added up.

    If you really wanted the system to be fairer, and for the most money to reach those who need it most, you might start by acting regionally. I doubt that John Blake’s regional partnership structures – which will involve cohort-level renewal for Access and Participation Plans will actually go as far as expecting providers in a region to pool their bursary or hardship spend – but there’s a very good logical case for that kind of approach.

    When students at Salford are getting £358 each in cash help while their neighbours at the University of Manchester are getting £1,829, there’s a very strong case for pooling the money.

    But even if that was to happen, beware the regional agglomeration effects. The region with the lowest higher education participation rate in the UK is the North East of England, at 33.4 per cent. London, with its 63 per cent rate, ought to be giving some of its spend on student financial support away to support participation up North.

    And once you’re there, you (re)realise what many said at the time of the Clarke announcement – that moving money around a university when participation in universities is so unequal to start with is no way to run a fair system.

    And even more importantly, it’s not fair on fee-paying students. When the assumption was that fees were a small part of the overall funding mix, we could say to students that the state’s contribution would be focussed more on those in need.

    Even with fees at £9,000, the redistributive effects of some paying much more than that through interest of RPI+3% and some much less via the repayment threshold and the cut-off – all while funding a moderately comfortable financial support system for all – was some sort of egalitarianism in action.

    But once the subsidy slips away, and students are expected to pay back almost all of the debt they incur, we end up expecting their personal debt to do what the state ought to do. And while it’s one thing for your fees to be spent subsidising other students at your own university, it would be quite another for them to be spent subsidising those at others in your region, or even around the UK.

    Then add in the fact that in UUK’s cuts survey, just under half of universities (49 per cent) say they may still need to cut hardship funding and 59 per cent say they may need to cut bursaries. Even if some sort of tougher APP regime was to find a way to stop that, that just means that wider cuts will fall on everyone – and so for some students, less and less of their actual contribution will end up being spent on their actual education.

    It turns out that the progressive taxation – ensuring that those with higher incomes contribute a larger share of their earnings to public services – is the much better way to promote economic fairness and reduce income inequality. Who knew?

    Source link

  • Lessons from innovating in our student support model

    Lessons from innovating in our student support model

    Over the last ten years – and particularly since the pandemic – the complexity of student wellbeing issues in higher education has increased significantly. It became clear to us at the University of Exeter that the traditional model of academic tutoring alone was no longer sufficient to meet the needs of our students.

    Like many other higher education institutions, we had long utilised an academic support model where most academic staff were allocated groups of tutees to provide both academic and pastoral support alongside a range of professional services in areas such as welfare, wellbeing, accessibility and financial support. Our review and research into higher education institutions best practice – both in the UK and internationally, and drawing on approaches from schools and further education providers, identified a clear need for dedicated expertise to provide pastoral support at Exeter.

    This led to the development of our Pastoral Mentor model, which we began piloting in autumn 2023. By 1 August 2025, we will have rolled out Pastoral Mentors to every department. Our model was described briefly in Wonkhe last year but you can also read more about it in the Journal of Learning Development in Higher Education. In summary, Pastoral Mentors are dedicated, non-teaching student support staff embedded in departments, serving as a friendly first point of contact for students facing challenges affecting their studies. They proactively reach out to students based on engagement and attainment data, offer a non-judgmental space for conversations, and connect students with specialist support services as needed. Our pastoral mentors work closely with discipline based staff and wider support services to identify the best way to assist students and ensure that the help they need is connected and timely.

    Lessons from transformation

    While institutions will adopt different approaches to student support, in this piece we reflect on what we’ve learned from implementing institutional change at Exeter, and share the key principles which underpin our model – offering insights we hope will be useful for others working in this space.

    Early identification is key. The earlier students identify they are struggling the easier it is to provide support and put remedies in place. Often, the causes of student failure and drop out begin as relatively low-level challenges, but these can escalate over time – non-attendance leads to missed submissions, which in turn result in failed modules, referrals and potentially withdrawal. If we can identify students whose attendance pattern drops early and support them to get back into the classroom, we can mitigate against many of these larger issues.

    Data is key to this. All institutions now hold large amounts of data on our students; attendance, engagement with the VLE, submissions, grades. We need to use this to support students and at Exeter we developed a bespoke engagement dashboard to enable us to identify students who might be struggling.

    Clear lines of responsibility are vital. It’s no good having access to data if it’s not clear who is going to act on it. Our Pastoral Mentors are responsible for using the engagement dashboard to identify students of concern and do the initial reach out. They then are responsible for linking students who require more specialist support with the correct service, not just telling the student who to contact but in some cases making that contact for them or following up with the student later to ensure they have accessed the support they need. It’s vital that students don’t slip through the net – whether because no one acts on the data or because they fall unnoticed between services.

    Clear escalation processes need to be established. It’s critical to have a clear understanding of where one person’s responsibility ends and when a student should be confidently referred to a specialist. We’ve developed well-defined escalation processes so that our Pastoral Mentors don’t feel pressured to take on issues beyond their expertise and remit, and to ensure we make full use of the specialist staff elsewhere in the institution – helping to maintain the integrity of the overall support ecosystem.

    Presence is a must. Early feedback from our students’ union and students’ guild highlighted the importance of face-to-face, named support, with students finding it easier to seek help from someone they already know. Our Pastoral Mentors are present in departments, they attend welcome and transition events, informal department gatherings and department social events for students. Students should know who the Pastoral Mentor is before they need help to facilitate that first conversation. As a core part of the education team, Pastoral Mentors also become specialists in the rhythm and challenges of the discipline and can thus provide contextualised support and advice relevant to the students’ programme.

    Clarity of message for students is essential. Students are often put off seeking support because they fear disciplinary or fitness to study processes, in particular international students sometimes do not seek support from traditional academic tutors because they do not want to disclose problems to those teaching them or marking their work. Our Pastoral Mentors aim to decouple support from formalised processes around unsatisfactory progress or visa compliance and rather focus on reaching out compassionately, emphasising the importance of a students’ wellbeing and success. Students have reported that this enhanced their sense of belonging and mattering, making it easier to seek support early.

    Supporting colleagues through change

    Institutional change is never easy and while many staff recognise the need to enhance our student support offer to students, it remains an emotive issue. Some departments embraced the new model from the outset, while others found the transition more difficult. There’s never “enough” evidence, particularly when the change you are implementing is both transformative and innovative.

    As academics we often spend a lot of time seeking and compiling evidence to support a theory, but sometimes we have to be brave enough to enact change because it’s the right thing to do and have confidence that we can bring people along over time. If everyone waits for the evidence from others, innovation will never happen. We have found that co-creation is powerful; in order to address the “evidence” challenge, we had to deploy compassion and communication rather than additional data.

    We have to meet colleagues where their concerns lie, not t diminish those concerns but to listen to and recognise both the opportunities and risks associated with change. At Exeter, we adopted a phased co-creation model for our Pastoral Mentor approach, being open with departments that we didn’t have all the answers upfront and that we needed to work together to meet students’ needs. Through this iterative approach we were able to take all our departments with us at a pace that suited them and subsequent feedback on the roll out has been overwhelmingly positive.

    Student support is an emotive area, and it’s important to recognise existing best practice alongside the benefits of change. While we should acknowledge the great work many have done and continue to do, it is also important to recognise the pressure providing pastoral support can put on colleagues. We were keen to ensure that specialising support wasn’t seen as a criticism but a way to relieve pressure on colleagues and ensure more sustainable support for our whole community.

    Source link

  • The End of Participation Growth

    The End of Participation Growth

    One of the things that I find extremely worrying about higher education policy these days is that we’ve simply stopped talking about increasing access to the system. Oh, sure, you will hear lots of talk about affordability, that is, making the system cheaper—and hence arguments about the correct level of tuition fees—but that’s not the same. Even to the extent that these things did meaningfully affect accessibility (and it’s not at all clear that they do), no one phrases their case in terms of access anymore. We don’t care about outcomes. And I do mean no one. Not students, not governments, not institutions. They care about money, cost, all sorts of things—but actual outcomes with respect to participation rates of low-income students? At best, they are a rhetorical excuse to mask regressive spending policies which benefit the rich.

    This is a problem because it now seems as though the process of widening access, a project which began after World War II and has been proceeding for seven decades. And yet, as some recently-released Statistics Canada data shows, participation rates are now actually in decline in Canada. And it’s mainly because growth at the bottom has stalled.

    Below is the chart StatsCan released last month. It shows the post-secondary enrolment rate for 19-year-olds, which I will henceforth refer to as the “part rate” or “participation rate,” both for the entire population (the dotted red line) and by income quintile.

    Now, the first thing you may notice is that there are some pretty big gaps between the participation rates of youth from rich and poor families; the top quintile does not quite attend at double the rate of the lowest quintile, but it’s close. And you might be tempted to say, “Hey, I’ve taken Econ 101—That must be because of tuition fees!” Except, no. These kinds of part-rate disparities are pretty common internationally, regardless of tuition fees. Here are postsecondary enrolment rates by income quintile from the United States, which, on the whole, has higher fees than Canada:

    And here’s a similar chart from Poland, which mostly offers education tuition-free:

    And here’s one from France, where public universities are tuition-free but students are increasingly heading to the fee-paying private sector:

    I could go on, country-by-country, but I will spare you and instead point you to this rather good paper doing a cross-national analysis across over 100 countries by OISE’s Elizabeth Buckner. Trust me, it’s the same story everywhere.

    But let me point out what I think are the two important points in that chart. The first is that the red dotted line, which represents the participation rate of all 19-year-olds, basically plateaued back in about 2014, the first year it broke the 59% and is currently headed downwards. This is a huge change from the previous period, 2000-2014, when overall participation rates rose from 46% to 59%. First growth, now stagnation.

    The second is that during the growth period, the biggest strides were being made at the bottom end of the income scale. The part rate gap between top and bottom quintiles fell from 38 percentage points in the early 2000s to about 32 percentage points in 2014, even as part rates for the wealthiest quintile increased. That is to say, more of our growth came from the bottom than from the top. That’s good! But the growth stopped across all income quintiles and went gently into reverse for the top four income quintiles.

    Now, you might think that it’s not a bad thing that participation rates peaked, that maybe we were in a situation where we were overproducing postsecondary graduates, etc. Who knows, it’s possible. I don’t know of any evidence that would suggest that 57-59% of the youth population is some kind of hard maximum, but if stipulating that such a maximum exists, then it might well be in this range.

    But since it’s quite clear that this overall plateauing of participation is happening entirely by way of freezing educational inequality at substantial levels, being OK with the present situation means being OK with major inequalities, and in any democracy which wishes to remain a democracy, that’s not really OK. It is true that, as I noted earlier, disparities are the global norm, but that doesn’t mean you don’t keep up the struggle against stasis. It might be the case that there is some kind of “natural barrier” to keep the country’s PSE part rate at 57-59%, but in what world does a “natural barrier” keep those rates at 75% for rich kids and 43% for poor kids?

    Increasing access overall and narrowing rich-poor access gaps is incredibly difficult. If it were as simple as making tuition free, we’d have it licked in no time, but countries with free tuition don’t have noticeably narrower part rate gaps than those that charge fees. Achieving these gaps requires a whole suite of policies to narrow educational achievement gaps as well as financial ones, to offer young people a variety of flexible program types rather than an inflexible academic monoculture and to ensure that advice and support exist for students not lucky enough to be able to access the kinds of cultural capital available to the top quintile.

    As I say, achieving success in this area is very difficult: solutions are neither easy nor quick. But what makes the problem even more intractable is ignoring it the way we are doing right now. Are we a country that actually cares about equal opportunity? Or is that just a myth to which we genuflect when we wish to pretend to be more socially progressive than Americans? I lean towards option #2 but would be overjoyed to be proven wrong.

    Source link

  • WSU continues industry partnership trend with Genetec – Campus Review

    WSU continues industry partnership trend with Genetec – Campus Review

    Western Sydney University (WSU) will send some of its students to intern at a Sydney-based tech company amid continued calls for universities to partner with industry to produce better quality graduates.

    Please login below to view content or subscribe now.

    Membership Login

    Source link

  • Asking students about value | Wonkhe

    Asking students about value | Wonkhe

    To value something (or not) is a curious thing.

    You can value anything; someone’s opinion, their feelings, their house, indeed nothing is out of the scope of being valued.

    In its broadest philosophical sense, value can be considered as the importance of any object, feeling or an action, prescribed by an individual before, during or after the fact.

    If we consult the ancient texts, then Plato offers a binary view of value. There is instrumental value, where something serves as a means to another end, and then there’s intrinsic value, which is just that.

    Its value exists by virtue of its own existence, it does not need to enable any other end or objective.

    The value of higher education

    So, is a degree and any student loan repayments just a means to graduate employment and taxpayer ROI (instrumental value), or is being within university education in of itself valuable (intrinsic)?

    I’m going to dodge the question early doors, to be honest, and instead invite discussion alongside a presentation of the student view of all of this. I’m nearing the end of a three-year longitudinal data collection process, whereby I’ve been annually surveying and interviewing the same cohort of undergraduate students from five different HEIs since the end of their first year back in May 2023. This has largely been in service of my part-time PhD, but with a day job in student experience and enhancement there’s some ready employment applicability.

    How did we get here?

    Please do check out my PhD literature review when it’s published for a fulsome answer, but in summary, a series of neoliberal policy interventions since the 1963 Robbins Report have led us to where we are today. The commodification of HE has crept in over time, and instruments like the NSS launched in 2005 (happy 20th anniversary!) and a new market regulator in 2017 are not insignificant markers of this creep.

    “Value for money” as a phrase, for the full villain origin story, appeared in the 1980s via the Local Government and Finance act, defining value for money in terms of 3Es: economy, efficiency and effectiveness. With the creation of the aforementioned HE regulator in 2017, value for money became part of regular policy parlance, given it was a central feature of the OfS’ strategy documentation and purpose. It also inspired people like myself and others to get under the skin of what it actually means in this context.

    Right here, right now

    By annually surveying and interviewing the same cohort of students across five institutions throughout their university education so far, I’ve found a few threads to pull on that I want to share. The first one is all about time and the temporal location of student value for money perceptions.

    Current policy is at odds with how students think about the value of their education. It looks into a hazy future of graduate earnings and loan repayments, with the higher of each being the better for all concerned. From my research, and the addition of a ‘temporal location’ to all my survey interview responses, student perceptions of value for money are located in the present day or recent past. They are not looking to a near-future and PAYE potential; they are looking at what they currently get versus the expectations they had and that is the challenge for institutions to overcome.

    Non-users and peer influencers

    A second research thread to dangle for readers here is that of non-user bias in student value for money perceptions. From my data, students are more likely to rate a particular aspect of their student experience as negative value for money when they haven’t used it. They don’t opt for neutral ratings; they go for negative as “I don’t know what they do.”

    As a counterpoint from my data, those students who do engage are far more likely to rate aspects as good value for money and on the whole are receiving excellent customer service (their words, not mine!). These two things in tandem really are a challenge for institutions, as while engagement leads to positive perceptions, very few will have the resource capacity to cater for all of their students.

    The influence of near-peers also can’t be understated. Students in my research will think something is bad value for money if a peer tells them so. This isn’t perhaps a shocking revelation, but what it can create is a barrier to that student ever engaging with that service for themselves, as it didn’t work out for their friend (as is their perception).

    How do you deliver timely (and personalised) messages to students in order to make them aware of the variety of things on offer for them? In an NSS context this is vital because students who think over the course of their degree that something hasn’t happened or not been available may well score you as such.

    Value for money when money is tight

    In my research I ask students about their value for money perceptions of student services and support. For positive perceptions one thing is very apparent in that they are largely driven by a direct engagement with any particular service, and doubly that their expectations of that service were met. They got what they thought they came in for.

    If you want students to think you offer value for money, then any investment you have in student support ought to focus on providing an excellent service, and meeting student expectations of that. This sounds simple, and indeed rather basic, but a bad experience leads to that student telling their peers, who may then not engage when they themselves need to access that particular service. In the current era we can’t give every student everything, and nostalgia for a more affluent time won’t help. All you can do is excel at the services you do offer to students and feed that positivity cycle.

    Dark and dangerous times lie ahead

    The sector is in a tricky financial situation, resources are shrinking, international numbers are in flux and your current and next incoming cohort are going to feed your APP, NSS, and TEF metrics for the remainder of this decade. Looking through a value for money lens, the things that drive positive student perceptions are excellent service levels that align to what they were expecting to happen. Focus on doing that very well is what you have to do when expansion and new projects aren’t an option.

    As one last bit of insight from my research, I ask students each year if they feel like they know what their tuition fee is spent on, and the majority say no. I also ask them to rate their overall university experience for value for money, and 44 per cent give it a very good or good rating. That 44 per cent is slightly above what you see in the annual HEPI Student Academic Experience Survey, but for those in my data who do feel like they know where their tuition is spent, this rises significantly to 73 per cent. You don’t need an itemised Council Tax type bill, but something not far off that demonstrates the breadth of fee spend could work wonders.

    Source link

  • Tufts PhD Student Released After Six-Week Detention Raising Academic Freedom Concerns

    Tufts PhD Student Released After Six-Week Detention Raising Academic Freedom Concerns

    Rümeysa Öztürk with her attorneyAfter six weeks in federal detention, Tufts University doctoral student Rümeysa Öztürk was released last Friday following a federal judge’s ruling that her continued detention potentially violated her constitutional rights and could have a chilling effect on free speech across college campuses.

    U.S. District Judge William K. Sessions III ordered Öztürk’s immediate release, stating she had raised “substantial claims” of both due process and First Amendment violations. The 30-year-old Turkish national, who was arrested on March 25 outside her Somerville, Massachusetts home by masked federal agents, had been detained at the South Louisiana ICE Processing Center in Basile, Louisiana—more than 1,500 miles from her university.

    “Continued detention potentially chills the speech of the millions and millions of individuals in this country who are not citizens. Any one of them may now avoid exercising their First Amendment rights for fear of being whisked away to a detention center,” Judge Sessions stated during Friday’s hearing.

    Öztürk’s legal team argued that her detention was directly connected to her co-authoring a campus newspaper op-ed critical of Tufts University’s response to the war in Gaza. During the hearing, Judge Sessions noted that “for multiple weeks, except for the op-ed, the government failed to produce any evidence to support Öztürk’s continued detention.”

    The Trump administration had accused Öztürk of participating in activities supporting Hamas but presented no evidence of these alleged activities in court. Öztürk, who has a valid F-1 student visa, has not been charged with any crime.

    Öztürk’s case is part of what appears to be a growing pattern of detentions targeting international students involved in pro-Palestinian activism. Her arrest by plainclothes officers, captured on video showing her being surrounded as she screamed in fear, sparked national outrage and campus protests.

    “It’s a feeling of relief, and knowing that the case is not over, but at least she can fight the case while with her community and continuing the academic work that she loves at Tufts,” said Esha Bhandari, an attorney representing Öztürk.

    The same day as Öztürk’s release, the U.S. Second Circuit Court of Appeals in New York denied an administration appeal to re-arrest Columbia University student and lawful permanent resident Mohsen Mahdawi, another case involving a student detained after pro-Palestinian advocacy.

    During her six weeks in detention, Öztürk, who suffers from asthma, experienced multiple attacks without adequate medical care, according to testimony. At Friday’s hearing, she briefly had to step away due to an asthma attack while a medical expert was testifying about her condition.

    Judge Sessions cited these health concerns as part of his rationale for immediate release, noting Öztürk was “suffering as a result of her incarceration” and “may very well suffer additional damage to her health.”

    In his ruling, Judge Sessions ordered Öztürk’s release without travel restrictions or ICE monitoring, finding she posed “no risk of flight and no danger to the community.” Despite this clear order, her attorneys reported that ICE initially attempted to delay her release by trying to force her to wear an ankle monitor.

    “Despite the 11th hour attempt to delay her freedom by trying to force her to wear an ankle monitor, Rümeysa is now free and is excited to return home, free of monitoring or restriction,” said attorney Mahsa Khanbabai.

    Source link

  • Academic judgement? Now that’s magic

    Academic judgement? Now that’s magic

    Every day’s a school day.

    In my head, I thought I understood the line between what counts as “academic judgement” and what doesn’t in cases, processes, appeals and complaints.

    It matters because my understanding has long been that students can challenge and appeal all sorts of decisions – right up to the Office of the Independent Adjudicator (OIA) in England and Wales – but not if the decision is one that relates to matters of academic judgment.

    Thus in a simplistic coin sorter, “this essay looks like a 2:1 to me” can’t be challenged, but “they’ve chucked me out for punching someone when I didn’t” can.

    I’ve often wondered whether the position can hold when we think about the interaction between consumer protection law – which requires that services be carried out with reasonable skill and care – and this concept of unchallengeable academic judgement in the context of workloads.

    Back in the halcyon days of Twitter, I’d regularly see posts from academic staff bemoaning the fantasy workload model in their university that somehow suggested that a 2,000 word essay could be read, graded and fed back on in 15 minutes flat.

    Add in large numbers of assignments dribbling in late via extensions and accommodations, the pressure to hit turnround times generated by the NSS question, wider workload issues and moderation processes that look increasingly thin (which were often shredded or thinned out even further during the marking boycotts), and I imagined a judge evaluating a student’s case to say something along the lines of “to deploy your magic get out jail free card, sunshine, you’ll need to have used more… care.”

    But that’s about an academic judgement being made in a way that isn’t academically defensible. I had a conversation with an SU officer this afternoon about academic misconduct off the back of a webinar they’d attended that OfS ran on AI, and now I’m more confused than ever.

    Do not pass go

    The bones first. The Higher Education Act 2004 mandated a body that would review complaints to replace the old “visitor” system, and it includes a line on what will and won’t qualify as follows:

    A complaint which falls within subsection (1) is not a qualifying complaint to the extent that it relates to matters of academic judgment.

    The concept is neither further defined nor mentioned anywhere else in UK law – but has deep roots. In medieval universities scholarly masters enjoyed autonomous assessment rights, and it gained legal recognition as universities developed formal examination systems during the Enlightenment period.

    By the 20th century, academic judgment became legally protected from external interference, exemplified by landmark cases like Clark v. University of Lincolnshire and Humberside (2000), which established that courts should not intervene in academic assessments except in cases of procedural unfairness:

    This is not a consideration peculiar to academic matters: religious or aesthetic questions, for example, may also fall into this class. It is a class which undoubtedly includes, in my view, such questions as what mark or class a student ought to be awarded or whether an aegrotat is justified.

    The principle that most understand is that specialised academic expertise uniquely qualifies academics to evaluate student performance and maintain educational standards, free from political or economic pressures.

    The OIA takes the line in the legislation and further defines things as follows:

    Academic judgment is not any judgment made by an academic; it is a judgment that is made about a matter where the opinion of an academic expert is essential. So for example a judgment about marks awarded, degree classification, research methodology, whether feedback is correct or adequate, and the content or outcomes of a course will normally involve academic judgment.

    It also helpfully sets out some things that it doesn’t consider fall into the ambit:

    We consider that the following areas do not involve academic judgment: decisions about the fairness of procedures and whether they have been correctly interpreted and applied, how a higher education provider has communicated with the student, whether an academic has expressed an opinion outside the areas of their academic competence, what the facts of a complaint are and the way evidence has been considered, and whether there is evidence of bias or maladministration.

    I’m not convinced that that “what’s in, what’s out” properly considers or incorporates the consumer protection law issue I discuss above – but nevertheless it hangs together.

    In his paper on whether the concept will hold in an era of consumerism, David Palfreyman argues that academic judgment properly applies to subjective assessments requiring specialised expertise – grading student work, designing curriculum, and evaluating learning outcomes.

    Educational institutions and courts generally consider these issues beyond external scrutiny to protect academic freedom and professional autonomy – because academics possess unique qualifications to make nuanced, context-dependent judgments about academic quality that outside parties lack the expertise to evaluate effectively.

    On the other side of that see-saw, he argues that academic judgment should not shield factual determinations, procedural errors, or administrative decisions from review. When institutions make claims about whether they properly applied their own rules, or failed to follow fair procedures, these issues fall outside protected academic judgment.

    Religious or aesthetic?

    But then back in the OIA’s guidance on its scheme rules at 30.4, there’s this curious line:

    Decisions about whether a student’s work contains plagiarism and the extent of that plagiarism will normally involve academic judgment, but that judgment must be evidence based.

    If that feels like a fudge, it’s because it is. “Whether” feels like a significantly different concept to “extent of that”, insofar as I can see how “did you punch the student” is about weighing up facts, but “how much harm did you cause” might require an expert medical judgement. But in a way, the fudge is topped off with that last sub-clause – what the OIA will insist on if someone uses that judgement is that they’ve used some actual evidence.

    And if they haven’t, that then is a process issue that becomes appealable.

    The problem here in 2025 is that 30.4 starts to look a little quaint. When someone was able to say “here’s one student’s script, and here’s another” with a red sharpie pointing out the copying, I get the sense that everyone would agree that that counts as evidence.

    Similarly, when Turnitin was able to trawl both the whole of the internet and every other essay ever submitted to its database, I get the sense that the Turnitin similarity score – along with any associated reports highlighting chunks of text – counts as evidence.

    But generative AI is a whole different beast. If this blog over-used the words “foster” and “emphasize”, used Title Case for all the subheadings, and set up loads of sentences using “By…” and then “can…”, not only would someone who reads a lot of essays “smell” AI, it would be more likely to be picked up by software that purports to indicate if I have.

    That feels less like evidence. It’s guess work based on patterns. Even if we ignore the research on who “false flags” disproportionately target, I might just like using those phrases and that style. In that scenario, I might expect a low mark for a crap essay, but it somehow feels wrong that someone can – without challenge – determine whether I’m “guilty” of cheating and therefore experience a warning, a cap on the mark or whatever other punishment can be meted out.

    And yes, all of this relates back to an inalienable truth – the asynchronous assessment of a(ny) digital asset produced without supervision as a way of assessing a students’ learning will never again be reliable. There’s no way to prove they made it, and even if they did, it’s increasingly clear that it doesn’t necessarily signal that they’ve learned anything when they did.

    But old habits and the economics of massification seem to be dying hard. And so in the meantime, increasing volumes of students are being “academically” judged to have “done it” when they may not have, in procedures and legal frameworks where, by definition, they can’t challenge that judgement. And an evaluation of whether someone’s done it based on concepts aligned to religion and aesthetics surely can’t be right.

    Cases in point

    There’s nothing that I can see in the OIA’s stock of case summaries that sheds any light on what it might or might not consider to count as “evidence” in its scheme rules.

    I don’t know whether it would take as its start point “whatever the provider says counts as evidence”, or whether it might have an objective test up its sleeve if a case crossed its desk.

    But what I do know is just how confusing and contradictory a whole raft of academic misconduct policies are.

    The very first academic misconduct policy I found online an hour or so ago says that using AI in a way not expressly permitted is considered academic misconduct. Fair enough. It also specifies that failing to declare AI use, even when permitted, also constitutes misconduct. Also fair enough.

    It defines academic judgement as a decision made by academic staff regarding the quality of the work itself or the criteria being applied. Fair enough. It also specifically states that academic judgement does not apply to factual determinations – it applies to interpretations, like assessing similarity reports or determining if the standard of work deviates significantly from a student’s usual output. Again, fair enough.

    But in another section, there’s another line – that says that the extent to which assessment content is considered to be AI generated is a matter of academic judgement.

    The in-principle problem with that for me is that a great historian is not necessarily an LLM expert, or a kind of academic Columbo. Expertise in academic subject matter just doesn’t equate to expertise in detecting AI-generated content.

    But the in-practice problem is the thing. AI detection tools supplying “evidence” are notoriously unreliable, and so universities using them within their “academic judgment” put students accused of using AI in an impossible situation – they can’t meaningfully challenge the accusation because the university has deemed it unchallengeable by definition, even though the evidence may be fundamentally flawed.

    Academic judgements that are nothing of the sort, supported by unreliable technology, become effectively immune from substantive appeal, placing the burden on students to somehow prove a negative (that they didn’t use AI) against an “expert judgment” that might be based on little more than algorithmic guesswork or subjective impressions about writing style.

    Policies are riddled with this stuff. One policy hedges its bets and says that the determination of whether such AI use constitutes academic misconduct is “likely to involve academic judgement”, especially where there is a need to assess the “extent and impact” of the AI-generated content on the overall submission. Oil? Water? Give it a shake.

    Another references “academic judgment” in the context of determining the “extent and nature” of plagiarism or misconduct, “including the use of AI” – with other bits of the policy making clear that that can’t be challenged if supported by “evidence”.

    One I’m looking at now says that the determination of whether a student has improperly used AI tools is likely to involve academic judgement, particularly when assessing the originality of the work and whether the AI-generated content meets the required academic standards. So is the judgement whether the student cheated, or whether the essay is crap? Or, conveniently, both?

    Set aside for a minute the obvious injustices of a system that seems to be profoundly incurious about how a student has come to think what they think, but seems obsessed with the method they’ve used to construct an asset that communicates those thoughts – and how redundant that approach is in a modern context.

    Game over

    For all sorts of reasons, I’ve long thought that “academic judgement” as something that can be deployed as a way of avoiding challenge and scrutiny is a problem. Barristers were stripped of their centuries-old immunity from negligence claims based on evolving expectations of professional accountability in the 2000s.

    In medicine, the traditional “Bolam test” was that a doctor was not negligent if they acted “in accordance with a responsible body of medical opinion” among their peers But a case in the nineties added a crucial qualification – the court must be satisfied that the opinion relied upon has a “logical basis” and can withstand logical analysis.

    Or take accountancy. Prior to 2002, accountants around the world enjoyed significant protection through the principle of “professional judgment” that shielded their decisions from meaningful challenge, but the US Congress’s Sarbanes-Oxley Act radically expanded their liability and oversight following Arthur Andersen’s role in facilitating Enron’s aggressive earnings management and subsequent document shredding when investigations began.

    Palfreyman also picks out architects and surveyors, financial services professionals and insurance brokers, patent agents and trademark lawyers, software suppliers/consultants, clergy providing counseling services, and even sports officials – all of whom now face liability for their professional judgments despite the technical complexity of their work.

    As Palfreyman notes in his analysis of the Eckersley v Binnie case (which defined the standards for “reasonable skill and care” for professionals generally), the standard that a professional “should not lag behind other ordinarily assiduous and intelligent members of his profession” and must be “alert to the hazards and risks” now applies broadly across professions, with academics sticking out like a sore, often unqualified thumb.

    Maybe the principle is just about salvageable – albeit that the sorry state of moderation, external examining, workload modelling and so on does undermine the already shaky case for “we know better”. But what I’m absolutely sure of is that extending the scope of unchallengeable decisions involving “academic judgement” to whether a student broke a set of AI-misconduct rules is not only a very slippery slope, but it’s also a sure fire way to hasten the demise of the magic power.

    Source link