Category: OfS

  • The latest sector-wide financial sustainability assessment from the Office for Students

    The latest sector-wide financial sustainability assessment from the Office for Students

    As the higher education sector in England gets deeper into the metaphorical financial woods, the frequency of OfS updates on the sector’s financial position increases apace.

    Today’s financial sustainability bulletin constitutes an update to the regulator’s formal annual assessment of sector financial sustainability published in May 2025. The update takes account of the latest recruitment data and any policy changes that could affect the sector’s financial outlook that would not have been taken into account at the point that providers submitted their financial returns to OfS ahead of the May report.

    Recruitment headlines

    At sector level, UK and international recruitment trends for autumn 2025 entry have shown growth by 3.1 per cent and 6.3 per cent respectively. But this is still lower than the aggregate sector forecasts of 4.1 per cent and 8.6 per cent, which OfS estimates could result in a total sector wide net loss of £437.8m lower than forecast tuition fee income. “Optimism bias” in financial forecasting might have been dialled back in recent years following stiff warnings from OfS, but these figures suggest it’s still very much a factor.

    Growth has also been uneven across the sector, with large research intensive institutions increasing UK undergraduate numbers at a startling 9.9 per cent in 2025 (despite apparently collectively forecasting a modest decline of 1.7 per cent), and pretty much everyone else coming in lower than forecast or taking a hit. Medium-sized institutions win a hat tip for producing the most accurate prediction in UK undergraduate growth – actual growth of 2.3 per cent compared to projected growth of 2.7 per cent.

    The picture shifts slightly when it comes to international recruitment, where larger research-intensives have issued 3.3 per cent fewer Confirmations of Acceptance of Studies (CAS) against a forecasted 6.6 per cent increase, largely driven by reduction in visas issued to students from China. Smaller and specialist institutions by contrast seem to have enjoyed growth well beyond forecast. The individual institutional picture will, of course, vary even more – and it’s worth adding that the data is not perfect, as not every student applies through UCAS.

    Modelling the impact

    OfS has factored in all of the recruitment data it has, and added in new policy announcements, including estimation of the impact of the indexation of undergraduate tuition fees, and increases to employers National Insurance contributions, but not the international levy because nobody knows when that is happening or how it will be calculated. It has then applied its model to providers’ financial outlook.

    The headline makes for sombre reading – across all categories of provider OfS is predicting that if no action were taken, the numbers of providers operating in deficit in 2025–26 would rise from 96 to 124, representing on increase from 35 per cent of the sector to 45 per cent.

    Contrary to the impression given by UK undergraduate recruitment headlines, the negative impact isn’t concentrated in any one part of the sector. OfS modelling suggests that ten larger research-intensive institutions could tip into deficit in 2025–26, up from five that were already forecasting themselves to be in that position. The only category of provider where OfS estimates indicate fewer providers in deficit than forecast is large teaching-intensives.

    The 30 days net liquidity is the number you need to keep an eye on because running out of cash would be much more of a problem than running a deficit for institutional survival. OfS modelling suggests that the numbers reporting net liquidity of under 30 days could rise from 41 to 45 in 2025–26, with overall numbers concentrated in the smaller and specialist/specialist creative groups.

    What it all means

    Before everyone presses the panic button, it’s really important to be aware, as OfS points out, that providers will be well aware of their own recruitment data and the impact on their bottom line, and will have taken what action they can to reduce in-year costs, though nobody should underestimate the ongoing toll those actions will have taken on staff and students.

    Longer term, as always, the outlook appears sunnier, but that’s based on some ongoing optimism in financial forecasting. If, as seems to keep happening, some of that optimism turns out to be misplaced, then the financial struggles of the sector are far from over.

    Against this backdrop, the question remains less about who might collapse in a heap and more about how to manage longer term strategic change to adapt providers’ business models to the environment that higher education providers are operating in. Though government has announced that it wants providers to coordinate, specialise and collaborate, while the sector continues to battle heavy financial weather those aspirations will be difficult to realise, however desirable they might be in principle.

    Source link

  • What’s in the new Office for Students strategy?

    What’s in the new Office for Students strategy?

    The Office for Students began a consultation process on its 2025-30 strategy back in December 2024. Alongside the usual opportunities for written responses there have been a series of “feedback events” promoted specifically to higher education provider staff, FE college staff, and students and student representatives held early in 2025.

    In the past OfS has faced arguably justified criticism for failing to take sector feedback on proposals into account – but we should take heart that there are significant differences between what was originally proposed and what has just been finalised and published.

    Graphic design is our passion

    Most strikingly, we are presented with four new attitudes that we are told will “drive delivery of all our strategic goals in the interest of students” – to hammer the point home individual activities in the “roadmap” are labelled with coloured, hexagonal, markers where “a particular activity will exemplify certain attitudes”. We get:

    • Ambitious for all students from all backgrounds (an upward arrow in a pink hexagon)
    • Collaborative in pursuit of our priorities and in our stewardship of the sector (two stylised hands in the shape of a heart, yellow hexagon)
    • Vigilant about safeguarding public money and student fees (A pound-sign on a teal hexagonal background)
    • Vocal that higher education is a force for good, for individuals, communities and the country (a stylised face and soundwave on a purple hexagon)

    Where things get potentially confusing is that the three broadly unchanged strategic goals – quality (tick, yellow circle), sector resilience (shield, blue circle), student experience and support (someone carrying an iPad, red circle) – are underpinned both by the attitude and the concept of “equality of opportunity” (teal ourobouros arrow). The only change at this conceptual level is that “the wider student interest” is characterised as “experience and support”. Don’t worry – the subsections of these are the same as in the consultations

    Fundamentally, OfS’ design language is giving openness and transparency, with a side order of handholding through what amounts to a little bit of a grab-bag of a list of interventions. The list is pared down from the rather lengthy set of bullet points initially presented, and there are some notable changes.

    Quality

    In the quality section what has been added is an assurance that OfS will do this “in collaboration with students, institutions, and sector experts”, and a commitment to “celebrate and share examples of excellence wherever we find them”. These are of course balanced with the corresponding stick: “Where necessary, we will pursue investigation and enforcement, using the full range of our powers.” This comes alongside clarification that the new quality system would be build on, rather than alongside the TEF.

    What is gone is the Quality Risk Register. An eminently sensible addition to the OfS armoury of risk registers, the vibes from the consultation were that providers were concerned that it might become another arm of regulation rather than a helpful tool for critical reflection

    Also absent from the final strategy is any mention of exploring alignment with European quality standards, which featured in the consultation materials. Similarly, the consultation’s explicit commitment to bring transnational education into the integrated quality model has not been restated – it’s unclear whether this reflects a change in priority or simply different drafting choices.

    Students

    In the section on students, language about consumer rights is significantly softened, with much more on supporting students in understanding their rights and correspondingly less on seeking additional powers to intervene on these issues. Notably absent are the consultation’s specific commitments – the model student contract, plans for case-report publication, and reciprocal intelligence sharing. The roadmap leans heavily into general “empowerment” language rather than concrete regulatory tools. And, for some reason, language on working with the Office for the Independent Adjudicator has disappeared entirely.

    A tweak to language clarifies that OfS are no longer keen to regulate around extra-curricular activity – there will be “non-regulatory” approaches however.

    New here is a commitment to “highlight areas of concern or interest that may not be subject to direct regulation but which students tell us matter to them”. The idea here looks to be that OfS can support institutions to respond proactively working with sector agencies and other partners. It is pleasing to see a commitment to this kind of information sharing (I suspect this is where OIA has ended up) – though a commitment to continue to collect and publish data on the prevalence of sexual misconduct in the draft appears not to have made the final cut.

    Resilience

    The “navigation of an environment of increased financial and strategic risks” has been a key priority of OfS over most of the year since this strategy was published – and what’s welcome here is clearer drafting and a positive commitment to working with providers to improve planning for potential closures, and that OfS will “continue to work with the government to address the gaps in the system that mean that students cannot be adequately protected if their institution can no longer operate”.

    Governance – yes, OfS will not only consider an enhanced focus, it will strengthen its oversight on governance. That’s strategic action right there. Also OfS will “work with government on legislative solutions that would stop the flow of public money when we [OfS, DfE, SLC] have concerns about its intended use.”

    Also scaled back is the consultation’s programmatic approach to governance reform. Where the consultation linked governance capability explicitly to equality and experience outcomes, the final version frames this primarily as assurance and capability support rather than a reform agenda. The shift suggests OfS moving toward a lighter-touch, collaborative posture on governance rather than directive intervention.

    Regulation

    OfS will now “strive to deliver exemplary regulation”, and interestingly the language on data has shifted from securing “modern real-time data” to “embedding the principle collect once, use many times” and a pleasing promise to work with other regulators and agencies to avoid duplication.

    Two other consultation commitments have been quietly downgraded. The explicit language on working with Skills England to develop a shared view of higher education’s role in meeting regional and national skills needs has disappeared – odd given the government’s focus on this agenda. And while the Teaching Excellence Framework remains present, the consultation’s push to make TEF “more routine and more widespread” has been cooled – the final version steps back from any commitments on cadence or coverage.

    What’s missing within the text of the strategy, despite being in the consultation version, are the “I statements” – these are what Debbie McVitty characterised on Wonkhe as:

    intended to describe what achieving its strategic objectives will look and feel like for students, institutions, taxpayers and employers in a clear and accessible way, and are weighted towards students, as the “primary beneficiaries” of the proposed strategy.

    These have been published, but separately and with a few minor revisions. Quite what status they have is unclear:

    The ‘I statements’ are a distillation of our objectives, as set out in our strategy. They are not regulatory tools. We will not track the performance of universities and colleges against them directly.

    Source link

  • Is there a place for LEO in regulation?

    Is there a place for LEO in regulation?

    The OfS have, following a DfE study, recently announced a desire to use LEO for regulation. In my view this is a bad idea.

    Don’t get me wrong, the Longitudinal Outcomes from Education (LEO) dataset is a fantastic and under-utilised tool for historical research. Nothing can compare to LEO for its rigour, coverage and the richness of the personal data it contains.

    However, it has serious limitations, it captures earnings and not salary, for everyone who chooses to work part time it will seriously underestimate the salary they command.

    And fundamentally it’s just too lagged. You can add other concerns around those choosing not to work and those working abroad if you wish to undermine its utility further.

    The big idea

    The OfS is proposing using data from 3 years’ after graduation which I assume to mean the third full tax year after graduation although it could mean something different, no details are provided. Assuming that my interpretation is correct the most recent LEO data published in June this year relates to the 2022-23 tax year so for that to be the third full tax year after graduation (that’s the that’s the 2018-19 graduating cohort, and even if you go for the third tax year including the one they graduated in it’s the 2019-20 graduates). The OfS also proposes to continue to use 4 year aggregates which makes a lot of sense to avoid statistical noise and deal with small cohorts but it does mean that some of the data will relate to even earlier cohorts.

    The problem is therefore if the proposed regime had been in place this year the OfS would have just got its first look at outcomes from the 2018-19 graduating cohort who were of course entrants in 2016-17 or earlier. When we look at it through this lens it is hard to see how one applies any serious regulatory tools to a provider failing on this metric but performing well on others especially if they are performing well on those based on the still lagged but more timely Graduate Outcomes survey.

    It is hard to conceive of any courses that will not have had at least one significant change in the 9 (up to 12!) years since the measured cohort entered. It therefore won’t be hard for most providers to argue that the changes they have made since those cohorts entered will have had positive impacts on outcomes and the regulator will have to give some weight to those arguments especially if they are supported by changes in the existing progression, or the proposed new skills utilisation indicator.

    A problem?

    And if the existing progression indicator is problematic then why didn’t the regulator act on it when it had it four years earlier? The OfS could try to argue that it’s a different indicator capturing a different aspect of success but this, at least to this commentators mind, is a pretty flimsy argument and is likely to fail because earnings is a very narrow definition of success. Indeed, by having two indicators the regulator may well find themselves in a situation where they can only take meaningful action if a provider is failing on both.

    OfS could begin to address the time lag by just looking at the first full tax year after graduation but this will undoubtedly be problematic as graduates take time to settle into careers (which is why GO is at 15 months) and of course the interim study issues will be far more significant for this cohort. It would also still be less timely than the Graduate Outcomes survey which itself collects the far more meaningful salary rather than earnings.

    There is of course a further issue with LEO in that it will forever be a black box for the providers being regulated using it. It will not be possible to share the sort of rich data with providers that is shared for other metrics meaning that providers will not be able to undertake any serious analysis into the causes of any concerns the OfS may raise. For example, a provider would struggle to attribute poor outcomes to a course they discontinued, perhaps because they felt it didn’t speak to the employment market. A cynic might even conclude that having a metric nobody can understand or challenge is quite nice for the OfS.

    The use of LEO in regulation is likely to generate a lot of work for the OfS and may trigger lots of debate but I doubt it will ever lead to serious negative consequences as the contextual factors and the fact that the cohorts being considered are ancient history will dull, if not completely blunt, the regulatory tools.

    Richard Puttock writes in a personal capacity.

    Source link

  • The “regulatory burden” on sexual misconduct needs to lift the weight from students

    The “regulatory burden” on sexual misconduct needs to lift the weight from students

    The problem with findings like “1.5 per cent of students said they were in intimate relationships with staff” is the danger of extrapolation.

    It’s in the results of the Office for Students (OfS) first sector-wide sexual misconduct survey – covering final year undergraduates in England who chose to take part in a clearly labelled bolt-on to the National Student Survey (NSS) earlier this year, with a response rate of just 12.1 per cent.

    But 1.5 per cent of final-year undergraduates at English providers reporting “intimate” staff-student relationships in the past 12 months still feels like a lot – especially when half involved staff members who were engaged in the student’s education and/or assessment.

    One in four respondents (24.5 per cent) said they’ve experienced sexual harassment since starting university, and 14.1 per cent declare experiencing sexual assault or violence.

    Most incidents involved fellow students – with 58.4 per cent of harassment cases and 44.1 per cent of assault cases (taking place off-campus) involving someone connected to the victim’s institution.

    OfS has published a dashboard of the results, an analysis report, a guide for students and a press release where the bullets slightly are less careful about extrapolation than I’ve been above. Another report to come later will provide more detailed analysis, including results for different combinations of characteristics and findings by academic subject.

    The exercise represents OfS’ first real attempt to gather national prevalence data on sexual misconduct affecting students, having initially promised to do so back in 2022 in the context of its new Condition E6. That requires providers to take “multiple steps which could make a significant and credible difference in protecting students”.

    The survey covered three main areas – sexual harassment experiences, sexual assault and violence, and intimate staff-student relationships. Questions also included detailed behavioural descriptions to ensure accurate prevalence measurement.

    As such, the approach built on a 2023 pilot study involving volunteer providers. Since then, OfS has shortened the questionnaire whilst maintaining its core elements, leveraging NSS infrastructure to achieve national scale coverage – although for now, none of the devolved nations have taken part.

    It’s worth noting that response patterns showed quite a bit of variation between demographic groups. Students with disabilities, female students, and LGB+ students were both more likely to respond and more likely to report misconduct – creating some quite complex interpretation challenges for understanding true prevalence rates.

    Prevalence patterns and vulnerable groups

    That set aside, the results show consistent vulnerability patterns across both harassment and assault. Female student respondents reported harassment rates of 33 per cent compared to significantly lower rates among males. Student respondents with disabilities experienced harassment at 34.7 per cent and assault at 22.1 per cent – higher than those without disabilities.

    Sexual orientation showed significant differences. Lesbian, gay and bisexual respondents reported harassment rates of 46.6 per cent and assault rates of 29.8 per cent, nearly double the overall population rates. Those identifying as having “other sexual orientation” also showed elevated rates – at 40.1 per cent for harassment and 23.3 per cent for assault.

    Age was also a key factor, with those under 21 at course start showing higher vulnerability rates – 31.2 per cent experienced harassment and 18.2 per cent experienced assault.

    In terms of behaviours, the survey found “making sexually suggestive looks or staring at your body” affected 16.7 per cent of all respondents – the most common individual harassment behaviour. This was followed by “making unwelcome sexual comments or asking sexualised questions about your private life, body, or physical appearance.”

    The patterns have direct relevance for E6’s training requirements, which mandate that induction sessions ensure students “understand behaviour that may constitute harassment and/or sexual misconduct.” The prevalence of apparently “lower-level” behaviours like staring suggests providers need to address misconceptions about what constitutes harassment – particularly given the survey’s use of legal definitions from the Equality Act 2010 and Protection from Harassment Act 1997.

    There were also interesting patterns across socioeconomic and ethnic lines that deserve interrogation. Those from the least deprived areas (IMD quintile 5) reported higher harassment rates at 32.6 per cent, but so did those not eligible for free school meals, who showed elevated rates at 32.9 per cent. And mixed ethnicity respondents reported harassment at 31.5 per cent compared to 27.9 per cent among white students.

    Where groups showed higher misconduct rates, part of the problem is that we can’t be sure whether that reflects reporting confidence, different social environments, or varying exposure patterns – all things providers will need to understand to make progress on the “credible difference” thing.

    The ethnic dimension also intersects with religious identity, with Jewish respendents (29.8 per cent), those with no religion (30.5 per cent), and those from “any other religion” (35.5 per cent) showing elevated harassment rates. Again, differential intersectional patterns should align with E6’s requirements for providers to understand their specific student populations and tailor interventions accordingly.

    The reporting crisis

    One of the survey’s most concerning findings relates to formal reporting rates. Only 13.2 per cent of respondents experiencing harassment in the past year made formal reports to their institutions. For sexual assault (in a university setting or involving someone connected to the university) reporting varied dramatically by age – just 12.7 per cent of under-21s reported incidents compared to 86.4 per cent of those aged 31 and above.

    This reporting gap in turn creates a fundamental information deficit for universities attempting to understand campus culture and develop appropriate interventions. The data suggests institutions may be operating with incomplete intel – hampering attempts to comply with E6 requirements to understand student populations and implement effective protective measures.

    E6 explicitly requires providers to offer “a range of different mechanisms” for making reports, including online and in-person options, and to “remove any unnecessary actual or perceived barriers” that might make students less likely to report. The survey’s findings suggest the mechanisms may not be reaching their intended audiences, particularly younger students.

    Among those who did report, experiences were mixed. For harassment cases, 46.7 per cent rated their reporting experience as good whilst 39.3 per cent rated it as poor. Sexual assault reporting showed slightly better outcomes, with 57.3 per cent rating experiences as good and 32.4 per cent as poor. These are findings that directly relate to E6’s requirements – and suggest the sector has some way to go to build confidence in the processes it does have.

    The condition mandates that providers ensure “investigatory and disciplinary processes are free from any reasonable perception of bias” and that affected parties receive “sufficient information to understand the provider’s decisions and the reasons for them.” The proportion rating experiences as poor does suggest that some providers are struggling to meet E6’s procedural fairness requirements.

    University connections and scope of misconduct

    Jurisdiction has always been a contested issue in some policies – here, misconduct frequently involved university-connected individuals even when incidents occurred off-campus. Among harassment cases not occurring in university settings, 58.4 per cent involved someone connected to the victim’s university. For assault cases, that figure was 44.1 per cent.

    Student perpetrators dominated both categories. Staff perpetrators appeared less frequently overall, though older students were more likely than younger groups to report staff involvement in assault cases.

    In E6 terms, the condition explicitly covers “the conduct of staff towards students, and/or the conduct of students towards students” and applies to misconduct “provided in any manner or form by, or on behalf of, a provider.” The data suggests universities’ efforts will need to explicitly extend beyond physical premises to encompass behaviour involving community members regardless of location.

    In fact, most recent harassment incidents occurred either entirely outside university settings (39.7 per cent) or across mixed locations (45.1 per cent), with only 15.2 per cent occurring exclusively in university settings. For sexual assault, 61.9 per cent occurred outside university settings entirely.

    The patterns all point to providers needing sophisticated approaches to addressing misconduct that span campus boundaries. Traditional safety measures, or at least student perceptions of jurisdiction, might well miss the majority of incidents affecting students – broader community engagement and partnership approaches will need to be deployed.

    Support confidence

    The survey also examined’ confidence in seeking institutional support – finding 67.5 per cent felt confident about where to seek help, whilst 29.3 per cent lacked confidence. But confidence levels varied significantly across demographic groups, with particular variations by sexual orientation, sex, disability status, and age.

    The differential confidence patterns also justify the E6 requirement for providers to ensure “appropriate support” is available and targeted at different student needs. It specifically requires support for students “with different needs, including those with needs affected by a student’s protected characteristics.”

    The age-related reporting gap suggests younger students may face particular barriers to accessing institutional processes. This could relate to unfamiliarity with university systems, power dynamics, or different attitudes toward formal complaint mechanisms. For sexual assault cases, the contrast between 12.7 per cent reporting among under-21s versus 86.4 per cent among over-31s represents one of the survey’s most striking findings.

    The age-related patterns have specific relevance given E6’s training and awareness requirements. The condition requires providers to ensure students are “appropriately informed to ensure understanding” of policies and behaviour constituting misconduct. The survey suggests the requirement may need particular attention for younger students – they’re showing both higher vulnerability and lower reporting rates.

    Staff-student relationships

    The survey’s staff-student relationship findings are a small proportion of the student population – but they do raise real questions about power dynamics and institutional governance.

    Among the 1.5 per cent reporting those relationships, the high proportion involving educational or professional responsibilities suggest significant potential conflicts of interest.

    Respondent students without disabilities were more likely to report relationships involving educational responsibility (72.6 per cent versus 45.5 per cent for disabled students), and similar patterns emerged for professional responsibilities. The differences deserve investigation, particularly given disabled students’ higher overall misconduct rates.

    E6’s requirements on intimate personal relationships require that providers implement measures making “a significant and credible difference in protecting students from any actual or potential conflict of interest and/or abuse of power.”

    The survey’s power dynamic findings suggest the requirement is needed – although whether the most common approach that has emerged (a ban where there’s a supervisory relationship, and a register where there isn’t) creates the right “culture” is a remaining question, given students’ views in general on professional boundaries.

    Regulatory implications

    The survey’s findings raise real questions about how OfS will use prevalence data in its regulatory approach. Back in 2022, Susan Lapworth told the House of Commons Women and Equalities Committee hearing that the data would enable the targeting of interventions:

    “So a university with high prevalence and low reporting would perhaps raise concerns for us – and we would want to then understand in detail what was going on there and that would allow us to focus our effort.

    Of course, as with Access and Participation, having national data on “which kinds of students in which contexts are affected by this” could well mean that what shows up in provider data as a very small problem could add up to a lot across the country. OfS’ levers in these contexts are always limited.

    The lack of survey coverage of postgraduate students in general turns up here as a major problem. We might theorise that most exhibit multiple theoretical vulnerabilities given the dominance of international students and students who have supervisors – patience with OfS’ focus on undergraduates really is wearing thin each time it manifests.

    The report also doesn’t look at home vs international student status, and nor does it disaggregate results by provider mission group, size, type, or characteristics. It only states that all eligible English providers in NSS 2025 were included, and that data are weighted to be representative of final-year undergraduates across the sector. Providers are also (confidentially) receiving their data – although response rates down at provider level may make drawing conclusions in the way originally envisaged difficult.

    The dramatic under-reporting rates create monitoring challenges for both institutions and OfS. If only 13.2 per cent of harassment victims make formal reports, institutional complaint statistics provide limited insight into actual campus culture. The information gap complicates E6 compliance assessment – and suggests OfS may need alternative monitoring approaches beyond traditional complaint metrics.

    E6 does explicitly contemplate requiring providers to “conduct a prevalence survey of its whole student population to the OfS’s specification” where there are compliance concerns. The 2025 survey’s methodology and findings provide a template, but it also seems to me that more contextual research – like that found in Anna Bull’s research from a couple of years back – is desperately needed to understand what’s going on beneath many of the numbers.

    Overall though, I’m often struck by the extent to which providers argue that things like E6 are an over-reach or an example of “burden”. On this evidence, even with all the caveats, it’s nothing like the burden being carried by victims of sexual misconduct.

    Source link

  • What OfS’ data on harassment and sexual misconduct doesn’t tell us

    What OfS’ data on harassment and sexual misconduct doesn’t tell us

    New England-wide data from the Office for Students (OfS) confirms what we have known for a long time.

    A concerningly high number of students – particularly LGBTQ+ and disabled people, as well as women – are subjected to sexual violence and harassment while studying in higher education. Wonkhe’s Jim Dickinson reviews the findings elsewhere on the site.

    The data is limited to final year undergraduates who filled out the National Student Survey, who were then given the option to fill out this further module. OfS’ report on the data details the proportion of final year students who experienced sexual harassment or violence “since being a student” as well as their experiences within the last 12 months.

    It also includes data on experiences of reporting, as well as prevalence of staff-student intimate relationships – but its omission of all postgraduate students, as well as all undergraduates other than final year students means that its findings should be seen as one piece of a wider puzzle.

    Here, I try to lay out a few of the other pieces of the puzzle to help put the new data in context.

    The timing is important

    On 1st August 2025 the new condition of registration for higher education providers in England came into force, which involves regulatory requirements for all institutions in England to address harassment and sexual misconduct, including training for all staff and students, taking steps to “prevent abuses of power” between staff and students, and requiring institutions to publish a “single, comprehensive source of information” about their approach to this work, including support services and handling of reports.

    When announcing this regulatory approach last year, OfS also published two studies published in 2024 – a pilot prevalence survey of a small selection of English HEIs, as well as a ‘poll’ of a representative sample of 3000 students. I have discussed that data as well as the regulation more generally elsewhere.

    In this year’s data release, 51,920 students responded to the survey with an overall response rate of 12.1 per cent. This is significantly larger sample size than both of the 2024 studies, which comprised responses from 3000 and 5000 students respectively.

    This year’s survey finds somewhat lower prevalence figures for sexual harassment and “unwanted sexual contact” than last year’s studies. In the new survey, sexual harassment was experienced by 13.3 per cent of respondents within the last 12 months (and by 24.5 per cent since becoming a student), while 5.4 per cent of respondents had been subjected to unwanted sexual contact or sexual violence within the last 12 months (since becoming a student, this figure rises to 14.1 per cent).

    By any measure, these figures represent a very concerning level of gender-based violence in higher education populations. But if anything, they are at the lower end of what we would expect.

    By comparison, in OfS’ 2024 representative poll of 3000 students, over a third (36 per cent) of respondents had experienced some form of unwanted sexual contact since becoming a student with a fifth (21 per cent) stating the incident(s) happened within the past year. 61 per cent had experienced sexual harassment since being a student, and 43 per cent of the total sample had experienced this in the past year.

    The lower prevalence in the latest dataset could be (in part) because it draws on a population of final year undergraduate students – studies from the US have repeatedly found that first year undergraduate students are at the greatest risk, especially when they start their studies.

    Final year students may simply have forgotten – or blocked out – some of their experiences from first year, leading to lower prevalence. They may also have dropped out. The timing of the new survey is also important – the NSS is completed in late spring, while we would expect more sexual harassment and violence to occur when students arrive at university in the autumn.

    A study carried out in autumn or winter might find higher prevalence. Indeed, the previous two studies carried out by OfS involved data collected at different times to year – in August 2023 (for the 3000-strong poll) and ‘autumn 2023’ (for the pilot prevalence study).

    A wide range of prevalence

    Systematic reviews published in 2023 from Steele et al and Lagdon et al from across the UK, Ireland and the US have found prevalence rates of sexual violence between 7 per cent to 86 per cent.

    Steele et al.’s recent study of Oxford University found that 20.5 per cent of respondents had experienced at least one act of attempted or forced sexual touching or rape, and 52.7 per cent of respondents experienced at least one act of sexual harassment within the past year.

    Lagdon et al.’s study of “unwanted sexual experiences” in Northern Ireland found that a staggering 63 per cent had been targeted. And my own study of a UK HEI found that 30 per cent of respondents had been subjected to sexual violence since enrolling in their university, and 55 per cent had been subjected to sexual harassment.

    For now, I don’t think it’s helpful to get hung up on comparing datasets between last year and this year that draw on somewhat different populations. It’s also not necessarily important that respondents were self-selecting within those who filled out the NSS – a US study compared prevalence rates for sexual contact without consent among students between a self-selecting sample and a non-self-selecting sample, finding no difference.

    The key take-home message is that students are being subject to a significant level of sexual harassment and violence, and particularly women, LGBTQ+ and disabled students are unable to access higher education in safety.

    Reporting experiences

    The findings on reporting reveals some important challenges for the higher education sector. According to the OfS new survey findings, rates of reporting to higher education institutions remain relatively low at 13.2 per cent of those experiencing sexual harassment, and 12.7 per cent of those subjected to sexual violence.

    Of students who reported to their HEI, only around half of rated their experience as “good”. But for women as well as for disabled and LGBTQ+ students there were much lower rates of satisfaction with reporting than men, heterosexuals and non-disabled students who reported incidents to their university.

    This survey doesn’t reveal why students were rating their reporting experiences as poor, but my study Higher Education After #MeToo sheds light on some of the reasons why reporting is not working out for many students (and staff).

    At the time of data collection in 2020-21, a key reason was that – according to staff handling complaints – policies in this area were not yet fit for purpose. It’s therefore not surprising that reporting was seen as ineffective and sometimes harmful for many interviewees who had reported. Four years on, hopefully HEIs have made progress in devising and implementing policies in this area, so other reasons may be relevant.

    A further issue focused on by my study is that reporting processes for sexual misconduct in HE focus on sanctions against the reported party rather than prioritising safety or other needs of those who report. Many HEIs do now have processes for putting in place safety (“precautionary” or “interim”) measures to keep students safe after reporting.

    Risk assessment practices are developing. But these practices appear to be patchy and students (and staff) who report sexual harassment or violence are still not necessarily getting the support they need to ensure their safety from further harm. Not only this, but at the end of a process they are not usually told the actions that their university has taken as a result of the report.

    More generally, there’s a mismatch between why people report, and what is on offer from universities. Forthcoming analysis of the Power in the Academy data on staff-student sexual misconduct reveals that by the time a student gets to the point of reporting or disclosing sexual misconduct from faculty/staff to their HEI, the impacts are already being felt more severely than those who do not report.

    In laywoman’s terms, if people report staff sexual misconduct, it’s likely to be having a really bad impact on their lives and/or studies. Reasons for reporting are usually to protect oneself and others and to be able to continue in work/study. So it’s crucial that when HEIs receive reports, they are able to take immediate steps to support students’ safety. If HEIs are listening to students – including the voices of those who have reported or disclosed to their institution – then this is what they’ll be hearing.

    Staff-student relationships

    The survey also provides new data on staff-student intimate relationships. The survey details that:

    By intimate relationship we mean any relationship that includes: physical intimacy, including one-off or repeated sexual activity; romantic or emotional intimacy; and/or financial dependency. This includes both in person and online, or via digital devices.

    From this sample, 1.5 per cent of respondents stated that they had been in such a relationship with a staff member. Of those who had been involved in a relationship, a staggering 68.8 per cent of respondents said that the university or college staff member(s) had been involved with their education or assessment.

    Even as someone who researches within this area, I’m surprised by how high both these figures are. While not all students who enter into such relationships or connections will be harmed, for some, deep harms can be caused. While a much higher proportion of students who reported “intimate relationships” with staff members were 21 or over, age of the student is no barrier to such harms.

    It’s worth revisiting some of the findings from 2024 to give some context to these points. In the 3000-strong representative survey from the OfS, a third of those in relationships with staff said they felt pressure to begin, continue or take the relationship further than they wanted because they were worried that refusing would negatively impact them, their studies or career in some way.

    Even consensual relationships led to problems when the relationship broke up. My research has described the ways in which students can be targeted for “grooming” and “boundary-blurring” behaviours from staff. These questions on coercion from the 2024 survey were omitted from the shorter 2025 version – but assuming such patterns of coercion are present in the current dataset, these findings are extremely concerning.

    They give strong support to OfS’ approach towards staff-student relationships in the new condition of registration. OfS has required HEIs to take “one or more steps which could make a significant and credible difference in protecting students from any actual or potential conflict of interest and/or abuse of power.”

    Such a step could include a ban on intimate personal relationships between relevant staff and students but HEIs may instead chose to propose other ways to protect students from abuses of power from staff. While most HEIs appear to be implementing partial bans on such relationships, some have chosen not to.

    Nevertheless, all HEIs should take steps to clarify appropriate professional boundaries between staff and students – which, as my research shows, students themselves overwhelmingly want.

    Gaps in the data

    The publication of this data is very welcome in contributing towards better understanding patterns of victimisation among students in HE. It’s crucial to position this dataset within the context of an emerging body of research in this area – both the OfS’ previous publications, but also academic studies as outlined above – in order to build up a more nuanced understanding of students’ experiences.

    Some of the gaps in the data can be filled from other studies, but others cannot. For example, while the new OfS regulatory condition E6 covers harassment on the basis of all protected characteristics, these survey findings focus only on sexual harassment and violence.

    National data on the prevalence of racial harassment or on harassment on the basis of gender reassignment would be particularly valuable in the current climate. This decision seems to be a political choice – sexual harassment and violence is a focus that both right- and left-wing voices can agree should be addressed as a matter of urgency, while it is more politically challenging (and therefore, important) to talk about racial harassment.

    The data also omits stalking and domestic abuse, which young people – including students – are more likely than other age groups to be subjected to, according to the Crime Survey of England and Wales. My own research found that 26 per cent of respondents in a study of gender-based violence at a university in England in 2020 had been subjected to psychological or physical violence from a partner.

    It does appear that despite the narrow focus on sexual harassment and violence from the OfS, many HEIs are taking a broader approach in their work, addressing domestic abuse and stalking, as well as technology-facilitated sexual abuse.

    Another gap in the data analysis report from the OfS is around international students. Last year’s pilot study of this survey included some important findings on their experiences. International students were less likely to have experienced sexual misconduct in general than UK-domiciled students, but more likely to have been involved in an intimate relationship with a member of staff at their university (2 per cent of international students in contrast with 1 per cent of UK students).

    They were also slightly more likely to state that a staff member had attempted to pressured them into a relationship. Their experiences of accessing support from their university were also poorer. These findings are important in relation to any new policies HEIs may be introducing on staff-student relationships: as international students appear to be more likely to be targeted, then communications around such policies need to be tailored to this group.

    We also know that the same groups who are more likely to be subjected to sexual violence/harassment are also more likely to experience more harassment/violence, i.e. a higher number of incidents. The new data from OfS do not report on how many incidents were experienced. Sexual harassment can be harmful as a one-off experience, but if someone is experiencing repeated harassment or unwanted sexual contact from one or more others in their university environment (and both staff and student perpetrators are likely to be carry out repeated behaviours), then this can have a very heavy impact on those targeted.

    The global context

    Too often, policy and debate in England on gender-based violence in higher education fails to learn from the global context. Government-led initiatives in Ireland and Australia show good practice that England could learn from.

    Ireland ran a national researcher-led survey of staff as well as students in 2021, due to be repeated in 2026, producing detailed data that is being used to inform national and cross-institutional interventions. Australia has carried out two national surveys – in 2017 and 2021 – and informed by the results has just passed legislation for a mandatory National Higher Education Code to Prevent and Respond to Gender-based Violence.

    The data published by OfS is much more limited than these studies from other contexts in its focus on third year undergraduate students only. It will be imperative to make sure that HEIs, OfS, government or other actors do not rely solely on this data – and future iterations of the survey – as a tool to direct policy, interventions or practice.

    Nevertheless, in the absence of more comprehensive studies, it adds another piece to the puzzle in understanding sexual harassment and violence in English HE.

    Source link

  • Higher education mergers are a marathon not a sprint

    Higher education mergers are a marathon not a sprint

    When the announcement came last Wednesday that the universities of Kent and Greenwich are planning to merge, the two institutions did a fine job of anticipating all the obvious questions.

    In particular, announcing that the totemic decision has already been taken on who should lead the new institution – University of Greenwich vice chancellor Jane Harrington – was a pragmatic move that will save a great deal of gossip and speculation that could otherwise have derailed the discussions that will now commence on how to turn “intention to formally collaborate” to the “first-of-its-kind multi university group.”

    But even with that really tricky bit of business out of the way, there is still a lot to work through. Broadly those questions fall into two baskets: the strategic direction and the practical fine detail. Practicalities are important for giving reassurance that people’s lives aren’t about to radically change overnight; albeit there are inevitably lots of issues that are either formally unknown at this stage or which can only be tackled in light of the evolution of the final agreement and organisational structure.

    With that in mind, it is really worth emphasising that the notion of a “multi university group” is a brand new idea, given a conceptual shape in the very recent publication Radical collaboration: a playbook from KPMG and Mills & Reeve, produced under the auspices of the Universities UK transformation and efficiency taskforce. The idea of a “multi university trust” explored in that report, derived from the school sector, posits the creation of a single legal entity that can nevertheless “house” a range of distinct “trading entities” with unique “brands” each with an agreed level of local autonomy.

    It answers the question of how you take two (or more) institutions, each with their own histories and characteristics and find ways to create the strength and resilience that scale might offer, while retaining the local distinctive characteristics that staff, students, and local communities value and feel a sense of affinity to. It also, as has been noted in the coverage following the announcement, leaves an option open for other institutions to join the new structure, if there’s a case for them to do so.

    “It is very positive to see institutions taking proactive steps to finding new ways to work together,” says Sam Sanders, head of education, skills and productivity for KPMG in the UK. “The group structure proposed is a model we have seen be successful elsewhere, where brand identity is retained but you get economies of scale, meaning institutions can focus on their core activities while sharing the burden of the overheads. If it goes well it could act as a blueprint for other similar ventures.”

    Sam’s reflection is that establishing a new entity might be the most straightforward part of the process: “The complicated part is moving to a new model that simultaneously preserves the right culture in the right places while achieving the savings you might want to see in areas like IT, infrastructure, and estates. These are multi-year agendas so everyone involved needs to be prepared for that.”

    The long and winding road

    With lots to work through, it’s really important to step back, and give space to the institutions to work this out. Because the big picture is about mapping what that critical path looks like from single-institution vulnerabilities to strength in numbers – and that is a path that these institutions and their governing bodies are, to a large extent, carving out as they go, potentially doing the wider sector a service in the process as others may look to follow the same path in the future.

    “The sector response has been overwhelmingly positive,” says Jane Harrington, who is already fielding calls from heads of institution who are curious about the planned new model. Both Jane and University of Kent acting vice chancellor Georgina Randsley de Moura have experience with group structures in schools and further education, knowledge they drew on in thinking through the options for formal collaboration – starting with ten different possible models which were narrowed down to two that were explored in more depth.

    “We started with what we wanted to achieve, and then we looked for models,” says Georgina. “We kept going back to our principles: widening participation, education without boundaries, high quality teaching and research, and what will make sense for our regions. Inevitably there is some focus in the news around finances and that is an important part of the context, but this would not work if our universities didn’t have values and mission alignment.”

    “We also had examples in mind of where we don’t want to end up,” adds Jane. “You see mergers where the brand identity is lost and it takes a decade to get it back. We have, right now, two student-facing brands that are strong in their own right. And in five or ten years time it might be that we have four or five institutions that are part of this structure – we don’t think it would make sense for them to become part of one amorphous brand.”

    It’s frequently observed that bringing together two or more institutions that are facing difficult financial headwinds may simply create a larger institution with correspondingly larger challenges. So having a very clear sense strategically of where the strengths and opportunities lie, as well as the where risks and weaknesses might also be subject to force-multiplier effects, is pretty important at the outset.

    It’s clear that there is an efficiency agenda in play in the sense that merging allows for the adoption of a single set of systems and processes – an area where Jane is especially interested in curating creative thinking. But the wider opportunities afforded by scale are also compelling, especially in being more strategic about the collective skills and innovation offer to the region.

    Kent and Medway local councils and MPs have also responded enthusiastically to the universities’ proposal, the two heads of institution tell me – not least because navigating politics around different HE providers can be a headache for regional actors who want to engage higher education institutions in key regional agendas.

    “There are cold spots in our region where nobody is offering what is needed,” says Jane. “But developing new provision is much harder when you are acting alone. This region has pockets of multiple forms of deprivation: rural, urban and coastal. The capacity and scale afforded by combining means we can think strategically about how to do the regional growth work, and what our combined offer should be, including to support reskilling and upskilling.”

    Georgina makes a similar case for combining research strengths. “Our shared research areas, like health, food sustainability, and creative industries, play to regional strengths,” she says. “When research resources are constrained, by combining we can do more.”

    We can work it out

    The multi university group is not, in theory, a million miles from a federation in structure in that in federations generally there is a degree of autonomy ceded by the constituent elements to a single governing body – but in a federation each entity retains its individual legal status. A critical difference is the extent to which a sharing economy among the entities would have to be painstakingly negotiated for a federation, which could erode the value that is created in collaborating. It could also raise tricky questions around things like VAT.

    But the sheer novelty of the multi university group also raises a bunch of regulatory questions, covered in all the depth you’d expect by DK elsewhere on the site – to give a flavour, can you use the word “university” for your trading entity without that existing as a legal entity with its own degree awarding powers?

    The supportive noises from DfE and OfS at the time of the initial announcement should give Kent and Greenwich some degree of comfort as they work through some of these questions. The sector has been making the argument for some time now that if the government and regulator want to see institutions seizing the initiative on innovative forms of collaboration, there will need to be some legal and regulatory quarter given, up to and including making active provision for forms of collaboration that emerge without a legal playbook.

    Aside from the formal conditions for collaboration, how OfS conducts itself in this period will be watched closely by others considering similar moves. While nobody would suggest that changing structure offers an excuse for dropping the ball on quality or student experience – and both heads of institution are very clear there is no expectation of that happening – OfS now has a choice. It can choose to be highly activist in requesting reams of documentation and evidence in response to events as they unfold, from institutions already grappling with a highly complex landscape. Or it can work out an approach that offers a degree of advance clarity to the institutions what their accountabilities are in this time of transition, and how they can/should keep the regulator informed of any material risks arising to students from the process.

    Despite the generally positive response, there is no shortage of scepticism about whether a plan like the one proposed can work. The answer, of course, depends on what you think success looks like. Certainly, anyone expecting a sudden and material shrinkage in costs is bound to be disappointed. Decisions will be made along the way with which some disagree, perhaps profoundly.

    But I think what is often forgotten in these discussions is that the alternative to the decision to pursue a new structure is not to carry on in broadly the same way as before, but to pursue a different but equally radical and equally contentious course of action. If the status quo was satisfactory then there would be no case for the change. In that sense, being as useful as possible in helping these two institutions make the very best fist that they can of their new venture is the right thing for everyone to do, from government downwards.

    Source link

  • Back to the future for the TEF? Back to school for OfS?

    Back to the future for the TEF? Back to school for OfS?

    As the new academic year dawns, there is a feeling of “back to the future” for the Teaching Excellent Framework (TEF).

    And it seems that the Office for Students (OfS) needs to go “back to school” in its understanding of the measurement of educational quality.

    Both of these feelings come from the OfS Chair’s suggestion that the level of undergraduate tuition fees institutions can charge may be linked to institutions’ TEF results.

    For those just joining us on TEF-Watch, this is where the TEF began back in the 2015 Green Paper.

    At that time, the idea of linking tuition fees to the TEF’s measure of quality was dropped pretty quickly because it was, and remains, totally unworkable in any fair and reasonable way.

    This is for a number of reasons that would be obvious to anyone who has a passing understanding of how the TEF measures educational quality, which I wrote about on Wonkhe at the time.

    Can’t work, won’t work

    First, the TEF does not measure the quality of individual degree programmes. It evaluates, in a fairly broad-brush way, a whole institution’s approach to teaching quality and related outcomes. All institutions have programmes of variable quality.

    This means that linking tuition fees to TEF outcomes could lead to significant numbers of students on lower quality programmes being charged the higher rate of tuition fees.

    Second, and even more unjustly, the TEF does not give any indication of the quality of education that students will directly experience.

    Rather, when they are applying for their degree programme, it provides a measure of an institution’s general teaching quality at the time of its last TEF assessment.

    Under the plans currently being considered for a rolling TEF, this could be up to five years previously – which would mean it gives a view of educational quality at least nine years before applicants will graduate. Even if it was from the year before they enrol, it will be based on an assessment of evidence that took place at least four years before they will complete their degree programme.

    Those knowledgeable about educational quality understand that, over such a time span, educational quality could have dramatically changed. Given this, on what basis can it be fair for new students to be charged the higher rate of tuition fees as a result of a general quality of education enjoyed by their predecessors?

    These two reasons would make a system in which tuition fees were linked to TEF outcomes incredibly unfair. And that is before we even consider its impact on the TEF as a valid measure of educational quality.

    The games universities play

    The higher the stakes in the TEF, the more institutions will feel forced to game the system. In the current state of financial crisis, any institutional leader is likely to feel almost compelled to pull every trick in the book in order to ensure the highest possible tuition fee income for their institution.

    How could they not given that it could make the difference between institutional survival, a forced merger or the potential closure of their institution? This would make the TEF even less of an effective measure of educational quality and much more of a measure of how effectively institutions can play the system.

    It takes very little understanding of such processes to see that institutions with the greatest resources will be in by far the best position to finance the playing of such games. Making the stakes so high for institutions would also remove any incentive for them to use the TEF as an opportunity to openly identify educational excellence and meaningfully reflect on their educational quality.

    This would mean that the TEF loses any potential to meet its core purpose, identified by the Independent Review of the TEF, “to identify excellence and encourage enhancement”. It will instead become even more of a highly pressurised marketing exercise with the TEF outcomes having potentially profound consequences for the future survival of some institutions.

    In its own terms, the suggestion about linking undergraduate tuition fees to TEF outcomes is nothing to worry about. It simply won’t happen. What is a much greater concern is that the OfS is publicly making this suggestion at a time when it is claiming it will work harder to advocate for the sector as a force for good, and also appears to have an insatiable appetite to dominate the measurement of educational quality in English higher education.

    Any regulator that had the capacity and expertise to do either of these things would simply not be making such a suggestion at any time but particularly not when the sector faces such a difficult financial outlook.

    An OfS out of touch with its impact on the sector. Haven’t we been here before?

    Source link

  • You’re not on the list

    You’re not on the list

    The pause in accepting applications to the Office for Students register is to be lifted on 28 August and, as Jim Dickinson notes, new providers now have a whole set of extra conditions that will apply to them.

    As was spotted at the consultation, it will become odd that 430 or so “old” providers have less stringent rules than those who join the register afresh. OfS is very clear – the new rules protect students and taxpayers.

    Meanwhile DfE is suggesting raising the stakes of the register by requiring both larger providers who teach via franchise, and those who want to deliver courses funded by the Lifelong Learning Entitlement (LLE) to join the register..

    Why a Register?

    One of the enduring regulatory mechanisms of English higher education is the list or register. From the outset of the regulated and funded system after the First World War there’s been a list that you needed to be on. These were invariably linked to hierarchies of status and funding. There were criteria with serious cut-offs: getting on the UGC funding list, making the list to be one of the colleges of advanced technology, or one of the polytechnics, or one of the colleges to be incorporated (and funded by PCFC) or to be allowed onto the HEFCE list.

    The intention with the OfS register was slightly different. It was both more encompassing, many more providers were due to join, including those that didn’t want funding (or the encumbrances of funding) as there was no opt out. But it turned out that you could opt out; many providers have existed (some might say flourished) outside the register via the franchise route. Even benefits that were supposed just to be for those on the register – say the use of “university” in your title – have been granted to providers not on the register.

    What’s recently vanished is the idea of a new (possibly lighter touch) category of registration for providers to offer the LLE. Back at the start of the OfS a registered (basic) category fell away. Given concerns about the behaviour of a few providers in the current registration categories, an Approved-lite wasn’t plausible.

    Consultation

    As you’d expect, the OfS says “We have decided to implement our proposals in broadly the form on which we consulted” There are tweaks, of course, particularly splitting out the initial tests on governance and management. These are at the heart of many of the problems we’ve seen – in particular how governance interacts with directors who may also be the proprietors of a for-profit business.

    Being on the register is a major kitemark: the guarantee of quality. The framing of the new conditions is that the old ones weren’t sufficient. We can see that the Department for Education has been working around the register – the replacement for the DET (the FE training course, which was the source of many problems) being limited to providers with degree awarding powers, and those without TEF Gold and Silver going through a tougher process to run LLE modules.

    A group of providers who have previously tried to get on the register will have to try again. OfS has stopped formally refusing applications.

    We can see some of the providers who now franchising were refused in the past and so could the Public Accounts Committee (who were not amused). OfS took the unusual step of publishing a case note on Oxford Business College’s application to register, clearly indicating there was no way they were getting on the list, but only after OBC had withdrawn their application and DfE turned off the funding taps. It’s worth noting that OBC’s purported governance arrangements (including a former VC as its chair) seemingly vanished away when it started to unravel. OfS has reduced the period before you can apply again, but it’s clear they want to stop half-baked applications.

    Leaving the Register

    There remains an issue in that it turns out that that threat of being removed from the register isn’t much of a threat if you are exiting HE and entering administration. OfS used the formulation “The provider no longer wishes to access the benefits of OfS registration” for the first colleges that left, Applied Business Academy as “The provider is no longer able to provide higher education” and Brit College as “The provider will no longer seek to meet the conditions required to remain registered as a higher education provider in England”. Brit, another registered provider, has had its courses de-designated by the DfE.

    If the loophole of providing HE while not being on the register is closed, then there will be plenty of pressure on the system. We’ve seen plenty of complaints about the time it takes to register and we will need to see whether OfS’ proposals make this better. OfS’s performance data does not record a time taken to resolve an application, but we’ve seen some take four years. Clearly they think that some providers are under-prepared for the registration process and also that they want to speed up the refusal (or forced withdrawal) process.

    We can only see some of the issues that OfS and DfE are dealing with; these new conditions and processes are designed to close loopholes. As an example, we saw OfS refuse registration to Spurgeon’s College because its finances were poor, only to have to admit them a few months later after they had secured a loan. Spurgeon’s is now in administration. There’s increased requirements for financial information in the application process. It’s hard to look at the list of prohibited behaviours linked to C5 on student fairness without imagining a lot of these have been reported to OfS at some point.

    Gas panic

    The OfS register sits in the background – on the web site this is literally the case as it lurks under “for providers” and “regulatory resources”. The way it manifests itself could do with a spruce up – the web version is marked in “beta” mode (as it has from 2022) and parts of the functionality and data need an overhaul. Checking the OfS register is never quite like looking up whether your engineer has a safe gas certificate, but an increasing number of students have found their HE experience has blown up.

    It’s hard to argue against protecting students and taxpayers, but the 429 providers already on the register should take a look at these conditions As Jim Dickinson notes there’s a challenge here on fairness for all providers and it’s hard to imagine that OfS won’t want to see both the fairness and governance and management aspects of the new E conditions apply to all providers pretty soon.

    Source link

  • OfS pushes ahead with two tier fairness for students

    OfS pushes ahead with two tier fairness for students

    Good news for students in England. Providers will soon be subject to tough new rules that ensure they’re treated fairly. But only if they’re in a new provider. Elsewhere, it seems, the unfairness can reign on!

    Just a few days before applications to join its register reopen, the Office for Students (OfS) has published consultation outcomes and final decisions on reforms to its registration requirements.

    It sets out the regulator’s decisions following its February 2025 consultation on changes to the entry conditions that higher education providers have to meet to register with OfS, and therefore access student loan funding. It covers:

    • A new initial condition C5 (treating students fairly), replacing the old consumer protection and student protection plan conditions (C1 and C3).
    • New governance conditions E7, E8 and E9, replacing the old governance requirements (E1 and E2).
    • Tighter application requirements, including more detailed financial planning, declarations about investigations, and restrictions on resubmitting applications after refusal.

    Conusingly, the changes interact closely with two separate consultations on subcontracting.

    First, in January 2025 the Department for Education consulted on requiring delivery providers in franchised or subcontractual arrangements to register directly with OfS for their students to be eligible for student support.

    Then, in June 2025 OfS ran its own consultation on the regulation of subcontracted provision, focusing on how such providers would be assessed, overseen, and held accountable if brought into the system.

    These reforms don’t themselves impose registration on subcontracted delivery providers, but they prepare the ground – the new conditions clarify how subcontracted applicants could meet C5 and related requirements, and OfS signals that it is ready to align with whatever the government decides on the January DfE proposals.

    Chin plasters

    We’re several months on now from the initial jaw on the floor moment, but by way of reminder – the main proposals on treating students fairly are justified as follows:

    Providers are facing increasing financial challenges. They must have effective management and governance to navigate those challenges in a way that delivers good student outcomes. Where providers are making tough financial decisions, they must continue to meet the commitments they have made to students. Our engagement with students shows that being treated fairly is very important to them and suggests that too often this does not happen.

    Against that backdrop, and repeated never-met promises to act to address student protection issues, you’d have thought that there would be progress on what is happening inside the 429 providers already on the register. Alas not – its centrepiece proposals on treating students fairly are only to apply to new providers, with a vague commitment to consult on what might be applied to everyone else (closing the stable door) at some point down the line (one the horse has bolted).

    But worse than that, in its infinite wisdom OfS has somehow managed to concoct a situation where for this tiny group of new providers, it will:

    • Remix lots of existing consumer protection law so that instead of talking about consumer rights, it talks about treating students fairly
    • In some areas go further than consumer protection law, because OfS can and has decided to in the student interest
    • In some areas not go as far as consumer protection law, because…. reasons?

    On the topline, what’s now being introduced is a new initial registration condition – C5, “treating students fairly” – that will replace the old consumer protection entry tests for providers seeking to join the OfS register.

    Instead of simply requiring a university or college to show that it has “had due regard” to CMA guidance, applicants will have to demonstrate that they treat students fairly in practice.

    To do that, OfS will review the policies and contracts they intend to use with students, and judge them against a new “prohibited behaviours” list, a detriment test, and any track record of adverse findings under consumer or company law. In effect, OfS is shifting from a box-ticking exercise about compliance to an upfront regulatory judgement about fairness.

    Providers will have to publish a suite of student-facing documents – terms and conditions, course change policies, refund and compensation policies, and complaints processes – which together will constitute their student protection plan.

    And the scope of the new condition is deliberately broad – it covers current, prospective, and former students, higher education and ancillary services like accommodation, libraries, or disability support, and information issued to attract or recruit students, including advertising and online material. In short, C5 sets a new standard of fairness at the point of entry to the system, at least for those providers trying to join it.

    Students aren’t consumers, but they are, or are they

    The problem is the relationship with consumer law. OfS is at pains to stress that new Condition C5 sits comfortably alongside consumer law, drawing on concepts that will be familiar to anyone who has worked with CMA guidance.

    It makes use of the same building blocks – unfair terms, misleading practices, clarity of information – and even names the same statutes.

    But we’re also reminded that C5 is not consumer law – it’s a regulatory condition of registration, judged and enforced by OfS as a matter of regulatory discretion. That means satisfying C5 doesn’t guarantee compliance with the Consumer Rights Act 2015 or the Digital Markets, Competition and Consumers Act 2024, and conversely, complying with the Act doesn’t automatically secure a pass on C5. The frameworks overlap, but they don’t align.

    In some respects C5 goes further. By creating its own “prohibited behaviours list”, OfS has declared that certain contractual terms – which the Consumer Rights Act 2015 would only treat as “grey list” risks – will always be unfair in the student context. Examples include terms that allow a provider to unilaterally withdraw an offer once it has been accepted, clauses that limit liability for disruptions within the university’s own control (like industrial action), or refund policies that impose unreasonable hurdles or delays.

    The list also bans misleading representations such as claiming “degree” or “university” status without proper authority, omitting key information about additional compulsory costs, or publishing fake or cherry-picked student reviews. It even extends to the legibility and clarity of terms and policies, requiring that documents be accessible and understandable to students.

    C5 also sweeps in documents that may not ordinarily have contractual force, like course change policies or compensation arrangements, and makes them part of the fairness test. In that sense, the regulator is demanding a higher standard than the law itself, rooted in its view of the student interest.

    But in other senses, C5 lags behind. Where DMCC now treats omissions of “material information” as unlawful if they’re likely to influence a student’s decision, C5 only bites when omissions cause demonstrable detriment, judged against whether the detriment was “reasonable.”

    DMCC introduces explicit protections for situational vulnerability, and a statutory duty of professional diligence in overseeing agents and subcontractors – neither concept is reflected in C5. DMCC makes universities liable for what their agents say on TikTok about visas or jobs – C5 says providers are accountable too, but stops short of importing the full professional diligence duty that the law now demands. DMCC makes clear that the full price of a degree needs to be set out in advance – including anything you have to buy on an optional module. C5 not so much.

    We will protect you

    The problem with all of that from a student point of view is that the Competition and Markets Authority is going to take one look at all of this and think “that means we don’t have to busy ourselves with universities” – despite the rights being different, and despite no such regulation kicking in in the rest of the UK.

    And worse, it makes the chances of students understanding their rights even thinner than they are now. On that, some respondents asked for wider duties to ensure students actively understand their rights – but OfS’ response is that its focus is on whether documents are fair, clear, and not misleading, and that if issues arise in practice (like if notifications flag that students aren’t being given fair or accurate information), OfS can require further information from the provider and take action.

    How on earth students would know that their rights had been breached, and that they can email an obscure OfS inbox is never explained. Even if students find the webpage, students are told that OfS “will not be able to update you on the progress or outcome of the issue that you have raised”.

    They’d likely make a complaint instead – but even if they got as far as the OIA, unless I’ve missed it I’ve never seen a single instance of OfS taking action (either at strategic/collective level or individual) off the back of the information I’m sure it gets regularly from its friends in Reading.

    I suspect this all means that OfS will now not publish two lots of information for students on their rights, depending on whether they’re new or existing members of the register – because like pretty much every other OfS strategy on the student interest, students are framed as people to be protected by a stretched mothership rather than by giving them some actual power themselves.

    I can make an argument, by the way, that sending complaints to lawyers to be assessed for legal risk to the provider, routinely ignoring the OIA Good Practice Framework, refusing to implement an OIA recommendation, not compensating a group when an individual’s complaint obviously applies to others who didn’t complain, using NDAs on complaints that don’t concern harassment and sexual misconduct, deploying “academic judgment” excuses on any appeal where the student is let down, or the practice of dragging out resolutions and making “deal or no deal” “goodwill” offers to coax exhausted students into settling are all pretty important fairness issues – but the relationship with the OIA in a whole document on fairness is barely mentioned.

    As usual, almost nothing has changed between proposals and outcome – but there’s a few nuggets in there. “Information for students” has been replaced with “information about the provider” – to make clear the duty extends beyond enrolled students and covers all marketing/info materials. The problem is that under DMCC stuff like, for example, misleading information on the cost of living in a given city is material, but under OfS “treating students fairly” doesn’t appear to be “about” the provider.

    OfS has clarified that its concerns about “ancillary services” only applies where there is a contract between student and provider (not with third parties), but has added that providers are responsible for information they publish about third-party services and expects universities to exercise “due diligence” on them and their contracts.

    Some language has been more closely aligned with the DMCCA on things like omissions and fake reviews), and in its “detriment” test providers now must do “everything reasonable” rather than “everything possible” to limit it.

    Banned practices

    In some ways, it would have been helpful to translate consumer law and then go further if necessary. But looking at the overlap between the CMA’s unfair commercial practices regime and OfS’s prohibited behaviours list reveals some odd gaps.

    OfS has borrowed much of the language around misleading marketing, fake reviews, false urgency, and misused endorsements, but it has not imported the full consumer protection arsenal. The result is that students don’t seem to be guaranteed the same protections they would enjoy if they were buying a car, a washing machine, or even a mobile phone contract.

    General CMA guidance prevents companies from mimicking the look of competitors to confuse buyers – but the practice is not explicitly barred by OfS. The CMA bans direct appeals to children – no mention of the vulnerable consumer / due diligence duties in OfS’ stuff. Under DMCC, a practice that requires a consumer to take onerous or disproportionate action in order to exercise rights that they have in relation to a product or service is banned – but there’s little on that from OfS.

    Fee increases

    One note on fees and increases – in the response, OfS points to a “statement” that anyone with an Access and Participation Plan has to submit on whether it will increase fees. It supposedly has to specify the “objective verifiable index” that would be used (for example, the Retail Price Index or the Consumer Price Index), in all cases the amount must not exceed the maximum amount prescribed by the Secretary of State for Education, and under consumer protection law, all students must have a right to cancel a contract in the event of a price increase, even where that price increase is provided for in the contact.

    Here’s the first five I found in approved Access and Participation Plans on Google:

    • “Our intention is to charge the maximum fee, subject to the fee limits set out in Regulations” (the doesn’t seem compliant to me)
    • “We will not raise fees annually for 2024-25 new entrants” (that one from a provider that has announced that it will after all)
    • “We will not raise fees annually for 2024-25 new entrants” (that from a provider who now says that for those who started before 1 August 2025, the continuing fee will be £9,535)
    • “We will not raise fees annually for new entrants” (that from a provider that now says “the fee information and inflation statement provided on page 69 of our 2025/26 to 2028/29 Access and Participation Plan are no longer current)
    • “Subject to the maximum fee limits set out in Regulations we will increase fees each year using RPI-X” (what it’s actually doing is increasing its fees by RPI-X as projected by the OBR, which is a very different figure, and no way would pass muster as an “objective verifiable index”

    I’d add here to this utterly laughable situation that the CMA is very clear that the right to cancel in the event of a material change or price increase has to be exercisable in practice:

    In the HE sector, switching course or, in some cases, withdrawing and switching HE provider, is likely to be difficult or impractical in practice, bearing in mind that in many cases the student will not be able simply to transfer their credits to another HE provider, and so saying the student can switch may not improve matters for them, or alleviate the potential unfairness of a variation.

    I’m not sure there’s a provider in the country that’s compliant with that.

    Wider changes

    On its reforms to registration requirements, the exciting news is that rather than introduce one new Condition of Registration, there’s going to be three – E7 (governing documents and business plan), E8 (fraud and inappropriate use of public funds) and E9 (on fit and proper persons, knowledge and expertise).

    In the future, providers will have to submit a defined set of governing documents at registration – replacing the previous reliance on self-assessment against public interest governance principles. Providers will also have to submit a clear and comprehensive five-year business plan showing objectives, risks, compliance with ongoing conditions, and consideration of students’ interests.

    Specific senior roles (chair of governing body, accountable officer, finance lead, and an independent governor) will have to demonstrate sufficient knowledge and expertise, usually tested through interviews. And a new fit and proper persons test will mean that those in senior governance and management roles will be subject to checks on past conduct (e.g. fraud, misconduct, behaviour undermining public trust).

    Providers will also have to have comprehensive and effective arrangements to prevent, detect, and stop fraud and the inappropriate use of public funds. A “track record” test also applies, the upshot of which is that relevant convictions or regulatory sanctions within the past 60 months could bar registration unless exceptional circumstances apply.

    You’ll not be surprised to learn that in the consultation, some worried that the changes would increase bureaucracy, slow down registration, and impose disproportionate burdens on smaller providers. Others objected to the removal of self-assessment against the Public Interest Governance Principles (PIGPs) at the point of registration, fearing this would dilute student protection or cause confusion given that PIGPs still apply on an ongoing basis.

    Concerns were also raised about creating a two-tier system where new entrants face tougher entry requirements than established providers, and about the practicality of requiring a five-year business plan when forecasting beyond two or three years is often unrealistic. Many also questioned a new interview requirement for key individuals, seeing it as costly, stressful, open to coaching, and potentially inconsistent. Just like student assessment!

    OfS was right all along, of course – arguing that the new conditions give stronger protection for students and taxpayers, that the five-year planning horizon is essential to test medium-term sustainability, and maintains that fit and proper person interviews are the most effective way to test leadership capacity.

    If you were one of the handful of respondents, it wasn’t all in vain – the phrase “policies and procedures” is now “policies and processes”, OfS has clarified the level of knowledge required (the chair and independent governor only need “sufficient awareness” of student cohorts rather than detailed operational knowledge) and a minimum requirement for fraud prevention arrangements is now in the actual condition (rather than just in guidance).

    Registering with OfS

    Much of that is now reflected in a tightening of the registration process itself. Applicants will now be required to submit a defined set of final, governing-body-approved documents at the point of application – including governing documents, financial forecasts, business plans, and information on ownership and corporate structure.

    The idea is to eliminate the previous piecemeal approach, under which providers often submitted partial or draft materials, and to ensure that applications arrive complete, coherent, and capable of demonstrating that a provider has the resources and arrangements necessary to comply with the ongoing conditions of registration.

    Some argued that the shift makes the process more rigid and burdensome, particularly for smaller or specialist providers, and warned that requiring fully approved documents could create practical difficulties or delay applications. Others were worried about duplication with other regulators and barriers to entry for innovative providers.

    Again, OfS is pressing on regardless, arguing that a standardised approach will improve efficiency and consistency, while promising proportionate application of the rules, detailed guidance on the required documents, and limited flexibility where a final document cannot yet exist.

    To the extent to which some might argue that a heavy and complex burden is a tough ask for small new providers – and runs counter to the original Jo Johnson “Byron Burgers” vision, the message seems to be that it turns out that scale and complexity is required to protect public money and the student interest. It would arguably be a lot easier (on both OfS and Independent HE’s members) if DfE was to just say so.

    Defeat from the jaws of victory

    Sometimes, OfS gets close to getting it – finally, an education regulator properly thinking through the ways in which students are treated unfairly – only to go and spoil it and say something stupid like “this will only apply to new providers”.

    As I noted when the consultation came out, what we now have is one set of rights for students in a new(ly registering) provider that they’ll never be proactively told about, and another set of much weaker ones for everyone else that they’re not told about either, all in the name of “fairness”, at exactly the point that the regulator itself admits is one where providers are under pressure to not deliver on some of the promises they made to students.

    The lack of justification or explanation for that remains alarming – and while cock up is often a better explanation than conspiracy, it’s hard to conclude anything other than OfS has proactively decided to turn a blind eye (while blindfolding students) to existing unfairness while everyone gets their cuts done. What a time to be a student.

    Source link

  • OfS Outcomes (B3) data, 2025

    OfS Outcomes (B3) data, 2025

    The Office for Students’ release of data relating to Condition of Registration B3 is the centerpiece of England’s regulator’s quality assurance approach.

    There’s information on three key indicators: continuation (broadly, the proportion of students who move from year one to year two), completion (pretty much the proportion who complete the course they sign up for), and progression (the proportion who end up in a “good” destination – generally high skilled employment or further study).

    Why B3 data is important

    The power comes from the ability to view these indicators for particular populations of students – everything from those studying a particular subject and those with a given personal characteristic, through to how a course is delivered. The thinking goes that this level of resolution allows OfS to focus in on particular problems – for example a dodgy business school (or franchise delivery operation) in an otherwise reasonable quality provider.

    The theory goes that OfS uses these B3 indicators – along with other information such as notifications from the public, Reportable Event notifications from the provider itself, or (seemingly) comment pieces in the Telegraph to decide when and where to intervene in the interests of students. Most interventions are informal, and are based around discussions between the provider and OfS about the identified problem and what is being done to address it. There have been some more formal investigations too.

    Of course, providers themselves will be using similar approaches to identify problems in their own provision – in larger universities this will be built into a sophisticated data-driven learner analytics approach, while some smaller providers primarily what is in use this release (and this is partly why I take the time to build interactives that I feel are more approachable and readable than the OfS versions).

    Exploring B3 using Wonkhe’s interactive charts

    These charts are complicated because the data itself is complicated, so I’ll go into a bit of detail about how to work them. Let’s start with the sector as a whole:

    [Full screen]

    First choose your indicator: Continuation, completion, and progression.

    Mode (whether students are studying full time, part time, or on an apprenticeship) and level (whether students are undergraduate, postgraduate, and so on) are linked: there are more options for full and part time study (including first degree, taught postgraduate, and PhD) and less for apprenticeships (where you can see either all undergraduates or all postgraduates).

    The chart shows various splits of the student population in question – the round marks show the actual value of the indicator, the crosses show the current numeric threshold (which is what OfS has told us is the point below which it would start getting stuck in to regulating).

    Some of the splits are self-explanatory, others need a little unpacking. The Index of Multiple Deprivation (IMD) is a standard national measure of how socio-economically deprived a small area is – quintile 1 is the most deprived, quintile 5 is the least deprived. Associations Between Characteristics of Students (ABCs) is a proprietary measure developed by OfS which is a whole world of complexity: here all you need to know is that quintile five is more likely to have good outcomes on average, and quintile 1 are least likely to have good outcomes.

    If you mouse over any of the marks you will get some more information: the year(s) of data involved in producing the indicator (by definition most of this data refers to a number of years ago and shouldn’t really be taken as an indication of a problem that is happening right now), and the proportion of the sample that is above or below the threshold. The denominator is simply the number of students involved in each split of the population.

    There’s also a version of this chart that allows you to look at an individual provider: choose that via the drop down in the middle of the top row.

    [Full screen]

    You’ll note you can select your population: Taught or registered includes students taught by the provider and students who are registered with a provider but taught elsewhere (subcontracted out), taught only is just those students taught by a provider (so, no subcontractual stuff), partnership includes only students where teaching is contracted out or validated (the student is both registered and taught elsewhere, but the qualification is validated by this provider)

    On the chart itself, you’ll see a benchmark marked with an empty circle: this is what OfS has calculated (based on the characteristics of the students in question) the value of the indicator should be – the implications being that the difference from the benchmark is entirely the fault of the provider. In the mouse-over I’ve also added the proportion of students in the sample above and below the benchmark.

    OfS take great pains to ensure that B3 measures can’t be seen as a league table, as this would make their quality assurance methodology look simplistic and context-free. Of course, I have built a league table anyway just to annoy them: the providers are sorted by the value of the indicator, with the other marks shown as above (note that not all options have a benchmark value). Here you can select a split indicator type (the group of characteristics you are interested in) and then the split indicator (specific characteristic) you want to explore using the menus in the middle of the top row – the two interact and you will need to set them both.

    You can find a provider of interest using the highlighter at the bottom, or just mouse over a mark of interest to get the details on the pop-up.

    [Full screen]

    With so much data going on there is bound to be something odd somewhere – I’ve tried to spot everything but if there’s something I’ve missed please let me know via an email or a comment. A couple of things you may stumble on – OfS has suppressed data relating to very small numbers of students, and if you ever see a “null” value for providers it refers to the averages for the sector as a whole.

    Yes, but does it regulate?

    It is still clear that white and Asian students have generally better outcomes than those from other ethnicities, that a disadvantaged background makes you less likely to do well in higher education, and that students who studied business are less likely to have a positive progression outcome than those who studied the performing arts.

    You might have seen The Times running with the idea that the government is contemplating restrictions on international student visas linked to the completion rates of international students. It’s not the best idea for a number of reasons, but should it be implemented a quick look at the ranking chart (domicile; non-uk) will let you know which providers would be at risk in that situation: for first degree it’s tending towards the Million Plus end of things, for taught Masters provision we are looking at smaller non-traditional providers.

    Likewise, the signs are clear that a crackdown on poorly performing validated provision is incoming – using the ranking chart again (population type: partnership, splits: type of partnerships – only validated) shows us a few places that might have completion problems when it comes to first degree provision.

    If you are exploring these (and I bet you are!) you might note some surprisingly low denominator figures – surely there has been an explosion in this type of provision recently? This demonstrates the achillies heel of the B3 data: completion data relates to pre-pandemic years (2016-2019), continuation to 2019-2022. Using four years of data to find an average is useful when provision isn’t changing much – but given the growth of validation arrangements in recent years, what we see here tells us next to nothing about the sector as it currently is.

    Almost to illustrate this point, the Office for Students today announced an investigation into the sub-contractual arrangement between Buckinghamshire New University and the London School of Science and Technology. You can examine these providers in B3 and if you look at the appropriate splits you can see plenty of others that might have a larger problem – but it is what is happening in 2025 that has an impact on current students.

    Source link