Category: OfS

  • Eight things to look for when we get the judgement on University of Sussex vs OfS

    Eight things to look for when we get the judgement on University of Sussex vs OfS

    The recent judicial review in the High Court of Justice (King’s Bench Division, Administrative Court – with Justice Lieven presiding) did not directly concern who was in the right and who was in the wrong regarding the substantive matter of Kathleen Stock’s experiences of a “chilling effect” at the University of Sussex.

    Rather – and by design – it examined the processes, powers, and principles relied upon by the Office for Students to come to the decision to make a regulatory finding of non-compliance with ongoing registration conditions E1 and E2, and thus to issue a record fine of £585,000.

    That’s what judicial reviews do. It’s not a matter of reworking the investigation – or making a substantive judgement on the merits or otherwise of any version of the Sussex Trans and Non-Binary Equality Policy Statement (TNBEPS) – it is a matter of procedure.

    It may sound like this would be deadly dull. Lawyers arguing at length as to whether particular letters have been appropriately adorned with dots and or crosses doesn’t sound like big box office.

    But the arguments and rebuttals presented by Monica Carss-Frisk for OfS, and by Chris Butler and Katy Sheridan for Sussex had a poetry of their own – and the entire three days made for compelling viewing.

    I can’t hope to cover everything that was said in a single article – likewise, I am no lawyer so I cannot offer any expert commentary. But these – to me – are the things that are likely to be particularly interesting for ministers and the whole sector when Justice Lieven releases her written judgement in a few weeks time.

    Does OfS actually have the power to make decisions concerning governing documents?

    So, on the face of it, ongoing condition of registration E1 (that the governing documents uphold the public interest governance principles) and E2 (adequate management of governance arrangements and complying with governing documents) give the Office for Students the right to get stuck into your university’s governing documents. Whether it is a matter of those documents having the right things in them about academic freedom, or of whether the measures to ensure decisions are taken properly are actually being followed, any deviation from what is right and proper has regulatory consequences.

    However, if you are in a university founded via a Royal Charter the common law position is that a named senior cleric, aristocrat, or – in many cases – King Charles III as monarch is the only person (they have “exclusive jurisdiction”) able to rule on whether or not your charter and statutes (the governing documents that constitute the “laws” of your university) are being correctly implemented.

    You might think that the Higher Education and Research Act 2017 sections 13 and 14 trump some relic of medieval governance processes, but you would be wrong. The usual test is that parliament can only replace common law if it explicitly says that is what it is doing at the time – generally on the face of the act, but any other public statement (a speech in the house, a consultation document) can suffice at a push. This is what happened when visitors lost the ability to deal with student complaints to the Office for the Independent Adjudicator in 2004 (Higher Education Act) and with employment issues (in the 1998 Education Reform Act).

    A trawl through the act, the green and white papers, and parliamentary debates about the bill do not help us – it really feels like nobody noticed this issue during the entire process of establishing OfS. And – to put it mildly – if the court rules that OfS does not in fact get to assess governing documents and how they are applied, it presents rather a problem for the way that the 51 universities who have visitors are regulated.

    What even are governing documents, anyway?

    So, I cited university statutes and university charters as “governing documents” – which feels pretty unarguable. But what else might be a governing document? The University of Sussex argues that the Trans and Non-Binary Equality Policy Statement that OfS was so unhappy with was not a governing document, and thus not something ongoing conditions of registration E1 and E2 could apply to.

    OfS contends that the approach to this definition should be broad enough to include that kind of statement. In condition E1, it says:

    Depending on the legal form of the provider its ‘governing documents’ may include a Royal Charter, Statutes and Ordinances, articles of association, or Instruments of Government and/or a trust deed or deeds. They are also likely to include documents such as…the provider’s policies on matters such as…support for freedom of speech or academic freedom…

    It would be helpful at this point to point to a handy definition, in HERA, of the term “governing document” – something that would clearly draw a line around what does and does not count. And, of course there isn’t one. The E1 definition sets out what OfS thinks, not what Parliament intended when it asked OfS to look at governing documents.

    Again, we end up trawling through the debates on the bill to get a taste of what the ministerial intention might be. It turns out that, in the eighth sitting of the Commons Bill Committee, Wes Streeting attempted to widen the definition (to include “practices” as well as “documents”) in amendment 25. He got a response from the minister of the time, Jo Johnson:

    The introduction of the term “practices” through the amendment would risk changing the scope of the public interest governance condition to give it a much wider and more subjective application and imposing a significant and ambiguous regulatory burden on the OfS. That would stray outside our stated policy objective and beyond the OfS’ regulatory remit.

    The University of Sussex argues that this response outlines the intention that a “subjective application” (basically that OfS gets to decide what is in scope) is not what was intended, was not the government’s policy objective, and was beyond the OfS’ regulatory remit.

    If the court agrees, the OfS’ ability to say that something like TNBPES is in breach of conditions E1 and E2 is in serious doubt. Something that would apply to the University of Sussex findings, and to everything else that OfS has done or tried to do with E1 or E2.

    What does “nor” mean?

    Universities change and update their policies all the time – to iron out issues and to make them work better, or to ensure compliance with changing regulatory requirements. Sussex updated TNBEPS a number of times – and in 2023 it added a “safeguarding statement”:

    For the avoidance of doubt, nothing in this Policy Statement should be taken to justify sanctioning academic staff for questioning or testing received wisdom or putting forward new ideas including controversial or unpopular opinions within the law, nor should this Policy Statement be taken to justify disproportionate restrictions on freedom of speech.

    You’d think that a change like this, made towards the end of the long OfS investigation, might keep the regulator happy. However, OfS ruled that this version of the policy still breached E1 – it met the requirements for safeguarding academic freedom, but did not meet the requirements for safeguarding freedom of speech.

    As a layperson, this seems odd – it’s literally the same sentence! That is what Sussex argues – the use of the word “nor” implies that everything state applies to both free speech and academic freedom. The OfS position is that the tests for free speech and academic freedom are different, and the existence of the other parts of the policy (the stuff about not seeking to rely on harmful stereotypes of trans people, for instance) would still have the potential to have a chilling effect.

    Lawyers do spend a lot of time talking about the meanings of words – here the decision on the word “nor” will have a bearing as to whether the 2023 version of the policy was in breach of E1.

    Who made decisions about what to do with Sussex? And when? What was Arif Ahmed’s role?

    OfS presented a fascinating chronology of the many years that the investigation continued, and assigned key decisions to key people.

    For instance, the OfS Board had initially suggested that OfS needed to start using their powers on freedom of speech in the summer of 2021 – so when the issues at the University of Sussex hit the media in October of that year the OfS board discussed the case and decided it should be prioritised (of course, this would be an investigation into compliance with regulatory conditions – as OfS could not investigate an individual case).

    It was the OfS Director of Regulation – who was at that point Susan Lapworth – that put a preliminary analysis to the Provider Risk Committee. When she became interim chief executive in 2022, the day-to-day conduct of the investigation passed to David Smy and Hilary Jones, though Lapworth continued to have close supervisory oversight. It was the interim chief executive who wrote to Sussex to offer the opportunity to reach a settlement in 2022 (of which more later), and who wrote to Kathleen Stock to solicit a statement, which was taken in November 2023.

    On 6 July 2023, Lapworth established the board-level University of Sussex Compliance and Enforcement Committee (USCEC); appointing Martin Coleman as chair, and Elizabeth Fagan and Nisha Arora as members. USCEC was established as a decision making body, to respond to recommendations that would be made by the investigative team. Accordingly the committee met the investigation team on numerous occasions to discuss emerging findings and interim analyses.

    This led to the presentation of interim recommendations on 7 November 2023 – which became, following debate, the provisional decision that was communicated to Sussex on 21 March 2024. As was its right, the university responded with representations (totalling around 2,000 pages on 30 May 2024, including two additional witness statements on 7 June 2024).

    By this point one Arif Ahmed had been appointed OfS Director of Free Speech and Academic Freedom – he started work in August 2023, and joined the investigation team in the autumn of 2024. This presented an issue. The University of Sussex argues that when Ahmed joined the OfS he declared a possible conflict of interest in “cases involving gender” and cases related to Stock – whom he knew professionally. Beyond this, Ahmed had written and spoken publicly about the issue on a number of occasions.

    Initially (June 2023) Lapworth is on record as having said “I think he is conflicted”, a position that others involved in the investigation agreed with. This position changed on 15 October 2024 – with Lapworth appointing Ahmed to head up the investigation team – she argued that his “potential conflicts of interest” were not a “material concern” because the team’s views had already “crystallised” at that point and Ahmed was not a decision maker (though he did present the team’s recommendation to the decision-making committee).

    The idea that the team’s views had crystallised by October 2024 was problematised in court – this date was before Stock had provided a second witness statement (in response to points made in the university’s representations) on 12 December 2024, before the team had completed the drafting of the final recommendation (24 December 2024), before the presentation of the recommendation at a committee meeting (15 January 2025), and before the amount of the fine was decided (14 February 2025).

    Sussex argued that the amount of work that was done after Ahmed’s appointment to the investigation – taking another witness statement from Stock for example – suggests that the views of the team had not crystallised, and that Ahmed may have had the opportunity to “infect” the decision with bias. Proving bias in a regulatory decision is a very high bar – but whatever is decided the ambiguity about how the conflict of interest seems to have been interpreted and applied is troubling, and that ambiguity will need to be resolved in future investigations of this kind.

    The final OfS decision letter was sent to Sussex on 20 March 2025. It was re-issued on 27 March 2025. And it has never been published in full.

    Has this process been “adversarial”?

    So that second Stock statement, the one made in December 2024 in response to the university’s representations, wasn’t shared with the university until the high court hearing started gathering steam. Beforehand, there were just a few points from the statement included in the final decision letter. As OfS was relying on aspects of this statement in reaching the final decision, Sussex was rather nonplussed about this – and asked to see the full thing.

    It was told “no”. The reason? Litigation privilege.

    To be clear, litigation privilege is absolutely a thing. If you are preparing for some kind of adversarial litigation (the test is “reasonably contemplated” – around a 50 per cent likelihood of you lawyering up) you are allowed not to share certain kinds of documents with the people you are litigating against. The question in this case is whether adversarial litigation was “reasonably contemplated” at the point the OfS took the second statement from Kathleen Stock, which was before the final decision (the thing that Sussex could reasonably be expected to litigate about) was issued.

    The Sussex argument is that this suggests a frame of mind at OfS that was “adverserial” at a point where it was supposed to be all impartial and regulatory. True, the response to the provisional decision and the idea of a settlement might have given the impression that Sussex was unhappy – but the actual investigation couldn’t really go forward on the basis it was a preparation for the high court. Can it?

    Does the OfS offer a “jolly odd” kind of settlement process?

    That’s the words of Justice Lieven, when she learned that to accept a settlement Sussex would have had to accept the entire of the OfS’ case against it – admitting the breaches of registration conditions in other words – before paying a reduced fine. And that this would include accepting parts of the OfS’ original draft decision (for instance, the idea that the Sussex Freedom of Speech policy was in breach of a B condition) that OfS later withdrew. It is very much like what happened in the investigations that led to OfS’ statement on degree algorithms, where universities felt pressured to accept a presumption that it had breached conditions in order to avoid a reputational detriment.

    It is because Sussex didn’t accept these terms that it – famously – didn’t get the only planned meeting with OfS during the investigation: accepting the settlement was a requirement for the meeting to happen.

    OfS says that the university:

    was intending to challenge the OfS’ view about breaches of conditions of registration […] which was not an issue in respect of which the OfS was willing to negotiate during a settlement process […] settlement is only available under Regulatory Advice 19 if a provider is willing to accept breaches.

    To be clear, this wasn’t in any of the grounds Sussex raised – but Justice Lieven’s note of surprise when presented with these terms was interesting in itself, and I would be unsurprised to see more of this incredulity in her written judgement.

    Does the University of Sussex currently comply with governance conditions?

    On one level the whole point of this investigation and subsequent regulatory action was to ensure that the University of Sussex complied with freedom of speech and academic freedom principles. And on that reading, the end point would be when OfS was satisfied that any offending bits of policy or decision-making did indeed comply with E1 and/or E2.

    However, OfS reserved its position on the 2024 TNBPES – it didn’t want to say if it complied with registration conditions or not. Indeed, Sussex had initially thought the note to this effect in the final decision letter meant it was still under investigation – it was only in preparation for the hearing that it learned it was not. Which is something you would think the regulatory might have been a little clearer on if it was looking to drive compliance. Or if you wanted to say (as OfS did) something like:

    some of the issues with the TNBEPS which gave rise to breach of Condition E1 continued to exist as at 20 March 2024, and therefore it is possible that the breach continued beyond that date and could occur again.

    It would only know that if it had reached a decision on the current policy. Which it said it hadn’t.

    OfS argued that the investigation window had to end at some point, and that it had looked at a lot of iterations of TNBPES already. It argued that this was a common regulatory approach, and to assess yet another iteration of the policy would take “considerable time”.

    What about other universities with a similar policy? What about AdvanceHE?

    The original Sussex TNBPES bore a very close resemblance to a template originally published by the Equality Challenge Unit, which had by 2018 become a part of AdvanceHE. Sussex was not the only university to take this up – by various reckonings there were between eight and ten universities that had a similar policy based on the template.

    This was not an issue that had escaped Judge Lieven, who asked at what point OfS approached either AdvanceHE (to note the issue with a commonly used template and to ask for changes) or other providers (to ask them to cease using or rethink their usage of the template)?

    It had not. It claimed not to have been aware of specific other universities using the policy when it began the investigation (it apparently considered this in the initial conversation about prioritising the Sussex investigation), and there was no conversation with AdvanceHE until after the final decision was made.

    In Justice Lieven’s own words:

    You’ve fined Sussex half a million pounds for a policy, but didn’t ask other universities if they had the same policy?

    Sussex had raised this issue as a part of an argument that it had been “singled out” for punishment – that other providers had done the same thing (and even gone on to experience high profile controversies concerning gender-critical speech) and had not faced regulatory consequences. It had, it claimed, experienced a negative impact as a result – one that had a detrimental effect on fair competition (one of those pesky “have regard to” requirements of OfS in HERA).

    But Justice Lieven’s take opens a wider question – is going after one provider for a widespread sin a reasonable and fair way to do regulation to ensure compliance. OfS’ pour encourager les autres approach is perhaps not a model of change that is helpful if you want to claim a dispassionate and even-handed approach to regulation.

    Why it matters beyond this case

    There is higher education legislation on the way – a Skills Bill, which would straighten up some of the more egregious problems with HERA and the OfS. Some of Justice Lieven’s judgement – particularly if she gets as stuck in to the visitorial jurisdiction issue – is an intervention in the evolution of higher education regulation that would shift us from the Behanite “it would be nice if” to a situation where changes had to be made to ensure OfS can continue to regulate in the way it has come to assume it has.

    Much like in the aftermath of Office for Students vs Bloomsbury College, already knowable issues with OfS become pressingly urgent, and the government will be forced into action via primary legislation. A crisis can drive needed change, but it can also drive measures to shore up systems that need a more considered rethink.

    Source link

  • How will OfS respond in Sussex vs the Office for Students?

    How will OfS respond in Sussex vs the Office for Students?

    The University of Sussex has been commendably transparent in setting out both Facts and Grounds and a Skeleton Argument in public, with documentation available to anyone with an interest on a dedicated section of its website.

    This means that, while the contours of the Sussex arguments about the Office for Student decision to fine the university £585,000 for alleged breaches of conditions of registration E1 and E2 (concerning governance and the public interest) are well understood, it has not been clear how OfS intends to counter these arguments.

    I thought a summary would be helpful. To be absolutely clear – this is just a summary: I am no lawyer and am not saying anything about the quality or otherwise of arguments on either side of this hearing.

    Governing documents?

    Ground 1 of the University of Sussex arguments is that the “Trans and Non-binary Equality Policy Statement” (the nub of the OfS’ concerns) is not a “governing document”. While the Higher Education and Research Act 2017 (HERA) does not define what a “governing document” was at the time, OfS contends that the language it uses at paragraph 424 of the regulatory framework means that the term can be understood broadly as any document that:

    describe[s] any of the provider’s objectives or values, its powers, who has a role in decision making within the provider, how the provider takes decisions about how to exercise its functions, or how it monitors their exercise

    And adding a list of examples that include “the provider’s policies on matters such as […] support for freedom of speech or academic freedom.”

    Regulatory guidance isn’t the same as a statutory definition, of course: the OfS argument here is that because the category in the act is broad and unlimited, and parliament intended to have the same regulation applied to all providers (of whatever form), the working definition needs to be suitably broad to cover everything that it may need to see.

    Visitors

    The University of Sussex argument in Ground 2 is that because the University Visitor has exclusive jurisdiction over its laws – and nothing in HERA or anything else changes this position – OfS therefore cannot have jurisdiction over university statutes and rules. As I set out elsewhere, this is potentially a very wide reaching interpretation, and the status of the University Visitor has only ever been changed in specific ways by acts of parliament (that specifically make those changes).

    The OfS response relies on an interpretation of how acts of parliament interact with common law (there’s no legislation that sets out the ancient powers of the University Visitor). The OfS interpretation is that parliament intended that all higher education providers should be regulated in the same way – whether or not they have a charter and a University Visitor – so the powers it has to regulate in HERA sit alongside the visitorial jurisdiction. Otherwise, OfS simply couldn’t do what parliament intended.

    Governing documents as a whole

    On the face of it Ground 3A sees Sussex arguing in the opposite direction to Ground 1 – that the OfS’ did not think that enough documents constituted governing documents. The university provided an enormous bundle of documents as a part of the investigation, and former OfS Deputy Chair (and chair of the investigative committee) Martin Coleman confirms that he did “read and give full and proper consideration to all of the papers, including the underlying supporting documents”.

    The concern of Sussex here is that the investigation team understood the way that the various regulations, codes of practice, policies, and other documents interacted – especially when it came to academic disciplinary processes. Again, OfS claims that it had that understanding.

    Academic freedom

    Ground 3C posits that OfS didn’t run the right test relating to the academic freedom public interest guiding principle (PIGP) – which should have asked whether the policy in question put academics “in jeopardy of losing their jobs and privileges” at the provider. Because Sussex has a statute (VII.6) that sets out guiding principles on academic freedom – that protects against “adverse outcomes” for staff as a result of exercising academic freedom – it argues that anyone in breach of the trans and non-binary equality policy statement as a result of exercising academic freedom would not face adverse outcomes.

    The OfS position is that while this may be true, nothing in the statute suggests that a member of staff would not face disciplinary proceedings – and that even if there were no adverse outcomes (in terms of job loss or loss of privilege) a member of staff could suffer “stress, anxiety, and reputational damage” merely by being subject to the disciplinary process.

    Reasonable steps

    In Ground 3D the university argues that OfS didn’t properly understand the meanings of the terms “reasonably practicable” and “freedom of speech within the law”. What this comes down to is whether a provider can impose restrictions on lawful speech where (double negative, sorry) it would “not be reasonably practicable not to impose such restrictions”. The example is on things like poor academic quality – without the ability to restrict speech the university would have to tolerate an academic designing curricula which lack academic rigor, or starting every lecture by swearing at and demeaning students.

    The OfS position is that it did consider the possibility of proportional restrictions of lawful speech: in examining the university safeguarding policy and noting that it did not permit disproportionate restrictions of lawful speech, confirming that it considered that there was such a thing as a proportion restriction. It argues that none of this applies to the way the Trans and Non-Binary Equality Policy Statement was to be implemented, and that a chilling effect suggested that reasonable steps had not been taken to support lawful free speech.

    It further notes that because since the opening of the investigation the university has removed or amended aspects of the policy statement it must have recognised that there was a problem.

    Interpreting policy

    Ground 4 covers similar ground to 3C in that Sussex argues that OfS didn’t consider the trans and non-binary policy alongside Statute VII and the disciplinary policy: the argument is that the wording of the latter two means that it rules out the possibility of disciplinary proceedings leading to loss or loss of privilege for an academic in breach of the policy through lawful free speech. What would happen instead is that a brief “process of triage and investigation” would conclude that there was no case to answer, and then the academic would get a letter before the whole matter was “removed from the record”.

    Again OfS argues that it is merely the threat of disciplinary action in the policy that constitutes the problem as it would create a chilling effect – and doesn’t go along with the university’s attempts to separate out the initial triage (the decision to use the disciplinary process) from the disciplinary process itself.

    Remedies

    Was there a way of ensuring that the University of Sussex complied with conditions of registration without finding a breach? That’s the question Ground 5A deals with, with Sussex understandably suggesting that given the various updates to the policy made during the investigation that a finding of a breach and a fine wasn’t required to satisfy OfS that everything was in order – and that OfS should have considered other approaches..

    The OfS decision notes that:

    Findings on breaches and imposition of monetary penalties will act as strong incentives for the provider to address breaches of conditions E1 and E2(i) and ensure compliance in the future in addition to incentivising compliance from other providers

    Suggesting (in the eyes of the OfS itself) that it did consider whether another remedy was available (which is all that it is required to do) before concluding that it wasn’t. A complicating factor here is that OfS decided not to take a view as to whether the (2024) version of the policy – the one currently in force – complied with the conditions of registration.

    Safeguarding

    The Trans and Non-Binary Equalities Policy Statement (as amended in 2023) has various clauses, one of which relates to the need to safeguard academic freedom and freedom of speech. This is the part of that clause that Ground 5C relates to:

    For the avoidance of doubt, nothing in this Policy Statement should be taken to justify sanctioning academic staff for questioning or testing received wisdom or putting forward new ideas including controversial or unpopular opinions within the law, nor should this Policy Statement be taken to justify disproportionate restrictions on freedom of speech. Any person concerned that their rights of academic freedom or freedom of speech have been unjustifiably restricted may lodge a complaint

    The OfS found that this statement was compatible with the academic freedom public interest governance principle, but incompatible with the freedom of speech principle. The Sussex argument is that these are both addressed in the same sentence, with the word “nor” connecting the two issues which are dealt with in the same way.

    OfS doesn’t agree with this, arguing that “the suggestion that, as a matter of grammar, the word ‘nor’ indicates that the two halves of the relevant sentence are to be treated the same is wrong.”

    The other end of this ground regards a part of the policy that deals with the use of stereotypes of trans and non-binary people in teaching: Sussex argues that if OfS is satisfied that the policy as a whole did not have a chilling effect on academics (it was compatible with the academic freedom principle), and considering that it is academics that set the curriculum there could therefore not be a chilling effect on the curriculum. OfS maintains that the “curriculum” is covered by free speech as well as academic freedom requirements.

    Unreasonable conclusion

    Ground 5E sets out that OfS concluded that the policy caused “significant and severe harm” to academic freedom and freedom of speech at the University of Sussex. The university itself argues that there was no evidence at all of a chilling effect on students, and that the evidence relating to staff referred only to the “potential” for a chilling effect, and because the evidence provided by Kathleen Stock was the only evidence provided for a chilling effect on staff there was no evidence of a chilling effect after 2021 (when Stock resigned from the university) and that the effect before that time was minimal (Stock did include extensive gender-critical materials written by herself and others on her reading lists).

    OfS maintains that it did not have to identify individuals who had specifically been chilled by each version of the policy to argue for a chilling effect and thus a breach – as the risk of harm is sufficient. It also argues that it was entitled, applying regulatory judgement, to conclude that the impact of the policy was “significant and severe”.

    Fairness and bias

    It’s worth thinking about grounds 6A, 6B, and 6C together, as they all deal with the nuts and bolts of the way the decision to find the university in breach and issue a fine were made.

    Sussex contends that OfS did not disclose key evidence (the second witness statement taken from Kathleen Stock, made after the university had responded to the OfS’ preliminary findings, was not shared with the university), that the final decision differed so substantially from the preliminary decision that the university effectively did not have the opportunity to comment on the final decision before it was issued, and – perhaps most importantly – that the investigation was predetermined: the refusal to meet the university during the investigation, drafting evidence itself, treating the investigation as an “adversarial exercise”, and that the investigation team was led by a person (Arif Ahmed) who was “personally connected to and proactively supported” Kathleen Stock.

    It’s heady stuff, but as you would expect the regulator disputes these claims. We get a canter through the process of the investigation: the board decision to investigate and prioritise the investigation, the conduct of the investigation (which was initially led by Susan Lapworth before the role was taken on by Arif Ahmed in 2024), the establishment of the University of Sussex Compliance and Enforcement Committee (USCEC) and the way it was given delegated authority.

    On whether Arif Ahmed was conflicted, the original opinion (June 2023) of the chief executive was that she thought he did have a conflict of interest, because he knew Stock and would have discussed relevant issues with her. He was not involved in the investigation until October 2024 at which point she “changed her mind” – the argument being that his conflict of interest was not by then a material concern as the investigative team’s view on substantive issues had been “crystalised” (though to be clear, this was before the decision was taken to get a second witness statement from Stock).

    On this, OfS confirms that Ahmed joined the investigating team at a late stage – and contends that his involvement in drafting the final recommendation papers would not “infuse” the decision making process with bias. On Ahmed’s relationship with Stock, we are told that this was “not personal, and is one of limited professional acquaintance” as two academic philosophers.

    Other stuff

    The OfS skeleton argument doesn’t respond on ground 5b (possibly because it was only mentioned in a footnote?), which argues that because many (around 8-10) providers adopted a similar policy at the same time as Sussex, the decision to single out Sussex (for investigation and for punishment) is anti-competitive – and thus in contravention of the OfS’ “have regard” duty in HERA. In court a witness statement from the chief executive stated that OfS had not been aware that other providers had similar policies until it was brought to their attention by Sussex during the investigation.

    The hearing continues, with a judgement expected in writing at a later date.

    Source link

  • Will the student protection stable door ever get closed?

    Will the student protection stable door ever get closed?

    The Office for Students (OfS) has published new polling on students’ perceptions of their providers’ response to financial challenges.

    I say new – it was actually carried out last April – but regardless of the delay, given the scale of job losses in the sector over the past year, we are very much in no shit Sherlock territory when it comes to the headlines.

    83 per cent of those polled thought that cost-cutting measures had changed the experience they felt they’d been promised – often through larger class sizes than expected, greater use of online learning, or reduced access to academic resources and student support.

    Around a quarter reported changes in support services, including services/funding offered via the SU, IT and technical support, or academic support services.

    Around two in five perceived impacts on access to academic resources and the quality of teaching, and a reduction in extracurricular activities, and 46 per cent of those polled expressed concern about the potential closure of their course or department – and nearly half were unaware of their options if that happened.

    An accompanying blog post remixes material from the new strategy, linking Ambitious, Vigilant, Collaborative and Vocal to the findings.

    Students might well ask whether “a little bit quicker than glacial” should be added to the list.

    They noticed

    Savanta conducted an online survey of 1,256 students studying at OfS-regulated universities and colleges in England, with fieldwork running 8-15 April 2025. The sample was designed with quotas on age, sex and ethnicity to match HESA data and the OfS Access and Participation dashboard, with all reported subgroup differences statistically significant at the 95 per cent confidence level and a margin of error of +/- 3 per cent.

    Just over half (52 per cent) of respondents reported noticing cost-cutting measures at their institution, with 56 per cent aware of perceived financial risks. Awareness was significantly stratified by student characteristics – older students (25+) were more likely than younger students to notice measures (58 per cent vs 48 per cent), as were postgraduates compared to undergraduates (60 per cent vs 44 per cent).

    Male students were also significantly more likely to be aware of financial risks than female students (67 per cent vs 47 per cent). Those already aware of financial risks were significantly more likely to notice measures than those not aware (62 per cent compared to 38 per cent).

    Students reported becoming aware of changes through multiple channels. Some received formal communications from their providers:

    The university management board sent out emails explaining various cost-cutting measures. (Undergraduate)

    Others learned through staff:

    My teachers told me about lay-offs. (Undergraduate)

    Many noticed changes through direct observation – reduced opening hours, cuts to society funding, larger classes:

    I noticed reduced hours of opening on the music studios and the library. (Undergraduate)

    I noticed the cuts in extracurricular because of my societies I was involved in, they got less funding from the university, so many activities that took place over the past few years are not happening anymore. (Postgraduate)

    For some, the changes were impossible to miss:

    It was obvious given that the previous master’s class had five students, and this one has over 50. Social media platforms also discussed it. (Postgraduate)

    Others pieced together the picture from multiple sources:

    I noticed some of these changes firsthand, such as the longer waiting times for counselling services and the reduced number of extracurricular activities. However, I was also informed about some of the cost-cutting measures through the university’s student union newsletter and social media channels, which reported on changes to IT support services and careers advising.” (Postgraduate)

    Impacts and differences

    Among students who noticed cost-cutting measures, the most commonly reported impacts related to staffing – 44 per cent observed changes to staff availability and capacity, while 40 per cent reported increased class sizes, with postgraduates significantly more likely to report the latter (44 per cent vs 35 per cent of undergraduates).

    There are fewer staff and bigger classroom sizes, and there are lots of cutbacks on IT equipment. (Postgraduate)

    Beyond the academic core, 38 per cent noticed changes to financial support availability, and 93 per cent of those noticing any measures highlighted changes to support services.

    Postgraduates were significantly more likely to notice changes to careers and employability services (24 per cent vs 15 per cent) and IT/technical support (29 per cent vs 20 per cent). Those aware of financial risks were more attuned to “indirect” measures such as changes to placements (28 per cent vs 15 per cent) and academic and pastoral support (24 per cent vs 14 per cent).

    The perceived impact was predominantly negative – 39 per cent cited a negative impact on academic resource quality, 35 per cent on extracurricular activities, and 34 per cent on teaching quality. The report notes that some of these findings were “difficult to interpret and/or implausible”, suggesting respondents may not always have been clear what the question was asking.

    A striking 83 per cent of students reported noticing a gap between the experience they believed had been promised at enrolment and the reality – with postgraduates feeling this more acutely (90 per cent vs 77 per cent of undergraduates).

    The report notes an important distinction between institutional promises and student expectations – though the question referred to institutional commitments, responses mixed unmet personal expectations with unfulfilled promises.

    Class sizes featured prominently:

    I was promised a class of 15 but now there are 25 students per class. (Undergraduate)

    Classes are much larger than expected and some courses and/or resources they promised were either cut or moved online due to budget cuts. (Undergraduate)

    The shift to online delivery was a recurring theme:

    When I enrolled, I was promised access to regular in-person lectures and state-of-the-art facilities. However, due to the budget cuts, many of my lectures were moved online. (Postgraduate)

    Many lectures were moved online permanently after the pandemic even though we were back on campus. (Postgraduate)

    Support and resources also fell short of expectations:

    The support I get as a student has reduced compared to what I was told I would get when I enrolled. (Undergraduate)

    They promised to provide us with all the learning equipment in school, but now I have to bring my own laptops. (Undergraduate)

    My university experience has differed quite a bit from what was initially promised, mainly because of some minor cost-cutting measures. When I enrolled, they promised modern facilities and extensive student support. There is limited access to academic support, and modern facilities like labs and student spaces have seen budget cuts. (Postgraduate)

    Financial pressures compounded the sense of broken promises:

    Tuition fees increased slightly, but alongside living expenses and unexpected costs, it’s a lot more than I anticipated when enrolling. (Undergraduate)

    Prices have gone up for food and accommodation, and unexpected fees for items like printing. (Undergraduate)

    The consequences for student decisions are notable – a quarter (25 per cent) said they were more likely to consider dropping out, 36 per cent were considering transferring, and 24 per cent considering deferring.

    Postgraduates were more likely to consider transferring than undergraduates (43 per cent vs 30 per cent), as were minority ethnic students compared to white students (41 per cent vs 33 per cent), and male students compared to female students (41 per cent vs 32 per cent).

    Awareness of financial risks significantly amplified these inclinations – 29 per cent of those aware were considering dropping out compared to 19 per cent of those unaware, and 43 per cent were considering transferring compared to 29 per cent.

    Preparedness and support

    Students were more likely to be unaware of what would happen should their course close than aware (48 per cent vs 44 per cent), with this lack of awareness particularly pronounced among undergraduates (53 per cent) compared to postgraduates (43 per cent), and among females (54 per cent vs 40 per cent of males).

    A majority (56 per cent) were unaware of their provider’s published student protection plans – rising to 68 per cent among undergraduates and 72 per cent among those not aware of financial risks. Males reported higher awareness of student protection plans than females (45 per cent vs 30 per cent).

    Concern about potential course or department closure sat at 46 per cent overall, with higher levels among postgraduates (59 per cent vs 34 per cent of undergraduates), those aged 25+ (64 per cent vs 36 per cent of 18-24 year olds), male students (51 per cent vs 42 per cent), and minority ethnic students (53 per cent vs 42 per cent).

    Subject differences were notable – engineering and technology students were most likely to be concerned (64 per cent vs 46 per cent average), while those studying subjects allied to medicine were among the most unconcerned (66 per cent vs 52 per cent average).

    When asked what support they would expect in a closure scenario, 61 per cent anticipated help transferring to another institution and 60 per cent expected support to complete their course. Clear guidance about available options was anticipated by 51 per cent.

    Notably, those unaware of financial risks had higher expectations of support than those who were aware – 71 per cent of those unaware expected transfer support compared to 53 per cent of those aware, and 66 per cent expected course completion support compared to 55 per cent. Females consistently had higher expectations than males across all support measures.

    On priorities for what providers should protect, 56 per cent said quality of teaching, followed by financial support options (47 per cent), student support services (47 per cent), and academic and pastoral support (39 per cent). Female and white students were more likely to prioritise teaching quality (59 per cent vs 52 per cent for male and minority ethnic students respectively).

    Minority ethnic students were distinctive in being equally likely to prioritise financial support options as teaching quality (51 per cent and 52 per cent) – the only demographic subgroup where this was the case. Those unaware of financial risks were consistently more likely to say each aspect should be prioritised than those who were aware.

    Financial hares and promises tortoises

    One way to read this – especially when it’s read in conjunction with the polling OfS published last year on awareness of rights – is that there is a material risk that to achieve savings, providers have been engaged in widespread breach of contract, effectively depending on students not understanding or feeling able to exercise their rights to pull off the changes. The respective risks have been weighed.

    The student on the Clapham Omnibus might both pose that as a hypothesis, and then ask what OfS has done about that serious risk to the student interest. I expect they’d find the relative focus on phonecalls to VC on their financial sustainability with occasional opinion-polled snapshots of the sort here a disturbing comparison for a “student, not provider” interests regulator.

    It’s never unhelpful to have this sort of stuff written down – and if OfS feels it needs to prove (as with its reasonable adjustments and student rights research) that what student leaders have been telling it through panels and its “student debrief” events is true via polling, so be it.

    But notwithstanding the “jam tomorrow” in the blog on what will now be done with the results, there are important questions surrounding priorities, pace and the idea of the student interest during a period of significant cuts.

    I could go back further than this, but a Susan Lapworth board paper from 2019 (then Director of Regulation, now outgoing CEO) identified almost identical concerns – and acknowledged that the regulator’s existing tools were insufficient to address them.

    The paper noted that “students are not clear about what they are buying, in terms of quality, contact time, support, and so on” and that “students’ consumer protection rights are not enforced when what they have been promised, in terms of quality, contact time, support, and so on, is not delivered.”

    It explicitly framed the outcomes the OfS should be seeking in terms of student expectations being met – that teaching quality, contact time, academic support, learning resources and financial support should all be “appropriate and as they had expected.” Six years later, 83 per cent of students report a gap between what they believed they had been promised and the reality.

    The 2019 paper also raised concerns about the burden of enforcement falling on individual students, questioning “whether a model that relies primarily on individual students challenging a provider for a breach of contract places a burden on students in an undesirable way.” It acknowledged that “the contractual relationship between students and providers is unequal” and that “it is not easy for students to identify instances where they have not received the service they were promised and to seek redress.”

    The paper proposed developing new work to address the failings that appears never to have manifested – and the 2025 survey finds a majority of students unaware of student protection plans, and significant proportions now considering dropping out, transferring or deferring as a direct consequence of the gap between expectation and experience.

    Promises, promises

    Even if we put our fingers in our ears over the pandemic, if forward to 2024 and 2025, John Blake’s work on “the student interest” was surfacing identical themes. In a speech to the SUs Membership Services Conference in August 2024, he acknowledged that OfS still has “challenges delivering that centrality of student interest” and that “students have not always felt the confidence in our regulation that they should.” He was explicit about what students were telling him:

    They tell us they are not getting the teaching hours they were promised, they tell us their work is not marked and support offered in a timely fashion, they tell us they don’t have the time resources and opportunities to get involved in the extracurricular activities so prominently featured in the prospectus and crucially, students tell us frequently that if they are unhappy about any of these things too often, their university or college does not respond speedily and effectively.

    Research then published by OfS in February 2025 reinforced the findings – 28 per cent of undergraduates felt contact hours had been insufficient, 32 per cent had issues with how their course was taught, and 40 per cent said financial support was one of the three biggest influences on their success.

    On both teaching and learning and accommodation, mental health support, and the cost of living, students reported consistent shortfalls. Yet the research also highlighted that students feel powerless to seek redress when promises are broken:

    Students are not really given consumers rights, as seen by Covid year students who want money back. If you are given a false promise… there should be a way to complain… but [there] is not really. (Female, 18, further education student, YouGov focus group)

    It is much more difficult to complain, and essentially impossible to claim a refund. (Female, 20, higher education student, YouGov focus group)

    I have a right to get what I was expecting when I signed up for the degree… This means having teaching provision in line with what was advertised. (Female, 20, higher education student, YouGov focus group)

    The 2025 Savanta survey – along with the work it out in December on student awareness of rights – puts numbers to this picture. 83 per cent noticing a gap between promise and reality, a majority unaware of student protection plans, and significant proportions considering dropping out, transferring or deferring. The regulator has known what the problems are for some time.

    Surprise, surprise

    OfS’ own financial sustainability reporting makes clear that the sector’s response to financial pressure is directly affecting students. Its May 2025 report documented a third consecutive year of declining surpluses and liquidity, with modelling suggesting that without mitigating action, up to 200 providers could be in deficit by 2027-28.

    Roundtable discussions with finance directors revealed how providers were responding – providers acknowledged they had “a limited ability to continue to cut costs and maintain value for money for students”, with some having to consider “course closures” and “rationalisation”.

    The report also noted that “students appear to be less prepared for higher education than previously”, resulting in “increasing attrition rates and the need for greater ongoing pastoral support” – support that “often has significant cost implications, requiring specialist trained staff resource.”

    Last March, students told OfS exactly what the survey would later find. OfS it was “at this very early stage of the project” to understand the impact of financial challenges on students, and wanted student “insight and advice” to “shape up the things that we might look at.”

    Libraries had been “cut down” with “reduced hours” – with one student noting they were “unable to use the building after 5pm including the library” meaning “students on placement are heavily affected.” There had been a “reduction in support services and support staff”, with “staffing being pared back to the bare minimum” and classes being cancelled outright.

    The impact on remaining staff was palpable – “less passionate than they used to be, some of the cuts impacting their morale” – resulting in “delay in responding to student queries” and “customer service not as good for students.” Students reported a “reduction in tests for neurodivergent students”, changes to “extension and deferral policies” meaning students were “no longer able to get extensions and deferrals”, and disruptive “changes in supervisors.”

    One student simply noted their “course is now totally different to when I came.” Perhaps most tellingly, students said they were “unable to raise issues” because “staff saying that it’s out of their hands.”

    OfS promised that “your input today will be used to shape our approach to protecting students in the face of University and college cutbacks” and that:

    “this isn’t just an exercise that we will do now and think is really interesting… we will be looking at these and seeing what we can do and we will feedback what we’ve done at a future debrief event.

    You might have thought that almost a year on, another student event on consumer rights would have been a good time to feedback on what was done. Not so much.

    Students on the call were told that OfS would be working on some “real steps” towards meaningfully strengthening student protections – a consultation on changes to the regulatory framework in Easter.

    With time to feed in and the usual months of delay between consultation close and publication, plus some time to make changes, we’re probably looking at 2028.

    Worse – even on issues OfS has previously been clear about – attendees got vague. Asked whether students were entitled to refunds over strike action, the answer was:

    I’m going to not answer that right now because you may know that there is a test case in the courts, by UCL students, about this very issue. Um, so I’m going to wait for the outcome of that.

    Can we imagine Arif Ahmed giving a similar response in the context of Sussex’s judicial review of its fines last year? We cannot.

    I have written a lot of articles on these issues since OfS’ inception. Every so often OfS announces progress, an intention to act, a new way to frame the issues or whatever. Now and again, I get optimistic. But real action has been thin – there’s only so many broken promises before it’s reasonable to conclude that nothing will ever change.

    The time that students really do need protection is not when the sun is shining – it’s when the cuts are on. It is very difficult to avoid the conclusion that OfS is waiting for the financial sustainability horse to fully bolt before it even gets close to closing the student protection stable door.

    Source link

  • TEF proposals’ radical reconfiguration of quality risk destabilising the sector – here’s the fix

    TEF proposals’ radical reconfiguration of quality risk destabilising the sector – here’s the fix

    The post-16 education and skills white paper reiterates what the Office for Students’ (OfS) recent consultation on the future of the Teaching Excellence Framework (TEF) had already made quite clear: there is a strong political will to introduce a regulatory framework for HE that imposes meaningful consequences on providers whose provision is judged as being of low quality.

    While there is much that could be said about the extent to which TEF is a valid way of measuring quality or teaching excellence, we will focus on the potential unintended consequences of OfS’s proposals for the future of TEF.

    Regardless of one’s views of the TEF in general, it is relatively uncontroversial to suggest that TEF 2023 was a material improvement on its predecessor. In an analysis of the outcomes from the 2017 TEF exercise, it was clear that a huge volume of work had gone into establishing a ranking of providers which was far too closely correlated with the characteristics of their student body.

    Speaking plainly, the optimal strategy for achieving Gold in 2017 was to avoid recruiting too many students from socially and economically disadvantaged backgrounds. In 2017, the 20 providers with the fewest FSM students had no Bronze awards, while the 20 with the highest failed to have any Gold awards associated with their provision.

    Following the changes introduced in the next round of TEF assessments, while there still appears to be a correlation between student characteristics and TEF outcomes, the relationship is not as strong as it was in 2017. Here we have mapped the distribution of TEF 2023 Gold, Silver and Bronze ratings for providers with the lowest (Table 1) and highest (Table 2) proportions of students who have received free school meals (FSM), for TEF 2023.

    In TEF 2023, the link between student characteristics and TEF outcome was less pronounced. This is a genuine improvement, and one we should ensure is not lost under the new proposals for TEF.

    Reconfiguring the conception of quality

    The current TEF consultation proposes radical changes, not least of which is the integration of the regulator’s assessment of compliance with the B conditions of registration which deal with academic quality.

    At present, TEF differentiates between different levels of quality that are all deemed to be above minimum standards – built upon the premise that the UK higher education sector is, on average, “very high quality” in an international context – and operates in parallel with the OfS’s approach to ensuring compliance with minimum standards. The proposal to merge these two aspects of regulation is being posited as a way of reducing regulatory burden.

    At the same time, the OfS – with strong ministerial support – is making clear that it wants to ensure there are regulatory consequences associated with provision that fails to meet their thresholds. And this is where things become more contentious.

    Under the current framework, a provider is technically not eligible to participate in TEF if it is judged by the OfS to fall foul of minimum quality expectations. Consequently, TEF ratings of Bronze, Silver and Gold are taken to correspond with High Quality, Very High Quality and Outstanding provision, respectively. While a fourth category, Requires Improvement, was introduced for 2023, vanishingly few providers were given this rating.

    Benchmarked data on the publicly available TEF dashboard in 2023 were deemed to contribute no more than 50 per cent of the weight in each provider’s aspect outcomes. Crucially, data that was broadly in line with benchmark was deemed – as a starting hypothesis, if you will – to be consistent with a Silver rating: again, reinforcing the message that the UK HE sector is “Very High Quality” on the international stage.

    Remember this, as we journey into the contrasts with proposals for the new TEF.

    Under the proposed reforms, OfS has signalled that providers failing to be of sufficient quality would be subject to regulatory consequences. Such consequences could span from enhanced monitoring to – in extremis – deregistration; such processes and penalties would be led by OfS. We have also received the clear indication that the government may wish to withdraw permission to grow and receive inflation-linked fee increases with quality outcomes. In other words, providers who fail to achieve a certain rating in TEF may experience student number caps and fee freezes.

    These are by no means minor inconveniences for any provider, and so one might reasonably expect that the threshold for implementing such penalties would be set rather high – from the perspectives both of the proportion of the sector that would, in a healthy system, be subject to regulatory action or governmental restriction at any one time, and the operational capacity of the OfS properly to follow through and follow up on the providers that require regulatory intervention. On the contrary, however, it is being proposed that both Requires Improvement- and Bronze-rated providers would be treated as inadequate in quality terms.

    While a provider rated as Requires Improvement might expect additional intervention from the regulator, it seems less obvious why a provider rated Bronze – which was previously defined as a High Quality provider – should expect to receive enhanced regulatory scrutiny and/or restrictions on their operation.

    It’s worse than we thought

    As the sector regulator, OfS absolutely ought to be working to identify areas of non-compliance and inadequate quality. The question is whether these new proposals achieve that aim.

    This proposal amounts to OfS making a fundamental change to the way it conceptualises the very notion of quality and teaching excellence, moving from a general assumption of high quality across the sector to the presumption that there is low quality at a scale hitherto unimagined. While the potential consequences of these proposed reforms are important at the level of an individual provider, and for student and prospective students’ perceptions, it is equally important to ask what they mean for the HE sector as a whole.

    Figure 1 illustrates the way in which the ratings of quality across our sector might change, should the current proposals be implemented. This first forecast is based upon the OfS’s proposal that overall provider ratings will be defined by the lowest of their two aspect ratings, and shows the profile of overall ratings in 2023 had this methodology been applied then.

    There are some important points to note regarding our methodology for generating this forecast. First, as we mentioned above, OfS has indicated an intention to base a provider’s overall rating on the lowest of the two assessed aspects: Student Experience and Student Outcomes. In TEF 2023, providers with mixed aspects, such as Bronze for one and Silver for another, may still have been judged as Silver overall, based on the TEF panel’s overall assessment of the evidence submitted. Under the new framework, this would not be possible, and such a provider would be rated Bronze by default. In addition, we are of course assuming that there has been no shift in metrics across the sector since the last TEF, and so these figures need to be taken as indicative and not definitive.

    Figure 1: Comparison of predicted future TEF outcomes compared with TEF 2023 actual outcomes

    There are two startling points to highlight:

    • The effect of this proposed TEF reform is to drive a downward shift in the apparent quality of English higher education, with a halving of the number of providers rated as Outstanding/Gold, and almost six times the number of providers rated as Requires Improvement.
    • The combined number of Bronze and Requires Improvement Providers would increase from 50 to 89. Taken together with the proposal to reframe Bronze as being of insufficient quality, OfS could be subjecting nearly 40 per cent of the sector to special regulatory measures.

    In short, the current proposals risk serious destabilisation of our sector, and we argue could end up making the very concept of quality in education less, not more, clear for students.

    Analysis by provider type

    Further analysis of this shift reveals that these changes would have an impact across all types of provider. Figures 2a and 2b show the distribution of TEF ratings for the 2023 and projected future TEF exercises, where we see high, medium and low tariff providers, as well as specialist institutions, equally impacted. For the 23 high tariff providers in particular, the changes would see four providers fall into the enhanced regulatory space of Bronze ratings, whereas none were rated less than Silver in the previous exercise. For specialist providers, of the current 42 with 2023 TEF ratings, five would be judged as Requires Improvement, whereas none received this rating in 2023.

    Figure 2a: Distribution of TEF 2023 ratings by provider type

    Figure 2b: Predicted distribution of future TEF ratings by provider type

    Such radical movement in OfS’s overall perception of quality in the sector requires explanation. Either the regulator believes that the current set of TEF ratings were overly generous and the sector is in far worse health than we have assumed (and, indeed, than we have been advising students via current TEF ratings), or else the very nature of what is considered to be high quality education has shifted so significantly that the way we rate providers requires fundamental reform. While the former seems very unlikely, the latter requires a far more robust explanation than has been provided in the current consultation.

    We choose to assume that OfS does not, in fact, believe that the quality of education in English HE has fallen off a cliff edge since 2023, and also that it is not intentionally seeking to radically redefine the concept of high quality education. Rather, in pursuit of a regulatory framework that does carry with it material consequences for failing to meet a robust set of minimum standards, we suggest that perhaps the current proposals have missed an opportunity to make more radical changes to the TEF rating system itself.

    We believe there is another approach that would help the OfS to deliver its intended aim, without destabilising the entire sector and triggering what would appear to be an unmanageable volume of regulatory interventions levelled at nearly 40 per cent of providers.

    Benchmarks, thresholds, and quality

    In all previous iterations of TEF, OfS has made clear that both metrics and wider evidence brought forward in provider and student submissions are key to arriving at judgements of student experience and outcomes. However, the use of metrics has very much been at the heart of the framework.

    Specifically, the OfS has gone to great lengths to provide metrics that allow providers to see how they perform against benchmarks that are tailored to their specific student cohorts. These benchmarks sit alongside the B3 minimum thresholds for key metrics, which OfS expects all providers to achieve. For the most part, providers eligible to enter TEF would have all metrics sitting above these thresholds, leaving the judgement of Gold, Silver and Bronze as a matter of the distance from the provider’s own benchmark.

    The methodology employed in TEF has also been quite simple to understand at a conceptual level:

    • A provider with metrics consistently 2.5 per cent or more above benchmark might be rated as Gold/Outstanding;
    • A provider whose metrics are consistently within ±2.5 per cent of their benchmarks, would be likely assessed as Silver/Very High Quality;
    • Providers who are consistently 2.5 per cent or more below their benchmark would be Bronze/High Quality or Requires Improvement.

    There is no stated numerical threshold that is consistent with the boundary between Bronze and Requires Improvement – a matter of holistic panel judgement, including but not limited to how far beyond -2.5 per cent of benchmark a provider’s data sits.

    It is worth noting here that in the current TEF, Bronze ratings (somewhat confusingly) could only be conferred for providers who could also demonstrate some elements of Silver/Very High Quality provision. Under the new TEF proposals, this requirement would be dropped.

    The challenge we see here is with the definition of Bronze being >2.5 per cent below benchmark; the issue is best illustrated with an example of two hypothetical Bronze providers:

    Let’s assume both Provider A and B have received a Bronze rating in TEF, because their metrics were consistently more than 2.5 per cent below benchmark, and their written submissions and context did not provide any basis on which a higher rating ought to be awarded. For simplicity, let’s pick a single metric, progression into graduate employment, and assume that the benchmark for these two providers happens to be the same, at 78 per cent.

    In this example, Provider A obtained its Bronze rating with a progression figure of 75 per cent, which is 3 per cent below its benchmark. Provider B, on the other hand, had a Progression figure of 63 per cent. While this is a full 12 percentage points worse than Provider A, it is nonetheless still 2 per cent above the minimum threshold specified by OfS, which is 60 per cent, and so it was not rated as Requires Improvement.

    Considering this example, it seems reasonable to conclude that Provider A is doing a far better job of supporting a comparable cohort of students into graduate employment than Provider B, but under the new TEF proposals, both are judged as being Bronze, and would be subject to the same regulatory penalties proposed in the consultation. From a prospective student’s perspective, it is hard to see what value these ratings would carry, given they conceal very large differences in the actual performance of the providers.

    On the assumption that the Requires Improvement category would be retained for providers with more serious challenges – such as being below minimum thresholds in several areas – the obvious problem is that Bronze as a category in the current proposal is simply being stretched so far, it will lose any useful meaning. In short, the new Bronze category is too blunt a tool.

    An alternative – meet Meets Minimum Requirements

    As a practical solution, we recommend that OfS considers a fifth category, sitting between Bronze and Requires Improvement: a category of Meets Minimum Requirements.

    This approach would have two advantages. First, it would allow the continued use of Bronze, Silver and Gold in such a way that the terms retain their commonly understood meanings; a Bronze award, in common parlance, is not a mark of failure. Second, it would allow OfS to distinguish providers who, while below our benchmark for Very High Quality, are still within a reasonable distance of their benchmark such that a judgement of High Quality remains appropriate, from those whose gap to benchmark is striking and could indicate a case for regulatory intervention.

    The judgement of Meets Minimum Requirements would mean the provider’s outcomes do not fall below the absolute minimum thresholds set by the regulator, but equally are too far from their benchmark to be awarded a quality kitemark of at least a Bronze TEF rating. The new category would reasonably be subject to increased regulatory surveillance, given the borderline risk of thus rated providers failing to meet minimum standards in future.

    We argue that such a model would be far more meaningful to students and other stakeholders. TEF ratings of Bronze, Silver and Gold would continue to represent an active recognition of High, Very High, and Outstanding quality, respectively. In addition, providers meeting minimum requirements (but not having earned a quality kitemark in the form of a TEF award) would be distinguishable from providers who would be subject to active intervention from the regulator, due to falling below the absolute minimum standards.

    It would be a matter for government to consider whether providers deemed to be meeting minimum requirements should receive inflation-linked uplifts in fees, and should be permitted to grow; indeed, one constructive use of the increased grading nuance we propose here could be that providers who meet minimum requirements are subject to student number caps until they can demonstrate capability to grow safely by improving to the point of earning at least a Bronze TEF award. Such a measure would seem proportionately protective of the student interest, while still differentiating those providers from providers who are actively breaching their conditions of registration and would be subject to direct regulatory intervention.

    Modelling the impact

    To model how this proposed approach might impact overall outcomes in a future TEF, we have, in the exercise that follows, used TEF 2023 dashboard data and retained the statistical definitions of Gold (>2.5 per cent above benchmark) and Silver (±2.5% of benchmark) from the current TEF. We have modelled a proposed definition of Bronze as between 2.5-5 per cent below benchmark. Providers who Meet Minimum Requirements are defined as being within 5-10 per cent below benchmark, and Requires Improvement reflects metrics >10 per cent below benchmark.

    For the sake of simplicity, we have taken the average distance from benchmark for all Student Experience and Student Outcomes metrics for each provider to categorise providers for each Aspect Rating. The outcome of our analysis is shown in Table A, and is contrasted in Table B with an equivalent analysis under OfS’s current proposals to redefine a four-category framework.

    Table A. Distribution of aspect ratings according to a five-category TEF framework

    Table B. Distribution of aspect ratings according to OfS’s proposed four-category TEF framework

    Following OfS’s proposal that a provider would be given an overall rating that reflects the lowest rating of the two aspects, our approach leads to a total of 32 providers falling into the Meets Minimum Requirements and Requires Improvement categories. This represents 14 per cent of providers, which is substantially fewer than the 39 per cent of providers who would be considered as not meeting high quality expectations under the current OfS proposals. It is also far closer to the 22 per cent of providers who were rated Bronze or Requires Improvement in TEF 2023.

    We believe that our approach represents a far more valid and meaningful framework for assessing quality in the sector, while OfS’ current proposals risk sending a problematic message that, since 2023, quality across the sector has inexplicably and catastrophically declined. Adding granularity to the ratings system in this way will help OfS to focus its regulatory surveillance where it will likely be the most useful in targeting provision that is of potentially low quality.

    Figure 4, below, illustrates the distribution of potential TEF outcomes based on OfS’s four category rating framework, contrasted with our proposed five categories. It is important to note that this modelling is based purely on metrics and benchmarks, and does not incorporate the final judgement of TEF panels, based on the narrative submissions providers submit.

    This is particularly important because previous analysis has shown that many providers with metrics that were not significantly above benchmark, or not significantly at benchmark, were nonetheless awarded Gold or Silver ratings, respectively, and this would have been based on robust narrative submissions and other evidence submitted by providers. Equally, some providers with data that was broadly in line with benchmark were awarded Bronze ratings overall, as the further evidence submitted in the narrative statements failed to convince the panel of an overall picture of very high quality.

    Figure 4: Predicted profile of provider ratings in a four- and five-category framework

    The benefits of a five-category approach

    First, the concept of a TEF award in the form of a Gold, Silver or Bronze rating retains its meaning for students and other stakeholders. Any of these three awards reflect something positive about a provider delivering beyond what we minimally expect.

    Second, the pool of providers potentially falling into categories that would prompt enhanced scrutiny and potential regulatory intervention/governmental restrictions would drop to a level that would be a much fairer reflection of the actual quality of our sector. We simply do not believe it to be the case that anyone can be convinced that as much as 40 per cent of our sector is not of sufficiently high quality.

    Third, referencing the socio-economic diversity data by 2023 TEF award in Tables 1 and 2, and the future TEF outcomes modelling in Figure 1, our proposal significantly reduces the risk that students who were previously eligible for free school meals (who form strong proportions of the cohorts of Bronze-rated providers) would be further disadvantaged by their HE environment being impoverished via fee freezes and student number caps. We argue that such potential measures should be reserved for the Requires Improvement, and, plausibly, Meets Minimum Requirements categories.

    Fourth, by expanding the range of categories, OfS would be able to distinguish to between providers who are in fact meeting minimum expectations, but not delivering quality in experience or outcomes which would allow them to benefit from some of the freedoms proposed to be associated with TEF awards, and providers who are, in at least one of these areas, failing to meet even those minimum expectations.

    To recap, the key features of our proposal are as follows:

    • Retain Bronze, Silver and Gold in the TEF as ratings that reflect a positive judgement of High, Very High, and Outstanding quality, respectively.
    • Introduce a new rating – Meets Minimum Requirements – that recognises providers who are delivering student experience and outcomes that are above regulatory minimum thresholds, but are too far from benchmarks to justify an active quality award in TEF. This category would be subject to increased OfS surveillance, given the borderline risk of provision falling below minimum standards in future.
    • Retain Requires Improvement as a category that indicates a strong likelihood that regulatory intervention is required to address more serious performance issues.
    • Continue to recognise Bronze ratings as a mark of High Quality, and position the threshold for additional regulatory restrictions or intervention such that these would apply only to providers rated as Meets Minimum Requirements or Requires Improvement.

    Implementing this modest adaptation to the current TEF proposals would safeguard the deserved reputation of UK higher education for high-quality provision, while meeting the demand for a clear plan to secure improvements to quality and tackle pockets of poor quality.

    The deadline for responding to OfS’ consultation on TEF and the integrated approach to quality is Thursday 11 December. 

    Source link

  • The latest sector-wide financial sustainability assessment from the Office for Students

    The latest sector-wide financial sustainability assessment from the Office for Students

    As the higher education sector in England gets deeper into the metaphorical financial woods, the frequency of OfS updates on the sector’s financial position increases apace.

    Today’s financial sustainability bulletin constitutes an update to the regulator’s formal annual assessment of sector financial sustainability published in May 2025. The update takes account of the latest recruitment data and any policy changes that could affect the sector’s financial outlook that would not have been taken into account at the point that providers submitted their financial returns to OfS ahead of the May report.

    Recruitment headlines

    At sector level, UK and international recruitment trends for autumn 2025 entry have shown growth by 3.1 per cent and 6.3 per cent respectively. But this is still lower than the aggregate sector forecasts of 4.1 per cent and 8.6 per cent, which OfS estimates could result in a total sector wide net loss of £437.8m lower than forecast tuition fee income. “Optimism bias” in financial forecasting might have been dialled back in recent years following stiff warnings from OfS, but these figures suggest it’s still very much a factor.

    Growth has also been uneven across the sector, with large research intensive institutions increasing UK undergraduate numbers at a startling 9.9 per cent in 2025 (despite apparently collectively forecasting a modest decline of 1.7 per cent), and pretty much everyone else coming in lower than forecast or taking a hit. Medium-sized institutions win a hat tip for producing the most accurate prediction in UK undergraduate growth – actual growth of 2.3 per cent compared to projected growth of 2.7 per cent.

    The picture shifts slightly when it comes to international recruitment, where larger research-intensives have issued 3.3 per cent fewer Confirmations of Acceptance of Studies (CAS) against a forecasted 6.6 per cent increase, largely driven by reduction in visas issued to students from China. Smaller and specialist institutions by contrast seem to have enjoyed growth well beyond forecast. The individual institutional picture will, of course, vary even more – and it’s worth adding that the data is not perfect, as not every student applies through UCAS.

    Modelling the impact

    OfS has factored in all of the recruitment data it has, and added in new policy announcements, including estimation of the impact of the indexation of undergraduate tuition fees, and increases to employers National Insurance contributions, but not the international levy because nobody knows when that is happening or how it will be calculated. It has then applied its model to providers’ financial outlook.

    The headline makes for sombre reading – across all categories of provider OfS is predicting that if no action were taken, the numbers of providers operating in deficit in 2025–26 would rise from 96 to 124, representing on increase from 35 per cent of the sector to 45 per cent.

    Contrary to the impression given by UK undergraduate recruitment headlines, the negative impact isn’t concentrated in any one part of the sector. OfS modelling suggests that ten larger research-intensive institutions could tip into deficit in 2025–26, up from five that were already forecasting themselves to be in that position. The only category of provider where OfS estimates indicate fewer providers in deficit than forecast is large teaching-intensives.

    The 30 days net liquidity is the number you need to keep an eye on because running out of cash would be much more of a problem than running a deficit for institutional survival. OfS modelling suggests that the numbers reporting net liquidity of under 30 days could rise from 41 to 45 in 2025–26, with overall numbers concentrated in the smaller and specialist/specialist creative groups.

    What it all means

    Before everyone presses the panic button, it’s really important to be aware, as OfS points out, that providers will be well aware of their own recruitment data and the impact on their bottom line, and will have taken what action they can to reduce in-year costs, though nobody should underestimate the ongoing toll those actions will have taken on staff and students.

    Longer term, as always, the outlook appears sunnier, but that’s based on some ongoing optimism in financial forecasting. If, as seems to keep happening, some of that optimism turns out to be misplaced, then the financial struggles of the sector are far from over.

    Against this backdrop, the question remains less about who might collapse in a heap and more about how to manage longer term strategic change to adapt providers’ business models to the environment that higher education providers are operating in. Though government has announced that it wants providers to coordinate, specialise and collaborate, while the sector continues to battle heavy financial weather those aspirations will be difficult to realise, however desirable they might be in principle.

    Source link

  • What’s in the new Office for Students strategy?

    What’s in the new Office for Students strategy?

    The Office for Students began a consultation process on its 2025-30 strategy back in December 2024. Alongside the usual opportunities for written responses there have been a series of “feedback events” promoted specifically to higher education provider staff, FE college staff, and students and student representatives held early in 2025.

    In the past OfS has faced arguably justified criticism for failing to take sector feedback on proposals into account – but we should take heart that there are significant differences between what was originally proposed and what has just been finalised and published.

    Graphic design is our passion

    Most strikingly, we are presented with four new attitudes that we are told will “drive delivery of all our strategic goals in the interest of students” – to hammer the point home individual activities in the “roadmap” are labelled with coloured, hexagonal, markers where “a particular activity will exemplify certain attitudes”. We get:

    • Ambitious for all students from all backgrounds (an upward arrow in a pink hexagon)
    • Collaborative in pursuit of our priorities and in our stewardship of the sector (two stylised hands in the shape of a heart, yellow hexagon)
    • Vigilant about safeguarding public money and student fees (A pound-sign on a teal hexagonal background)
    • Vocal that higher education is a force for good, for individuals, communities and the country (a stylised face and soundwave on a purple hexagon)

    Where things get potentially confusing is that the three broadly unchanged strategic goals – quality (tick, yellow circle), sector resilience (shield, blue circle), student experience and support (someone carrying an iPad, red circle) – are underpinned both by the attitude and the concept of “equality of opportunity” (teal ourobouros arrow). The only change at this conceptual level is that “the wider student interest” is characterised as “experience and support”. Don’t worry – the subsections of these are the same as in the consultations

    Fundamentally, OfS’ design language is giving openness and transparency, with a side order of handholding through what amounts to a little bit of a grab-bag of a list of interventions. The list is pared down from the rather lengthy set of bullet points initially presented, and there are some notable changes.

    Quality

    In the quality section what has been added is an assurance that OfS will do this “in collaboration with students, institutions, and sector experts”, and a commitment to “celebrate and share examples of excellence wherever we find them”. These are of course balanced with the corresponding stick: “Where necessary, we will pursue investigation and enforcement, using the full range of our powers.” This comes alongside clarification that the new quality system would be build on, rather than alongside the TEF.

    What is gone is the Quality Risk Register. An eminently sensible addition to the OfS armoury of risk registers, the vibes from the consultation were that providers were concerned that it might become another arm of regulation rather than a helpful tool for critical reflection

    Also absent from the final strategy is any mention of exploring alignment with European quality standards, which featured in the consultation materials. Similarly, the consultation’s explicit commitment to bring transnational education into the integrated quality model has not been restated – it’s unclear whether this reflects a change in priority or simply different drafting choices.

    Students

    In the section on students, language about consumer rights is significantly softened, with much more on supporting students in understanding their rights and correspondingly less on seeking additional powers to intervene on these issues. Notably absent are the consultation’s specific commitments – the model student contract, plans for case-report publication, and reciprocal intelligence sharing. The roadmap leans heavily into general “empowerment” language rather than concrete regulatory tools. And, for some reason, language on working with the Office for the Independent Adjudicator has disappeared entirely.

    A tweak to language clarifies that OfS are no longer keen to regulate around extra-curricular activity – there will be “non-regulatory” approaches however.

    New here is a commitment to “highlight areas of concern or interest that may not be subject to direct regulation but which students tell us matter to them”. The idea here looks to be that OfS can support institutions to respond proactively working with sector agencies and other partners. It is pleasing to see a commitment to this kind of information sharing (I suspect this is where OIA has ended up) – though a commitment to continue to collect and publish data on the prevalence of sexual misconduct in the draft appears not to have made the final cut.

    Resilience

    The “navigation of an environment of increased financial and strategic risks” has been a key priority of OfS over most of the year since this strategy was published – and what’s welcome here is clearer drafting and a positive commitment to working with providers to improve planning for potential closures, and that OfS will “continue to work with the government to address the gaps in the system that mean that students cannot be adequately protected if their institution can no longer operate”.

    Governance – yes, OfS will not only consider an enhanced focus, it will strengthen its oversight on governance. That’s strategic action right there. Also OfS will “work with government on legislative solutions that would stop the flow of public money when we [OfS, DfE, SLC] have concerns about its intended use.”

    Also scaled back is the consultation’s programmatic approach to governance reform. Where the consultation linked governance capability explicitly to equality and experience outcomes, the final version frames this primarily as assurance and capability support rather than a reform agenda. The shift suggests OfS moving toward a lighter-touch, collaborative posture on governance rather than directive intervention.

    Regulation

    OfS will now “strive to deliver exemplary regulation”, and interestingly the language on data has shifted from securing “modern real-time data” to “embedding the principle collect once, use many times” and a pleasing promise to work with other regulators and agencies to avoid duplication.

    Two other consultation commitments have been quietly downgraded. The explicit language on working with Skills England to develop a shared view of higher education’s role in meeting regional and national skills needs has disappeared – odd given the government’s focus on this agenda. And while the Teaching Excellence Framework remains present, the consultation’s push to make TEF “more routine and more widespread” has been cooled – the final version steps back from any commitments on cadence or coverage.

    What’s missing within the text of the strategy, despite being in the consultation version, are the “I statements” – these are what Debbie McVitty characterised on Wonkhe as:

    intended to describe what achieving its strategic objectives will look and feel like for students, institutions, taxpayers and employers in a clear and accessible way, and are weighted towards students, as the “primary beneficiaries” of the proposed strategy.

    These have been published, but separately and with a few minor revisions. Quite what status they have is unclear:

    The ‘I statements’ are a distillation of our objectives, as set out in our strategy. They are not regulatory tools. We will not track the performance of universities and colleges against them directly.

    Source link

  • Is there a place for LEO in regulation?

    Is there a place for LEO in regulation?

    The OfS have, following a DfE study, recently announced a desire to use LEO for regulation. In my view this is a bad idea.

    Don’t get me wrong, the Longitudinal Outcomes from Education (LEO) dataset is a fantastic and under-utilised tool for historical research. Nothing can compare to LEO for its rigour, coverage and the richness of the personal data it contains.

    However, it has serious limitations, it captures earnings and not salary, for everyone who chooses to work part time it will seriously underestimate the salary they command.

    And fundamentally it’s just too lagged. You can add other concerns around those choosing not to work and those working abroad if you wish to undermine its utility further.

    The big idea

    The OfS is proposing using data from 3 years’ after graduation which I assume to mean the third full tax year after graduation although it could mean something different, no details are provided. Assuming that my interpretation is correct the most recent LEO data published in June this year relates to the 2022-23 tax year so for that to be the third full tax year after graduation (that’s the that’s the 2018-19 graduating cohort, and even if you go for the third tax year including the one they graduated in it’s the 2019-20 graduates). The OfS also proposes to continue to use 4 year aggregates which makes a lot of sense to avoid statistical noise and deal with small cohorts but it does mean that some of the data will relate to even earlier cohorts.

    The problem is therefore if the proposed regime had been in place this year the OfS would have just got its first look at outcomes from the 2018-19 graduating cohort who were of course entrants in 2016-17 or earlier. When we look at it through this lens it is hard to see how one applies any serious regulatory tools to a provider failing on this metric but performing well on others especially if they are performing well on those based on the still lagged but more timely Graduate Outcomes survey.

    It is hard to conceive of any courses that will not have had at least one significant change in the 9 (up to 12!) years since the measured cohort entered. It therefore won’t be hard for most providers to argue that the changes they have made since those cohorts entered will have had positive impacts on outcomes and the regulator will have to give some weight to those arguments especially if they are supported by changes in the existing progression, or the proposed new skills utilisation indicator.

    A problem?

    And if the existing progression indicator is problematic then why didn’t the regulator act on it when it had it four years earlier? The OfS could try to argue that it’s a different indicator capturing a different aspect of success but this, at least to this commentators mind, is a pretty flimsy argument and is likely to fail because earnings is a very narrow definition of success. Indeed, by having two indicators the regulator may well find themselves in a situation where they can only take meaningful action if a provider is failing on both.

    OfS could begin to address the time lag by just looking at the first full tax year after graduation but this will undoubtedly be problematic as graduates take time to settle into careers (which is why GO is at 15 months) and of course the interim study issues will be far more significant for this cohort. It would also still be less timely than the Graduate Outcomes survey which itself collects the far more meaningful salary rather than earnings.

    There is of course a further issue with LEO in that it will forever be a black box for the providers being regulated using it. It will not be possible to share the sort of rich data with providers that is shared for other metrics meaning that providers will not be able to undertake any serious analysis into the causes of any concerns the OfS may raise. For example, a provider would struggle to attribute poor outcomes to a course they discontinued, perhaps because they felt it didn’t speak to the employment market. A cynic might even conclude that having a metric nobody can understand or challenge is quite nice for the OfS.

    The use of LEO in regulation is likely to generate a lot of work for the OfS and may trigger lots of debate but I doubt it will ever lead to serious negative consequences as the contextual factors and the fact that the cohorts being considered are ancient history will dull, if not completely blunt, the regulatory tools.

    Richard Puttock writes in a personal capacity.

    Source link

  • The “regulatory burden” on sexual misconduct needs to lift the weight from students

    The “regulatory burden” on sexual misconduct needs to lift the weight from students

    The problem with findings like “1.5 per cent of students said they were in intimate relationships with staff” is the danger of extrapolation.

    It’s in the results of the Office for Students (OfS) first sector-wide sexual misconduct survey – covering final year undergraduates in England who chose to take part in a clearly labelled bolt-on to the National Student Survey (NSS) earlier this year, with a response rate of just 12.1 per cent.

    But 1.5 per cent of final-year undergraduates at English providers reporting “intimate” staff-student relationships in the past 12 months still feels like a lot – especially when half involved staff members who were engaged in the student’s education and/or assessment.

    One in four respondents (24.5 per cent) said they’ve experienced sexual harassment since starting university, and 14.1 per cent declare experiencing sexual assault or violence.

    Most incidents involved fellow students – with 58.4 per cent of harassment cases and 44.1 per cent of assault cases (taking place off-campus) involving someone connected to the victim’s institution.

    OfS has published a dashboard of the results, an analysis report, a guide for students and a press release where the bullets slightly are less careful about extrapolation than I’ve been above. Another report to come later will provide more detailed analysis, including results for different combinations of characteristics and findings by academic subject.

    The exercise represents OfS’ first real attempt to gather national prevalence data on sexual misconduct affecting students, having initially promised to do so back in 2022 in the context of its new Condition E6. That requires providers to take “multiple steps which could make a significant and credible difference in protecting students”.

    The survey covered three main areas – sexual harassment experiences, sexual assault and violence, and intimate staff-student relationships. Questions also included detailed behavioural descriptions to ensure accurate prevalence measurement.

    As such, the approach built on a 2023 pilot study involving volunteer providers. Since then, OfS has shortened the questionnaire whilst maintaining its core elements, leveraging NSS infrastructure to achieve national scale coverage – although for now, none of the devolved nations have taken part.

    It’s worth noting that response patterns showed quite a bit of variation between demographic groups. Students with disabilities, female students, and LGB+ students were both more likely to respond and more likely to report misconduct – creating some quite complex interpretation challenges for understanding true prevalence rates.

    Prevalence patterns and vulnerable groups

    That set aside, the results show consistent vulnerability patterns across both harassment and assault. Female student respondents reported harassment rates of 33 per cent compared to significantly lower rates among males. Student respondents with disabilities experienced harassment at 34.7 per cent and assault at 22.1 per cent – higher than those without disabilities.

    Sexual orientation showed significant differences. Lesbian, gay and bisexual respondents reported harassment rates of 46.6 per cent and assault rates of 29.8 per cent, nearly double the overall population rates. Those identifying as having “other sexual orientation” also showed elevated rates – at 40.1 per cent for harassment and 23.3 per cent for assault.

    Age was also a key factor, with those under 21 at course start showing higher vulnerability rates – 31.2 per cent experienced harassment and 18.2 per cent experienced assault.

    In terms of behaviours, the survey found “making sexually suggestive looks or staring at your body” affected 16.7 per cent of all respondents – the most common individual harassment behaviour. This was followed by “making unwelcome sexual comments or asking sexualised questions about your private life, body, or physical appearance.”

    The patterns have direct relevance for E6’s training requirements, which mandate that induction sessions ensure students “understand behaviour that may constitute harassment and/or sexual misconduct.” The prevalence of apparently “lower-level” behaviours like staring suggests providers need to address misconceptions about what constitutes harassment – particularly given the survey’s use of legal definitions from the Equality Act 2010 and Protection from Harassment Act 1997.

    There were also interesting patterns across socioeconomic and ethnic lines that deserve interrogation. Those from the least deprived areas (IMD quintile 5) reported higher harassment rates at 32.6 per cent, but so did those not eligible for free school meals, who showed elevated rates at 32.9 per cent. And mixed ethnicity respondents reported harassment at 31.5 per cent compared to 27.9 per cent among white students.

    Where groups showed higher misconduct rates, part of the problem is that we can’t be sure whether that reflects reporting confidence, different social environments, or varying exposure patterns – all things providers will need to understand to make progress on the “credible difference” thing.

    The ethnic dimension also intersects with religious identity, with Jewish respendents (29.8 per cent), those with no religion (30.5 per cent), and those from “any other religion” (35.5 per cent) showing elevated harassment rates. Again, differential intersectional patterns should align with E6’s requirements for providers to understand their specific student populations and tailor interventions accordingly.

    The reporting crisis

    One of the survey’s most concerning findings relates to formal reporting rates. Only 13.2 per cent of respondents experiencing harassment in the past year made formal reports to their institutions. For sexual assault (in a university setting or involving someone connected to the university) reporting varied dramatically by age – just 12.7 per cent of under-21s reported incidents compared to 86.4 per cent of those aged 31 and above.

    This reporting gap in turn creates a fundamental information deficit for universities attempting to understand campus culture and develop appropriate interventions. The data suggests institutions may be operating with incomplete intel – hampering attempts to comply with E6 requirements to understand student populations and implement effective protective measures.

    E6 explicitly requires providers to offer “a range of different mechanisms” for making reports, including online and in-person options, and to “remove any unnecessary actual or perceived barriers” that might make students less likely to report. The survey’s findings suggest the mechanisms may not be reaching their intended audiences, particularly younger students.

    Among those who did report, experiences were mixed. For harassment cases, 46.7 per cent rated their reporting experience as good whilst 39.3 per cent rated it as poor. Sexual assault reporting showed slightly better outcomes, with 57.3 per cent rating experiences as good and 32.4 per cent as poor. These are findings that directly relate to E6’s requirements – and suggest the sector has some way to go to build confidence in the processes it does have.

    The condition mandates that providers ensure “investigatory and disciplinary processes are free from any reasonable perception of bias” and that affected parties receive “sufficient information to understand the provider’s decisions and the reasons for them.” The proportion rating experiences as poor does suggest that some providers are struggling to meet E6’s procedural fairness requirements.

    University connections and scope of misconduct

    Jurisdiction has always been a contested issue in some policies – here, misconduct frequently involved university-connected individuals even when incidents occurred off-campus. Among harassment cases not occurring in university settings, 58.4 per cent involved someone connected to the victim’s university. For assault cases, that figure was 44.1 per cent.

    Student perpetrators dominated both categories. Staff perpetrators appeared less frequently overall, though older students were more likely than younger groups to report staff involvement in assault cases.

    In E6 terms, the condition explicitly covers “the conduct of staff towards students, and/or the conduct of students towards students” and applies to misconduct “provided in any manner or form by, or on behalf of, a provider.” The data suggests universities’ efforts will need to explicitly extend beyond physical premises to encompass behaviour involving community members regardless of location.

    In fact, most recent harassment incidents occurred either entirely outside university settings (39.7 per cent) or across mixed locations (45.1 per cent), with only 15.2 per cent occurring exclusively in university settings. For sexual assault, 61.9 per cent occurred outside university settings entirely.

    The patterns all point to providers needing sophisticated approaches to addressing misconduct that span campus boundaries. Traditional safety measures, or at least student perceptions of jurisdiction, might well miss the majority of incidents affecting students – broader community engagement and partnership approaches will need to be deployed.

    Support confidence

    The survey also examined’ confidence in seeking institutional support – finding 67.5 per cent felt confident about where to seek help, whilst 29.3 per cent lacked confidence. But confidence levels varied significantly across demographic groups, with particular variations by sexual orientation, sex, disability status, and age.

    The differential confidence patterns also justify the E6 requirement for providers to ensure “appropriate support” is available and targeted at different student needs. It specifically requires support for students “with different needs, including those with needs affected by a student’s protected characteristics.”

    The age-related reporting gap suggests younger students may face particular barriers to accessing institutional processes. This could relate to unfamiliarity with university systems, power dynamics, or different attitudes toward formal complaint mechanisms. For sexual assault cases, the contrast between 12.7 per cent reporting among under-21s versus 86.4 per cent among over-31s represents one of the survey’s most striking findings.

    The age-related patterns have specific relevance given E6’s training and awareness requirements. The condition requires providers to ensure students are “appropriately informed to ensure understanding” of policies and behaviour constituting misconduct. The survey suggests the requirement may need particular attention for younger students – they’re showing both higher vulnerability and lower reporting rates.

    Staff-student relationships

    The survey’s staff-student relationship findings are a small proportion of the student population – but they do raise real questions about power dynamics and institutional governance.

    Among the 1.5 per cent reporting those relationships, the high proportion involving educational or professional responsibilities suggest significant potential conflicts of interest.

    Respondent students without disabilities were more likely to report relationships involving educational responsibility (72.6 per cent versus 45.5 per cent for disabled students), and similar patterns emerged for professional responsibilities. The differences deserve investigation, particularly given disabled students’ higher overall misconduct rates.

    E6’s requirements on intimate personal relationships require that providers implement measures making “a significant and credible difference in protecting students from any actual or potential conflict of interest and/or abuse of power.”

    The survey’s power dynamic findings suggest the requirement is needed – although whether the most common approach that has emerged (a ban where there’s a supervisory relationship, and a register where there isn’t) creates the right “culture” is a remaining question, given students’ views in general on professional boundaries.

    Regulatory implications

    The survey’s findings raise real questions about how OfS will use prevalence data in its regulatory approach. Back in 2022, Susan Lapworth told the House of Commons Women and Equalities Committee hearing that the data would enable the targeting of interventions:

    “So a university with high prevalence and low reporting would perhaps raise concerns for us – and we would want to then understand in detail what was going on there and that would allow us to focus our effort.

    Of course, as with Access and Participation, having national data on “which kinds of students in which contexts are affected by this” could well mean that what shows up in provider data as a very small problem could add up to a lot across the country. OfS’ levers in these contexts are always limited.

    The lack of survey coverage of postgraduate students in general turns up here as a major problem. We might theorise that most exhibit multiple theoretical vulnerabilities given the dominance of international students and students who have supervisors – patience with OfS’ focus on undergraduates really is wearing thin each time it manifests.

    The report also doesn’t look at home vs international student status, and nor does it disaggregate results by provider mission group, size, type, or characteristics. It only states that all eligible English providers in NSS 2025 were included, and that data are weighted to be representative of final-year undergraduates across the sector. Providers are also (confidentially) receiving their data – although response rates down at provider level may make drawing conclusions in the way originally envisaged difficult.

    The dramatic under-reporting rates create monitoring challenges for both institutions and OfS. If only 13.2 per cent of harassment victims make formal reports, institutional complaint statistics provide limited insight into actual campus culture. The information gap complicates E6 compliance assessment – and suggests OfS may need alternative monitoring approaches beyond traditional complaint metrics.

    E6 does explicitly contemplate requiring providers to “conduct a prevalence survey of its whole student population to the OfS’s specification” where there are compliance concerns. The 2025 survey’s methodology and findings provide a template, but it also seems to me that more contextual research – like that found in Anna Bull’s research from a couple of years back – is desperately needed to understand what’s going on beneath many of the numbers.

    Overall though, I’m often struck by the extent to which providers argue that things like E6 are an over-reach or an example of “burden”. On this evidence, even with all the caveats, it’s nothing like the burden being carried by victims of sexual misconduct.

    Source link

  • What OfS’ data on harassment and sexual misconduct doesn’t tell us

    What OfS’ data on harassment and sexual misconduct doesn’t tell us

    New England-wide data from the Office for Students (OfS) confirms what we have known for a long time.

    A concerningly high number of students – particularly LGBTQ+ and disabled people, as well as women – are subjected to sexual violence and harassment while studying in higher education. Wonkhe’s Jim Dickinson reviews the findings elsewhere on the site.

    The data is limited to final year undergraduates who filled out the National Student Survey, who were then given the option to fill out this further module. OfS’ report on the data details the proportion of final year students who experienced sexual harassment or violence “since being a student” as well as their experiences within the last 12 months.

    It also includes data on experiences of reporting, as well as prevalence of staff-student intimate relationships – but its omission of all postgraduate students, as well as all undergraduates other than final year students means that its findings should be seen as one piece of a wider puzzle.

    Here, I try to lay out a few of the other pieces of the puzzle to help put the new data in context.

    The timing is important

    On 1st August 2025 the new condition of registration for higher education providers in England came into force, which involves regulatory requirements for all institutions in England to address harassment and sexual misconduct, including training for all staff and students, taking steps to “prevent abuses of power” between staff and students, and requiring institutions to publish a “single, comprehensive source of information” about their approach to this work, including support services and handling of reports.

    When announcing this regulatory approach last year, OfS also published two studies published in 2024 – a pilot prevalence survey of a small selection of English HEIs, as well as a ‘poll’ of a representative sample of 3000 students. I have discussed that data as well as the regulation more generally elsewhere.

    In this year’s data release, 51,920 students responded to the survey with an overall response rate of 12.1 per cent. This is significantly larger sample size than both of the 2024 studies, which comprised responses from 3000 and 5000 students respectively.

    This year’s survey finds somewhat lower prevalence figures for sexual harassment and “unwanted sexual contact” than last year’s studies. In the new survey, sexual harassment was experienced by 13.3 per cent of respondents within the last 12 months (and by 24.5 per cent since becoming a student), while 5.4 per cent of respondents had been subjected to unwanted sexual contact or sexual violence within the last 12 months (since becoming a student, this figure rises to 14.1 per cent).

    By any measure, these figures represent a very concerning level of gender-based violence in higher education populations. But if anything, they are at the lower end of what we would expect.

    By comparison, in OfS’ 2024 representative poll of 3000 students, over a third (36 per cent) of respondents had experienced some form of unwanted sexual contact since becoming a student with a fifth (21 per cent) stating the incident(s) happened within the past year. 61 per cent had experienced sexual harassment since being a student, and 43 per cent of the total sample had experienced this in the past year.

    The lower prevalence in the latest dataset could be (in part) because it draws on a population of final year undergraduate students – studies from the US have repeatedly found that first year undergraduate students are at the greatest risk, especially when they start their studies.

    Final year students may simply have forgotten – or blocked out – some of their experiences from first year, leading to lower prevalence. They may also have dropped out. The timing of the new survey is also important – the NSS is completed in late spring, while we would expect more sexual harassment and violence to occur when students arrive at university in the autumn.

    A study carried out in autumn or winter might find higher prevalence. Indeed, the previous two studies carried out by OfS involved data collected at different times to year – in August 2023 (for the 3000-strong poll) and ‘autumn 2023’ (for the pilot prevalence study).

    A wide range of prevalence

    Systematic reviews published in 2023 from Steele et al and Lagdon et al from across the UK, Ireland and the US have found prevalence rates of sexual violence between 7 per cent to 86 per cent.

    Steele et al.’s recent study of Oxford University found that 20.5 per cent of respondents had experienced at least one act of attempted or forced sexual touching or rape, and 52.7 per cent of respondents experienced at least one act of sexual harassment within the past year.

    Lagdon et al.’s study of “unwanted sexual experiences” in Northern Ireland found that a staggering 63 per cent had been targeted. And my own study of a UK HEI found that 30 per cent of respondents had been subjected to sexual violence since enrolling in their university, and 55 per cent had been subjected to sexual harassment.

    For now, I don’t think it’s helpful to get hung up on comparing datasets between last year and this year that draw on somewhat different populations. It’s also not necessarily important that respondents were self-selecting within those who filled out the NSS – a US study compared prevalence rates for sexual contact without consent among students between a self-selecting sample and a non-self-selecting sample, finding no difference.

    The key take-home message is that students are being subject to a significant level of sexual harassment and violence, and particularly women, LGBTQ+ and disabled students are unable to access higher education in safety.

    Reporting experiences

    The findings on reporting reveals some important challenges for the higher education sector. According to the OfS new survey findings, rates of reporting to higher education institutions remain relatively low at 13.2 per cent of those experiencing sexual harassment, and 12.7 per cent of those subjected to sexual violence.

    Of students who reported to their HEI, only around half of rated their experience as “good”. But for women as well as for disabled and LGBTQ+ students there were much lower rates of satisfaction with reporting than men, heterosexuals and non-disabled students who reported incidents to their university.

    This survey doesn’t reveal why students were rating their reporting experiences as poor, but my study Higher Education After #MeToo sheds light on some of the reasons why reporting is not working out for many students (and staff).

    At the time of data collection in 2020-21, a key reason was that – according to staff handling complaints – policies in this area were not yet fit for purpose. It’s therefore not surprising that reporting was seen as ineffective and sometimes harmful for many interviewees who had reported. Four years on, hopefully HEIs have made progress in devising and implementing policies in this area, so other reasons may be relevant.

    A further issue focused on by my study is that reporting processes for sexual misconduct in HE focus on sanctions against the reported party rather than prioritising safety or other needs of those who report. Many HEIs do now have processes for putting in place safety (“precautionary” or “interim”) measures to keep students safe after reporting.

    Risk assessment practices are developing. But these practices appear to be patchy and students (and staff) who report sexual harassment or violence are still not necessarily getting the support they need to ensure their safety from further harm. Not only this, but at the end of a process they are not usually told the actions that their university has taken as a result of the report.

    More generally, there’s a mismatch between why people report, and what is on offer from universities. Forthcoming analysis of the Power in the Academy data on staff-student sexual misconduct reveals that by the time a student gets to the point of reporting or disclosing sexual misconduct from faculty/staff to their HEI, the impacts are already being felt more severely than those who do not report.

    In laywoman’s terms, if people report staff sexual misconduct, it’s likely to be having a really bad impact on their lives and/or studies. Reasons for reporting are usually to protect oneself and others and to be able to continue in work/study. So it’s crucial that when HEIs receive reports, they are able to take immediate steps to support students’ safety. If HEIs are listening to students – including the voices of those who have reported or disclosed to their institution – then this is what they’ll be hearing.

    Staff-student relationships

    The survey also provides new data on staff-student intimate relationships. The survey details that:

    By intimate relationship we mean any relationship that includes: physical intimacy, including one-off or repeated sexual activity; romantic or emotional intimacy; and/or financial dependency. This includes both in person and online, or via digital devices.

    From this sample, 1.5 per cent of respondents stated that they had been in such a relationship with a staff member. Of those who had been involved in a relationship, a staggering 68.8 per cent of respondents said that the university or college staff member(s) had been involved with their education or assessment.

    Even as someone who researches within this area, I’m surprised by how high both these figures are. While not all students who enter into such relationships or connections will be harmed, for some, deep harms can be caused. While a much higher proportion of students who reported “intimate relationships” with staff members were 21 or over, age of the student is no barrier to such harms.

    It’s worth revisiting some of the findings from 2024 to give some context to these points. In the 3000-strong representative survey from the OfS, a third of those in relationships with staff said they felt pressure to begin, continue or take the relationship further than they wanted because they were worried that refusing would negatively impact them, their studies or career in some way.

    Even consensual relationships led to problems when the relationship broke up. My research has described the ways in which students can be targeted for “grooming” and “boundary-blurring” behaviours from staff. These questions on coercion from the 2024 survey were omitted from the shorter 2025 version – but assuming such patterns of coercion are present in the current dataset, these findings are extremely concerning.

    They give strong support to OfS’ approach towards staff-student relationships in the new condition of registration. OfS has required HEIs to take “one or more steps which could make a significant and credible difference in protecting students from any actual or potential conflict of interest and/or abuse of power.”

    Such a step could include a ban on intimate personal relationships between relevant staff and students but HEIs may instead chose to propose other ways to protect students from abuses of power from staff. While most HEIs appear to be implementing partial bans on such relationships, some have chosen not to.

    Nevertheless, all HEIs should take steps to clarify appropriate professional boundaries between staff and students – which, as my research shows, students themselves overwhelmingly want.

    Gaps in the data

    The publication of this data is very welcome in contributing towards better understanding patterns of victimisation among students in HE. It’s crucial to position this dataset within the context of an emerging body of research in this area – both the OfS’ previous publications, but also academic studies as outlined above – in order to build up a more nuanced understanding of students’ experiences.

    Some of the gaps in the data can be filled from other studies, but others cannot. For example, while the new OfS regulatory condition E6 covers harassment on the basis of all protected characteristics, these survey findings focus only on sexual harassment and violence.

    National data on the prevalence of racial harassment or on harassment on the basis of gender reassignment would be particularly valuable in the current climate. This decision seems to be a political choice – sexual harassment and violence is a focus that both right- and left-wing voices can agree should be addressed as a matter of urgency, while it is more politically challenging (and therefore, important) to talk about racial harassment.

    The data also omits stalking and domestic abuse, which young people – including students – are more likely than other age groups to be subjected to, according to the Crime Survey of England and Wales. My own research found that 26 per cent of respondents in a study of gender-based violence at a university in England in 2020 had been subjected to psychological or physical violence from a partner.

    It does appear that despite the narrow focus on sexual harassment and violence from the OfS, many HEIs are taking a broader approach in their work, addressing domestic abuse and stalking, as well as technology-facilitated sexual abuse.

    Another gap in the data analysis report from the OfS is around international students. Last year’s pilot study of this survey included some important findings on their experiences. International students were less likely to have experienced sexual misconduct in general than UK-domiciled students, but more likely to have been involved in an intimate relationship with a member of staff at their university (2 per cent of international students in contrast with 1 per cent of UK students).

    They were also slightly more likely to state that a staff member had attempted to pressured them into a relationship. Their experiences of accessing support from their university were also poorer. These findings are important in relation to any new policies HEIs may be introducing on staff-student relationships: as international students appear to be more likely to be targeted, then communications around such policies need to be tailored to this group.

    We also know that the same groups who are more likely to be subjected to sexual violence/harassment are also more likely to experience more harassment/violence, i.e. a higher number of incidents. The new data from OfS do not report on how many incidents were experienced. Sexual harassment can be harmful as a one-off experience, but if someone is experiencing repeated harassment or unwanted sexual contact from one or more others in their university environment (and both staff and student perpetrators are likely to be carry out repeated behaviours), then this can have a very heavy impact on those targeted.

    The global context

    Too often, policy and debate in England on gender-based violence in higher education fails to learn from the global context. Government-led initiatives in Ireland and Australia show good practice that England could learn from.

    Ireland ran a national researcher-led survey of staff as well as students in 2021, due to be repeated in 2026, producing detailed data that is being used to inform national and cross-institutional interventions. Australia has carried out two national surveys – in 2017 and 2021 – and informed by the results has just passed legislation for a mandatory National Higher Education Code to Prevent and Respond to Gender-based Violence.

    The data published by OfS is much more limited than these studies from other contexts in its focus on third year undergraduate students only. It will be imperative to make sure that HEIs, OfS, government or other actors do not rely solely on this data – and future iterations of the survey – as a tool to direct policy, interventions or practice.

    Nevertheless, in the absence of more comprehensive studies, it adds another piece to the puzzle in understanding sexual harassment and violence in English HE.

    Source link

  • Higher education mergers are a marathon not a sprint

    Higher education mergers are a marathon not a sprint

    When the announcement came last Wednesday that the universities of Kent and Greenwich are planning to merge, the two institutions did a fine job of anticipating all the obvious questions.

    In particular, announcing that the totemic decision has already been taken on who should lead the new institution – University of Greenwich vice chancellor Jane Harrington – was a pragmatic move that will save a great deal of gossip and speculation that could otherwise have derailed the discussions that will now commence on how to turn “intention to formally collaborate” to the “first-of-its-kind multi university group.”

    But even with that really tricky bit of business out of the way, there is still a lot to work through. Broadly those questions fall into two baskets: the strategic direction and the practical fine detail. Practicalities are important for giving reassurance that people’s lives aren’t about to radically change overnight; albeit there are inevitably lots of issues that are either formally unknown at this stage or which can only be tackled in light of the evolution of the final agreement and organisational structure.

    With that in mind, it is really worth emphasising that the notion of a “multi university group” is a brand new idea, given a conceptual shape in the very recent publication Radical collaboration: a playbook from KPMG and Mills & Reeve, produced under the auspices of the Universities UK transformation and efficiency taskforce. The idea of a “multi university trust” explored in that report, derived from the school sector, posits the creation of a single legal entity that can nevertheless “house” a range of distinct “trading entities” with unique “brands” each with an agreed level of local autonomy.

    It answers the question of how you take two (or more) institutions, each with their own histories and characteristics and find ways to create the strength and resilience that scale might offer, while retaining the local distinctive characteristics that staff, students, and local communities value and feel a sense of affinity to. It also, as has been noted in the coverage following the announcement, leaves an option open for other institutions to join the new structure, if there’s a case for them to do so.

    “It is very positive to see institutions taking proactive steps to finding new ways to work together,” says Sam Sanders, head of education, skills and productivity for KPMG in the UK. “The group structure proposed is a model we have seen be successful elsewhere, where brand identity is retained but you get economies of scale, meaning institutions can focus on their core activities while sharing the burden of the overheads. If it goes well it could act as a blueprint for other similar ventures.”

    Sam’s reflection is that establishing a new entity might be the most straightforward part of the process: “The complicated part is moving to a new model that simultaneously preserves the right culture in the right places while achieving the savings you might want to see in areas like IT, infrastructure, and estates. These are multi-year agendas so everyone involved needs to be prepared for that.”

    The long and winding road

    With lots to work through, it’s really important to step back, and give space to the institutions to work this out. Because the big picture is about mapping what that critical path looks like from single-institution vulnerabilities to strength in numbers – and that is a path that these institutions and their governing bodies are, to a large extent, carving out as they go, potentially doing the wider sector a service in the process as others may look to follow the same path in the future.

    “The sector response has been overwhelmingly positive,” says Jane Harrington, who is already fielding calls from heads of institution who are curious about the planned new model. Both Jane and University of Kent acting vice chancellor Georgina Randsley de Moura have experience with group structures in schools and further education, knowledge they drew on in thinking through the options for formal collaboration – starting with ten different possible models which were narrowed down to two that were explored in more depth.

    “We started with what we wanted to achieve, and then we looked for models,” says Georgina. “We kept going back to our principles: widening participation, education without boundaries, high quality teaching and research, and what will make sense for our regions. Inevitably there is some focus in the news around finances and that is an important part of the context, but this would not work if our universities didn’t have values and mission alignment.”

    “We also had examples in mind of where we don’t want to end up,” adds Jane. “You see mergers where the brand identity is lost and it takes a decade to get it back. We have, right now, two student-facing brands that are strong in their own right. And in five or ten years time it might be that we have four or five institutions that are part of this structure – we don’t think it would make sense for them to become part of one amorphous brand.”

    It’s frequently observed that bringing together two or more institutions that are facing difficult financial headwinds may simply create a larger institution with correspondingly larger challenges. So having a very clear sense strategically of where the strengths and opportunities lie, as well as the where risks and weaknesses might also be subject to force-multiplier effects, is pretty important at the outset.

    It’s clear that there is an efficiency agenda in play in the sense that merging allows for the adoption of a single set of systems and processes – an area where Jane is especially interested in curating creative thinking. But the wider opportunities afforded by scale are also compelling, especially in being more strategic about the collective skills and innovation offer to the region.

    Kent and Medway local councils and MPs have also responded enthusiastically to the universities’ proposal, the two heads of institution tell me – not least because navigating politics around different HE providers can be a headache for regional actors who want to engage higher education institutions in key regional agendas.

    “There are cold spots in our region where nobody is offering what is needed,” says Jane. “But developing new provision is much harder when you are acting alone. This region has pockets of multiple forms of deprivation: rural, urban and coastal. The capacity and scale afforded by combining means we can think strategically about how to do the regional growth work, and what our combined offer should be, including to support reskilling and upskilling.”

    Georgina makes a similar case for combining research strengths. “Our shared research areas, like health, food sustainability, and creative industries, play to regional strengths,” she says. “When research resources are constrained, by combining we can do more.”

    We can work it out

    The multi university group is not, in theory, a million miles from a federation in structure in that in federations generally there is a degree of autonomy ceded by the constituent elements to a single governing body – but in a federation each entity retains its individual legal status. A critical difference is the extent to which a sharing economy among the entities would have to be painstakingly negotiated for a federation, which could erode the value that is created in collaborating. It could also raise tricky questions around things like VAT.

    But the sheer novelty of the multi university group also raises a bunch of regulatory questions, covered in all the depth you’d expect by DK elsewhere on the site – to give a flavour, can you use the word “university” for your trading entity without that existing as a legal entity with its own degree awarding powers?

    The supportive noises from DfE and OfS at the time of the initial announcement should give Kent and Greenwich some degree of comfort as they work through some of these questions. The sector has been making the argument for some time now that if the government and regulator want to see institutions seizing the initiative on innovative forms of collaboration, there will need to be some legal and regulatory quarter given, up to and including making active provision for forms of collaboration that emerge without a legal playbook.

    Aside from the formal conditions for collaboration, how OfS conducts itself in this period will be watched closely by others considering similar moves. While nobody would suggest that changing structure offers an excuse for dropping the ball on quality or student experience – and both heads of institution are very clear there is no expectation of that happening – OfS now has a choice. It can choose to be highly activist in requesting reams of documentation and evidence in response to events as they unfold, from institutions already grappling with a highly complex landscape. Or it can work out an approach that offers a degree of advance clarity to the institutions what their accountabilities are in this time of transition, and how they can/should keep the regulator informed of any material risks arising to students from the process.

    Despite the generally positive response, there is no shortage of scepticism about whether a plan like the one proposed can work. The answer, of course, depends on what you think success looks like. Certainly, anyone expecting a sudden and material shrinkage in costs is bound to be disappointed. Decisions will be made along the way with which some disagree, perhaps profoundly.

    But I think what is often forgotten in these discussions is that the alternative to the decision to pursue a new structure is not to carry on in broadly the same way as before, but to pursue a different but equally radical and equally contentious course of action. If the status quo was satisfactory then there would be no case for the change. In that sense, being as useful as possible in helping these two institutions make the very best fist that they can of their new venture is the right thing for everyone to do, from government downwards.

    Source link