Category: Regulation

  • Subcontractual higher education beyond the headlines

    Subcontractual higher education beyond the headlines

    We’ve written a lot about subcontractual provision on Wonkhe, and it is fair to say that very little of it has been positive.

    What’s repeatedly hit the headlines, here and elsewhere, are the providers that teach large numbers of students in circumstances that have sparked concerns about teaching quality, academic standards, and indeed financial probity or even ethics.

    There are a fair number of students that are getting a very bad deal out of subcontractual agreements and, although we’ve been screaming about this for several years, it is good to finally see the beginnings of some action.

    Student number tools

    The long-awaited release of OfS data is not perfect – there’s lots that we’d love to see that does not appear to have been delivered. One of these is proper student numbers: it should be possible to see data on how many students are studying at each subcontracted provider at the last census point.

    Instead, we are scrabbling around with denominators and suppressions trying to build a picture of this part of the sector that is both heavily caveated and three years out of date. This isn’t good enough.

    And it is a shame. Because as well as the horror show, the data we do have offers a glimpse of a little known corner of higher education that arguably deserves to be celebrated.

    I’ve developed some new visualisations to help you explore the data – these add substantial new features to what I have previously published. Both these dashboards work in broadly the same way – the first allows you to examine relationships at delivery providers, the second at lead providers. You choose your provider of interest at the top left, which shows the various relationships on a map on the left hand side. On the right you can see denominator numbers for each year of data – you can use the filter at the top right to see information about the total number of students who might be continuing, completing, or progressing in a given year.

    Each row on the right hand side shows a combination of provider (lead provider on the first dashboard, delivery provider on the second), mode, and level – with denominators and suppression codes available in the coloured squares on the right. The suppression codes are as follows:

    • [DQ]: information suppressed due to low data quality from the 2022-23 collection
    • [low]: There are more than 2 but fewer than 23 students in the denominator
    • [none]: There are 2 students or fewer in the denominator
    • [DP]: Data redacted for reasons of data protection
    • [DPL]: Data redacted for reasons of data protection (very low numbers,
    • [DPH]: Data redacted for reasons of data protection (within 2 of the denominator)
    • [RR] below threshold response rate (for progression)
    • [BK] no benchmarks (the benchmark includes at least 50 per cent of the provider’s students)

    You can see available indicators (including upper and lower confidence intervals at 95%), benchmarks, and numeric thresholds by mousing over one of the coloured squares. The filled circle is the indicator, the outline diamond is the benchmark, and the cross is the threshold.

    [Full screen]

    [Full screen]

    A typology

    It’s worth noting the range of providers that are subcontracted to deliver higher education for others. There were an astonishing 681 of these between 2014 and 2022.

    A third of those active in delivering provision for others (227) are registered with the Office for Students in their own right. Fifty-nine of these are recognisable as universities or other established higher education providers – including 14 in the Russell Group.

    Why would that happen? In some cases, a provider may not have had the degree awarding powers necessary for research degrees, so would partner with another university to deliver particular courses. In other cases, the peculiarities of this data mean that apprenticeship arrangements are shown with the university partner. There’s also some examples of two universities working together to deliver a single programme.

    We also find many examples of longstanding collaborations between universities and teaching organisations in the arts. Numerous independent schools of dance, drama, and music have offered higher education qualifications with the support of a university – the Bird School’s relationship with the Doreen Bird College of Performing Arts began in 1997. Italia Conti used to have an arrangement with the University of East London, it now works with the University of Chichester.

    There are 135 organisations delivering apprenticeships in a relationship with an OfS-registered higher education provider. Universities often offer end point assessment and administrative support to employers and others who offer apprenticeships between level 4 and level 7.

    Two large providers – Navitas and QA – offer foundation courses and accredited year one courses for international students at UK universities: QA also offers a range of programmes aimed at home undergraduates. We could also add Into as a smaller example. This dataset probably isn’t the best place to see this (QA is shown as multiple, linked, organisations) but this is a huge area of provision

    Seventy-four subcontracted providers are schools, or school centred initial teacher training (SCITT) organisations. As teacher training has gradually moved closer to the classroom and away from the lecture hall, many schools offer opportunities to gain the industry-standard Postgraduate Certificate in Education (PGCE), which is the main route to qualified teacher status. A PGCE is a postgraduate qualification and is thus only awarded by organisations with postgraduate degree awarding powers.

    In total there are 144 providers subcontracted to deliver PGCE (initial teacher training) courses, primarily schools, local councils, and further education colleges (FECs). There are 166 FECs involved in subcontracted delivery – and this extends far beyond teacher training. Most large FECs have a university centre or similar, offering a range of foundation and undergraduate courses often (but not always) in vocational subjects. The Newcastle College Group used its experience of delivering postgraduate taught masters courses for Canterbury Christ Church University to successfully apply for postgraduate degree awarding powers – the first FEC to do so.

    We find 23 NHS organisations represented within the data. Any provider delivering medical, medical related, or healthcare subjects will have a relationship with one or more NHS foundation trust – as a means to offer student placements, and bring clinical expertise into teaching. This is generally an accreditation requirement. But in many cases, the relationship extends to the university awarding credit or qualifications for the learning and training that NHS staff do. The Oxford Health NHS Foundation Trust works with multiple providers (the University of Oxford, Oxford Brookes University, and Buckinghamshire New University), to offer postgraduate apprenticeships in clinical and non-clinical roles.

    Nine police organisations (either constabularies or police and crime commissioners) have subcontractual relationships with registered higher education providers. Teesside University works with the Chief Constable of Cleveland to offer an undergraduate apprenticeship for prospective police officers.

    All three of the UKs armed forces have subcontractual relationships with higher education providers. The British Army currently works with the University of Reading to offer undergraduate and postgraduate degrees in leadership and strategic studies – in the past it has offered a range of qualifications from Bournemouth University. Kingston University has a relationship with the Royal Navy, currently offering an MSc in Technology (Maritime Operations) undertaken entirely in the workplace.

    Ecosystem

    When I talk to people about franchise and partnership arrangements, most (perhaps thinking of the examples that make the mainstream press) ask me whether it would not be easier to simply ban such arrangements. After all, it is very difficult to see any benefit from the possibly fraudulent and often low quality behavior that is plastered all over The Times on a regular basis.

    As I think the data demonstrates, a straight-ahead ban would be hugely damaging – swathes of national priorities and achievements (from NHS staff development, to offering higher education in “cold spots”, to the quality of performances on London’s West End) would be adversely affected. But the same could be said for increases in regulatory overheads.

    There are a handful of very large providers (I’d start with Global Banking School, Elizabeth School of London, Navitas, QA, Into, London School of Science and Technology, and a few others – and from the data you’d have included Oxford Business Colleges) that are, effectively, university-like in size and scope. It is very difficult to understand why these are not registered providers given the scale of their operations (GBS, Into, and Navitas already are) and this does seem to be the right direction of travel.

    There are a clutch of medium-sized delivery providers, often in a single long-standing relationship with a higher education institution. Often, these are nano-specialisms, particularly in the creative arts or in locally important industries. In many of these cases oversight on quality and standards from the lead provider has been proven over a number of years to work well – and there seems little benefit to changing this arrangement. I would hope for this group – as is likely to happen for the FECs, SCITTs, NHS, police, and armed forces – that a change to regulatory oversight only happens where there is an issue identified.

    There is also a long tail of very small arrangements, often linked to apprenticeships (and regulated accordingly). For others at this end of the scale it is difficult to imagine OfS having the time or the capacity to regulate, so almost by default oversight remains with the lead partners. I know I say this in nearly every article, but at this end it feels like we need some kind of regular review of the way quality processes work for external providers within lead providers – we need to be sure lead providers are able to do what can be a very difficult job and do it well.

    Source link

  • Outcomes data for subcontracted provision

    Outcomes data for subcontracted provision

    In 2022–23 there were around 260 full time first degree students, registered to a well-known provider and taught via a subcontractual arrangement, that had a continuation rate of just 9.8 per cent: so of those 260 students, just 25 or so actually continued on to their second year.

    Whatever you think about franchising opening up higher education to new groups, or allowing established universities the flexibility to react to fast-changing demand or skills needs, none of that actually happens if more than 90 per cent of the registered population doesn’t continue with their course.

    It’s because of issues like this that we (and others) have been badgering the Office for Students to produce outcomes data for students taught via subcontractual arrangements (franchises and partnerships) at a level of granularity that shows each individual subcontractual partner.

    And finally, after a small pilot last year, we have the data.

    Regulating subcontractual relationships

    If anything it feels a little late – there are now two overlapping proposals on the table to regulate this end of the higher education marketplace:

    • A Department of Education consultation suggests that every delivery partner that has more than 300 higher education students would need to register with the Office for Students (unless it is regulated elsewhere)
    • And an Office for Students consultation suggests that every registering partner with more than 100 higher education students taught via subcontractual arrangements will be subject to a new condition of registration (E8)

    Both sets of plans address, in their own way, the current reality that the only direct regulatory control available over students studying via these arrangements is via the quality assurance systems within the registering (lead) partners. This is an arrangement left over from previous quality regimes, where the nation spent time and money to assure itself that all providers had robust quality assurance systems that were being routinely followed.

    In an age of dashboard-driven regulation, the fact that we have not been able to easily disaggregate the outcomes of subcontractual students has meant that it has not been possible to regulate this corner of the sector – we’ve seen rapid growth of this kind of provision under the Office for Students’ watch and oversight (to be frank) has just not been up to the job.

    Data considerations

    Incredibly, it wasn’t even the case that the regulator had this data but chose not to publish it. OfS has genuinely had to design this data collection from scratch in order to get reliable information – many institutions expressed concern about the quality of data they might be getting from their academic partners (which should have been a red flag, really).

    So what we get is basically an extension of the B3 dashboards where students in the existing “partnership” population are assigned to one of an astonishing 681 partner providers alongside their lead provider. We’d assume that each of these specific populations has data across the three B3 (continuation, completion, progression) indicators – in practice many of these are suppressed for the usual OfS reasons of low student numbers and (in the case of progression) low Graduate Outcomes response rates.

    Where we do get indicator values we also see benchmarks and the usual numeric thresholds – the former indicating what OfS might expect to see given the student population, the latter being the line beneath which the regulator might feel inclined to get stuck into some regulating.

    One thing we can’t really do with the data – although we wanted to – is treat each subcontractual provider as if it was a main provider and derive an overall indicator for it. Because many subcontractual providers have relationships (and students) from numerous lead providers, we start to get to some reasonably sized institutions. Two – Global Banking School and the Elizabeth School London – appear to have more than 5,000 higher education students: GBS is around the same size as the University of Bradford, the Elizabeth School is comparable to Liverpool Hope University.

    Size and shape

    How big these providers are is a good place to start. We don’t actually get formal student numbers for these places – but we can derive a reasonable approximation from the denominator (population size) for one of the three indicators available. I tend to use continuation as it gives me the most recent (2022–23) year of data.

    [Full screen]

    The charts showing numbers of students are based on the denominators (populations) for one of the three indicators – by default I use continuation as it is more likely to reflect recent (2022–23) numbers. Because both the OfS and DfE consultations talk about all HE students there are no filters for mode or level.

    For each chart you can select a year of interest (I’ve chosen the most recent year by default) or the overall indicator (which, like on the main dashboards is synthetic over four years) If you change the indicator you may have to change the year. I’ve not included any indications of error – these are small numbers and the possible error is wide so any responsible regulator would have to do more investigating before stepping in to regulate.

    Recall that the DfE proposal is that institutions with more than 300 higher education students would have to register with OfS if they are not regulated in another way (as a school, FE college, or local authority, for instance). I make that 26 with more than 300 students, a small number of which appear to be regulated as an FE college.

    You can also see which lead providers are involved with each delivery partner – there are several that have relationships with multiple universities. It is instructive to compare outcomes data within a delivery partner – clearly differences in quality assurance and course design do have an impact, suggesting that the “naive university hoodwinked by low quality franchise partner” narrative, if it has any truth to it at all, is not universally true.

    [Full screen]

    The charts showing the actual outcomes are filtered by mode and level as you would expect. Note that not all levels are available for each mode of study.

    This chart brings in filters for level and mode – there are different indicators, benchmarks, and thresholds for each combination of these factors. Again, there is data suppression (low numbers and responses) going on, so you won’t see every single aspect of every single relationship in detail.

    That said, what we do see is a very mixed bag. Quite a lot of provision sits below the threshold line, though there are also some examples of very good outcomes – often at smaller, specialist, creative arts colleges.

    Registration

    I’ve flipped those two charts to allow us to look at the exposure of registered universities to this part of the market. The overall sizes in recent years at some providers won’t be of any surprise to those who have been following this story – a handful of universities have grown substantially as a result of a strategic decision to engage in multiple academic partnerships.

    [Full screen]

    Canterbury Christ Church University, Bath Spa University, Buckinghamshire New University, and Leeds Trinity University have always been the big four in this market. But of the 84 registered providers engaged in partnerships, I count 44 that met the 100 student threshold for the new condition of registration B3 had it applied in 2022–23.

    Looking at the outcomes measures suggests that what is happening across multiple partners is not offering wide variation in performance, although there will always be teaching provider, subject, and population variation. It is striking that places with a lot of different partners tend to get reasonable results – lower indicator values tend to be found at places running just one or two relationships, so it does feel like some work on improving external quality assurance and validation would be of some help.

    [Full screen]

    To be clear, this is data from a few years ago (the most recent available data is from 2022–23 for continuation, 2019–20 for completion, and 2022–23 for progression). It is very likely that providers will have identified and addressed issues (or ended relationships) using internal data long before either we or the Office for Students got a glimpse of what was going on.

    A starting point

    There is clearly a lot more that can be done with what we have – and I can promise this is a dataset that Wonkhe is keen to return to. It gets us closer to understanding where problems may lie – the next phase would be to identify patterns and commonalities to help us get closer to the interventions that will help.

    Subcontractual arrangements have a long and proud history in UK higher education – just about every English provider started off in a subcontractual arrangement with the University of London, and it remains the most common way to enter the sector. A glance across the data makes it clear that there are real problems in some areas – but it is something other than the fact of a subcontractual arrangement that is causing them.

    Do you like higher education data as much as I do? Of course you do! So you are absolutely going to want to grab a ticket for The Festival of Higher Education on 11-12 November – it’s Team Wonkhe’s flagship event and data discussion is actively encouraged. 

    Source link

  • What the saga of Oxford Business College tells us about regulation and franchising

    What the saga of Oxford Business College tells us about regulation and franchising

    One of the basic expectations of a system of regulation is consistency.

    It shouldn’t matter how prestigious you are, how rich you are, or how long you’ve been operating: if you are active in a regulated market then the same rules should apply to all.

    Regulatory overreach can happen when there is public outrage over elements of what is happening in that particular market. The pressure a government feels to “do something” can override processes and requirements – attempting to reach the “right” (political or PR) answer rather than the “correct” (according to the rules) one.

    So when courses at Oxford Business College were de-designated by the Secretary of State for Education, there’s more to the tale than a provider where legitimate questions had been raised about the student experience getting just desserts. It is a cautionary tale, involving a fascinating high-court judgment and some interesting arguments about the limits of ministerial power, of what happens when political will gets ahead of regulatory processes.

    Business matters

    A splash in The Sunday Times back in the spring concerned the quality of franchised provision from – as it turned out – four Office for Students registered providers taught at Oxford Business College. The story came alongside tough language from Secretary of State for Education Bridget Phillipson:

    I know people across this country, across the world, feel a fierce pride for our universities. I do too. That’s why I am so outraged by these reports, and why I am acting so swiftly and so strongly today to put this right.

    And she was in no way alone in feeling that way. Let’s remind ourselves, the allegations made in The Sunday Times were dreadful. Four million pounds in fraudulent loans. Fake students, and students with no apparent interest in studying. Non-existent entry criteria. And, as we shall see, that’s not even as bad as the allegations got.

    De-designation – removing the eligibility of students at a provider to apply for SLC fee or maintenance loans – is one of the few levers government has to address “low quality” provision at an unregistered provider. Designation comes automatically when a course is franchised from a registered provider: a loophole in the regulatory framework that has caused concern over a number of years. Technically an awarding provider is responsible for maintaining academic quality and standards for its students studying elsewhere.

    The Office for Students didn’t have any regulatory jurisdiction other than pursuing the awarding institutions. OBC had, in fact, tried to register with OfS – withdrawing the application in the teeth of the media firestorm at the end of March.

    So everything depended on the Department for Education overturning precedent.

    Ministering

    It is “one of the biggest financial scandals universities have faced.” That’s what Bridget Phillipson said when presented with The Sunday Times’ findings. She announced that the Public Sector Fraud Authority would coordinate immediate action, and promised to empower the Office for Students to act in such cases.

    In fact, OBC was already under investigation by the Government Internal Audit Agency (GIAA) and had been since 2024. DfE had been notified by the Student Loans Company about trends in the data and other information that might indicate fraud at various points between November 2023 and February 2024 – notifications that we now know were summarised as a report detailing the concerns which was sent to DfE in January 2024. The eventual High Court judgement (the details of which we will get to shortly) outlined just a few of these allegations, which I take from from the court documents:

    • Students enrolled in the Business Management BA (Hons) course did not have basic English language skills.
    • Less than 50 per cent of students enrolled in the London campus participate, and the remainder instead pay staff to record them as in attendance.
    • Students have had bank details altered or new bank accounts opened in their name, to which their maintenance payments were redirected.
    • Staff are encouraging fraud through fake documents sent to SLC, fake diplomas, and fake references. Staff are charging students to draft their UCAS applications and personal statements. Senior staff are aware of this and are uninterested.
    • Students attending OBC do not live in the country. In one instance, a dead student was kept on the attendance list.
    • Students were receiving threats from agents demanding money and, if the students complained, their complaints were often dealt with by those same agents threatening the students.
    • Remote utilities were being used for English language tests where computers were controlled remotely to respond to the questions on behalf of prospective students.
    • At the Nottingham campus, employees and others were demanding money from students for assignments and to mark their attendance to avoid being kicked off their course.

    At the instigation of DfE, and with the cooperation of OBC, GIAA started its investigation on 19 September 2024, continuing to request information from and correspond with the college until 17 January 2025. An “interim report” detailing emerging findings went to DfE on 17 December 2024; the final report arrived on 30 January 2025. The final report made numerous recommendations about OBC processes and policies, but did not recommend de-designation. That recommendation came in a ministerial submission, prepared by civil servants, dated 18 March 2025.

    Process story

    OBC didn’t get sight of these reports until 20 March 2025, after the decisions were made. It got summaries of both the interim and final reports in a letter from DfE notifying it that Phillipson was “minded to” de-designate. The documentation tells us that GIAA reported that OBC had:

    • recruited students without the required experience and qualifications to successfully complete their courses
    • failed to ensure students met the English language proficiency as set out in OBC and lead provider policies
    • failed to ensure attendance is managed effectively
    • failed to withdraw or suspend students that fell below the required thresholds for performance and/or engagement;
    • failed to provide evidence that immigration documents, where required, are being adequately verified.

    The college had 14 days to respond to the summary and provide factual comment for consideration, during which period The Sunday Times published its story. OBC asked DfE for the underlying material that informed the findings and the subsequent decision, and for an extension (it didn’t get all the material, but it got a further five days) – and it submitted 68 pages of argument and evidence to DfE, on 7 April 2025. Another departmental ministerial submission (on 16 April 2025) recommended that the Secretary of State confirm the decision to de-designate.

    According to the OBC legal team, these emerging findings were not backed up by the full GIAA reports, and there were concerns about the way a small student sample had been used to generalise across an entire college. Most concerningly, the reports as eventually shared with the college did not support de-designation (though they supported a number of other concerns about OBC and its admission process). This was supported by a note from GIAA regarding OBC’s submission, which – although conceding that aspects of the report could have been expressed more clearly – concluded:

    The majority of the issues raised relate to interpretation rather than factual accuracy. Crucially, we are satisfied that none of the concerns identified have a material impact on our findings, conclusions or overall assessment.

    Phillipson’s decision to de-designate was sent to the college on 17 April 2025, and it was published as a Written Ministerial Statement. Importantly, in her letter, she noted that:

    The Secretary of State’s decisions have not been made solely on the basis of whether or not fraud has been detected. She has also addressed the issue of whether, on the balance of probabilities, the College has delivered these courses, particularly as regards the recruitment of students and the management of attendance, in such a way that gives her adequate assurance that the substantial amounts of public money it has received in respect of student fees, via its partners, have been managed to the standards she is entitled to expect.

    Appeal

    Oxford Business College appealed the Secretary of State’s decision. Four grounds of challenge were pursued with:

    • Ground 3: the Secretary of State had stepped beyond her powers in prohibiting OBC from receiving public funds from providing new franchised courses in the future.
    • Ground 1: the decision was procedurally unfair, with key materials used by the Secretary of State in making the decision not provided to the college, and the college never being told the criteria it was being assessed against
    • Ground 4: By de-designating courses, DfE breached OBCs rights under Article 1 of the First Protocol to the European Convention on Human Rights (to peaceful enjoyment of its possessions – in this case the courses themselves)
    • Ground 7: The decision by the Secretary of State had breached the public sector equality duty

    Of these, ground 3 was not determined, as the Secretary of State had clarified that no decision had been taken regarding future courses delivered by OBC. Ground 4 was deemed to be a “controversial” point of law regarding whether a course and its designation status could be a “possession” under ECHR, but could be proceeded with at a later date. Ground 7 was not decided.

    Ground 1 succeeded. The court found that OBC had been subject to an unfair process, where:

    OBC was prejudiced in its ability to understand and respond to the matters of the subject of investigation, including as to the appropriate sanction, and to understand the reasons for the decision.

    Judgement

    OBC itself, or the lawyers it engaged, have perhaps unwisely decided to put the judgement into the public domain – it has yet to be formally published. I say unwisely, because it also puts the initial allegations into the public domain and does not detail any meaningful rebuttal from the college – though The Telegraph has reported that the college now plans to sue the Secretary of State for “tens of millions of pounds.”

    The win, such as it is, was entirely procedural. The Secretary of State should have shared more detail of the findings of the GIAA investigation (at both “emerging” and “final” stages) in order that the college could make its own investigations and dispute any points of fact.

    Much of the judgement deals with the criteria by which a sample of 200 students were selected – OBC was not made aware that this was a sample comprising those “giving the greatest cause for suspicion” rather than a random sample, and the inability of OBC to identify students whose circumstances or behaviour were mentioned in the report. These were omissions, but nowhere is it argued by OBC that these were not real students with real experiences.

    Where allegations are made that students might be being threatened by agents and institutional staff, it is perhaps understandable that identifying details might be redacted – though DfE cited the “”pressure resulting from the attenuated timetable following the order for expedition, the evidence having been filed within 11 days of that order” for difficulties faced in redacting the report properly. On this point, DfE noted that OBC, using the materials provided, “had been able to make detailed representations running to 68 pages, which it had described as ‘comprehensive’ and which had been duly considered by the Secretary of State”.

    The Secretary of State, in evidence, rolled back from the idea that she could automatically de-designate future courses without specific reason, but this does not change the decisions she has made about the five existing courses delivered in partnership. Neither does it change the fact that OBC, having had five courses forcibly de-designated, and seen the specifics of the allegations underpinning this exceptional decision put into the public domain without any meaningful rebuttal, may struggle to find willing academic partners.

    The other chink of legal light came with an argument that a contract (or subcontract) could be deemed a “possession” under certain circumstances, and that article one section one of the European Convention on Human Rights permits the free enjoyment of possessions. The judgement admits that there could be grounds for debate here, but that debate has not yet happened.

    Rules

    Whatever your feelings about OBC, or franchising in general, the way in which DfE appears to have used a carefully redacted and summarised report to remove an institution from the sector is concerning. If the rules of the market permit behaviour that ministers do not like, then these rules need to be re-written. DfE can’t just regulate based on what it thinks the rules should be.

    The college issued a statement on 25 August, three days after the judgement was published – it claims to be engaging with “partner institutions” (named as Buckinghamshire New University, University of West London, Ravensbourne University London, and New College Durham – though all four had already ended their partnerships with the remaining students being “taught out”) about the future of the students affected by the designation decision – many had already transferred to other courses at other providers.

    In fact, the judgement tells us that of 5,000 students registered at OBC on 17 April 2025, around 4,700 had either withdrawn or transferred out of OBC to be taught out. We also learn that 1,500 new students, who had planned to start an OBC-delivered course after 2025, would no longer be doing so. Four lead providers had given notice to terminate franchise agreements between April 2024 and May of 2025. Franchise discussions with another provider – Southampton Solent University – underway shortly before the decision to de-designate, had ended.

    OBC currently offers one course itself (no partnership offers are listed) – a foundation programme covering academic skills and English language including specialisms in law, engineering, and business – which is designed to prepare students for the first year of an undergraduate degree course. It is not clear what award this course leads to, or how it is regulated. It is also expensive – a 6 month version (requiring IELTS 5.5 or above) costs an eyewatering £17,500. And there is no information as to how students might enroll on this course.

    OBC’s statement about the court case indicates that it “rigorously adheres to all regulatory requirements”, but it is not clear which (if any) regulator has jurisdiction over the one course it currently advertises.

    If there are concerns about the quality of teaching, or about academic standards, in any provider in receipt of public funds they clearly need to be addressed – and this is as true for Oxford Business College as it is for the University of Oxford. This should start with a clear plan for quality assurance (ideally one that reflects the current concerns of students) and a watertight process that can be used both to drive compliance and take action against those who don’t measure up. Ministerial legal innovation, it seems, doesn’t quite cut it.

    Source link

  • Is there a place for LEO in regulation?

    Is there a place for LEO in regulation?

    The OfS have, following a DfE study, recently announced a desire to use LEO for regulation. In my view this is a bad idea.

    Don’t get me wrong, the Longitudinal Outcomes from Education (LEO) dataset is a fantastic and under-utilised tool for historical research. Nothing can compare to LEO for its rigour, coverage and the richness of the personal data it contains.

    However, it has serious limitations, it captures earnings and not salary, for everyone who chooses to work part time it will seriously underestimate the salary they command.

    And fundamentally it’s just too lagged. You can add other concerns around those choosing not to work and those working abroad if you wish to undermine its utility further.

    The big idea

    The OfS is proposing using data from 3 years’ after graduation which I assume to mean the third full tax year after graduation although it could mean something different, no details are provided. Assuming that my interpretation is correct the most recent LEO data published in June this year relates to the 2022-23 tax year so for that to be the third full tax year after graduation (that’s the that’s the 2018-19 graduating cohort, and even if you go for the third tax year including the one they graduated in it’s the 2019-20 graduates). The OfS also proposes to continue to use 4 year aggregates which makes a lot of sense to avoid statistical noise and deal with small cohorts but it does mean that some of the data will relate to even earlier cohorts.

    The problem is therefore if the proposed regime had been in place this year the OfS would have just got its first look at outcomes from the 2018-19 graduating cohort who were of course entrants in 2016-17 or earlier. When we look at it through this lens it is hard to see how one applies any serious regulatory tools to a provider failing on this metric but performing well on others especially if they are performing well on those based on the still lagged but more timely Graduate Outcomes survey.

    It is hard to conceive of any courses that will not have had at least one significant change in the 9 (up to 12!) years since the measured cohort entered. It therefore won’t be hard for most providers to argue that the changes they have made since those cohorts entered will have had positive impacts on outcomes and the regulator will have to give some weight to those arguments especially if they are supported by changes in the existing progression, or the proposed new skills utilisation indicator.

    A problem?

    And if the existing progression indicator is problematic then why didn’t the regulator act on it when it had it four years earlier? The OfS could try to argue that it’s a different indicator capturing a different aspect of success but this, at least to this commentators mind, is a pretty flimsy argument and is likely to fail because earnings is a very narrow definition of success. Indeed, by having two indicators the regulator may well find themselves in a situation where they can only take meaningful action if a provider is failing on both.

    OfS could begin to address the time lag by just looking at the first full tax year after graduation but this will undoubtedly be problematic as graduates take time to settle into careers (which is why GO is at 15 months) and of course the interim study issues will be far more significant for this cohort. It would also still be less timely than the Graduate Outcomes survey which itself collects the far more meaningful salary rather than earnings.

    There is of course a further issue with LEO in that it will forever be a black box for the providers being regulated using it. It will not be possible to share the sort of rich data with providers that is shared for other metrics meaning that providers will not be able to undertake any serious analysis into the causes of any concerns the OfS may raise. For example, a provider would struggle to attribute poor outcomes to a course they discontinued, perhaps because they felt it didn’t speak to the employment market. A cynic might even conclude that having a metric nobody can understand or challenge is quite nice for the OfS.

    The use of LEO in regulation is likely to generate a lot of work for the OfS and may trigger lots of debate but I doubt it will ever lead to serious negative consequences as the contextual factors and the fact that the cohorts being considered are ancient history will dull, if not completely blunt, the regulatory tools.

    Richard Puttock writes in a personal capacity.

    Source link

  • Testing Times & Interesting Discussions

    Testing Times & Interesting Discussions

    Last week, The Royal Bank of Canada (RBC) put out a discussion paper called Testing Times: Fending Off A Crisis in Post-Secondary Education, which in part is the outcome of a set of cross-country discussions held this summer by RBC, HESA, and the Business Higher Education Roundtable. (BHER). The paper, I think, sums up the current situation pretty well: the system is not at a starvation point but is heading in that direction pretty quickly and that needs to be rectified. On the other hand, there are some ways that institutions could be moving more quickly to respond to changing social and economic circumstances. What’s great about this paper is that it balances those two ideas pretty effectively.

    I urge everyone to read it themselves because I think it sums up a lot of issues nicely – many of which we at HESA will be taking up at our Re: University conference in January (stay tuned! the nearly full conference line-up will be out in a couple of weeks, and it’s pretty exciting). But I want to draw everyone’s attention to section 4 of the report, in particular which I think is the sleeper issue of the year, and that is the regulation of post-secondary institutions. One of the things we heard a lot on the road was how universities were being hamstrung – not just by governments but by professional regulatory bodies – in terms of developing innovative programming. This is a subject I’ll return to in the next week or two, but I am really glad that this issue might be starting to get some real traction.

    The timing of this release wasn’t accidental: it came just a few days before BHER had one of its annual high-level shindigs, and RBC’s CEO Dave MacKay is also BHER’s Board Chair, so the two go hand-in-hand to some extent. I was at the summit on Monday – a Chatham House rules session at RBC headquarters – which attracted a good number of university and college presidents, as well as CEOs – entitled Strategic Summit on Talent, Technology and a New Economic Order. The discussions took up the challenge in the RBC paper to look at where the country is going and where the post-secondary education sector can contribute to making a new and stronger Canada.

    And boy, was it interesting.

    I mean, partly it was some of the outright protectionist stuff being advocated by the corporate sector in the room. I haven’t heard stuff like that since I was a child. Basically, the sentiment in the room is that the World Trade Organization (WTO) is dead, the Americans aren’t playing by those rules anymore, so why should we? Security of supply > low-cost supply. Personally, I think that likely means that this “new economic order” is going to mean much more expensive wholesale prices, but hey, if that’s what we have to adapt to, that’s what we have to adapt to.

    But, more pertinent to this blog were the ways the session dealt with the issue of what in higher education needs to change to meet the moment. And, for me, what was interesting was that once you get a group of business folks in a room and ask what higher education can do to help get the country on track, they actually don’t have much to say. They will talk a LOT about what government can do to help get the country on track. The stories they can tell about how much more ponderous and anti-innovation Canadian public procurement policies are compared to almost any other jurisdiction on earth would be entertaining if the implications were not so horrific. They will talk a LOT about how Canadian C-suites are risk-averse, almost as risk-averse as government, and how disappointing that is.

    But when it comes to higher education? They don’t actually have all that much to say. And that’s both good and bad.

    Now before I delve into this, let me say that it’s always a bit tricky to generalize what a sector believes based on a small group of CEOs who get drafted into a room like this one. I mean, to some degree these CEOs are there because they are interested in post-secondary education, so they aren’t necessarily very representative of the sector. But here’s what I learned:

    • CEOs are a bit ruffled by current underfunding of higher education. Not necessarily to the point where they would put any of their own political capital on the line, but they are sympathetic to institutions.
    • When they think about how higher education affects their business, CEOs seem to think primarily about human capital (i.e. graduates). They talk a lot less about research, which is mostly what universities want to talk about, so there is a bit of a mismatch there.
    • When they think about human capital, what they are usually thinking about is “can my business have access to skills at a price I want to pay?” Because the invitees are usually heads of successful fast-growing companies, the answer is usually no. Also, most say what they want are “skills” – something they, not unreasonably, equate with experience, which sets up another set of potential misunderstandings with universities because degrees ≠ experience (but it does mean everyone can agree on more work-integrated learning).
    • As a result – and this is important here – it’s best if CEOs think about post-secondary education in terms of firm growth, not in terms of economy-wide innovation.

    Now, maybe that’s all right and proper – after all, isn’t it government’s business to look after the economy-wide stuff? Well, maybe, but here’s where it gets interesting. You can drive innovation either by encouraging the manufacture and circulation of ideas (i.e. research) or by diffusing skills through the economy (i.e. education/training). But our federal government seems to think that innovation only happens via the introduction of new products/technology (i.e., the product of research), and that to the extent there is an issue with post-secondary education, it is that university-based research doesn’t translate into new products fast enough – i.e. the issue is research commercialization. The idea that technological adoption might be the product of governments and firms not having enough people to use new technologies properly (e.g. artificial intelligence)? Not on anyone’s radar screen.

    And that really is a problem. One I am not sure is easily fixed because I am not sure everyone realizes the degree to which they are talking past each other. But that said, the event was a promising one. It was good to be in a space where so many people cared about Canada, about innovation, and about post-secondary education. And the event itself – very well pulled-off by RBC and BHER – made people want to keep discussing higher education and the economy. Both business and higher education need to have events like this one, regularly, and not just nationally but locally as well. The two sides don’t know each other especially well, and yet their being more in sync is one of the things that could make the country work a lot better than it does. Let’s keep talking.

    Source link

  • From improvement to compliance – a significant shift in the purpose of the TEF

    From improvement to compliance – a significant shift in the purpose of the TEF

    The Teaching Excellence Framework has always had multiple aims.

    It was partly intended to rebalance institutional focus from research towards teaching and student experience. Jo Johnson, the minister who implemented it, saw it as a means of increasing undergraduate teaching resources in line with inflation.

    Dame Shirley Pearce prioritised enhancing quality in her excellent review of TEF implementation. And there have been other purposes of the TEF: a device to support regulatory interventions where quality fell below required thresholds, and as a resource for student choice.

    And none of this should ignore its enthusiastic adoption by student recruitment teams as a marketing tool.

    As former Chair and Deputy Chair of the TEF, we are perhaps more aware than most of these competing purposes, and more experienced in understanding how regulators, institutions and assessors have navigated the complexity of TEF implementation. The TEF has had its critics – something else we are keenly aware of – but it has had a marked impact.

    Its benchmarked indicator sets have driven a data-informed and strategic approach to institutional improvement. Its concern with disparities for underrepresented groups has raised the profile of equity in institutional education strategies. Its whole institution sweep has made institutions alert to the consequences of poorly targeted education strategies and prioritised improvement goals. Now, the publication of the OfS’s consultation paper on the future of the TEF is an opportunity to reflect on how the TEF is changing and what it means for the regulatory and quality framework in England.

    A shift in purpose

    The consultation proposes that the TEF becomes part of what the OfS sees as a more integrated quality system. All registered providers will face TEF assessments, with no exemptions for small providers. Given the number of new providers seeking OfS registration, it is likely that the number to be assessed will be considerably larger than the 227 institutions in the 2023 TEF.

    Partly because of the larger number of assessments to be undertaken, TEF will move to a rolling cycle, with a pool of assessors. Institutions will still be awarded three grades – one for outcomes, one for experience and one overall, but their overall grade will simply be the lower of the two other grades. The real impact of this will be on Bronze-rated providers who could find themselves subject to a range of measures, potentially including student number controls or fee constraints, until they show improvement.

    The OfS consultation paper marks a significant shift in the purpose of the TEF, from quality enhancement to regulation and from improvement to compliance. The most significant changes are at the lower end of assessed performance. The consultation paper makes sensible changes to aspects of the TEF which always posed challenges for assessors and regulators, tidying up the relationship between the threshold B3 standards and the lowest TEF grades. It correctly separates measures of institutional performance on continuation and completion – over which institutions have more direct influence – from progression to employment – over which institutions have less influence.

    Pressure points

    But it does this at some heavy costs. By treating the Bronze grade as a measure of performance at, rather than above, threshold quality, it will produce just two grades above the threshold. In shifting the focus towards quantitative indicators and away from institutional discussion of context, it will make TEF life more difficult for further education institutions and institutions in locations with challenging graduate labour markets. The replacement of the student submission with student focus groups may allow more depth on some issues, but comes at the expense of breadth, and the student voice is, disappointingly, weakened.

    There are further losses as the regulatory purpose is embedded. The most significant is the move away from educational gain, and this is a real loss: following TEF 2023, almost all institutions were developing their approaches to and evaluation of educational gain, and we have seen many examples where this was shaping fruitful approaches to articulating institutional goals and the way they shape educational provision.

    Educational gain is an area in which institutions were increasingly thinking about distinctiveness and how it informs student experience. It is a real loss to see it go, and it will weaken the power of many education strategies. It is almost certainly the case that the ideas of educational gain and distinctiveness are going to be required for confident performance at the highest levels of achievement, but it is a real pity that it is less explicit. Educational gain can drive distinctiveness, and distinctiveness can drive quality.

    Two sorts of institutions will face the most significant challenges. The first, obviously, are providers rated Bronze in 2023, or Silver-rated providers whose indicators are on a downward trajectory. Eleven universities were given a Bronze rating overall in the last TEF exercise – and 21 received Bronze either for the student experience or student outcomes aspects. Of the 21, only three Bronzes were for student outcomes, but under the OfS plans, all would be graded Bronze, since any institution would be given its lowest aspect grade as its overall grade. Under the proposals, Bronze-graded institutions will need to address concerns rapidly to mitigate impacts on growth plans, funding, prestige and competitive position.

    The second group facing significant challenges will be those in difficult local and regional labour markets. Of the 18 institutions with Bronze in one of the two aspects of TEF 2023, only three were graded bronze for student outcomes, whereas 15 were for student experience. Arguably this was to be expected when only two of the six features of student outcomes had associated indicators: continuation/completion and progression.

    In other words, if indicators were substantially below benchmark, there were opportunities to show how outcomes were supported and educational gain was developed. Under the new proposals, the approach to assessing student outcomes is largely, if not exclusively, indicator-based, for continuation and completion. The approach is likely to reinforce differences between institutions, and especially those with intakes from underrepresented populations.

    The stakes

    The new TEF will play out in different ways in different parts of the sector. The regulatory focus will increase pressure on some institutions, whilst appearing to relieve it in others. For those institutions operating at 2023 Bronze levels or where 2023 Silver performance is declining, the negative consequences of a poor performance in the new TEF, which may include student number controls, will loom large in institutional strategy. The stakes are now higher for these institutions.

    On the other hand, institutions whose graduate employment and earnings outcomes are strong, are likely to feel more relieved, though careful reading of the grade specifications for higher performance suggests that there is work to be done on education strategies in even the best-performing 2023 institutions.

    In public policy, lifting the floor – by addressing regulatory compliance – and raising the ceiling – by promoting improvement – at the same time is always difficult, but the OfS consultation seems to have landed decisively on the side of compliance rather than improvement.

    Source link

  • An assessor’s perspective on the Office for Students’ TEF shake-up

    An assessor’s perspective on the Office for Students’ TEF shake-up

    Across the higher education sector in England some have been waiting with bated breath for details of the proposed new Teaching Excellence Framework. Even amidst the multilayered preparations for a new academic year – the planning to induct new students, to teach well and assess effectively, to create a welcoming environment for all – those responsible for education quality have had one eye firmly on the new TEF.

    The OfS has now published its proposals along with an invitation to the whole sector to provide feedback on them by 11 December 2025. As an external adviser for some very different types of provider, I’m already hearing a kaleidoscope of changing questions from colleagues. When will our institution or organisation next be assessed if the new TEF is to run on a rolling programme rather than in the same year for everyone? How will the approach to assessing us change now that basic quality requirements are included alongside the assessment of educational ‘excellence’? What should we be doing right now to prepare?

    Smaller providers, including further education colleges that offer some higher education programmes, have not previously been required to participate in the TEF assessment. They will now all need to take part, so have a still wider range of questions about the whole process. How onerous will it be? How will data about our educational provision, both quantitative and qualitative, be gathered and assessed? What form will our written submission to the OfS need to take? How will judgements be made?

    As a member of TEF assessment panels through TEF’s entire lifecycle to date, I’ve read the proposals with great interest. From an assessor’s point of view, I’ve pondered on how the assessment process will change. Will the new shape of TEF complicate or help streamline the assessment process so that ratings can be fairly awarded for providers of every mission, shape and size?

    Panel focus

    TEF panels have always comprised experts from the whole sector, including academics, professional staff and student representatives. We have looked at the evidence of “teaching excellence” (I think of it as good education) from each provider very carefully. It makes sense that the two main areas of assessment, or “aspects” – student experience and student outcomes – will continue to be discrete areas of focus, leading to two separate ratings of either Gold, Silver, Bronze or Requires Improvement. That’s because the data for each of these can differ quite markedly within a single provider, so it can mislead students to conflate the two judgements.

    Diagram from page 18 of the consultation document

    Another positive continuity is the retention of both quantitative and qualitative evidence. Quantitative data include the detailed datasets provided by OfS, benchmarked against the sector. These are extremely helpful to assessors who can compare the experiences and outcomes of students from different demographics across the full range of providers.

    Qualitative data have previously come from 25-page written submissions from each provider, and from written student submissions. There are planned changes afoot for both of these forms of evidence, but they will still remain crucial.

    The written provider submissions may be shorter next time. Arguably there is a risk here, as submissions have always enabled assessors to contextualise the larger datasets. Each provider has its own story of setting out to make strategic improvements to their educational provision, and the submissions include both qualitative narrative and internally produced quantitative datasets related to the assessment criteria, or indicators.

    However, it’s reasonable for future submissions to be shorter as the student outcomes aspect will rely upon a more nuanced range of data relating to study outcomes as well as progression post-study (proposal 7). While it’s not yet clear what the full range of data will be, this approach is potentially helpful to assessors and to the sector, as students’ backgrounds, subject fields, locations and career plans vary greatly and these data take account of those differences.

    The greater focus on improved datasets suggests that there will be less reliance on additional information, previously provided at some length, on how students’ outcomes are being supported. The proof of the pudding for how well students continue with, complete and progress from their studies is in the eating, or rather in the outcomes themselves, rather than the recipes. Outcomes criteria should be clearer in the next TEF in this sense, and more easily applied with consistency.

    Another proposed change focuses on how evidence might be more helpfully elicited from students and their representatives (proposal 10). In the last TEF students were invited to submit written evidence, and some student submissions were extremely useful to assessors, focusing on the key criteria and giving a rounded picture of local improvements and areas for development. For understandable reasons, though, students of some providers did not, or could not, make a submission; the huge variations in provider size means that in some contexts students do not have the capacity or opportunity to write up their collective experiences. This variation was challenging for assessors, and anything that can be done to level the playing field for students’ voices next time will be welcomed.

    Towards the data limits

    Perhaps the greatest challenge for TEF assessors in previous rounds arose when we were faced with a provider with very limited data. OfS’s proposal 9 sets out to address this by varying the assessment approach accordingly. Where these is no statistical confidence in a provider’s NSS data (or no NSS data at all), direct evidence of students’ experiences with that provider will be sought, and where there is insufficient statistical confidence in a provider’s student outcomes, no rating will be awarded for that aspect.

    The proposed new approach to the outcomes rating makes great sense – it is so important to avoid reaching for a rating which is not supported by clear evidence. The plan to fill any NSS gap with more direct evidence from students is also logical, although it could run into practical challenges. It will be useful to see suggestions from the sector about how this might be achieved within differing local contexts.

    Finally, how might assessment panels be affected by changes to what we are assessing, and the criteria for awarding ratings? First, both aspects will incorporate the requirements of OfS’s B conditions – general ongoing, fundamental conditions of registration. The student experience aspect will now be aligned with B1 (course content and delivery), B2 (resources, academic support and student engagement) and part of B4 (effective assessment). Similarly, the student outcomes B condition will be embedded into the outcomes aspect of the new TEF. This should make even clearer to assessors what is being assessed, where the baseline is and what sits above that line as excellent or outstanding.

    And this in turn should make agreeing upon ratings more straightforward. It was not always clear in the previous TEF round where the lines between Requires Improvement and even meeting basic requirements for the sector should be drawn. This applied only to the very small number of providers whose provision did not appear, to put it plainly, to be good enough.

    But more clarity in the next round about the connection between baseline requirements should aid assessment processes. Clarification that in the future a Bronze award signifies “meeting the minimum quality requirements” is also welcome. Although the sector will need time to adjust to this change, it is in line with the risk-based approach OfS wants to take to the quality system overall.

    The £25,000 question

    Underlying all of the questions being asked by providers now is a fundamental one: How we will do next time?

    Looking at the proposals with my assessor’s hat on, I can’t predict what will happen for individual providers, but it does seem that the evolved approach to awarding ratings should be more transparent and more consistent. Providers need to continue to understand their education-related own data, both quantitative and qualitative, and commit to a whole institutional approach to embedding improvements, working in close partnership with students.

    Assessment panels will continue to take their roles very seriously, to engage fully with agreed criteria, and do everything we can to make a positive contribution to encouraging, recognising and rewarding teaching excellence in higher education.

    Source link

  • TEF6: the incredible machine takes over quality assurance regulation

    TEF6: the incredible machine takes over quality assurance regulation

    If you loved the Teaching Excellence Framework, were thrilled by the outcomes (B3) thresholds, lost your mind for the Equality of Opportunity Risk Register, and delighted to the sporadic risk-based OfS investigations based on years-old data you’ll find a lot to love in the latest set of Office for Students proposals on quality assurance.

    In today’s Consultation on the future approach to quality regulation you’ll find a cyclical, cohort based TEF that also includes a measurement (against benchmarks) of compliance with the thresholds for student outcomes inscribed in the B3 condition. Based on the outcomes of this super-TEF and prioritised based on assessment of risk, OfS will make interventions (including controls on recruitment and the conditions of degree awarding powers) and targeted investigation. This is a first stage consultation only, stage two will come in August 2026.

    It’s not quite a grand unified theory: we don’t mix in the rest of the B conditions (covering less pressing matters like academic standards, the academic experience, student support, assessment) because, in the words of OfS:

    Such an approach would be likely to involve visits to all providers, to assess whether they meet all the relevant B conditions of registration

    The students who are struggling right now with the impacts of higher student/staff ratios and a lack of capacity due to over-recruitment will greatly appreciate this reduction in administrative burden.

    Where we left things

    When we last considered TEF we were expecting an exercise every four years, drawing on provider narrative submissions (which included a chunk on a provider’s own definition and measurement of educational gain), students’ union narrative submissions, and data on outcomes and student satisfactions. Providers were awarded a “medal” for each of student outcomes and student experience – a matrix determined whether this resulted in an overall Bronze, Silver, Gold or Requires Improvement.

    The first three of these awards were deemed to be above minimum standards (with slight differences between each), while the latter was a portal to the much more punitive world of regulation under group B (student experience) conditions of registration. Most of the good bits of this approach came from the genuinely superb Pearce Review of TEF conducted under section 26 of the Higher Education and Research Act, which fixed a lot of the statistical and process nonsense that had crept in under previous iterations and then-current plans (though not every recommendation was implemented).

    TEF awards were last made in 2023, with the next iteration – involving all registered providers plus anyone else who wanted to play along – was due in 2027.

    Perma-TEF

    A return to a rolling TEF rather than a quadrennial quality enhancement jamboree means a pool of TEF assessors rather than a one-off panel. There will be steps taken to ensure that an appropriate group of academic and student assessors is selected to assess each cohort – there will be special efforts made to use those with experience of smaller, specialist, and college-based providers – and a tenure of two-to-three years is planned. OfS is also considering whether its staff can be included among the storied ranks of those empowered to facilitate ratings decisions.

    Likewise, we’ll need a more established appeals system. Open only to those with Bronze or Needs Improvement ratings (Gold and Silver are passing grades) it would be a way to potentially forestall engagement and investigations based on an active risk to student experience or outcomes, or a risk of a future breach of a condition of registration for Bronze or Requires Improvement.

    Each provider would be assessed once every three years – all providers taking part in the first cycle would be assessed in either 2027-28, 2028-29, or 2029-30 (which covers only undergraduate students because there’s no postgraduate NSS yet – OfS plan to develop one before 2030). In many cases they’ll only know which one at the start of the academic year in question, which will give them six months to get their submissions sorted.

    Because Bronze is now bad (rather than “good but not great” as it used to be) the first year’s could well include all providers with a 2023 Bronze (or Requires Improvement) rating, plus some with increased risks of non-compliance, some with Bronze in one of the TEF aspects, and some without a rating.

    After this, how often you are assessed depends on your rating – if you are Gold overall it is five years till the next try, Silver means four years, and Bronze three (if you are “Requires Improvement” you probably have other concerns beyond the date of your next assessment) but this can be tweaked if OfS decides there is an increased risk to quality or for any other reason.

    Snakes and ladders

    Ignore the gradations and matrices in the Pearce Review – the plan now is that your lowest TEF aspect rating (remember you got sub-awards last time for student experience and student outcomes) will be your overall rating. So Silver for experience and Bronze for outcomes makes for an overall Bronze. As OfS has decided that you now have to pay (likely around £25,000) to enter what is a compulsory exercise this is a cost that could lead to a larger cost in future.

    In previous TEFs, the only negative consequence for those outside of the top ratings have been reputational – a loss of bragging rights of, arguably, negligible value. The new proposals align Bronze with the (B3) minimum required standards and put Requires Improvement below these: in the new calculus of value the minimum is not good enough and there will be consequences.

    We’ve already had some hints that a link to fee cap levels is back on the cards, but in the meantime OfS is pondering a cap on student numbers expansion to punish those who turn out Bronze or Requires Improvement. The workings of the expansion cap will be familiar to those who recall the old additional student numbers process – increases of more than five per cent (the old tolerance band, which is still a lot) would not be permitted for poorly rated providers.

    For providers without degree awarding powers it is unlikely they will be successful in applying for them with Bronze and below – but OfS is also thinking about restricting aspects of existing providers DAPs, for example limiting their ability to subcontract or franchise provision in future. This is another de facto numbers cap in many cases, and is all ahead of a future consultation on DAPs that could make for an even closer link with TEF.

    Proposals for progression

    Proposal 6 will simplify the existing B3 thresholds, and integrate the way they are assessed into the TEF process. In a nutshell, the progression requirement for B3 would disappear – with the assessment made purely on continuation and completion, with providers able to submit contextual and historic information to explain why performance is not above the benchmark or threshold as a part of the TEF process.

    Progression will still be considered at the higher levels of TEF, and here contextual information can play more of a part – with what I propose we start calling the Norland Clause allowing providers to submit details of courses that lead to jobs that ONS does not consider as professional or managerial. That existing indicator will be joined by another based on (Graduate Outcomes) graduate reflections on how they are using what they have learned, and benchmarked salaries three years after graduation from DfE’s Longitudinal Educational Outcomes (LEO) data – in deference to that random Kemi Badenoch IFS commission at the tail end of the last parliament.

    Again, there will be contextual benchmarks for these measures (and hopefully some hefty caveating on the use of LEO median salaries) – and, as is the pattern in this consultation, there are detailed proposals to follow.

    Marginal gains, marginal losses

    The “educational gains” experiment, pioneered in the last TEF, is over: making this three times that a regulator in England has tried and failed to include a measure of learning gain in some form of regulation. OfS is still happy for you to mention your education gain work in your next narrative submission, but it isn’t compulsory. The reason: reducing burden, and a focus on comparability rather than a diversity of bespoke measures.

    Asking providers what something means in their context, rather than applying a one-size-fits-all measure of student success was an immensely powerful component of the last exercise. Providers who started on that journey at considerable expense in data gathering and analysis may be less than pleased at this latest development – and we’d certainly understood that DfE were fans of the approach too.

    Similarly, the requirement for students to feed back on students in their submissions to TEF has been removed. The ostensible reason is that students found it difficult last time round – the result is that insight from the valuable networks between existing students and their recently graduated peers is lost. The outcomes end of TEF is now very much data driven with only the chance to explain unusual results offered. It’s a retreat from some of the contextual sense that crept in with the Pearce Review.

    Business as usual

    Even though TEF now feels like it is everywhere and for always, there’s still a place for OfS’ regular risk-based monitoring – and annex I (yes, there’s that many annexes) contains a useful draft monitoring tool.

    Here it is very good to see staff:student ratios, falling entry requirements, a large growth in foundation year provision, and a rapid growth in numbers among what are noted as indicators of risk to the student experience. It is possible to examine an excellent system designed outside of the seemingly inviolate framework of the TEF where events like this would trigger an investigation of provider governance and quality assurance processes.

    Alas, the main use of this monitoring is to decide whether or not to bring a TEF assessment forward, something that punts an immediate risk to students into something that will be dealt with retrospectively. If I’m a student on a first year that has ballooned from 300 to 900 from one cycle to the next there is a lot of good a regulator can do by acting quickly – I am unlikely to care whether a Bronze or Silver award is made in a couple of years’ time.

    International principles

    One of the key recommendations of the Behan review on quality was a drawing together of the various disparate (and, yes, burdensome) streams of quality and standards assurance and enhancement into a unified whole. We obviously don’t quite get there – but there has been progress made towards another key sector bugbear that came up both in Behan and the Lords’ Industry and Regulators Committee review: adherence to international quality assurance standards (to facilitate international partnerships and, increasingly, recruitment).

    OfS will “work towards applying to join the European Quality Assurance Register for Higher Education” at the appropriate time – clearly feeling that the long overdue centring of the student voice in quality assurance (there will be an expanded role for and range of student assessors) and the incorporation of a cyclical element (to desk assessments at least) is enough to get them over the bar.

    It isn’t. Principle 2.1 of the EQAR ESG requires that “external quality assurance should address the effectiveness of the internal quality assurance processes” – philosophically establishing the key role of providers themselves in monitoring and upholding the quality of their own provision, with the external assurance process primarily assessing whether (and how well) this has been done. For whatever reason OfS believes the state (in the form of the regulator) needs to be (and is capable of being!) responsible for all, quality assurance everywhere, all the time. It’s a glaring weakness of the OfS system that urgently needs to be addressed. And it hasn’t been, this time.

    The upshot is that while the new system looks ESG-ish, it is unlikely to be judged to be in full compliance.

    Single word judgements

    The recent use of single headline judgements of educational quality being used in ways that have far reaching regulatory implications is hugely problematic. The government announced the abandonment of the old “requires improvement, inadequate, good, and outstanding” judgements for schools in favour of a more nuanced “report card approach” – driven in part by the death by suicide of headteacher Ruth Perry in 2023. The “inadequate” rating given to her Cavendish Primary School would have meant forced academisation and deeper regulatory oversight.

    Regulation and quality assurance in education needs to be rigorous and reliable – it also needs to be context-aware and focused on improvement rather than retribution. Giving single headline grades cute, Olympics-inspired names doesn’t really cut it – and as we approach the fifth redesign of an exercise that has only run six times since 2016 you would perhaps think that rather harder questions need to be asked about the value (and cost!) of this undertaking.

    If we want to assess and control the risks of modular provision, transnational education, rapid expansion, and a growing number of innovations in delivery we need providers as active partners in the process. If we want to let universities try new things we need to start from a position that we can trust universities to have a focus on the quality of the student experience that is robust and transparent. We are reaching the limits of the current approach. Bad actors will continue to get away with poor quality provision – students won’t see timely regulatory action to prevent this – and eventually someone is going to get hurt.

    Source link

  • What Ofsted inspections reveal about university leadership and culture

    What Ofsted inspections reveal about university leadership and culture

    The arrival of Ofsted inspections of degree apprenticeships in higher education was never going to be smooth. But what’s become clear is just how underprepared some universities were for the emotional and organisational demands that these inspections bring.

    As part of my doctoral research, I conducted a qualitative study, based on 20 semi-structured interviews with academic and professional services (PS) staff from 19 English universities. What I found reveals more than just overstretched teams or complaints about workload. It tells a story of institutional neglect within a sector where the rhetoric is one of apprenticeships being embraced while quietly sidelining the staff delivering this provision.

    As government policy surrounding apprenticeships, flexible/modular provision, and the growth and skills levy starts to become clearer, the findings act as a warning shot. The issues higher education staff face during Ofsted inspections reflect deeper structural and cultural problems – ones that won’t be solved with another “you’ve got this!” email from the vice chancellor’s office.

    A marginalised provision

    Apprenticeships have always had an awkward status in HE. They’re professionally significant and they can attract noteworthy employer relationships, but they remain institutionally peripheral. As one participant put it, “we’ve never been invited to a senior leader’s meeting to talk about apprenticeships.”

    Almost all academic participants described their apprenticeship work as invisible in workload models and poorly understood by senior leaders. One participant reported that they get “50 hours a year to look after apprenticeships, even though I would consider it to be my full-time role.” Another simply said, “we feel like the poor relation.” PS staff described the work during the Ofsted inspection creating “a permanent status of panic” and detailed 12-hour working days that ran through weekends until they were “running on fumes”. One cancelled a long-planned family holiday. Others reported stress-related illness, insomnia, extended sick leave, and the need for medication.

    The most striking point during many of the interviews wasn’t just the volume of work to support apprenticeship delivery or the Ofsted inspection – it was the sense that senior leaders within their institution didn’t acknowledge it or even care.

    Inspections as emotional events

    There are multiple other accountability mechanisms within HE: the Teaching Excellence Framework, the Office for Students’ conditions of registration, the Quality Assurance Agency, the Department for Education apprenticeship accountability framework, and professional accreditation processes. This results in a complex and multi-agency system of regulation and scrutiny. However, among participants, Ofsted inspections weren’t experienced as just another audit or review. They were felt as emotional, personal, a question of professional competence, and in many cases traumatic.

    The anticipation alone triggered stress symptoms and anxiety. One PS participant said:

    Before the inspection started, I was terrified because I was going to be representing my university. What if I get it wrong? I kept feeling sick.

    Another participant feared that the inspection outcome, if unsuccessful, could undermine years of hard work and this loss of control and emotional volatility left them feeling depleted and unwilling to experience an Ofsted inspection again:

    I cannot be here in five years’ time. I’m not going through that again. I had some stress symptoms which didn’t let up for six to eight months.

    Teaching staff viewed the inspection as a test of professional credibility and the emotional toll was compounded by the expectation to present calm professionalism: “I spent time telling everyone to be careful and not let your guard down” while managing their own fears and “the impending pit of doom” and those of their colleagues. Another said: “I was really worried about my colleague being pulled into an observation with an inspector. Her practice is wonderful, but she would have fallen apart. I wanted to protect her wellbeing.”

    The need to “perform professionalism” while internally unravelling created a specific kind of emotional labour which was often invisible to those in leadership roles. It was obvious that participants weren’t just preparing evidence: they were absorbing institutional risk. In doing so, they became the shock absorbers for their university’s unpreparedness.

    The problem isn’t Ofsted, it’s us

    One might assume the findings are a critique of Ofsted. In fact, most participants described the inspectors as “courteous”, “professional”, “kind”, “amazing” or “approachable”. The frustration wasn’t aimed at the inspectors; it was aimed at the system.

    One problem was the mismatch between Ofsted’s frameworks and the reality of delivering apprenticeships in higher education. Teaching staff spoke of “squeezing your programme, pedagogy, everything into an arbitrary box” that didn’t reflect their practice. Others questioned why Ofsted couldn’t operate more like consultants, “sharing best practice and providing exemplars” rather than simply evaluating.

    While almost all participants described inspectors as courteous and supportive, they also expressed concerns about the disempowering effects of inspection dynamics. One noted

    The power dynamic is… ‘If we don’t think you’re good enough, we’re going to close you down’. There are other regulatory bodies that don’t have the ability to put people out of jobs. It’s crazy.

    That perception of existential risk was heightened because many institutions appeared to have no clear inspection plan. No training. No joined-up strategy. “We only got Ofsted training two days before the inspection,” said one participant. Others had to “design and deliver” their own training from scratch “without any support” from their leadership which meant it was difficult to get people to engage with it.

    Teaching staff shared their views that traditional academic CPD (such as research outputs and pedagogic innovation) continues to be prioritised over compliance-linked work like Ofsted inspections, despite the institutional reputational risks:

    If any of us wanted to go off to London to present a research paper, we would have accommodation paid for us, we’d be able to go to that conference, no problem. But if we ask for £150 worth of CPD on how to improve apprenticeship delivery it wouldn’t be allowed. It’s not a business priority.

    Not malicious, just indifferent

    Overall, my research tells a story about institutional neglect. Unlike toxic leadership or micro-management, this form of harm is quieter. It’s not what leaders do; it’s what they fail to do. It’s the absence of engagement and the unwillingness to fund training. Most importantly, it’s the lack of psychological safety during a high-pressured event like an Ofsted inspection. As one participant said, “when the Ofsted inspectors came in, it was really hard to listen to senior leaders talking about how much they support staff… the reality is very different.”

    This isn’t about bad management, it’s about structural marginalisation. Apprenticeship provision was described as falling outside the strategic priorities of some institutions and their senior leaders were perceived as having “no awareness, no understanding” and that they “don’t particularly care about apprenticeships”. Research, undergraduate teaching, and the TEF occupied the centre of institutional gravity. Apprenticeships did not.

    Some participants said they almost wished for a “requires improvement” judgement just to get leadership to take them seriously. One observed:

    I had hoped that we would get ‘requires improvement’ because it would have made senior leadership pay attention to the changes we need to make. Senior staff have this sense of complacency as if the ‘good’ rating shows that we’re fine.

    The government is watching

    With this government promising a reshaping of apprenticeships and skills, and the growth and skills levy pushing modular/skills learning into new territory, the pressures experienced in apprenticeship provision in HE are likely to spread. Inspection and regulation in this space aren’t going away. Nor should they. But my findings suggest the real threat to quality and staff wellbeing is not external scrutiny, it’s internal culture.

    The risks here are reputational and ethical. Strategic responsibility for inspection readiness and staff wellbeing needs to sit at the top table, not with the most overworked and marginalised staff in the room. Here are five things that universities should do, right now:

    Stop marginalising apprenticeship teams. If universities are serious about their current apprenticeship provision and the imminent skills/flexible learning opportunities coming our way, the teams supporting these activities must be embedded into institutional strategy, not treated as marginalised, compliance-heavy provision.

    Build inspection readiness into annual planning, not panic-mode two days before the inspection starts.

    Invest in meaningful CPD for apprenticeships, including training on inspection frameworks, evidence expectations, managing emotional load during inspection periods, and conference attendance for the skills and apprenticeships agenda.

    Create psychological safety. No one should feel personally responsible for the entire institution’s regulatory fate.

    Use governance structures to ask hard questions. Boards and Senates should demand answers: how are we resourcing our skills and apprenticeship provision? What preparations do we have in place for the new skills/modular provision that will inevitably be inspected? Does leadership in schools/faculties understand their skills and apprenticeships provision fully? Do all colleagues get equal access to relevant CPD to do their job effectively?

    Ofsted didn’t bring stress into higher education; it just exposed a stretched system and the fragility of institutional operations and governance which relies on invisible labour.

    With the introduction of the growth and skills levy and a significant shift toward modular and flexible provision, the emotional and operational burdens seen in apprenticeship delivery and Ofsted inspections risk being replicated at scale unless universities adapt. When senior leaders are thinking about the structures and metrics for expanding into new opportunities such as modular/skills provision, they also need to carefully consider culture, responsibility, support, and compassionate leadership.

    If they replicate the same dynamics – underfunded, misunderstood, marginalised, and shouldered by isolated staff – universities risk institutionalising burnout and anxiety as conditions of participation in apprenticeships and skills.

    Source link

  • Reputation versus sunlight – universities and the new Duty of Candour

    Reputation versus sunlight – universities and the new Duty of Candour

    The idea of a “Hillsborough Law” has been in circulation for years.

    Campaigners – led by families of those who died at at Hillsborough Stadium in 1989, and joined more recently by those bereaved by Grenfell, Covid, and the death of headteacher Ruth Perry – have long argued that public authorities must be placed under a clear, statutory duty to tell the truth.

    Manchester Mayor (and emergent Labour leadership hopeful) Andy Burnham first introduced a Private Members’ Bill in 2017, but it fell with the general election.

    Labour then adopted the idea as policy in 2022, and after years of pressure – including a personal promise from Keir Starmer in the run-up to the 2024 election – the King’s Speech in July 2024 confirmed it would be brought forward.

    A year later, ministers missed the April anniversary deadline – triggering frustration from campaigners and months of rumour about officials attempting to water down the Bill – before finally introducing the Bill to Parliament now under the stewardship of new Justice Secretary David Lammy.

    To campaigners’ relief, this is not just symbolic legislation – it’s about correcting a deep structural imbalance, and very much connects to what little there is in Starmer’s vision – the idea and ideals of public service and a public realm “on the side of truth and justice”.

    For decades, bereaved families navigating inquests have faced publicly funded barristers representing the police, the NHS, local councils, or universities – while they themselves have been forced to crowdfund. They have seen evidence lost, withheld, or destroyed, and have encountered institutions that default to defensive strategies – preferring to protect their reputation than face accountability.

    The Public Office (Accountability) Bill (along with its explanatory notes and multiple impact assessments) – colloquially known as the Hillsborough Law – attempts to change that dynamic. It is about “candour”, legal aid, and cultural reform. And although the national debate has focused on disasters and policing, the legislation will very much apply to universities.

    What the Bill does

    At its core, the Bill does two things. First, it imposes a statutory duty of candour on public authorities and officials. That means a proactive obligation to be frank, open, and transparent when dealing with inquiries, investigations, and inquests. In some cases, it criminalises obstruction, dishonesty, and selective disclosure.

    Second, it guarantees non-means-tested legal aid for bereaved families involved in inquests and inquiries where public authorities are represented. That ends the unjust asymmetry of families crowdfunding – while the state and its arms funds lawyers to defend itself.

    Alongside this, the Bill codifies a replacement for the common law offence of misconduct in public office, creates new statutory misconduct offences, and requires public authorities to adopt and publish their own codes of ethical conduct embedding candour and the Nolan principles.

    The schedules name government departments, police forces, NHS bodies, schools, and further education corporations. But it also applies to any body carrying out “functions of a public nature” – a familiar phrase from the Human Rights Act and the Freedom of Information Act. Universities are covered.

    Pre-1992 universities were founded by Royal Charter or statute, and their governing bodies often include members approved by ministers or the Crown. Post-1992 universities are higher education corporations created by the 1992 Act. They fit easily within the test. Whether private providers, where they are registered with the Office for Students (OfS) and teach (quasi-)publicly funded students, will be caught under the “functions of a public nature” clause.

    For universities and their staff, this ought to be a profound change to the way they respond to tragedy, handle complaints, and manage their obligations to students and the public.

    Candour in inquiries and inquests

    In Part 2, Chapter 1, the Bill sets out the statutory duty of candour in relation to formal, statutory inquiries, investigations, and inquests.

    The duty is not passive – it requires public authorities to notify an inquiry if they hold relevant material, preserve records, provide assistance, and correct errors or omissions. Institutions can’t wait until a chair or coroner demands disclosure – they have to surface relevant material themselves.

    A new mechanism – a compliance direction – then strengthens the framework. Chairs of inquiries and coroners can issue formal directions requiring disclosure, written statements, clarifications, or corrections. These are binding. If an authority, or the official responsible for compliance, ignores, delays, or obstructs such directions, it becomes a criminal offence if done deliberately or recklessly.

    For universities, the most direct likely application will likely be to coroners’ inquests into student deaths. If, for example, a university was aware that it held key documents about a student’s support plan, assessment records, or internal communications, the duty would compel it to notify the coroner and disclose them proactively. The current norm – where families must ask precise questions and often guess at what exists – would be replaced by a statutory expectation of candour.

    If, as another example, a coroner designated a university as an interested person, a compliance direction could require a formal position statement explaining its role, structured disclosure of documents, and timely corrections if errors emerged. Senior officers will be personally responsible for compliance.

    And if relevant staff had first-hand knowledge of a critical incident – say, supervising an assessment where a student’s distress became acute – they could not quietly stay in the background. The university would be under pressure to identify and disclose their evidence candidly.

    The Bill also extends legal aid. Families would be guaranteed representation in any inquest where a public authority is an interested person. That means if, for example, a university and an NHS trust were both in scope, the family would not have to crowdfund tens of thousands of pounds to achieve parity of arms.

    At present, coroners have wide powers, but families often lack the leverage to ensure they are exercised fully. Coroners have to answer the four statutory questions – who, where, when, how – and they often interpret “how” narrowly. Families often push for broader scope, but institutions can resist. A statutory duty of candour would not change the coroner’s legal remit, but it should alter the behaviour of institutions within that remit. Selective disclosure, defensive positioning, and late document dumps would become high-risk strategies.

    It’s also notable that the Bill places the duty personally on those in charge of public authorities. In the university context, that means senior leadership cannot outsource disclosure entirely to lawyers or middle managers. Accountability flows up to the governing body and vice chancellor.

    And coroners’ Prevention of Future Deaths reports (PFDs) matter too. With fuller disclosure under candour, coroners are more likely to identify systemic failings in universities and recommend changes. While coroners cannot assign civil liability, their reports can shape policy and practice across the sector.

    Crucially, the Bill specifically recognises the problem of “information asymmetry.” Families can’t know what to ask for if they do not know what exists. By flipping the responsibility – making universities proactively disclose rather than forcing families to drag material into the open – the duty addresses that asymmetry head-on.

    The scope of this bit of the Bill is wide, but not limitless. It clearly applies to coroners’ inquests, Fatal Accident Inquiries in Scotland, and statutory public inquiries under the 2005 Act. It also extends to non-statutory inquiries set up by ministers, and there is a power for the Secretary of State (or devolved governments) to designate other investigations by regulation.

    But it does not automatically capture every process that universities are familiar with – complaints investigated by the OIA in England and Wales, regulatory investigations by OfS, Medr or the SFC, professional regulator fitness to practise panels, or independent reviews commissioned internally are all outside its scope as drafted.

    In those arenas, candour would only bite through the separate Chapter 2 duty to adopt and apply an ethical code (see below), rather than through the compliance-direction machinery of Chapter 1. But for those types of iniquity and investigation explicitly covered, it means candour is no longer optional or reputational – it is statutory, enforceable, and personal.

    Candour in day-to-day conduct

    If Part 2, Chapter 1 is about how institutions behave in high-profile inquiries, Chapter 2 is about how they behave every day. The Bill as drafted would require every public authority to adopt and publish a code of ethical conduct. In that Code, universities will be required to:

    • articulate the Nolan principles (selflessness, integrity, objectivity, accountability, openness, honesty, leadership);
    • define a duty of candour for the authority’s context;
    • explain consequences for breaches, including disciplinary and professional sanctions;
    • set out whistleblowing and complaint routes for staff and the public;
    • be public, regularly reviewed, and supported by training.

    For universities, this will mean embedding candour into teaching, research, administration, and student support.

    There are all sorts of potential implications. Consider complaints handling – at present, plenty of universities instruct lawyers at an early stage to assess litigation risk. For complainants, that shifts the emphasis to protecting the institution rather than resolving the complaint candidly. A student might receive partial explanations, documents only when pressed, or carefully worded responses that obscure institutional failings.

    If the idea is that the Code required under Chapter 2 incorporates and translates the principles reflected in Chapter 1, that approach to complaints would be unacceptable. The code should require:

    • proactive disclosure of relevant information during a complaint;
    • corrections when errors are identified;
    • clear explanations of decisions, not just outcomes;
    • openness even where disclosure is uncomfortable.
    • and a failure to act candidly could itself be misconduct, separate from the original complaint.

    For staff, the implications are significant. An academic accused of discrimination could no longer rely on the institution minimising disclosure to reduce liability. If records show concerns were raised earlier, candour might require acknowledging that, not burying it. Someone processing appeals could not quietly omit inconvenient information from a report.

    It raises staff-side concerns. The NHS experience shows frontline workers often feel candour exposes them personally, while leadership remains insulated. In universities, staff already operate under high pressure – REF, TEF, student satisfaction surveys, and reputational risk all loom large.

    A candour duty could feel like additional personal exposure – unless universities design their codes carefully, the burden may fall disproportionately on individual staff rather than leadership.

    And the implications extend beyond complaints. In admissions, candour could mean being frank with applicants about course viability or resource constraints. In research, it could mean full disclosure of conflicts of interest. In governance, it could mean sharing risk assessments with staff and students rather than keeping them confidential.

    The duty also requires universities to build internal systems – staff will have to be trained to understand candour, managers will be required to reinforce it, and whistleblowing protections will have to be clear. And codes will need to specify sanctions for breaches – shifting candour from an abstract principle to a live HR and governance issue. If the Higher Education (Freedom of Speech) Act offers staff protection for saying things out loud, at least in theory the Public Office (Accountability) Bill will require universities to require staff to say (some) things out loud.

    Legal context

    There are still limits. The Bill is explicit that candour doesn’t override other legal restrictions – data protection, privilege, and statutory exemptions still apply. A university can’t disclose student medical records without consent, nor breach confidentiality agreements lawfully in place. But the default flips – the presumption is disclosure unless legally barred, not concealment unless forced.

    That will all interact directly with stuff like Equality Act duties and consumer protection law. Universities might resist admissions in complaints because acknowledging discrimination or misleading marketing creates liability. Under Chapter 2, the risk is reversed – concealing those admissions would itself be a statutory breach. The Digital Markets, Competition and Consumers Act 2024 and CMA guidance already push towards transparency in student marketing. A candour duty would add a new, statutory dimension.

    In practical terms, universities will need to rewrite policies, retrain staff, and rethink how they interact with students. Complaints offices, HR teams, and legal advisers will all have to internalise the new default of candour. The reputational instinct to minimise admissions of fault will be directly challenged by statutory obligation.

    In theory, as liability risk increases, so should trust. Universities are often criticised for opacity, defensiveness, and spin – a statutory candour duty offers a chance to change that culture. Students making complaints would be entitled not just to process fairness but to institutional honesty, and staff accused of misconduct would know that concealment or minimisation would itself be a breach. Governing bodies would have to lead by example, publishing codes and demonstrating compliance.

    Regulators and adjudicators

    Of course if candour becomes law, regulators and adjudicators will need to respond. As it stands, no specific regulator is identified for monitoring compliance with the “devolved” duty under Chapter 2 – that may get added as the Bill progresses, but even if it doesn’t, the interactions with other areas of regulation make it wise for there to be change.

    In England and Wales, the Office of the Independent Adjudicator (OIA) already reviews individual complaints and publishes a Good Practice Framework. It emphasises fairness, transparency, and clarity, but not candour as a statutory duty per se.

    Once Chapter 2 is in force, the OIA would likely need to update its framework to reference candour explicitly. It would then be able to hold universities to account not just against good practice, but against a legal standard – did the university act candidly in its handling of this complaint?

    The Office for Students (OfS) then has wider systemic oversight. The regulatory framework includes Condition E2 on management and governance, and requires compliance with Public Interest Governance Principles. These do currently cover accountability and academic freedom – but not candour. If universities are under a statutory candour duty, OfS will almost certainly need to amend the PIGPs or issue guidance to reflect it.

    How this all sits with other existing regimes like the Freedom of Information Act (FOIA) will be another big question. FOIA already imposes transparency duties, but universities often take a restrictive approach, especially private providers not designated as public authorities under FOIA. The candour duty would run in parallel – requiring disclosure in complaints and inquests even where FOIA might not apply.

    Other sections of the Bill

    While most attention has focused on the duty of candour and the reforms to inquests, the Bill also contains other important provisions that will reshape the accountability of public authorities.

    Part 1 of the Bill tackles the long-running debate around misconduct in public office. The common law offence – dating back centuries – has long been criticised as vague, inconsistently applied, and overly reliant on judicial interpretation.

    The Bill abolishes the common law offence and replaces it with a new statutory framework, creating clear offences for serious misconduct by public officials, defining more precisely what counts as abuse of position or wilful neglect of duty. For universities, where senior leaders or governors are increasingly seen as “public officials” when exercising functions of a public nature, this should provide sharper statutory clarity on when misconduct could cross from an HR or governance issue into criminal liability.

    The Bill also addresses investigations and inquiries more broadly. It enhances powers for inquiry chairs and coroners not just to compel evidence, but to ensure compliance is timely and truthful. The creation of compliance directions backed by criminal sanction sits here, but the wider context is about rebalancing relationships.

    Families and victims have long argued that inquiries too often become adversarial battles against obfuscating institutions. As the Bill shifts legal duties onto the institutions themselves, it tries to realign incentives so truth-seeking, not reputation-protection, dominates. And Part 2 expects those principles to be reflected inside universities too.

    Another significant element is the reform of legal aid at inquests. For the first time, non-means-tested legal aid will be automatically available for bereaved families whenever a public authority is represented at an inquest. This is not just a financial change – it’s another attempt to end the asymmetry that has often characterised high-profile inquests. For universities, it should mean that whenever they are an interested person, families will now face them on an equal legal footing.

    The Bill also contains provisions on whistleblowing and reporting duties – where staff often feel trapped between loyalty to the institution and responsibility to students or the public. Public authorities will have to create clear internal mechanisms to support those who raise concerns, and codes of conduct will have to integrate protections and processes for staff who disclose wrongdoing.

    Taken together, these other sections of the Bill flesh out the candour framework, create sharper criminal liability for misconduct, and give families, the public and/or students and staff stronger levers for truth and accountability.

    Territorial application

    The Bill extends to England and Wales, with many provisions applying directly to public authorities operating there. Scotland and Northern Ireland have their own legal systems and inquest regimes, so the Bill’s application is more limited. But universities across the UK will need to pay attention.

    In Scotland, there is no coroner system, but Fatal Accident Inquiries serve a similar role. While the Bill itself does not apply wholesale, the Scottish Government and the Scottish Funding Council are likely to face pressure to adopt parallel reforms – particularly on candour and legal aid – to avoid a two-tier approach for bereaved families.

    In Wales, higher education is now regulated under the Tertiary Education and Research (Wales) Act 2022, with the new Commission for Tertiary Education and Research (CTER) taking over regulatory functions. Although the Bill applies to Wales, CTER will need to consider how candour duties interact with its quality and governance oversight.

    And in Northern Ireland, inquests operate differently again, and universities there are few in number. The territorial extent of the Bill is narrower, but questions will inevitably arise about parity of rights for families and students.

    For providers operating across borders – particularly cross-UK institutions or partnerships – the patchwork will be complex. Consistency will matter, and regulators in devolved nations might usefully align their governance principles and duties to ensure students and families are not disadvantaged by geography.

    Culture change

    Of course, policy is one thing – culture is another. The NHS has had its own statutory duty of candour for a decade, requiring openness with patients when things go wrong. But implementation has been patchy – studies and reviews have found variability, defensiveness, and resistance. In practice, candour clearly depends not just on statutory text but on leadership, training, and incentives.

    The same will be true in higher education. Universities are complex, professionalised, and reputationally sensitive – candour is simply not their default culture. Embedding it will require governing bodies and senior staff to model openness, leaders to embrace uncomfortable truths, and lawyers to reframe their advice.

    The risk is that candour becomes yet another procedural box-tick – a paragraph in a code, a slide in induction training – while the real behaviours remain defensive. The opportunity is for universities to embrace candour as a chance to rebuild trust with students, staff, and the public.

    A particularly thorny question is how the Bill will apply to the growing number of private higher education providers. A brief glance at WhatDoTheyKnow suggests that they routinely refuse Freedom of Information requests on the basis that they are not designated as public authorities under FOIA, despite (in England) often being registered with the Office for Students and enrolling thousands of publicly funded students.

    On the face of the Bill, they would only fall within scope of the candour duty where they are performing “functions of a public nature” – a phrase that has generated years of litigation under the Human Rights Act and remains contestable.

    That creates a risk of a two-tier candour regime in higher education – so one way to resolve it would be for OfS to hardwire candour into its Public Interest Governance Principles, explicitly requiring all registered providers – public and private – to adopt candour codes and to respond to FOI requests as a condition of registration (especially if registration does eventually end up covering franchised-to providers not on the OfS register).

    That would extend the protections in practice, ensuring that students and families do not see their access to information and honesty diluted simply because their provider is incorporated as a private company. Similar steps could be taken by the Scottish Funding Council and Medr in Wales, embedding candour and transparency as regulatory expectations across the UK.

    Oh – and the position of partners and contractors is also significant, and may need exploration as the Bill progresses. Under Chapter 1, some may be caught directly where they are exercising functions of a public nature or hold relevant health and safety responsibilities – for example, halls providers, outsourced counselling services, or teaching partners.

    And even where they are not formally within scope, the spirit of the Bill makes clear that universities cannot sidestep candour by outsourcing – they will effectively be expected to build equivalent obligations into contracts, ensuring that candour duties flow through to partners so that evidence and disclosure gaps do not open up when multiple organisations are involved.

    A different kind of leadership

    The coverage might not point directly at universities – but the Hillsborough Law is not just about disasters, policing, or health. It is about the way the state – and those who exercise public functions – treat people when things go wrong.

    For universities, inquests into student deaths should be different – candour will be mandatory, legal aid automatic, and compliance enforceable. Day-to-day complaints handling should be reshaped – defensive, lawyer-led strategies will sit uneasily alongside statutory candour codes. Regulators and adjudicators should respond, updating frameworks and guidance.

    But as I say, just as the OIA’s “Bias and the perception of bias” expectations haven’t automatically made complaints handling any less… biased, legislation of this sort alone will not fix culture. The challenge for leaders will be to embed candour not just in codes and conditions, but in the behaviours of academics, professional services staff, their partners, and themselves.

    In an ideal world, universities would embrace transparency organically, driven by their educational mission rather than legal compulsion. The best learning happens when trust and openness prevail, not when compliance regimes loom.

    But not only have academic careers forever been about reputation, universities have evolved into large, corporatised institutions with competing pressures – league tables, reputational risk, financial sustainability. In this environment, in the teeth of a crisis or complaint, the truth is that abstract appeals to academic values often lose out to immediate institutional interests.

    Rather than hoping for cultural transformation, the Hillsborough Law reshapes incentives. When concealment becomes legally riskier than disclosure, and when defensive strategies carry criminal liability, candour becomes not just morally right but institutionally smart.

    For students, families, and staff facing institutional defensiveness at vulnerable moments, legal leverage may be the only way to level the playing field. Too many public authorities have failed to redefine reputation to mean trustworthiness rather than unblemished image – now the law will redefine it for them.

    That will mean shifting from reputation management to truth telling, from legal defensiveness to openness, and from institutional self-interest to public accountability. In a sector so dominated by the powerful incentives of reputation, that will be no simple task – but it will be a vital one.

    Source link