A randomised control trial has found that early intervention support for highly disengaged first-year equity students does not necessarily lead to higher participation.
Please login below to view content or subscribe now.
Membership Login

A randomised control trial has found that early intervention support for highly disengaged first-year equity students does not necessarily lead to higher participation.
Please login below to view content or subscribe now.

Streamlining recognition of prior learning (RPL) is one way the tertiary education sector can boost the economy during the Albanese government‘s mission to tackle declining productivity.
Please login below to view content or subscribe now.

Just under 70 per cent of Australian universities have dropped compared to last year in the latest QS World University Rankings released on Thursday.
Please login below to view content or subscribe now.

How do perceptions of artificial intelligence, online education, tertiary harmonisation, regulation and the skills agenda differ between Australia and the United Kingdom?
Please login below to view content or subscribe now.

It’s been a year since publication of the Behan review and six months since OfS promised to “transform” their approach to quality assessment in response. But it’s still far from clear what this looks like, or if the change is what the sector really needs.
In proposals for a new strategy published back in December OfS suggested a refocus of regulatory activity to concentrate on three strategic priorities of quality, the wider student experience and financial resilience. But while much of the mooted activity within experience and resilience themes felt familiar, when it came to quality, more radical change was clearly on the agenda.
The plans are heavily influenced by findings of last summer’s independent review (the Behan review). This critiqued what it saw as minimal interaction between assessment relating to baseline compliance and excellence, and recommended bringing these strands together to focus on general improvement of quality throughout the sector. In response OfS pledged to ‘transform’ quality assessment, retaining TEF at the core of an integrated approach and developing more routine and widespread activity.
Unfortunately, these bare bones proposals raised more questions about the new integrated approach than they answered and if OfS ‘recent blog update was a welcome attempt to do more in the way of delivering timely and transparent information to providers, it disappointed on detail. OfS have been discussing key issues such as the extent of integration, scope for a new TEF framework, and methods of assessment. But while a full set of proposals will be out for consultation in the autumn, in the meantime, there’s little to learn other than to expect a very different TEF which will probably operate on a rolling cycle (assessing all institutions over a four to five year period).
The inability to cement preparations for the next TEF will cause some frustration for providers. However, if as the tone of communications suggests, OfS is aiming for more disruptive integration above an expansion of the TEF proposals may present some bigger concerns for the sector.
A fundamental concern is whether an integrated approach aimed at driving overall improvement is the most effective way to tackle the sector’s current challenges around quality. Behan’s review warns against an overemphasis on baseline regulation, but below standard provision from a significant minority of providers is where the most acute risks to students, taxpayers and sector reputation lie (as opposed to failure to improve quality for the majority performing above the baseline). Regulation should support improvement across the board too of course.
However, it’s not clear how shifting focus away from the former, let alone moving it within a framework designed to assess excellence periodically, will usefully help OfS tackle stubborn pockets of poor provision and emerging threats within a dynamic sector.
There is also an obvious tension inherent in any attempt to bring baseline regulation within a rolling cycle which is manifest as soon as OfS find serious concerns about provider quality mid cycle. Here we should expect OfS to intervene with investigation and enforcement where appropriate to protect the student and wider stakeholder interest. But doing so would essentially involve regulating on minimum standards on top of a system that’s aiming to do that already as part of an integrated approach. Moreover, if whistle blowing and lead indictors which OfS seem keen to develop to alert them to issues operate effectively, and if OfS start looking seriously at franchise and potentially TNE provision, it’s easy to imagine this duplication becoming widespread.
There is also the issue of burden for both regulator and providers which should be recognised within any significant shift in approach. For OfS there’s a question of the extent to which developing and delivering an integrated approach is hindering ongoing quality assessment. Meanwhile, getting to grips with new regulatory processes, and aligning internal approaches to quality assurance and reporting will inevitably absorb significant provider resource. At a time when pressures are profound, this is likely to be particularly unwelcome and could detract significantly from the focus on delivery and students. Ironically it’s hard to see how transformative change might not hamper the improvements in quality across the board that Behan advocates and prove somewhat counter-productive to the pursuit of OfS’ other strategic goals.
It’s crucial that OfS take time to consider how best to progress with any revised approach and sector consultation throughout the process is welcome. Nevertheless, development appears to be progressing slowly and somewhat at odds with OfS’ positioning as an agile and confident regulator operating in a dynamic landscape. Maybe this should tell us something about the difficulties inherent in developing an integrated approach.
There’s much to admire about the Behan review and OfS’ responsiveness to the recommendations is laudable. But while Behan looks to the longer term, I’m not convinced that in the current climate there’s much wrong with the idea of maintaining the incumbent framework.
Let’s not forget that this was established by OfS only three years ago following significant development and consultation to ensure a judicious approach.
I wonder if the real problem here is that, in contrast to a generally well received TEF (and as Behan highlights), OfS’ work on baseline quality regulation simply hasn’t progressed with the speed, clarity and bite that was anticipated and necessary to drive positive change above the minimum were needed. And I wonder if a better solution to pressing quality concerns would be for OfS to concentrate resources on improving operation of the current framework. There certainly feels room to deliver more, more responsive, more transparent and more impactful baseline investigations without radical change. At the same time, the feat of maintaining a successful and much expanded TEF seems much more achievable without bringing a significant amount of assurance activity within its scope.
We may yet see a less intrusive approach to integration proposed by OfS. I think this could be a better way forward – less burdensome and more suited to the sector’s current challenges. As the regulator reflects on their approach over the summer with a new chair at the helm who’s closer to the provider perspective and more distanced from the independent review, perhaps this is one which they will lean towards.

A decade since his passing, David Watson’s work remains a touchpoint of UK higher education analysis.
This reflects the depth and acuity of his analysis, but also his ability as a phrasemaker.
One of his phrases that has stood the test of time is the “quality wars” – his label for the convulsions in UK higher education in the 1990s and early 2000s over the assurance of academic quality and standards.
Watson coined this phrase in 2006, shortly after the 2001 settlement that brought the quality wars to an end. A peace that lasted, with a few small border skirmishes, until HEFCE’s launch of its review of quality assessment in 2015.
I wasn’t there, but someone who was has described to me a meeting at that time involving heads of university administration and HEFCE’s chief executive. As told to me, at one point a registrar of a large and successful university effectively called out HEFCE’s moves on quality assessment urging HEFCE not to reopen the quality wars. I’ve no idea if the phrase Pandora’s box was used, but it would fit the tenor of the exchange as it was relayed to me.
Of course this warning was ignored. And of course (as is usually the case) the registrar was right. The peace was broken, and the quality wars returned to England.
The staging posts of the revived conflict are clear.
HEFCE’s Revised operating model for quality assessment was introduced in 2016. OfS was establishment two years later, leading to the B conditions mark I; followed later the same year by a wholesale re-write of the UK quality code that was reportedly largely prompted and/or driven by OfS. Only for OfS to decide by 2020 that it wasn’t content with this; repudiation of the UK quality code; and OfS implementing from 2022 the B conditions mark II (new, improved; well maybe not the latter, but definitely longer).
And a second front in the quality wars opened up in 2016, with the birth of the Teaching Excellence Framework (TEF). Not quite quality assessment in the by then traditional UK sense, but still driven by a desire to sort the sheep from the goats – identifying both the pinnacles of excellence and depths of… well, that was never entirely clear. And as with quality assessment, TEF was a very moveable feast.
There were three iterations of Old TEF between 2016 and 2018. The repeated insistence that subject level TEF was a done deal, leading to huge amounts of time and effort on preparations in universities between 2017 and early 2020 only for subject-level TEF to be scrapped in 2021. At which point New TEF emerged from ashes, embraced by the sector with an enthusiasm that was perhaps to be expected – particularly after the ravages of the Covid pandemic.
And through New TEF the two fronts allegedly became a united force. To quote OfS’s regulatory advice , the B conditions and New TEF formed part of an “overall approach” where “conditions of registration are designed to ensure a minimum level” and OfS sought “to incentivise providers to pursue excellence in their own chosen way … in a number of ways, including through the TEF”.
So in less than a decade English higher education experienced: three iterations of quality assessment; three versions of TEF (one ultimately not implemented, but still hugely disruptive to the sector); and a rationalisation of the links between the two that required a lot of imagination, and a leap into faith, to accept the claims being made.
Pandora’s box indeed.
No wonder that David Behan’s independent review of OfS recommended “that the OfS’s quality assessment methodologies and activity be brought together to form a more integrated assessment of quality.” Last week we had the first indications from OfS of how it will address this recommendation, and there are two obvious questions: can we see a new truce emerging in the quality wars; and given where we look as though we may end up on this issue, was this round of the quality wars worth fighting?
Any assessment of where we are following the last decade of repeated and rapid change has to recognise that there have been some gains. The outcomes data used in TEF, particularly the approach to benchmarking at institutional and subject levels, is and always has been incredibly interesting and, if used wisely, useful data. The construction of a national assessment process leading to crude overall judgments just didn’t constitute wise use of the data.
And while many in the sector continue to express concern at the way such data was subsequently brought into the approach to national quality assessment by OfS, this has addressed the most significant lacuna of the pre-2016 approach to quality assurance. The ability to use this to identify specific areas and issues of potential concern for further, targeted investigation also addresses a problematic gap in previous approaches that were almost entirely focused on cyclical review of entire institutions.
It’s difficult though to conclude that these advances, important elements of which it appears will be maintained in the new quality assessment approach being developed by OfS, were worth the costs of the turbulence of the last 10 years.
What appears to be emerging from OfS’s development of a new integrated approach to quality assessment essentially feels like a move back towards central elements of the pre-2016 system, with regular cyclical reviews of all providers (with our without visits to be decided) against a single reference point (albeit the B conditions rather than UK Quality Code). Of course it’s implicit rather than explicit, but it feels like an acknowledgment that the baby was thrown out with the bathwater in 2016.
There are of course multiple reasons for this, but a crucial one has been the march away from the concept of co-regulation between universities and higher education providers. This was a conscious and deliberate decision, and one that has always been slightly mystifying. As a sector we recognise and promote the concept of co-creation of academic provision by staff and students, while being able to maintain robust assessment of the latter by the former. The same can and should be true of providers and regulators in relation to quality assurance and assessment, and last week’s OfS blog gives some hope that OfS is belatedly moving in this direction.
It’s essential that they do.
Another of David Watson’s memorable phrases was “controlled reputational range”: the way in which the standing of UK higher education was maintained by a combination of internal and external approaches. It is increasingly clear from recent provider failures and the instances of unacceptable practices in relation to some franchised provision that this controlled reputational range is increasingly at risk. And while this is down to developments and events in England, it jeopardises this reputation for universities across the UK.
A large part of the responsibility for this must sit with OfS and its approach to date to regulating academic quality and standards. There have also been significant failings on the part of awarding bodies, both universities and private providers. The answer must therefore lie in partnership working between regulators and universities, moving closer to a co-regulatory approach based on a final critical element of UK higher education identified by Watson – its “collaborative gene”.
OfS’s blog post on its developing approach to quality assessments holds out hope of moves in this direction. And if this is followed through, perhaps we’re on the verge of a new settlement in the quality wars.

If you believe – as many do – that English higher education is among the best in the world, it can come as an unwelcome surprise to learn that in many ways it is not.
As a nation that likes to promote the idea that our universities are globally excellent, it feels very odd to realise that the rest of the world is doing things rather better when it comes to quality assurance.
And what’s particularly alarming about this is that the new state of the art is based on the systems and processes set up in England around two decades ago.
The main bone of contention between OfS and the rest of the quality assurance world – and the reason why England is coloured in yellow rather than green on the infamous EQAR map – and the reason why QAA had to demit from England’s statutory Designated Quality Body role – is that the European Standards and Guidance (ESG) require a cyclical review of institutional quality processes and involve the opinions of students, while OfS wants things to be more vibes risk-based and feels quality assurance is far too important to get actual students involved.
Harsh? Perhaps. In the design of its regulatory framework the OfS was aiming to reduce burden by focusing mainly on where there were clear issues with quality – with the enhancement end handled by the TEF and the student aspect handled by actual data on how they get on academically (the B3 measures of continuation, completion, and progression) and more generally (the National Student Survey). It has even been argued (unsuccessfully) in the past that as TEF is kind of cyclical if you squint a bit, and it does sort of involve students, that England is in fact ESG compliant.
It’s not like OfS were deliberately setting out to ignore international norms, it was more that it was trying to address English HE’s historic dislike for lengthy external reviews of quality as it established a radically new system of regulation – and cyclical reviews with detailed requirements on student involvement were getting in the way of this. Obviously this was completely successful, as now nobody complains about regulatory burden and there are no concerns about the quality of education in any part of English higher education among students or other stakeholders.
Those ESG international standards were first published in 2005,with the (most recent) 2015 revision adopted by ministers from 47 countries (including the UK). There is a revision underway led by the E4 group: the European Association for Quality Assurance in Higher Education (ENQA), ESU, EUA and EURASHE – fascinatingly, the directors of three out of four of these organisations are British. The ESG are the agreed official standards for higher education quality assurance within the Bologna process (remember that?) but are also influential further afield (as a reference point for similar standards in Africa, South East Asia, and Latin America. The pandemic knocked the process off kilter a bit, but a new ESG is coming in 2027, with a final text likely to be available in 2026.
A lot of the work has already been done, not least via the ENQA-led and EU-funded QA-FIT project. The final report, from 2024, set out key considerations for a new ESG – it’s very much going to be a minor review of the standards themselves, but there is some interesting thinking about flexibility in quality assurance methodologies.
International standards are reflected more clearly in other parts of the UK.
Britain’s newest higher education regulator, Medr, continues to base higher education quality assurance on independent cyclical reviews involving peer review and student input, which reads across to widely accepted international standards (such as the ESG). Every registered provider will be assessed at least every five years, and new entrants will be assessed on entry. This sits alongside a parallel focus on teaching enhancement and a focus on student needs and student outcomes – plus a programme of triennial visits and annual returns to examine the state of provider governance.
Over at the Scottish Funding Council the Tertiary Quality Enhancement Framework (TQEF) builds on the success of the enhancement themes that have underpinned Scottish higher education quality for the past 20 years. The TQEF again involves ESG-compliant cyclical independent review alongside annual quality assurance engagements with the regulator and an intelligent use of data. As in Wales, there are links across to the assessment of the quality of governance – but what sets TQEF apart is the continued focus on enhancement, looking not just for evidence of quality but evidence of a culture of improvement.
Teaching quality and governance are also currently assessed by cyclical engagements in Northern Ireland. The (primarily desk-based) Annual Performance Review draws on existing data and peer review, alongside a governance return and engagement throughout the year, to give a single rating to each provider in the system. Where there are serious concerns an independent investigation (including a visit) is put in place. A consultation process to develop a new quality model for Northern Ireland is underway – the current approach simply continues the 2016 HEFCE approach (which was, ironically, originally hoped to cover England, Wales, and Northern Ireland while aligning to ESG).
You could see this as a dull, doctrinal, dispute of the sort that higher education is riven with – you could, indeed, respond in the traditional way that English universities do in these kinds of discussions by putting your fingers in your ears and repeating the word “autonomy” in a silly voice. But the ESG is a big deal: it is near essential to demonstrate compliance if you want to get stuck into any transnational education or set up an international academic partnership.
As more parts of the world are now demanding access to high quality higher education, it seems fair to assume that much of this will be delivered – in the country or online – by providers elsewhere. In England, we still have no meaningful way of assuring the quality of transnational education (something that we appear to be among the best in the world at expanding)? Indeed, we can’t even collect individualised student data about TNE.
Almost by definition, regulation of TNE requires international cooperation and international knowledge – the quasi-colonial idea that if the originating university is in good standing then everything it does overseas is going to be fine is simply not an option. National systems of quality need to be receptive to collaboration and co-regulation as more and more cross-border provision is developed, in terms of rigor, comparability (to avoid unnecessary burden) and flexibility to meet local needs and concerns.
Of course, concerns about the quality of transnational education are not unique to England. ENQA has been discussing the issue as a part of conversations around ESG – and there are plans to develop an international framework, with a specific project to develop this already underway (which involves our very own QAA). Beyond Europe, the International Network for Quality Assurance Agencies in Higher Education (INQAAHE – readers may recall that at great expense OfS is an associated member, and that the current chair is none other than the QAA’s Vicki Stott) works in partnership with UNESCO on cross-border provision.
And it will be well worth keeping an eye on the forthcoming UNESCO second intergovernmental conference of state parties to the Global Convention on Higher Education later this month in Paris, which looks set to adopt provisions and guidance on TNE with a mind to developing a draft subsidiary text for adoptions. The UK government ratified the original convention, which at heart deals with the global recognition of qualifications, in 2022. That seems to be the limit of UK involvement – there’s been no signs that the UK government will even attend this meeting.
TNE, of course, is just one example. There’s ongoing work about credit transfer, microcredentials, online learning, and all the other stuff that is on the English to-do pile. They’re all global problems and they will all need global (or at the very least, cross system) solutions.
The mood music at OfS – as per some questions to Susan Lapworth at a recent conference – is that the quality regime is “nicely up and running”, with the various arms of activity (threshold assessment for degree awarding powers, registration, and university titles; the B conditions and associated investigations; and the Teaching Excellence Framework) finally and smoothly “coming together”.
A blog post earlier this month from Head of Student Outcomes Graeme Rosenberg outlined more general thinking about bringing these strands into better alignment, while taking the opportunity to fix a few glaring issues (yes, our system of quality assurance probably should cover taught postgraduate provision – yes, we might need to think about actually visiting providers a bit more as the B3 investigations have demonstrated). On the inclusion of transnational education within this system, the regulator has “heard reservations” – which does not sound like the issue will be top of the list of priorities.
To be clear, any movement at all on quality assurance is encouraging – the Industry and Regulators Committee report was scathing on the then-current state of affairs, and even though the Behan review solidified the sense that OfS would do this work itself it was not at all happy with the current fragmentary, poorly understood, and internationally isolated system.
But this still keeps England a long way off the international pace. The ESG standards and the TNE guidance UNESCO eventually adopts won’t be perfect, but they will be the state of the art. And England – despite historic strengths – doesn’t even really have a seat at the table.
