Category: Access & WP

  • Universities in England can’t ignore the curriculum (and students) that are coming

    Universities in England can’t ignore the curriculum (and students) that are coming

    What has schools policy got to do with higher education?

    The Westminster government has published Becky Francis’s Curriculum and Assessment Review, unveiling what Education Secretary Bridget Phillipson calls “landmark reforms” to the national curriculum.

    Interestingly, the revitalised curriculum is to be a “core part” of how the government will deliver the Prime Minister’s target of two-thirds of young people participating in higher-level learning by age 25.

    The review treats higher education as an explicit destination, not a distant afterthought.

    When it invents a new “third pathway” at level 3, it insists those V Levels must carry higher education credibility and be built so that young people can progress to degree-level study as well as work – hence Ofqual regulation and sector-standard-linked content. In other words, this isn’t a dead-end vocational cul-de-sac – it is designed to be read and trusted by admissions tutors.

    On T Levels, the panel recognises reality on the ground – many universities do already accept T Level learners – but says the acceptance landscape is messy, confusing and poorly signposted. Its answer is that government should keep working with providers and HEIs to promote understanding across the HE sector so applicants know which courses take T Levels and on what terms. The implication for universities is making recognition statements clearer, and aligning them with national guidance as it emerges.

    Why the anxiety about clarity? Because the authors kept bumping into learners who don’t grasp how subject and qualification choices at 16–19 play out later for university admission. That includes confusion introduced by new badge-sets like AAQs and TOQs. It turns out that if you design a landscape that looks like alphabet soup, you shouldn’t be surprised when applicants misread the signposts.

    Bacc to the future

    The EBacc gets a particular dressing-down. It’s true that taking an academic portfolio at GCSE correlates with applying to – and attending – university. But the review finds that EBacc combinations do not boost the chance of getting into the Russell Group, (although the only source for this is a paper from 2018, which doesn’t really come down conclusively against it), and that EBacc’s accountability pull has constrained subject choice in ways that squeeze arts and applied options. For HE, that means any lingering myth that EBacc equals elite-entry advantage gets killed off.

    There’s a financial edge to all this that the review politely doesn’t mention. When the previous government tried to defund BTECs, analysis showed the policy could strip £700 million in tuition fee income from the sector, with catastrophic effects for subjects like nursing, sport science, and computing – some facing 20 per cent recruitment losses. Those shortfalls would land heaviest on lower-tariff universities already wrestling with flat domestic recruitment and collapsing international numbers.

    The stakes for getting pathway reform right are existential for parts of the sector. If V Levels don’t recruit at scale, if T Level recognition remains patchy, and if the “simplification” just creates new barriers for disadvantaged students rather than removing old ones, some universities and programmes will struggle to recruit. The review’s optimism about legibility needs to meet reality – student choice is sticky, established qualifications have brand recognition, and centrally-planned qualification reform has a patchy track record. T Levels attracted just 6,750 students after £482 million of investment.

    As well as all of that, the panel seems super keen to stress the continuing strength of A levels as a pipeline, noting that in 2022/23 some eighty-two per cent of A level learners progressed to higher education by age 19. Whatever else changes, the academic route remains a robust feeder – and universities should expect the report’s other reforms to orbit around, not replace, that core.

    Crucially, the review refuses the tired binary that “vocational” equals “non-HE.” It records evidence that large applied or technical programmes can carry real weight with HE providers – precisely because they demonstrate breadth and depth in a way that can be benchmarked consistently across learners. If you run foundation years or applied degree routes, you are being invited to read these programmes seriously.

    It also acknowledges the contested evidence on outcomes for legacy qualifications like unreformed BTECs while still affirming their role in widening participation. The nuance matters – some qualifications have varied quality and mixed university performance data, yet for those who succeed in HE, BTECs and other AGQs have often been the bridge in. A credible vocational pathway that keeps that bridge open – while simplifying the current maze – is the intended fix.

    Are universities actually ready to make good on these promises? The sector has spent years documenting how BTEC students – despite “equivalent” tariff points – have systematically worse outcomes than A-level students. Arguably, the problem in some providers isn’t the qualification – it’s that first-year curricula and pedagogy remain stubbornly designed around A-level assumptions. Group projects, applied assessment, practical skills – the things BTEC students excel at – routinely get squeezed out in favour of essays and exams that privilege academic writing developed through A-levels.

    So when the review insists V Levels must “carry higher education credibility” and enable progression to degrees, the translation work required isn’t just clearer admissions statements – it’s a more fundamental rethink of how universities teach first-year students, assess them, and support their transition.

    Put together, the narrative runs something like this. Design V Levels to be legible to universities, clean up T Level recognition so applicants aren’t left guessing, stop pretending EBacc is a golden ticket to elite admission, and keep A levels stable, but value applied depth where it’s rigorous.

    And above all, help students understand how choices at 16–19 map to HE doors that open, or close, later.

    What (or who) is coming?

    There are some wider bits of note. The review has things to say about AI:

    …generative artificial intelligence has further heightened concerns around the authenticity of some forms of non-exam assessment… It is right, therefore, that exams remain the principal form of assessment.

    As such, it urges no expansion of written coursework and a subject-by-subject approach to non-exam assessment where it is the only valid way to assess what matters. It also tasks DfE and Ofqual to explore potential for innovation in on-screen assessment – particularly where this could further support accessibility for students with SEND – but cautions that evidence for wider rollout is thin and equity risks from the digital divide are real.

    Digital capability stops being taken-for-granted. Computing becomes the spine for digital literacy across all key stages, explicitly incorporating AI – what it is, what it can and can’t do – and broadening the GCSE so it reflects the full curriculum rather than a narrow slice of computer science. Other subjects are expected to reference digital application coherently, but the foundations live in Computing. Online safety and the social-emotional ethics of tech use sit in RSHE, while the “is this real?” critical discernment is anchored in Citizenship.

    The ambition is a cohort that can use technology safely and effectively, understands AI well enough to question it, and can interrogate digital content rather than drown in it.

    More broadly, English is recast so students study “the nature and expression of language” – including spoken language – and analyse multi-modal and so-called “ephemeral” texts. That builds media-literate readers and writers who can spot persuasion, evaluate sources, and switch register across platforms, backed by a Year 8 diagnostic to catch gaps early. Drama regains status as a vehicle for performance, confidence and talk.

    In parallel, an “oracy framework” is proposed to make speaking and listening progression explicit across primary and secondary – something schools say is currently fuzzy and inconsistently taught. The sector should expect clearer outcomes on expressing ideas, listening, turn-taking and audience awareness, with specific hooks in English and Citizenship.

    Citizenship is made statutory at primary with a defined core – financial literacy, democracy and government, law and rights, media literacy, climate and sustainability – and tightened at secondary for purpose, progression and specificity. The point is to guarantee exposure, not leave it to chance. If implemented properly, you’d expect clearer outcomes on budgeting and borrowing, evaluating claims and campaigns, understanding institutions and rights, and participating respectfully in debate.

    And climate education also steps out of the margins. Expect refreshed content in Geography and Science and an explicit sustainability lens in Design and Technology, with an eye on green skills and the realities of local, affordable fieldwork. The intent isn’t a new silo called “climate” – it’s to make the concepts visible, current and assessed where they logically belong.

    What’s next?

    If this all lands as intended – and that’s a big “if” given implementation timelines and school capacity – universities should expect a cohort that’s been taught to interrogate sources, question AI outputs, and articulate arguments aloud, not just on the page.

    Whether all of this survives contact with reality should be the sector’s real concern. The review’s timeline assumes schools can execute sweeping curriculum reform, embed new pathways, and deliver enhanced oracy and media literacy by 2028 – all while navigating funding pressures, teacher shortages, and the usual chaos of system change. That’s ambitious even in favourable conditions.

    And universities know from painful experience that when school reform stumbles, they inherit the mess. BTECs were supposed to be the accessible applied route, until differential outcomes data revealed the sector hadn’t actually adapted to teach those students effectively. The EBacc was positioned as the passport to elite universities, until evidence showed it just constrained subject choice without improving Russell Group entry. The Francis Review has laudable intentions – genuine pathways, informed choice, rigorous applied options – but intentions aren’t infrastructure.

    If the 2028 cohort arrives at university having been promised that V Levels are “trusted by admissions tutors” but finds patchy recognition, or discovers their oracy training doesn’t translate because seminars still privilege A-level-style discourse, the sector will be cleaning up another policy gap between aspiration and delivery. The review knows this risk exists – hence the repeated insistence on clarity, signposting, and sector cooperation.

    But cooperation requires capacity, and capacity requires resources neither schools nor universities currently have a box full of. Nevertheless, the intent is to send universities young people who can think critically, speak confidently, and navigate complexity.

    Source link

  • The white paper opens the door, but we need to ensure everyone gets in

    The white paper opens the door, but we need to ensure everyone gets in

    After months of anticipation, the post-16 education and skills white paper has finally landed.

    For many across the sector, the wait has been worth it. There are bold commitments on funding, skills pathways and structural reform. But for those of us focused on widening participation there are the green shoots of ideas but very little detail and the group of students who are at most risk to lack of equality of access – care experienced and estranged students – are barely even mentioned. The paper feels more like a promising prologue than a full chapter.

    There are areas of positive progress. The previously trailed increase in maintenance support, which will help students better manage rising living costs – a critical issue for those without family safety nets.

    Plus the report commits to “provide extra support for care leavers, some of the most vulnerable in our society, who will automatically become eligible to receive the maximum rate of loan.” We would want to see these extended to estranged students as well as care experienced young people as we know many report financial hardship without the support of parents to top up income. Data from the Student Academic Experience Survey showed us that both care experienced and estranged students work a statistically significantly higher number of hours per week – 11.3 and 11.1 hours respectively – than 8.8 hours non-care experienced students at 8.8 hours.

    But we must await further detail to see whether this makes any material difference for care leavers (and hopefully estranged students) – given that they’re currently already eligible for the maximum maintenance loan, and this maximum doesn’t cover anywhere near enough to support their living expenses, as recent work on minimum income standards has shown.

    A richer picture

    The promise of better information for applicants, combining UCAS data with graduate outcomes and completion rates, is an important move toward transparency and fairer choice. The work that UUK, Sutton Trust and UCAS have already started in this space is welcome but ensuring consistency will be key. This is especially important to consider when we know from UCAS research that 60 per cent of surveyed applicants said “they did not receive guidance at school around applying to higher education, specific to their status as a care-experienced student.”

    We’re also encouraged by the focus on regional disparities and disadvantage cold-spots, especially in coastal and low-participation areas. These are often the places where care experienced and estranged students are most at risk of being left behind.

    But while these commitments signal progress, there’s still much to be drawn out around widening participation. Care experienced and estranged students remain largely invisible in mainstream policy design. They’re not always captured in data. They’re rarely the headline. But they matter (which is why we welcomed HESA’s planned exploration of the issues involved in publishing data on this group of students more regularly). These students face some of the steepest barriers to access, retention and success.

    There are pockets of excellent practice and growing awareness of this group of students that is driving change in some areas. The commitment by Russell Group universities to develop a consistent offer of support is welcomed as is seeing more FE and HE institutions achieving the NNECL Quality Mark. These examples demonstrate that progress is achievable when there is institutional will and leadership – but there is still such little evidence about what works.

    At the Unite Foundation, we were pleased to see recognition that accommodation is a key issue. For care experienced and estranged students, having somewhere safe and stable to live is not just a nice-to-have – it’s a fundamental prerequisite for participation in education. If we’re serious about widening participation, then addressing the barrier of housing insecurity must be central to the conversation. And yet, the white paper is light on detail about how government will support access to accommodation. This is a missed opportunity.

    The Unite Foundation’s own scholarship programme remains the only intervention to meet OfS Level 2 standards for impact on retention, progression, and completion for this group. It’s a powerful testament to what targeted, sustained support can achieve – but it also highlights how little evidence we have about what works.

    The journey continues

    So while the white paper offers a welcome direction of travel, it’s not the final destination. I’m pleased to be joining the national access and participation task and finish group, chaired by access and participation champion Kathryn Mitchell, to work within government to ensure that we’re embedding care experienced and estranged students at the heart of this work as the detail starts to emerge.

    If we’re serious about change we need more than just warm words. We need system-wide commitments that embed equity in funding, housing, student support and success metrics. We need to listen to students and design policy that reflects their lived realities.

    The wrapping paper is off. Now it’s time to see what’s inside – and to make sure care experienced and estranged students aren’t left out of the picture.

    Source link

  • OfS Access and Participation data dashboards, 2025 release

    OfS Access and Participation data dashboards, 2025 release

    The sector level dashboards that cover student characteristics have a provider-level parallel – the access and participation dashboards do not have a regulatory role but are provided as evidence to support institutions develop access and participation plans.

    Though much A&P activity is pre-determined – the current system pretty much insists that universities work with schools locally and address stuff highlighted in the national Equality of Outcomes Risk Register (EORR). It’s a cheeky John Blake way of embedding a national agenda into what are meant to be provider level plans (that, technically, unlock the ability to charge fees up to the higher level) but it could also be argued that provider specific work (particularly on participation measures rather than access) has been underexamined.

    The A&P dashboards are a way to focus attention on what may end up being institutionally bound problems – the kinds of things that providers can fix, and quickly, rather than the socio-economic learning revolution end of things that requires a radicalised cadre of hardened activists to lead and inspire the proletariat, or something.

    We certainly don’t get any detailed mappings between numeric targets declared in individual plans and the data – although my colleague Jim did have a go at that a while ago. Instead this is just the raw information for you to examine, hopefully in an easier to use and speedier fashion than the official version (which requires a user guide, no less)

    Fun with indicators

    There are four dashboards here, covering most of what OfS presents in the mega-board. Most of what I’ve done examines four year aggregations rather than individual years (though there is a timeseries at provider level), I’ve just opted for the 95 per cent confidence interval to show the significance of indicator values, and there’s a few other minor pieces that I’ve not bothered with or set a sensible default on.

    I know that nobody reads this for data dashboard design tips, but for me a series of simpler dashboards are far more useful to the average reader than a single behemoth that can do anything – and the way HESA presents (in the main) very simple tables or plain charts to illustrate variations across the sector represents to me a gold standard for provider level data. OfS is a provider of official statistics, and as such is well aware that section V3.1 of the code of practice requires that:

    Statistics, data and explanatory material should be relevant and presented in a clear, unambiguous way that supports and promotes use by all types of users

    And I don’t think we are quite there yet with what we have, while the simple release of a series of flat tables might get us closer

    If you like it you should have put a confidence interval on it

    To start with, here is a tool for constructing ranked displays of providers against a single metric – here defined as a life cycle stage (access, continuation, completion, attainment, progression) expressed as a percentage of successful achievements for a given subgroup.

    Choose your split indicator type on the top right, and the actual indicator on the top right – select the life cycle stage on the box in the middle, and set mode and level (note certain splits and stages may only be available for certain modes and levels). You can highlight a provider of interest using the box on the bottom right, and also find an overall sector average by searching on “*”. The colours show provider group, and the arrows are upper and lower confidence bounds at the standard 95 per cent level.

    You’ll note that some of the indicators show intersections – with versions of multiple indicators shown together. This allows you to look at, say, white students from a more deprived background. The denominator in the tool tip is the number students in that population, not the number of students where data is available.

    [singles rank]

    I’ve also done a version allowing you to look at all single indicators at a provider level – which might help you to spot particular outliers that may need further analysis. Here, each mark is a split indicator (just the useful ones, I’ve omitted stuff like “POLAR quintiles 1,2,4, and 5” which is really only worth bothering with for gap analysis), you can select provider, mode, and level at the top and highlight a split group (eg “Age (broad)”) or split (eg “Mature aged 21 and over”).

    Note here that access refers to the proportion of all entrants from a given sub-group, so even though I’ve shown it on the same axis for the sake of space it shows a slightly different thing – the other lifecycle stages relate to a success (be that in continuation, progression or whatever) based on how OfS defines “success”.

    [singles provider]

    Oops upside your head

    As you’ve probably spotted from the first section, to really get things out of this data you need to compare splits with other relevant splits. We are talking, then, about gaps – on any of the lifecycle stages – between two groups of students. The classic example is the attainment gap between white and Black students, but you can have all kinds of gaps.

    This first one is across a single provider, and for the four lifecycle stages (this time, we don’t get access) you can select your indicator type and two indicators to get the gap between them (mode, and level, are at the bottom of the screen). When you set your two split, the largest or most common group tends to be on indicator 1 – that’s just the way the data is designed.

    [gaps provider]

    As a quick context you can look for “*” again on the provider name filter to get sector averages, but I’ve also built a sector ranking to help you put your performance in context with similar providers.

    This is like a cross between the single ranking and the provider-level gaps analysis – you just need to set the two splits in the same way.

    [gaps rank]

    Sign o’ the times

    The four year aggregates are handy for most applications, but as you being to drill in you are going to start wondering about individual years – are things getting gradually worse or gradually better? Here I’ve plotted all the individual year data we get – which is, of course, different for each lifecycle stage (because of when data becomes available). This is at a provider level (filter on the top right) and I’ve included confidence intervals at 95 per cent in a lighter colour.

    [gaps provider timeseries]

    Source link

  • OfS characteristics dashboards, 2025 release

    OfS characteristics dashboards, 2025 release

    The Office for Students releases a surprisingly large amount of data for a regulator that is supported by a separate “designated data body”.

    Some of it is painfully regulatory in nature – the stuff of nightmares for registrars and planning teams that are not diligently pre-preparing versions of the OfS’ bespoke splits in real time (which feels like kind of a burden thing, but never mind).

    Other parts of it feel like they might be regulatory, but are actually descriptive. No matter how bad your provider looks on any of the characteristics, or access and participation, indicators it is not these that spark the letter or the knock on the door. But they still speak eloquently about the wider state of the sector, and of particular providers within it.

    Despite appearances, it is this descriptive data that is likely to preoccupy ministers and policymakers. It tells us about the changing size and shape of the sector, and of the improvement to life chances it does and does not offer particular groups of students.

    Outcomes characteristics

    How well do particular groups of students perform against the three standard OfS outcomes measures (continuation, completion, progression) plus another (attainment) that is very much in direct control of individual providers?

    It’s a very pertinent question given the government’s HE Reform agenda language on access and participation – and the very best way to answer it is via an OfS data release. Rather than just the traditional student characteristics – age, ethnicity, the various area based measures – we get a range of rarities: household residual income, socioeconomic status, parental higher education experience. And these come alongside greatly expanded data on ethnicity (15 categories) and detail on age.

    Even better, as well as comparing full time and part-time students, we can look at the performance of students by detailed (or indeed broad) subject areas – and at a range of levels of study.

    We learn that students from better off (residual income at £42,601 or greater) are more likely to progress to a positive outcome – but so are students of nursing. Neither of these at the level of medical students, or distance learning students – but very slightly above Jewish students. The lowest scoring group on progression is currently students taught via subcontractual arrangements – but there are also detriments for students with communication-related disabilities, students from Bangladeshi backgrounds, and students with “other” sexual orientations.

    In some cases there are likely explanatory factors and probably intersections – in others it is anyone’s guess. Again and again, we see a positive relationship between parental income or status and doing well at higher education: but it is also very likely that progression across the whole of society would show a similar pattern.

    On this chart you can select your lifecycle stage on the top left-hand side, and use the study characteristics drop down to drill into modes of study or subject – there’s also an ability to exclude sub-contractual provision outside of registered provider via the population filter. At the bottom you can set domicile (note that most characteristics are available only for UK students) and level of study (again note that some measures are limited to undergraduates). The characteristics themselves are seen as the individual blobs for each year: mouse over to find similar blobs in other years or use the student characteristic filter or sub-characteristic highlighter to find ones that you want.

    [Full screen]

    The “attainment” life cycle stage refers to the proportion of undergraduate qualifiers that achieve a first or upper second for their first degree. It’s not something we tend to see outside of the “unexplained first” lens, and it is very interesting to apply the detailed student characteristics to what amounts to awarding rates.

    It remains strikingly difficult to achieve a first or upper second while being Black. Only 60 per cent of UK full time first degree students managed this in 2023-24 which compares well to nearer 50 per cent a decade ago, but not so well with the 80 per cent of their white peers. The awarding gap remains stark and persistent.

    Deprivation appears to be having a growing impact on continuation – again for UK full time first degree students, the gap between the most (IMD Q1, 83.3 per cent) and least (Q5 93.1 per cent) deprived backgrounds has grown in recent years. And the subject filters add another level of variation – in medicine the different is tiny, but in natural sciences it is very large.

    Population characteristics

    There are numerators (number of students where data is included) and denominators (number of students with those characteristics) within the outcomes dashboard, but sometimes we just need to get a sense of the makeup of the entire sector – focusing on entrants, qualifiers, or all students.

    We learn that nearly 10 per cent of UK first degree students are taught within a subcontractual arrangement – rising to more than 36 per cent in business subjects. Counter-intuitively, the proportion of UK students studying other undergraduate courses (your level 4 and 5 provision) has fallen in previous years – 18 per cent of these students were taught via sub contractual arrangements in 2010, and just 13 per cent (of a far lower total) now. Again, the only rise is in business provision – sub-contractual teaching is offered to nearly a quarter of non-degree undergraduates from UK domiciles there.

    More than a third (33.14 per cent) of UK medicine or dentistry undergraduates are from managerial or professional backgrounds, a higher proportion than any other subject area, even though this has declined slightly in recent years.

    Two visualisations here – the first shows student characteristics as colours on the bars (use the filter at the top) and allows you to filter what you see by mode or subject area using the filters on the second row. At the bottom you can further filter by level of study, domicile, or population (all, entrants, or qualifiers). The percentages include students where the characteristic is “not applicable” or where there is “no response” – this is different from (but I think clearer than) the OfS presentation.

    [by student characteristic]

    The second chance puts subject or mode as the colours, and allows you to look at the make up of particular student characteristic groups on this basis. This is a little bit of a hack, so you need to set the sub characteristic as “total” in order to alter the main characteristic group.

    [by study characteristic]

    Entry qualification and subject

    Overall, UK undergraduate business students are less likely to continue, complete, attain a good degree, or a positive progression outcome than their peers in any other subject area – and this gap has widened over time. There is now a 1.5 percentage point progression gap between business students and creative or performing arts students: on average a creative degree is more likely to get you into a job or further study than one in business, and this has been the case since 2018.

    And there is still a link between level 3 qualifications and positive performance at every point of the higher education life cycle. The data here isn’t perfect – there’s no way to control for the well documented link between better level 3 performance (more As and A*s, less Cs, Ds and BTECs) and socioeconomic status or disadvantage. Seventy two per cent of the best performing BTEC students were awarded a first or upper second, 96 per cent of the best performing A level students.

    This is all taken from a specific plot of characteristics (entry qualification and subject) data – unfortunately for us it contains information on those two topic only, and you can’t even cross plot them.

    [Full screen]

    What OfS makes of all this

    Two key findings documents published alongside this release detail the regulatory observations. The across-the-board decline in continuation appears to have been halted, with an improvement in 2022-23 – but mature entrants are still around 9 percentage points less likely to continue.

    We get recognition of the persistent gap in performance at all levels other than progression between women (who tend to do better) and men (who tend to do worse). And of the counterintuitive continuation benefits experienced by disabled students. And we do get a note on the Black attainment gap I noted above.

    Again, this isn’t the regulatory action end of OfS’ data operations – so we are unlikely to see investigations or fines related to particularly poor performance on some of these characteristics within individual providers. Findings like these at a sector level suggest problems at a structural rather than institutional level, and as is increasingly being made plain we are not really set up to deal with structural higher education issues in England – indeed, these two reports and millions of rows of data do not even merit a press release.

    We do get data on some of these issues at provider level via the access and participation dashboards, and we’ll dive into those elsewhere.

    Source link

  • The fifty per cent participation target is no more. Again.

    The fifty per cent participation target is no more. Again.

    Dare we say he felt the hand of history on his shoulder?

    In his Labour Party Conference speech Prime Minister Keir Starmer set a new participation target for participation in education at level 4 and above (including higher education, further education, and some apprenticeships) for young people. He said:

    Two thirds of our children should either go to university or take on a gold standard apprenticeship

    As subsequently briefed, the target (which replaces, somehow, the old 50 per cent target from the Blair years) relates to higher skills, either through university, further education or taking on a gold standard apprenticeship. It will include at least ten percent of young people pursuing higher technical education or apprenticeships that the economy needs by 2040, a near doubling of today’s figure.

    Alongside a restatement of recent further education policies (£800m extra into funding for 16-19 year olds in FE next year, and measures to make FE “world class”) Starmer couched the target in the language of “respect”, drawing on the now familiar tale of his father, the toolmaker.

    Because if you are a kid or a parent of a kid who chooses an apprenticeship, what does it say to you? Do we genuinely, as a country – afford them the same respect?

    The numbers now?

    We don’t really have the data at hand to judge progress against the target to date – we would imagine a new measure would be developed. The press release points to the most recent data we have relating to participation in any level four education before the age of 25 (CHEP-25 “all level four”): around half of the cohort that turned 15 in 2012-13 (and thus might have entered university in 2015-16) participated in the kind of provision the prime minister talked about. As this cohort turned 25 in 2022-23, we do not yet have data for future cohorts.

    In the last two recruitment cycles (2024, and 2025) 37 per cent of 18 year olds in England entered university directly from school via UCAS. This equates to 240,510 young people in 2024 and 249,780 in 2025 – out of an England domiciled 18 year old population of 650,710 in 2024 and 675,710 in 2025.

    In contrast just 15,085 adults (19+) participated in-year in provision at level four or above in the further education and skills sector during 2023-24. And there were 100,490 higher (level 4) apprenticeship starts in the same year.

    The uncancellable target

    It was originally proposed by Tony Blair during his 1999 leader’s address to conference that the government should have:

    a target of 50 per cent of young adults going into higher education in the next century.

    And this plan was reiterated in the 2001 manifesto, and the promise maintained in both 2005 and 2010 :

    It is time for an historic commitment to open higher education to half of all young people before they are 30, combined with increased investment to maintain academic standards.

    The original target date was 2010, but by 2008 then universities minister John Denham had already conceded that this target would not be met. And it was not met under a Labour government.

    It was never universally popular – in 2009 the CBI made a high profile call to drop the 50 per cent aspiration. Under Coalition Prime Minister David Cameron, then Business Secretary Vince Cable was the first of many to formally cancel the target. On 12 October 2010 he told the House of Commons that:

    We must not perpetuate the idea, encouraged by the pursuit of a misguided 50% participation target, that the only valued option for an 18-year-old is a three-year academic course at university. Vocational training, including apprenticeships, can be just as valuable as a degree, if not more so

    Which you’d imagine would be the end of it, a non-binding (it never featured in legislation) aspiration set by the previous administration rejected by a new minister.

    Cancel culture

    As the magic figure approached (the goal was achieved in 2019) the general disapproval of the long-scrapped target shifted into outright hostility. By 2017 Nick Boles (remember him?) was not outside the political mainstream in saying:

    The policy of unbridled expansion has now reached its logical conclusion.

    All to no avail. By 2020 the ever-thoughtful Gavin Williamson seemed he was making it into a personal vendetta:

    When Tony Blair uttered that 50 per cent target for university attendance, he cast aside the other 50 per cent. It was a target for the sake of a target, not with a purpose… As Education Secretary, I will stand for the forgotten 50 per cent.

    While former universities minister Chris Skidmore was characteristically a little more measured in his critique. Just about the only politician willing to stick up for the idea was Tony Blair himself, who in 2022 backed calls for 70 per cent of young people to enter higher education.

    By this point, Rishi Sunak had become Prime Minister, and was telling the 2023 Conservative conference that:

    As he renewed another familiar attack on “rip off degrees”. This brought about a robust response from Keir Starmer as leader of the opposition:

    I never thought I would hear a modern Conservative Prime Minister say that 50 per cent of our children going to university was a “false dream”. My Dad felt the disrespect of vocational skills all his life. But the solution is not and never will be levelling-down the working class aspiration to go to university.

    If anything, Starmer missed the opportunity at that stage to point out the volume of vocational going on in universities – but that probably speaks to the polling and public perception of “universities” that reinforces the challenge the sector has in surfacing it all.

    Delivery, delivery, delivery

    Targets and aspirations are all very well, but you would expect a government as focused on “delivery” as our current one to have a clear plan to drive up participation. And though welcome, the previously announced funding for further education is not it.

    Driving up participation to such a level is far beyond what can be achieved by tweaks around apprenticeship incentives or even the roll-out of the (surprisingly unpopular) Lifelong Learning Entitlement. We are promised more details about how it’s going to work in the forthcoming post-16 education white paper.

    History tells us that the majority of any increase in participation at level 4 will come from the efforts of our universities, through new courses and innovative delivery modes. And this will take participation in higher education far above the 50 per cent target.

    Source link

  • Better access to medical school shouldn’t need a deficit model

    Better access to medical school shouldn’t need a deficit model

    Patients benefit from a diverse healthcare workforce. Doctors, particularly those from disadvantaged and minoritised backgrounds, play a crucial role in advocating for what is best for their patients.

    The NHS recognises this, linking workforce diversity with increased patient satisfaction, better care outcomes, reduced staff turnover, and greater productivity.

    A promising start

    Efforts to widen participation in higher education began at the turn of the century following the Dearing report. Over time, access to medical schools gained attention due to concerns about its status as one of the most socially exclusive professions. Medical schools responded in 2014 with the launch of the Selecting for Excellence report and the establishment of the Medical Schools Council (MSC) Selection Alliance, representing admissions teams from every UK medical school and responsible for fair admissions to medical courses.

    With medical school expansion under government review, institutions face increasing pressure to demonstrate meaningful progress in widening participation to secure additional places. Although medicine programmes still lag in representing some demographic groups, they now align more closely with wider higher education efforts.

    However, widening participation policy often follows a deficit model, viewing disadvantaged young people as needing to be “fixed” or “topped up” before joining the profession. Phrases like “raising aspirations” suggest these students lack ambition or motivation. This model shifts responsibility onto individuals, asking them to adapt to a system shaped mainly by the experiences of white, male, middle-class groups.

    Beyond access

    To create real change, organisations must move beyond this model and show that students from diverse backgrounds are not only welcomed but valued for their unique perspectives and strengths. This requires a systems-based approach that rethinks every part of medical education, starting with admissions. In its recent report, Fostering Potential, the MSC reviewed a decade of widening participation in medicine. Medical schools across the UK have increased outreach, introduced gateway year courses, and implemented contextual criteria into admissions.

    Contextual markers recognise structural inequalities affecting educational attainment. Students from low socioeconomic backgrounds often attend under-resourced schools and face personal challenges hindering academic performance. Yet evidence shows that, when given the chance, these students often outperform more advantaged peers at university. Contextual admissions reframe achievements in light of these challenges, offering a fairer assessment of potential.

    Despite progress, access remains unequal. Although acceptance rates for students from the most deprived areas have increased, their chances remain 37 per cent lower than those from the least deprived areas. Research indicates that a two-grade A-level reduction is needed to level the playing field—an approach several schools now adopt. Other policies include fast-tracking interviews, test score uplifts, and alternative scoring for widening participation candidates.

    Not just special cases

    These processes, however, are often opaque and hard to navigate. Many applicants struggle to determine eligibility. With no single definition of disadvantage, medical schools use varied proxy indicators, often poorly explained online. This confusion disproportionately affects the students these policies aim to support; those without university-educated parents, lacking insider knowledge, and attending under-resourced schools.

    A commitment to transparency is vital but must go beyond rhetoric. Transparency means all medical schools clearly outline contextual admissions criteria in one accessible place, provide step-by-step guides to applicants and advisors, and offer examples of how contextual data influences decisions. Medical schools could collaborate to agree on standardised metrics for identifying widening participation candidates. This would simplify eligibility understanding, reduce confusion, and promote fairness.

    Tools like MSC’s entry requirements platform are a good start but must be expanded, standardised, and actively promoted to the communities that need them most. Genuine transparency empowers applicants to make informed choices, selecting schools best suited to their circumstances and maximising success chances. This also eases the burden on schools, advisors, and outreach staff who struggle to interpret inconsistent criteria.

    Ultimately, moving away from the deficit model toward an open, systems-based approach is about more than fairness. It is essential for building a medical workforce that reflects society’s diversity, improving patient care, strengthening the profession, and upholding the NHS’s commitment to equity and excellence.

    Source link

  • Inclusivity should be about more than individual needs

    Inclusivity should be about more than individual needs

    Assessment lies at the core of higher education. It helps focus students’ learning and helps them evidence, to themselves and to others, the progress they have made in their learning and growth.

    Setting, supporting and marking assessed student work takes up a substantial proportion of academic colleagues’ effort and time.

    Approaches to assessment and outcomes of assessment experiences underpin the narratives crafted by many higher education providers to showcase how they secure meaningful educational gains for their students.

    It’s not just what you know

    Educational gains go well beyond academic assessment, yet assessment is central to student experiences and should not be limited to academic knowledge gains. Indeed, a nuanced and insightful independent report commissioned by the Office for Students in March 2024 on how educational gains were defined and articulated in TEF 2023 submissions notes that providers rated gold for student outcomes

    “make reference to enhancing student assessment practices as a vehicle for embedding identified educational gains into the curriculum, explaining that their range of assessments is designed to assess beyond subject knowledge.”

    Assessments that require evidence of learning beyond subject knowledge are a particularly pertinent point to ponder, because these assessments are more likely to underpin the kind of inclusive higher education experiences that providers hope to create for their students, with inclusion understood in broad rather than narrow terms.

    The link between inclusion and assessment has been problematised by scholars of higher education. A narrow view of inclusive assessment focuses on individual adjustments in response to specific student needs. Higher education providers, however, would benefit from developing a broad definition of inclusive assessment if they are intent on meaningfully defining educational gains. Such a definition will need to move beyond implementing individual adjustments on a case by case basis, to consider intersecting and diverse student backgrounds that may impact how a student engages with their learning.

    Well-defined

    A good definition should also be mindful of (but not constrained by) needs and priorities articulated by external bodies and employers. It should be based on a thorough understanding of how to create equitable student assessment experiences in interdisciplinary settings (being able to operate flexibly across disciplines is key to solving societal challenges). It should appreciate that bringing co and extra-curricular experiences into summative assessment does not dilute a course or programme academic core.

    It should be aligned to a view of assessment for and as learning. It should value impact that goes beyond individual student achievement and is experienced more broadly in the assessment context. Importantly, it should embrace the potential of generative artificial intelligence to enhance student learning while preserving the integrity of assessment decisions and the need for students to make responsible use of generative tools during and beyond their studies.

    All higher education providers are likely to be able to find at least some examples of good, broadly defined inclusive practice in their contexts – these may just need to be spotlighted for others to consider and engage with. To help with this task, providers should be exploring

    • · Who is included in conversations about what is assessed, when and how?
    • · How fully are experiences outside a more narrowly defined academic curriculum core included in summative evaluative judgements about student achievement of intended and desired outcomes?
    • · To what extent does the range of assessments within a course or programme include opportunities for students to have their most significant strengths developed and recognised?

    Providers should develop their own narratives and frameworks of educational gains to create full inclusion in and through assessment. As they carefully implement these (implementation is key), they may also consider not just the gains that can be evidenced but also whether they could attract, welcome and evidence gains for a broader range of students than might have been included in the providers’ initial plans.

    And suppose energy to rethink assessment reaches a low point. In that case, it will be useful to remember that insufficient attention to inclusion, broadly defined, when assessing learning and measuring gains can (inadvertently) create further disadvantage for individuals, as it preserves the system that created the disadvantage in the first place.

    Source link

  • Care experienced students are assets, not risks

    Care experienced students are assets, not risks

    We have spent decades asking what support care leavers need to “catch up” in education. But what if we focused instead on what they already bring?

    Thirty years since I left the care system, I reflect on low expectations, persistent awarding gaps, and why higher education needs to reframe the care experience.

    Low expectations

    “One GCSE is enough, you’re in care”. That’s what I was told as a teenager growing up in the care system. That message stayed with me, if one GCSE was enough for someone like me, then I was not expected to succeed, I was expected to survive.

    By the time I was studying for my A levels I was living independently and worked full time. University at 18 was not an option, it was unthinkable. Years later, I found myself on a BTEC in health and social care as part of a role as a children’s rights worker, and that was where I discovered psychology.

    Suddenly, everything in my life made sense, my upbringing, my responses, the systems around me. I applied for university in 2002 and completed my first term while pregnant. At 36 I became a lecturer in education and psychology in higher education, teaching education through a psychological lens to education students, many of whom want to become teachers themselves.

    A full circle moment

    Recently, I hosted an A level psychology student for a placement. On the final day, she revealed that one of her teachers had been one of my undergraduate students. The moment was moving, not because she was care experienced (she wasn’t), or because the teacher was (they weren’t), but because it showed how my journey, rooted in care, had rippled out into the education system in ways I never imagined.

    That moment hit me like a wave. It was not just a neat coincidence, it was a full circle moment that challenged everything I had been told about my place in education.

    It reminded me that care experienced students are not simply passing through higher education as “at risk” individuals in need of support. Instead, we are contributing to it, we are building it and sometimes we are shaping the success of others in ways that last longer than we realise.

    Ditching deficit thinking

    What if we stopped asking what care experienced students lack? Too often, care leavers are described as “at risk” of exclusion, poor attainment, and drop-out. We talk about their trauma, instability, or disadvantage.

    Those challenges are real and need addressing – but rarely do we ask what strengths they bring with them. We bring resilience, not just as a feel good buzzword, but as a lived practice. We know how to manage under pressure, navigate uncertainty, and stay focused when stability is not guaranteed.

    We bring empathy, because we have seen how systems can fail people and we have learned how to listen, observe, and understand beneath the surface. We bring adaptability because when your life has taught you that plans change, support disappears, and people move on, you learn how to adjust quickly, quietly, and effectively and we bring purpose. Many of us enter education not out of expectation, but out of intent because we want to create the kind of impact we once needed. It is that intent that makes us powerful educators, mentors, and role models even for students who do not share our background.

    Within the classroom, I sometimes hear mature students described as “assets” because they bring work experience, life experience, and often support other students. Care experienced students who are appropriately nurtured and empowered bring their own strengths to their peers. They also bring different and valuable perspectives – particularly relevant to social sciences disciplines – about social inequity, systemic injustice, and resilience that can open up important conversations about theory and its relevance to the “real world” and prepare the students they learn alongside for work in a world in which they will encounter diverse and disadvantaged others.

    My time in care taught me skills that have defined my academic and professional life – I learned independence young and I developed empathy and adaptability not just emotionally, but practically, not as nice extras but as core strengths. They have helped me understand students better and helped shape the kind of lecturer I am.

    Care experienced students do not just overcome adversity, they carry rich insight, emotional intelligence, and a deep understanding of social systems and sometimes, like in my case, they help educate the people who go on to teach the next generation.

    Having said that, it’s 30 years since I left the care system – is it still the same?

    Not enough has changed

    In many ways, the system looks different today. Every looked-after child has a Personal Education Plan (PEP), schools appoint designated teachers, virtual school heads oversee progress, and there’s a £2,345 per-child annual Pupil Premium Plus. In principle, care-experienced learners are a priority. Some universities make contextual offers to care leavers in recognition of the challenges they faced on their way through the education system.

    Yet the numbers tell a different story:

    • only 37 per cent of looked-after children reach expected levels at Key Stage 2 (vs 65 per cent of peers)
    • only 7.2 per cent achieve grade 5+ in English and maths at GCSE (vs 40 per cent)
    • at age 19, just 13 per cent of care leavers enter higher education (vs 45 per cent of others).

    These gaps are not just statistical, they reflect structural inequalities, where expectations remain low and pathways to university feel closed off before they have even begun. For a care experienced student to find their way into higher education is a testament to their determination, resilience, and motivation before they even start.

    A fight not a right

    My mantra was “education was a fight not a right”. We may no longer say, “one GCSE is enough” out loud – but it is still heard in the subtext of our systems.

    We talk about “widening participation” and “belonging,” but too often, care experienced learners are left out of those conversations, or placed into categories of concern rather than capability. Recently, my ten-year-old said something that stopped me in my tracks: “children shouldn’t be judged on academic intelligence but on creative intelligence. School is more about following the rules than finding yourself.”

    They are right – the education system has moved from creativity to conformity and in doing so, we do not just risk excluding care experienced learners, we risk losing the individuality, emotional intelligence, and imaginative power that all students bring. The ones who have had to survive the most often bring innovation and creativity. When we centre care experienced voices in policy, in pedagogy, and in professional learning, we begin to close the awarding gap, the one that limits how we (and sometimes they) see their potential.

    Higher education did not just change my life. It gave me the chance to change other people’s too – and that is an opportunity we should provide to all our children.

    Source link

  • Working with our places will help us to spread the benefits of higher education more widely

    Working with our places will help us to spread the benefits of higher education more widely

    In the North East of England, fewer than one in three 18 year olds enter higher education, compared to a national average of 37 per cent.

    For higher education institutions, including my own, this is more than a regrettable statistic. It must be a call to action. The Sutton Trust’s Opportunity Index highlights that the North East ranks lowest of all English regions for social mobility prospects, with the poorest students in the region facing some of the most limited chances for progression into higher education and good employment.

    As a country we have undoubtedly made progress in widening participation, but as someone who spends their days thinking about such things, I worry: are we measuring that progress in the right ways? It’s not just about the gateway to university, it’s about the university journey and beyond. Or, to put it in more human terms: are people who previously wouldn’t have gone to university not only getting in, but thriving once they’re in?

    If we carry on measuring widening participation purely by entry stats and graduate salaries, we’ll miss the bigger picture, and what many of us went into higher education to try to achieve: deeper, transformative impact. A university education does more than prepare someone for a job. There is good evidence that links it to longer life expectancy, better health, and greater stability.

    The benefits of university go beyond the individual. Children of university graduates are much more likely to attend university and perform better once there. When a young person from a disadvantaged background earns a degree, it can spark a ripple effect that changes their family’s trajectory for good.

    There’s also a clear economic case for seeing success more broadly. Graduates typically pay more in tax, rely less on welfare services, and are more likely to engage in civic life. In regions like ours, where economic renewal and social mobility are deeply connected, that impact is amplified. A university education doesn’t just boost an individual’s prospects – it helps build stronger, more resilient communities.

    Whole-journey approach

    If we are truly serious about transforming lives and levelling up opportunity, especially in so-called “cold spots” like County Durham, then we need to dig deeper, beyond continuation rates and into attainment and the feeling of belonging. Financial strains, cultural barriers, wellbeing concerns, and more must be recognised and overcome. These are challenges not just for admissions, but across the entire student journey.

    Attainment gaps have a substantial impact, and disadvantaged students can be up to 22.7 months behind advantaged peers by the time they take their GCSEs. GCSE performance is strongly correlated with later life outcomes, including university attendance and employment quality. Early outreach is therefore pivotal in closing these long-standing gaps.

    It’s a challenge we take seriously. We’re not just widening the door – we’re reshaping the whole experience: investing nearly £1.5m in programmes for Key Stage 4 and 5 students, strengthening our foundation programme, and working with Sunderland AFC’s Foundation of Light to create a new health hub in one of our most deprived communities.

    One of the clearest messages of our new access and participation plan is how deeply place and perception are intertwined. Many young people in North East England don’t just lack opportunities – they’re not even sure those opportunities are meant for them. And, sadly, some still perceive Durham to be a place where they wouldn’t belong. Multiple studies show a strong link between a sense of belonging and academic success, particularly for underrepresented groups. So we’re investing in transition support and the Brilliant Club’s Join the Dots programme, which connects incoming students with peer coaches from results day onward.

    What we’re trying to achieve with our strategy cannot and should not be measured solely in continuation rates and degree classifications. Our evaluation strategy includes:

    • Sense of belonging as a core outcome: Building on Durham-led research, we are embedding a validated survey tool into our access and participation work. This tool captures students’ sense of belonging across multiple domains — from college life to academic confidence. These survey findings will help us identify and support groups at higher risk of exclusion.
    • Quasi-experimental design: Where sample sizes allow, we will use matched control groups and multiple regression analysis to compare outcomes between intervention participants and non-participants, tracking progress from outreach through to graduation. Intermediate metrics include not only continuation and attainment but also self-efficacy and engagement.
    • Pre/post measures: Our use of TASO’s validated access and success questionnaire enables pre- and post-intervention analysis of psychosocial outcomes such as academic self-efficacy and expectations of higher education.
    • Theory of change models: These have been developed for each intervention strand and will be regularly updated to ensure our work is aligned with evidence and outcomes over time.

    While our approach is rigorous, we anticipate several challenges. Students from disadvantaged backgrounds face cost-related pressures that may impact belonging and continuation. And persistent concerns about whether students from working-class or Northern backgrounds “belong” at Durham risk undermining recruitment and retention. We aim to confront this through co-designed interventions, but change in perception takes time.

    Co-development is key

    We believe that we can only succeed for the North East by working with others: through Universities for North East England – which includes Durham, Newcastle, Northumbria, Sunderland, and Teesside; and the new Durham Learning Alliance partnership with four local colleges – we must expand educational opportunities and drive economic growth.

    When people see that their goals and dreams are genuinely realisable, they’re far more likely to engage. After all, who are we to define what success should look like for someone else?

    The government’s opportunity mission gives higher education a rare, and much-needed, moment to pause and reset. Let’s not waste it. We’ve got a chance to rethink what success means – not just for universities, but for the people and places we serve. Let’s broaden the conversation beyond who gets through the door. Let’s put co-development at the heart of everything we do. And above all, let’s keep listening – not just to what students need, but to what they hope for. In the end, the real test of progress isn’t just who gets in. It’s who gets on – and how far they go, with us walking alongside them.

    Source link

  • For some students, home doesn’t feel like home

    For some students, home doesn’t feel like home

    In Britain, we can be oddly squeamish when talking about class, whether known or implied through a person’s accent, appearance, or behaviour.

    But not having an honest conversation with ourselves and our institutions about it is actively harming our students, especially the ones who are from the area where our institutions sit.

    I was one of a team of authors that published a report at the back end of 2024 exploring the role of social class and UK home region at Durham University. Our research, which was supported by the university, found that students from North East England had a lower sense of belonging than their peers.

    This is in comparison to students from other northern regions, the rest of the UK, and international students. And it is true even if they are from more advantaged backgrounds.

    I’ll say that again – students from North East England feel excluded from Durham University, which is in… North East England. This highlights that a problem at Durham University is not only class, but preconceived stereotypes based on how a person speaks, acts, or their family background.

    This article explains how we built our evidence base, and how the university responded, including by integrating our recommendations into the new Access and Participation Plan, and resourcing new staff roles and student-led activity.

    From anecdote to evidence

    The student-led report came out of the First Generation Scholars group in the Anthropology department in 2022.

    Having heard repeatedly the issues that first generation students were facing, and feeling it ourselves, we decided to move beyond anecdotal stories which were known in the university, and produce something concrete and legible which couldn’t be denied.

    We devised a survey and sent it to every student, with a 10 per cent response rate. Follow up focus groups were conducted to add additional context to the quantitative findings and ensure the voices of those who had been let down were heard.

    The findings were grouped into seven areas – overall sense of belonging at Durham, peer relationships, experiences in teaching and learning, college events and activities, college staff relationships, experiences in clubs and societies, and financial considerations.

    Across all these areas, social class had the strongest and most consistent effect. Students from less privileged backgrounds were more likely to feel ashamed of the way they speak, dress, and express themselves.

    They students felt targeted based on their background or personal characteristics – and said they were:

    …being told countless times by a flatmate that I seem the ‘most chavvy’ and continuously refer to Northerners as degenerates.

    …at a formal dinner, students laughed at my North-east accent, they asked if I lived in a pit village.

    The irony is that due to rising housing costs, many students really are being forced to live in pit villages.

    These instances weren’t only present in peer interactions – but also took place in the teaching and learning spaces. One student said that during a lecture, the lecturer mentioned that they couldn’t understand what the IT staff member was saying due to his North East accent – which was the same as the students’.

    Another noted that their peers were “sniggering when I made a comment in a tutorial.” Comments like these have led to students self-silencing during classes and, in some cases, changing their accents entirely to avoid stigma.

    Anecdotally, I’ve heard students say that their families laugh when they hear their new accent. If we are implicitly telling students that they have to change who they are in their own region, their own city, amongst their own family in order to fit in, we are telling them that they are not safe to be authentically themselves. That message lingers beyond university.

    The report notes that other groups of students also experienced exclusion. These included women, LGBTQ+ students, and students with a disability – although only disability came close to the magnitude of effects explained by social class and region.

    It should be noted that these are protected characteristics, while class and region are not. But there was also an interaction between these characteristics, class, and region. Women from less advantaged backgrounds from North East England had a worse time than their southern peers – which they reported as being due to their perceived intelligence and sexual availability. One North East female student stated,

    I was a bet for someone to sleep with at a college party because ‘Northern girls are easy.’

    Tackling the sense of exclusion

    The report also highlights instances of real connections for students. It was often in the simplest gestures, such as having a cup of tea with their college principal, porters saying hello in the corridor, or a lecturer confirming that they deserved to be at Durham, despite the student’s working-class background.

    We were worried that the university might be quick to dismiss, bury, or simply ignore the report. However, they’ve stepped up. The report has been used in the new Access and Participation Plan (APP), underpinning an intervention strategy to increase students’ sense of belonging through student-led, funded activities.

    That builds on the creation of new, instrumental staffing positions. In discussions following the launch event for the report, there was a real buzz and momentum from colleagues who spotlighted the work they were doing in this area – but with an awareness that more needs to be done.

    A key issue is connecting this discrete but interconnected work. Many activities or initiatives are happening in silos within departments, colleges, faculties, or within the central university, with few outside those realms knowing about it.

    In a time when every university is tightening their belts, coordinating activities to share resources and successes seems like an easy win.

    It would be easy to dismiss the problem as unique to Durham – the university and its students have often been under fire for being elitist, tone deaf, or exclusionary. But it’s likely that students at other institutions are facing similar barriers, comments, and slights.

    I’ve spoken to enough colleagues in SUs to know that it isn’t just a Durham problem, not even just a Russell Group problem. There will be those who are afraid of what they might find if they turn over that particular stone, actually having a good look at how social class impacts students belonging.

    But I’d argue it’s a positive thing to do. Bringing it into the light and confronting and acknowledging the problem means that we can move forward to make our students’ lives better.

    Read the full report here, including recommendations, and the university’s comments.

    Source link