Tag: Wonkhe

  • Identifying “mickey mouse” courses | Wonkhe

    Identifying “mickey mouse” courses | Wonkhe

    St Valentine’s Day, 1966. Salem, Oregon.

    State legislator Morris Crothers (Salem-R), a qualified doctor, is unhappy with a Bachelor’s degree in Medical Technology offered by the Oregon Technical Institute (OTI, formerly Oregon Polytechnic).

    The Capital Journal reported his words:

    a mickey mouse degree that would not allow those earning it to practice in most Oregon laboratories.

    His issue isn’t with the content of the degree, but with his perception that it does not qualify a graduate to perform certain licensed tests (including the pre-marriage test for syphilis) in the state of Oregon. I say perception because it turned out he was wrong and the course was accredited – 131 graduates were already employed by the state. His real issue was that OTI wasn’t a proper four-year college, and had low entry requirements.

    OTI chancellor RE Lieuallen responded (as recorded in The Oregonian): “Here we get into the question of the liberal arts background … some people would say that a job-oriented programme is better”.

    Crothers withdrew his accusation, claiming “the news media quoted me a little out of context”.

    This is the earliest published newspaper use of the pejorative term “mickey mouse degree”. And it betrayed a lack of understanding, and a certain level of snobbery, rather than academic failings.

    From Morris to Maurice

    In the academic literature a letter to journal American Speech from Michigan State University’s Maurice Crane slightly predates Salem’s tawdry tale: in 1958 (volume 33, number 3) his letter (“Vox Bop”) offers a partial lexicon of historic midwestern jazz slang, in which he observes:

    Incidentally, a mickey or Mickey Mouse band is not merely a ‘pop tune’ band … but the kind of pop band that sounds as if it is playing background for an animated cartoon. […] This term, which has been around almost as long as Mickey Mouse himself, has also come into common parlance in another sense at Michigan State, where a ‘Mickey Mouse course’ means a snap course, or what Princeton undergraduates in my day called a gut course

    It’s unhelpful to have slang defined by reference to earlier slang, but Collins dictionary tells us a snap course was “an academic course that can be passed with a minimum of effort”.

    For things dismissed as “hobby courses” – usually arts, crafts, and leisure pursuits – there is a suspicion that such provision lacks academic rigour. The economic value argument is less pronounced here – the sheer size of the Disney industry is just one example of just how much money and time human beings devote to hobbies and interests.

    The jazzman’s derivation is interesting in that jazz is itself based on “pop tunes” – the distinction Crane draws is around the manner of playing rather than the repertoire itself. Whether you play them with a “hip” jazz inflection or a “square” pop sensibility these are difficult tunes that are challenging to play and perform well.

    Morris dancing

    The first UK press sighting of the term was in 1972 – the Nottingham Guardian Journal published a letter from an irate Loughborough resident concerning governance problems at the Institute of Race Relations (a “so-earnest group of sociologists, permissives, and mickey mouse degree holders all speaking at the same time.”)

    Here the mouse is used to infer suspicions about the political project underpinning a degree course – in the same way that the likes of the Taxpayers Alliance is able to classify courses on topics as complex and crucial as climate science and mental health as being “mickey mouse.”

    Although Margaret Hodge famously used the term in a speech to the Institute for Public Policy Research on 13 January 2003 she did not coin the phrase. Her perhaps ill-chosen words masked the actual intent of her speech – she was attempting to encourage the growth of two-year foundation degree provision in subjects that met the needs of local industry. This is a diametrically opposite position to the one taken by Morris Crothers – which serves to illustrate why the idea has become so useful. A “mickey mouse degree” is simply a term for higher education provision that the speaker doesn’t like.

    Many of the early media examples on this side of the Atlantic are actually playful subversions of the trope (University of Exeter drama lecturer Robin Allan received “Britain’s first PhD on Walt Disney” in 1994 – the Torquay Herald Express tells us that Mickey himself turned up on graduation day!) suggesting that the term had currency long before the term was introduced to the parliamentary record. That wasn’t Margaret Hodge either – Liberal Democrat MP Simon Hughes used the phrase to defend the University of Westminster from that attack in the media, in a debate on the private City of Westminster Bill in June 1995.

    You’re so fine you blow my mind

    So to describe a course as “mickey mouse” is to make a judgement that it is either academically frivolous, politically suspect, or economically worthless: and – importantly – popular. A drawing of an anthropomorphic rodent is worthless, while Mickey Mouse himself is worth billions of dollars to the Disney corporation: to use the term is to ignore a widely perceived value in favour of your own judgement.

    For this reason, a list of “mickey mouse courses” – such as the one published by the Telegraph on 3 January is the purest expression of the long running “low quality courses” debate. It floats free of metrics and data simply to reinforce prejudices.

    The 787 courses identified by a researcher (Callum McGoldrick) at the Taxpayers’ Alliance were selected based on his own judgement and assigned to one of five categories:

    • Fashion (including textiles and jewellery)
    • Games (by which I mean computer games industry related courses)
    • Media (film, photography, and – with apologies to Maurice Crane – both jazz and popular music)
    • Woke (inevitably – mostly things to do with ethnicity, gender, mental wellbeing, and sustainability)
    • Misc (which includes specifically leisure-linked vocational courses, and more general arts and crafts provision)

    There’s no distinction drawn between undergraduate, postgraduate, and non-credit-bearing provision, and (as the article illustrates) not all of the courses described are currently recruiting or funded via student loans. Courses were drawn from a series of freedom of information requests – so the list, as well as being arbitrary, is not exhaustive. It covers just 51 providers.

    It feels like a horribly labour intensive way of getting an article into the Telegraph, and as a service to contrarian think-tanks everywhere I’ve built a little tool to optimise the process. Just type a word that makes you angry into the box on the left and you get both a count and a complete list of currently recruiting undergraduate courses with that word in their title to give you that special tingly feeling.

    [Full screen]

    The bigger question

    In a 2003 article for the Guardian, Emma Brockes examined the “mickey mouse” course industry in the light of Margaret Hodge’s comments noting that “every generation has its Mickey Mouse degrees – arts subjects were mocked in the 60s and 70s, sociology in the 80s and gender studies in the 1990s.” She noted:

    “There are degrees made ludicrous by virtue of their specificity (a BA (Hons) in air-conditioning). There are degrees ridiculed for their non-specificity (citizenship studies, which, to its detractors, is so broad that it might as well be called “shit that happens in the world” studies). There are the apparent oxymorons – turfgrass science, amenity horticulture, surf and beach management and the BSc from Luton University in decision-making, which begs the cheap but irresistible observation, how did those on the course manage to make the decision to take it in the first place?

    She hangs her piece on an interview with the news editor of the Coalville and Ashby Times – one Paul Marston, a recent media studies graduate from De Montfort University. Though he does mount a defence (which Brockes rather snootily describes as half-hearted) of the relevance and interest of his degree, he laments that:

    I’m finding it difficult to move on in my career now, and I do put that down partly to my degree. It was very general, very broad, good for keeping my options open, but it doesn’t seem to have prepared me for anything much else.

    The early 00s were perhaps not the most auspicious point to begin a career in local journalism, but linkedin does confirm that Marston has had a successful career in media and communications – currently leading internal communications for defence company MBDA. It’s not clear his media studies degree directly prepared him for that role, but it feels reasonable to suggest it may have had an impact, in the same way that niche, broad, and oxymoronic courses help graduates into careers all the time.

    The “mickey mouse” accusations seldom have much to do with actual concerns about course quality. You’ll look in vain for any sign of the kind of courses that OfS and DfE are currently concerned about (franchise delivery, business studies), and the only time you see a link to metrics is with graduate salaries (which, I would argue, says more about low pay in certain industries than any failings of the courses themselves).

    It is easy and unsatisfying to critique the methodology, because (as with everything like this) the methodology isn’t the point. The prejudice, and the way people respond to it, is a much bigger issue.

    On the recent Taxpayers’ Alliance efforts, researcher Callum McGoldrick told the Telegraph:

    Taxpayers are sick of seeing their hard-earned cash subsidise rip-off degrees that offer little to no return on investment. These Mickey Mouse subjects are essentially a state-sponsored vanity project where universities fill their coffers while the public picks up the tab for loans which will never come close to being fully repaid. We need to stop funding hobby courses and start prioritising rigorous subjects that actually boost the economy and deliver value for money

    As much as the current fashion for skills planning (at levels from the local to the global) and vocational training speaks to the anxieties of a government and nation increasingly unsure of itself in a radically changing world, there’s also a sense in which it is a kind of play-acting. Sometimes we don’t think the skills we need are skills at all: while manufacturing in the UK is healthier than popularly imagined we obsess over ensuring we have the skills we need to to do that, and there is far less attention devoted to the myriad professions that keep our theatres and venues delighting audiences. We clearly need both, for our economy to thrive.

    In 1960s Salem, Morris Crothers was concerned about prestige and employer value – but his perspective was at odds with that of actual employers. Mickey Mouse, as a cipher for the immense value embedded in things we dismiss or fail to understand, betrayed his anxiety that an old order was disrupted and a new one was being born.

    Whatever the next ten years look like, in the sector or the wider economy, our starting point has to be that what will be economically (or humanly) valuable – and what skills are needed to make that value work – are at best unclear. The state may have a legitimate interest in the overall mix of subjects provided: it may have an interest as a purchaser where particular skills or expertise will be needed.

    But we also need to admit as a society that we don’t know what we will need, and that learning for the sake of learning has a value that is a little bit harder to measure.

    Source link

  • Universities, climate, and COP30 | Wonkhe

    Universities, climate, and COP30 | Wonkhe

    It was announced in October that Earth has reached its first catastrophic climate tipping point, with warm water coral reefs facing long-term decline.

    The report was produced by the University of Exeter’s Global Systems Institute, a world-leading centre in climate change research, and it carries the stark warning of further impacts – like melting of the polar ice sheets, and dieback of the Amazon rainforest – that “would cascade through the ecological and social systems we depend upon, creating escalating damages.”

    Despite such findings emphasizing the need for more effective and faster progress, the consensus about climate action needed to propel change that existed in 2019 has given way to scepticism and cynicism in some quarters.

    Political parties and influential figures in the UK (and beyond) are turning their attention away from the climate crisis and Net Zero, if they don’t dismiss it outright. The Conservatives have pledged to repeal the Climate Change Act, claiming it has harmed the economy. Reform supports the continuing use of fossil fuels. Donald Trump has recently called climate change a ‘con job’.

    So political support for Net Zero, and by extension the climate emergency, is shaky. As a result, there is a risk to climate action being perceived by the wider public to be unfair, something that forces a negative economic impact on their lives, and that’s out of touch with the concerns and needs of ordinary people.

    A fair COP?

    With these developments forming part of the backdrop of the COP30 global climate summit, that took place in Brazil in November, and mindful of there being a need to transform the narrative on climate action, the President of COP30 (Ambassador André Aranha Corrêa do Lago) made it his mission to turn the story about tipping points from one of doom and danger into one of hope, opportunity, and possibility.

    Although confidently billed in advance as “the implementation COP”, in fact it concluded as anything but. One positive outcome of the event was agreement to establish a just transition mechanism that would enable equitable and inclusive transitions for communities of workers in high-carbon industries shifting to clean energy and a climate-resilient future: though participation in the mechanism is nonetheless voluntary.

    There was a modest step towards the phasing out of fossil fuels amidst accounts of fractious talks and frantic negotiations. Outside, indigenous groups protested, and a thunderstorm caused flooding and brought down trees; the climate crisis visiting the venue, literally. Other commentators speak of underwhelm and disappointment, judging that COP30 did not deliver a turning point, and that not much will change if climate action is left to governments – it being instead down to other organisations and individuals to take collective responsibility.

    What it means for the sector

    Universities, with their core mission of delivering for the public good, have a pivotal role here. A poll published by the University of Cambridge last year demonstrated that nearly two-thirds of adults expect universities to come up with ways of fixing the climate crisis. The need for universities to “[ground] the realities of a sustainable future in the day to day of people’s lives” was advocated by James Coe in Wonkhe three years ago.

    How does what UK universities are doing on climate action now reflect these matters and respond to the attendant challenges?

    Here are some examples that offer reasons to have hope for the future.

    The reassuring starting point is that there continues to be consensus within higher education about the need for climate action. The UK Universities Climate Network includes academic and professional services staff from over 90 institutions advancing climate action and promoting a “zero carbon, resilient future”, and the University Alliance of professional and technical universities aims to find “practical solutions to pressing climate challenges, making a difference for people in their everyday lives”.

    Much of what universities do to exert influence on climate action is through research grounded in science which seeks to inform policy-making by developing human understanding of the consequences of climate change, and how its effects will play out in different geographical contexts. Leading academics in climate change contributed to the National Emergency Briefing in Westminster on 27 November, a gathering of political, media and business figures.

    Meanwhile, research that leads to the design and implementation of scientific solutions provides tangible evidence of public benefit. The Sheffield Institute for Sustainable Food has recently published work on transforming food systems, addressing public health and biodiversity challenges. Its recommendations include incentivising the growing of beans and peas, which are both healthy for people and require less energy, land, and water to grow, have a lower carbon footprint compared with animal products, and are good for the soil.

    A connection is made

    One thing that’s persuasive about this is how it relates climate action to real lives and issues that carry significance for people, like health, food, and – this is the UK after all – the weather. Another fascinating example in this vein is the Weathering Identity: Weather and Memory in England project at the King’s Climate Research Hub. This involved the gathering of oral histories of the weather and how it has shaped individual lives, and considering how more frequent extreme weather events might alter human memory and sense of place.

    A different way in which universities convince with their action on climate change, especially in a world where many are cynical about the established order, is by demonstrating they are not simply acting in a business-as-usual way and ‘admiring the problems’ caused by current policy. One example of critical thinking with reflection on radical policy change is James Dyke’s System Update film, which challenges the “incremental and timid policies of today” in search of a better world, putting the spotlight on the role that continuous economic and material growth has in causing ecological degradation. The film also suggests that citizens’ assemblies of people who are invested in meeting climate change challenges in the longer term could play a larger role in determining policy. In recent years several UK universities have convened their own climate assemblies mobilising students and staff.

    Living through change

    Building on that idea of public empowerment, there is a strand of university research focusing on the abilities and education needed to help people live through and address the challenges of climate change. Researchers at University College London have suggested ways of embedding climate education into the secondary curriculum, promoting emotional engagement with the climate crisis as a means of helping young people avoid negative feelings, and motivating them to take action.

    Environmental and social justice go hand in hand and feature prominently in university climate action. The Priestley Centre for Climate Futures in Leeds has a study looking at how the climate crisis, decarbonisation, and net zero will impact the world of work, with the guiding principle that climate transformation measures should be just and fair for workers.

    And across the Pennines, the JUST centre based at the University of Manchester but involving researchers from a group of northern institutions focuses on the pursuit of sustainability transformations that are people-centred, joined-up and socially just for citizens in regions that benefit the least from dominant economic and political systems.

    The field of arts and cultures has always played a part in inspiring reflection – which is itself a form of action – sparking emotions, and firing the imagination of its audience, with the potential to ultimately lead that audience to doing something transformative. VOTUM at Hadrian’s Wall is an art project supported by Newcastle University, part of a programme where artists are invited to undertake research inspired by Roman archaeology and climate research. Interlocking mirrored shields arranged in the shapes of artefacts found at the Wall reflect the sky and show the viewer themselves in the landscape, holding a mirror up to people and challenging them to think about their impact on the environment.

    What the examples above show is that, through work responding to climate change, universities are collectively addressing some of the concerns that are important to everyone whatever their background: healthy living, sustainable places and communities, and empowering people to maximise their potential.

    They are aligned with the narrative of the future that the COP30 President said should be “not imposed by catastrophe, but designed through cooperation”. Universities are committed to urgent climate action, but the story these examples offer is not gloomy, alienating, or dispiriting; it is engaging, inclusive and hopeful.

    It has the power to counteract climate denialism and allay doubts over net zero; it speaks of collective responsibility and points towards the possibility of a world that is fairer and greener.

    Source link

  • Alterni-TEF, 2026 | Wonkhe

    Alterni-TEF, 2026 | Wonkhe

    The proposal that the Office for Students put out for consultation was that the teaching excellence framework (TEF) will become a rolling, annual, event.

    Every year would see an arbitrary number of providers undergo the rigours (and hefty administration load) of a teaching excellence framework submission – with results released to general joy/despair at some point in the autumn.

    The bad news for fans of medal-based higher education provider assessments is that – pending the outcome of the recent ask-the-sector exercise and another one coming in the summer – we won’t get the first crop of awards until 2028. And even then it’ll just be England.

    No need to wait

    Happily, a little-noticed release of data just before Christmas means that I can run my own UK Alterni-TEF. Despite the absence of TEF for the next two years, OfS still releases the underlying data each year – ostensibly to facilitate the update of the inevitable dashboard (though this year, the revised dashboard is still to come).

    To be clear, this exercise is only tangentially related to what will emerge from the Office for Student’s latest consultation. I’ve very much drawn from the full history of TEF, along similar lines to my previous releases.

    This version of the TEF is (of course) purely data driven. Here’s how it works.

    • I stole the “flags” concept from the original TEF – one standard deviation above the benchmark on each indicator is a single flag[+], two would be a double flag[++] (below the benchmark gives me a single[-] or double[- -] negative flag). I turned these into flag scores for each sub award: [++] is 2, [- -] is minus 2 and so on. This part of the process was made much more enjoyable by the OfS decision to stop publishing standard deviations – I had to calculate them myself from the supplied (at 95%) confidence intervals.
    • If there’s no data for a split metric at a provider, even for just one flag, I threw it out of that competition. If you can’t find your subject at your provider, this is why.
    • For the Student Outcomes sub-award (covering continuation, completion, and progression) three or more positive flags ( or the flag score equivalent of [+++] or above) gives you a gold, three or more negative flags or equivalent gives you a bronze. Otherwise you’re on silver (there’s no “Needs Improvement” in this game, and a happy absence of regulatory consequences)
    • For the Student Experience sub-award, the flag score equivalent of seven or more positive flags lands you a gold, seven or more negative gets you a bronze.
    • Overall, if you get at least one gold (for either Outcomes of Experience) you are gold overall, but if you get at least one bronze you are bronze overall. Otherwise (or if you get one bronze and one gold) you get a silver.
    • There’s different awards for full-time, part-time, and apprenticeship provision. In the old days you’d get your “dominant mode”, here you get a choice (though as above, if there’s no data on even one indicator, you don’t get an award).

    There are multiple overall awards, one for each split metric. To be frank, though I have included overall awards to put in your prospectus (please do not put these awards into your prospectus) the split metrics awards are much more useful given the way in which people in providers actually use TEF to drive internal quality enhancement.

    Because that’s kind of the point of doing this. I’ve said this before, but every time I’ve shown plots like this to people in a higher education provider the response is something along the lines of “ah, I know why that is”. There’s always a story of a particular cohort or group that had a bad time, and this is followed by an explanation as to how things are being (or most often, have been) put right.

    Doing the splits

    In previous years I’ve just done subject TEF, but there’s no reason not to expand what is available to cover the full range of split metrics that turn up in the data. Coverage includes:

    • Overall
    • Subject area (a variant on CAH level 2)
    • ABCs quintile (the association between characteristics OfS dataset)
    • Deprivation quintile (using the relevant variant of IMD)
    • Sex
    • Ethnicity
    • Disability indicator
    • Age on course commencement (in three buckets)
    • Graduate outcomes quintile
    • Level of study (first degree, other undergraduate, UG with PG components)
    • Partnership type (taught in house, or subcontracted out)

    The data as released also purports to contain data on domicile (I couldn’t get this working with my rules above) and “year” which refers to a year of data in each metric. To avoid confusion I haven’t shown these.

    In each case there is different data (and thus different awards) for full time, part time, and apprenticeship provision. It’s worth noting that where there is no data, even for a single indicator, I have not shown that institution as having provision referring to that split. So if you are standing in your department of mathematics wondering why I am suggesting it doesn’t exist the answer is more than likely that there is missing data for one of your indicators.

    Here, then, is a version that lets you compare groups of students within a single institution.

    [Full screen]

    And a version that lets you look at a particular student subgroup for all providers.

    [Full screen]

    For each, if you mouse over an entry in the list at the top, it shows a breakdown by indicator (continuation, completion, progression for student outcomes; six cuts of National Student Survey question group data for student experience) at the bottom. This allows you both to see how the indicator compares against the benchmark, view flag scores (the colours) by indicator, and see how many data points are used in each indicator (the grey bar, showing the denominator).

    More detail

    The Office for Students did briefly consider the idea of a “quality risk register” before it was scrapped in the latest round of changes. In essence, it would have pointed out particular groups of students where an indicator was lower than what was considered normal. To be honest, I didn’t think it would work at a sector level as well as at the level of the individual institution – and I liked the idea of including NSS-derived measures alongside the outcomes (B3) indicators.

    So here’s an alternative view of the same data that allows you to view the underlying TEF indicators for every group (split) we get data for. There’s a filter for split type if you are interested in the differing experience of students across different deprivation quintiles, ethnicities, subjects, or whatever else – but the default view lets you view all sub-groups: a quality risk register of your very own.

    [Full screen]

    Here the size of the blobs show the number of students whose data is included in each group, while the colour shows the TEF flag as a quick way for you to grasp the significance and direction of each finding.

    Understanding the student experience in 2026

    Data like this is the starting point for a series of difficult institutional conversations, made all the harder by two kinds of financial pressure: that faced by students themselves (and affecting the way they are able to engage with higher education) and that faced by providers (a lack of resources, and often staff, to provide supportive interventions).

    There’s no easy way of squaring this circle – if there was, everyone would be doing it. The answers (if there are answers) are likely to be very localised and very individual, so the wide range of splits from institutional data available here will help focus efforts where they are most needed. Larger providers will likely be able to supplement this data with near-realtime learner analytics – for smaller and less established providers releases like this may well be the best available tool for the job.

    More than ever, the sector needs to be supplementing this evidence with ideas and innovations developed by peers. While a great deal of public funding has historically been put into efforts to support the sharing of practice, in recent years (and with a renewed emphasis on competition under the Office for Students in England) the networks and collaborations that used to facilitate this have begun to atrophy.

    Given that this is 2026, there will be a lot of interest in using large language models and similar AI-related technologies to support learning – and it is possible that there are areas of practice where interventions like this might be at use. As with any new technology the hype can often run ahead of the evidence – I’d suggest that Wonkhe’s own Secret Life of Students event (17 March, London) would be a great place to learn from peers and explore what is becoming the state of the art, in ways that do not endanger the benefits of learning and living at a human scale.

    Meanwhile it is ironic that an approach developed as an inarguable data-driven way of spotting “poor quality provision” has data outputs that are so useful in driving enhancement but are completely inadequate for their stated purpose. Whatever comes out of this latest round of changes to the regulation of quality in England we have to hope that data like this continues to be made available to support the reflection that all providers need to be doing.

    Source link

  • Higher education postcard: Ashridge | Wonkhe

    Higher education postcard: Ashridge | Wonkhe

    We looked a few weeks ago at Philip Stott College; this week we’ll go to the Bonar Law Memorial College, its rival and successor, and see what happened there.

    Earlier in my career, when I worked at what is now City St George’s, I was obliged to visit Ashridge in my official capacity. A magnificent stone building, with wonderful medieval fireplaces and mullioned windows; the childhood home of Elizabeth I, rich in history.

    Except, of course, that Ashridge House was built in the early nineteenth century. All of that history took place at Ashridge Priory, which stood on the same site but was demolished in 1803. And Ashridge House is now grade I listed, with its grounds grade 2 listed. It’s a fake, but it’s a glorious fake.

    It was built under the auspices of John Egerton, 7th Earl of Bridgewater. He was a descendent of Thomas Egerton, Lord Keeper of the Great Seal and Lord Chancellor of Elizabeth I and James VI and I, and also of Francis Egerton, 3rd Duke of Bridgewater and a canal magnate. And when complete it eventually passed into the Brownlow family, who in 1921 sold the house and the grounds to the National Trust.

    It was bought by Urban H Broughton. Broughton was a civil engineer, who in 1884 went to the USA to promote a hydro-pneumatic sewerage system. He clearly did well there, promoting the system at the 1893 World Fair in Chicago, and in 1895 being hired by oil tycoon Henry Rogers to instal the system in his home community. And there he met Cara Leland Duff, Rogers’ widowed daughter. Sparks flew; they married. And, years later, he returned to Britain with his family, a rich man. He became a personage in society, a Conservative MP, and he was just about to be ennobled when he died.

    Before he did this, however, he gave Ashridge House to the Conservative Party to be used as a staff college.

    And so the Bonar Law Memorial College was born. Its original trustees were a roll-call of the Conservative party’s great and good: Stanley Baldwin MP, John Colin Campbell Davidson MP, Baron Fairhaven, John William Beaumont Pease, Viscount Hailsham, Neville Chamberlain MP, Viscount Astor, Col. John Buchan (he of The Thirty-Nine Steps), Viscountess Bridgeman, and Lady Greenwood, amongst others. The Leader and Chaiman [sic] of the Conservative and Unionist Party were trustees ex officio. It was named for Andrew Bonar Law, Prime Minister from 1922 to 1923.

    The Bonar Law Memorial College opened in 1929; it became known as a college of citizenship. During WW2 it was used as a field hospital. And it seems that its time as a Conservative college was not without tensions between the Conservative party and the college. Which is probably inevitable: the periodicity of vicissitudes in politics is, I claim, shorter than the periodicity of change in ideas and curricula.

    By 1954 the political nature of the college was coming to an end. By an Act of Parliament – the Ashridge (Bonar Law Memorial) Trust Act 1954 – the college became non-partisan, and known as the Ashridge Management College. It seems that the charitable aims were focused on the UK and the Commonwealth, meaning that the Ashridge (Bonar Law Memorial) Trust Act 1983 was necessary to enable the college to recruit students from countries outside the Commonwealth.

    In the 1990s Ashridge was validated by City University – which was how I got to go there – but then gained its own degree awarding powers. And rightly so. In 2015 it became part of Hult International Business School and now hosts executive education.

    The card is undated and unposted but judging by the cars parks out front I would guess stems from the 1950s, after it had become Ashridge. Here’s a jigsaw of the card – it’s a really tricky one this week!

    Source link

  • Podcast: Labour Conference 2025 | Wonkhe

    Podcast: Labour Conference 2025 | Wonkhe

    This week on the podcast, as the dust settles on Labour conference in Liverpool, we unpack what Keir Starmer’s new higher education participation target really means – and whether universities have the capacity and funding to meet the moment.

    We also get into the surprise return of targeted maintenance grants – funded controversially by the levy on international student fees, and we reflect on the wider political atmosphere at the conference – from policy signals to sector perceptions, and what all this might tell us about Labour’s emerging offer and forthcoming White Paper.

    With Gary Hughes, Chief Executive at Durham Students’ Union, Eve Alcock, Director of Public Affairs at QAA, Michael Salmon, News Editor at Wonkhe and hosted by Jim Dickinson, Associate Editor at Wonkhe.

    The PM’s announcement on higher level participation is a win for the HE sector

    The fifty per cent participation target is no more. Again.

    Grants return, the levy stays

    Maybe the levy just moves money to where it’s needed most

    The Augar review is back, baby. Just don’t about talk yourself

    Students are being othered again – and everyone loses out

    Have universities got the capacity and cash to respond to the government’s agenda?

    How much should the new maintenance grant be?

    Universities should be central to rebuilding communities

    Students are working harder than ever – because they have to

    I have a lot of questions about the LLE

    Who’s ready for a debate at 930am on a Sunday?

    The education policy trap: will the Augar review avoid the mistakes of the past?

    You can subscribe to the podcast on Apple Podcasts, YouTube Music, Spotify, Acast, Amazon Music, Deezer, RadioPublic, Podchaser, Castbox, Player FM, Stitcher, TuneIn, Luminary or via your favourite app with the RSS feed.

    Source link

  • The Wonkhe HE staff survey – how good is work in higher education?

    The Wonkhe HE staff survey – how good is work in higher education?

    As financial pressures continue to bear down on higher education institutions across the UK, there is a visible impact on higher education staff, as resources shrink, portfolios are rationalised, and redundancy programmes are implemented. These are definitively tough times for the sector and its people.

    One way this plays out is in the industrial relations landscape, with unions balloting for industrial action, as well as, on some specific issues, advancing joint work with employers.

    But there is a wider, arguably more nuanced, lens to bring to bear, about how the current circumstances are reshaping staff experiences of working in higher education, and what options are available to those with responsibility for leading and supporting higher education staff.

    When the Wonkhe team came up with the idea of running a national survey for higher education staff we knew from the outset that we would not be able to produce definitive statements about “the HE staff experience” derived from a representative sample of responses. There is no consensus over how you would define such a sample in any case.

    The best national dataset that exists is probably found in UCEA publications that combine institutional staff experience survey datasets at scale – one published in 2024 titled “What’s it really like to work in HE?” and one in May this year diving into some of the reported differences between academic and professional staff, “A tale of two perspectives: bridging the gap in HE EX.

    Instead we wanted to, firstly, ask some of the questions that might not get asked in institutional staff surveys – things like, how staff feel about their institution’s capacity to handle change, or the relative importance of different potential motivating factors for working in HE, or, baldly, how institutional cost-cutting is affecting individuals. And secondly, as best we can, to draw out some insight that’s focused on supporting constructive conversations within institutions about sustaining the higher education community during challenging times.

    We’ll be reporting on three key areas:

    1. “Quality of work” – discussed further below
    2. Professional motivations, the relative importance of different motivators for our sample group, and the gap between the level of importance afforded key motivators and the extent to which respondents believe they actually get to experience these in their roles – DK has tackled that subject and you can read about his findings here
    3. Views on institutional change capability – coming soon!

    We’ve not covered absolutely everything in this tranche of reporting – partly because of time pressures, and partly because of format constraints. We have a fair bit of qualitative data to dive into, as well as the third area of investigation on institutional change capability all still to come – watch this space.

    The methodology and demographics bit

    We promoted the survey via our mailing list (around 60,000 subscribers) during July and August 2025, yielding a total of 4,757 responses. We asked a whole range of questions that we hoped could help us make meaningful comparisons within our sample – including on things like nationality, and type and location of institutions – but only some of those questions netted enough positive responses to allow us to compare two or more good-sized groups.

    Our working assumption is that if there was a group of around 500 or more who share a particular characteristic it is reasonable to compare their responses to the group of respondents who did not have that particular characteristic. We have conducted analysis of the following subgroups:

    • Career stage: Early career (n=686), mid career (n=1,304), and late career (n=2,703)
    • Those with an academic contract (n=1,110) and those with a non-academic contract (n=3,394) – excluding some other kinds of roles/contracts
    • Time in higher education: five years or fewer (n=908); 6-10 years (n=981); 11-20 years (n=1,517) and more than 20 years (n=1,333)
    • Working arrangements: on-site (n=988); working from home or remotely (n=475); and flexible/hybrid (n=3,268)
    • Leadership role: respondents who said they have formal management or leadership responsibility in their current role for projects, programmes, resources, or people (n=3,506), and those who did not (n=1,214)

    And we also looked at the following identity characteristics:

    • Gender: men (n=1,386) and women (n=3,271)
    • Sexuality: those who identified as gay, lesbian, bisexual or queer (n=654) and those who did not (n=4,093)
    • Ethnicity: those who identified as being of a minoritised ethnicity (n=247) and those who did not (n=4,444)
    • Disability: those who identified as being disabled (n=478) and those who did not (n=4,269)

    In one case – that of respondents who identified as being of a minoritised ethnicity – our sample didn’t meet the threshold for wholly robust analysis, but we found some differences in reported experience, which we think is worth reporting given what we already know about this group of staff, and would caution that these findings should be viewed as indicative rather than definitive.

    In some cases we have combined subgroups to make larger groups – for example we’ve grouped various academic roles together to compare with roles on other kinds of contracts. In others we’ve ignored some very small (usually n=3 and below) groups to make for a more readable chart; for this reason we don’t often show all responses. And although our response rates are high you don’t have to refine things much to get some pretty low numbers, so we’ve not looked at intersections between groups.

    We have reported where we found what we considered to be a meaningful difference in response – a minimum of four percentage points difference.

    The financial context

    88 per cent of respondents said their institution has taken material steps to reduce costs in the last 12 months, offering a background context for answers to the wider survey and the assurance that the thing we are looking at is definitely staff views against a backdrop of change.

    51.6 per cent said they personally had been negatively affected by cost reduction measures, while 41.9 per cent said the personal impact was neutral. This suggests that while cost reduction may be widely viewed as negative, that experience or the views that arise from it may not be universal.

    Of those that said they had been negatively affected we found no meaningful differences among our various comparator groups. Leaders and those later in their career, were as likely to report negative impacts as those without leadership responsibilities or earlier in their career, suggesting that there is little mileage in making assumptions about who is more likely to be negatively impacted – though of course we did not try to measure the scale of the impact, and we’re mindful we were talking to people who had not lost their jobs as a result of cost-saving measures.

    The one exception was between those on academic contracts, of whom nearly two third (65.3 per cent) reported negative impacts, and those on non-academic contracts, of whom the number reporting negative impact was closer to half (47.4 per cent). This difference gives important context for the wider findings, in which those on academic contracts are consistently more likely to offer a negative perspective than those on non-academic contracts across a range of questions. This tallies to some degree with the national picture explored in UCEA’s “Bridging the gap” report in which academics were more likely to report challenges with workload, work-life balance, and reward and recognition, than professional staff – though higher levels of work satisfaction.

    Regretting and recommending HE

    We asked whether, taking into account what is known about other available career paths, whether respondents feel that choosing to work in HE was the right decision for them – two thirds said yes (66.9 per cent) while 23.8 per cent were unsure. Only 9 per cent said no.

    Those approaching the end of their career were more likely to agree (74.3 per cent) compared to those mid-career (65 per cent) or early career (61.2 per cent). Those with leadership responsibilities were also slightly more likely to agree, at 68.2 per cent, compared to 62.3 per cent for those without leadership responsibilities.

    Those on academic contracts were slightly less likely to agree, at 60.8 per cent compared to 68.9 per cent for those on non-academic contracts.

    However, the real divide opens up when we looked at responses to our follow up question: whether respondents would recommend a career in higher education to someone they cared about who was seeking their advice. A much smaller proportion of our sample agreed they would recommend a career in HE (42.2 per cent), with much higher rates of “unsure” (32.1 per cent) and “no” (24.5 per cent) – most likely reflecting the impact of current challenges as compared to people’s longer-term lived experience.

    For the recommend question, the career-stage trend reverses, with those approaching the end of their careers less likely to say they would recommend a career in HE (39.2 per cent) compared to 41.6 per cent for those mid-career and 50.4 per cent for early career respondents.

    There was a substantial difference by role: only 25.7 per cent of those on academic contracts would recommend a career in HE, compared to 46.9 per cent of those on non-academic contracts.

    We did not find any differences by gender, ethnicity, disability, or sexuality on either confidence in the decision to work in HE or willingness to recommend it as a career.

    Quality of work

    One of the great things about higher education as an employment sector is that there are lots of ways to be employed in it and lots of different types of jobs. What one person values about their role might be quite different from what another person appreciates – and the same for the perceived downsides of any given role.

    So rather than trying to drill down into people’s reported experiences based on our own probably biased views about what “good work” looks and feels like, we turned to the idea of “quality of work” as a guiding framework to look at respondents’ experiences and perceptions. We asked 16 questions in total derived from this 2018 Carnegie UK-RSA initiative on measuring job quality in the UK which proposes seven distinct dimensions of work quality, including pay and conditions, safety and wellbeing, job design, social support, voice, and work-life balance.

    We also kept in mind that, while support, safety and wellbeing at work are foundational conditions for success, so is effective performance management and the opportunity to apply your skills. In the spirit of Maslow’s hierarchy of needs we clustered our questions broadly into four areas: safety, security, and pay/conditions; the balance between support and challenge; relationships with colleagues; and “self-actualisation” incorporating things like autonomy and meaningfulness.

    For each question, respondents were offered a choice of Strongly disagree, Disagree, Agree, and Strongly agree. Here we report overall levels of agreement (ie Agree and Strongly Agree)

    You can see the full findings for all our comparator groups in the visualisation below.

    [Full screen]

    Headlines on quality of work and interaction with willingness to recommend

    You can see all the workings out below where I’ve gone through the results line by line and reported all the variations we could see, but the TL;DR version is that the quality dimensions that jump out as being experienced comparatively positively are physical safety, good working relationships with colleagues, and meaningfulness of work. Two key areas that emerge as being experienced comparatively negatively are feeling the organisation takes your wellbeing seriously, and opportunities for progression – the level of agreement is startlingly low for the latter.

    We compared the various quality dimensions against whether people would recommend a career in higher education for the whole sample and found that across every question there was a direct correlation between a positive response and likelihood to recommend a career in HE – and the inverse for negative responses. We think that means we’re asking meaningful questions – though we’ve not been able to build a regression model to test which quality questions are making the largest contribution to the recommend question (which makes us sad).

    [Full screen]

    Going through the various comparator groups for the quality of work questions we find that there are three core “at risk” groups – one of which is respondents of a minorised ethnicity, which comes with caveats regarding sample size. Another is those on academic contracts, and the third is disabled respondents. These groups did not consistently respond more negatively to every question on quality of work, but we did find enough differentiation to make it worth raising a flag.

    So to try to see whether we could find some core drivers for these “at risk” groups, we plotted the response to the “recommend” question against the responses to the quality questions just for these groups. At this point the samples for disabled and minoritised ethnic responses become just too small to draw conclusions – for example, under 100 respondents who identified as being of a minoritised ethnicity said they would not recommend a career in HE.

    However, over 400 of those on academic contracts said they would not recommend a career in HE, so we compared the answers of that group to those of respondents on non-academic contracts who also would not recommend a career in HE (just shy of 700 respondents). Interestingly for a number of the quality questions there was no differentiation in response between the groups, but there was noticeable difference for “reasonable level of control over work-life balance”, “able to access support with my work when I need it”, and “opportunities to share my opinion” – in the sense that among the group that would not recommend HE the academic cohort were more likely to give negative responses to these questions, giving a modest indication of possible priority areas for intervention.

    We also found that those who had worked in higher education for five years or fewer were frequently more likely to report agreement with our various propositions about quality work. While there’s clearly some overlap with those early in their career they are not entirely the same group – some may have entered HE from other sectors or industries – though early career respondents do also seem to emerge as having a slightly more positive view as well, including on areas like emotional safety, and wellbeing.

    Safety, security and pay and conditions

    The four statements we proposed on this theme were:

    • I feel reasonably secure in my job
    • I am satisfied with the pay and any additional benefits I receive
    • I feel physically safe at work
    • I feel emotionally safe at work

    On job security, overall two thirds (66.3 per cent) of our sample agreed or strongly agreed that they feel reasonably secure in their job. Those on academic contracts reported lower levels of agreement (57.8 per cent). Those who said they had been employed in higher education for five years or fewer reported higher levels of agreement (71.4 per cent). Respondents who identified as disabled reported slightly lower levels of agreement (61.9 per cent).

    On satisfaction with pay, conditions and additional benefits, overall 63.8 per cent of respondents agreed or strongly agreed that they were satisfied. Those on academic contracts reported lower levels of agreement (56.3 per cent). Those who identified as having a minoritised ethnicity had the lowest levels of agreement of all our various comparators (53.1 per cent), and were twice as likely to strongly disagree that they were satisfied with pay and benefits than those from non-minoritised ethnicities (15.2 per cent compared to 7.9 per cent). Those who identified as disabled had lower levels of agreement (54.6 per cent agreement) compared to those who did not consider themselves disabled (64.9 per cent agreement)

    On physical safety, the vast majority of respondents (95.8 per cent) agreed or strongly agreed they feel physically safe at work with very little variation across our comparator groups. While the overall agreement was similar between men and women, notably men were more likely to register strong agreement (66.3 per cent) than women (51.9 per cent).

    On emotional safety the picture is more varied. Overall 72 per cent agreed or strongly agreed they feel emotionally safe at work. Those who reported being earlier in their career reported higher levels of agreement (78.6 per cent), as did those who reported having worked in the HE sector for five years or fewer (78.6 per cent). Those with academic contracts reported lower levels of agreement (61.62). Those who identified as having a minoritised ethnicity had lower levels of agreement (62.7 per cent) and were more than twice as likely to strongly disagree they feel emotionally safe at work than those who are not minoritised (14.2 per cent compared to 6.1 per cent).

    Balance, challenge, and performance

    The four statements we proposed on this theme were:

    • The work I do makes appropriate use of my skills and knowledge
    • I have a reasonable level of control over my work-life balance
    • My organisation demonstrates that it takes my wellbeing seriously
    • My organisation demonstrates that it takes my performance seriously

    On using skills and knowledge 79.2 per cent of our sample agreed or strongly agreed that their work makes appropriate use of their skills and knowledge. There was very little variation between comparator groups – the one group that showed a modest difference was those who reported being disabled, whose agreement levels were slightly lower at 75.3 per cent.

    On control over work-life balance, 80.7 per cent of our sample agreed or strongly agreed they have a “reasonable” level of control. Those who had worked in higher education for five years or fewer were more likely to agree (87.2 per cent). 86.5 per cent of those who work from home agreed, compared to 74.4 per cent of those who work on campus or onsite, and 81.7 per cent of those who have hybrid or flexible working arrangements. Those who reported having leadership responsibilities had lower levels of agreement (78.9 per cent) compared to those who did not (85.9 per cent).

    The biggest difference was between those on academic contracts (66 per cent agreement) and those on non-academic contracts (85.3 per cent agreement). There were also slightly lower scores for disabled respondents (74.7 per cent compared to 81.2 per cent for non-disabled respondents) and for minoritised ethnicities (76.6 per cent compared to 81 per cent for non-minoritised ethnicities).

    On wellbeing, 57.8 per cent of our sample agreed or strongly agreed that their organisation demonstrates that it takes their wellbeing seriously. This was higher for early-career respondents – 60 per cent agreement compared to 57.9 per cent for those in mid-career, and 55.5 per cent for those approaching the end of their career. Agreement was higher for those with five years or fewer in higher education at 68.4 per cent agreement, compared with 54.5 per cent for those with more than 20 years’ experience.

    Those on academic contracts were substantially less likely to agree with only 39.7 per cent agreement that their organisation demonstrates that it takes their wellbeing seriously. Disabled respondents were also much less likely to agree than non-disabled respondents, at 47.7 per cent and 59 per cent respectively. Those working from home reported slightly lower levels of agreement, at 52.6 per cent.

    On performance, 63.1 per cent of our sample reported that their organisation demonstrates that it takes their performance seriously. This was slightly higher for those who had five years or fewer in higher education, at 69.6 per cent. Again, there was a difference between those on academic contracts with 57.8 per cent agreement and those on non-academic contracts, with 64 per cent agreement. Disabled respondents were slightly less likely to agree (58 per cent agreement) than non-disabled (63.8 per cent agreement).

    Relationships with colleagues

    The four statements we proposed on this theme were:

    • I am able to access support with my work when I need it
    • I am given sufficient opportunities to share my opinion on matters that affect my work
    • For the most part I have a good working relationship with my colleagues
    • I generally trust that the people who work here are doing the right things

    On accessing support, 76.2 per cent of our sample agreed they are able to access support when they need it. There was higher agreement among those early in their career at 81.3 per cent, and similarly among those who had worked five years or fewer in HE, at 82.8 per cent. There was lower agreement among those on academic contracts: 62.3 per cent agreement versus 80.5 per cent for those on non-academic contracts. Those from a minoritised ethnicity had lower agreement at 70.6 per cent, as did disabled respondents at 67.4 per cent.

    On opportunities to share opinion, 70.4 per cent of our sample agreed or strongly agreed they were given sufficient opportunities to share their opinion on matters that affect their work. There was a small difference between those who held a leadership role and those who did not, at 71.9 per cent and 66 per cent agreement respectively. Again, those on academic contracts had lower levels of agreement, at 58.2 per cent compared to 73.9 per cent for those on non-academic contracts. Disabled staff also had lower agreement at 60.9 per cent.

    On working relationships, cheeringly, 96.1 per cent of our sample agreed or strongly agreed they have good working relationships with their colleagues. While this held true overall across all our comparator groups regardless of leadership roles, working location, personal characteristics or any other factor, notably those of a minoritised ethnicity strongly agreed at a lower rate than those who did not identity as being from a minoritised ethnicity (39.6 per cent strong agreement compared to 48.3 per cent).

    On trust, 70.8 per cent of our sample agreed or strongly agreed that they generally trust the people they work with are doing the right things. This was very slightly lower among those who work from home or remotely, at 65.9 per cent. Agreement was lower among those on an academic contract, at 61.6 per cent, compared to 73.4 per cent of those on a non-academic contract. Agreement was also lower among disabled respondents, at 63.8 per cent.

    “Self-actualisation”

    The four statements we proposed on this theme were:

    • My current job fits with my future career plans and aspirations
    • I am comfortable with the level of autonomy I have in my job
    • There are sufficient opportunities for progression from this job
    • The work I do in my job is meaningful

    On career plans, 76.1 per cent of our sample agreed or strongly agreed that their current job fits with their future career plans and aspirations. Those who said they work from home or remotely had slightly lower levels of agreement at 69.3 per cent. Those who said they do not have any kind of leadership role had slightly lower levels of agreement at 69.4 per cent.

    On autonomy, 82.5 per cent of our sample agreed or strongly agreed they were comfortable with the level of autonomy they have in their job. Those with an academic contract had very slightly lower levels of agreement at 77.9, compared to 83.8 per cent agreement among those on non-academic contracts. Those of a minoritised ethnicity had lower levels of agreement at 73.9 per cent, as did disabled respondents, at 75.9 per cent agreement.

    On progression, a startling 29.5 per cent agreed or strongly agreed that there are sufficient opportunities for progression from their current position. There was a modest difference between those with leadership roles, 31.1 per cent of whom agreed, compared to 25 per cent of those without a leadership role. Those on academic contracts had higher levels of agreement at 38.5 per cent, compared to 26.8 per cent of those on non-academic contracts.

    On meaningful work, 86.1 per cent of our sample agreed or strongly agreed that the work they do in their job is meaningful. Those who work from home or remotely had very slightly lower levels of agreement at 77.9 per cent but otherwise this held true across all our comparator groups.

    Aspiration to lead and preparedness to lead

    We asked about whether respondents aspire to take on or further develop a leadership role in higher education, and if so, whether they are confident they know what a path to leadership in higher education involves in terms of support and professional development. These questions are particularly relevant given the generally negative view about opportunities to progress held by our survey respondents.

    [Full screen]

    Overall, 44.5 per cent of our sample said they aspire to take on or further develop a leadership role. Curiously, this was only slightly higher for those who already have some level of leadership responsibility, at 48.3 per cent. This can be explained to some degree by differentiation by career stage: 58.8 per cent of early career respondents aspired to take on or develop leadership roles, as did 50.9 per cent of mid-career respondents.

    Aspiration to lead was higher among those identifying as lesbian, gay, or bisexual at 52.6 per cent compared to 43.2 per cent for those who did not. Aspirations were also higher among respondents of a minoritised ethnicity, at 54.5 per cent, compared to 43.8 per cent among those not of a minoritised ethnicity.

    We also asked respondents whether they are confident they know what a path to leadership involves in terms of support and professional development, where we found some important variations. Confidence about pathways to leadership was lower among early career respondents, at 22.8 per cent agreement, and even mid-career respondents confidence was lower than the numbers reporting they aspire to leadership, at 36.6 per cent.

    While there was no difference in aspiration between respondents on academic contracts and those on non-academic contracts, those on academic contracts were more likely to say they are confident they know what a path to leadership involves, at 50.3 per cent compared to 34.8 per cent.

    While there was no difference in aspiration between men and women respondents, women were slightly less likely than men to report confidence in knowing about the path to leadership, at 37.5 per cent compared to 42 per cent. Those who identify as lesbian, gay or bisexual, those of a minoritised ethnicity, and disabled respondents were also slightly less likely than their comparator groups to express confidence, despite having expressed aspiration to lead at a higher rate.

    These findings around demographic difference suggest that there remains some work to be done to make leadership pathways visible and inclusive to all.

    We’ll be picking up the conversation about sustaining higher education community during tough times at The Festival of Higher Education in November. It’s not too late to get your ticket – find out more here.

    Source link

  • KEF deserves a boost | Wonkhe

    KEF deserves a boost | Wonkhe

    The Knowledge Exchange Framework (KEF) is excellent in all kinds of ways.

    It eschews the competitiveness of league tables. It provides a multi-faceted look at everything that is going on in the world of knowledge exchange. And it is nuanced in comparing similar kinds of institutions.

    KEF is not overly bureaucratic and it is helpful for universities in understanding where they might improve their knowledge exchange work.

    It is a shame then that the release of the KEF dashboard is not as big a day for the sector as something like REF or even TEF.

    Keep on KEFing on

    The KEF is the friend that would help you move house even if it isn’t the first one you would call for a gossip. It is nice, it is helpful, it is realistic on what is and isn’t working. In the very kindest way possible it is straightforward.

    The problem is that the nuance of the KEF doesn’t make for sensational coverage. There isn’t an up and down narrative, there aren’t really winners and losers, and of course there is no funding attached. It is a mirror to the world of knowledge exchange that simply shows what is going on.

    And if you dig deep enough the stories are good. Queen Mary University of London is doing a superb job at IP and commercialisation as well as public and community engagement all the while generating £760m of GVA. Birmingham Newman University is playing a significant role in local growth and regeneration through partnerships, placements, collaborations and consultancy. While the University of Plymouth has one of the most complete radar diagrams with a distinct focus on its maritime work.

    Every single event about how the sector promotes its value discusses the need for universities to have a better story about their places, economic impact, and the tangible impact they make on people’s lives. The KEF is a single source of hundreds of such stories, but somehow it is not cutting through.

    Perhaps, one of the reasons is because the consequences of doing badly (whatever badly means in the context of KEF) is very little. It is not the public shaming tool of the TEF, it is not the funding mechanism of REF, and it doesn’t attract very much media attention. It could have been so different. As Jo Johnson, then Science Minister, said at the launch of KEF

    Our ambition is that the new KEF will become an important public indicator of how good a job universities are doing at discharging their third mission, just as the REF rewards excellence in research and the TEF rewards excellence in teaching and student outcomes.

    The KEF does not reward anything, but it could (yes – its constituent parts are linked to HEIF but that isn’t quite the same thing.)

    My favourite gains

    Another model of funding distribution is possible. One of the major concerns about the REF is that it is becoming too complex. REF measures inputs and outputs, it looks at impact but not in the same way as KEF, and there is also the ongoing debate about People, Culture, and Environment, as a measure of research excellence.

    To make the REF more manageable and make the KEF more meaningful perhaps it is time to add funding consequences to KEF and just shift the pressure a little bit. Previously, I have made the argument that one way of doing this would be to rationalise all of the funding mechanisms that bump into KEF:

    As a starting point it would be sensible to rationalise HEIF allocations and KEF measurements. Without getting into the weeds at this stage a joint data set would likely draw from an enhanced HE-BCI survey, Innovate UK income, research income, journal data, and non-credit bearing course data from the Office for Students. The most straightforward way would be either to dispense with HEIF entirely and allocate the whole pot to KEF with a strengthened self-assessment element, like in REF, or use KEF as the sole basis for HEIF allocations. This would avoid both double counting funds and reduce administrative burden.

    Given the government agenda around universities and economic contribution now might be the time to consider going further.

    One measure could be to allocate a proper funding formula to KEF. In keeping with the spirit of KEF each university would still be organised into a cluster, ensuring like for like is being compared, and funding would be allocated on a formula basis depending on their contribution to each of the seven areas. Each area would not have to receive the same level of funding. Instead, the government could vary it from time to time depending on national priorities or alternatively universities could (in advance) make a pitch for their own growth priorities ensuring they devote energy to and are rewarded for where their strengths lie. This would also help with greater specialisation.

    Simultaneously, the government could add in a more dynamic competition element that is tied to funding. For example, given the state of the economy it might make sense to provide greater reward for the institutions contributing to local growth and innovation. This then becomes a whole new kind of funding route with funding to support the things universities are good at and a gentle nudge toward the things government wish them to do.

    Something changed

    The trade-offs, and the arguments, would of course be significant. In a world of fiscal constraint one of the trade-offs would be reducing funding allocated through REF or through grants in order to fund KEF.

    Reducing funding through REF may help to reduce some pressure on it but it isn’t clear that reducing the pot for exploratory research would be a net economic good in the long-term. Reducing grant funding would mean simply trading off one lever to direct research activity for another.

    Simultaneously, adding in funding allocations to KEF would undoubtedly make it into a more high-pressure exercise which would then attract costs as universities looked to maximise their returns. The exercise would need to be carefully managed to, as far as possible, rely on public data and limited returns.

    Nonetheless, it seems to be a wasted opportunity to have an exercise which is primed for measuring engagements between universities and wider society and economy, at precisely the time there seems to be a consensus this is a good idea, but with few levers to enhance this work. The benefit of looking at a funding allocation toward KEF could be a greater spread of providers rewarded for their work, greater focus on growth and social contribution, and greater attention on the work universities do alongside research and teaching.

    The road to a new kind of KEF is long. However, if the debate about REF has taught us anything, it’s that trying to change a single exercise is exceptionally hard. If the current arrangements feel tired, and reform feels piecemeal, perhaps now is the time to look at the whole ecosystem and look at a system which prizes universities third mission as much as their other work.

    Source link

  • Students as curriculum critics | Wonkhe

    Students as curriculum critics | Wonkhe

    When a member of staff claimed their course reading list was “diverse” because it included authors from the UK, North America, and Australia, it captured something problematic in higher education: an entrenched Eurocentric worldview dressed as global perspective.

    Despite growing sector-wide commitments to equity, diversity and inclusion, many UK university curricula remain bounded by the ideas, voices and assumptions of the Global North. This isn’t just an issue of representation. It cuts to the core of what and whose knowledge counts and who the curriculum is for.

    Universities have long functioned as what sociologist Remi Joseph-Salisbury calls “white spaces”; spaces where whiteness is not just numerically dominant, but culturally embedded. When students from minoritised backgrounds find their lived experiences absent from course readings, case studies, or teaching examples, the message is clear: they are not the imagined subject of the curriculum.

    Lost in translation

    This exclusion is rarely intentional. However, its effects are deeply felt. Students must expend emotional labour to navigate, challenge or mentally translate course content that does not reflect their experiences or worldviews. For some, this produces what scholars such as Smith and colleagues in 2014 have termed racial battle fatigue.

    Language plays a significant role here. In many classrooms standard academic English, which is rooted in the speech of white, middle-class Britons, is upheld as the norm. Yet this language can be very different to that adopted by young people. Students who speak dialects such as Multicultural London English (MLE), or whose cultural references differ from the mainstream, are often seen as less articulate or capable. Even institutional communications, usually packed with acronyms like NSS, OfS, PGCert, and so on can create an impenetrable culture that alienates students unfamiliar with higher education’s bureaucratic vernacular.

    Such institutional language is frequently seen in formal assessment briefs and feedback mechanisms, and so existing insecurities and barriers are reinforced. Minoritised students can struggle to link the assessments to their own lived realities and performance is impacted. These are factors that will contribute towards the ethnicity degree awarding gap, at many universities where white students tend to be awarded more top grades at the end of their studies than students from minoritised backgrounds.

    From audit to agency

    To address the ethnicity degree awarding gap, universities are now moving away from peer reviews or external examiner reports for assessments of teaching and learning materials. In their place, models of evaluation have been introduced that empower students to take the lead in assessing the inclusivity of their curriculum. This isn’t a symbolic gesture. Student reviewers can not only be trained in inclusive pedagogy and curriculum theory but bring a wealth of experiences and insights that add considerable value to curriculum.

    Having a student review their teaching can be challenging for staff, but it should be viewed as a positive experience and an opportunity to develop. Practical, structured student feedback specifically about how the curriculum is experienced isn’t routinely available but by positioning students as co-creators rather than consumers of education and working with them to develop a negotiated curriculum universities can begin to develop what Bovill and Bulley in 2011 call “student–staff partnerships” in curriculum design. More importantly cultures of reflection will be built among both students and staff.

    Indeed, the insights gathered by students provide fresh ideas and impetus for change. Universities should expect to see changes such as staff diversifying reading lists, incorporating non-Western knowledge systems, adopting and adapting podcasts and visual content, removing or clarifying colloquialisms, and reflecting more critically on their own teaching assumptions. Students also benefit as they gain a deeper understanding of learning and teaching at a personal and institutional level while developing skills that make them more employable.

    Structural limits, and what they reveal

    It is important that inclusion is not viewed as a compliance exercise. Inclusivity isn’t just about content or the material on reading lists, but about systems. Class times, unit design or professional frameworks can prevent meaningful change. These barriers resonate with what Sara Ahmed calls “non-performativity”; institutional practices that talk inclusion but do little to change the power structures that sustain inequality.

    Too frequently in HE, students are asked for feedback at the end of a module; asked what is missing or could be improved. And too often, the work of inclusion is treated as a box-ticking exercise, rather than as a long-term commitment to changing institutional norms. Collaborations that give agency to students provide visible demonstration to staff and students that inclusion matters, and that they need to work together to take practical steps that make a real difference.

    Importantly, in the same way that it is vital to reduce the burden of navigating racialised systems and institutional language, inclusive practice cannot rest on the unpaid labour of those most affected by exclusion. Students employed to do this work should be compensated, recognising their expertise and time – institutions must demonstrate their commitment to this work.

    It is also important to address how gender, disability, class, and educational background also shape curriculum experience. This move reflects the understanding, drawn from Kimberlé Crenshaw’s work that inclusion must be multi-dimensional. One-size-fits-all interventions rarely address the complex realities students navigate.

    Reimagining who defines knowledge

    Curriculum audits on their own are not enough to drive change, A shift in culture is needed. The idea that academics alone define what is taught must be challenged, with students viewed not as passive recipients but as partners in learning.

    If universities are to genuinely respond to the challenges of structural inequality, they must go beyond slogans and statements. They must create space for critique, redistribute power, and invite students to shape the educational experiences that shape them in return.

    At a time when EDI initiatives in the higher education sector are facing pressure from right-wing popularist leaders it is perhaps now more important than ever that efforts to decolonise and diversify curricula continue to grow. The question for higher education institutions is whether they are willing to relax their control over whose knowledge counts.

    Practical takeaways for sector professionals

    • Start small, but with structure. Begin with a pilot group of trained student reviewers and focus on a manageable number of modules.
    • Pay students. Their labour and insights are valuable; budget for it.
    • Support staff engagement. Offer training and development to help staff reflect and act without defensiveness.
    • Embed responses in course planning. Ensure that student input leads to visible, documented changes.
    • Build community, not compliance. Inclusion isn’t just about metrics; it’s about shared commitment to equity and belonging.

    The authors are grateful for the contributions of Rebekah Kerslake and Parisa Gilani to this article.

    Source link

  • Podcast: Mergers, reshuffle | Wonkhe

    Podcast: Mergers, reshuffle | Wonkhe

    This week on the podcast we examine the bombshell merger announcement between the University of Greenwich and the University of Kent, set to create the London and South East University Group – one of the largest higher education institutions in the UK.

    With a memorandum of understanding signed and contracts expected by Christmas, this “super university” is being hailed as a potential blueprint for sector transformation. But what does this new multi-university model really mean for students, staff, and the future of higher education consolidation?

    Plus we discuss the recent government reshuffle and its implications for the sector, as Angela Rayner’s departure triggers ministerial changes across departments with direct links to higher education – from Liz Kendall’s appointment as Secretary of State for Science, Innovation and Technology to questions about skills policy under Pat McFadden’s expanded brief at the newly configured Department for Work and Pensions.

    With Ben Vulliamy, Executive Director at the Association of Heads of University Administration, Emma Maslin, Senior Policy and Research Officer at AMOSSHE, Michael Salmon, News Editor at Wonkhe, and presented by Mark Leach, Editor-in-Chief at Wonkhe.

    The first multi-university group arrives

    Back to the future for the TEF? Back to school for OfS?

    The former student leaders entering Parliament

    You can subscribe to the podcast on Acast, Amazon Music, Apple Podcasts, Spotify, Deezer, RadioPublic, Podchaser, Castbox, Player FM, Stitcher, TuneIn, Luminary or via your favourite app with the RSS feed.

    Source link

  • Higher education postcard: matriculation | Wonkhe

    Higher education postcard: matriculation | Wonkhe

    It’s that time of year again. A level results have been and gone, the initial buzz of clearing has passed, and new students are about to turn up. It can only mean enrolment. Or, at some universities, this strange thing called matriculation.

    One internet definition of matriculation has it as “the process of matriculating”. Helpful.

    To get to the bottom of it, we need to remember that universities were medieval European creations, and medieval Europe was all about the corporation. A universitas was a single body of people, chartered by a king or a pope, or sometimes by both, and you had to become a member of the universitas to benefit from its protection and patronage.

    And the terminology stays with us – a degree refers to your class of membership of the universitas. A master had a license to teach at a universitas, and being a master at one would often (but not always) give you license to teach at another.

    Matriculation was the process whereby you became a student member of the university. At some universities (here’s Oxford, for example) it is a formal ceremony, dressing up and parading, and the whole works. At other universities it can be more administrative – in my own case, I got a letter from the University of London University Entrance Requirements office telling me that I’d matriculated. But I still had to queue up a long winding staircase at LSE to enrol, get my student ID and a grant cheque.

    Yes, I am that old.

    Enrolment is really the same as matriculation, but without the razzamatazz. It’s the moment when the contract between the student and the university becomes made by both sides; calling it enrolment not matriculation is a badge of the ongoing transition by universities from being medieval to being modern. Which I guess we should probably support. Before we need to transition to being postmodern.

    The card itself was issued by Clarkson School of Technology, in the USA. It’s actually a marketing card. Come to Clarkson, it says. There’s still time to matriculate and register, and start to learn. Note that the sequence is: exam for matriculation, matriculation, instruction begins. And note that the exam to matriculate isn’t the university’s, but is the New York Education Department’s. An external verification that standards had been met before enrolment could happen.

    The Thomas S. Clarkson Memorial School of Technology was founded in 1896. Thomas S Clarkson was a businessman, with multiple interests including a quarry. In August 1894 we are told that a worker at the quarry was in danger of being crushed by a derrick pump. Clarkson pushed the worker out of the way, being crushed himself instead. He died five days later of his injuries. His three sisters and his niece established the technical school in his name.

    In 1912 the State University of New York required the registration of all higher educational establishments, and it became the Thomas S. Clarkson Memorial College of Technology, commonly known as the Clarkson College of Technology. It became a university in 1984. The university has a more thorough account of its history on its webpages.

    The card itself was sent on 19 February 1910.

    Good morning, Leon:- Haven’t heard from you this week. Neither have we heard from Mayme. Had letter from Mabel R, her vacation began last Monday and lasts ‘til April 1st ….

    Here’s the actual message if you can decipher more than I have, please share in the comments!

    Image: Hugh Jones

    And here’s a jigsaw of the postcard for you – hope you enjoy it.

    Source link