Blog

  • Inside the Retention Gap: Why Adult Student Retention Demands a New Strategy [Infographic] 

    Inside the Retention Gap: Why Adult Student Retention Demands a New Strategy [Infographic] 

    Adult student retention has become a defining measure of institutional health. As colleges and universities expand online programs to serve working professionals and adult learners, persistence has become just as important as enrollment.

    To better understand today’s retention realities, Collegis partnered with UPCEA to survey online adult learners and higher education leaders nationwide. We examined what’s driving stop-out risk, where institutional strategies fall short, and how perceptions differ between students and leadership.

    The results reveal a clear disconnect and a significant opportunity to rethink adult student retention with greater alignment, flexibility, and data-informed strategy.

    Explore the key findings below.

    Closing the Adult Student Retention Gap Starts with Alignment

    The research reveals four clear disconnects: institutions prioritize structure while adult learners prioritize flexibility; students value dashboards more than leaders realize; support models remain generic despite varied life stages; and nearly half of institutions don’t track online retention at all.

    Improving adult student retention requires more than small adjustments. It demands student-centered strategy, integrated data, and proactive engagement built around the realities of adult learners’ lives.

    The infographic offers just a snapshot. Download the full eBook to access the complete findings and build a smarter, more effective adult student retention strategy.

    Download the Complete Report

    “The Retention Disconnect: What Adult Learners Need and What Institutions Miss”

    Source link

  • The Society for Research into Higher Education in 2015

    The Society for Research into Higher Education in 2015

    by Rob Cuthbert

    In SRHE News and Blog a series of posts is chronicling, decade by decade, the progress of SRHE since its foundation 60 years ago in 1965. As always, our memories are supported by some music of the times. This is the last of the series.

    2015 was a troubled year, as wars and terrorist outrages proliferated. Russia had invaded Crimea and eastern Ukraine in 2014 and a supposedly agreed ceasefire in 2015 broke down within days, as had a previous agreement in 2014. The war in Iraq involving Islamic State, which had started in 2013, would not end until 2017. Islamic State were also involved in the Syrian civil war, drawing in more and more major powers on opposite sides. It would continue until the Assad regime was overthrown in 2024, but elsewhere the Arab Spring popular uprisings had mostly faded. Massacres in Nigeria by Boko Haram killed more than 2,000 people. Al Qaeda gunmen killed 12 people and injured 11 more in Paris at the offices of newspaper Charlie Hebdo. Al-Shabaab killed 148 people, mostly students, at the Garissa University College in Kenya. A terrorist bomb probably brought down Metrojet Flight 9268, an Airbus A321 airliner which crashed in Sinai, killing 224 passengers and crew. Another Airbus was deliberately crashed by its first officer in the French Alps, killing all 150 people on board. An earthquake in Nepal killed 9000 people, and at least 2200 people died in a stampede at the Hajj pilgrimage in Mecca.

    Xi Jinping had been leader of China since 2012, as had François Hollande in France; Angela Merkel was in her tenth year as German Chancellor and Barack Obama was halfway through his second term as US President. The UK general election in 2015 was won by the Conservatives under David Cameron; their former coalition partners the Liberal Democrats suffered their worst result in recent history, paying for their betrayal of Nick Clegg’s “pledge” before the 2010 election to abolish HE tuition fees, even though they almost said sorry. The Labour Party elected Jeremy Corbyn as leader. Queen Elizabeth II became the longest-serving British monarch. The Paris Agreement at COP 21 saw countries agreeing to “do their best” to keep global warming to “well below 2 degrees C” and Greece became the first advanced economy ever to default on a payment to the International Monetary Fund.

    Australia beat New Zealand to win the Cricket World Cup, jointly hosted by Australia and New Zealand. The Rugby World Cup was held in England but the hosts flopped as New Zealand beat Australia in the final. Microsoft launched Windows 10, and a new startup called OpenAI was founded.

    Higher education in 2015

    In 2015 the dominant theme in higher education was internationalisation. A 2016 book by Paul Zeleza (Case Western Reserve University, US), The Transformation of Global Higher Education 1945-2015 argued that “Internationalization emerged as one of the defining features of higher education, which engendered new modes, rationales, and practices of collaboration, competition, comparison, and commercialization. External and internal pressures for accountability and higher education’s value proposition intensified, which fueled struggles over access, affordability, relevance, and outcomes that found expression in the quality assurance movement.”

    The Economist leader in March said the world was going to university but: “More and more money is being spent on higher education. Too little is known about whether it is worth it”. Students in Canada, Netherlands, UK and elsewhere were still protesting, trying to hold back the river of commercialisation, but they were just washed away.

    Simon Marginson (by then at the UCL Institute of Education) naturally provided the authoritative commentary in his 2016 article in Higher Education: “Worldwide participation in higher education now includes one-third of the age cohort and is growing at an unprecedented rate. The tendency to rapid growth, leading towards high participation systems (HPS), has spread to most middle-income and some low-income countries. Though expansion of higher education requires threshold development of the state and the middle class, it is primarily powered not by economic growth but by the ambitions of families to advance or maintain social position. However, expansion is mostly not accompanied by more equal social access to elite institutions.“

    The Going Global conference in 2015 in London had 1000 VCs and others debating “the impact of the greatest global massification of higher education ever experienced”, as NV Varghese, Jinusha Panigrahi and Lynne Heslop reported for University World News on 27 February 2015. Oxford University provided its own report on International Trends, and there was continuing progress towards a common European Higher Education Area, as the 2015 Implementation Report said: ““The European Higher Education Area (EHEA) has evolved towards a more common and much more understandable structure of degrees. There is, however, no single model of first-cycle programmes in the EHEA.” No single model for pop music either, as the Eurovision Song Contest in Vienna was won by Sweden with “Heroes” (no, me neither) and George Ezra’s European tour included Budapest.

    UK HE in 2015

    In 2015-2016 there were 162 publicly-funded HE providers in the UK; HESA held data on all of them, plus the decreasingly private University of Buckingham. In addition there was HE provision in FE colleges and other places. Of the 2.3million HE students, 60% were full-time undergraduates. 56.5% of all students were female, 43.5% male. Total numbers had been falling since 2011-2012, because the decline in part-time numbers had outstripped the continuing growth of full-time and sandwich student numbers, up by 5.8% over the same period. Business and administrative studies was the most heavily populated at both UG and PG levels, as in previous years; at PG level Education was second. Reflecting the globalisation of HE, UK universities in 2015-2016 had over 700,000 students registered in transnational education.

    The 2004 Higher Education Act (2004 c. 8) had established the Arts and Humanities Research Council and provided for the appointment of a Director of Fair Access to Higher Education. It set out arrangements for dealing with students’ complaints about higher education institutions and made provisions on grants and loans for FHE students. Then came the 2005 Education Act (2005 c. 18), which renamed the Teacher Training Agency (established by the 1994 Education Act) as the Training and Development Agency for Schools (TDA). The Learning and Skills Council was set up by the 2007 Further Education and Training Act (2007 c. 25) and the 2008 Sale of Student Loans Act (2008 c. 10) allowed the government to sell student loans to private companies. The school leaving age went from 16 to 18 under the 2008 Education and Skills Act (2008 c. 25) and the2009 Apprenticeships, Skills, Children and Learning Act (2009 c. 22) created a statutory framework for apprenticeships, and established among other things the Young People’s Learning Agency for England (YPLA), the office of Chief Executive of Skills Funding and the Office of Qualifications and Examinations Regulation (Ofqual).

    Labour might have had a head full of dreams, but many of their new structures were dismantled after the coalition government was elected in 2010. There was bad blood between Education Secretary Michael Gove and the teacher unions’ ‘blob’; his 2011 Education Act (2011 c. 21) put an end to the General Teaching Council for England, the Training and Development Agency for Schools, the School Support Staff Negotiating Body, the Qualifications and Curriculum Development Agency and the Young People’s Learning Agency for England. The Act also ended the diploma entitlement for 16 to 18 year olds and abandoned Labour’s aim of making 18 the upper age limit for participation in education.

    The tortuous rise of HE fees for undergraduates was usefully summarised in a 2015 House of Commons Library Briefing Note. The £1000 fee introduced in 1998 had risen to £3000 after 2006, in a move which almost brought down the Labour government. The 2010 election saw the Liberal Democrats renege on their pre-election ‘pledge’ to abolish tuition fees, instead agreeing with their Conservative coalition partners to triple them instead, which had many asking ‘What do you mean?’ The £9000 fees were partly a consequence of the Browne Review, but the government as always cherry-picked the recommendations it liked and ignored the package which was proposed. A 2010 vote set fees at between £6000 and £9000, but as everyone had predicted – except the Universities Minister David Willetts – English universities scrambled en masse to charge £9000, for fear of otherwise being labelled as inferior. The £9000 fees took effect in September 2012, while in other parts of the UK tuition fee arrangements increasingly diverged from England’s world-beating fee levels. The fee rose with inflation to £9250 but was then frozen, fiscal drag which would cost HE many £billions in revenue and lead to today’s widespread financial problems.

    In June 2011 the government published the White Paper Higher Education: Students at the Heart of the System, but the anticipated Higher Education Bill did not follow. Minister David Willetts was not letting go; he brought forward a package of reforms to change HE regulation: placing the funding council in an oversight and coordination role; establishing a Register of Higher Education Provision; introducing designation conditions for HEIs, and a new designation system for alternative providers; updating the Financial Memorandum; reforming student number controls, including a system for alternative providers; and creating a Designation Resolution Process. Once again Sue Hubble of the House of Commons Library provided a definitive record in September 2013, noting some commentators’ criticisms that such sweeping changes had been achieved by administrative procedures rather than primary legislation.

    In July 2015 DBIS updated the statistics on widening participation, which showed continuing but erratic progress despite too many policy interventions. We had to wait until November 2015 for a Green Paper, Fulfilling Our Potential, which proposed establishing a Teaching Excellence Framework, abolishing the Higher Education Funding Council for England and replacing it with the Office for Students. It would not be until 2017 that the Higher Education Reform Act confirmed and enshrined these changes in statute. HEPI Report 161, edited by SRHE member Helen Carasso (Oxford), looked back on the 20 years since HEPI’s formation in 2022-2023. It included a chapter by SRHE Fellow Michael Shattock (UCL) on how ‘self-governed’ universities (I doubt if we’ll see you again) were forced to say Hello to a ‘regulated’ university system: “The year 2003 can be seen as starting point in a process of systemic governance change in UK higher education.”

    SRHE and research into higher education in 2015

    By 2015 research into higher education had been noticed even in the furthest corners of academe. A 2012 book chapter by philosopher Andre V Rezaev (St Petersburg State University) was thinking out loud: “… to articulate a possibility for integrating a number of perspectives in studying higher education as a scholarly subject in current social science. We begin with the reasons for such an undertaking and its relevance. We then develop several basic definitions in order to establish a common conceptual basis for discussion. The final section presents new institutionalism as one of the ways to integrate several approaches in understanding higher education. This chapter is rather theoretical and methodological in its outlook. We develop the basic approach that, in many respects, is still a work in progress. We take in this approach a set of arguments that open up new research agenda rather than settled a perception to be accepted uncritically.” Even latecomers were of course welcome.

    With due ceremony SRHE staged a 50th Anniversary Colloquium in London on 26 June 2015. The congregation of more than 200 people included almost everyone who had been anyone in HE research in the UK, and many places beyond, gathered in Westminster for discussion and celebration, primed by ‘think pieces’ from SRHE Fellows past and future. The themes encapsulated the scope of research into HE: Learning, Teaching and the Curriculum (Marcia Devlin); Academic Practice, Identity and Careers (Bruce Macfarlane); The Student Experience (Mary Stuart); Transnational Perspectives (Rajani Naidoo); Research on HE Policy (Jeroen Huisman); Going Global (Paul Ashwin); Access and Widening Participation in HE (Penny-Jane Burke); and, Reflective Teaching in HE (Kelly Coate).

    The Society had managed to shake off its financial woes and was flourishing in financial and academic terms. The chairs from 2005 were Ron Barnett (UCL), George Gordon (Strathclyde), Yvonne Hillier (Brighton), and Jill Jameson (Greenwich). The successful series of books published by the Open University Press had ended when it was swallowed by McGraw-Hill, but a seamless change led to a new and even more successful series with Routledge from 2012. SRHE News was reimagined and relaunched in February 2010, and the SRHE Blog followed from 2012. The Society’s office moves continued, switching in 2009 from the Institute of Physics in Portland Place to a brief sojourn at Open University offices in 44 Bedford Row, London, before finding a longer-term home on the second floor at 73 Collier Street in London. In 2009 the annual Research Conference was held for the first time at Celtic Manor in Newport, Wales (where François Smit might often have said shut up and dance). It would return every year until 2019, just before Covid disrupted the world, including the world of research into higher education. The Society would however emerge even stronger, having discovered the power of online meeting (if you don’t believe me, just watch) to expand its global reach, as a more prominent complement to the still essential face-to-face meetings in networks and conferences.

    Rob Cuthbert is editor of SRHE News and the SRHE Blog, Emeritus Professor of Higher Education Management, University of the West of England and Joint Managing Partner, Practical Academics. Email [email protected]. Twitter/X @RobCuthbert. Bluesky @robcuthbert22.bsky.social.

    Author: SRHE News Blog

    An international learned society, concerned with supporting research and researchers into Higher Education

    Source link

  • Ed Department Weaponizes FERPA to Restrict Voting (opinion)

    Ed Department Weaponizes FERPA to Restrict Voting (opinion)

    Earlier this month, the U.S. Department of Education sent a letter to every college and university president with the goal of continuing its efforts to curb voting among college students. This latest letter threatens colleges and universities if they participate in or use the data from the National Study of Learning, Voting and Engagement, claiming that if they do so, they “could be at risk of being found in violation of FERPA.”

    The Family Educational Rights and Privacy Act is the federal law that protects the privacy of student education records and applies to any institution that accepts Department of Education funds. Like many of this administration’s actions, this letter is designed to have a chilling effect, since no determination has been made by the department that participation in, or use of, NSLVE studies violates any privacy statutes.

    In existence since 2013, with more than 1,000 colleges and universities nationwide currently choosing to participate, the NSLVE is a study of student political engagement at higher education institutions. The NSLVE uses data that colleges and universities voluntarily provide to the National Student Clearinghouse, which matches student enrollment records with public voting files to determine whether students registered to vote and whether they voted—not whom they voted for. NSLVE, which is housed at Tufts University, then uses the de-identified data it receives to send a confidential report to participating campuses about their own students’ voting participation.

    Under the guise of protecting student privacy, the Department of Education is weaponizing FERPA to try to get to the Trump administration’s goal of weakening voter participation, especially among college students, for political reasons. Secretary of Education Linda McMahon herself stated in the press release announcing the new guidance that “American colleges and universities should be focused on teaching, learning, and research— not influencing elections.” And the department admits in its guidance letter that its assessment that NSLVE is in violation of FERPA is based on a “preliminary analysis” and that ED merely has “concerns” about NSLVE’s use of data. The department does not conclude that NSLVE or the use of the NSLVE data violates any laws, including privacy laws.

    The NSLVE primarily uses directory information—name, address and date of birth—which institutions may disclose without consent as long as they have given general public notice (including notice of the option to opt out of disclosure) at the beginning of the academic year. In addition, when other information is provided—such as gender, race/ethnicity and degree-seeking status—it is allowable because it falls under FERPA’s “studies exception.”

    This exception allows information to be shared for studies that “improve instruction.” The NSLVE’s research is designed to enable colleges to improve civic education on campus—something that is a stated goal of this administration. Furthermore, NSLVE reports do not contain individually identifiable information and are only shared with the institution itself. It is for these reasons that the Department of Education, since the program’s inception more than a decade ago, has found this work to be allowable under FERPA.

    It is critical for colleges to understand what this letter is saying—and what it isn’t. Students deserve to have their data protected, and the federal government has a critical role to play in safeguarding their data. It is the Department of Education’s obligation to use its resources to do so. It is paramount that the government ensures any actions taken by institutions put student privacy first. But alleging potential student privacy violations when there are none is a waste of resources and undermines what is really at stake.

    As recognized by the Higher Education Act’s requirement that higher education institutions provide voter registration forms to all their students, colleges have an important role to play in promoting civic engagement and participation in democracy among students. As long as they are doing it in a way that is compliant with the data sharing allowed in FERPA, the federal government must not interfere with colleges’ participation in the NSLVE— especially with threats that are not backed up with legal findings. Insights from the NSLVE are critical to strengthening nonpartisan civic engagement for college students. Restricting use of the data in an election year is not about protecting students—but instead is harmful to them and to our democracy.

    Amanda Fuchs Miller is the president of Seventh Street Strategies and former deputy assistant secretary for higher education programs at the U.S. Department of Education in the Biden-Harris administration.

    Source link

  • What’s the real political problem with higher education funding?

    What’s the real political problem with higher education funding?

    This blog was kindly authored by Johnny Rich, Chief Executive of Push and Chief Executive of the Engineering Professors’ Council. It has been written in a personal capacity.

    In recent weeks, the issue of burdensome student loans has metastasised across the media. Then last week the Treasury’s Supplementary Estimates revealed that, even with frozen thresholds and above-inflation interest rates, the current system is necrotising public finances.

    It seems time is up for the system that protected the English higher education sector from the worst ravages of austerity after 2012. It was always, as the OBR called it, a ‘fiscal illusion’ and, when the magician’s sleeves were clipped in 2018 with new ways to account for the cost of loans, it forced successive governments to squeeze students and graduates ever harder.

    It’s still not enough though. Graduates feel their debts were ‘mis-sold’ and the cost to taxpayers is breaking the social compact with universities, which are in turn facing financial crisis.

    Should we tweak the system to make it even more expensive for students and taxpayers? Should we slash the cost of delivering higher education and face the consequences of, presumably, worse outcomes and lower graduate premiums, which would only make matters worse? Should we abandon mass higher education altogether and return to a system where the privileged have access to universities that beget more privilege?

    The system needs a reset

    The real problem – for politicians and for us all – is that the current system doesn’t incentivise the outcomes it needs to drive. Indeed, it actively drives other undesirable outcomes like high debts, skill shortages and unemployed or underemployed graduates.

    Key to the public interest in higher education is the need to produce a graduate population that meets the economic and social needs of the future. That means raising their employability in the broadest sense – their skills, knowledge, behaviours, values and, perhaps most of all, their ability to adapt to shifting labour markets.

    We need a system that plugs the yawning gaps in sectors such as engineering, but which also ensures that people like me (who studied English & Philosophy as an undergraduate) emerge not only with cultural capital, but with an awareness of their transferable employability that they can articulate to themselves and to potential employers and deploy in their work and lives.

    Yet that’s not what the current system is designed for. It was designed to ‘put students at the heart of the system’. It places the future of the highly skilled labour market at the mercy of the choices of 17-year-olds (14-year-olds, in fact, given that that’s the age when most set out on circumscribed pathways). (I’m well aware I am oversimplifying higher education to the school-leaver model. Please forgive the rhetorical device. The point remains whatever the age of applicants: they have neither the information nor the incentive to make individual choices that will aggregate to match future labour market needs.)

    The money follows the student. If they want to study Mickey Mouse courses (i.e. History of Animation), then universities are incentivised to offer those courses with little regard to our national need for Disney historians or fostering those students to transfer the employability they’ve developed to where gaps do exist.

    Meanwhile, if they want to study engineering – where there are skills shortages running into the tens of thousands a year and yet able applicants outnumber places – then the system cannot expand because the income a university can get is capped at around £7,600 lower than the cost of providing that degree.

    The system relies on the economic foresight of young people in a world where high-quality careers support is a tiny voice amid noisy misinformation and confusing heuristics. But it’s worse than that. It also relies on them putting the national interest ahead of their own ambitions and interests – making what appear to be ‘safer’ choices, rather than pursuing what actually interests them and the career they want.

    It’s no wonder it doesn’t work

    We need to redesign the higher education funding system so that it balances the needs of employers, society and the economy with opportunities for prospective students. That system must be sustainable in the long-term, and provide sufficient funding and incentive to universities to deliver a high-quality education that meets those needs.

    If we could do that, then employers would find their skills gaps met, the economy would prosper, graduates would find suitable jobs and universities would thrive. Rather than trying to slice an increasingly expensive cake ever more thinly, we’d be baking a bigger cake.

    There is a way to do this: bring employers in from the cold.

    There’s an increasing number of experts, think tanks, economists and others calling for some form of graduate employer contribution. Having flirted with the idea of a contribution before the election, the idea fell out of favour with the government after the increase in employer national insurance contributions looked like a tax on business growth and the labour market. Instead, Labour took sanctuary in tweaks to the current system.

    However, employer contributions can be designed so they don’t cost employers any more – or potentially less – than the current system for the next 25 years or so. By that time, even if there were additional costs, they would be barely detectable in long-term labour market corrections.

    There are five steps

    1. Abolish tuition fees. Retain (and improve) loans for maintenance (and continue with the reintroduction of grants).
    2. Instead of graduates paying 9 per cent repayments on earnings over the threshold, reduce it to 3 per cent for their maintenance loan (time-limited) and charge employers a 3 per cent contribution (not time-limited). The other 3 per cent could be a saving for employers, a boost to graduates’ take-home pay or, most likely, a combination determined by the labour market.
    3. The employer contribution is paid to the institution where the graduate studied (or multiple institutions according to credit accumulation under LLE arrangements).
    4. Universities that do not meet access and participation benchmarks must pay into a national access fund, which is redistributed to universities that exceed them.
    5. To manage the transition to this system, instead of the government lending to individual students for 40 years, they lend to universities for longer (or indefinite) periods based on student numbers, but otherwise with repayment terms parallel to current graduates.

    This would give universities an incentive to restrict entry to courses that do not deliver what employers or society value. It would also ensure every course fosters wider employability and every university supports its graduates.

    It would give employers skin in the game: influence over higher education provision, but not transactional control for short-term goals.

    And it would give students assurance that, whatever their course, their university is as invested as they are in it leading somewhere.

    This is the graduate employer contribution model I first outlined in a HEPI paper in 2018 and articulated in more depth in 2024 when HEPI commissioned an independent economic analysis and polling of the proposal alongside various other models.

    The projections by London Economics suggested this would save the exchequer many billions per year, and the polling placed it as even more popular among students than an NUS proposal to reinstate a fee-free, full-grant model. Other polling by Public First suggested it would be popular with the wider voting public too.

    The issue of higher education funding gathers political salience every few years and, when it does, it quickly becomes an irresistible tide. The conditions are aligning for it to become a perfect storm. The Green Party is polling at over 20% ahead of Labour among under 24s (more of whom will have the vote at the next general election). Reform UK is touting an (unaffordable) removal of interest on loans. And a prominent candidate for the next prime minister is a former NUS President who will face questions about higher education funding whether it’s because he wants that debate or his opponents do.

    Any serious political party can no longer simply command the waves to turn back. They need to get ahead of this issue with a fair, affordable, workable solution that fixes the real problem of generating growth through opportunity.

    Source link

  • HESA Spring 2026: Staff | Wonkhe

    HESA Spring 2026: Staff | Wonkhe

    If you have been keeping track of the redundancy announcements that have been the backdrop to the last few years in UK higher education chances are you have Queen Mary UCU’s “HE shrinking” page bookmarked.

    Since at least 2023 this branch of the union has been keeping records of announcements and internal communications – and necessarily it is in the form of a narrative rather than reusable data.

    I say “necessarily” because the announcement is just the start of a conversation. The provider wants (needs) to save money, and has decided that a lower staff headcount is the way – the union, departments, faculties, and students have alternative views. This seemingly hopeless situation can often resolve in ways where everyone is unhappy, but less unhappy than the initial announcement would suggest.

    For this reason, the “hard” numbers (there are limitations, which we will get to) released as HESA Staff open data represent the middle-to-end of ongoing discussions, and a straight comparison between the numbers initially announced and the changes experienced in a provider (and documented in the HESA return) doesn’t always hold.

    Staff overall

    The big limitation is that – in England, and Northern Ireland – it is not mandatory to return data about staff that are not on academic contracts. This severely limits our overall understanding of what is happening with staff at providers in these countries – and happily from 2029 non-academic contract data for the whole UK will return to the collection.

    In the meantime, the majority of visualisations in this article default (or are pre-set) to look at academic (non-atypical) contracts. In this context “atypical” refers to staff who are not permanent, work for less than four consecutive weeks and/or on one-off and short term tasks, and generally experience a high degree of flexibility. These are generally not teaching staff – common examples include student demonstrators, conference catering, and such like.

    It is also worth noting that atypical staff are not necessarily zero-hours contracts (though clearly they may be given the need to work for less than four consecutive weeks). In the non-atypical world HESA has information about 3,440 individuals on an academic zero hours contract in 2024–25: the majority (3,180 are hourly paid). These numbers are down slightly on last year, and on the previous year.

    Overall, there were 490,510 academic non-atypical full time equivalent staff in 2024–25 – including 322,445 full time and the equivalent of 167,050 as part time roles. The full time number is up on last year (and is the highest on record), whereas both part time and overall numbers have fallen.

    By default this chart shows you the whole sector, but you can use the “provider” filter to choose an institution of interest. You can always analyse these numbers by academic employment function (teaching, research, and so on), contract level (professors), terms of employment (open-ended or fixed term), source of basic salary, and sex.

    [Full screen]

    On those points, it is notable that the proportion of academic staff on teaching and research contracts is up slightly on recent years at 43.16 per cent, which remains far below the nearly half that was standard a decade ago. Teaching only contracts are down on the last couple of years at 34.8 per cent. The proportion of staff on an open-ended or permanent contract is – at 71.45 per cent – the highest in a decade.

    Age and pay

    We can see the age of academic non-atypical staff by age (note this is headcount rather than FTE so overall numbers may look a little different) split by cost centres. While we may be more familiar with the CAH and HECoS subject categories, in staff (and finance) data we are trying to associate data with physical bits of the university.

    [Full screen]

    This view defaults to “all academic cost centres” – in other words all named subject areas. There are of course many people on academic contracts that work in central departments (senior managers are the obvious example) which it makes sense to me to exclude here. The sense that the shape of the chart overall gives me is one of an aging academic population: from nearly a third of academic staff being under 35 a decade ago we now have just over a quarter in that age bracket.

    Clearly there are differences by subject areas: scientists tend to be younger, while academics in the humanities tend to be older. You can explore cost centres at a wider group level, or drill down to individual cost centres.

    HESA also gives us data on pay we can filter by age (and this time, provider rather than cost centre). As usual this is expressed in bands linked to the single pay scale: and as usual I have translated these to show the spine points changing proportion year on year (the salaries themselves are visible on the tooltips).

    [Full screen]

    The first thing you are going to want to do here is look at salaries at your own institution – you can also filter by contract level (professors), job function (teaching or research) and age. I will note on a sector level that proportionally more staff are in the higher two salary bands (spine point 40 and above) than last year, but proportions are not substantially different from previous years. Some 510 non-atypical academic staff are currently on a spine point below 20 – equivalent to a salary below £29,659. Twenty are on a salary below £23,581, which is below the 2025 minimum wage and very likely to be a data error.

    I should note that with very low numbers we run into the “HESA rounds everything by five” issue. This is done for noble reasons (to avoid identifying individual staff) but has the side effect of making it very difficult to be certain about the size of small populations.

    Starters and leavers

    The non-atypical academic staff headcount of the sector has fallen by 2,295 over last year – but the full-time population has risen by just under 2,500. As such the majority of the first drop in academic staff numbers for more than a decade is caused by a loss of part time roles.

    There were, overall, 43,050 members of staff who left an academic role in a UK higher education provider, and 40,775 that started a new role. Some of these – as we shall see – are likely to be the same people.

    [Full screen]

    Other than the link to immediate financial concern during the 2024–25 academic year there is not a lot to separate out providers that grew staff numbers from those that cut back. The need to drive efficiencies with redundancy and cuts is no respecter of institutional age and status: likewise there is no sure-fire recipe for growth (though larger Russell Group providers are doing a lot of growing.)

    We can also look at growth within providers and cost centres – the filter is at the top left and we once again default to academic atypical: you can use the controls at the bottom to zero into the area of an institution you are interested in.

    [Full screen]

    What strikes me here is that the sciences have held up comparatively well, while humanities, arts, and social sciences have seen cuts – though honestly the difference is less stark than you may expect.

    Here’s another view of the same data allowing you to compare 2023–24 and 2024–25 directly: this is probably an easier way to see recent history. Again, select a provider (or “total” for overall) at the top, and switch between cost centre or cost centre group on the left.

    [Full screen]

    Remember above we hinted that some people who left academic roles in UK higher education moved to other academic roles in higher education? This plot (non-atypical academic headcount) shows where staff have come from and moved to in terms of employment.

    We learn that 4,590 staff left an academic role in the sector for another one during 2024–25: that’s not as many as the post-pandemic peak but in line with historic norms. In the same year 7,930 staff joined a UK higher education provider from another, a historic low – the two numbers are not equivalent because of the time taken to recruit staff and the nature of in-year movement.

    [Full screen]

    You can filter by age, mode of employment, and academic contract type (teaching, research) – all the data is sector level only. It is notable that mid-career staff tend to be those who move within the sector, and that staff of all ages (3,600 of them) can find themselves outside of regular employment after leaving a higher education provider.

    Source link

  • Higher education postcard: big science

    Higher education postcard: big science

    When I was maybe 10 or 11 I remember going on a trip to Jodrell Bank. It was huge, and quite a lot of the material must have been out of my comprehension, but I recall being fascinated by the ticker tape with which they recorded the data received by the telescope.

    Anyway, welcome to Cheshire. What you can see on the card is the Lovell Telescope – not an optical telescope but one which uses radio waves to explore the universe. The facility is part of the University of Manchester, which is how it qualifies for this write up.

    The telescope’s story begins in 1945, but we need to go a bit further back for context. So, first, how is this a telescope? Where do you put your eye? Well, it’s a radio telescope: rather than using the visible spectrum it looks for radio waves. There’s a good explainer video here – worth a few minutes of your time. And the telescope on the card makes an appearance at about 2:20.

    Radio astronomy was discovered in 1933 by Karl Jansky, an American telecoms engineer who was trying to work out why there was a background hiss on short wave transmissions. His discovery of radio waves coming from the Milky Way paved the way for others. In 1937 an amateur astronomer, Grote Reber, built a radio telescope with a nine metre wide dish in his backyard, confirmed Jansky’s results, and demonstrated that radio astronomy was feasible.

    (Just stop and read that again. An amateur astronomer. A nine-metre diameter dish. A new field of science. What a life!)

    And now we can come closer to home. Bernard Lovell – after whom the telescope is now named – was an astronomer studying cosmic rays. After World War 2 (during which he had been working on radar for the UK government) he returned to his research on cosmic rays at the University of Manchester. But the electric cables for the Manchester trams interfered with his equipment, so he took his rig to a field station in Cheshire, owned by the university and used by its botany department. And got much better signals. He’d discovered meteors, though, not cosmic rays. And this got him and his fellow scientists very interested.

    The field station became known as the Jodrell Bank Experimental Station. A team grew, building and installing steadily more powerful instruments. Lovell teamed up with a civil engineer from Sheffield, Charles Husband (the son of the founder of the University of Sheffield’s civil engineering department). They designed a 250ft diameter radio telescope that could be pointed at any point in the sky above Cheshire, and work commenced in 1952 to build it.

    This drawing of the telescope, published by the Manchester Evening News on 10 June 1954 shows how it had captured the public imagination.

    Anyone familiar with university capital projects will not be surprised to learn that there were delays, cost overruns and building problems, and by 1957 the future of the uncompleted-but-partly-operational telescope was uncertain. But when the Soviet Union launched Sputnik 1, the UK government realised that it would be quite handy to be able to track fast moving objects in space. And the telescope at Jodrell Bank was the only instrument in the world able to do it. Money was found, problems solved, and the telescope – then called the Mark 1 telescope – was completed and became fully operational.

    Jodrell Bank then began a dual life. As a fully-functioning scientific telescope, contributing tremendously to our knowledge of the galaxy, its formation and the different phenomena in it. And also as a secret military establishment, part of the UK’s early warning system for attack by ballistic missiles, and also used in tracking space activity by NASA and the Soviet Union.

    It is still a functioning and useful telescope, although advances in our understanding of radio astronomy mean that distributed arrays of smaller telescopes are now preferred – more powerful and easier and cheaper to run.

    The telescope was renamed the Lovell Telescope in 1987, and was given grade 1 listed building status the following year (can you imagine the challenges in maintaining a piece of scientific equipment which is heritage listed?). The whole site was named as a UNESCO world heritage site in 2019. And it is well worth a visit! The website for the centre has an excellent short history, on which I have drawn for some of this blog, which is also well worth a visit, not least for the great images.

    Here’s a jigsaw of the card. It hasn’t been posted, but looks to be from the 1960s. It tells us on the back that

    the 250 ft Radio Telescope is the major instrument belonging to the Department of Radio Astronomy [of the University of Manchester]. It is set in the rural county of Cheshire, about 20 miles south of Manchester.

    Source link

  • The consultant you need is already on campus

    The consultant you need is already on campus

    When questions come up about whether services are working, overlapping, or reaching the right students, the instinct is usually the same – commission a review.

    But the expertise to do that work often already exists on campus – if institutions are willing to use it.

    I’ve been on both sides of that equation. I went to management school, worked as a management consultant, and later as a careers consultant across several business schools. I’ve watched students walk straight from campus into firms like McKinsey, solving high-stakes, real-world problems.

    So when it came time to review our student support teams (Advice, Basic Student Needs and Wellbeing), asking Grishma, an MBA student, to take on the brief didn’t feel risky – it felt obvious. What followed challenged our assumptions, tested our data, and raised a bigger question about why higher education so often overlooks the expertise sitting on its own campuses.

    What the review focussed on

    Grishma led the review across the three areas using a mix of interviews, process mapping, data analysis, and systems review. Enough structure to test our assumptions, without losing sight of how students actually experience services.

    The analysis covered four questions:

    Are we set up to deliver the biggest possible impact with the resources we have? Are services designed around real student journeys? Do students know what’s available, and do they trust it? Can we clearly evidence impact and adapt as needs change?

    Once we began working through them, the questions revealed far more than expected.

    What we learned

    One of the most useful aspects of the review was the lens Grishma brought as a business school student.

    She didn’t need translating between how services are meant to work and how they actually feel from the student side. At the same time, her business school education forced us to be more disciplined – clearer about purpose, tighter on prioritisation, and more honest about trade-offs.

    Additionally, the action research approach meant this wasn’t a hands-off consultancy exercise. Being embedded in the organisation allowed Grishma to see how decisions played out in practice, not just in theory. That depth of involvement significantly improved the quality of the output in a way that a more traditional consultant model rarely achieves.

    That combination exposed gaps we’d normalised over time, and the same themes kept coming up.

    First, clarity of purpose. Teams were doing good work, but not always towards clearly shared goals. Planning often happened in response to immediate need, which led to lots of small initiatives rather than a few well-defined priorities. For example, different teams would each run their own welcome events or student workshops, duplicating effort and confusing students, instead of coordinating around a single, high-impact programme.

    Secondly, silos. Ways of working had evolved separately, limiting visibility across services and making collaboration harder than it needed to be. In a few cases, this meant duplicated effort or students being passed between teams at key moments. For example, a student seeking financial support might contact both the advice team and the basic needs team, but without defined ways of working together, neither team had a clear picture of the student’s situation or next steps.

    Finally, data. Insight was being gathered through multiple systems and methods, which made it difficult to build a consistent picture of reach, equity, outcomes, or impact. We could tell strong individual stories, but struggled to answer bigger questions about who we were missing or what was working best.

    For instance, in the advice team, advisors could point to powerful examples of students helped through complex situations, but the data couldn’t easily show whether those successes were typical, who never made it to an appointment, or whether some groups were consistently under-represented.

    Taken together, these issues created a student experience that could feel disjointed – particularly at transition points – and made it harder to confidently demonstrate impact to partners and stakeholders.

    None of this is unique to us. And if anything, the problem is more acute on the university side. An SU might have three or four support teams working in parallel – a university can have dozens, spread across academic departments, central services, faculties, and colleges.

    A student might interact with disability services, personal tutoring, counselling, hardship funds, and academic skills support without any of those teams sharing data or referral pathways. The fragmentation we found in a relatively small organisation is a microcosm of something much bigger.

    But seeing it laid out clearly, using our own evidence, challenged the comforting assumption that being busy and well-intentioned automatically adds up to effectiveness.

    What changed as a result

    The work hasn’t sat on a shelf. Each of the three teams received tailored recommendations reflecting their specific pressures, maturity, and opportunities.

    We’ve restructured a team to align roles more closely with student journeys rather than service silos. Student development was split into two directorates – wellbeing and volunteering – with wellbeing brought alongside other student support teams where there was clearer overlap in purpose and practice.

    We also created a new role to bring together processes for advice and basic needs, giving students a single point of contact and enabling internal triage rather than passing students between teams.

    We tightened how we collect and use data so that demand, access, and outcomes can be viewed together. Feedback is now gathered consistently across services, supported in some areas by psychology student placements focused on survey design and analysis using the PERMA model which looks at Positive emotion, Engagement, Relationships, Meaning, and Accomplishment. All teams produce end-of-semester reports, and we are working with the Data Science department to develop an annual impact report that brings together reach, trends, outcomes, and student journeys in a form suitable for senior leaders, trustees, and funders.

    We also introduced clearer governance for new projects. This includes being explicit about goals from the outset, who makes which decisions, how student journeys will be reviewed, and the points at which a project should be adapted or stopped. Projects are now reviewed regularly against agreed KPIs, spend, and evidence of impact.

    For example, we introduced the ‘SU How’s You Strategic Oversight Board’, meeting every six weeks to track progress on our student wellbeing calls. With a target of 22,000 calls this year and a fixed budget, the board has improved transparency and accountability, helping teams stay on track, maintain call volumes and quality, and make timely decisions about priorities and resources.

    Perhaps just as importantly, it shifted how we think about service design. We’re developing a shared student–staff model in which students contribute as partners across the system, through rotational roles that span services and intentionally build future student support leaders, rather than being recruited into single, siloed teams.

    What we’d take forward

    This won’t work everywhere, and it isn’t a plug-and-play solution. It needs a clear brief, a committed student, good support, and a willingness to test ideas – and leave some behind.

    It also required us to take the idea of being student-led seriously: not just consulting students on decisions already made, but trusting them with real responsibility for analysing systems, challenging assumptions, and shaping change. In doing so, we were forced to look more critically at our own practices, and to recognise expertise sitting much closer to home than we often assume.

    Next time we’re tempted to reach for a consultant, we’ll start by looking across campus first.

    Getting started

    For many university departments and students’ unions, the biggest barrier isn’t whether this would be useful – it’s not knowing where to begin.

    Most business schools already run modules built around real-world projects. These often sit under headings like placements, consulting projects, capstone projects, or experiential learning. The quickest route in is usually the placements or employability team, whose job is to source organisations willing to host applied projects for students.

    If you search your university website for terms like “MBA projects”, “consulting project”, or “industry partnerships”, you’ll often find a named contact or generic inbox. That’s usually the best place to start.

    If there’s no obvious placements team, look for programme directors or module leads, particularly for MBAs, MSc Management, Data Science or Psychology. Programme directors are often keen to find credible, well-scoped projects that give students meaningful experience – especially ones based inside the university, where governance, ethics and access are easier to manage.

    An initial email doesn’t need to be polished or technical. What matters is clearly setting out the problem you’re trying to solve, why it’s complex or interesting, what students would gain from working on it, and the level of access and support you can offer.

    Not every department will say yes, and that’s normal. Timings, assessment cycles, and capacity vary widely. Treat the first conversation as relationship-building rather than transactional. Once one project lands successfully, future collaborations become much easier.

    For university professional services teams, the logic is the same but the institutional politics can be trickier. Approaching your own business school to review your own services means navigating sensitivities that an external partnership wouldn’t trigger – academic departments don’t always see professional services as worthy project hosts, and professional services teams can be defensive about student scrutiny.

    A director of student services reaching out to an MBA programme director with a well-scoped brief and genuine openness to findings is the way through – but it helps to have senior sponsorship and to frame the work as a genuine learning opportunity, not a cost-saving exercise.

    For universities, there’s also a stronger regulatory case. OfS expects providers to be able to evidence the impact of student support – particularly around access, continuation, and attainment gaps. TEF panels look for evidence that institutions understand and act on the student experience. Work like this can feed directly into access and participation monitoring and condition of registration evidence in ways that traditional consultancy reports – which tend to arrive, get filed, and gather dust – rarely do.

    For students’ unions and universities alike, this approach isn’t about cutting corners. It’s about recognising that universities already contain extraordinary expertise – and that with a bit of confidence and curiosity, organisations can turn themselves into powerful learning environments while improving services at the same time.

    Source link

  • Do students lack skills, or does the system struggle to surface them?

    Do students lack skills, or does the system struggle to surface them?

    Plenty of people I talk to that work in graduate employability tell me there’s a problem.

    They say that students develop skills at university but struggle to name them, evidence them, and talk about them convincingly to employers.

    AGCAS’s (now the Graduate Futures Institute) recent “Uncovering Skills” research put numbers and texture to it.

    Focus groups with higher education professionals found that students routinely fail to recognise the value of their experiences, particularly informal ones.

    Participants observed that students “often think if it’s not linked to their degree then it is not relevant” and “disregard skills gained from everyday life – like being a parent or managing during Covid.”

    The research identified widespread confusion about terminology, with one participant noting that:

    the language around ‘skills’, ’employability’, ‘attributes’, etc. is very confusing, and there is no one central list of what ’employability skills’ are.

    Students engage with reflection “just in time” – only when prompted by imminent job applications – limiting their ability to build coherent narratives about their development.

    Meanwhile, employers report persistent concerns about graduate readiness. The Institute of Student Employers’ Student Development Survey tracks whether graduates meet expectations across various attributes, and the trajectory on resilience is fascinating.

    In 2023, 25 per cent of employers said graduates showed “less than expected” resilience, but by 2024 that figure had risen to 35 per cent, and the 2025 survey recorded 48 per cent.

    Nearly half of employers now think graduates lack resilience, and the problem is getting worse rather than better. ISE’s 2025 survey also recorded rising concerns about self-awareness, time management, and work-appropriate communication.

    When the University Alliance commissioned CBI Economics to survey 252 employers in April 2024, the top factor in graduate recruitment was “the graduate’s enthusiasm and positive attitude towards the role” – selected by 68 per cent – while interpersonal and communication skills were the most important determinant of interview success according to 84 per cent.

    Add it all up, and it looks like students can’t articulate their skills, employers say key attributes are missing, and the system seems unable to bridge the gap.

    The “Uncovering Skills” research called for better pedagogy – helping students “uncover” what they’ve learned – but pedagogy alone can’t solve a problem that’s also about vocabulary and documentation.

    Common tongue

    Skills England’s UK Standard Skills Classification arrived in November – a new framework for describing skills across the labour market.

    As my colleague David Kernohan outlined at the time, the SSC organises 3,343 occupational skills into a four-level hierarchy, with 22 skill domains at the top level breaking down into 106 skill areas, 606 skill groups, and individual occupational skills; alongside these sit 13 core transferable skills meant to capture generic capabilities valuable across sectors.

    The SSC maps to HECoS subject codes, Ofqual qualifications, and SOC occupational classifications, and around 43 per cent of SSC skills matched to AGCAS graduate job profile skills.

    If this becomes the common vocabulary for skills conversations – and that’s clearly the intention – it could help address the “no central list” problem identified in “Uncovering Skills”.

    But does the vocabulary actually capture what employers say they want?

    Mind the gap

    An earlier attempt to codify graduate employability is worth a look. In 2011, the CBI and NUS jointly published Working towards your future – making the most of your time in higher education, and the timing was significant – just as £9,000 fees were arriving. The guide drew on a survey of 2,823 students across 71 institutions.

    The CBI/NUS framework had a distinctive structure, with “positive attitude” positioned as the foundation underpinning everything else:

    a readiness to take part, openness to new activities and ideas, and a desire to achieve results.

    The other building blocks were self-management (including resilience, flexibility, self-starting, time management, and “readiness to improve your own performance based on feedback and reflective learning”), team working (“respecting others, co-operating, negotiating, persuading, contributing to discussions, awareness of interdependence with others”), business and customer awareness, problem solving, communication (written and oral, including listening and questioning), numeracy, and IT skills.

    If we compare the SSC’s 13 core skills – Planning and Organising, Adapting, Working With Others, Listening, Speaking, Leadership, Learning and Investigating, Creating, Problem Solving and Decision Making, Reading, Digital Literacy, Numeracy, and Writing.

    Some mappings work well – communication has been disaggregated into four components, which is more precise, and Problem Solving, Numeracy, and Digital Literacy have clear equivalents, while Leadership appears where it was absent from 2011.

    But several 2011 concepts have no obvious SSC home. Positive attitude – the foundation of the entire CBI/NUS framework – has no equivalent, and resilience, a component of self-management, isn’t present either. “Adapting” covers flexibility but not recovery from setbacks.

    Business and customer awareness doesn’t appear as a core transferable skill, and reflective learning is only weakly captured (since “Learning and Investigating” emphasises information gathering rather than improving through experience). Negotiating and persuading have been collapsed into the generic “Working With Others”, and self-starting and initiative don’t feature at all.

    It’s all partly because the 2011 framework was dispositional – describing how people approach work, not just what they can do. The SSC is more taxonomic – categorising capabilities but lighter on attributes that shape how those capabilities are deployed.

    The CBI Economics 2024 survey found “enthusiasm and positive attitude” was employers’ top recruitment factor, and ISE data shows resilience concerns have nearly doubled in three years.

    The attributes employers emphasise most strongly are either absent from the SSC or significantly diluted compared to earlier frameworks.

    Off the record

    The gap between frameworks matters particularly for informal learning – the 2011 guide was explicit that employability skills developed through extracurricular activities as well as courses, and it devoted substantial space to SU roles (employed positions in SU venues, society and club committees, elected officer roles, course representative positions) and to volunteering.

    It’s worth remembering what these roles actually involve. A student leader may spend a year managing staff, overseeing budgets, navigating institutional politics, handling crises – and the role requires resilience (things go wrong regularly), initiative (nobody prescribes priorities), and sustained commitment to a demanding public position.

    A society treasurer, meanwhile, manages a budget ranging from hundreds to tens of thousands of pounds, planning expenditure, tracking spending, and ensuring compliance – developing numeracy in applied contexts, certainly, but also accountability and understanding of how organisations function.

    Course representatives might gather peer feedback, attend committees, present the student perspective, and navigate disagreement, requiring listening and communication but also willingness to put yourself forward and persist when progress is slow. And a regular volunteer commits time to community projects over months or years, developing service orientation, reliability, and adaptability.

    These activities develop precisely the attributes employers say are lacking – positive attitude, resilience, initiative. But those attributes have limited presence in the SSC vocabulary, and if the framework becomes the common language for skills, and extracurricular learning can’t be expressed in that language, then a vocabulary gap becomes a recognition gap.

    Paper trail

    Ironically, there is existing infrastructure designed to make qualifications transparent and comparable – infrastructure that could help with exactly these problems. The UK signed up to it – and it’s not being used properly.

    The Diploma Supplement is a standardised document attached to a higher education qualification, describing what the qualification means, what the graduate studied, how the grading system works, and where the qualification sits within national frameworks.

    It’s been a core Bologna Process transparency instrument since 1999, and ministers from participating countries – including the UK – agreed in 2003 that every graduate should receive the Diploma Supplement automatically, free of charge, and in a widely spoken European language.

    The UK remains a full participating member of the European Higher Education Area, and Bologna membership isn’t contingent on EU membership – Norway, Switzerland, and other non-EU countries participate as full members.

    A 2017 European Commission-funded study found that around 90 per cent of HR and recruitment professionals surveyed had used the Diploma Supplement or similar documents to obtain information about job candidates, and more than half of surveyed enterprises requested such documents “often” or “very often” from applicants.

    Students most commonly used the document for job applications — either submitting it directly or consulting it when preparing CVs. Employers actively use the infrastructure when making selection and hiring decisions.

    Partial credit

    But the most recent Bologna implementation report tells a difficult story – of 48 European higher education systems, 39 now fully comply with all ministerial requirements for Diploma Supplement provision, but the UK is not among them.

    The report explicitly identifies England, Wales, and Northern Ireland as partial and inconsistent implementers:

    The Diploma Supplement is not universally issued. Some institutions issue the Diploma Supplement. Others issue the Higher Education Achievement Report (HEAR), which is ‘based upon and virtually reflects the Diploma Supplement, whilst remaining distinctly British’. Some institutions issue only a transcript, without either the Diploma Supplement or HEAR.

    The UK doesn’t meet the commitment it signed up to. Some graduates get a Diploma Supplement, some get HEAR, and some get only a bare-bones transcript, with no system-level guarantee.

    Worse still, HEAR adoption is on the wane – plenty of institutions that championed it a decade ago have quietly dropped it, citing administrative burden or low perceived demand.

    The UK’s documentation infrastructure isn’t just incomplete – it’s moving backwards while Europe consolidates around a standard.

    Plenty of UK graduates are missing out on the whole document – all eight sections of standardised qualification information that European employers are trained to read and trust.

    Many don’t get the core transparency function of explaining what their degree means, how their grades work, and where their qualification sits in national frameworks.

    For graduates seeking work or further study in Europe – facing reduced automatic recognition arrangements post-Brexit – this is a disadvantage built into the system. When European HR professionals expect the Diploma Supplement and UK graduates can’t provide one, or provide inconsistent alternatives, they start on the back foot.

    White space

    Beyond the basic provision problem, there’s a very specific opportunity being missed. The Diploma Supplement has eight sections, and Section 6.1 (“Additional Information”) is designed as flexible space for recording relevant information that doesn’t fit the core academic record but deserves formal recognition.

    European universities use it systematically to record extracurricular activity, and several models are well-established.

    In Germany, the Technical University of Munich publishes a detailed activity list specifying what qualifies for Section 6.1 recording – student council president, student representatives on Senate and University Council, student representatives on appointment committees, membership of quality circles, and voluntary work in faculty student council units – using a 90-hour threshold (equivalent to three ECTS credits) to determine what’s substantial enough to record. The Katholische Universität Eichstätt-Ingolstadt operates a similar model with a formal application process, and the standard wording is simple:

    The student has served in an honorary capacity as [role].

    In Poland, Kazimierz Wielki University publishes a “catalogue of achievements” for Section 6.1 explicitly including volunteering (verified by the university volunteering centre) and active involvement in students’ government (confirmed by the students’ government chair), while the University of Wrocław specifies eligible activities including Senate membership, faculty council roles, students’ government positions, and year representative roles.

    In Portugal, the Instituto Politécnico de Bragança lists activity types including membership of institutional bodies and student bodies, with published examples showing entries like “The student contributed to R&D activities relevant for the Institution… under supervision of the lecturer…” and “The student represented the IPB in the event ’35º Congresso Técnico Científico da APTN’ that took place in 29/04/2012.”

    In Sweden, Lund University uses capability-style descriptions naming leadership, communication skills, administrative skills, democratic decision-making, and cooperation.

    In other words, systematic infrastructure exists across multiple European systems – treating extracurricular activity as a category of achievement worth formal recording on qualification documentation.

    The few UK institutions that do issue Diploma Supplements aren’t generally using Section 6.1 this way, and the opportunity sits unused while European counterparts have built processes around it.

    Show your working

    If UK universities did adopt European approaches, Section 6.1 entries might look something like this.

    A sabbatical officer’s entry might read:

    The student served as Education Officer of the Students’ Union from September 2024 to June 2025, a full-time elected position. The role involved policy development, representation on University Senate and Academic Board, line management of student staff, and budget oversight. Approximately 1,600 hours across the academic year.

    For a student representative on Senate:

    The student served as elected student representative on University Senate from October 2023 to June 2025, attending scheduled meetings and contributing to governance decisions. Verified by the University Secretary.

    A society president’s record might state:

    The student served as President of the Economics Society during 2024-25, leading a committee of six officers and coordinating activities for approximately 200 members. Responsibilities included chairing meetings, event planning, and budget oversight. Approximate commitment: 150 hours.

    And for volunteering:

    The student participated in the Community Tutoring Programme from October 2023 to May 2025, contributing approximately 120 hours of educational support. The student completed safeguarding training. Verified by the Students’ Union Volunteering Coordinator.”

    The German 90-hour threshold provides a ready standard – some roles far exceed it, significant committee positions typically meet it, and casual participation does not.

    Using SSC vocabulary where it fits (planning and organising, problem solving, numeracy) while adding supplementary language for what the SSC doesn’t capture could help bridge the vocabulary gap for extracurricular learning.

    Assembly required

    Three interconnected problems need addressing – documentation, vocabulary and connection – but they’re currently being discussed in entirely separate conversations, if at all.

    The documentation question is whether the UK is serious about the Bologna commitment it signed up to – the current patchwork of some institutions issuing Diploma Supplements, some issuing HEAR, and some issuing only transcripts fails that commitment and disadvantages graduates.

    That might require national guidance, sector coordination, or regulatory attention – and institutions still issuing HEAR should consider whether declining sector adoption makes that sustainable and whether the Diploma Supplement’s European recognition makes it more valuable for graduates entering international labour markets.

    For those that do issue the Diploma Supplement, guidance on Section 6.1 practice would help – defining what activities qualify, what evidence is required, and what verification applies, with German and Polish catalogues providing ready models.

    The vocabulary question is whether the SSC’s core skills adequately capture the dispositional attributes that employer surveys highlight – the evidence suggests gaps, and adding concepts like positive attitude, resilience, and business awareness (or expanding existing definitions) would better reflect what employers actually prioritise.

    Consultation with bodies interested in informal learning – GFI, SUs, student volunteering depts – could help identify where gaps bite hardest.

    And the connection question is simply whether anyone will notice these issues are related.

    Skills England is developing the SSC, Bologna compliance gets periodic implementation reports from panicked civil servants filling forms ahead of meetings, GFI is working on helping students articulate skills, and Advance HE promulgates employability and graduate attributes matrices.

    Universities UK is running “Future Jobs” roundtables with employers about matching graduate skills to labour market needs, Russell Group students’ unions are gathering data on their collective employability contribution, and SUs across the sector facilitate activities that can part populate all of it.

    But nobody seems to be connecting the vocabulary that could describe learning, the documentation that could record it, and the pedagogy that helps students recognise what they’ve gained.

    The fixes aren’t technically difficult – they require using existing infrastructure properly, filling vocabulary gaps that employer evidence identifies, and connecting work happening in different parts of the system.

    We no longer, it seems, have a government department with the capacity or disposition to undertake work like this – but it is an opportunity that’s there for the taking.

    Source link

  • After Research, Tennessee Lawmaker Drops Bill to End Tenure

    After Research, Tennessee Lawmaker Drops Bill to End Tenure

    rruntsch | iStock | Getty Images Plus

    Tennessee’s House Higher Ed Subcommittee chair has withdrawn his bill to end tenure in public universities after saying he “stumbled into a little bit of the history” and “got a little deeper than I thought I would.”

    “It got me to thinking about political lines, pendulums, they’re always moving … I kind of think that way about tenure,” Republican Justin Lafferty told his subcommittee Wednesday in a brief but wide-ranging explanation for dropping the bill.

    According to a video of the meeting posted on the state General Assembly’s website, Lafferty said he learned tenure goes back to the 1600s or 1700s, “a time when there weren’t that many highly educated folks,” so “it was very important to keep the best and the brightest.”

    Though he didn’t use the words “academic freedom,” he echoed arguments for protecting it that proponents of tenure often use. Mentioning the Vietnam War era, Lafferty said, “In a controversial time, I kind of understand you want those protections in place to not lose the talent that you’ve been able to acquire.”

    But he also suggested that he filed his bill in opposition to controversial faculty speech. He didn’t mention Charlie Kirk, but he complained about faculty speech regarding someone’s death and a “half a million” payout. (Darren Michael, a tenured theater professor at Tennessee’s Austin Peay State University, was terminated for reposting a news headline about Kirk but was later reinstated and paid $500,000.)

    “With tenure now, the pendulum has swung so far that we can have state employees that we pay with our tax dollars—‘mock’ might not be the right word, but can certainly be very insensitive towards the death of another human being,” Lafferty said. “And as a Tennessean, I’m not comfortable with the fact that that person cannot be removed from a job.”

    Lafferty withdrew his bill, but he may not be done targeting tenure. He said during the meeting that “we’ll maybe be back.” News Channel 5 reported that Lafferty said the bill likely didn’t have a path forward this year. He didn’t return Inside Higher Ed’s requests for comment Thursday.

    Source link

  • How to Build a Higher Education Web Team

    How to Build a Higher Education Web Team

    Your institution’s website is one of your strongest branding assets. Creating a best-in-class web experience for your audiences and maintaining it requires the right people in the right roles. But how many people do you need? What roles should they play? How much depends on your CMS, your content production strategy, or the few dozen stakeholders who “just want to make a quick edit”?

    Here’s the good news: you don’t need a huge team as long as you have a strategic team.  Whether your model of web governance is centralized, decentralized, or somewhere in between, there’s a way to structure your university’s web team so it supports your goals and those of your contributors.

    Assess Your Current CMS, Site Complexity, and Staffing

    Get some clarity on your situation. Before you go drafting org charts or rewriting job descriptions, ask three questions:

    1. What CMS are we using?

    A WordPress site with 30 content contributors requires a different level of support than an enterprise Drupal install.

    2.  How big and complex is the site?

    A single, streamlined .edu with clear governance is one thing. A legacy multi-domain labyrinth of program pages, faculty bios, and microsites requires a different level of care.

    3. What support do we already have?

    Is there an understanding that the developers within your IT department will also serve Marcomm? Do you keep an agency on retainer for web support? How staffed-up is your central marketing team? Identifying any gaps will assist you in identifying the roles you actually need to fill.

    Tip: Document the true state of things today, not what’s supposed to be true. Gaps, workarounds, and unofficial duties all provide valuable clues to where your web team structure might need more support and reinforcement.

    Define the Core Web Team Roles Every Institution Needs

    There’s no one size that fits all for every institution, but there are core functions that have to be accounted for on every successful higher education web team. Whether these roles live in one person or four, they need to be covered.

    1. Strategy

    This person or team holds the vision. They think in systems and are able to connect the dots across teams. A strategist ensures that the website does more than function; it actively moves the needle on broader institutional goals.

    2. Content

    Your web content is not self-managing. You need someone who understands how to write, edit, and maintain content for humans (and search engines and generative AI, too).

    3. User Experience & Design

    This role shapes the website experience, ensuring that every page is visually consistent, accessible to all users, and designed to support institutional goals through thoughtful UX and a cohesive design system.

    4. Development

    Even the tidiest CMS needs attention from a developer. Whether it’s minor front-end changes, troubleshooting plugin issues,  or core updates, you need someone technical to keep things running.

    Tip: If you don’t have in-house developers, make sure your CMS isn’t so customized with plugins that it makes your implementation unwieldy, fragile, and difficult to keep updated.

    H2: Establish a Clear Web Governance Model

    “Everyone owns the website” sounds collaborative, but without a defined structure, it’s chaos. That doesn’t mean all ownership should be centralized. After all, many university web teams want to, or are best resourced to, rely on decentralized academic and departmental units to support web work.  However, it does mean you need a clear model.

    Here’s what we know holds up well when it comes to higher education web governance:

    • Defined roles: Who owns what, who approves what, and who’s responsible when something breaks?
    • Governance structure: Its policies as well as working norms. What’s expected, what’s supported, and what happens when someone goes rogue?
    • Guardrails: Templates, standards, permissions, and training will keep your site consistent, cohesive, and professional, safeguarding your brand and ensuring the best UX for your site visitors.
    • Community: Build your editor community like you’d build your brand. You’ll need to support this community, stay in close communication, and seek out feedback regarding pain points, feature requests, and other challenges and opportunities that may arise among your power users.

    Decentralized content models work beautifully when they’re supported with intention. This prevents the inadvertent distribution of chaos across your web properties.

    Build a Sustainable Training and Contributor Support System

    Training isn’t optional, especially if your web team supports distributed content contributors. It’s the difference between a brand-aligned, accessible site and a digital free-for-all.

    Here’s what we’ve seen work well:

    • Practical CMS guides tailored to your setup. These can be in the form of short videos or lightweight documentation.
    • Quick-start templates for content contributors. Look to your support queue for common CMS asks and frequent stumbling points for your user base. This will ensure you can address the issues that are vexing your user base and also let them know you’re focused on continuous improvement.
    • Style and accessibility checklists. Even if your style guide is in early stages or your accessibility guidance doesn’t cover every WCAG guideline, start with what you have and build as you go.
    • Ongoing refreshers. Think lunch-and-learns, active Slack or Teams CMS knowledge-sharing groups, or a regular “web best practices” newsletter for CMS users. Provide opportunities to upskill while fostering an active community of practice on your campus.

    Real-world documentation should follow the same best practices as other content so it can be easily read, understood, and observed. It can be snappy and nimble, and delivered in the same tone and clarity you want to see reflected in your web content.

    TL;DR: Build What You Need—No More, No Less

    Your university’s web team doesn’t need to be huge. But it does need to be cleverly built in your campus reality: your CMS, your site, your people, and your capacity. Start with clarity, invest in training, and build the structure that makes a great website possible.

    Need help figuring out what structure makes sense for your team?

    Carnegie helps institutions of all sizes map the roles, training, and web governance they actually need to build and maintain a successful website. Let’s dig in together.


    Frequently Asked Questions About Building a Higher Ed Web Team

    How many people should be on a higher education web team?

    There is no universal number. The right size depends on your CMS, site complexity, governance model, and content volume. What matters most is ensuring four core functions are covered: strategy, content, user experience/design, and development. In smaller institutions, one person may cover multiple functions. In larger institutions, these may be distributed across specialized roles.

    What roles are essential for a successful university web team?

    Every successful web team must account for:

    • Strategic leadership aligned with institutional goals
    • Content creation and optimization
    • UX and design oversight
    • Technical development and CMS support

    Whether centralized or decentralized, these responsibilities must be clearly defined to prevent gaps or duplication.

    Should higher education websites be centrally or decentrally managed?

    Both models can work. Centralized governance creates consistency and control. Decentralized models increase agility and subject-matter expertise. The most effective approach defines clear guardrails, approval processes, and training structures so distributed contributors operate within a cohesive system.

    How does governance improve website performance?

    Governance clarifies ownership, approval workflows, standards, and expectations. Without it, websites become inconsistent, outdated, and difficult to maintain. Strong governance improves accessibility, brand consistency, SEO performance, and long-term sustainability.

    How often should web contributors receive training?

    Training should be ongoing, not one-time. Institutions benefit from:

    • Initial CMS onboarding
    • Accessibility and style refreshers
    • Documentation updates
    • Quarterly or semi-annual best-practice sessions

    Ongoing training prevents content drift and strengthens distributed web communities.

    When should an institution seek external web support?

    External support can be helpful when:

    • Your team lacks development resources
    • Governance is unclear or difficult to enforce
    • Accessibility compliance needs monitoring
    • SEO and analytics insights are underutilized
    • Your CMS requires optimization or reconfiguration

    Carnegie partners with institutions to strengthen governance, improve accessibility, optimize content performance, and build sustainable web team structures that support long-term success.

    Source link