Category: Data

  • Re-capturing the early 80s | HESA

    Re-capturing the early 80s | HESA

    Most of the time when I talk about the history of university financing, I show a chart that looks like this, showing that since 1980 government funding to the sector is up by a factor of about 2.3 after inflation over the last 40-odd years, while total funding is up by a factor of 3.6.

    Figure 1: Canadian University Income by source, 1979-80 to 2022-23, in billions of constant $2022

    That’s just a straight up expression of how universities get their money. But what it doesn’t take account of are changes in enrolment, which as Figure 2 shows, were a pretty big deal. Universities have admitted a *lot* more students over time. The university system has nearly doubled since the end of the 1990s and nearly tripled since the start of the 1990s.

    Figure 2: Full-time Equivalent Enrolment, Canada, Universities, 1978-79 to 2022-23

    So, the question is, really, how have funding pattern changes interacted with changes in enrolment? Well, folks, wonder no more, because I have toiled through some unbelievably badly-organized excel data to bring you funding data on this that goes back to the 1980s (I did a version of this back here, but I only captured national-level data—the toil here involved getting data granular enough to look at individual provinces). Buckle up for a better understanding of how we got to our present state!

    Figure 3 is what I would call the headline graph: University income per student by source, from 1980-81 to the present, in constant $2022. Naturally, it looks a bit like Figure 1, but more muted because it takes enrolment growth into account.

    Figure 3: University income per student by source, from 1980-81 to the present, in constant $2022

    There’s nothing revolutionary here, but it shows a couple of things quite clearly. First, government funding per-student has been falling for most of the past 40 years.; the brief period from about 1999 to 2009 stands out as the exception rather than the norm. Second, despite that, total funding per student is still quite high compared with the 1990s. Institutions have found ways to replace government income with income from other sources. That doesn’t mean the quality of the money is the same. As I have said before, hustling for money incurs costs that don’t occur if governments are just writing cheques.

    As usual, though, looking at the national picture often disguises variation at the provincial level. Let’s drill one level down and see what happened to government spending at the sub-national level. A quick note here: “government spending” means *all* government spending, not just provincial government spending. So, Ontario and Quebec probably look better than they otherwise would because they receive an outsized chunk of federal government research spending, while the Atlantic provinces probably look worse. I doubt the numbers are affected much because overall revenues from federal sources are pretty small compared to provincial ones, but it’s worth keeping in mind as you read the following.

    Figure 4 looks at government spending per student in the “big three” provinces which make up over 75% of the Canadian post-secondary system. Nationally, per-student spending fell from $22,800 per year to $17,600 per year. But there are differences here: Ontario spent the entire 42-year period below that average, while BC and Quebec spent nearly all that period above it. Quebec has notably seen very little in terms of per-student fluctuations, while BC has been more volatile. Ontario saw a recovery in spending during the McGuinty years, but then has experienced a drop of about 35%. Of note, perhaps is that most of this decline happened before the arrival of the current Ford government.

    Figure 4: Per-Student Income from Government Sources, in thousands of constant $2022, Canada and the “Big Three” provinces, 1980-81 to 2022-23

    Figure 5 shows that spending volatility was much higher in the three oil provinces of Alberta, Saskatchewan, and Newfoundland & Labrador. All three provinces spent virtually the entirety of our period with above-average spending levels but the gap between these provinces and the national average was quite large both in the early 1980s and from about 2005 onwards: i.e. when oil prices were at their highest. Alberta of course has seen per-student funding drop by about 50% in the last fifteen years, but at the same time, it is close to where it was 25 years ago. So, was it the dramatic fall or the precipitous rise that was the outlier?

    Figure 5: Per-Student Income from Government Sources, in thousands of constant $2022, Canada and the “Oil provinces”, 1980-81 to 2022-23

    Figure 6 shows the other four provinces for the sake of completeness. New Brunswick and Nova Scotia were the lowest spenders in the country for most of the period we’re looking at, only catching up to the national average in the mid-aughts. Interestingly, the two provinces took two different paths to raise per-student spending: Nova Scotia did it almost entirely by raising spending, while in New Brunswick this feat was to a considerable extent “achieved” by a significant fall in student numbers (this is a ratio, folks, both the numerator and the denominator matter!).

    Figure 6: Per-Student Income from Government Sources, in thousands of constant $2022, Canada and selected provinces, 1980-81 to 2022-23

    An interesting question, of course, is what it would have cost to have kept public spending at 1980 per-student levels. And it’s an interesting question, because remember, total spending did in fact rise quite substantially (see Figure 1): it just didn’t rise as fast as student numbers. So, in Figure 7, I show what it would have cost to keep per-student expenditures stable at 1980-81 levels both if student numbers had stayed constant, and what it would have meant in practice given actual student numbers.

    Figure 7: Funds required to return to 1980-81 levels of per-student government investment in universities, Canada, in millions of constant $2022

    Weird-looking graph, right? But here’s how to interpret it. Per-student public funding did fall in the 80s and early 90s. But it rose again in the early aughts, to the point where per-student funding went back to where it was in 1980, even though the number of students in the system had doubled in the meanwhile. From about 2008 onwards, though, public investment started falling off again in per-student terms, going back to mid/late-90s levels even as overall student numbers continued to rise. We are now at the point where getting back to the levels of 1980-81, or even just 2007-08, would require a rise of between $6 and $6.5 billion dollars.

    Anyways, that’s enough sunshine for one morning. Have a great day.

    Source link

  • College Financials 2022-23 | HESA

    College Financials 2022-23 | HESA

    StatsCan dropped some college financial data over the XMAS holidays.  I know you guys are probably sick of this subject, but it’s still good to have some national data—even if it is eighteen months out of date and doesn’t really count the last frenzied months of the international student gold rush (aka “doing the Conestoga”).  But it does cover the year in which everyone now agrees student visa numbers “got out of control,” so there are some interesting things to be learned here nonetheless.

    To start, let’s look quickly at college income by source.  Figure 1, below, shows that college income did rise somewhat in 2022-23, due mainly to an increase in tuition income (up 35% between the nadir COVID year of 20-21 and 22-23).  But overall, once inflation is taken into account, the increase in college income really wasn’t all that big: about a billion dollars in 2021-22 and about the same again in 2022-23, or about 6-7% per year after inflation.  Good?  Definitely.  Way above what universities were managing, and well above most sectors internationally?  But it’s not exactly the banditry that some communicators (including the unofficial national minister of higher education, Marc Miller) like to imply.

    Figure 1: College Income by Source, Canada, 2017-18 to 2022-23, in Billions of $2022

    Now I know a few of you are looking at this and scratching your heads, asking what the hell is going on in Figure 1.  After all, haven’t I (among others) made the point about record surpluses in the college sector?  Well, yes.  But I’ve only ever really been talking about Ontario, which is the only province where international tuition fees have really taken flight.  In Figure 2, I put the results for Ontario and for the other nine provinces side-by-side.  And you can see how different the two are.  Ontario has seen quite large increases in income, mainly through tuition fees and by ancillary income bouncing back to where it was pre-COVID, while in the other nine provinces income growth is basically non-existent in any of the three categories.

    Figure 2a/bCollege Income by Source, Ontario vs Other Nine Provinces, 2017-18 to 2022-23, in Billions of $2022

    (As an aside, just note here that over 70% of all college tuition income is collected in the province of Ontario, which is kind of wild.  At the national level, Canada’s college sector is not really a sector at all…their aims, goals, tools, and income patterns all diverge enormously.)

    Figure 3 drills down a little bit on the issue of tuition fee income to show where they have been growing and where they have not.  One might look at this and think its irreconcilable with Figure 2, since tuition fees in the seven smaller provinces seem to be increasing at a rate similar to Ontario.  What that should tell you, though, is that the base tuition from which these figures are rising are pretty meagre in the seven smallest provinces, and quite significant in Ontario.  (Also, remember that in Ontario, domestic tuition fees fell by over 20% or so after inflation between 2019-20 and 2022-23, so this chart is actually underplaying the growth in international fees in that province a bit.)

    Figure 3: Change in Real Aggregate Tuition Income by Province, 2017-18 to 2022-23, (2017-18 = 100)

    Now I want to look specifically at some of the data with respect to expenditures and to try to ask the question: where did that extra $2.2 billion that the sector acquired in 21-22 and 22-23 (of which, recall, over 70% went to Ontario alone) go?

    Figure 4 answers this question in precise detail, and once again the answer depends on whether you are talking about Ontario or the rest of the country.  The biggest jump in expenditures by far is “contracted services” in Ontario—an increase of over $500M in just two years.  This is probably as close a look as we will ever get at the economics of those PPP colleges that were set up around the GTA since most of this sum is almost certainly made up of public college payments to those institutions for paying the new students had arrived in those two years.  If you assume the increase in international students at those colleges was about 40,000 (for a variety of reasons, an exact count on this is difficult), then that implies that colleges were paying their PPP partners about $12,500 per student on average and pocketing the difference, which would have been anywhere between about $2,500 and $10,000, depending on the campus and program.  And of course, most of the funds spent on PPP were spent one way or another on teaching expenses for these students.

    Figure 4: Change in Expenditures/Surplus, Canadian Colleges 2022-23 vs 2020-21, Ontario vs. Other 9 Provinces, in millions of 2022

    On top of that, Ontario colleges threw an extra $300 million into new construction (this is a bit of an exaggeration because 2020-21 was a COVID year and building expenses were abnormally low), and an extra $260 million (half a billion in total) thrown into reserve funds for future years.  This last is money that probably would have ended up as capital expenditures in future years if the feds hadn’t come crashing in and destroying the whole system last year but will now probably get used to cover losses over the next year or two instead.  Meanwhile, in the rest of Canada, surpluses decreased between 2020-21 and 2022-23, and such spending increases as occurred came mostly under the categories “miscellaneous” and “ancillary enterprises.”

    2022-23 of course was not quite “peak international student” so this analysis can’t quite tell the full story of how international students affected colleges.  We’ll need to wait another 11 months for that data to show up.  But I doubt that the story I have outlined based on the data available to date will not change too much.  In short, the financials show that:

    • Colleges outside Ontario were really not making bank on international students.
    • Within Ontario, over a third of the additional revenue from international students generated in the 2020-21 to 2022-23 period was paid out to PPP partners, who would have spent most of that on instruction.
    • Of the remaining billion or so, about a third went into new construction and another 20% was “surplus,” which probably meant it was intended for future capital expenditure.
    • The increase in core college salary mass was miniscule—in fact only about 3% after inflation. 

    If there was “empire building” going on, it was in the form of constructing new buildings, not in terms of massive salary rises or hiring sprees. 

    Source link

  • Deafening Silence on PIAAC | HESA

    Deafening Silence on PIAAC | HESA

    Last month, right around the time the blog was shutting down, the OECD released its report on the second iteration of the Programme for International Assessment for Adult Competencies (PIAAC), titled “Do Adults Have the Skills They Need to Thrive in a Changing World?”. Think of it perhaps as PISA for grown-ups, providing a broadly useful cross-national comparison of basic cognitive skills which are key to labour market success and overall productivity. You are forgiven if you didn’t hear about it: its news impact was equivalent to the proverbial tree falling in a forest. Today, I will skim briefly over the results, but more importantly, ponder why this kind of data does not generate much news.

    First administered in 2011, PIAAC consists of three parts: a test for literacy, numeracy, and what they call “adaptive problem solving” (this last one has changed a bit—in the previous iteration it was something called “problem-solving in technology-rich environments). The test scale for is from 0 to 500, and individuals are categorized as being in one of six “bands” (1 through 5, with 5 being the highest, and a “below 1,” which is the lowest). National scores across all three of these areas are highly correlated, which is to say that if country is at the top or bottom, or even in the middle on literacy, it’s almost certainly pretty close to the same rank order for numeracy and problem solving as well. National scores all cluster in the 200 to 300 range.

    One of the interesting—and frankly somewhat terrifying—discoveries of PIAAC 2 is that literacy and numeracy scores are down in most of the OECD outside of northern Europe. Across all participating countries, literacy is down fifteen points, and numeracy by seven. Canada is about even in literacy and up slightly in numeracy—this is one trend it’s good to buck. The reason for this is somewhat mysterious—an aging population probably has something to do with it, because literacy and numeracy do start to fall off with age (scores peak in the 25-34 age bracket)—but I would be interested to see more work on the role of smart phones. Maybe it isn’t just teenagers whose brains are getting wrecked?

    The overall findings actually aren’t that interesting. The OECD hasn’t repeated some of the analyses that made the first report so fascinating (results were a little too interesting, I guess), so what we get are some fairly broad banalities—scores rise with education levels, but also with parents’ education levels; employment rates and income rise with skills levels; there is broadly a lot of skill mis-match across all economies, and this is a Bad Thing (I am not sure it is anywhere near as bad as OECD assumes, but whatever). What remains interesting, once you read over all the report, are the subtle differences one picks up in the results from one country to another.

    So, how does Canada do, you ask? Well, as Figure 1 shows, we are considered to be ahead of the OECD average, which is good so far as it goes. However, we’re not at the top. The head of the class across all measures are Finland, Japan, and Sweden, followed reasonably closely by the Netherlands and Norway. Canada is in a peloton behind that with a group including Denmark, Germany, Switzerland, Estonia, the Flemish region of Belgium, and maybe England. This is basically Canada’s sweet spot in everything when it comes to education, skills, and research: good but not great, and it looks worse if you adjust for the amount of money we spend on this stuff.

    Figure 1: Key PIAAC scores, Canada vs OECD, 2022-23

    Canadian results can also be broken down by province, as in Figure 2, below. Results do not vary much across most of the country. Nova Scotia, Ontario, Saskatchewan, Manitoba, Prince Edward Island, and Quebec all cluster pretty tightly around the national average. British Columbia and Alberta are significantly above that average, while New Brunswick and Newfoundland are significantly below it. Partly, of course, this has to do with things you’d expect like provincial income, school policies, etc. But remember that this is across entire populations, not school leavers, and so internal immigration plays a role here too. Broadly speaking, New Brunswick and Newfoundland lose a lot of skills to places further west, while British Columbia and Alberta are big recipients of immigration from places further east (international migration tends to reduce average scores: language skills matter and taking the test in a non-native tongue tends to result in lower overall results).

    Figure 2: Average PIAAC scores by province, 2022-23

    Anyways, none of this is particularly surprising or perhaps even all that interesting. What I think is interesting is how differently this data release was handled from the one ten years ago. When the first PIAAC was released a decade ago, Statistics Canada and the Council of Ministers of Education, Canada (CMEC) published a 110-page analysis of the results (which I analyzed in two posts, one on Indigenous and immigrant populations, and another on Canadian results more broadly) and an additional 300(!)-page report lining up the PIAAC data with data on formal and informal adult learning. It was, all in all, pretty impressive. This time, CMEC published a one-pager which linked to a Statscan page which contains all of three charts and two infographics (fortunately, the OECD itself put out a 10-pager that is significantly better than anything domestic analysis). But I think all of this points to something pretty important, which is this:

    Canadian governments no longer care about skills. At least not in the sense that PIAAC (or PISA for that matter) measures them.

    What they care about instead are shortages of very particular types of skilled workers, specifically health professions and the construction trades (which together make up about 20% of the workforce). Provincial governments will throw any amount of money at training in these two sets of occupations because they are seen as bottlenecks in a couple of key sectors of the economy. They won’t think about the quality of the training being given or the organization of work in the sector (maybe we wouldn’t need to train as many people if the labour produced by such training was more productive?). God forbid. I mean that would be difficult. Complex. Requiring sustained expert dialogue between multiple stakeholders/partners. No, far easier just to crank out more graduates, by lowering standards if necessary (a truly North Korean strategy).

    But actual transversal skills? The kind that make the whole economy (not just a politically sensitive 20%) more productive? I can’t name a single government in Canada that gives a rat’s hairy behind. They used to, twenty or thirty years ago. But then we started eating the future. Now, policy capacity around this kind of thing has atrophied to the point where literally no one cares when a big study like PIAAC comes out.

    I don’t know why we bother, to be honest. If provincial governments and their ministries of education in particular (personified in this case by CMEC) can’t be arsed to care about something as basic as the skill level of the population, why spend millions collecting the data? Maybe just admit our profound mediocrity and move on.

    Source link

  • More Eating the Future | HESA

    More Eating the Future | HESA

    Morning everyone. Welcome back. Some statistical wonkery today, with respect to the analysis of government expenditures on postsecondary education.

    Many of you will recognize Figures 1 and 2 from earlier blogs or the State of Postsecondary Education 2024. They represent the two most-common ways to look at commitments to postsecondary education: the first in per-student terms, and the second in per-GDP terms.

    Figure 1: Provincial Expenditures per FTE Student by Sector, 2022-23

    Figure 2: Provincial PSE Expenditures, by Sector, as a Percentage of Provincial GDP, 2022-23

    These two approaches have their respective strengths and weaknesses, and not surprisingly they generate slightly different conclusions about how strong each jurisdiction’s efforts are writ to postsecondary education, one focused on the “recipients” of funding (students) and the other focused on the source of the funding (the local economy). Neither is definitive, both are useful.

    But there is another way to look at this funding, and that is not to look at how much institutions receive as a proportion of local jurisdictional output, but to look at what percentage of government spending is devoted to educational institutions. Examined over time, this figure tells you the changing status of postsecondary education compared to other policy priorities; examined across provinces, it can tell you which provinces put more emphasis on postsecondary education. Of course, no one tracks this in Canada, because it involves a lot of tedious mucking around in government documents, but what is this blog for if not precisely that? I wasn’t doing anything on my holidays anyways.

    So I decided to pair my long-term data series on provincial budgets (the most recent one posted back in April), with a new data series on total provincial spending which I derived simply by looking at consolidated expenditures in each province since 2006 and expressed in these same budgets. Usual disclaimers apply: provincial spending definitions aren’t entirely parallel (or at least they use different words to describe what they are doing) particularly with respect to capital, so inter-provincial comparisons are probably a tiny bit apples-to-oranges even if each province’s data is consistent over time. Take the exact numbers with a grain of salt but I think they will mostly stand up to scrutiny.

    Figure 3 shows provincial transfers on postsecondary institutions across all ten provinces as a percentage of total provincial spending. And it’s…well, it’s not good. As recently as 2011-12, provinces spent five percent of their budgets on postsecondary education. Now it’s three and a half percent. Or to put it another way, as a proportion of total spending, it’s down by about thirty percent.

    Figure 3: Provincial Spending on PSE as a Percentage of Total Provincial Spending, Canada, 2006-07 to 2024-25

    Is this due to particular events in particular provinces? Not really. Let’s just take a look at the four big provinces (which make up 85% of the postsecondary system. The provinces all started in different places (Alberta, famously, spent a heck of a lot more than other provinces back in the day) and the slope of decline is gentler in Quebec than elsewhere, but the basic path of decline and the eventual destination is similar everywhere. Notable by its absence in any of the four provinces are any clear break-lines which coincide with a change in administration—these declines are pretty consistent regardless of whether governments are left, right, or centre. It’s not a partisan thing.

    Figure 4: Provincial Spending on PSE as a Percentage of Total Provincial Spending, Selected Provinces, 2006-07 to 2024-25

    Figure 5 shows each province’s performance both in 2006-07 and 2024-25. As can clearly be seen, every province saw a decline over the 18-year period. This was not especially driven by one or two provinces: all provinces seem to have come to an identical conclusion that postsecondary institutions are not worth investing in. The size of whatever drop was in most cases inversely proportionate to how high spending was back in the initial period. The biggest drops were in Alberta and Newfoundland, which back in the day were the two highest spenders, riding high as they were on oil revenues. The smallest drop was New Brunswick, which was the weakest performer back in 2006-07. Ontario…is Ontario. But basically, the entire country is converging on the idea that investments in postsecondary need to be in the 2.5%-4.5% range rather than in the 4-7.5% range as they did 20 years ago.

    Figure 5: Provincial Spending on PSE as a Percentage of Total Provincial Spending, by Province, 2006-07 vs 2024-25

    Now, the obvious conclusion you might draw from this is “hey! Huge declines in public support for public postsecondary education!” But this is not quite correct. Remember: these are ratios we are looking at. Some of the delta will be due to changes in the numerator, some will be due to changes in the denominator. Figure 6 shows changes in both postsecondary spending and total provincial spending. And what’s clear is that the changes we have been examining in Figures 3 and 6 have more to do with the expansion of total spending rather than a decline in PSE spending.

    Figure 6: Real Change Provincial Spending on PSE Institutions vs Real Change Total Provincial Spending, Canada, 2006-07 to 2024-25 (2006-07 = 100)

    That increase in provincial spending in the last decade—30% over and above inflation—is wild. And deeply inconvenient for anyone who wants to build a narrative around generalized “austerity.” But what is clear here is:

    1. transfers to universities and colleges have trailed provincial spending everywhere and without reference to ideology of the governments in question, and
    2. ii) if transfers had not trailed general spending, they would be roughly $9.5 billion better off than they are today.

    And by a simply *amazing* coincidence, $9.5 billion–in real dollars—is almost identical to the increase in income  postsecondary institutions have seen in revenue from international students over the same period (it’s about a $9.2 increase from 2007-08 to 2022-23, the last year for which we have useful data—the 2024-25 is likely somewhat higher but we don’t know by how much).

    There a number of conclusions one could draw from this, but the ones I draw are:

    • Governments are spending more. A lot more. They just aren’t spending on PSE. Instead, they are spending it on an ageing population and other things that juice consumption. Eating the future, basically.
    • The drop in government support for PSE relative to overall spending increases is universal. No government provides any evidence of contrarian thinking. None of them think PSE is worth greater investment.
    • Changes of government are also almost irrelevant. They may change the “vibe” around postsecondary education, but they don’t change financial facts on the ground.
    • There is a really basic argument about the value of postsecondary education which somehow, postsecondary institutions are losing with governments and, I think by implication, the public. That, and nothing else, needs to be the focus of institutional efforts on external relations.

    Provincial governments are eating the future. But the data above, showing that the trend transcends geography and political ideology suggests that at base, the problem is that the Canadian public does not think postsecondary education is worth investing in. Working out how to reverse this view really needs to be job one for the whole sector.

    (Or, to be a bit cuter: the sector needs to do a lot less Government Relations and a lot more Community Relations.)

    Source link

  • Your 2025 higher education policy almanac

    Your 2025 higher education policy almanac

    Well, it’s January again.

    The early months of last year were dominated by the Conservatives’ slow swan dive into electoral oblivion, and then we got a general election that saw little serious discussion of the sector’s future, aside from the trotting out of a few old canards.

    And since Labour took power in July, there have been two broad phases: an initial “these things take time” framing in which universities – as well as many other groups and industries – were asked to be patient. In November we got the tuition fee uplift in England (in cash terms, for one year) and news of a bigger reform plan due next summer. A little movement, but in grand terms it was still can-kicking. Even the concrete announcements we’ve had, such as on level 7 apprenticeships, have not been accompanied by detailed policy papers or formal consultations.

    There’s reason to think that 2025 will have more for wonks to get their teeth into. There’s plenty pending, promised, or otherwise pretty damn urgent. So the below is an attempt to reckon with absolutely everything we know is on its way that matters for HE. Please charitably ascribe any oversights to a post-holidays sugar crash on my part rather than wilfully turning a blind eye, and let me know what I’ve missed in the comments.

    Big ticket items

    In Westminster politics, the first half of next year is going to be completely dominated by the spending review, which will set departmental budgets for three financial years (2026–29) as well as lay out a five-year programme of capital spending. It has always been described as being “in the spring”, but recent reports suggest that Labour will fly as close to the summer solstice as they can with this definition, so make sure you’ve got some free time in June to deal with the fallout.

    If what we read in the papers is to be believed, what is – counterintuitively – the default policy of inflation-linked tuition fees will be confirmed for England at this point, taking us up over £10,000 a year by the end of the Parliament.

    This is also when we’ll hear more about the government’s plans for ten-year R&D budgets. Attendees of the 2023 Labour conference may recall science secretary Peter Kyle promising a decade of confirmed funding for UKRI and ARIA – this commitment has been repeatedly qualified since then, partly due to issues of practicality (given that it’s not a ten-year spending review) and partly due to a question mark over whether fixing research spending in this way is really a good idea. It’s likely to be restricted now to “specific R&D activities” – the (much) bigger question will be around levels of investment in R&D. Plus we’ll see to what extent the government really wants to commit to linking research and its missions – last autumn brought only a small pot of cash for this in 2025–26.

    Also due alongside the spending review is “further detail and plans for delivery” for the Lifelong Learning Entitlement – so don’t expect to hear much more before then, though the delayed commencement in 2027 makes the need for information marginally less pressing. And the finalised industrial strategy will also arrive, “aligned with” (and likely published together with) the spending review, laying out specific sector plans for areas like the creative industries, the life sciences, and professional services. Once complete, the idea is that these plans can then inform Skills England’s work, and potentially migration policy – it’s all very ambitious.

    The HE reform announcement in England that we’ve been promised for “the summer” will land – it appears – fairly hot on the heels of the spending review settlements, and any money needed for it will need to have been allocated already, or at least tucked in to Office for Budgetary Responsibility (OBR) projections in some way. On the topic of the OBR, its spring forecast is due on 26 March – there are rumblings that its revised projections could spell fiscal trouble for the government.

    There are also clear indications that the HE reform statement will be preceded, or possibly accompanied, by a review of some kind. There have been rumours of a panel in place, and the indications are that this will fly under the radar somewhat and happen quickly – think Becky Francis’ curriculum review or Lord Darzi’s NHS audit, rather than a grand commission in the traditional “major review” style we have become used to.

    Around the sector

    Part of the Westminster government’s reform agenda is predicated on the sector coming up with ideas itself, which may end up drawing quite a lot on Universities UK’s blueprint from back in September. UUK’s own “efficiency and transformation taskforce” will be busy putting out recommendations on business models and collaboration, with the endorsement of education secretary Bridget Phillipson – “all options are on the table,” we are told, with plenty of policy debate likely to ensue once publications begin to appear.

    With many universities in poor financial shape, the search for longer-term sustainability will likely be derailed at regular intervals by news of redundancies and course closures. National industrial action is a possibility, though there are real questions around the willingness of struggling union members to take action on pay at this point. Local disputes will continue to flare up. Alongside this we have a renewed push for newer English universities to be exempted from the Teachers’ Pension Scheme due to the massively increased costs it is now carrying, a move which would substantially inflame industrial relations if it came to pass.

    And looming over all of this is the possibility of a disorderly market exit, and the question of whether the government has a viable plan in place to step in if a large institution were to hit the wall. All the other policy developments we are highlighting here could be hugely complicated by a sudden shock to the system and what is likely to be a political rather than a strategic response.

    The world of regulation

    There’s a lot to look out for from the Office for Students, from the appointment of a new permanent chair down (interviews are being held this month).

    There’s the ongoing consultation on a new strategy, the continuing fallout from the temporary closure of the register (this should supposedly also bring new proposals on improvements to the registration process), whispers of a more “integrated approach” to quality and whatever that means for the TEF, and a greater regional focus to access and participation.

    We should start getting assessment reports for the second round of quality investigations (where franchising and foundation years will be a focus) as well as the belated release of those grade inflation investigations that were announced on 2 September 2022. We’re waiting for consultation responses on a new approach to public grant funding and even on LLE regulation, though you can’t blame them for waiting to see what exactly the government is planning with this one.

    According to last summer’s business plan, there should also be consultations of potential new initial conditions of registration on both management and governance, and consumer protection. And this year’s National Student Survey will have a sexual misconduct questionnaire appended – though it’s not clear at time of writing to what extent the results will be made public.

    Over in Wales, Medr is taking shape, with a finalised strategic plan due to have been submitted to the Welsh government for approval just before the Christmas break – we should hear more of this soon, along with the consultation response.

    And if all that sounds like a lot, in Scotland we are due a Post-School Education Reform Bill at some point in the 2024–25 parliamentary session, which will make big changes to how the Scottish Funding Council (SFC) and Student Awards Agency Scotland operate. A consultation which closed in September asked stakeholders for thoughts on what the funding agency landscape should look like – we haven’t heard much since then. The sector is keen to stress the importance of universities retaining their autonomy, whatever happens – legislative passage could see MSPs push for new duties on the SFC.

    We’ve been aware for a long time that the Office for National Statistics is undertaking a review into whether higher education should be seen as “public sector” in the national accounts – it’s now been slightly rejigged into a review of the statistical classification of “the transactions in which UK universities engage.” For what is a very technical definition, an eye over the recent travails of the FE sector suggest that there are potential implications for everything from procurement to senior staff pay. The long delayed work will kick off early in 2025.

    The research agenda

    What little research policy we’ve seen come out of the new government so far has been limited to haggling over budgets and science minister Patrick Vallance stressing that ministers should not meddle in university research. There’s no reason to think we will get big policy pronouncements out of the Department for Science, Innovation and Technology, which feels more interested in the tech and digital side of its remit, both legislatively and aesthetically. But there’s lots going on around the margins that could end up being quite consequential.

    First up we have the appointment of a new UKRI chief executive, where there’s already evidence the new minister has been having a think about longer term strategic direction. While the new roleholder won’t take up office until June, we should get news of the appointment fairly soon.

    In the Research Excellence Framework world, the “modular” approach to releasing different policies on a staggered timetable will see the release of the volume measure policy (imminently) and the contribution to knowledge and understanding policy (scheduled for the summer). The more contentious people, culture and environment pilot will continue throughout the year, with criteria and definitions due for the winter – any slippage on this will likely provoke controversy.

    At UKRI, January will bring an update on its work reviewing how PGR stipends are set (as well as the stipend level for 2025–26). Elsewhere, the ongoing National Audit Office work looking at UKRI grants and loans could be a wildcard – it’s due to report in spring 2025 – and at the very least is a moment where the government will need to comment on how the research funding system is operating. Research England is also thinking about the current state of research infrastructure, via its condition of the estate survey, and how the sector’s financial challenges are affecting research – for both of these pieces of ongoing work, it’s doubtful that much will be shared publicly.

    Further afield, a European Commission proposal for the successor to Horizon Europe is due midway through the year, preceded by an interim evaluation of the current funding programme which will likely give an indication of its plans. We will also get regulations for the Foreign Influence Registration Scheme in the new year – the measures, which will speak to research security, are now expected to come into effect in the summer. It’s been reported that the government is resisting calls to put China on the “enhanced tier” of the scheme, a move that would have greatly complicated UK-China academic partnerships. On a related note, the government has quietly been conducting a “China audit” – this will be released in the coming months, and in theory will spell out the policy areas where closer ties will be permitted.

    Finally, the House of Commons Science, Innovation and Technology Committee will be conducting a timely inquiry into regional R&D, which should be a good opportunity for some more insight into how the government’s English devolution-related plans for more mayoral involvement in the research system will come together.

    International

    If you had to pick a policy area that will have the biggest macro impacts on the sector in 2025, you could do a lot worse than opt for international recruitment (you would arguably have been proved right if you’d chosen it in any of the last few years).

    Two big policy items are on their way here: a legal migration white paper, spelling out how the government will fulfil its electoral promise to bring net migration down. And a revised international education strategy (IES), which we’re told is coming “early spring” – whether it will appear before, after, or alongside the white paper remains to be seen, but could be significant.

    The big questions here are whether the government will put a recruitment target on the face of the strategy – the aspiration for 600,000 students in the last one ended up coming back to haunt the Conservatives among their own base – and what the plan for education exports targets might be. But there are other areas we could see movement, such as on post-study work, where some in the sector seem hopeful that a little improvement could be on offer, despite the enormous political pushback the Graduate route has faced over the last couple of years. It feels like an outside bet.

    More important to keep an eye on will be whether some kind of arrangement is arrived at with net migration statistics – we know that the Office for National Statistics is looking at how estimates excluding students could be arrived at, and it’s been on the higher education sector’s wishlist for years.

    If it did come to pass, the devil would very much be in the detail – the Migration Advisory Committee annual report has already been noting the contribution that students make to long-term net migration, and Starmerite think tank Labour Together’s recent proposal is for visa routes such as Skilled Worker and Graduate to have multi-year targets, even if the Student visa does not. Put like that, it sounds like a recipe for universities to recruit pretty freely but for students’ post-study options to remain a political football – the seeming lack of student involvement in the IES review would appear all the more glaring in this case. The Universities UK blueprint did promise a kind of quid pro quo on responsible international recruitment, and it has been notable that government ministers have stressed the importance of housing availability when the question has come up in Parliament recently.

    Whatever comes out of it, it looks clear that the Home Office will continue to toe a careful line on student visas, and continue to implement the last government’s Graduate route review response. The use of “action plans” by UKVI for certain providers will continue, even if there is no substantive public comment from the Home Office about what these are and why they are being imposed. And there will also be a review of English language self-assessment policies over the next few months, “driven by growing concern around underlying reasons for reports of students being picked up at the border or entering UKHE with low levels of English” (in UUKi’s words). It’s unlikely much will be shared publicly about these, but they are items to watch, especially in the event that there is further negative publicity about international students in the media.

    It’s worth stressing that developments in migration and visa policy do not only affect students – the House of Lords Science and Technology Committee is next week highlighting the interplay between visas and international researchers, and there are ongoing issues such as the future of the family visa income threshold where the government will eventually need to take a position.

    And despite all this policy in play, the three most significant factors for future international recruitment with likely be the Australian federal election – where the incumbent government’s attempts to impose number caps have been thwarted by an opposition that wants bigger caps – the Canadian election – which could happen at any minute if Justin Trudeau is forced out, and where the Conservatives are strongly favoured to take power – and the impact of Donald Trump on the USA, where universities are already reportedly asking international students to return before he takes office. All these things have the potential to greatly benefit the UK “market”.

    Skills, skills, skills

    Before we get any HE reform news out of Westminster, there’s going to be policy elsewhere in the post-compulsory system, with Skills England gearing up for action – we’ll learn the appointments of chief executive and permanent chair pretty soon – and various policy pronouncements at this end of the tertiary sector are overdue.

    Probably the most impactful for higher education is confirmation about exactly what is happening with the apprenticeship levy, both in wider terms of the planned additional flexibility for non-apprentice courses (this will be less than the 50 per cent originally proposed… at least probably), and for the “defunding” of level 7 apprenticeships.

    Many universities are big operators in this space, and it appears that most if not all of these programmes will be removed from the levy’s scope (“a significant number” is the most recent framing from the government). Over Christmas the Telegraph reported that the much-feted doctor apprenticeship is now “paused in perpetuity”. We should get the full picture very soon, as well as the much-awaited post-16 strategy, which you would hope would give a decent insight into the government’s wider vision for tertiary education. Though it may not.

    The defunding of level 7 apprenticeships is also relevant for those higher education institutions that have been spending their levy contributions on such courses for their staff as part of their professional development offer. DfE assures us all that employers are more than welcome to pay for them using different funds, “where they feel they provide a good return on their investment.”

    Our world in data

    We’re getting the outcome of the Data Futures review soon! There may be some lessons to learn about programme management and platform delivery, which could play out as a shared commitment to improving processes or as an unedifying multi-agency row. Whatever the case, this year’s HESA Student data will arrive later than usual – “in the spring, earlier than last year’s August publication but later than the January release date achieved in previous years.” Whether this is spring as in daffodils, or spring as in spending review, remains to be seen – but the delay (and issues with data quality as we saw last year) will have a knock on effect on data releases elsewhere, once again.

    At the end of this month we are getting HESA Staff data for 2023–24. The headline figures from last year’s release did get quoted the odd time by the previous government – in answer to questions about the impact of redundancies and cuts, it would occasionally be pointed out by ministers that (academic) staff numbers were still rising when you look at the sector as a whole. These figures won’t show the impact of this academic year’s cuts, however.

    Of course, elsewhere we have the usual releases which make up the HE wonk’s annual working rhythm. UCAS end-of-cycle numbers, at provider level, are due out at the end of January, and further down the line (probably around spending review time!) we have HESA Finance data and the Office for Students’ accompanying financial sustainability report, which will likely once again be a moment of maximum attention for higher education’s bottom line.

    One other piece of data we are getting this spring is a new ONS release on student suicides. This will come alongside the independent review commissioned by the last government, and whatever the findings is likely to generate a lot of press coverage and renewed pushes from campaign groups and opposition parties for a statutory duty of care. Early indications from the current government is that they are happy with the voluntary, sector-led approach to mental health – but things can change.

    Elsewhere in government

    It’s amazing it’s taken us this long to get to it, but probably the biggest, most controversial item on DfE’s to-do list is a decision on the fate of the free speech act and its associated provisions and complaints scheme. The Free Speech Union has its day in court on 23 January as part of a legal challenge over the pausing of the bill’s commencement – it’s just possible that the government will try to get a decision out before then. Or it could all drag on intractably for several more months, very much in keeping with the legislation’s passage through Parliament.

    Another hugely consequential move which we may see from DfE this month is the launch of a consultation on proposals to “strengthen oversight of partnership delivery in higher education” in conjunction with OfS. The department “will be developing options for legislative change, if required,” the Public Accounts Committee was told back in September, with a target date of January 2025 for an update.

    We’re due impact assessments and regulations for the tuition fee and maintenance “increases”, which should also involve a government pronouncement on how much the national insurance increase will cost the sector. And while it’s not higher education business, the soon-to-appear curriculum review (covering the curriculum in England from key stage 1 to key stage 5) will have long-term consequences for the wider education system – as well as likely sparking further backlash among those worried about it recklessly promoting diversity and risking PISA scores.

    Elsewhere in Westminster, the ongoing parliamentary passage of massive pieces of legislation will have big consequences for universities and students. The Employment Rights Bill and the Renters’ Rights Bill will both likely see some amendments, and we’re still awaiting the text of the English Devolution Bill and the promised “Hillsborough” bill. The government’s NHS plan for change – again, due at some point in the spring – and proposed updates to the NHS Long-Term Workforce Plan are important to keep an eye on as well.

    Up in Scotland, one day we may see the fruits of the ongoing review of student maintenance for part-time students. Negotiations over the 2025–26 budget will dominate the parliamentary agenda in the early part of the year, with ministers appearing in front of committees to get into the details of what exactly will be funded and what will not – and then the countdown to 2026 elections begins (all of this sentence is also true in Wales).

    It’s dangerous to go alone – take this

    If you’ve made it this far, congratulations. It feels like there is currently a huge number of moving parts in play in policy-land, all of which will contribute to the future shape and operations of the UK higher education sector in various, often hard-to-predict ways. Some are pretty immediate, others are issues that should have been tackled long ago, and then there are long-term policy changes that will be massive news in the 2030s.

    Here at Wonkhe we try to cover every single policy development that affects the sector, especially in our Daily Briefings (which restart on Tuesday 7 January – my alarm is already set).

    So if you’re interested in following even a fraction of the stuff that’s set out above, do join us for the ride this year. And fair warning, it’s likely that a good number of the most important developments that 2025 has in store for us are not even on this list. We’ll cover those as well, the moment they arrive.

    Source link

  • The data dark ages | Wonkhe

    The data dark ages | Wonkhe

    Is there something going wrong with large surveys?

    We asked a bunch of people but they didn’t answer. That’s been the story of the Labour Force Survey (LFS) and the Annual Population Survey (APS) – two venerable fixtures in the Office for National Statistics (ONS) arsenal of data collections.

    Both have just lost their accreditation as official statistics. A statement from the Office for Statistical Regulation highlights just how much of the data we use to understand the world around us is at risk as a result: statistics about employment are affected by the LFS concerns, whereas APS covers everything from regional labour markets, to household income, to basic stuff about the population of the UK by nationality. These are huge, fundamental, sources of information on the way people work and live.

    The LFS response rate has historically been around 50 per cent, but it had fallen to 40 per cent by 2020 and is now below 20 per cent. The APS is an additional sample using the LFS approach – current advice suggests that response rates have deteriorated to the extent that it is no longer safe to use APS data at local authority level (the resolution it was designed to be used at).

    What’s going on?

    With so much of our understanding of social policy issues coming through survey data, problems like these feel almost existential in scope. Online survey tools have made it easier to design and conduct surveys – and often design in the kind of good survey development practices that used to be the domain of specialists. Theoretically, it should be easier to run good quality surveys than ever before – certainly we see more of them (we even run them ourselves).

    Is it simply a matter of survey fatigue? Or are people less likely to (less willing to?) give information to researchers for reasons of trust?

    In our world of higher education, we have recently seen the Graduate Outcomes response rate drop below 50 per cent for the first time, casting doubt as to its suitability as a regulatory measure. The survey still has accredited official statistics status, and there has been important work done on understanding the impact of non-response bias – but it is a concerning trend. The national student survey (NSS) is an outlier here – it has a 72 per cent response rate last time round (so you can be fairly confident in validity right down to course level), but it does enjoy an unusually good level of survey population awareness even despite the removal of a requirement for providers to promote the survey to students. And of course, many of the more egregious issues with HESA Student have been founded on student characteristics – the kind of thing gathered during enrollment or entry surveys.

    A survey of the literature

    There is a literature on survey response rates in published research. A meta-analysis by Wu et al (Computers in Human Behavior, 2022) found that, at this point, the average online survey result was 44.1 per cent – finding benefits for using (as NSS does) a clearly defined and refined population, pre-contacting participants, and using reminders. A smaller study by Diaker et al (Journal of Survey Statistics and Methodology, 2020) found that, in general, online surveys yield lower response rates (on average, 12 percentage point lower) than other approaches.

    Interestingly, Holton et al (Human Relations, 2022) show an increase in response rates over time in a sample of 1014 journals, and do not find a statistically significant difference linked to survey modes.

    ONS itself works with the ESRC-funded Survey Futures project, which:

    aims to deliver a step change in survey research to ensure that it will remain possible in the UK to carry out high quality social surveys of the kinds required by the public and academic sectors to monitor and understand society, and to provide an evidence base for policy

    It feels like timely stuff. Nine strands of work in the first phase included work on mode effects, and on addressing non-response.

    Fixing surveys

    ONS have been taking steps to repair LFS – implementing some of the recontacting/reminder approaches that have been successfully implemented and documented in the academic literature. There’s a renewed focus on households that include young people, and a return to the larger sample sizes we saw during the pandemic (when the whole survey had to be conducted remotely). Reweighting has led to a bunch of tweaks to the way samples are chosen, and non-responses accounted for.

    Longer term, the Transformed Labour Force Survey (TLFS) is already being trialed, though the initial March 2024 plans for full introduction has been revised to allow for further testing – important given a bias towards older age group responses, and an increased level of partial responses. Yes, there’s a lessons learned review. The old LFS and the new, online first, TLFS will be running together at least until early 2025 – with a knock on impact on APS.

    But it is worth bearing in mind that, even given the changes made to drive up responses, trial TLFS response rates have been hovering around just below 40 per cent. This is a return to 2020 levels, addressing some of the recent damage, but a long way from the historic norm.

    Survey fatigue

    More usually the term “survey fatigue” is used to describe the impact of additional questions on completion rate – respondents tire during long surveys (as Jeong et al observe in the Journal of Development Economics) and deliberately choose not to answer questions to hasten the end of the survey.

    But it is possible to consider the idea of a civilisational survey fatigue. Arguably, large parts of the online economy are propped up on the collection and reuse of personal data, which can then be used to target advertisements and reminders. Increasingly, you now have to pay to opt out of targeted ads on websites – assuming you can view the website at all without paying. After a period of abeyance, concerns around data privacy are beginning to reemerge. Forms of social media that rely on a constant drive to share personal information are unexpectedly beginning to struggle – for younger generations participatory social media is more likely to be a group chat or discord server, while formerly participatory services like YouTube and TikTok have become platforms for media consumption.

    In the world of public opinion research the struggle with response rates has partially been met via a switch from randomised phone or in-person to the use of pre-vetted online panels. This (as with the rise of focus groups) has generated a new cadre of “professional respondents” – with huge implications for the validity of polling even when weighting is applied.

    Governments and industry are moving towards administrative data – the most recognisable example in higher education being the LEO dataset of graduate salaries. But this brings problems in itself – LEO lets us know how much income graduates pay tax on from their main job, but deals poorly with the portfolio careers that are the expectation of many graduates. LEO never cut it as a policymaking tool precisely because of how broadbrush it is.

    In a world where everything is data driven, what happens when the quality of data drops? If we were ever making good, data-driven decisions, a problem with the raw material suggests a problem with the end product. There are methodological and statistical workarounds, but the trend appears to be shifting away from people being happy to give out personal information without compensation. User interaction data – the traces we create as we interact with everything from ecommerce to online learning – are for now unaffected, but are necessarily limited in scope and explanatory value.

    We’ve lived through a generation where data seemed unlimited. What tools do we need to survive a data dark age?

    Source link