Category: Featured

  • UCAS End of Cycle, 2025: access and participation

    UCAS End of Cycle, 2025: access and participation

    While one end of your university is focused entirely on the number of undergraduate students that get a place (and pay a fee) each year, another equally important driver is who these students are and where they come from.

    A part of the initial quid pro quo offered to the sector when we lost the last vestiges of student number control and managed expansion in 2012 was that some of this new capacity would be made available for students from non-traditional backgrounds – and that this would happen from everywhere: from the poshest ancient university to the most practical and locally-focused further education college.

    Though regulators all over the UK do keep an eye on how providers are doing at making this egalitarian dream a reality, in England at least the focus has been more on what providers are doing to widen access (and how they know it is working) and less on the actual numbers or entry rates.

    Deprivation

    UCAS has always provided data on what proportion of main scheme UK applicants from major demographics end up with an offer. Because of some smart choices by UCAS in its data design, I can also offer you an main scheme acceptance rate: the proportion of applications that end up with an accepted offer.

    (UCAS main scheme? That’s the one where an applicant applies to up to five courses before the 30 June deadline. It doesn’t include stuff like direct entry to clearing, or records of prior acceptance – where someone applies directly to the provider.)

    We don’t get as many metrics as we used to (what’s happened to UCAS’ own Multiple Equality Measure, or MEMs, I wonder) – and I’ve chosen to look at indices of multiple deprivation as a common way of thinking about participation from economically disadvantaged small areas. There are four of them (SIMD, WIMD, NIMD, and IMD – one for each home nation) and it makes no sense to see them all on one graph. By default we are seeing England (more data points!) but you can also choose to see Wales, Scotland, or Northern Ireland using the “nations/regions” filter.

    You choose your quintile of interest at the top (default is one, the most deprived 20 per cent), a year (default is 2025), chosen measure (offer rate or acceptance rate) and Age (default is “all”). This changes the display at the top: an ordered plot of providers, with the size of the dot showing the number of accepted students. Mouse over a dot to show annual proportions by quintile for main scheme applications, offers, and accepted applicants.

    [Full screen]

    By default you can see the proportion of applications that end with an accepted applicant – but a low score does not mean a provider is terrible at widening access. Recall there are a lot of variables here, with as much to do with student choice (or portfolio) and performance as what the provider does. For this reason the offer rate (how many applications end with an offer being made) is a more popular measure.

    Entry qualifications

    I feel like I keep saying this, but you can’t really talk about access without talking about what qualifications an applicant is likely to be bringing with them. A level performance is a spectacular proxy for how rich your parents are and how nice your house is – even the choice to take A levels is less common among disadvantaged groups.

    On the first issue we still don’t get data on actual (A level or tariff) points at provider level as structured data. The data exists – it’s on course pages at an individual course level, but supposedly it is far too commercially powerful to publish openly in a structured way at provider level. It feels like a policy from another age, and it doesn’t make anyone look good.

    The best we get is a provider-level look at the types of qualification held by accepted applicants (and those that get offers). I’ve not plotted this to enable comparison, but it is fascinating to find individual providers slowly moving away from recruiting A level students only and into the “other” qualification that suggest mature learners, and (less clearly) local rather than national recruitment.

    [Full screen]

    Unconditional

    Back at the end of the 2010s there was a great deal of policy concern around the idea of unconditional offers. This was eventually refined into the “conditional unconditional offer”, a situation where a “firm” commitment from an applicant was rewarded with a lack of insistence on a particular set of grades or tariff points.

    Though there were often valid reasons given for direct unconditional offers (for example, when admission to an arts course was by portfolio, or where – rarely – a provider set its own entrance exams or used a detailed interview process to inform selection) nobody ever really managed to convincingly defend the conditional unconditional offer in a way that stopped being banned (with the briefest of blips when it was accidentally unbanned for a month or so in the 2022 cycle). It was odd as the best available evidence showed that such offers didn’t have an impact on student outcomes.

    I’ve been starting to hear stories about a growth in other forms of unconditional offers in this last cycle – the pressure to lock in applicants may be prompting usual academic requirements to be suspended or lowered. The available data suggest a very slight growth in “other unconditional offers” that regulators may want to keep an eye on, but only back to roughly 2023 levels from a slight dip last year.

    [Full screen]

    In England, at least, we’ve rather taken our eye off the ball when it comes to participation metrics – they exist, but there’s very little (other than the required existence of an access and participation plan for those who want to charge higher fees) to connect them to regulation. There have been some suggestions from ministers that this may change, and if you are in planning or strategy you may wish to get yourself reacquainted with the state of the art in 2025.

    Source link

  • UCAS End of Cycle, 2025: provider recruitment strategies

    UCAS End of Cycle, 2025: provider recruitment strategies

    On the face of it, running a successful recruitment round is fairly straightforward.

    It’s a bit like making a salad. Everything needs to look fresh and appetising, and you don’t want too much of one thing in case people don’t like it.

    I mean, it’s not rocket science.

    The provider level data from UCAS nicely illustrates the other, less straightforward end of the equation. We know surprisingly little about what applicants actually want to do, and where they want to do it.

    Sure, there’s near-certainties – medicine at UCL is unlikely to want for well-qualified applicants any time soon – but some things are rather less expected. Computing and IT focused courses, which have been growing in popularity for years, appear to have hit a wall. Is it the onset of generative AI “vibe coding” hitting employment prospects? Is it a change in the public perception of technology companies?

    We pretty much know it is affordability (and the slow atrophy of the student maintenance system) that prompts applicants from less advantaged socio-economic backgrounds to choose to study locally. But we don’t know why selective providers that have historically recruited nationally have decided en masse to move into this very specialised market, or what changes they have made to their standard teaching (and indeed offer-making) approach to make this work.

    It’s questions like these that make the insights available from this year’s UCAS End of Cycle data so fascinating, and the choice of data that is released so frustrating.

    The Russell Group ate my students!

    There’s been a lot of talk (and a lot of quite informed data driven evidence) to suggest that traditionally selective providers have been accepting students with uncharacteristically low grades in greater numbers than in previous years.

    A couple of unexpected new additional data tables shed a little more light. This last (2025) cycle saw selective (high tariff) providers recruit more students with 15 A level points or below than in any previous year – while medium tariff providers are doing less well in students with between 9 and 11 points than any year outside the pandemic, and low tariff providers had their worst year on record for between 10 and 12 points, and their worst year since the pandemic for between 8 and 6 points.

    [Full screen]

    A level points? Yes, for reasons best known to UCAS this is not the same as tariff points (so only includes A level performance, not vocational qualifications or grade 8 piano). You get 6 points for an A*, down to 1 point for an E – and only your best three A levels count. So 12 points means three Bs or thereabouts.

    The counter story is that this change in behavior hasn’t shifted the overall averages by that much. For high tariff providers the average accepted applicant has 13.9 A level points (down from 14 last year or 14.3 in 2016 – that’s round about AAB. Medium tariff is about BCC (10.4), Lower tariff is near enough CCC (9.4 – up very slightly on the historic average).

    [Full screen]

    Usually I’d suggest that this stasis is down to a regular recalculation of tariff groups – but I know that the last time UCAS allocated providers to groups was back in 2012. We’ve also never been told which providers are in which tariff group – this is a different split to the DfE or OfS variants, unhelpfully. And we don’t get data on A level (or tariff) points by provider, which would offer a much more helpful level of granularity to this point of sector-wide interest.

    A peep at provider strategies

    There’s been a welcome update to the release of the provider level End of Cycle dataset: previously we used to get offermaking only within a rather vestigial dataset known as “equalities” – 2025 adds the offermaking data plus a range of new equalities parameters to the main provider level release.

    For all tariff bands or sector-level data is interesting, the increasing diversity of (and increasing competition within) the sector means that provider-level changes in behavior are by far the most interesting component of this release. The new information means that the chart that you lost your morning to last year is now looking very likely to make you lose your entire day.

    This is a complex but powerful dashboard, which shows the difference between the most recent year (2025) of data and a comparator year you can choose (by default last year but you can choose any year since 2019) across two dimensions (you can choose from applications, offers, and accepted applicants for each). I’ve added filters by domicile (UK, international, or all) and subject group (the familiar top level – CAH1 – list).

    It’s a lot of data on one chart, so I’ve added a group filter, which by default removes some smaller providers from the display – and there’s a highlighter to help you find a provider of interest.

    A dot being further up or further right means that measure has grown between the comparator year and the current year, further down or further left means it has shrunk.

    [Full screen]

    There’s a nearly infinite number of stories to tell from this chart. Here’s some notable ones.

    Firstly Canterbury Christ Church University has accepted substantially fewer applicants in 2025 than in 2024. A dig around in the data suggests that decline is focused on UK domiciled applicants studying business subjects, which suggests to me that this shows the end of one or more franchise or partnership arrangements. I asked Canterbury Christ Church University for a comment – nothing yet but I’ll add it if it comes in – I’d imagine that this is the most visible of a wave of providers calculating that the increasing regulatory risk (with both OfS and DfE taking action) is not worth the hassle of running such provision – I’m tentatively pointing at Buckinghamshire New University and Oxford Brookes University as other similar (but smaller) examples).

    Not all of the Russell Group is following the same recruitment strategy – there are instances (Nottingham, Glasgow, Cardiff) where fewer applicants have been accepted than in 2024. Some Russell Group providers (for example Leeds, York, Southampton, and Cardiff) have seen fewer applications than in previous years – the first three in that list have nevertheless increased acceptances over last year. Because we can now see the number of offers made using the filters at the top, it is apparent that the entire group (excepting Cardiff and Southampton) made more offers than last year.

    League leaders

    If you are playing along with the dashboard you’ll have spotted that University College London accepted nearly 2,500 more applicants than last year (after making a genuinely startling 12,000 more offers) . The majority of this increase (2,290 accepted applicants, 10,650 offers) related to international applicants – with growth in pretty much every subject area contributing to this performance.

    That’s not the largest growth in accepted applicants, however (it’s the second largest). For the league leaders, we look to the University of Wolverhampton – which accepted an impressive 3,625 extra applicants compared to last year. Unlike UCL, these are all UK-domiciled students, and nearly all (2,970) are studying business subjects. To me, this suggests a new partnership – I asked Wolverhampton about this, and am waiting to hear back.

    But who made the most offers in 2025? For international students, it’s UCL and it isn’t even close. But for home students it was the University of Exeter, which made 7,130 more UK domiciled offers this year than last year (a total of 37,515 offers in the 2025 cycle!) across a mix of subject areas. Exeter wasn’t able to get me a comment before publication – I’ll add one if it comes in later.

    And I did promise a look at computing recruitment. It is a decline in both applications and acceptances pretty much across the board – with the exception of an 800 student growth in accepted applicants at Bath Spa University. UCL did recruit 40 more students than last year, but this is against a 1,520 decline in applications. There’s still a bit of growth at the University of Manchester, and the University of York – but note also Escape Studios (a growing independent visual effects specialist that was once known as Pearson College, which delivers degrees validated by the Coventry University).

    School leavers

    I’ve also put together a version of this chart that shows only the recruitment of 18 year olds. The direct path between school or college and university is no longer the dominant one in the UK, and hasn’t been for some time – but in policymaking and political discussions it is still where minds tend to go.

    [Full screen]

    Focusing on UK 18 year olds, we can see that the University of Exeter has grown most spectacularly compared to last year on applications, offers, and acceptances. Large amounts of growth in this part of the market tends to be concentrated in more selective providers, but we can also see credible performances from big civic providers like Nottingham Trent University, Manchester Metropolitan University, and Liverpool John Moores University.

    Conversely we can see smaller but notable declines in applications and acceptances from providers including the University of the West of England, Birmingham City University, and the University of East London. The noticeable pattern is that there is no pattern – recruitment among school leavers can go cold anywhere at any time it would seem. And there are some ways around this – both the University of York (up 1,285) and the University of Leeds (up 3,180) upped school-leaver offer making despite a small decline in applications

    A sense of the sector

    Competition is clearly heating up. For those who have hit on a winning recruitment formula, the challenge becomes a need to ensure that every additional undergraduate gets the high quality experience they have been led to expect. An increase in fee income is almost all going to go to investment in capacity (be that more staff, retaining existing staff, or providing more resources). If your expansion has been into applicant groups you have little experience in teaching, the need to invest rises.

    Conversely, for those who have yet to hit upon the way to attract applications reliably there will already have been internal discussions about what needs to be done or what needs to change. Recruitment can and does figure in portfolio review and course revalidation questions: all of which comes down to whether a provider can afford to do what it would like to continue doing. Losing resources or capacity is a very last resort – once you wave goodbye to a course or department it is very difficult to spin back up.

    There will also be attention paid to sector trends – the kind of stuff I plotted back in December when we got the first phase of the End of Cycle release. Is it something your provider is doing, or a more general societal change, that means recruitment is growing or shrinking on a particular course. These are difficult, painful conversations, and need careful, considered, responses.

    Source link

  • Can Congress subpoena a journalist for reporting a Delta Force commander’s name?

    Can Congress subpoena a journalist for reporting a Delta Force commander’s name?

    On Jan. 7, the House Oversight Committee approved a subpoena for Seth Harp, an investigative journalist and contributing editor at Rolling Stone, for posting information about a Delta Force commander. Congress has broad authority to issue subpoenas. But it must show far more restraint when aiming them at journalists without any evidence of wrongdoing. 

    In early January, Harp reposted a screenshot identifying a commander involved in the U.S. capture of Nicolás Maduro, Venezuela’s former dictator. X reportedly locked Harp’s account until he deleted the post. The House Oversight Committee then voted to approve a subpoena “for leaking classified information.” Republican Rep. Anna Paulina Luna of Florida’s 13th congressional district, who introduced the motion to subpoena Harp, said, “Putting a service member and their family in danger is dishonorable and feckless. Leaking classified information demands explanation and a criminal investigation.” 

    But publishing the news, even when the news contains classified information, is exactly the role of a journalist. And Rep. Luna did not cite any evidence that Harp broke the law to obtain the information.

    Can Congress subpoena Harp over his reporting? 

    Congress has a broad subpoena power, subject to some constitutional limits. 

    Congress does have broad investigative authority tied to its legislative power, and subpoenas are a standard tool of that authority. It cannot investigate without the ability to compel people to share information. 

    But that authority still has limits. In Watkins v. United States (1957), a McCarthy-era congressional subpoena case, the Supreme Court held that while it is “unquestionably the duty” of citizens to cooperate with such subpoenas, the power to issue subpoenas at all “assumes that the constitutional rights of witnesses will be respected by the Congress as they are in a court of justice. The Bill of Rights is applicable to investigations as to all forms of governmental action.” The First Amendment prohibits government retaliation for engaging in protected speech. So under Watkins’ rationaleCongress should not subpoena a journalist merely because it dislikes their reporting.

    If Congress abuses its subpoena power, will courts stop it?

    In practice, the Speech or Debate Clause weakens Watkins’ constitutional limit on congressional subpoenas.

    Even after Watkins, abusive congressional subpoenas are difficult to preemptively fight in court. One reason is the Speech or Debate Clause, which gives members of Congress immunity for legislative acts or statements, including subpoenas. 

    In Eastland v. U.S. Servicemen’s Fund (1975), the Senate investigated the defendant organization (including a subpoena for bank records) after it distributed anti-Vietnam war publications to the military. When the Servicemen’s Fund challenged the subpoena all the way to the Supreme Court, the Court held that the subpoena fell within Congress’s “legitimate legislative sphere” of investigating the “effect of subversive activities.” Because the committee acted within its investigatory powers, the Court concluded, the Speech or Debate Clause immunized the committee and its staff from suit. The subpoena remained on the books. 

    Eastland thus stands for the proposition that courts may not “look behind” a subpoena for constitutionally improper motives. It would be unconstitutional for Congress to investigate a nonprofit’s bank accounts, or a reporter’s sources, based on First Amendment-protected expression. But so long as Congress can prove it acted within the bounds of its power, any remedy for the constitutional violation must be found outside the courts. 

    Even if Congress can use its subpoena power to end-run around the First Amendment, should it? 

    Standardless subpoenas against reporters risk chilling journalism.

    Even when Congress has facially legitimate (if arguably pretextual) grounds for its investigation, forced investigative questioning is a direct threat to the conditions that make journalistic inquiry possible. Freedom of the press — and of speech — requires the ability to pursue knowledge and ideas without fear of retribution. Otherwise, our knowledge grows stale, and our ability to assess the truth trends toward the state’s mandated line. As the Supreme Court noted in Sweezy v. New Hampshire (1957), “scholarship cannot flourish in an atmosphere of suspicion and distrust.” Replace “scholarship” with “protesting” or “reporting,” and the principle remains the same. 

    Floyd Abrams, who has spent his career litigating press cases, puts it plainly — such legal battles “cost an enormous amount of money, have enormous disruptive effects” and represent “an institutional threat to the behavior of a newspaper.” Subpoenas signal to sources that talking to the press could put them under a governmental spotlight. They force reporters and editors to ask: This story is accurate, but can we afford the cost of printing it? 

    FIRE’s recent work shows that when Congress goes overboard with investigations, it can scare people into silence — even when their speech is perfectly legal. Tyler Coward, lead government counsel at FIRE, condemned congressional investigations into student groups and nonprofits associated with pro-Palestine protests as “fishing expedition[s]” based on groups’ viewpoint. 

    Likewise, John Coleman, legislative counsel at FIRE, criticized the House’s investigation of Stanford researchers studying “misinformation.” Targeting protected academic inquiry might serve some legitimate congressional objective, Coleman argued, but such investigations deter future inquiry. For reporters, the same lesson is obvious: even if a subpoena is ultimately narrowed or withdrawn, if you want to avoid the risk, avoid the subject. 

    The issuance of speech-chilling subpoenas knows no partisan bounds, either. Republicans led the investigations into pro-Palestine groups and Stanford researchers. But in 2021, the House Select Committee on January 6th — chaired by Democratic Rep. Bennie Thompson — subpoenaed a photojournalist’s phone records from Verizon. At the time, the Reporters Committee for Freedom of the Press called on Thompson to withdraw the subpoena, calling it a “direct threat to newsgathering.” In Seth Harp’s case, the House Oversight Committee’s top Democrat, Rep. Robert Garcia, supported Rep. Luna’s motion to subpoena, and it was approved unanimously. 

    If courts are unlikely to stop Congress, who will protect journalists?

    Even if constitutional, Congress should refrain from issuing standardless subpoenas against journalists.

    Despite the fact that the Speech or Debate Clause largely immunizes Congress when it issues subpoenas, Congress has an independent obligation to follow the Constitution. Recall that Eastland held that courts may not “look behind” a subpoena to test whether the real aim was retaliation or harassment. Facially legitimate subpoenas will stand, even if they’re arguably illegal. That means Congress itself is the main check on subpoenas meant to retaliate against or harass reporters. And Congress must better police its subpoena process — otherwise it imperils not only our free press, but also free speech and our collective pursuit of truth and knowledge. 

    Congress should keep Watkins in mind when crafting subpoenas. At a base level, that means Congress should not issue subpoenas to journalists for merely reporting the news. Beyond that, Congress should ensure that there are no other means to obtain the requested information. It should tailor requests to avoid sweeping in things like sources, editorial deliberations, or other discussions essential to the newsgathering process. These suggestions are modest but vital institutional firewalls against congressional abuse of its oversight power. 

    The public cannot be informed — cannot check officials, evaluate policy, or hold politicians accountable — without strong protections for the press freedom to share information and to criticize without retaliation. Alexander Hamilton warned that constitutional safeguards are often not enough. The freedom of the press, he wrote in Federalist No. 84, rested not in “fine declarations” but rather in the “general spirit of the people and of the government.” 

    Congress must take it upon itself, in Harp’s case and others, to embody that spirit of a free press and refrain from investigating journalists for merely doing their job.

    Source link

  • Unlocking GA4 for Student Recruitment Journey

    Unlocking GA4 for Student Recruitment Journey

    Reading Time: 15 minutes

    Google Analytics 4 (GA4) has reshaped how colleges and universities track prospective student behaviour online. With the retirement of Universal Analytics (UA) in 2023, GA4 is now the default analytics platform, and for many higher ed marketers, the transition has been disorienting. Gone are the familiar sessions and pageviews; in their place is an event-based model, a redesigned interface, and new metrics that require a shift in thinking.

    But while the learning curve is real, so are the opportunities. GA4 offers deeper insights into student intent, behaviour, and engagement, insights that, when used effectively, can support measurable enrollment growth.

    This guide breaks down GA4 in a practical, approachable way. We’ll walk through how to use its core features at each stage of the student recruitment funnel: Discovery, Engagement, Decision-Making, and Enrollment. You’ll learn which reports matter, which metrics to ignore, and how to use GA4’s exploration tools to uncover new conversion opportunities. Throughout, we’ll also highlight how Higher Education Marketing (HEM) can help you make the most of GA4, from free audits to CRM integration support.

    Let’s start by shifting our perspective on what analytics can do, and then dive into how GA4 can support every phase of your student journey.

    GA4 unlocks powerful enrolment insights.

    Turn student journey data into smarter recruitment decisions with HEM.

    GA4’s Event-Based Mindset vs. Universal Analytics

    The most significant shift from Universal Analytics (UA) to Google Analytics 4 (GA4) is the underlying measurement model. UA was centred on sessions and pageviews, essentially counting a sequence of “hits” during a user’s visit. GA4, by contrast, is entirely event-based. Every interaction, whether it’s a pageview, a button click, a form submission, or a video play, is captured as an event. This model allows for a more flexible, granular view of user behaviour across devices and platforms, reflecting the idea that “everything is an event that signals user intent.”

    What makes GA4 different from Universal Analytics for higher ed marketers? Higher ed marketers accustomed to UA’s pageviews and sessions are now confronted with a new event-based model, a slew of unfamiliar reports, and an interface that looks nothing like the old Google Analytics. GA4 offers richer insights into student behaviour and intent, which can directly fuel enrollment growth.

    Crucially, GA4 is built for today’s privacy-first, multi-device world. It can track a single user’s journey across devices using User IDs or Google Signals and relies less on cookies, instead using machine learning to fill in data gaps, helping you stay compliant with emerging privacy standards.

    For higher ed marketers, this opens up richer insight into the prospective student journey. GA4 for student recruitment automatically tracks many common interactions (like scrolls and file downloads) and lets you define custom events aligned to your goals.

    New metrics also reflect this shift. Engagement Rate replaces bounce rate, highlighting sessions that last 10+ seconds, include 2+ pageviews, or trigger a conversion. Other core metrics include Engaged Sessions per User and Average Engagement Time, which are helpful indicators of whether your content holds attention or needs refinement.

    GA4 also brings predictive capabilities. With built-in machine learning, it can surface emerging trends or flag anomalies in student behaviour. While some advanced features like Predictive Metrics may feel out of reach initially, knowing they exist helps future-proof your analytics approach.

    It’s true, GA4 isn’t just an upgrade, it’s an entirely new platform. Many familiar reports have been retired or redesigned, and the interface now favours customizable dashboards over static reports. But don’t let the overhaul overwhelm you.

    The key is to focus on the metrics that support your enrollment goals. In the next section, we’ll show how GA4’s event-based model aligns with each stage of the student journey, from first visit to application.

    If you need support getting started, HEM offers a free GA4 audit to help identify top-performing lead sources, evaluate your marketing ROI, and ensure your setup is recruitment-ready.

    Mapping GA4 to the Student Journey Stages

    Every prospective student moves through distinct phases on the path to enrollment. GA4 can provide actionable insights at each stage if you know where to look. Below, we break down how to use GA4 effectively across the four stages of the student journey: Discovery, Engagement, Decision-Making, and Enrollment. We’ll also highlight key metrics to prioritize and reports you can skip to avoid analysis paralysis.

    Stage 1: Discovery: Awareness & Early Interest

    What it is:
    At this stage, prospective students are just beginning to explore postsecondary options. They may land on your site via a Google search, a digital ad, or a social post. They’re not ready to apply yet, but they’re starting to investigate. Your goal is to attract the right audiences and create a strong first impression.

    What to use in GA4:
    Focus on the Acquisition reports under Life cycle > Acquisition:

    • User Acquisition Report
      Shows how new users first arrive, by channel, campaign, or source. This answers, “Where are our new prospects coming from?” and helps assess brand awareness performance.
    • Traffic Acquisition Report
      Tracks sessions from all users (new and returning). Use it to evaluate which traffic sources deliver engaged sessions and prompt interaction.

    Key metrics to monitor:

    • Engaged Sessions per User: Are visitors exploring more than one page?
    • Engagement Rate: What percentage of sessions include meaningful interaction?
    • Event Count per Session: Are users watching videos, downloading brochures, or clicking calls-to-action?

    These metrics reflect traffic quality, not just quantity. For example, if organic search traffic has a 75% engagement rate while paid social sits at 25%, that’s a clear sign of where to invest.

    Landing Pages: Your Digital First Impression
    Check Engagement > Pages and Screens to see which pages users land on most. Are your program or admissions pages pulling in traffic? Are they generating long engagement times? That’s a signal they’re working. If top landing pages show low engagement, it’s time to refine content, CTAs, or UX.

    What to skip:

    • Demographics and Tech Reports: Too broad to act on for now.
    • Real-time Report: Interesting, but not useful for strategic planning.

    Pro tip:
    HEM’s free GA4 assessment can help you identify your highest-quality channels and flag low-performing ones so you can optimize marketing spend and attract better-fit prospects.

    Stage 2: Engagement & Consideration: Mid-Funnel Interest

    Once prospective students are aware of your institution and begin browsing your site in earnest, they enter the engagement or consideration stage. Here, they’re comparing programs, evaluating fit, and building interest, but may not yet be ready to contact you. Your goal is to nurture their intent by providing relevant content, encouraging micro-conversions, and guiding them toward decision-making.

    GA4 Focus: Engagement & Behaviour Reports

    In GA4, shift your attention to the Engagement reports under Life cycle > Engagement. These include:

    • Pages and Screens
    • Events
    • Conversions
    • Landing Pages

    As HEM notes, “Engagement reports are all about what prospects do after landing on your site”, whether they go deeper or drop off.

    1. Pages and Screens Report

    This is your new “Top Pages” view. Use it to identify high-interest pages such as:

    • Program descriptions
    • Tuition and aid
    • Admissions criteria
    • Campus life

    Key metrics:

    • Average Engagement Time
    • Conversions per Page
    • User Navigation Paths (Where users go next)

    If your BBA program page has high engagement and links to “Schedule a Tour,” make sure the CTA is prominent and functional. If engagement is low, revise the content or layout.

    2. Events Report

    GA4 automatically tracks events like:

    • Scroll depth (90%)
    • File downloads
    • Outbound clicks
    • Video plays

    You should also configure custom events for micro-conversions, such as:

    • “Request Info” form submissions
    • Brochure downloads
    • “Schedule a Visit” or “Start Application” clicks

    These are the mid-funnel signals that indicate increasing interest. Mark them as Conversions in GA4 to elevate their importance in reporting.

    Pro tip: Track 3–5 key events that correlate strongly with application intent.

    3. Conversions Report

    Once key events are marked as conversions, the report will show:

    • Total conversions by event type
    • Event frequency over time
    • Value (if assigned)

    This helps determine which micro-conversions are driving engagement and which campaigns or pages are most effective.

    4. Path Exploration

    GA4’s Explorations > Path Analysis lets you visualize what users do after key pages or events. For example, if many students visit the “Admissions FAQ” after reading a program page, that suggests rising intent. Use this to improve internal linking and user flow.

    What to Skip

    Avoid advanced GA4 reports like:

    • Cohort Analysis
    • User Lifetime
    • User Explorer

    These are often too detailed or irrelevant for short-term funnel optimization. Also, don’t feel obligated to use every Exploration template; build your own around your specific enrollment steps instead.

    HEM Insight: Unsure if your GA4 is tracking these mid-funnel behaviours correctly? HEM offers audits, event configuration, and CRM integration support, ensuring that when a student requests info, that action is tracked, stored, and acted upon.

    Ready for the next stage? Let’s move on to how GA4 supports Decision-Making.

    Stage 3: Decision-Making: High Intent & Lead Conversion

    In the decision-making stage, prospective students move from casual interest to serious consideration. They’re comparing programs, costs, outcomes, and culture. By now, they’ve likely returned to your site several times. The goal here is clear: convert an engaged visitor into a lead or applicant.

    GA4 Focus: Conversion Tracking & Funnel Analysis

    This is where your earlier GA4 setup pays off. With key conversion events (e.g., “Request Info,” “Submit Application”) defined, you can now analyze how and where those conversions happen. GA4’s Traffic Acquisition, Explorations, and Conversions tools are central at this stage.

    Conversions by Source/Medium

    To understand which marketing channels drive high-intent actions, use the Traffic Acquisition report and add columns for specific conversions (e.g., “Request Info count” and conversion rate). Alternatively, build an Exploration with source/medium as the dimension and conversion events as metrics.

    HEM’s webinar emphasizes looking beyond raw volume: ask “Which sources deliver my highest-intent leads?” For example:

    • Organic Search: 30 info requests, 10 applications
    • Paid Social: 5 info requests, 0 applications

    This data helps optimize channel strategy. If certain channels underperform in lead quality, revisit targeting, messaging, or landing pages.

    Funnel Exploration

    GA4’s Funnel Exploration is ideal for visualizing conversion paths. You can define steps like:

    1. View Program Page
    2. Click “Request Info”
    3. Submit RFI Form
    4. Start Application
    5. Submit Application

    Example funnel insight:

    • 1,000 users view program pages
    • 200 click “Inquire” (20%)
    • 50 submit forms (25% of clicks)
    • 30 start applications
    • 20 submit applications (67% of starters)

    This highlights where friction occurs, perhaps a clunky form (25% completion) or weak CTAs (20% inquiry rate). Use this to improve form UX, reinforce CTAs, or add nurturing touchpoints.

    You can also segment student recruitment funnels by device or user type (e.g., international vs. domestic). If drop-off is worse on mobile, consider layout changes; if international students abandon applications, address barriers like unclear visa info.

    Path Exploration

    GA4’s Path Exploration can show common user journeys leading to conversion. Start with “Application Submitted” and trace backward. If scholarship pages, FAQs, or department overviews frequently appear in these paths, you’ve identified key conversion content.

    Conversely, if users loop across pages without converting, that may signal confusion. Use these insights to surface critical info sooner or rework unclear sections.

    User Explorer: Qualitative Insights

    While not scalable, inspecting User Explorer for select journeys (e.g., converters vs. non-converters) can offer qualitative insight. One user might watch webinars and return five times before applying, proving content value. Others bounce after one visit, highlighting the need for nurturing.

    Metrics That Matter

    Focus on:

    • Conversion counts and rates per channel and funnel stage
    • Engaged sessions per user
    • Average engagement time for converters

    Example: applicants average 5 sessions and 10 engagement minutes; non-converters average 1 session and 2 minutes. Clearly, repeat engagement correlates with conversion, and nurturing campaigns (email, retargeting) are essential.

    What to Skip

    Avoid getting distracted by:

    • Cohort Analysis or User Lifetime
    • Attribution modelling (unless you’re running major ad campaigns)
    • Default GA4 templates that don’t fit your student recruitment funnel

    Stick with the custom funnel and path reports that reflect your application process.

    Pro Tip: Not confident in GA4 setup? HEM’s experts can build your funnels, configure conversion tracking, and connect GA4 to your CRM, giving you clear, enrollment-focused dashboards and team training to act on the insights confidently.

    Stage 4: Enrollment: Application to Enrollment (Bottom of Funnel)

    The enrollment stage is the final stretch, transforming applicants into enrolled students. While much of this process shifts to admissions and offline workflows (e.g., application review, acceptance, deposit), digital analytics still play a critical role. GA4 helps marketing teams identify friction points, evaluate channel performance, and inform efforts that influence yield. It also closes the loop on campaign effectiveness, especially if tied to downstream outcomes.

    GA4 Focus: Funnel Completion, Attribution, and Post-Application Insights

    Application Funnel Completion

    Using Funnel Exploration, ensure your funnel captures key milestones like “Apply Clicked” and “Application Submitted.” If many click “Apply” but few complete the form, GA4 highlights a clear drop-off. For instance, if desktop converts at 30% but mobile only 10%, there may be UX issues on mobile or a third-party form that isn’t optimized. This insight can guide IT discussions or quick fixes (e.g., warning banners or responsive design improvements).

    Attribution Paths

    GA4’s Advertising > Attribution > Conversion Paths report reveals the sequence of marketing touches that lead to applications. Common patterns in higher ed include:

    • Organic Search → Direct → Conversion
    • Paid Search → Organic → Direct → Conversion
    • Email → Direct → Conversion

    These paths underscore that enrollment isn’t a single-touch journey. For instance, Organic Search may start the process, while Direct or Email closes it. If you frequently see Email leading to conversions, it validates your nurture sequences. Also, keep an eye on new referral sources, like “Chat” or “Perplexity”, which may signal traffic from AI tools, as teased in HEM’s presentation.

    Post-Application Engagement

    Some schools track events beyond submission (e.g., clicking an admitted student portal link, viewing housing or financial aid info). While GA4 may not capture yield or melt directly, it can show post-application interest signals. Continued engagement, like visiting tuition or residence life pages, suggests intent to enroll or lingering questions that marketing content can address.

    Benchmarking and Outcomes

    Use GA4 to evaluate ROI by channel. For example, if Paid Search generates 10 applications at $5,000, while Organic Search drives 30 at no direct ad cost, that’s a critical insight. While GA4 doesn’t include media spend (unless connected to Google Ads), you can overlay cost data offline to calculate rough efficiency.

    You can also segment Applicants vs. Non-Applicants using GA4’s Explorations. Let’s say applicants averaged 8 sessions while non-applicants averaged 2. That suggests high engagement correlates with conversion, reinforcing the value of remarketing, email campaigns, and sticky content.

    Research supports this: EAB found that highly engaged users (multiple sessions, longer duration) were significantly more likely to apply.

    What to Skip

    Once a student applies, most enrollment decisions move to CRM or SIS platforms, not GA4. Don’t expect GA4 to tell you who enrolled, who melted, or who was denied. Similarly, ignore reports like Predictive Metrics, User Lifetime, and Cohort Analysis, which are less actionable for enrollment marketing. Focus instead on your core funnel, attribution, and engagement data.

    Final Takeaway

    By now, your GA4 setup should illuminate your recruitment funnel: how students find you, how they behave, when they convert, and where they fall off. This data is crucial for optimizing spend, improving user experience, and shaping strategic decisions.

    Priority GA4 Reports:
    • Traffic & User Acquisition (channel quality)
    • Pages and Screens (top content, engagement)
    • Events & Conversions (key actions)
    • Funnel & Path Explorations (journey analysis)
    • Attribution Paths (multi-touch influence)
    Reports to Skip:
    • Demographics & Tech (unless troubleshooting)
    • Realtime (not strategic)
    • Cohorts, LTV, Default Templates (too advanced or unfocused)

    Pro tip: HEM can help you build enrollment-specific GA4 funnels, connect data to your CRM, and surface dashboards that show “visits → inquiries → apps → yield” at a glance, so you can finally act on your data with confidence.

    Real-World Examples: GA4 Insights Driving Enrollment in Higher Ed (from various colleges & universities)

    Clemson University (College of Business) Clemson’s Wilbur O. and Ann Powers College of Business leveraged targeted digital campaigns and GA4 event tracking to dramatically increase prospective student engagement.

    The college saw a 207% increase in page engagement and a 222% growth in program page views for a key graduate program after the campaign. In just a two-month push, GA4 recorded 498 users requesting information and 44 clicking “Apply” to begin their applications.

    HEM BP Image 2HEM BP Image 2Source: Clemson University

    University College Dublin (UCD). This university fully transitioned to GA4 and implemented a unified analytics dashboard via a data warehouse for all its websites. The new GA4-powered reporting interface, featuring Overview, Page Performance, and User Engagement reports, loads much faster and retains up to two years of data.

    This enables UCD’s faculties and departments to easily track user behaviour across the university’s web presence, gaining insights into what content is engaging visitors and where improvements can be made.

    HEM BP Image 3HEM BP Image 3

    Source: University College Dublin

    Boise State University. Boise State created a centralized GA4 “Comprehensive Dashboard” accessible to campus stakeholders and paired it with training tutorials on common GA4 tasks. Their web team produced self-paced video guides on how to filter GA4 data to answer specific questions (such as finding top pages, viewing traffic sources, or seeing visitor geolocation).

    This approach empowers individual departments to slice the raw GA4 data for their own needs and quickly get answers about user behaviour, for example, identifying the most popular pages or where visitors are coming from, without needing advanced technical skills.

    HEM BP Image 4HEM BP Image 4

    Source: Boise State University

    UC Riverside. UC Riverside moved all its many departmental and unit websites to GA4 under a centralized analytics structure. The university’s web team built a curated “Web Analytics for Campus Partners” GA4 dashboard with custom reports, including a Broken Links report and a Top Landing Pages report.

    These tailored GA4 dashboards help site owners across campus quickly spot issues (e.g. finding and fixing 404 error pages) and identify content that attracts new traffic. By giving each department actionable insights, such as which pages are bringing in the most new visitors, UCR has improved user experience and informed content strategy across dozens of sites in its domain.

    HEM BP Image 5HEM BP Image 5

    Source: UC Riverside

    Texas A&M University. Texas A&M established an Analytics Community of Practice that meets monthly, bringing together marketers and communicators from different colleges and units to share GA4 insights and techniques.

    In these sessions, participants discuss recent findings (for example, which pages on their sites show unusually high engagement rates, or how referral traffic patterns are shifting) in a collaborative forum. This ongoing knowledge exchange ensures continuous learning and helps cultivate a data-informed culture campus-wide.

    HEM BP Image 6HEM BP Image 6

    Source: Texas A&M University

    Turning GA4 Insights into Enrollment Growth

    Embracing GA4’s event-based, student-centric model can reshape how your team drives recruitment outcomes. By moving beyond vanity metrics like pageviews, GA4 prompts higher ed marketers to focus on real indicators of student intent, such as engaged sessions, application clicks, and program page sequences. Across each funnel stage, GA4 reveals which channels attract interest, what content sustains it, and which actions convert it.

    This clarity empowers you to refine campaign targeting, improve website performance, and simplify the inquiry or application path. GA4 also bridges the long-standing gap between marketing and admissions by giving both teams shared metrics and a common funnel narrative. Instead of saying, “We got 10,000 visits,” marketing can report: “We drove 300 info requests and 50 applications, and here’s what influenced them.”

    It’s true, GA4 can feel overwhelming at first. But by focusing on core engagement metrics, key conversion events, and simple funnel analyses, you can avoid the noise and surface what truly matters. Start small, then grow into more advanced insights as you gain confidence. What should higher ed marketers avoid focusing on in GA4? Don’t worry if GA4 isn’t tracking beyond the application. 

    Also, avoid misattributing things to GA4 that it can’t measure – e.g., GA4 won’t tell you ‘admitted vs. denied’ or ‘enrolled vs. melt’ – that’s outside its scope. Focus on what GA4 can concretely tell you about the marketing funnel leading up to enrollment.

    Above all, GA4 is most powerful when used collaboratively. Share funnel data with admissions. Highlight high-performing content to your copy team. Use insights to inform international recruitment or retargeting campaigns. And if needed, partner with specialists. At HEM, we help institutions build clear, actionable GA4 setups, from audits and event tracking to CRM integrations, so your analytics directly support enrollment.

    GA4 isn’t just an upgrade; it’s a strategic advantage. When aligned with your funnel, it can become your most effective tool for enrollment growth.

    GA4 unlocks powerful enrolment insights.

    Turn student journey data into smarter recruitment decisions with HEM.

    FAQs

    What makes GA4 different from Universal Analytics for higher ed marketers?
    Higher ed marketers accustomed to UA’s pageviews and sessions are now confronted with a new event-based model, a slew of unfamiliar reports, and an interface that looks nothing like the old Google Analytics. GA4 offers richer insights into student behaviour and intent, which can directly fuel enrollment growth.

    What should higher ed marketers avoid focusing on in GA4?
    Don’t worry if GA4 isn’t tracking beyond the application. Also, avoid misattributing things to GA4 that it can’t measure, e.g., GA4 won’t tell you ‘admitted vs. denied’ or ‘enrolled vs. melt’, that’s outside its scope. Focus on what GA4 can concretely tell you about the marketing funnel leading up to enrollment.

    Which GA4 reports should we prioritize for enrollment marketing?
    Focus on the critical reports:

    • Traffic Acquisition & User Acquisition (for awareness channel quality)
    • Engagement > Pages and Screens (for top content and engagement per page)
    • Engagement > Events & Conversions (for tracking micro and macro conversions)
    • Explorations: Funnel Analysis (for visualizing the enrollment funnel and drop-offs)
    • Explorations: Path Analysis (for seeing common user journeys and sequences)
    • Advertising > Attribution Paths (for understanding multi-touch conversion paths)”

    Source link

  • Carnegie Navigates Change in Higher Ed With Student Connection

    Carnegie Navigates Change in Higher Ed With Student Connection

    Carnegie announced a continued commitment to higher education that places student connection at the center of institutional strategy, aligning research, strategy, storytelling, media, and technology to help colleges and universities navigate today’s interconnected challenges. The update reflects an evolution in how Carnegie supports enrollment, trust, relevance, and student success in an era shaped by demographic change and AI-driven discovery.

    A Moment of Change for Higher Education

    As colleges and universities confront a period of sustained pressure, rising scrutiny, and rapid change, Carnegie today announced a continued commitment to how it supports higher education—placing student connection at the center of institutional strategy, decision-making, and long-term success.

    The Announcement at the 2026 Carnegie Conference

    The announcement was made on stage at the opening of the 2026 Carnegie Conference, where more than 400 higher education leaders and professionals gathered to examine the forces reshaping enrollment, reputation, strategy, and the student experience.

    More Than a Brand Update—A Strategic Evolution

    While Carnegie introduced an updated brand identity as part of the moment, company leaders emphasized that the announcement reflects a broader evolution in how the company is responding to the realities facing institutions today. 

    Carnegie is aligning its strategy around integrated, innovative approaches—bringing together research, data, AI-enabled technology, and strategy—to help leaders address challenges that are increasingly interconnected and complex.

    Why This Shift Matters Now

    “Higher education leaders are operating in an environment where the stakes are higher and the margin for error is smaller,” said Gary Colen, chief executive officer of Carnegie. “Our responsibility is to innovate with purpose—delivering clarity, focus, and solutions that help institutions make decisions that lead to better outcomes for students.”

    Student Connection as a Strategic Imperative

    Carnegie’s work is grounded in a single belief: when students succeed, higher education thrives—and the world wins

    As demographic shifts, changing learner expectations, technological disruption, and public accountability reshape the sector, Carnegie has aligned its strategy around helping higher ed institutions build meaningful, lasting connections with today’s diverse learners.

    Meeting the Moment Higher Education Leaders Are Facing

    According to Michael Mish, Chief Growth Officer, the timing of the announcement reflects what the company is hearing from campus leaders. “Higher education leaders need partners who deliver strategic expertise and forward-thinking innovation,” Mish said. “Our evolution is about connecting strategy and innovation in practical ways—so institutions can address today’s challenges while preparing for what’s next.”

    What the Updated Carnegie Brand Represents

    The updated brand brings greater cohesion to how Carnegie delivers research, strategy, storytelling, media, and technology—reinforcing its role as a strategic higher education partner focused on trust, relevance, and results rather than short-term wins.

    A More Integrated Approach to Research, Strategy, and Execution

    “Our intent wasn’t to make a statement about ourselves,” said Tyler Borders, Chief Brand Officer. “It was to be more precise about our role and our responsibility in this moment. The brand reflects how our work has evolved and the standard we expect of ourselves as a partner to higher education.”

    What’s Launching Next

    As part of the rollout, Carnegie has launched an updated digital experience and will introduce new research, offerings, and insights. 

    New Research and Insights

    This week, the company is releasing a comprehensive research report focused on online learners. In February, Carnegie will debut an updated Carnegie Intelligence newsletter, expanding how it shares perspective and practical guidance with higher education leaders.

    Introducing Answer Engine Optimization (AEO)

    Carnegie is also introducing a new Answer Engine Optimization (AEO) solution designed to help higher education institutions improve visibility in AI-powered search experiences—ensuring institutions are accurately represented as students increasingly rely on AI to answer questions about programs, outcomes, cost, and fit.

    Navigating the Now and the Next—Together

    “This is ongoing work,” Colen added. “Our commitment is to keep earning trust—by helping institutions navigate what’s next without losing sight of what matters most: changing students’ lives for good.”

    For every college and university facing urgent and complex challenges, Carnegie is the student connection company that helps you navigate the now and the next in higher education. Our experts design custom strategies fueled by data, technology, and insights—empowering you to connect with today’s diverse learners and stay focused on what matters most: changing students’ lives for good. 

    Frequently Asked Questions About Carnegie and Student Connection

    Who is Carnegie in higher education?

    Carnegie is a strategic partner to colleges and universities focused on enrollment, reputation, strategy, and student success. The company helps institutions navigate complex, interconnected challenges by aligning research, strategy, storytelling, media, and technology around what matters most: students.

    What does it mean to be a “student connection company”?

    Being a student connection company means helping institutions build meaningful, lasting relationships with today’s diverse learners. Carnegie focuses on connecting strategy, data, storytelling, and execution so institutions can support student success, institutional relevance, and long-term impact.

    What prompted Carnegie’s updated brand and renewed commitment?

    Carnegie’s updated brand reflects an evolution in how the company responds to the realities facing higher education today, including demographic shifts, technological disruption, and increased public accountability. The refresh clarifies Carnegie’s role as a strategic partner helping institutions navigate these interconnected challenges without losing focus on students.

    How does Carnegie help colleges and universities navigate change?

    Carnegie supports institutions through integrated research, strategic planning, brand and storytelling, media and digital marketing, and technology-enabled solutions. This approach helps leaders align enrollment goals, reputation, data, and execution to drive meaningful outcomes.

    What is Carnegie’s Answer Engine Optimization (AEO) solution?

    Carnegie’s Answer Engine Optimization (AEO) solution helps colleges and universities improve how they are represented in AI-powered search environments like ChatGPT, Google AI Overviews, and other answer engines. The solution focuses on content clarity, factual alignment, and structured optimization so institutions are trusted sources when students ask AI-driven questions.

    Source link

  • Why are Miami police questioning a woman over Facebook posts?

    Why are Miami police questioning a woman over Facebook posts?

    This essay was originally published by UnHerd on Jan. 20, 2026.


    “This is freedom of speech. This is America, right?”

    Those were the incredulous words of Raquel Pacheco, a U.S. Army veteran and three-time candidate for local office. She made the remark while being questioned by police at her Miami Beach home last week for criticizing her mayor on Facebook.

    On Jan. 6, Miami Beach Mayor Steven Meiner posted a message on his official Facebook page saying, among other things, that “Miami Beach is a safe haven for everyone” and that the city “is consistently ranked by a broad spectrum of groups as being the most tolerant in the nation.”

    Speech is not a crime — even if it complicates ICE’s job

    Aaron Terr explains why alerting others to law enforcement activity, or reporting on it, is protected by the First Amendment.


    Read More

    That is, apparently, unless you criticize him. Pacheco’s response — accusing Meiner of “consistently call[ing] for the death of all Palestinians,” trying “to shut down a theater for showing a movie that hurt his feelings,” and “REFUS[ING] to stand up for the LGBTQ community in any way” — appears to have been too much free speech for the mayor to tolerate.

    Six days later, two Miami Police officers knocked on Pacheco’s door, claiming they were there “to have a conversation” and confirm that it was her who made those comments. In a video of the interaction, the officers justify their visit by saying they wanted to prevent “somebody else getting agitated or agreeing with” Pacheco’s post. They added that the line about Meiner’s views on Palestinians “can probably incite somebody to do something radical,” and advised her “to refrain from posting things like that because that could get something incited.”

    What occurred at Pacheco’s home raises serious concerns in a free society. Her statements fall well short of the legal threshold for incitement, which applies only to speech which urges unlawful action and is likely to provoke it immediately. A careful reading of her post reveals no call for illegal activity, nor any indication that it would prompt others to act unlawfully.

    If sharp but non-threatening criticism and political commentary can be treated as unlawful incitement, freedom of speech ceases to exist in any meaningful sense.

    Residents of the United Kingdom are all too familiar with police interventions over social media content. In September, blogger Pete North was arrested for posting a meme displaying the text “F— Palestine F— Hamas F— Islam… Want to protest? F— off to a Muslim country & protest.” That same month, Deborah Anderson, an American who had been living in England for years, was visited by police for Facebook posts that “upset someone.” And last January, a couple were arrested on suspicion of harassment, evidently for comments as mild as describing an employee at their daughter’s school as a control freak in a parents’ WhatsApp chat. Sadly, such incidents are just a fraction of longstanding limitations on speech in the UK.

    These examples demonstrate why the First Amendment sets the bar so high for its few, narrow exceptions. Democracy requires ample breathing room to speak about public issues. If sharp but non-threatening criticism and political commentary can be treated as unlawful incitement, freedom of speech ceases to exist in any meaningful sense.

    Such cases highlight the need to safeguard free expression in both the U.S. and the UK. Censorious practices which appear in one place often spread elsewhere. Across the West, law enforcement responses to online criticism are becoming more common. Without vigilance, such interventions will continue. The principle is clear: free expression must be protected.

    Source link

  • Erasmus or Turing? Why it’s not an either-or

    Erasmus or Turing? Why it’s not an either-or

    Join HEPI Director Nick Hillman OBE and SUMS Consulting at 11am tomorrow (22nd January 2026) for a webinar based on the report ‘University Lands: Mapping Risks and Opportunities for the HE Sector’. Sign up for the webinar here. Read the blogs HEPI had published on the report here, here and here.

    This blog was kindly authored by Beverley Orr-Ewing, Consultant and Student Mobility Lead, Cormack Consulting Group, and Professor Sally Wheeler, Vice-Chancellor, Birkbeck, University of London.

    As the UK edges closer to a return to Erasmus+, attention is turning towards what this might mean for the Turing Scheme. For many universities, particularly those with deep European partnerships and strong Modern Language provision, the prospect of rejoining Erasmus is genuinely welcome. Politically, it is also an attractive signal – a step towards reversing some of the damage caused to relationships by Brexit and restoring a sense of connection with our European partners.

    Against that backdrop, our concern is less about a return to Erasmus itself, and more about the assumption that Erasmus can simply take on the role Turing currently plays. The real question, therefore, is not which scheme is better but what we will lose if Turing disappears.

    Different schemes, different problems

    Erasmus+ and Turing were created to address different policy challenges, at different moments in time and in very different political contexts. Erasmus+ emerged in the late 1980s as part of a broader project of European social integration, designed to support long-term cooperation through reciprocal partnerships between largely publicly funded higher education systems.

    By contrast, Turing, was designed in a post-Brexit landscape where the loss of Erasmus made a contraction in outward student mobility all but inevitable unless a new mechanism was put in place. From the outset, it placed greater emphasis on two areas Erasmus historically struggled with: widening participation and global reach. Turing was never intended to be a like-for-like replacement for Erasmus+, and treating it as such risks misunderstanding both its purpose and its value.

    What Turing has enabled

    A useful starting point is what Turing has enabled, and how it has reshaped and added value to student mobility.

    First, a genuinely global approach. The Turing Scheme has supported tens of thousands of UK students each year to undertake study and work placements overseas. In the 2023–24 funding round alone, nearly 23,000 higher-education students were supported, with placements spanning more than 160 countries worldwide. Alongside European destinations, the top host countries also show how non-European mobility has become central to Turing. In 2023–24, six of the ten most common higher-education destination countries were outside the EU (the United States, Australia, Canada, Japan, South Korea and China), and these non-EU destinations accounted for just over half of learners across the top ten.This reflects a shift away from a primarily Europe-centred model towards mobility that aligns more closely with student interests and the increasingly global outlook of UK higher education.

    Second, a more flexible, student-focused funding model. Turing places fewer structural requirements on the form mobility must take, allowing funding to follow the student rather than being shaped by institutional exchange frameworks. This has made it easier to support a wider range of mobility types, including short programmes, work placements and volunteering. While Erasmus also supports work and volunteering, Turing’s design has provided greater flexibility in contexts where traditional exchange models are difficult to sustain or risk excluding particular groups of students.

    Third, a clear priority around widening participation. From its inception, Turing was explicitly framed around improving access to mobility for students who have historically been less likely to participate. Funding outcomes for 2023–24 indicate that close to half of higher-education Turing participants were from under-represented or disadvantaged backgrounds, reflecting the scheme’s prioritisation of access. Its support for shorter, more flexible forms of mobility has been particularly important in widening participation, creating opportunities that are more manageable alongside work, family and financial commitments.

    This focus reflects purposeful choices shaped by both the diversity of today’s student body and the way UK universities now operate within a competitive global higher education environment.

    Equity and access

    The question, then, is how well Erasmus can sustain the broader patterns of participation that Turing has helped to establish. While Erasmus offers many strengths, its traditional models work best for students who are already well placed to participate – those who can commit to longer periods abroad, manage higher upfront costs and navigate study in another European language or academic system.

    Turing marked a deliberate shift away from that default. It was designed not just to increase the number of students going abroad, but to ensure that students from less financially secure backgrounds could access similar international opportunities to their peers. Removing it risks a return to mobility models that remain open in principle, but are easier to take up for students with greater financial flexibility and fewer competing pressures. If widening participation is to be meaningful, that risk requires careful consideration.

    Global breadth and strategic reach

    One of Erasmus’s greatest strengths is its stability secured in long term institutional partnerships, but the flexibility of Turing has enabled UK student mobility well beyond Europe at a time when global competition for talent, partnerships and influence is intensifying.

    Erasmus does allow some third-country mobility through KA171, but this remains capped and partnership-led, with limited flexibility and scale. If Erasmus were to become the sole mobility mechanism, the geographic frame and shape of UK student mobility would inevitably narrow, at a time when UK universities are being required to think more globally.

    A different operating environment

    There is also a more fundamental structural issue at play. UK universities increasingly operate within a funding and policy environment that differs markedly from that of many European counterparts. They are less directly publicly funded, more dependent on international engagement, and therefore more globally oriented in both strategy and outlook. For many institutions, this is not a matter of ambition alone but of long-term sustainability.

    Erasmus was designed to support publicly funded systems with strong regional integration. For EU member states, it also sits within a policy framework that they are able to shape and influence over time. Turing, by contrast, aligns more closely with the strategic reality of UK institutions, particularly those seeking to build sustainable engagement in growth regions such as India, Africa and Southeast Asia. In practice, universities were only beginning to realise how Turing could be used as a strategic lever alongside wider international priorities – removing it now risks cutting off that line of development just as it was starting to take shape.

    Disruption and uncertainty

    There is also a practical reality to consider. Over the past five years, universities have rebuilt systems, processes and partnerships around Turing. Removing it would create partnership instability, impose significant transition costs and erode institutional capacity at a point when resources are already under pressure.

    Alongside this sits a wider issue of political volatility. If student mobility funding is subject to repeated policy shifts, institutions are forced into short-term planning, designing programmes that may not survive the next change of government. That environment makes sustained investment, partnership-building and long-term student opportunity much harder to achieve.

    Design and delivery: what can be improved

    None of this is to deny that Turing has significant shortcomings. Late funding announcements, heavy administrative requirements and a lack of certainty of funding from one year to the next have created real challenges for institutions. They have not been able to embed Turing fully into long-term strategy.

    There are also limits that stem from Turing’s underlying design. Most notably, it does not provide funded reciprocal mobility unlike Erasmus, which embeds exchange as a core principle. Turing was deliberately constructed as an outward-only scheme. This has made it harder for institutions to sustain balanced international partnerships but reflects a conscious policy choice.

    Challenges around timing, administration and predictability are matters of delivery rather than principle. With clearer commitment, earlier confirmation of funding and greater multi-year certainty – drawing on the planning cycles familiar from Erasmus – many of these could be substantially mitigated. The experience of recent years suggests not that Turing lacks purpose, but that it has not yet been given the conditions it needs to thrive.

    A question of balance

    The most sustainable outcome is not a choice between Erasmus or Turing, but an approach that recognises the distinct value of both. Together, they support different dimensions of student mobility: European depth alongside global reach; long-standing partnerships alongside flexibility; stability alongside responsiveness.

    In a relatively short period of time, Turing has begun to reshape who participates in mobility, where students are able to go, and how international experience fits. It has opened doors to experiences previously beyond the reach of many students, supported more inclusive forms of participation and given UK institutions a tool that better reflects the global realities in which they operate. It would be a loss to see that progress curtailed.

    Rejoining Erasmus may be both desirable and beneficial. But allowing it to replace Turing entirely would mean stepping back from gains in widening participation and global engagement that align closely with the strategic direction of UK higher education and with the needs of students.

    Source link

  • Why Ind. Fans Are Excited About First Football National Champs

    Why Ind. Fans Are Excited About First Football National Champs

    The Indiana Hoosiers defeated the Miami Hurricanes 27 to 21 to win the university’s first-ever NCAA Division I college football national championship this week. Any school would be thrilled to clinch this title and take home the trophy that accompanies it. But I will explain in this article why it hits different for IU students, alumni, employees and other supporters. Before doing so, I’ll first disclose how I know.

    Five of the best years of my life were spent in Bloomington. I have a master’s degree and Ph.D. from the extraordinary university that is the heartbeat of that beloved community. IU subsequently bestowed upon me two distinguished alumni awards. The university presented its first Bicentennial Medal to Indiana governor Eric Holcomb in July 2019; that same month, I became the second recipient.

    Since graduating with my doctorate 23 years ago, I have returned to campus to deliver several lectures and keynote speeches, including the 2024 Martin Luther King Jr. Day Address. My favorite trip back was in 2011 to celebrate my fraternity’s centennial. Ten visionary Black male students founded Kappa Alpha Psi there, a brotherhood that now has more than 150,000 members. I am proud to be one of them. These are just a few of countless reasons why I have long been one of IU’s proudest alums.

    Here is what I remember about football games in the late ’90s and early 2000s: Whew, yikes! Tons of people showed up to tailgate outside our stadium on Saturday mornings before home games. I was often one of them. Those gatherings were probably just as fun there as they were at schools that had won Power 4 conference titles and national championships. But there was one embarrassing feature of our pregame tailgates: Few people actually went inside Memorial Stadium for games. When I say “few,” I mean at least two-thirds of stadium seats were empty. I thought it rude and unsupportive of student athletes to eat and drink in the parking lot for hours then skip the game—hence, I opted for the tailgate-only experience no more than four times each season. I was inside cheering all the other times.

    Despite what had long been its shady tailgating culture, IU has amazing fans. I often screamed alongside them at basketball games. During one of my most recent visits to campus, President Pam Whitten generously hosted me for a Big Ten matchup in her fabulous suite inside the iconic Assembly Hall. I was instantly reminded that my beloved alma mater has an electrifying, inspiringly loyal fan base—for basketball. As it turns out, winning five men’s national basketball championships, clinching 22 Big Ten conference titles and making 41 NCAA tournament appearances (advancing to the Final Four eight times) excites people. Suffering so many defeats in football year after year, not so much.

    Throughout the last two seasons, ESPN commentators and other sportscasters have annoyingly repeated that Indiana has long been the losingest major college football team of all time; I will leave it to someone else to fact-check that. Going from being so bad for so long to an 11–2 season and playoff berth last year, followed by a Big Ten Championship, a flawless 16–0 season and a national championship win this year, are just some reasons why IU alumni and others are so excited. Oh, and then there is Fernando Mendoza, our first-ever Heisman Trophy winner, and Curt Cignetti, the inspirational head coach who accelerated our football program to greatness in just two seasons.

    Instantly improving from (reportedly) worst of all time to college football’s undisputed best is indeed exciting. Nevertheless, it is not the only reason why the Indiana faithful are so amped. Our university is beyond extraordinary in numerous domains. Academic programs there are exceptional; many, including the one from which I graduated, are always ranked in the top nationally. The university employs many of the world’s best professors and researchers. Its connection to the Hoosier State is deep, measurable and in many ways transformative. The Bloomington campus, framed by its gorgeous tulip-filled Sample Gates, is a vibrant, exciting place to be a student. It feels like a great university because it has long been, still is and forever will be. It is birthplace of the greatest collegiate fraternity, a fact that requires no verification.

    Finally having a football program that matches all the other great things that IU is and does is why those of us who have experienced the place are so freakin’ excited about our first-ever college football national championship. Greatness deserves greatness. Thanks to Cignetti and his staff, Mendoza and every other student athlete on their team, Indiana University has finally achieved football greatness. They have given others and me one more reason to be incredibly proud of a great American university that excels in academics, public outreach, athletics and so many other domains. I conclude with this: Hoo-Hoo-Hoo-Hoosiers!

    Shaun Harper is University Professor and Provost Professor of Education, Business and Public Policy at the University of Southern California, where he holds the Clifford and Betty Allen Chair in Urban Leadership. His most recent book is titled Let’s Talk About DEI: Productive Disagreements About America’s Most Polarizing Topics.

    Source link

  • Settlements Cost Higher Ed Hundreds of Millions in 2025

    Settlements Cost Higher Ed Hundreds of Millions in 2025

    Jeffness/Wikimedia Commons

    A new report by the United Educators insurance company shows that universities spent hundreds of millions of dollars on damages in 2025, according to an analysis of publicly reported settlements.

    Legal cases involved a variety of issues, ranging from deaths on campus to antitrust issues, cybersecurity breaches, discrimination, sexual misconduct and pandemic-era policy fallout. 

    Columbia University and NewYork-Presbyterian Hospital had the largest settlement at $750 million in a case related to hundreds of instances of sexual abuse by Robert Hadden, a former doctor who worked at both Columbia’s Irving Medical Center and the hospital. United Educators noted that there is no clear breakdown of which entity shouldered the brunt of the settlement.

    Michigan State University followed with the next-largest settlement at $29.7 million. Michigan State settled with three victims injured in a campus shooting that killed three students in 2023.

    Other notable settlements include:

    • Pennsylvania State University paid $17 million to settle claims that it overcharged students when officials shifted from in-person to remote instruction during the coronavirus pandemic. Penn State was one of five institutions in the report to settle lawsuits amid allegations that they overcharged students, with damages ranging from a high of $17 million to a low of $3.5 million.
    • The University of Colorado Anschutz reached a $10 million settlement with 18 plaintiffs, both staff and students, who were denied religious exemptions to a COVID-19 vaccine mandate.

    The report noted that most of the incidents highlighted did not involve United Educators members. The full report can be read here and also includes major losses for K–12 schools.

    Source link

  • Honoring Martin Luther King, the Nobel Peace Prize He Earned

    Honoring Martin Luther King, the Nobel Peace Prize He Earned

    The United States celebrated the life and legacy of Rev. Dr. Martin Luther King Jr. this week. On the national holiday named for him and at numerous other times throughout each year, I reflect on what King taught the world through his justice-seeking philosophies, agendas and actions. I typically do so in writing, with the aim of thoughtfully connecting King to what is happening in our country at the time. For example, two years ago, I published an article in which I contended that he would be appalled by the politicized attacks on and dismantling of diversity, equity and inclusion efforts. This year, I decided to write about something else that has been in the news lately for all the wrong reasons.

    The Nobel Peace Prize was awarded to King in 1964, four years before he was assassinated. He earned it. King did not beg for it or annoyingly insist that it should be awarded to him. He did not make boastful claims about all he had single-handedly done to help end human suffering in America and abroad. Instead, he bravely put his life on the line for peace and justice, not for a prize.

    The Nobel Foundation was persuaded enough by King’s impact to celebrate it. No one had to donate their award to the civil and human rights icon. Same with Barack Obama—his 2009 Nobel Peace Prize did not come via whining, self-aggrandizement, public expressions of entitlement or donation from a prior recipient who desperately endeavored to gain political favor with a U.S. president.

    I learned very little about the prize in my K–12 schools, college or graduate school. I did at least know that King had been awarded it, because it is often a prominent detail in his biography. There is a chance that today’s students (including collegians) still do not learn much about the prize in textbooks or anyplace else. Perhaps few would be able to name five prior recipients. But King would probably be one name that most of them call.

    In addition to not knowing enough people who have won it, it is plausible that few students know much about the origins of the prize and the process by which laureates are selected. Because “peace” is in its name, most would likely deduce that the honor is in recognition of recipients’ extraordinary efforts to promote peace. Students also would likely presume the awardees to have themselves been peaceful people, certainly not sustainers of chaos or promoters of divisiveness.

    King had lots of opponents. But he did not waste time in pulpits, in his Birmingham jail cell, on streets all over America or on the steps of the Lincoln Memorial (the site of his famed “I Have a Dream” speech) talking about how much he hated those who violently challenged and rejected his agenda. Love, forgiveness, unity and peace are what he extended to and invited from them. He urged others to pursue the same with neighbors and co-workers who were from different races, socioeconomic circumstances, religions and political parties. King hated racism. He hated poverty. Notwithstanding, he proposed and aggressively pursued remedies for them from a standpoint of love.

    I know for sure that were he still alive, King would be fighting like hell right now to ensure that millions of Americans—including whites who jailed him, spat in his face and wanted him dead—get to keep access to high-quality, affordable health care. There is no way he would have sat idly by as the recent politicization of food-stamp benefits placed low-income citizens at risk of starvation. I suspect that King would make the point that poverty and sickness unfairly place people in desperate, unhealthy contexts in which conflict ensues. In myriad ways, equity and equality are strongly connected to his writings about peace, several of which are published in a 736-page anthology of speeches, letters, sermons and op-eds.

    On the eve of this year’s MLK holiday here in the U.S., instead of devoting full attention to honoring one of its most recognizable laureates, the Nobel Foundation had to spend its time articulating the sacredness of its award and making sure people understand that “a laureate cannot share the prize with others, nor transfer it once it has been announced.” Its statement released last week went on to specify, “A Nobel Peace Prize can also never be revoked. The decision is final and applies for all time.”

    Absurdity will neither diminish King’s irrefutable impact nor the Nobel Peace Prize bestowed upon him. In the most dignified manner, King accepted the honor in Oslo 62 years ago: “Sooner or later all the people of the world will have to discover a way to live together in peace,” he declared in his acceptance speech. “If this is to be achieved, man must evolve for all human conflict a method which rejects revenge, aggression and retaliation. The foundation of such a method is love.”

    In celebration of what would have been his 97th birthday, I chose to reflect on King as a courageous, relentless pursuer of peace who himself was a peaceful leader.

    Shaun Harper is University Professor and Provost Professor of Education, Business and Public Policy at the University of Southern California, where he holds the Clifford and Betty Allen Chair in Urban Leadership. His most recent book is titled Let’s Talk About DEI: Productive Disagreements About America’s Most Polarizing Topics.

    Source link