Tag: activity

  • Subject-level insights on graduate activity

    Subject-level insights on graduate activity

    We know a lot about what graduates earn.

    Earnings data—especially at subject level—has become key to debates about the value of higher education.

    But we know far less about how graduates themselves experience their early careers. Until now, subject-level data on graduate job quality—how meaningful their work is, how well it aligns with their goals, and whether it uses their university-acquired skills—has been missing from the policy debate.

    My new study (co-authored with Fiona Christie and Tracy Scurry and published in Studies in Higher Education) aims to fill this gap. Drawing on responses from the 2018-19 graduation cohort in the national Graduate Outcomes survey, we provide the first nationally representative, subject-level analysis of these subjective graduate outcomes.

    What we find has important implications for how we define successful outcomes from higher education—and how we support students in making informed choices about what subject to study.

    What graduates tell us

    The Graduate Outcomes survey includes a set of questions—introduced by HESA in 2017—designed to capture core dimensions of graduate job quality. Respondents are asked (around 15 months after graduation) whether they:

    • find their work meaningful
    • feel it aligns with their future plans
    • believe they are using the skills acquired at university

    These indicators were developed in part to address the over-reliance on income as a measure of graduate success. They reflect a growing international awareness that economic outcomes alone offer a limited picture of the value of education—in line with the OECD’s Beyond GDP agenda, the ILO’s emphasis on decent work, and the UK’s Taylor Review focus on job quality.

    Subject-level insights

    Our analysis shows that most UK graduates report positive early-career experiences, regardless of subject. Across the sample, 86 per cent said their work felt meaningful, 78 per cent felt on track with their careers, and 66 per cent reported using their degree-level skills.

    These patterns generally hold across disciplines, though clear differences emerge. The chart below shows the raw, unadjusted proportion of graduates who report positive outcomes. Graduates from vocational fields—such as medicine, subjects allied to medicine, veterinary science, and education—tend to report particularly strong outcomes. For instance, medicine and dentistry graduates were 12 percentage points more likely than average to say their work was meaningful, and over 30 points more likely to report using the skills they acquired at university.

    However, the results also challenge the narrative that generalist or academic degrees are inherently low value. As you can see, most subject areas—including history, languages, and the creative arts, often targeted in these debates—show strong subjective outcomes across the three dimensions. Only one field, history and philosophy, fell slightly below the 50 per cent threshold on the skills utilisation measure. But even here, graduates still reported relatively high levels of meaningful work and career alignment.

    Once we adjusted for background characteristics—such as social class, gender, prior attainment, and institutional differences—many of the remaining gaps between vocational and generalist subjects narrowed and were no longer statistically significant.

    This chart shows the raw proportion of 2018-19 graduates who agree or strongly agree that their current work is meaningful, on track and using skills, by field of study (N = 67,722)

    Employment in a highly skilled occupation—used by the Office for Students (OfS) as a key regulatory benchmark—was not a reliable predictor of positive outcomes. This finding aligns with previous HESA research and raises important questions about the appropriateness of using occupational classification as a proxy for graduate success at the subject level.

    Rethinking what we measure and value

    These insights arrive at a time when the OfS is placing greater emphasis on regulating equality of opportunity and ensuring the provision of “full, frank, and fair information” to students. If students are to make informed choices, they need access to subject-level data that reflects more than salary, occupational status, or postgraduate progression. Our findings suggest that subjective outcomes—how graduates feel about their work—should be part of that conversation.

    For policymakers, our findings highlight the risks of relying on blunt outcome metrics—particularly earnings and occupational classifications—as indicators of course value. Our data show that graduates from a wide range of subjects—including those often labelled as “low value”—frequently go on to report meaningful work shortly after graduation that aligns with their future plans and makes use of the skills they developed at university.

    And while job quality matters, universities should not be held solely accountable for outcomes shaped by employers and labour market structures. Metrics and league tables that tie institutional performance too closely to job quality risk misrepresenting what higher education can influence. A more productive step would be to expand the Graduate Outcomes survey to include a wider range of job quality indicators—such as autonomy, flexibility, and progression—offering a fuller picture of early career graduate success.

    A richer understanding

    Our work offers the first nationally representative, subject-level insight into how UK graduates evaluate job quality in the early stages of their careers. In doing so, it adds a missing piece to the value debate—one grounded not just in earnings or employment status, but in graduates’ own sense of meaning, purpose, and skill use.

    If we are serious about understanding what graduates take from their university experience, it’s time to move beyond salary alone—and to listen more carefully to what graduates themselves are telling us.

    DK notes: Though the analysis that Brophy et al have done (employing listwise deletion, examining UK domiciled first degree graduates only) enhances our understanding of undergraduate progression and goes beyond what is publicly available, I couldn’t resist plotting the HESA public data in a similar way, as it may be of interest to readers:

    [Full screen]

    Source link

  • Lawsuit challenges Trump ICE raid policy, citing LAUSD activity

    Lawsuit challenges Trump ICE raid policy, citing LAUSD activity

    This audio is auto-generated. Please let us know if you have feedback.

    The Trump administration’s Immigration and Customs Enforcement policy allowing ICE raids on school grounds and other sensitive locations was challenged in a lawsuit filed this week on behalf of an Oregon-based Latinx organization and faith groups from other states. 

    The lawsuit cites ICE activity at two Los Angeles elementary schools last month, as well as parents’ fears of sending their children to school in other locations across the country. 

    “Teachers cited attendance rates have dropped in half and school administrators saw an influx of parents picking their children up from school in the middle of the day after hearing reports that immigration officials were in the area,” said the lawsuit filed April 28 by the Justice Action Center and the Innovation Law Lab. It was filed in the U.S. District Court for the District Court of Oregon’s Eugene Division.

    The two organizations filed on behalf of Oregon’s farmworker union Pineros y Campesinos Unidos del Noroeste, whose members say they are afraid to send their children to school,” per the draft complaint. The farmworker union’s members, especially those who are mothers, say their livelihood depends on sending their children to school during the day while they work. 

    “They now must choose between facing the risk of immigration detention or staying at home with their children and forfeiting their income,” the lawsuit said. One of the members of the union said her children were “afraid of ICE showing up and separating their family.” 

    The lawsuit challenges a Department of Homeland Security directive, issued one day after President Donald Trump’s inauguration, that undid three decades of DHS policy that prevented ICE from raiding sensitive locations like schools, hospitals and churches. 

    “Criminals will no longer be able to hide in America’s schools and churches to avoid arrest,” a DHS spokesperson said in a January statement on the order. “The Trump Administration will not tie the hands of our brave law enforcement, and instead trusts them to use common sense.”

    When asked for comment on the lawsuit, an ICE spokesperson said the agency does not comment on pending or ongoing litigation. 

    Monday’s lawsuit and others filed against the directive say the change in policy is impacting students’ learning and districts’ ability to carry out their jobs. 

    A lawsuit filed in February by Denver Public Schools said the DHS order “gives federal agents virtually unchecked authority to enforce immigration laws in formerly protected areas, including schools.” It sought a temporary restraining order prohibiting ICE and Customs and Border Protection from enforcing the policy. 

    According to the American Immigration Council, over 4 million U.S. citizen children under 18 years of age lived with at least one undocumented parent as of 2018. A 2010 study cited by the council found that immigration-related parental arrests led to children experiencing at least four adverse behavioral changes in the six months following the incidents.

    Another study cited by the organization, conducted in 2020, found that school districts in communities with a large number of deportations saw worsened educational outcomes for Latino students.

    Source link

  • Exploring the explosion in franchise and partnership activity

    Exploring the explosion in franchise and partnership activity

    There’s a clear need for more regulatory oversight of franchise and partnership teaching arrangements, but – as regulators are finding – there’s no easy way to track which students are being registered, taught, or physically located at which provider.

    Knowing where students study feels like a straightforward matter – indeed “where do HE students study” is one of the top level questions posed in HESA’s Student open data collection. If you click on that, it takes you to an up-to-date (2022-23 academic year) summary of student numbers by registering provider.

    But as we’ve learned from concerns raised by the Office for Students, the Student Loans Company, the National Audit Office, and (frankly) Wonkhe there is a bit more to it than that. And it is not currently possible to unpick this to show the number of students at each provider – for any given value of “at” other than registering – using public data. But we can do it for the number of courses.

    The forgotten open data set

    Yes – I’ve started the year abusing the Unistats open data release (it’s the only open data release that lets you find details of courses was the clue). And you sort of, kind of, can unpick some of these relationships using it. After a fashion.

    It is worth unpacking our terms a bit:

    • A student’s registration provider is the provider that returns that student as a part of their official data returns. If the registration provider has degree awarding powers, this is generally the provider that awards the qualification the student is working towards
    • A student’s teaching provider is the provider where a student is actually taught – in the instances where this is different to the registering provider this usually happens via a partnership arrangement of some sort.
    • A student’s location provider is the actual place a student is taught – usually, this is the same as a teaching provider, but not always – for example a “university centre” based at an FE college counts as the university in question doing the teaching, but the location would be the FE provider that hosts the centre.
    • We’ve also got to deal with the idea of an awarding provider, the place that actually awards the degree the student is working towards. In the main this is the same as the registering provider, but where the registering provider can’t award the degree in question it will be someone else by arrangement.

    How does this appear in Unistats? You are probably familiar with the notion of the UK Provider Reference Number (UKPRN): a unique identifier for educational settings. In the unistats data we get something called PUBUKPRN, which identifies where a course is primarily taught. We get something called UKPRN, which identifies where students on a given course are registered. And we get LOCUKPRN, which identifies the location a student is taught at – where this a location with its own UKPRN that is not the same as the PUBUKPRN.

    Limits to sector growth visualisation

    What’s missing – we don’t get a UKPRN for the awarding body. Not in public data. It is collected (OTHERINST) and used on the Discover Uni website but it is not published for me to mess with. Not yet, anyway.

    So what I can show you is the number of courses at each combination of registering, teaching, and location provider. This shows instances where students may be registered at one provider but taught at another (your classic partnership arrangement), and the evolving practice of declaring unilateral branch campuses (where students are registered and taught at one provider, but located at a different one).

    That latter one explains the explosion of London “campuses” that area really an independent provider (that may cater to multiple institutions in a similar way). The whole thing gives some indication of where a given provider is involved in franchise/partnership activity – but only where this is shown on unistats.

    What I found

    First up – sorted by registering provider. You can filter by registering provider if you don’t want the whole list (whyever not?), and I’ve included a wildcard filter for course titles. This is instead of a subject filter – each course in unistats is meant to have a subject associated with it, but this tends not to happen for the kinds of courses we are interested in here:

    [Full screen]

    And the same thing, sorted (and searchable) by teaching provider:

    [Full screen]

    Many limitations, but a space to keep asking

    The limitations here are huge. What we really want is the number of students registered on each course – we can get this at various levels of aggregation but not reliably at a single course, single cohort level. HESA’s policy of rounding and aggregating to avoid identifying individual students (and of course the historic nature of much of the data presented on unistats) means that most of the information in the data set (entry qualification), NSS, graduate destinations is combined across multiple years and numerous related subjects.

    In the usual unistats fashion “courses” are a unique combination of qualification aim, title, and mode for each location. So the handful of remaining providers that do defined “joint” degrees (two or more subjects) look like they are spectacularly busy. And, as always, the overall quality of data isn’t brilliant so there will be stuff that doesn’t look right (pro tip: yes tell me, but also tell HESA).

    Here then, is a partial explanation as to why regulators and others have been slow to respond to the growth in franchise, partnership, and other joint provisions: almost by definition the novel things providers are up to don’t show up in data collection. That’s kind of the point.

    But is a simple list of available courses, where they are taught, what (and whose) awards they lead to, what subjects they cover, and how many students are on each too much to ask? It appears so.

    Source link