Tag: good

  • Actually, It’s a Good Time to Be an English Prof (opinion)

    Actually, It’s a Good Time to Be an English Prof (opinion)

    It may sound perverse to say so. Our profession is under attack, our students are reading less, jobs are scarce and the humanities are first on the chopping block. But precisely because the outlook is dire, this is also a moment of clarity and possibility. The campaign against higher education, the AI gold rush and the dismantling of our public schools have made the stakes of humanistic teaching unmistakable. For those of us with the privilege of relative job security, there has never been a more urgent—or more opportune—time to do what we were trained to do.

    I am an English professor, so let me first address my own. Colleagues, this is the moment to make the affirmative case for our existence. This is our chance to demonstrate the worth of person-to-person pedagogy; to speak the language of knowledge formation and the pursuit of truth; to reinvigorate the canon while developing new methods for the study of ethnic, postcolonial, feminist, queer and minority literatures and cultural texts; to stand for the value of human intelligence. Now is when we seize the mantle and opportunity of “English” as a both a privileged signifier and a sign of humility as we fight alongside our colleagues in the non-Western languages and literatures who are even more endangered than we are— and for our students, without whom we have no future.

    I’m not being Pollyannaish. Between Trump 1 and Trump 2 sit the tumultuous COVID years, which means U.S. universities have been reeling, under direct attacks and pressures, for a decade. I started my first job in 2016, so that is the entirety of the time that I have worked as an academic. I spent six years in public universities in purple-red states, where austerity was the name of the game—and then I moved to Texas.

    There have been years of insults and incursions into the profession. We have been scapegoated as an out-of-touch elite and called enemies of the state. And no, we haven’t always responded well. In the face of austerity, we let our colleagues be sacrificed. Despite the bad-faith weaponization of “CRT,” “DEI” and “identity politics,” we disavowed identity. Against our better judgment, we assimilated wave after wave of new educational technologies, from MOOCs to course management platforms to Zoom.

    Now, we face a new onslaught: the supposedly unstoppable and inevitable rise of generative AI—a deliberately misleading misnomer for the climate-destroying linguistic probability machines that can automate and simulate numerous high-level tasks, but stop short of demonstrating human levels of intelligence, consciousness and imagination. “The ultimate unaccountability machine,” as Audrey Watters puts it.

    From Substack to The New York Times to new collaborative projects Against AI, humanities professors are sounding the alarm. At the start of this semester, philosopher Kate Manne reflected that her “job just got an awful lot harder.”

    Actually, I think our jobs just got a whole lot easier, because our purpose is sharper than ever. Where others see AI as the end of our profession, I see a clarifying opportunity to recommit to who we are. No LLM can reproduce the deep reading, careful dialogue and shared meaning-making of the humanities classroom. We college professors stand alongside primary and secondary school teachers who have already faced decades of deprofessionalization, deskilling and disrespect.

    There is a war on public education in this country. Statehouses in places like Texas are rapidly dismantling the infrastructure and independence of public institutions at all levels, from disbanding faculty senates to handing over curriculum development to technologists who have no understanding of the dialogical, improvisatory nature of teaching. These are folks who gleefully predict that robots with the capacity to press “play” on AI-generated slide decks can replace human teachers with years of experience. We need them out of our schools at every level.

    Counter to what university administrators and mainstream pundits seem to believe, students are not clamoring to use AI tools. Tech companies are aggressively pushing them. All over the country, school districts and universities are partnering with companies like Microsoft and OpenAI for fear of being left behind. My own institution has partnered with Google. Earlier this semester, “Google product experts” came to campus to instruct our students on how to “supercharge [their] creativity” and “boost [their] productivity” using Gemini and NotebookLM tools. Faculty have been invited to join AI-focused learning communities and enroll in trainings and workshops (or even a whole online class) on integrating AI tools into our teaching; funds have been allotted for new grant programs in AI exploration and course development.

    I didn’t spend seven years earning a doctorate to learn how to teach from Google product experts. And my students didn’t come to university to learn how to learn from Google product experts, either. Those folks have their work, motivations and areas of expertise. We have ours, and it is past time to defend them. We are keepers of canon and critique, of traditions and interventions, of discipline-specific discourses and a robust legacy of public engagement. The whole point of education is to hand over what we know to the next generation, not to chase fads alongside the students we are meant to equip with enduring skills. It is our job to strengthen minds, to resist what Rebecca Solnit calls the “technological invasion of consciousness, community, and culture.”

    Many of us have been trying to do this for some time, but it’s hard to swim against the tides. In 2024, I finally banned all electronics from my English literature classes. I realized that sensitivity to accessibility need not prevent us from exercising simple common sense. We know that students learn more and better when they take notes by hand, annotate texts and read in hard copy. Because my students do not have access to free printing, and because a university librarian told me that “we only go from print to digital, not the other way around,” I printed copies of every reading for every student. With the words on paper before them, they retained more, they made eye contact, they took marginal notes, they really responded to each other’s interpretations of the texts.

    That’s the easy part. As we college professors plan our return to blue books, in-class midterms and oral exams, the challenge is how to intervene before our students come to class. If AI is antithetical to the project of higher education, it’s even more insidious and damaging in the elementary, middle and high schools.

    My children attend Texas public schools in the particularly embattled Houston Independent School District, so I have seen firsthand the app-ification of education. Log in to the middle school student platform—which some “innovator” had the audacity to name “Clever”—and you’ll get a page with more than three dozen apps. Not just the usual suspects like Khan Academy and Epic, but also ABC-CLIO, Accelerate Learning, Active Classroom, Amplify, Britannica, BrainPOP, Canva, Carnegie Learning, CK-12 Foundation, Digital Theatre Plus, Discover Magazine, Edgenuity, Edmentum, eSebco, everfi, Gale Databases, Gizmos, IPC, i-Ready, iScience, IXL, JASON Learning, Language! Live, Learning Ally Audiobook, MackinVIA, McGraw Hill, myPLTW, Newsela, Raise, Read to Achieve, Savvas EasyBridge, STEMscopes, Summit K12, TeachingBooks, Vocabulary.com, World Book Online, Zearn …

    As both a professor and a parent, I have decided to intervene directly. Last year, I started leading a reading group for my 12-year-old daughter and a group of her classmates. They call it a book club. Really, it’s a seminar. Once a month, they convene around our dining table for 90 minutes, paperbacks in hand, to engage in close reading and analysis. They do all the stuff we English professors want our college students to do: They examine specific passages, which illuminate broader themes; they draw connections to other books we’ve read; they ask questions about the historical context; they make motivated references to current social, cultural and political issues; they plumb the space between their individual readings and the author’s intentions.

    No phones, no computers, no apps. We have books (and snacks). And conversation. After each meeting, my daughter and I debrief. About four months in, she said, “You know, a lot of the previous meetings I felt like we were each just giving our own takes. But this time, I feel like we arrived at a new understanding of the book by talking about it together.” The club members had challenged and pushed each other’s interpretations, and together exposed facets of the text they wouldn’t have seen alone.

    The literature classroom is a space of collaborative meaning-making—one of the last remaining potentially tech-free spaces out there. A precious space, that we need to renew and defend, not give up to the anti-intellectual mob and not transform at the behest of tech oligarchs. We have an opportunity here to stand up for who we are, for the mission of humanistic education, in affirmative, unapologetic terms—while finding ways to build new alliances and enact solidarity beyond the walls of our college classrooms.

    This moment is clarifying, motivating, energizing. It’s time to remember what we already know.

    Source link

  • Helping students to make good choices isn’t about more faulty search filters

    Helping students to make good choices isn’t about more faulty search filters

    A YouTube video about Spotify popped into my feed this weekend, and it’s been rattling around my head ever since.

    Partly because it’s about music streaming, but mostly because it’s all about what’s wrong with how we think about student choice in higher education.

    The premise runs like this. A guy decides to do “No Stream November” – a month without Spotify, using only physical media instead.

    His argument, backed by Barry Schwartz’s paradox of choice research and a raft of behavioural economics, is that unlimited access to millions of songs has made us less satisfied, not more.

    We skip tracks every 20 to 30 seconds. We never reach the guitar solo. We’re treating music like a discount buffet – trying a bit of everything but never really savouring anything. And then going back to the playlists we created earlier.

    The video’s conclusion is that scarcity creates satisfaction. Ritual and effort (opening the album, dropping the needle, sitting down to actually listen) make music meaningful.

    Six carefully chosen options produce more satisfaction than 24, let alone millions. It’s the IKEA effect applied to music – we value what we labour over.

    I’m interested in choice. Notwithstanding the debate over what a “course” is, Unistats data shows that there were 36,421 of them on offer in 2015/16. This year that figure is 30,801.

    That still feels like a lot, given that the University of Helsinki only offers 34 bachelor’s degree programmes.

    Of course a lot of the entries on DiscoverUni separately list “with a foundation year” and there’s plenty of subject combinations.

    But nevertheless, the UK’s bewildering range of programmes must be quite a nightmare for applicants to pick through – it’s just that once they’re on them, job cuts and switches to block teaching are delivering increasingly less choice in elective pathways than they used to.

    We appear to have a system that combines overwhelming choice at the point of least knowledge (age 17, alongside A-levels, with imperfect information) with rigid narrowness at the point of most knowledge (once enrolled, when students actually understand what they want to study and why). It’s the worst of both worlds.

    What the white paper promises

    The government’s vision for improving student choice runs to a couple of paragraphs in the Skills White Paper, and it’s worth quoting in full:

    We will work with UCAS, the Office for Students and the sector to improve the quality of information for individuals, informed by the best evidence on the factors that influence the choices people make as they consider their higher education options. Providing applicants with high-quality, impartial, personalised and timely information is essential to ensuring they can make informed decisions when choosing what to study. Recent UCAS reforms aimed at increasing transparency and improving student choice include historic entry grades data, allowing students, along with their teachers and advisers, to see both offer rates and the historic grades of previous successful applicants admitted to a particular course, in addition to the entry requirements published by universities and colleges.

    As we see more students motivated by career prospects, we will work with UCAS and Universities UK to ensure that graduate outcomes information spanning employment rates, earnings and the design and nature of work (currently available on Discover Uni) are available on the UCAS website. We will also work with the Office for Students to ensure their new approach to assessing quality produces clear ratings which will help prospective students understand the quality of the courses on offer, including clear information on how many students successfully complete their courses.”

    The implicit theory of change is straightforward – if we just give students more data about each of the courses, they’ll make better choices, and everyone wins. It’s the same logic that says if Spotify added more metadata to every track (BPM, lyrical themes, engineer credits), you’d finally find the perfect song. I doubt it.

    Pump up the Jam

    If the Department for Education (DfE) was serious about deploying the best evidence on the factors that influence the choices people make, it would know about the research showing that more information doesn’t solve choice overload, because choice overload is a cognitive capacity problem, not an information quality problem.

    Sheena Iyengar and Mark Lepper’s foundational 2000 study in the Journal of Personality and Social Psychology found that when students faced 30 essay topic options versus six options, completion rates dropped from 74 per cent to 60 per cent, and essay quality declined significantly on both content and form measures. That’s a 14 percentage point completion drop from excessive choice alone, and objectively worse work from those who did complete.

    A study on Jam showed customers were ten times more likely to buy when presented with six flavours rather than 24, despite 60 per cent more people initially stopping at the extensive display. More choice is simultaneously more appealing and more demotivating. That’s the paradox.

    CFE Research’s 2018 study for the Office for Students (back when providing useful research for the sector was something it did) laid this all out explicitly for higher education contexts.

    Decision making about HE is challenging because the system is complex and there are lots of alternatives and attributes to consider. Those considering HE are making decisions in conditions of uncertainty, and in these circumstances, individuals tend to rely on convenient but flawed mental shortcuts rather than solely rational criteria. There’s no “one size fits all” information solution, nor is there a shortlist of criteria that those considering HE use.

    The study found that students rely heavily on family, friends, and university visits, and many choices ultimately come down to whether a decision “feels right” rather than rational analysis of data. When asked to explain their decisions retrospectively, students’ explanations differ from their actual decision-making processes – we’re not reliable informants about why we made certain choices.

    A 2015 meta-analysis by Chernev, Böckenholt, and Goodman in the Journal of Consumer Psychology identified the conditions under which choice overload occurs – it’s moderated by choice set complexity, decision task difficulty, and individual differences in decision-making style. Working memory capacity limits humans to processing approximately seven items simultaneously. When options exceed this cognitive threshold, students experience decision paralysis.

    Maximiser students (those seeking the absolute best option) make objectively better decisions but feel significantly worse about them. They selected jobs with 20 per cent higher salaries yet felt less satisfied, more stressed, frustrated, anxious, and regretful than satisficers (those accepting “good enough”). For UK applicants facing tens of thousands of courses, maximisers face a nearly impossible optimisation problem, leading to chronic second-guessing and regret.

    The equality dimension is especially stark. Bailey, Jaggars, and Jenkins’s research found that students in “cafeteria college” systems with abundant disconnected choices “often have difficulty navigating these choices and end up making poor decisions about what programme to enter, what courses to take, and when to seek help.” Only 30 per cent completed three-year degrees within three years.

    First-generation students, students from lower socioeconomic backgrounds, and students of colour are systematically disadvantaged by overwhelming choice because they lack the cultural capital and family knowledge to navigate it effectively.

    The problem once in

    But if unlimited choice at entry is a cognitive overload problem, what happens once students enrol should balance that with flexibility and breadth. Students gain expertise, develop clearer goals, and should have more autonomy to explore and specialise as they progress.

    Except that’s not what’s happening. Financial pressures across the sector are driving institutions to reduce module offerings – exactly when research suggests students need more flexibility, not less.

    The Benefits of Hindsight research on graduate regret says it all. A sizeable share of applicants later wish they’d chosen differently – not usually to avoid higher education, but to pick a different subject or provider. The regret grows once graduates hit the labour market.

    Many students who felt mismatched would have liked to change course or university once enrolled – about three in five undergraduates and nearly two in three graduates among those expressing regret – but didn’t, often because they didn’t know how, thought it was too late, or feared the cost and disruption.

    The report argues there’s “inherent rigidity” in UK provision – a presumption that the initial choice should stick despite evolving interests, new information, and labour-market realities. Students described courses being less practical or less aligned to work than expected, or modules being withdrawn as finances tightened. That dynamic narrows options precisely when students are learning what they do and don’t want.

    Career options become the dominant reason graduates cite for wishing they’d chosen differently. But that’s not because they lacked earnings data at 17. It’s because their interests evolved, they discovered new fields, labour market signals changed, and the rigid structure gave them no way to pivot without starting again.

    The Competition and Markets Authority now explicitly identifies as misleading actions “where an HE provider gives a misleading impression about the number of optional modules that will be available.” Students have contractual rights to the module catalogue promised during recruitment. Yet redundancy rounds repeatedly reduce the size and scope of optional module catalogues for students who remain.

    There’s also an emerging consensus from the research on what actually works for module choice. An LSE analysis found that adding core modules within the home department was associated with higher satisfaction, whereas mandatory modules outside the home department depressed it. Students want depth and coherence in their chosen subject. They also value autonomous choice over breadth options.

    Research repeatedly shows that elective modules are evaluated more positively than required ones (autonomy effects), and interdisciplinary breadth is associated with stronger cross-disciplinary skills and higher post-HE earnings when it’s purposeful and scaffolded.

    What would actually work

    So what does this all suggest?

    As I’ve discussed on the site before, at the University of Helsinki – Finland’s flagship institution with 40,000 students – there’s 32 undergraduate programmes. Within each programme, students must take 90 ECTS credits in their major subject, but the other 75 ECTS credits must come from other programmes’ modules. That’s 42 per cent of the degree as mandatory breadth, but students choose which modules from clear disciplinary categories.

    The structure is simple – six five-credit introductory courses in your subject, then 60 credits of intermediate study with substantial module choice, including proseminars, thesis work, and electives. Add 15 credits for general studies (study planning, digital skills, communication), and you’ve got a degree. The two “modules” (what we’d call stages) get a single grade each on a one-to-five scale, producing a simple, legible transcript.

    Helsinki runs this on a 22.2 to one staff-student ratio, significantly worse than the UK average, after Finland faced €500 million in higher education cuts. It’s not lavishly resourced – it’s structurally efficient.

    Maynooth University in Ireland reduced CAO (their UCAS) entry routes from about 50 to roughly 20 specifically to “ease choice and deflate points inflation.” Students can start with up to four subjects in year one, then move to single major, double major, or major with minor. Switching options are kept open through first year. It’s progressive specialisation – broad exploration early when students have least context, increasing focus as they develop expertise.

    Also elsewhere on the site, Técnico in Lisbon – the engineering and technology faculty of the University of Lisbon – rationalised to 18 undergraduate courses following a student-led reform process. Those 18 courses contain hundreds of what the UK system would call “courses” via module combinations, but without the administrative overhead. They require nine ECTS credits (of 180) in social sciences and humanities for all engineering programmes because “engineers need to be equipped not just to build systems, but to understand the societies they shape.”

    Crucially, students themselves pushed for this structure. They conducted structured interviews, staged debates, and developed reform positions. They wanted shared first years, fewer concurrent modules to reduce cognitive load, more active learning methods, and more curricular flexibility including free electives and minors.

    The University of Vilnius allows up to 25 per cent of the degree as “individual studies” – but it’s structured into clear categories – minors (30 to 60 credits in a secondary field, potentially leading to double diploma), languages (20-plus options with specific registration windows), interdisciplinary modules (curated themes), and cross-institution courses (formal cooperation with arts and music academies). Not unlimited chaos, just structured exploration within categorical choices.

    What all these models share is a recognition that you can have both depth and breadth, structure and flexibility, coherence and exploration – if you design programmes properly. You need roughly 60 to 70 per cent core pathway in the major for depth and satisfaction, 20 to 30 per cent guided electives organised into three to five clear categories per decision point, and maybe 10 to 15 per cent completely free electives.

    The UK’s subject benchmark statements, if properly refreshed (and consolidated down a bit) could provide the regulatory infrastructure for it all. Australia undertook a version of this in 2010 through their Learning and Teaching Academic Standards project, which defined threshold learning outcomes for major discipline groupings through extensive sector consultation (over 420 meetings with more than 6,100 attendees). Those TLOs now underpin TEQSA’s quality regime and enable programme-level approval while protecting autonomy.

    Bigger programmes, better choice

    The white paper’s information provision agenda isn’t wrong – it’s just addressing the wrong problem at the wrong end of the process. Publishing earnings data doesn’t solve cognitive overload from tens of thousands of courses, quality ratings don’t help students whose interests evolve and who need flexibility to pivot, and historic entry grades don’t fix the rigidity that manufactures regret.

    What would actually help is structural reform that the international evidence consistently supports – consolidation to roughly 20 to 40 programmes per institution (aligned with subject benchmark statement areas), with substantial protected module choice within those programmes, organised into clear categories like minors, languages, and interdisciplinary options.

    Some of those groups of individual modules might struggle to recruit if they were whole courses – think music and languages. They may well (and across Europe, do) sustain research-active academics if they could exist in broader structures. Fewer, clearer programmes at entry when students have least context, and more, structured flexibility during the degree when students have expertise to choose wisely.

    The efficiency argument is real – maintaining thousands of separate course codes, each with approval processes, quality assurance, marketing materials, and UCAS coordination is absurd overhead for what’s often just different permutations of the same modules. See also hundreds of “programme leaders” each having to be chased to fill a form in.

    Fewer programme directors with more module convenors beneath them is far more rational. And crucially, modules serve multiple student populations (what other systems would call majors and minors, and students taking breadth from elsewhere), making specialist provision viable even with smaller cohorts.

    The equality case is compelling – guided pathways with structured choice demonstrably improve outcomes for first-in-family students, students of colour, and low-income students, populations that regulators are charged with protecting. If current choice architecture systematically disadvantages exactly these students, that’s not pedagogical preference – it’s a regulatory failure.

    And the evidence on what students actually want once enrolled validates it all – they value depth in their chosen subject, they want autonomous choice over breadth options (not forced generic modules), they benefit from interdisciplinary exposure when it’s purposeful, and they need flexibility to correct course when their goals evolve.

    The white paper could have engaged with any of this. Instead, we get promises to publish more data on UCAS. It’s more Spotify features when what students need is a curated record collection and the freedom to build their own mixtape once they know what they actually like.

    What little reform is coming is informed by the assumption that if students just had better search filters, unlimited streaming would finally work. It won’t.

    Source link

  • When Was Higher Education Truly a Public Good? (Glen McGhee)

    When Was Higher Education Truly a Public Good? (Glen McGhee)

    Like staring at the Sun too long, that brief window in time, when higher ed was a public good, has left a permanent hole for nostalgia to leak in, becoming a massive black hole for trillions of dollars, and a blind-spot for misguided national policies and scholars alike. 

    The notion that American higher education was ever a true public good is largely a myth. From the colonial colleges to the neoliberal university of today, higher education has functioned primarily as a mechanism of class reproduction and elite consolidation—with one brief, historically anomalous exception during the Cold War.


    Colonial Roots: Elite Reproduction in the New World (1636–1787)

    The first American colleges—Harvard, William and Mary, Yale, Princeton, and a handful of others—were founded not for the benefit of the public, but to serve narrow elite interests. Their stated missions were to train Protestant clergy and prepare the sons of wealthy white families for leadership. They operated under monopoly charters and drew funding from landowners, merchants, and slave traders.

    Elihu Yale, namesake of Yale University, derived wealth from his commercial ties to the East India Company and the slave trade. Harvard’s early trustees owned enslaved people. These institutions functioned as “old boys’ clubs,” perpetuating privilege rather than promoting equality. Their educational mission was to cultivate “gentlemen fit to govern,” not citizens of a democracy.


    Private Enterprise in the Republic (1790–1860)

    After independence, the number of colleges exploded—from 19 in 1790 to more than 800 by 1880—but not because of any commitment to the public good. Colleges became tools for two private interests: religious denominations seeking influence, and land speculators eager to raise property values.

    Ministers often doubled as land dealers, founding small, parochial colleges to anchor towns and boost prices. State governments played a minimal role, providing funding only in times of crisis. The Supreme Court’s 1819 Dartmouth College decision enshrined institutional autonomy, shielding private colleges from state interference. Even state universities were created mainly out of interstate competition—every state needed its own to “keep up with its neighbors.”


    Gilded Age and Progressive Era: Credential Capitalism (1880–1940)

    By the late 19th century, industrial capitalism had transformed higher education into a private good—something purchased for individual advancement. As family farms and small businesses disappeared, college credentials became the ticket to white-collar respectability.

    Sociologist Burton Bledstein called this the “culture of professionalism.” Families invested in degrees to secure middle-class futures for their children. By the 1920s, most students attended college not to seek enlightenment, but “to get ready for a particular job.”

    Elite universities such as Harvard, Yale, and Princeton solidified their dominance through exclusive networks. C. Wright Mills later observed that America’s “power elite” circulated through these same institutions and their associated clubs. Pierre Bourdieu’s concept of cultural capital helps explain this continuity: elite universities convert inherited privilege into certified merit, preserving hierarchy under the guise of meritocracy.


    The Morrill Acts: Public Promise, Private Gains (1862–1890)

    The Morrill Act of 1862 established land-grant colleges to promote “practical education” in agriculture and engineering. While often cited as a triumph of public-minded policy, the act’s legacy is ambivalent.

    Land-grant universities were built on land expropriated from Indigenous peoples—often without compensation—and the 1890 Morrill Act entrenched segregation by mandating separate institutions for Black Americans in the Jim Crow South. Even as these colleges expanded access for white working-class men, they simultaneously reinforced racial and economic hierarchies.


    Cold War Universities: The Brief Public Good (1940–1970)

    For roughly thirty years, during World War II and the Cold War, American universities functioned as genuine public goods—but only because national survival seemed to depend on them.

    The GI Bill opened college to millions of veterans, stabilizing the economy and expanding the middle class. Massive federal investments in research transformed universities into engines of technological and scientific innovation. The university, for a moment, was understood as a public instrument for national progress.

    Yet this golden age was marred by exclusion. Black veterans were often denied GI Bill benefits, particularly in the South, where discriminatory admissions and housing policies blocked their participation. The “military-industrial-academic complex” that emerged from wartime funding created a new elite network centered on research universities like MIT, Stanford, and Berkeley.


    Neoliberal Regression: Education as a Private Commodity (1980–Present)

    After 1970, the system reverted to its long-standing norm: higher education as a private good. The Cold War’s end, the tax revolt, and the rise of neoliberal ideology dismantled the postwar consensus.

    Ronald Reagan led the charge—first as California governor, cutting higher education funding by 20%, then as president, slashing federal support. He argued that tuition should replace public subsidies, casting education as an individual investment rather than a social right.

    Since 1980, state funding per student has fallen sharply while tuition at public universities has tripled. Students are now treated as “customers,” and universities as corporations—complete with branding departments, executive pay packages, and relentless tuition hikes.


    The Circuit of Elite Network Capital

    Today, the benefits of higher education flow through a closed circuit of power that links elite universities, corporations, government agencies, and wealthy families.

    1. Elite Universities consolidate wealth and prestige through research funding, patents, and endowments.

    2. Corporations recruit talent and license discoveries, feeding the same institutions that produce their executives.

    3. Government and Military Agencies are staffed by alumni of elite universities, reinforcing a revolving door of privilege.

    4. Elite Professions—law, medicine, finance, consulting—use degrees as gatekeeping mechanisms, driving credential inflation.

    5. Wealthy Families invest in elite education as a means of preserving status across generations.

    What the public receives are only residual benefits—technologies and medical innovations that remain inaccessible without money or insurance.


    Elite Network Capital, Not Public Good

    The idea of higher education as a public good has always been more myth than reality. For most of American history, colleges and universities have functioned as institutions of elite reproduction, not engines of democratic uplift.

    Only during the extraordinary conditions of the mid-20th century—when global war and ideological conflict made mass education a national imperative—did higher education briefly align with the public interest.

    Today’s universities continue to speak the language of “public good,” but their actions reveal a different truth. They serve as factories of credentialism and as nodes in an elite network that translates privilege into prestige. What masquerades as a public good is, in practice, elite network capital—a system designed not to democratize opportunity, but to manage and legitimize inequality.


    Sources:

    Labaree (2017), Bledstein (1976), Bourdieu (1984, 1986), Mills (1956), Geiger (2015), Thelin (2019), and McGhee (2025).

    Source link

  • Higher education data explains why digital ID is a good idea

    Higher education data explains why digital ID is a good idea

    Just before the excitement of conference season, your local Facebook group lost its collective mind. And it shows no sign of calming down.

    Given everything else that is going on, you’d think that reinforcing the joins between key government data sources and giving more visibility to the subjects of public data would be the kind of nerdy thing that the likes of me write about.

    But no. Somebody used the secret code word. ID Cards.

    Who is she and what is she to you?

    I’ve written before about the problems our government faces in reliably identifying people. Any entitlement– or permission– based system needs a clear and unambiguous way of assuring the state that a person is indeed who they claim they are, and have the attributes or documentation they claim to.

    As a nation, we are astonishingly bad at this. Any moderately serious interaction with the state requires a parade of paperwork – your passport, driving license, birth certificate, bank statement, bank card, degree certificate, and two recent utility bills showing your name and address. Just witness the furore over voter ID – to be clear a pointless idea aimed at solving a problem that the UK has never faced – and the wild collection of things that you might be allowed to pull out of your voting day pocket that do not include a student ID.

    We are not immune from this problem in higher education. I’ve been asking for years why you need to apply to a university via UCAS, and apply for funding via the Student Loans Company, via two different systems. It’s then never been clear to me why you then need to submit largely similar information to your university when you enroll.

    Sun sign

    Given that organs of the state have this amount of your personal information, it is then alarming that the only way it can work out what you earn after graduating is by either asking you directly (Graduate Outcomes) or by seeing if anyone with your name, domicile, and date of birth turns up in the Inland Revenue database.

    That latter one – administrative matching – is illustrative of the government’s current approach to identity. If it can find enough likely matches of personal information in multiple government databases it can decide (with a high degree of confidence) that records refer to the same person.

    That’s how they make LEO data. They look for National Insurance Number (NINO), forename, surname, date of birth, postcode, and sex in both HESA student records and the Department for Work and Pension’s Customer Information System (which itself links to the tax database). Keen Wonkhe readers will have spotted that NINO isn’t returned to HESA – to get this they use “fuzzy matching” with personal data from the Student Loans Company, which does. The surname thing is even wilder – they use a sound-based algorithm (SOUNDEX) to allow for flexibility on spellings.

    This kind of nonsense actually has a match rate of more than 90 per cent (though this is lower for ethnically Chinese graduates because sometimes forenames and surnames can switch depending on the cultural knowledge of whoever prepared the data).

    It’s impressive as a piece of data engineering. But given that all of this information was collected and stored by arms of the same government it is really quite poor.

    The tale of the student ID

    Another higher education example. If you were ever a student you had a student ID. It was printed on your student card, and may have turned up on various official documents too. Perhaps you imagined that every student in the UK had a student number, and that there was some kind of logic to the way that they were created, and that there was a canonical national list. You would be wrong.

    Back in the day, this would have been a HESA ID, itself created from your UCAS number and your year of entry (or your year of entry, HESA provider ID, and an internal reference number if you applied directly). Until just a few years ago, the non-UCAS alternative was in use for all students – even including the use of the old HESA provider ID rather than the more commonly used UKPRN. Why the move away from UCAS – well, UCAS had changed how they did identifiers and HESA’s systems couldn’t cope.

    You’re expecting me to say that things are far more sensible now, but no. They are not. HESA has finally fixed the UKPRN issue within a new student ID field (SID). This otherwise replicates the old system but with one important difference: it is not persistent.

    Under the old approach, the idea was you had one student number for life – if you did an undergraduate degree at Liverpool, a masters at Manchester Met, and a PhD at Royal Holloway these were all mapped to the same ID. There was even a lookup service for new providers if the student didn’t have their old number. I probably don’t even need to tell you why this is a good idea if you are interested – in policy terms – in the paths that students within their career in higher education. These days we just administratively match if we need to. Or – as in LEO – assume that the last thing a student studied was the key to or cause of their glittering or otherwise career.

    The case of the LLE

    Now I hear what you might be thinking. These are pretty terrible examples, but they are just bodges – workarounds for bad decisions made in the distant past. But we have the chance to get it right in the next couple of years.

    The design of the Lifelong Learning Entitlement means that the government needs tight and reliable information about who does what bit of learning in order that funds can be appropriately allocated. So you’d think that there would be a rock-solid, portable, unique learner number underpinning everything.

    There is not. Instead, we appear to be standardising on the Student Loans Company customer reference number. This is supposed to be portable for life, but it doesn’t appear in any other sector datasets (the “student support number” is in HESA, but that is somehow different – you get two identifiers from SLC, lucky you). SLC also holds your NINO (you need one to get funding!), and has capacity to hold another additional number of an institution’s choice, but not (routinely) your HESA student ID or your UCAS identifier.

    There’s also space to add a Unique Learner Number (ULN) but at this stage I’m too depressed to go into what a missed opportunity that is.

    Why is standardising on a customer reference number not a good idea? Well, think of all the data SLC doesn’t hold but HESA does. Think about being able to refer easily back to a school career and forward into working life on various government data. Think about how it is HESA data and not SLC data that underpins LEO. Think about the palaver I have described above and ask yourself why you wouldn’t fix it when you had the opportunity.

    Learning to love Big Brother

    I’ll be frank, I’m not crazy about how much the government knows about me – but honestly compared to people like Google, Meta, or – yikes – X (formerly twitter) it doesn’t hugely worry me.

    I’ve been a No2ID zealot in my past (any employee of those three companies could tell you that) but these days I am resigned to the fact that people need to know who I am, and I’d rather be more than 95 per cent confident that they could get it right.

    I’m no fan of filling in forms, but I am a fan of streamlined and intelligent administration.

    So why do we need ID cards? Simply because in proper countries we don’t need to go through stuff like this every time we want to know if a person that pays tax and a person that went to university are the same person. Because the current state of the art is a mess.

    Source link

  • The Wonkhe HE staff survey – how good is work in higher education?

    The Wonkhe HE staff survey – how good is work in higher education?

    As financial pressures continue to bear down on higher education institutions across the UK, there is a visible impact on higher education staff, as resources shrink, portfolios are rationalised, and redundancy programmes are implemented. These are definitively tough times for the sector and its people.

    One way this plays out is in the industrial relations landscape, with unions balloting for industrial action, as well as, on some specific issues, advancing joint work with employers.

    But there is a wider, arguably more nuanced, lens to bring to bear, about how the current circumstances are reshaping staff experiences of working in higher education, and what options are available to those with responsibility for leading and supporting higher education staff.

    When the Wonkhe team came up with the idea of running a national survey for higher education staff we knew from the outset that we would not be able to produce definitive statements about “the HE staff experience” derived from a representative sample of responses. There is no consensus over how you would define such a sample in any case.

    The best national dataset that exists is probably found in UCEA publications that combine institutional staff experience survey datasets at scale – one published in 2024 titled “What’s it really like to work in HE?” and one in May this year diving into some of the reported differences between academic and professional staff, “A tale of two perspectives: bridging the gap in HE EX.

    Instead we wanted to, firstly, ask some of the questions that might not get asked in institutional staff surveys – things like, how staff feel about their institution’s capacity to handle change, or the relative importance of different potential motivating factors for working in HE, or, baldly, how institutional cost-cutting is affecting individuals. And secondly, as best we can, to draw out some insight that’s focused on supporting constructive conversations within institutions about sustaining the higher education community during challenging times.

    We’ll be reporting on three key areas:

    1. “Quality of work” – discussed further below
    2. Professional motivations, the relative importance of different motivators for our sample group, and the gap between the level of importance afforded key motivators and the extent to which respondents believe they actually get to experience these in their roles – DK has tackled that subject and you can read about his findings here
    3. Views on institutional change capability – coming soon!

    We’ve not covered absolutely everything in this tranche of reporting – partly because of time pressures, and partly because of format constraints. We have a fair bit of qualitative data to dive into, as well as the third area of investigation on institutional change capability all still to come – watch this space.

    The methodology and demographics bit

    We promoted the survey via our mailing list (around 60,000 subscribers) during July and August 2025, yielding a total of 4,757 responses. We asked a whole range of questions that we hoped could help us make meaningful comparisons within our sample – including on things like nationality, and type and location of institutions – but only some of those questions netted enough positive responses to allow us to compare two or more good-sized groups.

    Our working assumption is that if there was a group of around 500 or more who share a particular characteristic it is reasonable to compare their responses to the group of respondents who did not have that particular characteristic. We have conducted analysis of the following subgroups:

    • Career stage: Early career (n=686), mid career (n=1,304), and late career (n=2,703)
    • Those with an academic contract (n=1,110) and those with a non-academic contract (n=3,394) – excluding some other kinds of roles/contracts
    • Time in higher education: five years or fewer (n=908); 6-10 years (n=981); 11-20 years (n=1,517) and more than 20 years (n=1,333)
    • Working arrangements: on-site (n=988); working from home or remotely (n=475); and flexible/hybrid (n=3,268)
    • Leadership role: respondents who said they have formal management or leadership responsibility in their current role for projects, programmes, resources, or people (n=3,506), and those who did not (n=1,214)

    And we also looked at the following identity characteristics:

    • Gender: men (n=1,386) and women (n=3,271)
    • Sexuality: those who identified as gay, lesbian, bisexual or queer (n=654) and those who did not (n=4,093)
    • Ethnicity: those who identified as being of a minoritised ethnicity (n=247) and those who did not (n=4,444)
    • Disability: those who identified as being disabled (n=478) and those who did not (n=4,269)

    In one case – that of respondents who identified as being of a minoritised ethnicity – our sample didn’t meet the threshold for wholly robust analysis, but we found some differences in reported experience, which we think is worth reporting given what we already know about this group of staff, and would caution that these findings should be viewed as indicative rather than definitive.

    In some cases we have combined subgroups to make larger groups – for example we’ve grouped various academic roles together to compare with roles on other kinds of contracts. In others we’ve ignored some very small (usually n=3 and below) groups to make for a more readable chart; for this reason we don’t often show all responses. And although our response rates are high you don’t have to refine things much to get some pretty low numbers, so we’ve not looked at intersections between groups.

    We have reported where we found what we considered to be a meaningful difference in response – a minimum of four percentage points difference.

    The financial context

    88 per cent of respondents said their institution has taken material steps to reduce costs in the last 12 months, offering a background context for answers to the wider survey and the assurance that the thing we are looking at is definitely staff views against a backdrop of change.

    51.6 per cent said they personally had been negatively affected by cost reduction measures, while 41.9 per cent said the personal impact was neutral. This suggests that while cost reduction may be widely viewed as negative, that experience or the views that arise from it may not be universal.

    Of those that said they had been negatively affected we found no meaningful differences among our various comparator groups. Leaders and those later in their career, were as likely to report negative impacts as those without leadership responsibilities or earlier in their career, suggesting that there is little mileage in making assumptions about who is more likely to be negatively impacted – though of course we did not try to measure the scale of the impact, and we’re mindful we were talking to people who had not lost their jobs as a result of cost-saving measures.

    The one exception was between those on academic contracts, of whom nearly two third (65.3 per cent) reported negative impacts, and those on non-academic contracts, of whom the number reporting negative impact was closer to half (47.4 per cent). This difference gives important context for the wider findings, in which those on academic contracts are consistently more likely to offer a negative perspective than those on non-academic contracts across a range of questions. This tallies to some degree with the national picture explored in UCEA’s “Bridging the gap” report in which academics were more likely to report challenges with workload, work-life balance, and reward and recognition, than professional staff – though higher levels of work satisfaction.

    Regretting and recommending HE

    We asked whether, taking into account what is known about other available career paths, whether respondents feel that choosing to work in HE was the right decision for them – two thirds said yes (66.9 per cent) while 23.8 per cent were unsure. Only 9 per cent said no.

    Those approaching the end of their career were more likely to agree (74.3 per cent) compared to those mid-career (65 per cent) or early career (61.2 per cent). Those with leadership responsibilities were also slightly more likely to agree, at 68.2 per cent, compared to 62.3 per cent for those without leadership responsibilities.

    Those on academic contracts were slightly less likely to agree, at 60.8 per cent compared to 68.9 per cent for those on non-academic contracts.

    However, the real divide opens up when we looked at responses to our follow up question: whether respondents would recommend a career in higher education to someone they cared about who was seeking their advice. A much smaller proportion of our sample agreed they would recommend a career in HE (42.2 per cent), with much higher rates of “unsure” (32.1 per cent) and “no” (24.5 per cent) – most likely reflecting the impact of current challenges as compared to people’s longer-term lived experience.

    For the recommend question, the career-stage trend reverses, with those approaching the end of their careers less likely to say they would recommend a career in HE (39.2 per cent) compared to 41.6 per cent for those mid-career and 50.4 per cent for early career respondents.

    There was a substantial difference by role: only 25.7 per cent of those on academic contracts would recommend a career in HE, compared to 46.9 per cent of those on non-academic contracts.

    We did not find any differences by gender, ethnicity, disability, or sexuality on either confidence in the decision to work in HE or willingness to recommend it as a career.

    Quality of work

    One of the great things about higher education as an employment sector is that there are lots of ways to be employed in it and lots of different types of jobs. What one person values about their role might be quite different from what another person appreciates – and the same for the perceived downsides of any given role.

    So rather than trying to drill down into people’s reported experiences based on our own probably biased views about what “good work” looks and feels like, we turned to the idea of “quality of work” as a guiding framework to look at respondents’ experiences and perceptions. We asked 16 questions in total derived from this 2018 Carnegie UK-RSA initiative on measuring job quality in the UK which proposes seven distinct dimensions of work quality, including pay and conditions, safety and wellbeing, job design, social support, voice, and work-life balance.

    We also kept in mind that, while support, safety and wellbeing at work are foundational conditions for success, so is effective performance management and the opportunity to apply your skills. In the spirit of Maslow’s hierarchy of needs we clustered our questions broadly into four areas: safety, security, and pay/conditions; the balance between support and challenge; relationships with colleagues; and “self-actualisation” incorporating things like autonomy and meaningfulness.

    For each question, respondents were offered a choice of Strongly disagree, Disagree, Agree, and Strongly agree. Here we report overall levels of agreement (ie Agree and Strongly Agree)

    You can see the full findings for all our comparator groups in the visualisation below.

    [Full screen]

    Headlines on quality of work and interaction with willingness to recommend

    You can see all the workings out below where I’ve gone through the results line by line and reported all the variations we could see, but the TL;DR version is that the quality dimensions that jump out as being experienced comparatively positively are physical safety, good working relationships with colleagues, and meaningfulness of work. Two key areas that emerge as being experienced comparatively negatively are feeling the organisation takes your wellbeing seriously, and opportunities for progression – the level of agreement is startlingly low for the latter.

    We compared the various quality dimensions against whether people would recommend a career in higher education for the whole sample and found that across every question there was a direct correlation between a positive response and likelihood to recommend a career in HE – and the inverse for negative responses. We think that means we’re asking meaningful questions – though we’ve not been able to build a regression model to test which quality questions are making the largest contribution to the recommend question (which makes us sad).

    [Full screen]

    Going through the various comparator groups for the quality of work questions we find that there are three core “at risk” groups – one of which is respondents of a minorised ethnicity, which comes with caveats regarding sample size. Another is those on academic contracts, and the third is disabled respondents. These groups did not consistently respond more negatively to every question on quality of work, but we did find enough differentiation to make it worth raising a flag.

    So to try to see whether we could find some core drivers for these “at risk” groups, we plotted the response to the “recommend” question against the responses to the quality questions just for these groups. At this point the samples for disabled and minoritised ethnic responses become just too small to draw conclusions – for example, under 100 respondents who identified as being of a minoritised ethnicity said they would not recommend a career in HE.

    However, over 400 of those on academic contracts said they would not recommend a career in HE, so we compared the answers of that group to those of respondents on non-academic contracts who also would not recommend a career in HE (just shy of 700 respondents). Interestingly for a number of the quality questions there was no differentiation in response between the groups, but there was noticeable difference for “reasonable level of control over work-life balance”, “able to access support with my work when I need it”, and “opportunities to share my opinion” – in the sense that among the group that would not recommend HE the academic cohort were more likely to give negative responses to these questions, giving a modest indication of possible priority areas for intervention.

    We also found that those who had worked in higher education for five years or fewer were frequently more likely to report agreement with our various propositions about quality work. While there’s clearly some overlap with those early in their career they are not entirely the same group – some may have entered HE from other sectors or industries – though early career respondents do also seem to emerge as having a slightly more positive view as well, including on areas like emotional safety, and wellbeing.

    Safety, security and pay and conditions

    The four statements we proposed on this theme were:

    • I feel reasonably secure in my job
    • I am satisfied with the pay and any additional benefits I receive
    • I feel physically safe at work
    • I feel emotionally safe at work

    On job security, overall two thirds (66.3 per cent) of our sample agreed or strongly agreed that they feel reasonably secure in their job. Those on academic contracts reported lower levels of agreement (57.8 per cent). Those who said they had been employed in higher education for five years or fewer reported higher levels of agreement (71.4 per cent). Respondents who identified as disabled reported slightly lower levels of agreement (61.9 per cent).

    On satisfaction with pay, conditions and additional benefits, overall 63.8 per cent of respondents agreed or strongly agreed that they were satisfied. Those on academic contracts reported lower levels of agreement (56.3 per cent). Those who identified as having a minoritised ethnicity had the lowest levels of agreement of all our various comparators (53.1 per cent), and were twice as likely to strongly disagree that they were satisfied with pay and benefits than those from non-minoritised ethnicities (15.2 per cent compared to 7.9 per cent). Those who identified as disabled had lower levels of agreement (54.6 per cent agreement) compared to those who did not consider themselves disabled (64.9 per cent agreement)

    On physical safety, the vast majority of respondents (95.8 per cent) agreed or strongly agreed they feel physically safe at work with very little variation across our comparator groups. While the overall agreement was similar between men and women, notably men were more likely to register strong agreement (66.3 per cent) than women (51.9 per cent).

    On emotional safety the picture is more varied. Overall 72 per cent agreed or strongly agreed they feel emotionally safe at work. Those who reported being earlier in their career reported higher levels of agreement (78.6 per cent), as did those who reported having worked in the HE sector for five years or fewer (78.6 per cent). Those with academic contracts reported lower levels of agreement (61.62). Those who identified as having a minoritised ethnicity had lower levels of agreement (62.7 per cent) and were more than twice as likely to strongly disagree they feel emotionally safe at work than those who are not minoritised (14.2 per cent compared to 6.1 per cent).

    Balance, challenge, and performance

    The four statements we proposed on this theme were:

    • The work I do makes appropriate use of my skills and knowledge
    • I have a reasonable level of control over my work-life balance
    • My organisation demonstrates that it takes my wellbeing seriously
    • My organisation demonstrates that it takes my performance seriously

    On using skills and knowledge 79.2 per cent of our sample agreed or strongly agreed that their work makes appropriate use of their skills and knowledge. There was very little variation between comparator groups – the one group that showed a modest difference was those who reported being disabled, whose agreement levels were slightly lower at 75.3 per cent.

    On control over work-life balance, 80.7 per cent of our sample agreed or strongly agreed they have a “reasonable” level of control. Those who had worked in higher education for five years or fewer were more likely to agree (87.2 per cent). 86.5 per cent of those who work from home agreed, compared to 74.4 per cent of those who work on campus or onsite, and 81.7 per cent of those who have hybrid or flexible working arrangements. Those who reported having leadership responsibilities had lower levels of agreement (78.9 per cent) compared to those who did not (85.9 per cent).

    The biggest difference was between those on academic contracts (66 per cent agreement) and those on non-academic contracts (85.3 per cent agreement). There were also slightly lower scores for disabled respondents (74.7 per cent compared to 81.2 per cent for non-disabled respondents) and for minoritised ethnicities (76.6 per cent compared to 81 per cent for non-minoritised ethnicities).

    On wellbeing, 57.8 per cent of our sample agreed or strongly agreed that their organisation demonstrates that it takes their wellbeing seriously. This was higher for early-career respondents – 60 per cent agreement compared to 57.9 per cent for those in mid-career, and 55.5 per cent for those approaching the end of their career. Agreement was higher for those with five years or fewer in higher education at 68.4 per cent agreement, compared with 54.5 per cent for those with more than 20 years’ experience.

    Those on academic contracts were substantially less likely to agree with only 39.7 per cent agreement that their organisation demonstrates that it takes their wellbeing seriously. Disabled respondents were also much less likely to agree than non-disabled respondents, at 47.7 per cent and 59 per cent respectively. Those working from home reported slightly lower levels of agreement, at 52.6 per cent.

    On performance, 63.1 per cent of our sample reported that their organisation demonstrates that it takes their performance seriously. This was slightly higher for those who had five years or fewer in higher education, at 69.6 per cent. Again, there was a difference between those on academic contracts with 57.8 per cent agreement and those on non-academic contracts, with 64 per cent agreement. Disabled respondents were slightly less likely to agree (58 per cent agreement) than non-disabled (63.8 per cent agreement).

    Relationships with colleagues

    The four statements we proposed on this theme were:

    • I am able to access support with my work when I need it
    • I am given sufficient opportunities to share my opinion on matters that affect my work
    • For the most part I have a good working relationship with my colleagues
    • I generally trust that the people who work here are doing the right things

    On accessing support, 76.2 per cent of our sample agreed they are able to access support when they need it. There was higher agreement among those early in their career at 81.3 per cent, and similarly among those who had worked five years or fewer in HE, at 82.8 per cent. There was lower agreement among those on academic contracts: 62.3 per cent agreement versus 80.5 per cent for those on non-academic contracts. Those from a minoritised ethnicity had lower agreement at 70.6 per cent, as did disabled respondents at 67.4 per cent.

    On opportunities to share opinion, 70.4 per cent of our sample agreed or strongly agreed they were given sufficient opportunities to share their opinion on matters that affect their work. There was a small difference between those who held a leadership role and those who did not, at 71.9 per cent and 66 per cent agreement respectively. Again, those on academic contracts had lower levels of agreement, at 58.2 per cent compared to 73.9 per cent for those on non-academic contracts. Disabled staff also had lower agreement at 60.9 per cent.

    On working relationships, cheeringly, 96.1 per cent of our sample agreed or strongly agreed they have good working relationships with their colleagues. While this held true overall across all our comparator groups regardless of leadership roles, working location, personal characteristics or any other factor, notably those of a minoritised ethnicity strongly agreed at a lower rate than those who did not identity as being from a minoritised ethnicity (39.6 per cent strong agreement compared to 48.3 per cent).

    On trust, 70.8 per cent of our sample agreed or strongly agreed that they generally trust the people they work with are doing the right things. This was very slightly lower among those who work from home or remotely, at 65.9 per cent. Agreement was lower among those on an academic contract, at 61.6 per cent, compared to 73.4 per cent of those on a non-academic contract. Agreement was also lower among disabled respondents, at 63.8 per cent.

    “Self-actualisation”

    The four statements we proposed on this theme were:

    • My current job fits with my future career plans and aspirations
    • I am comfortable with the level of autonomy I have in my job
    • There are sufficient opportunities for progression from this job
    • The work I do in my job is meaningful

    On career plans, 76.1 per cent of our sample agreed or strongly agreed that their current job fits with their future career plans and aspirations. Those who said they work from home or remotely had slightly lower levels of agreement at 69.3 per cent. Those who said they do not have any kind of leadership role had slightly lower levels of agreement at 69.4 per cent.

    On autonomy, 82.5 per cent of our sample agreed or strongly agreed they were comfortable with the level of autonomy they have in their job. Those with an academic contract had very slightly lower levels of agreement at 77.9, compared to 83.8 per cent agreement among those on non-academic contracts. Those of a minoritised ethnicity had lower levels of agreement at 73.9 per cent, as did disabled respondents, at 75.9 per cent agreement.

    On progression, a startling 29.5 per cent agreed or strongly agreed that there are sufficient opportunities for progression from their current position. There was a modest difference between those with leadership roles, 31.1 per cent of whom agreed, compared to 25 per cent of those without a leadership role. Those on academic contracts had higher levels of agreement at 38.5 per cent, compared to 26.8 per cent of those on non-academic contracts.

    On meaningful work, 86.1 per cent of our sample agreed or strongly agreed that the work they do in their job is meaningful. Those who work from home or remotely had very slightly lower levels of agreement at 77.9 per cent but otherwise this held true across all our comparator groups.

    Aspiration to lead and preparedness to lead

    We asked about whether respondents aspire to take on or further develop a leadership role in higher education, and if so, whether they are confident they know what a path to leadership in higher education involves in terms of support and professional development. These questions are particularly relevant given the generally negative view about opportunities to progress held by our survey respondents.

    [Full screen]

    Overall, 44.5 per cent of our sample said they aspire to take on or further develop a leadership role. Curiously, this was only slightly higher for those who already have some level of leadership responsibility, at 48.3 per cent. This can be explained to some degree by differentiation by career stage: 58.8 per cent of early career respondents aspired to take on or develop leadership roles, as did 50.9 per cent of mid-career respondents.

    Aspiration to lead was higher among those identifying as lesbian, gay, or bisexual at 52.6 per cent compared to 43.2 per cent for those who did not. Aspirations were also higher among respondents of a minoritised ethnicity, at 54.5 per cent, compared to 43.8 per cent among those not of a minoritised ethnicity.

    We also asked respondents whether they are confident they know what a path to leadership involves in terms of support and professional development, where we found some important variations. Confidence about pathways to leadership was lower among early career respondents, at 22.8 per cent agreement, and even mid-career respondents confidence was lower than the numbers reporting they aspire to leadership, at 36.6 per cent.

    While there was no difference in aspiration between respondents on academic contracts and those on non-academic contracts, those on academic contracts were more likely to say they are confident they know what a path to leadership involves, at 50.3 per cent compared to 34.8 per cent.

    While there was no difference in aspiration between men and women respondents, women were slightly less likely than men to report confidence in knowing about the path to leadership, at 37.5 per cent compared to 42 per cent. Those who identify as lesbian, gay or bisexual, those of a minoritised ethnicity, and disabled respondents were also slightly less likely than their comparator groups to express confidence, despite having expressed aspiration to lead at a higher rate.

    These findings around demographic difference suggest that there remains some work to be done to make leadership pathways visible and inclusive to all.

    We’ll be picking up the conversation about sustaining higher education community during tough times at The Festival of Higher Education in November. It’s not too late to get your ticket – find out more here.

    Source link

  • Good Marketing Won’t Fix Unpopular Programs

    Good Marketing Won’t Fix Unpopular Programs

    In full disclosure, I work in higher education marketing. But I’m here to say: Marketing can’t fix a bad program. OK, maybe “bad” is too strong of a word, but degree programs that aren’t aligned to the modern learner’s needs and expectations — or the job market — can be challenging. Let’s discuss.

    For this article, we’ll primarily focus on adult online learners. And these prospective students are very different from those coming right out of high school. According to Common App, first-time college students apply to about six different colleges, on average. The online learner typically inquires with only two institutions, according to an EducationDynamics report, and 45% apply to just one.

    What does this mean for schools with online programs? You have to get in front of your target audience quickly and make your case clearly. But if you don’t have the right mix of features or programs for these students, it doesn’t matter if your marketing is excellent.

    Give Online Learners What They Need 

    Online learners typically work at least part time and often full time. They have different needs and expectations for their higher education experience. They need flexibility. They also don’t want to be in school longer than necessary. Most are earning a degree to improve their career options. 

    Below are a few things to consider when formatting your programs and processes for online students.

    Efficiency 

    Once online learners have decided to take the step of applying, they’re committed and want to get started quickly. According to the EducationDynamics report, 80% enroll in the school that admits them first, and more than 50% expect to begin courses within a month of being admitted. 

    That means admissions teams have to move quickly and the programs must offer multiple start dates per year. If you make prospective students wait, you lose out. Delays can make an otherwise good program fall into the “bad” category.

    This one can be challenging. You need enough students to merit multiple start dates. That’s where that good marketing comes in!

    Relevant Skills

    Online learners choose online because they’re working and need a flexible school schedule to accommodate their work and personal commitments. But let’s focus on the work part here. These students need skills and credentials that will boost their earnings and opportunities. That’s one of the most cited reasons for returning to school.

    So, again, the degree must match the skills students need to find work. If the only online programs you offer are in computer science, you may find that you’re wasting your marketing dollars. Yes! Computer science! In the age of artificial intelligence (AI), computer science and engineering graduates are struggling to find work. 

    Personal opinion: Liberal arts and studies will become more important if they can teach students the durable skills needed in the AI era — communication, critical thinking, and research skills.

    Clear Information

    Degree program pages and websites sometimes obscure information users need to make decisions. And we saw above how quickly online learners are making decisions and want to get started. If your program page hides costs, financial aid information, credit hours, and requirements, you’re going to drop out of their consideration set. 

    Online learners want to weigh available information and make informed decisions. Some will certainly have price sensitivity, but it’s not the only consideration, so don’t hide tuition rates and fees. The EducationDynamics report notes that “flexibility can even overcome cost, with 30% of respondents indicating they would enroll at a more expensive institution if the available format, schedule, or location were ideal.” Show your cards. Let the students make their decisions with the information available.

    If your program doesn’t meet student requirements in this area, marketing won’t make a significant impact on your enrollments.

    Be Discriminating in Your Marketing Spend

    Sometimes there are politics at play or other reasons to market or support certain programs, but when possible, be thoughtful and intentional about where you spend your marketing dollars. Because marketing can’t solve for a challenging program, you must put your budget toward programs that meet student needs, including those that meet the criteria above.

    It’s tempting to give equal shares to all programs, but unless you have an unlimited budget, that’s not the best use of your funds. 

    If you must give some marketing love to all programs, even the “bad” ones, try a brand-focused approach that connects to an all-programs page. For example, send some limited traffic to a dedicated landing page that briefly covers all available programs. That way, you’ve covered the challenged programs without dedicated resources.

    Use the remainder of your budget on programs that align with students’ needs, so you can enjoy a lower cost per enrollment. Who doesn’t love a “chase the winners” strategy?

    Need More Help?

    Archer Education has deep expertise in both of these areas: marketing and program assessments. Our Strategy and Development team can help you take an unfiltered view of your programs and processes to create a plan for future success, even as the market shifts. If you have good programs and need marketing support, we’re here for that, too.

    Source link

  • Sarah Bendall on good governance – Campus Review

    Sarah Bendall on good governance – Campus Review

    NSO First Assistant Ombudsman Sarah Bendall spoke to Campus Review editor Erin Morley about how student complaints reflect current sector issues, like governance, and how it will work with the Australian Tertiary Education Commission (ATEC).

    Please login below to view content or subscribe now.

    Membership Login

    Source link

  • Top Tips: Take good notes

    Top Tips: Take good notes

    The key to a great news story is a great interview. But all your work getting that interview will be wasted if you don’t have great notes. 

    When I first started out as a journalist, people didn’t always record their interviews. I rarely used a recorder. I found that it did something to my brain. Part of my brain would be worried that the recording wasn’t working. 

    When I became an editor, I found that I could tell when a reporter had used a recorder. The quotes in the story were often too long or too flat — they lacked something, maybe emotion or emphasis. 

    When you interview someone without taping it, you have to listen carefully. There isn’t any backup. And because it is difficult to take down everything someone says word for word, your brain works with your ears and your hand to take down what is most important – the essential facts and details, the emotion, the surprising things someone says. 

    If you have recorded that same interview, you won’t be doing that. You know you have a backup. And when you go back and listen to the recording, something is different. The statements all flatten out and you end up putting in the story what sounds most explanatory or most impressive. In other words, you can’t tell what was most interesting when you were sitting there or on the phone. 

    A recording is not enough.

    These days it is standard practice to record interviews, if for no other reason than we need the audio for podcasts or audio clips. 

    But for a great story and to be a great storyteller you should master the art of notetaking. When doing an interview forget that the recorder is on. Imagine it isn’t working (and it might not be working!) So here are some tips for taking notes: 

    First, don’t try to take down every word. Instead, listen for what is important. 

    “Quotes can be short,” said News Decoder Educational News Director Marcy Burstiner.

    Don’t try to write everything down word for word. It’s OK to paraphrase. Put quote marks only around actual quotes. If you didn’t put quote marks around something in your notes DON’T put the quote marks in your story. 

    Master shorthand.

    Second, create your own system of shorthand.

    Shorthand is a system of writing in a code that allows you to take down words fast and accurately. There are some standard ways of doing that and courses to teach you how. It was developed for stenographers. Before recorders came along, offices employed people to take down dictation. The boss would dictate letters and reports to their secretaries who would then type them up. But you can create your own system of shorthand.

    For example, instead of writing down the person’s name every time they start talking (when you are talking to multiple people at the same time) use their initials. You can also lv out the vwls of common wrds. 

    U can write in text message 4mat b/c that also wrks. I am not a fast writer so I came up with my own system early on in my career. I put ?? when I’m not sure what the person said but I don’t want to interrupt them. I put ** when I want to go back to it to follow up. I circle words or underline them when I sense it is important.

    For something outrageous I write !!

    Take lots and lots of notes.

    Writing down words and ideas cements them in your mind.

    Finally, use a pen and paper. There are a number of reasons to do this. If you are interviewing someone in person and you try to take your notes on a laptop or tablet, your head will be down half the time and you can’t circle stuff easily. 

    Second, there is some science behind the notion that we retain information better when we write things down hand to paper. 

    As a journalist, I was a messy note taker. That piece of paper in the photo image at the top of the story is an actual page of notes I once took. If I had time before I had to submit my article I would take the effort to type my notes onto a Word or Google Doc. Later I fell in love with spreadsheeting and would type my notes into a Google Spreadsheet, which would allow me to match up information with information from other interviews and sort them. This became handy when I was doing a story that involved a lot of interviews and complicated information. 

    When going to interviews I sometimes forgot a notepad and would have to grab paper anyway I could. I’ve taken notes in the margins of flyers and brochures and on the backside of stuff I got in the mail.

    But the best practice is to always keep a notepad on you, just as photographers always keep a camera on them. 

    Finally, when you are ready to start writing, write from your notes first before going to the recording. Use the recording to make sure you got your quotes right and that you paraphrased what the person said correctly. Trust that your brain and your ears and your hand will have taken down the best information and the most engaging quotes. 


    Questions to consider:

    1. Why take notes if you are recording an interview?

    2. What is the difference between quoting someone and paraphrasing something they said?

    3. How good a notetaker are you?


     

    Source link

  • The Good Enough Manuscript (opinion)

    The Good Enough Manuscript (opinion)

    I recently coached a scholar through drafting a proposal for her second book. Her manuscript is almost complete, and our work together involved putting together a strong pitch for a few of the university presses that publish in her field. As she shared the last component of her book proposal with me for feedback, she observed with satisfaction that the proposal was indeed coming together but that the hard part would be working up the nerve to send the project off. If she only knew how many times I’ve seen that “hard part” be the step that kept people from realizing the publishing success they so deserve.

    As a professional developmental editor and publishing consultant who has spent the last 10 years helping academics bring their books and articles to print, I’m well-versed in the struggles of the scholarly writer. It’s no small feat to find time to research and write amid other professional obligations (like teaching and service) and personal commitments (like childcare, eldercare and self-care), not to mention national and global turbulence. Those who manage to complete a scholarly manuscript under these conditions should be applauded. But then the writer who has already accomplished so much faces another hurdle: persuading a press or journal to publish the text they’ve written.

    A common reaction to this hurdle is to find ways to delay having to confront it. I see writers get stuck in endless rounds of revision, going back and forth about which citations to include, tinkering with sentence structure and word choice, waiting to contact publishers until they’ve landed on the perfect phrasing for their cover letters.

    The truth is that the minute details don’t matter as much in the first submission as authors might think, especially at book publishers. You do want to put your best foot forward, to show that you value an editor’s time and that of the peer reviewers who will consider your work for publication. But it’s expected that your manuscript will evolve with the input of peer reviewers and that polishing words and sentences will happen during the final revision and copyediting stages. The writer’s goal when submitting to publishers should therefore not be a perfectly finished text, but a “good enough” manuscript that allows a press or journal to seriously consider whether they want to give a greater platform to the writer’s ideas.

    But what constitutes “good enough” in the eyes of scholarly publishers? The first criterion publishers are looking for is a sense of fit with their existing offerings. This actually has little to do with the quality of your writing. It’s more about whether the readership that the press or journal has already cultivated is generally welcoming to the topic, methods and theoretical framework of your piece.

    To ensure your manuscript is good enough in the area of fit, do your homework on what your target press or journal has recently published in the last year or two. Get clear on whom you are writing for and find outlets where those readers are already gathering. The risk of rejection goes down exponentially when you send your manuscript to the right place.

    Turning to your manuscript itself, before sending it to a publisher, you should evaluate it for what I call the four pillars of scholarly writing: argument, evidence, structure and style. Scholarly manuscripts must have a solid foundation in all four areas to be successful in the publishing process, because each of these fundamental aspects of the text has the potential to make or break the text’s chances of being received well by peer reviewers, getting approved for publication and ultimately reaching readers in the author’s scholarly field and beyond.

    Your argument is the main claim that drives your text and that you want readers to accept. Is it clearly stated near the beginning and does it remain present throughout the text? Your evidence backs up the argument for the reader. Do you have sufficient evidence and do you analyze it effectively to guide your reader to the points you want them to accept?

    The structure of your manuscript supports the reader in encountering your evidence and absorbing your points in a logical and engaging order. Structural concerns include the way the text is organized into chapters, sections and paragraphs, as well as your use of titles, headings, transitions and other signposts to move your reader along. Have you put thought into why the components of your text are organized the way they are, and have you used appropriate cues to make the structural logic obvious to your reader? By style, I mean the overall presentation of your writing, including how your attitude toward both reader and subject matter shows up on the page. Depending on the publishing venue, the style of a scholarly manuscript may be informal or formal, passionate or detached. Consider what will be most effective with your most important readers and ensure stylistic consistency across your text.

    After attending to big-picture matters, you will want to double-check that everything in your text is accurate and that sloppy errors don’t interfere with a reader’s understanding of what you want to say. But resist the temptation to tinker endlessly with superficial details. Everyone’s time, labor and mental fortitude are limited these days, so spend yours where they will get you the greatest return on investment.

    Try to reframe the editorial and publication process in your mind, thinking of it not as an adversarial set of gatekeeping encounters—though it can be that at times—but as a process designed to make your work the best it can be before it goes public. Your manuscript doesn’t have to be perfect when entering the process, because you’ll be taking it through several cycles of development before considering it to be finished. There will be multiple opportunities to improve it, and editors, peer reviewers and supportive readers will be alongside to help.

    The prospect of hitting “send” on your manuscript can be incredibly nerve-racking, but your ideas can’t reach anyone, let alone do good work in the world, if you don’t put them out there. You must eventually let go of the manuscript so it can go do its work.

    Doubts are natural. You may worry that a reader you respect will have reasonable objections or that you’ve missed something important. Perhaps you also worry about exposing yourself to criticism or rejection on the basis of your ideas, identity, background or political beliefs. Such fears are legitimate, especially for those scholars who are already marginalized in the academy. Name these fears and acknowledge that you have a right to feel your anxieties. Then assess whether the actual risks are worth silencing yourself by not putting your work out there at all.

    Laura Portwood-Stacer is the author of Make Your Manuscript Work (Princeton University Press, 2025), which offers a practical method to develop scholarly texts for publication, including a list of the most common areas where manuscripts need improvement. She is also the author of The Book Proposal Book (Princeton, 2021) and the Manuscript Works Newsletter, providing weekly guidance for scholarly writers and publishing professionals.

    Source link

  • Sharing is good, except when it isn’t

    Sharing is good, except when it isn’t

    In the wake of the floods in the U.S. state of Texas earlier this month news circulated on social media of two girls being rescued. One of the first posts sharing the story included a screenshot of a post to social media that read:

    Rescuers find 2 girls in tree, 30-feet up, near Comfort

    The dramatic rescue occurred closer to Comfort, which is in Kendall County, witnesses said. The girls were found in the tree during ongoing search operations for victims of Friday’s catastrophic flooding that has killed 59 people across Kerr County.

    A Facebook search of the post’s keywords returned dozens of identical or similarly-worded posts retelling the harrowing rescue. Other versions of the story were also shared across social media platforms like Instagram Threads, as well as in now-deleted articles across various news outlets

    But the story was fabricated. 

    It was a prime example of a type of misinformation known as “copypasta.” 

    Inciting fear

    Social media posts that utilize copypasta — a portmanteau of “copy” and “paste” — are often used to incite fear or evoke emotions, prompting users to like and share the content. These posts are used for various reasons, whether to polarize different political groups further or to attract a broader audience and spread misinformation. 

    Alex Kasprak is an investigative journalist who reported for the digital fact-checking website Snopes for nearly a decade. In his experience, Kasprak says copypasta plays a central role in online misinformation. (For more on Snopes’ take on copypasta, head to this link.) 

    “The simplest way to put it, is that copypasta is a text that you see that is identical or nearly identical posted either with somebody’s name as an author or without it in an identical form on multiple posts such that it’s clear that whoever is posting it copied it from somewhere else,” said Kasprak. 

    “What you end up getting in that sort of phenomenon is a game of telephone.”  

    Copypasta serves as a new-age version of chainmail, seen in the early days of email, which promised good luck for forwarding a message or foretold misinformation if you let the email sit in an inbox. 

    Lacking credibility

    In the case of copypasta, social media users are encouraged to comment, share or tag their friends in a post to boost engagement. Such emotion-evoking messages can serve as an entry point into more polarizing content, which is often rife with false information. 

    To identify copypasta, look for signs of vague or generic information that lacks a credible source or call to action. The way a post is written can also serve as an indication that it may be a copy-and-paste text. 

    “With copypasta, everything generally kind of travels forward, including errors in grammar or mistranslations,” Kasprak said. “If there are weird sentences that just kind of end or don’t fully make grammatical sense, that is an indicator that the tone of the message doesn’t match.”

    If the post is shared by someone that you know on your feed, but the tone is different than how they usually post or talk, the content likely originated from another source — credible or not, Kasprak said. 

    In addition to spreading false information, copypasta can be used as part of bigger campaigns to push particular sentiments or ideologies. For example, back in 2017, U.S. government officials found evidence that Russian “trolls” took to social media and also deployed social media campaigns to connect certain users to various organizations or movements.

    Danger to the infosphere

    During these online campaigns, nefarious actors meddled in the election by posting emotional content to get users to engage, gradually bringing them down a digital rabbit hole of more polarizing issues.

    Kasprak adds that copypasta content also harms the “infosphere,” or public knowledge otherwise rooted in fact. When copypasta becomes widespread and is presented as a “pseudofact,” people begin to cite it as common knowledge. A commonly held belief that many people cite as fact, for example, is that a mother bird will abandon its offspring if a human touches it. Experts agree that this notion is not true. 

    Another tactic behind those who post copypasta is to poison AI models in a similar way that fake news websites do. When enough content on the internet makes a particular claim, AI technologies may focus on this noise and refer to it as fact. In this way, AI programs are “trained” to focus and “believe” those posts over other sources of information.

    Emotion-evoking posts may also fall into the copypasta category if they are not rooted in unbiased facts. If emotional language used in the post immediately sparks anger, sadness or another strong emotion, it may be a fake post. 

    “In general, the big thing to watch out for is if something fits perfectly into your notion of how the world works,” said Kasprak. Posts that validate a person’s view of the world or evoke strong emotions in a positive or negative way are more likely to be a red flag. 

    Kasprak advises users to check their biases when reading potential copypasta content; if something makes you angry or sad, double-check its source and legitimacy. 

    “Pause if you feel strongly about wanting to share something, because those posts are the ones where the risk of copypasta is higher,” said Kasprak. When he comes across a post he believes to be copypasta, Kasprak says that he tries to “tear apart” the argument, primarily if it supports his beliefs, until it dissolves. 

    “Check your blind spots and be vigilant in checking your work,” said Kasprak. 

    When in doubt, don’t share.


     

    Questions to consider:

    1. What is meant by “copypasta”?

    2. How can something false become part of commonly believed?

    3. Can you remember the last thing you reposted on social media? What kind of things do you share with your network?


     

    Source link