Author: admin

  • Anatomy of the Research Statement (opinion)

    Anatomy of the Research Statement (opinion)

    The research statement that you include in your promotion and tenure dossier is one of the most important documents of your scholarly career—and one you’ll have little experience writing or even reading, unless you have a generous network of senior colleagues. As an academic editor, I support a half dozen or so academics each year as they revise (and re-revise, and throw out, and retrieve from the bin, and re-revise again) and submit their research statements and P&T dossiers. My experience is with—and so these recommendations are directed at—tenure-track researchers at American R-1s and R-2s and equivalent Canadian and Australian institutions.

    In my experience, most academics are good at describing what their research is and how and why they do it, but few feel confident in crafting a research statement that attests to the impact of their accomplishments. And “impact” is a dreaded word across the disciplines—one that implies reducing years of labor to mere numbers that fail to account for the depth, quality or importance of your work.

    When I think about “impact,” I think of course of the conventional metrics, but I think as well of your work’s influence among your peers in academia, and also of its resonance in nonacademic communities, be they communities of clinicians, patients, people with lived experiences of illness or oppression, people from a specific equity-deserving group, or literal neighborhoods that can be outlined on a map. When I edit research statements, I support faculty to shift their language from “I study X” to “My study of X has achieved Y” or “My work on X has accomplished Z.” This shift depends on providing evidence to show how your work has changed other people’s lives, work or thinking.

    For researchers who seek to make substantial contributions outside of academia—to cure a major disease, to change national policy or legislation—such a focus on impact, influence and resonance can be frustratingly short-termist. Yet if it is your goal to improve the world beyond the boundaries of your classroom and campus, then it seems worthwhile to find ways to show whether and how you are making progress toward that goal.

    If you’re preparing to go up for tenure or promotion, here’s a basic framework for a research statement, which you can adopt and adapt as you prepare your own impact-, influence- or resonance-focused research statement:

    Paragraph 1—Introduction

    Start with a high-level description of your overarching program of research. What big question unites the disparate parts of your work? What problem are you working toward solving? If your individual publications, presentations and grants were puzzle pieces, what big picture would they form?

    Paragraph 2—Background (Optional)

    Briefly sketch the background that informed your current preoccupations. Draw, if relevant, on your personal or professional background before your graduate studies. This paragraph should be short and should emphasize how your pre-academic life laid the foundation that has prepared you, uniquely, to address the key concerns that now occupy your intellectual life. For folks in some disciplines or institutions, this paragraph will be irrelevant and shouldn’t be included: trust your gut, or, if in doubt, ask a trusted senior colleague.

    Middle Paragraphs—Research Themes, Topics or Areas

    Cluster thematically—usually into two, three or four themes—the topics or areas into which your disparate projects and publications can be categorized. Within each theme, identify what you’re interested in and, if your methods are innovative, how you work to advance scholarly understandings of your subject. Depending on the expected length of your research statement, you might write three or four paragraphs for each theme. Each paragraph should identify external funding that you secured to advance your work and point to any outputs—publications, conference presentations, journal special issues, monographs, edited books, keynotes, invited talks, events, policy papers, white papers, end-user training guides, patents, op-eds and so on—that you produced.

    If the output is more than a few years old, you’ll also want to identify what impact (yes) that output had on other people. Doing so might involve pointing at your numbers of citations, but you might also:

    • Describe the diversity of your citations (e.g., you studied frogs but your research is cited in studies of salmon, belugas and bears, suggesting the broad importance of your work across related subfields);
    • Search the Open Syllabus database to identify the number of institutions that include your important publication in their teaching, or WorldCat, to identify the number of countries in which your book is held;
    • Link your ORCID account to Sage’s Policy Profiles to discover the government ministries and international bodies that have been citing your work;
    • Summarize media mentions of your work or big, important stories in news media, e.g. magazine covers or features in national newspapers (e.g. “In August 2025, this work was featured in The New York Times (URL)”);
    • Name awards you’ve won for your outputs or those won by trainees you supervised on the project, including a description of why the award-giving organization selected your or your trainee’s work;
    • Identify lists of top papers in which your article appears (e.g., most cited or most viewed in that journal in the year it was published); or,
    • Explain the scholarly responses to your work, e.g., conference panels discussing one of your papers or quotations from reviews of your book in important journals.

    Closing Paragraphs—Summary

    If you’re in a traditional research institution—one that would rarely be described by other academics as progressive or politically radical—then it may be advantageous for you to conclude your research statement with three summary paragraphs.

    The first would summarize your total career publications and your publications since appointment, highlighting any that received awards or nominations or that are notable for the number of citations or the critical response they have elicited. This paragraph should also describe, if your numbers are impressive, your total number of career conference presentations and invited talks or keynotes as well as the number since either your appointment or your last promotion, and the total number of publications and conference presentations you’ve co-authored with your students or trainees or partners from community or patient groups.

    A second closing paragraph can summarize your total career research funding and funding received since appointment, highlighting the money you have secured as principal investigator, the money that comes from external (regional, national and international) funders, and, if relevant, the new donor funding you’ve brought in.

    A final closing paragraph can summarize your public scholarship, including numbers of media mentions, hours of interviews provided to journalists, podcast episodes featured on or produced, public lectures delivered, community-led projects facilitated, or numbers of op-eds published (and, if available, the web analytics associated with these op-eds; was your piece in The Conversation one of the top 10 most cited in that year from your institution?).

    Final Paragraph—Plans and Commitments

    Look forward with excitement. Outline the upcoming projects, described in your middle paragraphs, to which you are already committed, including funding applications that are still under review. Paint for your reader a picture of the next three to five years of your research and then the rest of your career as you progress toward achieving the overarching goal that you identified in your opening paragraph.

    While some departments and schools are advising their pretenure faculty that references to metrics aren’t necessary in research statements, I—perhaps cynically—worry that the senior administrators who review tenure dossiers after your department head will still expect to see your h-index, total number of publications, number of high-impact-factor journals published in and those almighty external dollars awarded.

    Unless you are confident that your senior administrators have abandoned conventional impact metrics, I’d encourage you to provide these numbers and your disciplinary context. I’ve seen faculty members identify, for example, the average word count of a journal article in their niche, to show that their number of publications is not low but rather is appropriate given the length of a single article. I’ve seen faculty members use data from journals like Scientometrics to show that their single-digit h-index compares to the average h-index for associate professors in their field, even though they are not yet tenured. Such context will help your reader to understand that your h-index of eight is, in fact, a high number, and should be understood as such.

    You’ll additionally receive any number of recommendations from colleagues and mentors; for those of you who don’t have trusted colleagues or mentors at your institution, I’ve collected the advice of recently tenured and promoted associate professors and full professors from a range of disciplines and institutional contexts in this free 30-page PDF.

    I imagine that most of the peers and mentors whom you consult will remind you to align with any guidelines that your institution provides. Definitely, you should do this—and you should return to those guidelines and evaluation criteria, if they exist, as you iteratively revise your draft statement based on the feedback you receive from peers. You’ll also need to know what pieces of your P&T dossier will be read by what audience—external readers, a departmental or faculty committee, senior administrators. Anyone can tell you this; every piece of writing will need to consider both audience and context.

    But my biggest takeaway is something no client of mine has ever been told by a peer, colleague or mentor: Don’t just describe what you’ve done. Instead, point to the evidence that shows that you’ve done your work well.

    Source link

  • Students, Unions to Protest Trump’s Higher Ed Agenda Friday

    Students, Unions to Protest Trump’s Higher Ed Agenda Friday

    Members of the American Association of University Professors, the affiliated American Federation of Teachers and student groups are planning protests in more than 50 cities Friday against “the Trump administration’s broad assault” on higher ed, the AAUP announced in a news release.

    The AAUP said demonstrators will urge institutions to continue rejecting Trump’s “Compact for Academic Excellence in Higher Education” and instead “commit to the freedom to teach, learn, research, and speak out without government coercion or censorship.”

    “From attacks on academic freedom in the classroom to the defunding of life-saving scientific research to surveilling and arresting peaceful student protesters, Trump’s higher education policies have been catastrophic for our communities and our democracy,” AAUP president Todd Wolfson said in the release. “We’re excited to help build a coalition of students and workers united in fighting back for a higher education system that is accessible and affordable for all and serves the common good.”

    The protests are part of a progressive movement called Students Rise Up, or Project Rise Up. The Action Network website says there will be “walkouts and protests at hundreds of schools” Friday—the start of a buildup “to a mass student strike on May 1st, 2026, when we’ll join workers in the streets to disrupt business as usual.”

    “We’re demanding free college, a fair wage for workers, and schools where everyone is safe to learn and protest—regardless of their gender or race or immigration status,” the website says.

    Other groups listed as organizing or supporting the protests include the Campus Climate Network, College Democrats of America, Florida Youth Action Fund, Frontline for Freedom, Higher Ed Labor United, Ohio Student Association, Sunrise Movement, Dissenters, Feminist Generation, Gen-Z for Change, Generation Vote (GenVote), March for Our Lives, Oil and Gas Action Network, Socialist Alternative, Together Across America, Voters of Tomorrow, Blue Future, Get Free, and NOW Young Feminists.

    Asked for a comment from the Education Department, Madi Biedermann, deputy assistant secretary for communications, repeated statements the department previously made, saying, “The Trump Administration is achieving reforms on higher education campuses that conservatives have dreamed about for 50 years.”

    “Institutions are once again committed to enforcing federal civil rights laws consistently, they are rooting out DEI and unconstitutional race preferences, and they are acknowledging sex as a biological reality in sports and intimate spaces,” she wrote.

    Source link

  • Teaching students how to talk: why dialogue belongs at the heart of higher education

    Teaching students how to talk: why dialogue belongs at the heart of higher education

    UK universities are under mounting financial pressure. Join HEPI and King’s College London Policy Institute on 11 November 2025 at 1pm for a webinar on how universities balance relatively stable but underfunded income streams against higher-margin but volatile sources. Register now. We look forward to seeing you there.

    This blog was kindly authored by Estefania Gamarra, Postdoctoral Research Fellow, and Marion Heron Associate Professor in Educational Linguistics, both from the University of Surrey Institute of Education. It was also authored by Harriet R. Tenenbaum Professor in Developmental and Social Psychology and Lewis Baker Senior Lecturer in Chemical and Process Engineering – Foundation Year, both from the University of Surrey.

    Today’s higher education sector faces a need to increase student progression and improve retention. This goal is especially necessary for Foundation Year programmes. A proposed solution is active learning. Yet amid the push to make lectures more interactive, one approach stands out – dialogue.

    Dialogue transforms students from passive listeners into active participants. But while universities increasingly encourage discussion in classrooms and put students in pairs, they often overlook a crucial question: do students know how to talk to each other in academic contexts?

    For years, the emphasis has been on teaching students how to write academically, while teaching them how to engage in academic talk – how to reason aloud, build on others’ ideas, and disagree respectfully –  has been largely ignored. Academic dialogue is not a natural skill: it is a learnt one. For many students, particularly those from ethnic minoritised or first-generation backgrounds, the language of higher education can feel like a second language. Expecting them to navigate complex, often implicit norms of discussion without support risks reproducing the very inequalities universities seek to address.

    What we mean by educational dialogue

    Educational dialogue refers to purposeful, structured talk that supports reasoning, collaboration, and shared understanding. It differs from casual conversation because it asks participants to listen actively, build connections between ideas, and make their thinking explicit. In this way, dialogue makes learning visible – students co-construct understanding through talk.

    Despite a growing body of research in schools showing the benefits of educational dialogue for reasoning, collaboration, and attainment, there has been little work examining how this plays out in higher education. Our project, funded by the Nuffield Foundation, aimed to fill that gap by exploring how Foundation Year students across six UK universities talk to one another when given structured opportunities for dialogue – and whether a targeted intervention could enhance the quality of these interactions.

    What we found

    We observed clear disciplinary differences in the ways students engaged in dialogue. Psychology students, for instance, tended to make more connections to topics beyond the classroom, while Engineering students often built on one another’s ideas in a collaborative effort to solve the problems presented. Recognising these differences is crucial: subject cultures shape how students learn to talk, and this understanding can help educators design more inclusive, discipline-sensitive approaches to active learning. At the same time, if our goal is to prepare students for an increasingly interdisciplinary world, we must also help them become aware of how other disciplines talk and encourage them to develop the flexibility to communicate across disciplinary boundaries.

    The intervention itself had a tangible effect. Discussion time increased, and we observed a higher frequency of dialogic moves such as connecting ideas and making reasoning explicit. In simple terms, students were not just talking more; they were engaging in higher-quality dialogue.

    Both students and teachers noticed the change. Students reported greater confidence in contributing to class discussions and felt more comfortable expressing disagreement respectfully. Teachers in the intervention group described classroom talk as ‘more professional’ and ‘more purposeful’, noting that students participated more readily and that discussions felt more structured.

    Why this matters for policy

    These findings underscore a simple yet powerful message: if universities want students to collaborate effectively and communicate professionally, they must teach them how to talk.

    This is not merely a matter of classroom technique but of educational equity. All students are expected to adopt the norms of academic discourse without being taught what these norms are. By treating dialogue as a teachable skill – much like academic writing – universities can make participation more equitable and support a sense of belonging for all learners.

    Embedding educational dialogue within curricula also has broader policy implications. It aligns directly with the sector’s commitments to widening participation, student engagement, and the development of graduate attributes. In an increasingly interdisciplinary world, helping students learn how to communicate across disciplinary and cultural boundaries is not an optional extra – it is essential preparation for both professional and civic life.

    A call to action

    Universities already invest heavily in teaching academic writing. It is time to afford talk the same status. Embedding structured opportunities for educational dialogue – and explicitly teaching the skills that underpin it – can help create classrooms where every student, regardless of background, can find and use their voice.

    If higher education is serious about inclusion, engagement, and progression, it must teach students not just what to say, but how to say it.

    Source link

  • The new AI tools are fast but can’t replace the judgment, care and cultural knowledge teachers bring to the table

    The new AI tools are fast but can’t replace the judgment, care and cultural knowledge teachers bring to the table

    by Tanishia Lavette Williams, The Hechinger Report
    November 4, 2025

    The year I co-taught world history and English language arts with two colleagues, we were tasked with telling the story of the world in 180 days to about 120 ninth graders. We invited students to consider how texts and histories speak to one another: “The Analects” as imperial governance, “Sundiata” as Mali’s political memory, “Julius Caesar” as a window into the unraveling of a republic. 

    By winter, our students had given us nicknames. Some days, we were a triumvirate. Some days, we were Cerberus, the three-headed hound of Hades. It was a joke, but it held a deeper meaning. Our students were learning to make connections by weaving us into the histories they studied. They were building a worldview, and they saw themselves in it. 

    Designed to foster critical thinking, this teaching was deeply human. It involved combing through texts for missing voices, adapting lessons to reflect the interests of the students in front of us and trusting that learning, like understanding, unfolds slowly. That labor can’t be optimized for efficiency. 

    Yet, today, there’s a growing push to teach faster. Thousands of New York teachers are being trained to use AI tools for lesson planning, part of a $23 million initiative backed by OpenAI, Microsoft and Anthropic. The program promises to reduce teacher burnout and streamline planning. At the same time, a new private school in Manhattan is touting an AI-driven model that “speed-teaches” core subjects in just two hours of instruction each day while deliberately avoiding politically controversial issues. 

    Marketed as innovation, this stripped-down vision of education treats learning as a technical output rather than as a human process in which students ask hard questions and teachers cultivate the critical thinking that fuels curiosity. A recent analysis of AI-generated civics lesson plans found that they consistently lacked multicultural content and prompts for critical thinking. These AI tools are fast, but shallow. They fail to capture the nuance, care and complexity that deep learning demands. 

    Related: A lot goes on in classrooms from kindergarten to high school. Keep up with our free weekly newsletter on K-12 education.  

    When I was a teacher, I often reviewed lesson plans to help colleagues refine their teaching practices. Later, as a principal in Washington, D.C., and New York City, I came to understand that lesson plans, the documents connecting curriculum and achievement, were among the few steady examples of classroom practice. Despite their importance, lesson plans were rarely evaluated for their effectiveness.  

    When I wrote my dissertation, after 20 years of working in schools, lesson plan analysis was a core part of my research. Analyzing plans across multiple schools, I found that the activities and tasks included in lesson plans were reliable indicators of the depth of knowledge teachers required and, by extension, the limits of what students were asked to learn. 

    Reviewing hundreds of plans made clear that most lessons rarely offered more than a single dominant voice — and thus confined both what counted as knowledge and what qualified as achievement. Shifting plans toward deeper, more inclusive student learning required deliberate effort to incorporate primary sources, weave together multiple narratives and design tasks that push students beyond mere recall. 

     I also found that creating the conditions for such learning takes time. There is no substitute for that. Where this work took hold, students were making meaning, seeing patterns, asking why and finding themselves in the story. 

    That’s the transformation AI can’t deliver. When curriculum tools are trained on the same data that has long omitted perspectives, they don’t correct bias; they reproduce it. The developers of ChatGPT acknowledge that the model is “skewed toward Western views and performs best in English” and warn educators to review its content carefully for stereotypes and bias. Those same distortions appear at the systems level — a 2025 study in the World Journal of Advanced Research and Reviews found that biased educational algorithms can shape students’ educational paths and create new structural barriers. 

    Ask an AI tool for a lesson on westward expansion, and you’ll get a tidy narrative about pioneers and Manifest Destiny. Request a unit on the Civil Rights Movement and you may get a few lines on Martin Luther King Jr., but hardly a word about Ella Baker, Fannie Lou Hamer or the grassroots organizers who made the movement possible. Native nations, meanwhile, are reduced to footnotes or omitted altogether. 

    Curriculum redlining — the systematic exclusion or downplaying of entire histories, perspectives and communities — has already been embedded in educational materials for generations. So what happens when “efficiency” becomes the goal? Whose histories are deemed too complex, too political or too inconvenient to make the cut? 

    Related: What aspects of teaching should remain human? 

    None of this is theoretical. It’s already happening in classrooms across the country. Educators are under pressure to teach more with less: less time, fewer resources, narrower guardrails. AI promises relief but overlooks profound ethical questions. 

    Students don’t benefit from autogenerated worksheets. They benefit from lessons that challenge them, invite them to wrestle with complexity and help them connect learning to the world around them. That requires deliberate planning and professional judgment from a human who views education as a mechanism to spark inquiry. 

    Recently, I asked my students at Brandeis University to use AI to generate a list of individuals who embody concepts such as beauty, knowledge and leadership. The results, overwhelmingly white, male and Western, mirrored what is pervasive in textbooks.  

    My students responded with sharp analysis. One student created color palettes to demonstrate the narrow scope of skin tones generated by AI. Another student developed a “Missing Gender” summary to highlight omissions. It was a clear reminder that students are ready to think critically but require opportunities to do so.  

    AI can only do what it’s programmed to do, which means it draws from existing, stratified information and lags behind new paradigms. That makes it both backward-looking and vulnerable to reproducing bias.  

    Teaching with humanity, by contrast, requires judgment, care and cultural knowledge. These are qualities no algorithm can automate. When we surrender lesson planning to AI, we don’t just lose stories; we also lose the opportunity to engage with them. We lose the critical habits of inquiry and connection that teaching is meant to foster. 

    Tanishia Lavette Williams is the inaugural education stratification postdoctoral fellow at the Institute on Race, Power and Political Economy, a Kay fellow at Brandeis University and a visiting scholar at Harvard University. 

    Contact the opinion editor at [email protected].  

    This story about male AI and teaching was produced by The Hechinger Report, a nonprofit, independent news organization focused on inequality and innovation in education. Sign up for Hechinger’s weekly newsletter.  

    This <a target=”_blank” href=”https://hechingerreport.org/opinion-the-new-ai-tools-are-fast-but-cant-replace-the-judgment-care-and-cultural-knowledge-teachers-bring-to-the-table/”>article</a> first appeared on <a target=”_blank” href=”https://hechingerreport.org”>The Hechinger Report</a> and is republished here under a <a target=”_blank” href=”https://creativecommons.org/licenses/by-nc-nd/4.0/”>Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License</a>.<img src=”https://i0.wp.com/hechingerreport.org/wp-content/uploads/2018/06/cropped-favicon.jpg?fit=150%2C150&amp;ssl=1″ style=”width:1em;height:1em;margin-left:10px;”>

    <img id=”republication-tracker-tool-source” src=”https://hechingerreport.org/?republication-pixel=true&post=113191&amp;ga4=G-03KPHXDF3H” style=”width:1px;height:1px;”><script> PARSELY = { autotrack: false, onload: function() { PARSELY.beacon.trackPageView({ url: “https://hechingerreport.org/opinion-the-new-ai-tools-are-fast-but-cant-replace-the-judgment-care-and-cultural-knowledge-teachers-bring-to-the-table/”, urlref: window.location.href }); } } </script> <script id=”parsely-cfg” src=”//cdn.parsely.com/keys/hechingerreport.org/p.js”></script>

    Source link

  • Higher education should back a national digital skills wallet

    Higher education should back a national digital skills wallet

    Across the UK, millions of people struggle to prove what they’re capable of. From students juggling part-time work, to graduates volunteering in their communities, much of their learning sits outside formal qualifications and is effectively invisible to employers and institutions.

    That invisibility costs everyone. It holds back individuals who can’t evidence their abilities and employers who struggle to identify the right talent, and it also drags on national productivity.

    Last month the RSA and Ufi VocTech Trust published the final report of the Digital Badging Commission, From Skills to Growth: A Plan for Digital Badging in the UK. The conclusion is clear: the UK urgently needs a national digital skills wallet, linked to emerging plans for a national digital ID and built on open, interoperable standards. It would allow people to collect, store and share digital badges and credentials for their skills and capabilities that sit alongside their formal qualifications for a more holistic approach to education.

    This recommendation now aligns directly with the UK government’s post-16 education and skills white paper which commits to a digital-first, lifelong learning system and the development of a national digital identity infrastructure. The commission’s proposals are therefore timely, practical and well placed to support the government’s agenda.

    The missing infrastructure

    While individual institutions issue transcripts in pdf format, and some pioneer digital badges alongside them, there’s no shared infrastructure to make those records connected nor visible across different sectors. As education and work become increasingly digital, paper certificates should be giving way to verifiable, portable digital records, alongside digital badges and credentials.

    These are not just icons of achievement, but verified records embedded with information about who issued them, what they recognise and when they were awarded. Built on open standards, they can be issued by any organisation that follows the same open technical framework, ensuring compatibility across sectors.

    Globally, digital badges and credentials are part of richer digital profile infrastructures, including Comprehensive Learner Records (CLRs) and Learning and Employment Records (LERs). CLRs capture academic, professional and co-curricular achievements, while LERs extend this to employment history, creating a portfolio of verified experience. Both can be stored in digital wallets, secure platforms that give individuals control over how and when they share their data with employers or education providers.

    Together, these systems represent a shift towards lifelong, learner-owned digital records, combining qualifications, skills and experience into one trusted framework. This is precisely the direction outlined in the government’s white paper, which calls for a more joined-up and data-driven post-16 education landscape.

    The Digital Badging Commission’s modelling shows that a trusted digital credentialing ecosystem could unlock billions in productivity across the wider economy through faster hiring and retention. But the gains go deeper than economics: visibility of skills drives inclusion. It means every learner, whatever their route, can have their capabilities recognised.

    Keeping up with global trends

    The use of digital skills wallets and LERs is accelerating worldwide. In the UK, the idea of a skills passport is not new – in 2022, the Council of Skills Advisers, chaired by David Blunkett, proposed a Learning and Skills Passport, a modular, assessment-based record built over a lifetime, linked to Individual Learning Accounts. More recently, under the industrial strategy, government confirmed that Skills England will work with industry to develop such passports.

    Across Europe, the EU’s Digital Identity Wallet is being piloted in several member states ahead of full rollout in 2026. It will let citizens store and share verified digital credentials, from qualifications to identity documents, through a secure mobile app integrated with Europass. The system aims to make skills and qualifications transferable across borders, supporting the EU’s vision for a flexible, skills-based digital economy.

    The Digital Badging Commission is calling for interoperable skills wallets that begins as an evolution of the Department for Education’s digital Education Record (which will be rolled out to all school pupils from August 2026). It will initially hold GCSEs for school-leavers but could expand to become a lifelong, portable record – potentially linking to whatever comes out of emerging plans for a national digital ID.

    Here the white paper is welcome but incomplete. It describes an “education record app” focused on qualifications and support information, yet it does not set out how essential, non-accredited learning (workplace skills, volunteering, micro-credentials) will be imported. To avoid a two-tier system, the government should seize the chance to ensure one integrated wallet – rather than a separate “skills app” – so a person’s full skillset is represented, not just formal assessments. Not only that, but individuals should be able to choose what they share, and with whom.

    This directly complements the white paper’s ambition for a unified skills and qualifications framework, ensuring that learning follows the individual, not the institution, across life and work.

    Crucially, it must adopt open standards so that every education provider can issue records that align with it, and can be exported into a shared national wallet or interoperable proprietary ones. Degree transcripts from universities would no longer be in pdf format, but living records exported to the same technical rules as other credentials. A learner could move a verified transcript directly into a skills and qualifications wallet, combine it with badges from professional training or volunteering, and share it securely with employers anywhere in the UK.

    That interoperability matters. Without it, the Education Record risks excluding lifelong learning altogether. With it, we can create a single, trusted architecture connecting higher education, workplace learning and civic participation. For universities, it means the qualifications they issue remain visible and valuable in a joined-up system.

    What higher education stands to gain

    There’s a strategic choice here. Universities can either wait until government or private platforms dictate the standards or help design them now.

    By engaging early, HEIs can enhance their reputation and competitiveness by being seen as innovators in trusted digital credentials. This will strengthen their global profile and appeal to learners and employers seeking transparent, skills-focused education. They can provide better learner outcomes by meaningfully capturing students’ broader skills, placements and co-curricular learning, improving employability and lifelong learning pathways – taking the former Higher Education Achievement Record to its natural digital conclusion, and making lifelong learning tangible, rather than rhetorical, to students. Moreover, early involvement will mean a smoother integration with existing systems (VLEs and student records), reducing future compliance costs and avoiding disruptive retrofitting when standards are mandated.

    The post-16 white paper’s call for a coherent digital skills framework reinforces this opportunity for universities to lead, not follow, in shaping the standards and technology that will define post-16 learning.

    The obvious concerns are trust, quality and cost. Without consistent quality assurance, digital credentials risk being untrusted markers of skill. That’s why the Commission also calls for a national registry for digital credential quality assurance – a registry that defines standards, metadata requirements and approved issuers.

    Quality is not the enemy of flexibility in this case; it is the enabler of trust. If universities lead in shaping these standards, they can ensure rigour and learner protection are built in from the start.

    Adopting open standards in education and digital skills systems will not necessarily be straightforward, particularly in environments where legacy systems have been modified incrementally over years. For many institutions, both large and small, modifying student information systems and integrating open standards to bring them into line will require significant overhaul of systems, as well as underpinning investment in staff training.

    However, the implementation of the LLE is requiring all institutions to explore the extent to which their student record systems are fit for purpose, and this represents a real opportunity to think broadly about what systems will be required in the future.

    A call to lead, not follow

    The RSA once helped invent the modern exam system. Today we need the same leap of imagination for the digital age. If we want an inclusive, high-trust and high-skill economy, recognition must catch up with reality.

    Universities are uniquely placed to lead this transition, rooted in evidence, trusted by learners and central to the national conversation about growth. The question is not whether digital credentials will become part of our landscape, but who will set the standards and values that shape them.

    By engaging with open standards for degree transcripts and flexing VLEs to deliver digital badges, higher education can ensure that the national digital skills wallet reflects academic quality, learner autonomy and social purpose. In the wake of the government’s post-16 white paper – and with clarity now needed on integrating non-accredited learning – the timing could not be better. It’s an opportunity to turn invisible learning into visible value.

    You can download the Digital Badging Commission’s final report, From Skills to Growth: A Plan for Digital Badging in the UK here.

    Source link

  • What might lower response rates mean for Graduate Outcomes data?

    What might lower response rates mean for Graduate Outcomes data?

    The key goal of any administered national survey is for it to be representative.

    That is, the objective is to gather data from a section of the population of interest in a country (a sample), which then enables the production of statistics that accurately reflect the picture among that population. If this is not the case, the statistic from the sample is said to be inaccurate or biased.

    A consistent pattern that has emerged both nationally and internationally in recent decades has been the declining levels of participation in surveys. In the UK, this trend has become particularly evident since the Covid-19 pandemic, leading to concerns regarding the accuracy of statistics reported from a sample.

    A survey

    Much of the focus in the media has been on the falling response rates to the Labour Force Survey and the consequences of this on the ability to publish key economic statistics (hence their temporary suspension). Furthermore, as the recent Office for Statistics Regulation report on the UK statistical system has illustrated, many of our national surveys are experiencing similar issues in relation to response rates.

    Relative to other collections, the Graduate Outcomes survey continues to achieve a high response rate. Among the UK-domiciled population, the response rate was 47 per cent for the 2022-23 cohort (once partial responses are excluded). However, this is six percentage points lower than what we saw in 2018-19.

    We recognise the importance to our users of being able to produce statistics at sub-group level and thus the need for high response rates. For example, the data may be used to support equality of opportunity monitoring, regulatory work and understand course outcomes to inform student choice.

    So, HESA has been exploring ways in which we can improve response rates, such as through strategies to boost online engagement and offering guidance on how the sector can support us in meeting this aim by, for example, outlining best practice in relation to maintaining contact details for graduates.

    We also need, on behalf of everyone who uses Graduate Outcomes data, to think about the potential impact of an ongoing pattern of declining response rates on the accuracy of key survey statistics.

    Setting the context

    To understand why we might see inaccurate estimates in Graduate Outcomes, it’s helpful to take a broader view of survey collection processes.

    It will often be the case that a small proportion of the population will be selected to take part in a survey. For instance, in the Labour Force Survey, the inclusion of residents north of the Caledonian Canal in the sample to be surveyed is based on a telephone directory. This means, of course, that those not in the directory will not form part of the sample. If these individuals have very different labour market outcomes to those that do sit in the directory, their exclusion could mean that estimates from the sample do not accurately reflect the wider population. They would therefore be inaccurate or biased. However, this cause of bias cannot arise in Graduate Outcomes, which is sent to nearly all those who qualify in a particular year.

    Where the Labour Force Survey and Graduate Outcomes are similar is that submitting answers to the questionnaire is optional. So, if the activities in the labour market of those who do choose to take part are distinct from those who do not respond, there is again a risk of the final survey estimates not accurately representing the situation within the wider population.

    Simply increasing response rates will not necessarily reduce the extent of inaccuracy or bias that emerges. For instance, a survey could achieve a response rate of 80 per cent, but if it does not capture any unemployed individuals (even when it is well known that there are unemployed people in the population), the labour market statistics will be less representative than a sample based on a 40 per cent response rate that captures those in and out of work. Indeed, the academic literature also highlights that there is no clear association between response rates and bias.

    It was the potential for bias to arise from non-response that prompted us to commission the Institute for Social and Economic Research back in 2021 to examine whether weighting needed to be applied. Their approach to this was as follows. Firstly, it was recognised that for any given cohort, it is possible that the final sample composition could have been different had the survey been run again (holding all else fixed). The sole cause of this would be a change in the group of graduates who choose not to respond. As Graduate Outcomes invites almost all qualifiers to participate, this variation cannot be due to the sample randomly chosen to be surveyed being different from the outset if the process were to be repeated – as might be the case in other survey collections.

    The consequence of this is that we need to be aware that a repetition of the collection process for any given cohort could lead to different statistics being generated. Prior to weighting, the researchers therefore created intervals – including at provider level – for the key survey estimate (the proportion in highly skilled employment and/or further study) which were highly likely to contain the true (but unknown) value among the wider population. They then evaluated whether weighted estimates sat within these intervals and concluded that if they did, there was zero bias. Indeed, this was what they found in the majority of cases, leading to them stating that there was no evidence of substantial non-response bias in Graduate Outcomes.

    What would be the impact of lower response rates on statistics from Graduate Outcomes?

    We are not the only agency running a survey that has examined this question. Other organisations administering surveys have also explored this matter too. For instance, the Scottish Crime and Justice Survey (SCJS) has historically had a target response rate of 68 per cent (in Graduate Outcomes, our target has been to reach a response rate of 60 per cent for UK-domiciled individuals). In SCJS, this goal was never achieved, leading to a piece of research being conducted to explore what would happen if lower response rates were accepted.

    SCJS relies on face-to-face interviews, with a certain fraction of the non-responding sample being reissued to different interviewers in the latter stages of the collection process to boost response rates. For their analysis, they looked at how estimates would change had they not reissued the survey (which tended to increase response rates by around 8-9 percentage points). They found that choosing not to reissue the survey would not make any material difference to key survey statistics.

    Graduate Outcomes data is collected across four waves from December to November, with each collection period covering approximately 90 days. During this time, individuals have the option to respond either online or by telephone. Using the 2022-23 collection, we generated samples that would lead to response rates of 45 per cent, 40 per cent and 35 per cent among the UK-domiciled population by assuming the survey period was shorter than 90 days. Similar to the methodology for SCJS therefore, we looked at what would have happened to our estimates had we altered the later stages of the collection process.

    From this point, our methodology was similar to that deployed by the Institute for Social and Economic Research. For the full sample we achieved (i.e. based on response rate of 47 per cent), we began by generating intervals at provider level for the proportion in highly skilled employment and/or further study. We then examined whether the statistic observed at a response rate of 45 per cent, 40 per cent and 35 per cent sat within this interval. If it did, our conclusion was there was no material difference in the estimates.

    Among the 271 providers in our dataset, we found that, at a 45 per cent response rate, only one provider had an estimate that fell outside the intervals created based on the full sample. This figure rose to 10 (encompassing 4 per cent of providers) at a 40 per cent response rate and 25 (representing 9 per cent of providers) at a 35 per cent response rate, though there was no particular pattern to the types of providers that emerged (aside from them generally being large establishments).

    What does this mean for Graduate Outcomes users?

    Those who work with Graduate Outcomes data need to understand the potential impact of a continuing trend of lower response rates. While users can be assured that the survey team at HESA are still working hard to achieve high response rates, the key-take away message from our study is that a lower response rate to the Graduate Outcomes survey is unlikely to lead to a material change in the estimates for the proportion in highly skilled employment and/or further study among the bulk of providers.

    The full insight and associated charts can be viewed on the HESA website:
    What impact might lower response rates have had on the latest Graduate Outcomes statistics?

    Read HESA’s latest research releases. If you would like to be kept updated on future publications, please sign-up to our mailing list.

    Source link

  • Trump Partially Funds SNAP, Colleges Scramble

    Trump Partially Funds SNAP, Colleges Scramble

    In the last week, campuses scrambled to shore up resources as 42 million Americans, including over a million college students, prepared to lose federal assistance to buy food. Payments for the Supplemental Nutrition Assistance Program, or SNAP, didn’t go out on the first of the month as they normally would amid the ongoing government shutdown.

    Now the Trump administration plans to dole out some of the benefits this month—but not all—in response to two federal court orders.

    In court filings Monday, the Trump administration agreed to expend emergency reserves to issue partial benefits this month, but also said the funds will only cover half of eligible households’ current benefits. And for at least some states, payments could take months to come through because of bureaucratic hurdles.

    Erika Roberson, senior policy associate at the Institute for College Access and Success, said she worries students who rely on SNAP will still get less food than they need.

    “Some food is not nearly enough food—especially when students are left to decide between finding their next meal and studying for an exam,” Roberson said in a statement to Inside Higher Ed. “Food should not be a luxury, but today, sadly, many college students are finding themselves in a position where that’s their reality.”

    And while partial benefits are better than none at all, some questions remain unanswered. It’s unclear whether all SNAP recipients will get half of their benefits or whether some will get less than others this month, said Mark Huelsman, director of policy and advocacy at the Hope Center for Student Basic Needs at Temple University. He also expects payments to be delayed.

    “I think that it still holds that campuses and food pantries and community organizations are going to be stretched pretty thin in the coming weeks,” Huelsman said, “even if the courts did the right thing here and stepped in and made sure that people’s benefits weren’t completely withheld.”

    Campuses ‘Plan for the Worst’

    Colleges and universities across the country have been furiously stocking up their campus pantries and expanding on-campus food programs in preparation for a pause in SNAP.

    Southeast Community College in Nebraska typically runs a food drive in November for the food pantries on its three campuses. But this year, the college started its drive a month early, predicting a surge of students in need. Already, the Lincoln campus’s pantry went from serving 49 students two years ago to 505 students this September, said Jennifer Snyder, communications specialist at Southeast Community College. That number is only expected to grow. The college also plans to run a fundraising campaign for its emergency scholarship fund in case more students need aid than usual.

    Ramping up these supports comes with challenges, Snyder said. Campus pantries used to be able to stock up by buying items at a low price from local food banks, but food banks are holding on to more of their goods as they also prepare for increases in demand. As campus pantries become harder to fill, Snyder worries staff members will have to make difficult decisions about how much food students can take.

    “The need is there, and the demand is there, but the supply just keeps dwindling,” Snyder said. “So, how do you make it even? How do you make it fair for everybody so that everybody has access?”

    Snyder said the Trump administration’s promise to partially fund SNAP this month hasn’t changed the college’s plans.

    “If it’s partial funding, that’s a benefit,” she said. But “you just don’t know when it’s going to be taken away, so we should plan for the worst.”

    Keith Curry, president of Compton College in Los Angeles, also sprang into action when he realized his students’ SNAP benefits were at risk.

    The college already offers students one free meal per day through a partnership with the nonprofit Everytable. Starting Wednesday, the college is upping the number to two free meals daily for students participating in CalFresh, the state’s SNAP program, and CalWORKs, a state benefit program for low-income families. CalWORKs students will also get $50 in grocery vouchers per week, and students in either program get an extra $20 in farmers market vouchers per week.

    Compton College also has a data-sharing agreement with the Los Angeles County Department of Public Social Services that helps the college identify students who are eligible for CalFresh and CalWORKs to offer them extra supports, if students sign a waiver allowing it. The college plans to lean on that partnership to verify more students participating in these programs who are now eligible for Compton College’s new supports. The college and Everytable are splitting the costs of the additional free meals, and the college plans to reassess the political situation every Friday to determine whether the extra measures are still needed.

    “We’re moving forward, because we don’t know what the impact will be to our students,” Curry said. “We don’t know how much they will actually receive. And our students need us more now than ever before. People are waiting for their benefits, and they’ve got to figure it out. Students are in a precarious position where they already have other needs.”

    The Foundation for California Community Colleges expects more than 275,000 students in the system will be affected by SNAP payment delays, according to an emergency fundraising campaign launched Monday.

    Grant Tingley, 41, is one of those students. He’s a student at Cypress College and an ambassador for the foundation whose job is to spread information about student food and housing resources. He’s also a SNAP recipient himself. In preparation for SNAP’s lapse, he’s been working with community organizations and other students to create a database of local food pantries and is pushing his campus food pantry to expand its hours.

    Tingley emphasized that hunger makes it harder for the most vulnerable students to focus on their schoolwork. He’s also a student worker at Rising Scholars, a support program for formerly incarcerated students, students with incarcerated family members or students recovering from substance use, like himself. He fears these students in particular are at risk of losing academic momentum.

    “They’re a group of people that have been beaten down repeatedly, time after time, and sometimes a small roadblock can really be a huge impediment for them going forward and continuing on their path,” he said. “Every little roadblock that we put in front of these students is almost make or break.”

    Huelsman, of the Hope Center, encouraged colleges and universities to keep pushing forward plans to bolster student food supports and emergency aid as students divert funds they use for housing and other necessities to groceries. The Hope Center also put out a guide to help colleges navigate how to support students through disrupted SNAP benefits.

    Even with partial benefits flowing, “every contingency plan and every preparation that institutions were making to help students weather this is still live,” he said. “Students are going to still feel a pretty severe disruption. And there’s just general confusion about what’s next.”

    Source link

  • Gen Z’s Career Apocalypse Just Got Worse (Vincent Chan)

    Gen Z’s Career Apocalypse Just Got Worse (Vincent Chan)

     

    The job market is the worst it’s been in over a decade, specifically for the younger generation. Gen Z’s unemployment rate is nearly double the national average and nearly 60% of recent college grads are still looking for their first job, compared to just 25% of recent college graduates in previous generations. but this all Gen Z’s fault or is there something else going on?

    Source link

  • Why Even Harvard’s Smartest Graduates Can’t Get a Job Now (Economy Media)

    Why Even Harvard’s Smartest Graduates Can’t Get a Job Now (Economy Media)

     

    Generation Z faces a challenging labor market as unemployment among recent graduates reached 8.6% in June 2025. Entry-level jobs often demand two to three years of experience, creating a catch-22 for young workers. Stagnant starting salaries, rising living costs, and student debt averaging $33,500 per borrower add economic pressure. Companies prioritize retaining staff, while tariffs, inflation, and hiring freezes limit new opportunities. Gig work and delayed financial independence are common, with only 29% of Gen Z workers feeling engaged. Long application processes, reduced internships, and intense competition further hinder career entry, creating widespread professional anxiety and underemployment.

    Source link