Tag: Matters

  • WEEKEND READING: The one strategic role almost every university underestimates – and why it matters now more than ever

    WEEKEND READING: The one strategic role almost every university underestimates – and why it matters now more than ever

    This blog was kindly authored by Caroline Dunne, Leadership Coach, Change Mentor and former Chief of Staff.  

    For many Vice-Chancellors, the challenge is one of bandwidth. Leading a university today is equivalent to running a major regional employer – complex multi-campus operations, often turning over hundreds of millions of pounds, under intensifying public and political scrutiny. In this environment, strategic support is not a luxury; it is a prerequisite for strong, steady leadership that can hold the line between urgent pressure and long-term ambition.

    Within this context, one critical role remains under-recognised in much of the sector: the Chief of Staff.

    Drawing on insights from interviews conducted in the first quarter of this academic year with Chiefs of Staff and senior Higher Education leaders across the UK, this piece explores the strategic value of the role and why, in a period of profound turbulence, now could be the right time to put more “Chief” into the Chief of Staff.

    An untapped strategic asset

    Outside higher education, the Chief of Staff is a well-understood part of modern executive infrastructure: a senior adviser who expands the horizon of the chief executive, drives alignment, absorbs complexity and enables organisational agility.

    Inside higher education, the role is far more variable. In some institutions, the role is positioned as a strategic partner to the Vice-Chancellor; in others, it is mistaken for an ‘executive assistant-plus’ or folded into a different portfolio. Reporting lines, authority and remit differ widely, sometimes limiting the role’s ability to deliver its full strategic value.

    What emerged consistently from my interviews is this: the absence of a portfolio is the Chief of Staff’s greatest strategic advantage. It enables the role to traverse boundaries, ‘keep things moving in the grey areas’ and view institutional issues through an enterprise lens rather than a single-portfolio perspective.

    As one interviewee described it, not having a portfolio makes you:

    A free agent with an aerial view.

    Greater understanding of this untapped role is overdue. Paradoxically – and perhaps counterintuitively in a resource-constrained sector – it is precisely in this context that a well-positioned Chief of Staff becomes most critical to institutional success.

    Five modes of strategic influence

    In a sector facing systemic pressures, where, as one respondent put it, “driving change and transformation… is like pushing a boulder uphill”, the Chief of Staff plays an important catalytic role – shaping thinking, absorbing complexity and helping the organisation respond with coherence rather than fragmentation.

    I conducted 11 interviews which revealed five modes of strategic influence that a Chief of Staff brings to university leadership:

    Sense-making: turning complexity into coherence.

    Not being tied to a portfolio gives the Chief of Staff a rare vantage point. They see the connections, gaps and risks that others – focused on their own areas – may miss.

    A seat at the top table, even without formal membership, brings influence through insight rather than authority. Chiefs of Staff challenge assumptions, sharpen strategic issues and help Vice-Chancellors translate vision into coordinated action.

    One interviewee captured the essence of the role well:

    “We help make things happen, but we belong in the background.”

    Alignment and flow: moving decisions through the system.

    Universities are structurally complex, often siloed and prone to initiatives moving at different speeds in different directions. Chiefs of Staff surface dependencies, shepherd decisions through the right governance bodies, and ensure that decisions, conversations and projects maintain momentum.

    As one Chief of Staff noted:

    We make sure everyone is rowing in the same direction – even if they’re in separate boats.

    Trusted connectivity: the organisational glue

    Nearly every interviewee emphasised the relational character of the role. Chiefs of Staff build trust across formal and informal networks, read the room, join dots, create spaces for candid conversations and offer a safe space to rehearse potentially difficult issues.

    Much of their impact is intentionally invisible. As one Chief of Staff reflected, the

    most significant unseen impact is behind-the-scenes relationship building.

    Another colleague added:

    Real mastery is knowing when to be visible and when to be invisible… knowing how to master ego.

    Influence in universities is exercised as much between meetings as it is within them.

    Strategic counsel:  second pair of eyes

    Vice-Chancellors face relentless external demands. Chiefs of Staff help maintain strategic momentum by offering:

    • operational realism
    • political insight
    • institutional memory
    • horizon scanning
    • a safe environment to test ideas

    Several described themselves as the “second pair of eyes” – seeing risks early and raising issues before they land.

    We clear barriers, trial new approaches, and give leaders the space to act confidently without being swamped by operational detail – enabling principled, well-understood risks.

    Steadying influence: calm in a volatile environment


    With no portfolio interests and a broad institutional view, Chiefs of Staff help manage tension within senior teams, support leadership transitions and create calm judgement in moments of pressure.

    As one interviewee said:

    A Chief of Staff can help calm the waters – up and down and sideways.

    Another added:

    When an institution is facing uncertainty, you need someone with no skin in the game – someone invested in the success of the collective.

    “A Chief of Staff takes it to the finish line – but you’re nowhere near the ribbon.”

    The point is clear: the role is not about visibility. It is about capacity, coherence, relationships, pace and judgement.

    In a sector where senior leaders are stretched, where decisions carry political and human consequences, and where the pace of change is only accelerating, the question for institutions is no longer whether to invest in a Chief of Staff – but how to position the role for maximum effect:

    • reporting lines that enable influence
    • clarity of remit
    • proximity to decision-making
    • and a mandate that embraces both people and strategy

    As the higher education sector faces continued uncertainty, one thing is clear: well-positioned Chief of Staffs are not a luxury. They are a source of resilience, coherence and leadership capacity – precisely when the sector needs it most.

    In developing this piece, I am deeply grateful to the colleagues who generously contributed their insights including:

    Dr Giles Carden, Chief Strategy Officer and Chief of Staff, University of Southampton

    Dr Clare Goudy, Chief of Staff, Office of the President and Provost, UCL

    Thomas Hay, Head of Vice-Chancellor’s Office, Cardiff University

    Jhumar Johnson, former Chief of Staff to the former Vice-Chancellor at the Open University

    Dr. Chris Marshall, Chief of Staff and Head of the Vice-Chancellor’s Office, University of Wales Trinity Saint David

    Mark Senior, Chief of Staff (Vice-Chancellor’s Office), University of Birmingham

    Rachel Stone, Head of Governance and Vice-Chancellor’s Office, University of Roehampton 

    Luke Taylor, Chief of Staff to the President & Vice-Chancellor, University of Manchester

    Becca Varley, Chief of Staff, Vice-Chancellor’s Office, Sheffield Hallam University

    Source link

  • Teaching in the age of generative AI: why strategy matters more than tools

    Teaching in the age of generative AI: why strategy matters more than tools

    Join HEPI and Advance HE for a webinar today (Tuesday, 13 January 2026) from 11am to 12pm, exploring what higher education can learn from leadership approaches in other sectors. Sign up here to hear this and more from our speakers.

    This blog was kindly authored by Wioletta Nawrot, Associate Professor and Teaching & Learning Lead at ESCP Business School, London Campus.

    Generative AI has entered higher education faster than most institutions can respond. The question is no longer whether students and staff will use it, but whether universities can ensure it strengthens learning rather than weakens it. Used well, AI can support personalised feedback, stimulate creativity, and free academic time for deeper dialogue. Used poorly, it can erode critical thinking, distort assessment, and undermine trust.

    The difference lies not in the tools themselves but in how institutions guide their use through pedagogy, governance, and culture.

    AI is a cultural and pedagogical shift, not a software upgrade

    Across higher education, early responses to AI have often focused on tools. Yet treating AI as a bolt-on risks missing the real transformation: a shift in how academic communities think, learn, and make judgements.

    Some universities began with communities of practice rather than software procurement. At ESCP Business School, stakeholders, including staff and students, were invited to experiment with AI in teaching, assessment, and student support. These experiences demonstrated that experimentation is essential but only when it contributes to a coherent framework with shared principles and staff development.

    Three lessons have emerged as AI rollouts have been deployed. Staff report using AI to draft feedback or generate case study variations, but final decisions and marking remain human. Students learn more when they critique AI, not copy it. Exercises where students compare AI responses to academic sources or highlight errors can strengthen critical thinking. Governance matters more than enthusiasm. Clarity around data privacy, authorship, assessment and acceptable use is essential to protect trust.

    Assessment: the hardest and most urgent area of reform

    Once students can generate fluent essays or code in seconds, traditional take-home assignments are no longer reliable indicators of learning. At ESCP we have responded by: 

    • Introducing oral assessments, in-class writing, and step-by-step submissions to verify individual understanding.
    • Asking students to reference class materials and discussions, or unique datasets that AI tools cannot access.
    • Updating assessment rubrics to prioritise analytical depth, originality, transparency of process, and intellectual engagement.

    Students should be encouraged to state whether AI was used, how it contributed, and where its outputs were adapted or rejected. This mirrors professional practice by acknowledging assistance without outsourcing judgement. This shift moves universities from policing to encouraging by detecting misconduct and teaching responsible use.

    AI literacy and academic inequality

    AI does not benefit all students equally. Those with strong subject knowledge are better able to question AI’s inaccuracies; others may accept outputs uncritically. 

    Generic workshops alone are insufficient. AI literacy must be embedded within disciplines, for example, in law through case analysis; in business via ethical decision-making; and in science through data validation. Students can be taught not just how to use AI, but how to test it, challenge it, and cite it appropriately.

    Staff development is equally important. Not all academics feel confident incorporating AI into feedback, supervision or assessments. Models such as AI champions, peer-led workshops, and campus coordinators can increase confidence and avoid digital divides between departments.

    Policy implications for UK higher education

    If AI adoption remains fragmented, the UK’s higher education sector risks inconsistency, inequity, and reputational damage. A strategic approach is needed at an institutional and a national level. 

    Universities should define the educational purpose of AI before adopting tools, and consider reforming assessments to remain robust. Structured professional development, opportunities for peer exchange, and open dialogue with students about what constitutes legitimate and responsible use will also support the effective integration of AI into the sector.

    However, it’s not only institutions that need to take action. Policymakers and sector bodies should develop shared reference points for transparency and academic integrity. As a nation, we must invest in research into AI’s impact on learning outcomes and ensure quality frameworks reflect AI’s role in higher education processes, such as assessment and skills development.

    The European Union Artificial Intelligence Act (Regulation (EU) 2024/1689) sets a prescriptive model for compliance in education. The UK’s principles-based approach gives universities flexibility, but this comes with accountability. Without shared standards, the sector risks inconsistent practice and erosion of public trust. A reduction in employability may also follow if students are not taught how to use AI ethically while continuing to develop their critical thinking and analytical skills.

    Implications for the sector

    The experience of institutions like ESCP Business School shows that the quality of teaching with AI depends less on the technology itself than on the judgement and educational purpose guiding its use. 

    Generative AI is already an integral part of students’ academic lives; higher education must now decide how to shape that reality. Institutions that approach AI through strategy, integrity, and shared responsibility will not only protect learning, but renew it, strengthening the human dimension that gives teaching its meaning.

    Source link

  • Not everyone goes home: why inclusive winter planning matters for student success

    Not everyone goes home: why inclusive winter planning matters for student success

    Author:
    Fiona Ellison and Kate Brown

    Published:

    This blog was kindly authored by Fiona Ellison and Kate Brown, Co-Directors, Unite Foundation.

    It is the third blog in HEPI’s series with The Unite Foundation on how to best support care experienced and estranged students. You can find the first blog here and the second blog here.

    Every December, universities flood inboxes with references to “going home” and “family time.” But thousands of students will not go home, because there is no home away from university to go to. For care experienced and estranged students, winter magnifies isolation, financial pressures and risk. This isn’t a welfare sidebar; it’s a retention issue, central to building a sense of belonging for this group of students.

    The Unite Foundation supports All of Us – the UK-wide community for all care experienced and estranged students – where students can find friends who get it and allies to organise with. We know first-hand from students how challenging this time of year can be. That’s why we’re re-issuing our winter guide with practical examples of how you can support care experienced and estranged students this winter.

    Why does it matter?

    The – perhaps forgotten – Office for Students’ Equality of Opportunity Risk Register (EORR) identified risks that disproportionately affect under‑represented groups – including care experienced and estranged students – across access, continuation, and progression. These include insufficient academic and personal support, mental health challenges, cost pressures, and lack of suitable accommodation. All of which were shown to be particularly key for care experienced and estranged students – and which ,as we approach the winter period, are even more at the forefront. There are even more reasons:

    Three quick wins

    Your institution’s winter break is a stress test for belonging. When libraries close, halls empty and festive messaging assumes family gatherings care experienced and estranged students can feel invisible. There are three foundational moves that every provider should implement immediately:

    1. Mind your language – Drop “going home for Christmas” and family‑centric messaging; use inclusive language (“winter break,” “happy holidays”) across all channels.
    2. Keep the place alive – Maintain open, warm spaces (library, SU, study hubs) with skeleton staff and programmed activities for residents; publish clear opening hours and what’s on.
    3. Proactively signpost specifics – Put support routes (welfare, counselling, emergency contacts, hardship funds) in email signatures, posters and social media – not buried webpages.

    Everyone’s role

    Supporting care experienced and estranged students during the winter break isn’t just a widening participation problem – activity should run through everyone within the university. Here are a few suggestions of what you could be doing:

    • Academics: Make proactive check-ins part of your routine, ask students where they’ll be during the break and whether they need support. Clearly publish extenuating circumstances routes and deadlines, and consider scheduling optional study drop-ins for those staying on campus.
    • Estates and Library teams: Keep central, warm spaces open on a rota so students have somewhere to study and socialise. Publish opening hours well in advance and ensure signage at entrances makes this information visible.
    • Residence Life: Maintain a skeleton support service throughout the holiday period and actively include care experienced and estranged students in any events planned for international students, making it clear they are welcome.
    • Security: Brief your team on the heightened risks these students may face, such as harassment or stalking, and incorporate welfare checks into your holiday protocols.
    • Students’ Union: Organise inclusive social events to reduce loneliness, advertise them relentlessly across channels, and partner with local food banks or community projects to provide essential support.
    • Welfare, Counselling, and Mental Health services: Keep services running, even at reduced capacity, and promote crisis lines and emergency contacts prominently so students know help is available.
    • Widening Participation and APP leads: Ensure term-time employment opportunities continue into the break, name – a real person – as a designated contact for care experienced and estranged students.

    We need everyone to be proactive with their intentions – could you forward this to three people to encourage them to take action?

    Act now

    • If you’re a senior leader in your institution, how can you fund at least one visible, winter‑specific intervention? It could be a staffed warm hub, hardship vouchers, or a winter get-together.
    • Choose one immediate change and implement it this week. Whether it’s using inclusive language in your emails, ensuring a key space stays open, or adding support details to your signature, small actions make a big difference. Belonging is built through everyday signals of care.
    • Make sure students know about existing sources of communities. Connect peers to All of Us, the  community for care experienced and estranged students. Peer networks reduce isolation and create a sense of solidarity – especially during the winter break when loneliness can peak.

    If you’re working in higher education and want to explore this work more, so you’re not making last minute plans next year – why not join our HE Peer Professionals network – a member curated, termly meeting of fellow professionals.

    When you’re thinking about going ‘home for Christmas’ have you thought about what you can do to support a home for care experienced and estranged students? Find out more about the wider work of the Unite Foundation and how we can support you through our  Blueprint framework – to support your institution in building a safe and stable home for care experienced and estranged students, improving retention and attainment outcomes.  

    Source link

  • What external examiners do and why it matters

    What external examiners do and why it matters

    Within the big visions presented in the Post-16 Education and Skills White Paper, a specific element of the academic standards landscape has found itself in the spotlight: external examining.

    Within a system predicated on the importance of academic freedom and academic judgement, where autonomous institutions develop their own curricula, external examining provides a crucial UK-wide, peer-led quality assurance mechanism supporting academic standards.

    It assures students their work has been marked fairly and reassures international stakeholders degrees from each UK nation have consistent academic standards.

    So when a Minister describes the system as “inward-focused” and questions its objectivity and consistency, as Jacqui Smith did at Wonkhe’s Festival of Higher Education, the sector needs to respond.

    What external examiners actually do

    External examiners typically review a sample of student work to check that marking criteria and internal moderation processes have been correctly applied and therefore that it has been graded appropriately. They comment on the design, rigour and academic level of assessments, provide external challenge to course teams, and identify good practice and innovation. External examiner reports are escalated through the relevant academic governance processes within a provider, forming a foundation for critical self-reflection on the institution’s maintenance of academic standards.

    Education policy may be devolved, but the systems and infrastructure that maintain academic standards of UK degrees are UK-wide: external examiners frequently examine institutions across UK nation borders. Indeed, the system is also embedded in the Republic of Ireland, with Irish providers drawing some of their external examiners from the UK pool, of which England is the largest source. The system is also intertwined with the work of PSRBs. External examiner reports are often used by PSRBs in their own assurance and accreditation processes, with some PSRBs appointing and managing external examiners directly.

    Tale as old as time

    Scepticism of the system is not new. Over the last quarter of a century, there have been periodic reviews in response to critiques. The most recent of these system reviews was undertaken by QAA in 2022 in partnership with UUK, Guild HE and what is now the Quality Council for UK Higher Education.

    The review compiled insight from a survey across 44 institutions and over 100 external examiners and senior quality professionals, roundtables with 170 individuals from across the sector, in addition to workshops with PSRBs and students.

    It surfaced the importance of the system in maintaining the reputation of UK degrees through impartial scrutiny and triangulation of practice, especially when we know that international audiences view it as an important extra layer of assurance.

    And institutions value the critical friendship provided, and the challenge to course teams which is not always achieved through other routes. External examiner feedback is consistently seen as important in enhancing teaching delivery and assessment practices, as well as upholding due process and internal consistency.

    But our review also revealed thorny problems. The roles can be ambiguously defined, leading to confusion about whether examiners are expected to audit processes, assess standards, or act as enhancement partners. Standards can be interpreted and applied inconsistently – and institutional approaches to examiner engagement, training, and reporting can differ widely. Examiners often reported inadequate support from their home institutions, poor remuneration, and limited recognition for their work.

    To respond to these problems, QAA developed external examining principles, and guidance on how they should be implemented. These principles represented a UK-wide sector agreement on the role and responsibilities of external examiners, bringing a consistent and refreshed understanding across the nations.

    Where do we go from here

    Given its embedded, UK-wide nature, the Westminster government will need to tread carefully and collaboratively in any “review” of the system. A unilateral choice to ditch the system in England would have significant implications. It would impact upon the experience and currency of the pool of external examiner expertise available across the rest of the British Isles, and would undermine the network of general reciprocity on which the system (like that for the peer review of research) is based.

    It would also impact those PSRBs whose accreditation requirements rely on external examiner reports, and in some cases the ability to appoint their own external examiners to courses. To mitigate these risks, work should focus on further strengthening the system to address the English minister’s concerns. This should be sector-led.

    St Mary’s University Twickenham’s recent degree algorithm report demonstrated that sector-led initiatives into these topics do lead to changes within institutional practice; their decision to review their algorithm practice in 2021 was in response to QAA’s work on the appropriate design of degree algorithms, done in conjunction with UUK and GuildHE through the UK Quality Council.

    Using the same model, the Westminster government could work through the UK Quality Council to instigate a sector-led UK-wide review by QAA of how well the 2022 External Examining principles have been implemented across the sector since their creation. This would identify barriers in implementing the principles and surface where further work is needed. The barriers may be as simple as a lack of awareness, or might reveal more systemic challenges around an institution’s ability to encourage independent externals to follow a standardised approach.

    This review could result in updating the principles or proposing more radical solutions to address the system’s weaknesses. Crucially, this mechanism would incorporate the devolved governments and funder regulators, ensuring any changes are done with them, not despite them.

    An external red herring?

    The apparent link between external examining and concerns over grade inflation must also be interrogated. QAA’s 2022 research found that only a third of external examiners were asked by institutions to comment on degree algorithms, and indeed further conversations with quality professionals suggested that it was not perceived as appropriate for external examiners to pass comments on those algorithms. Either that needs to change, or the sector needs to demonstrate that scrutinising external examining in response to grade inflation concerns is like changing the curtains because the roof is leaking.

    If the core Government concern really is grade inflation, then perhaps another sector-led progress review against the UK sector’s 2019 Statement of Intent’could be in order. This could look at the sector’s continued engagement with the guidance around producing degree outcome statements, the principles for effective degree algorithm design, and the outcome classification descriptors in the frameworks for higher education qualifications to address broader concerns around grade inflation in a way that is truly UK wide.

    One nation’s government extricating itself from these interwoven, mutually reinforcing systems risks undermining the whole thing. It would be another enormous and eminently avoidable risk to the UK-ness of a sector that continues to be seen as one entity to anyone outside of the hallowed halls of domestic higher education policy.

    The best way therefore to preserve the continuation of a system that is deeply valued by institutions across the UK is for the sector to lead the critical self-reflection itself, identify its value and merits, and address its weaknesses, preventing a painful fracturing of the ways that academic standards are maintained across the UK.

    This will ensure that degrees awarded by institutions in each UK nation remain trusted and comparable. As a result, governments, students, and international stakeholders can continue to have confidence in the standards of UK degrees.

    Source link

  • The R&D buckets are here to stay – what matters now is how they’re used

    The R&D buckets are here to stay – what matters now is how they’re used

    The Budget and the introduction of DSIT’s new bucket framework mark a shift in how government wants to think and talk about research and innovation. With growth now central to the government’s agenda, it is a clear attempt to answer Treasury’s perennial question: what does the public get for its money?

    At the centre of this shift sits the idea of R&D “buckets”: a four-part categorisation of public R&D funding into curiosity-driven research, government priorities, innovation support and cross-cutting infrastructure.

    The logic behind the buckets is easy to understand. The UK system is complex, with budget lines stretching across a maze of research councils, departments, institutes, academies and government labs. Even seasoned insiders need a cup of coffee before attempting to decipher the charts on one of UKRI’s much-valued budget explainers.

    From the Treasury’s perspective, the lack of clarity is a barrier to the value of government investment. DSIT’s response is the bucket model: a clearer way of presenting public investment that moves the conversation away from budget lines and towards outcomes that matter to citizens. If this helps build broader support for R&D across departments and with the public, as CaSE’s latest research suggests is needed, it could be hugely valuable.

    The outcomes challenge

    One consequence of an outcomes-driven model, however, is that different types of research will find it easier or harder to demonstrate their value. Basic and curiosity-driven research can be difficult to evidence through simple KPIs or narrow ROI measures.

    In contrast, some forms of applied R&D lend themselves more easily to straightforward metrics. The Higher Education Innovation Fund (HEIF) is a good example. It can demonstrate a return on investment of £14.80 to £1 in ways that are simple to communicate and easy for officials to interpret. In a system that places a premium on measurable outcomes, this kind of clarity is powerful.

    If outcomes become the dominant organising logic, there is a risk that bucket one, which covers curiosity-driven research, could appear on paper to be the least “investable” – especially under a future minister who is less supportive of blue-skies research. The danger is not deliberate neglect, but an unintended shift in perception, whereby discovery research is viewed as separate from, rather than essential to, mission-led or innovation-focused work.

    The challenge becomes even clearer when we look at quality-related research funding (QR). Few funding mechanisms are as versatile or as important to the health of the research ecosystem. QR supports discovery research, helps universities leverage private investment, underpins mission- and place-based activity, and fills the gaps left by research council and charity grants. It is the flexible connective tissue that keeps the system functioning.

    Trying to code QR neatly into a single bucket, as bucket one, doesn’t reflect reality. It may make the diagrams tidier, but it also risks narrowing Whitehall’s understanding of how QR actually works. Worse, it could make QR more vulnerable at fiscal events if bucket one is cast as the “future problem” bucket, the category that can be trimmed without immediately visible consequences.

    The trap of over-simplification

    That brings us to a wider point about the buckets themselves. The intention with buckets is to draw a much more explicit line between priorities, investment and impact. This is a reasonable goal. But the risk is that it invites interpretations that are too neat. Most research does not sit cleanly in any one category. The system is interdependent, porous and overlapping. Innovation depends on discovery research. Regional growth depends on long-term capability. And capability only exists if the UK continues to invest in talent, infrastructure and basic research.

    Rather than accepting a model that implies hard boundaries, it may be more helpful to embrace, and actively communicate, this interdependence. A Venn diagram might be a more honest reflection than three or four boxes with solid walls.

    The aim is not to relabel the buckets, but to strengthen the narrative around how the types of research we fund reinforce each other, rather than competing for space in a zero-sum system. This kind of framing could also help government understand why certain funding streams look costly on paper, but yield value across a wide range of outcomes over time.

    One argument is that by identifying curiosity-driven research as a distinct bucket, it will be harder for future governments to cut it without doing so publicly. There is some truth in this. Transparency can raise the political cost of reducing support for basic research. But the counterargument is also important. Once bucket one becomes a visible and discrete line of spend, it could also become more vulnerable during fiscal consolidations. Ministers looking to free up resources for missions or innovation-focused interventions may see it as an easier place to make adjustments, especially if the definition of “impact” narrows over time.

    Shovel ready

    This is why the narrative around the buckets matters as much as the buckets themselves. If they are understood as three separate spaces competing for limited resources, the system loses coherence. Discovery becomes something distant from growth, rather than the engine that drives it. Missions appear disconnected from the long-term capability required to achieve them. Innovation emerges as a standalone activity rather than as part of a pipeline that begins with public investment in fundamental science.

    The bucket framework is not going away. It will shape how government talks about R&D for years to come. This makes the next phase critical: there is an opportunity now to influence how the buckets are interpreted, how they are used in practice and how the narrative around them is constructed.

    If treated as rigid boundaries, the buckets risk weakening the case for long-term investment in capability. But if used as a way of telling a more coherent story about the interdependence of discovery, missions and innovation, they could help build stronger cross-government support for R&D. The challenge is to make sure the latter happens.

    Source link

  • If free speech only matters when convenient, it isn’t free at all

    If free speech only matters when convenient, it isn’t free at all

    The recent controversies surrounding Charlie Kirk — and the extraordinary reaction that followed his campus appearances and commentary — offer a revealing window into the fragile state of free expression in contemporary America. 

    Two recent New York Times opinion pieces examining the backlash were right to highlight how quickly public discourse has hardened into a zero-sum contest in which speech itself becomes grounds for professional punishment, social ostracism, and institutional retaliation. But the deeper lesson is even more unsettling: Free speech is increasingly treated not as a constitutional principle, but as a conditional privilege — one that applies only when speech is politically comfortable.

    This concern is not confined to the Kirk episode alone.

    A mature liberal democracy does not protect speech because it is agreeable. It protects speech precisely because it is controversial.

    In recent essays and commentary in the TimesSteven Pinker and Greg Lukianoff have voiced parallel anxieties about the narrowing of permissible speech in American life. Pinker, writing in response to the wave of cancellations following Kirk’s assassination, argued that the public reaction revealed something larger than partisan outrage: It exposed a culture increasingly governed by moral intimidation rather than democratic confidence. He warned that Americans have begun to treat disagreement itself as a form of complicity, a dynamic that pressures institutions to distance themselves from speakers not because of what they say, but because of how others might react. 

    In Pinker’s telling, this logic shrinks what he calls the “theater of ideas,” replacing open argument with reputational panic, association anxiety, and pre-emptive suppression. When leaders apologize not for their own actions but for the mere fact of conversation, he argued, they signal their inability to withstand the volatility of public outrage — a sign that our intellectual ecosystem is growing narrower, thinner, and more brittle.

    Lukianoff’s column makes a complementary point from a different angle. Drawing on years of work at FIRE, he noted how quickly both institutions and individuals abandon their stated commitments to free expression the moment those commitments become uncomfortable. The Kirk episode, he wrote, was simply the latest example of a pattern he has watched unfold across campuses for more than a decade: a willingness to tolerate speech only when it fits within prevailing ideological or cultural fashions. 

    Lukianoff emphasized that the most troubling aspect is not the criticism of Kirk — criticism is central to free speech — but the eagerness to impose professional penalties, public shaming, or formal censure on anyone associated with him. The principle collapses the instant it is tested. Taken together, Pinker and Lukianoff reveal with unusual clarity that America is drifting toward a model of free expression that survives only when it flatters majority sentiment — a vision entirely at odds with the core purpose of the First Amendment.

    In defense of fiery words

    In the wake of political violence, calls to criminalize rhetoric are growing louder. But Brandenburg v. Ohio set the bar — and it’s a high one.


    Read More

    This is not an argument about whether one agrees with Kirk’s public statements. Many do not. Nor is it a defense of every remark, posture, or provocation associated with his political brand. That is beside the point. A mature liberal democracy does not protect speech because it is agreeable. It protects speech precisely because it is controversial — because democracy requires open contestation, not the selective silencing of whatever unsettles the cultural majority.

    And yet, across universities, professional settings and online spaces, we have witnessed a familiar pattern repeat itself: organized efforts to deplatform, disrupt, shame, or punish those associated with political positions deemed unacceptable. Speakers are shouted down. Venues are pressured. Faculty and students who express dissenting views risk reputational harm or institutional discipline. Even civil engagement becomes suspect if it involves “the wrong people.”

    This reflex is often defended as moral clarity. In reality, it is institutional cowardice.

    There is a great irony here. The very individuals and institutions that loudly proclaim their commitment to diversity, inclusion, and pluralism often prove least capable of tolerating genuine intellectual diversity. They champion the language of openness even as they tighten the boundaries of permissible speech. What results is a shallow performance of tolerance that collapses the moment speech becomes genuinely uncomfortable.

    Free speech is not a decorative ideal meant for ceremonial brochures or abstract jurisprudence seminars. It is a living civic discipline, and it demands that we cultivate tolerance even — especially — when it offends our sensibilities. That discipline has historically been one of the United States’ most distinguishing features: the belief that robust public debate, rather than enforced consensus, is the engine of democratic resilience.

    But today’s culture increasingly treats emotional discomfort as a kind of injury, speech as a form of violence, and dissent as a moral failing. Within that framework, the logic of suppression becomes not only tempting but virtuous: If speech causes harm, then silencing it becomes an act of justice. Once adopted, that logic expands rapidly. Today it is Charlie Kirk. Tomorrow it will be someone else. The principle does not survive the politics.

    The Times essays were right to note how the fear of association now extends far beyond extremist rhetoric to include basic engagement. Students who meet with controversial speakers, professors who host debates, and institutions that tolerate ideological diversity all find themselves scrutinized. The mere act of conversation becomes dangerous territory. That should alarm anyone who values the university as a space for intellectual exploration rather than ideological enforcement.

    This is not merely a cultural concern. It is institutional. When administrators respond to pressure campaigns by canceling speakers, disciplining faculty, or issuing vague statements about “community harm,” they send a powerful message: Conformity is safer than inquiry. Over time, this breeds self-censorship. Students learn that advancement depends not on argumentation but on alignment. Faculty learn that silence is prudent. The public sphere narrows, not because debate has been resolved, but because people have learned to be afraid.

    In the wake of Charlie Kirk’s assassination, colleges must not burden speaking events

    After an assassin cut short a campus speech, colleges must keep in mind that passing security costs to speakers or canceling events under the guise of “safety” hands victory to the heckler’s veto — and invites more violence.


    Read More

    History tells us where this road leads. Societies that abandon free expression do not become kinder or more just. They become brittle. They lose the capacity for correction. Without dissent, errors calcify into doctrine. Without debate, divisions deepen underground until they erupt elsewhere; often violently.

    A healthy society requires a different posture: one that refuses to reward political violence or celebrate rhetorical cruelty, but also refuses to treat speech as a crime. It is possible and necessary to maintain both moral standards and civic tolerance. We can condemn genuinely hateful language without constructing an environment where only preapproved opinions are allowed to exist.

    This distinction matters. There is a difference between criticism and coercion, between moral disagreement and institutional suppression. The first is essential to democratic life. The second corrodes it.

    The American tradition of free speech was never intended to be easy. It was built to withstand tension, disagreement, even anger. It requires a certain moral maturity — the ability to hear something one detests without immediately seeking to destroy the speaker. That maturity is thinning. And institutional leadership has not helped. Rather than modeling resilience and restraint, too many leaders respond to every controversy with ritualized apologies and performative distancing.

    This, in turn, reinforces a culture in which power flows not through argument but through outrage. The loudest voices do not persuade; they intimidate. The most extreme reactions set the rules. The center retreats.

    Defending free speech in this environment is not a partisan exercise; it is a civic one. Conservatives should care when progressive speech is suppressed. Progressives should care when conservative speech is silenced. And all citizens should recognize that the erosion of expressive freedom is rarely symmetrical or stable. It expands. It metastasizes. It eventually reaches those who once applauded it.

    If free speech only survives during moments of convenience, it’s not really free.

    Supporting the right to speak does not mean endorsing what is said. It means believing that a free society is strong enough to withstand unpopular ideas without resorting to coercion. It means valuing persuasion over prohibition. It means recognizing that democracy requires friction.

    Charlie Kirk may be a lightning rod, but the underlying issue is larger than any one figure. The question is whether we still believe in a public square robust enough to sustain disagreement. Whether our institutions still trust citizens to confront ideas rather than suppress them. Whether discomfort is something to be navigated or eliminated.

    If free speech only survives during moments of convenience, it’s not really free. It is permission masquerading as principle. And permission always has an expiration date.

    What this moment demands is not perfect harmony but civic courage: the willingness to say that speech should be protected even when we dislike the speaker, that debate should remain open even when it unsettles us, and that the strength of a liberal society lies not in silencing dissent but in enduring it.

    That endurance is not weakness. It is democracy.

    Source link

  • Measuring What Matters: A Faculty Development System That Improves Teaching Quality – Faculty Focus

    Measuring What Matters: A Faculty Development System That Improves Teaching Quality – Faculty Focus

    Source link

  • Measuring What Matters: A Faculty Development System That Improves Teaching Quality – Faculty Focus

    Measuring What Matters: A Faculty Development System That Improves Teaching Quality – Faculty Focus

    Source link

  • In “Rocky” Labor Market, Your College Major Matters

    In “Rocky” Labor Market, Your College Major Matters

    Nuthawut Somsuk/Getty Images

    Despite mounting public skepticism about the value of a college degree, the data is still clear: Over all, college graduates have much higher earning potential than their peers without a bachelor’s degree. But the limits of those boosted earnings are often decided by a student’s major.

    American workers with a four-year degree ages 25 to 54 earn a median annual salary of $81,000—70 percent more than their peers with a high school diploma alone, according to a new report that Georgetown University’s Center on Education and the Workforce published Thursday. However, the salary range for workers with a bachelor’s degree can span anywhere from $45,000 a year for graduates of education and public service to $141,000 for STEM majors.

    And even within those fields, salary levels have a big range. Humanities majors in the prime of their careers earn between $48,000 and $105,000 a year, with a median salary of $69,000. Meanwhile, business and communications majors earn between $58,000 and $129,000 a year, with a median salary of $86,000.

    “Choosing a major has long been one of the most consequential decisions that college students make—and this is particularly true now, when recent college graduates are facing an unusually rocky labor market,” said Catherine Morris, senior editor and writer at CEW and lead author of the report, “The Major Payoff: Evaluating Earnings and Employment Outcomes Across Bachelor’s Degrees.”

    “Students need to weigh their options carefully.”

    The report, which analyzed earnings and unemployment data collected by the U.S. Census Bureau’s American Community Survey from 2009 to 2023, also documented rising unemployment for recent college graduates. In 2008, recent graduates had lower unemployment rates relative to all workers (6.8 percent versus 9.8 percent). But that gap has narrowed over the past 15 years; since 2022, recent college graduates have faced higher levels of unemployment relative to all workers.

    Morris attributed rising unemployment for recent college graduates to a mix of factors, including increased layoffs in white-collar fields, the rise of artificial intelligence and general economic uncertainty. At the same time, climbing tuition prices and the student debt crisis have heightened consumer concern about a degree’s return on investment.

    “Over the past 15 years, there’s been more and more of a shift toward students wanting to get degrees in majors that they perceive as lucrative or high-paying,” Morris, who noted that STEM degrees, especially computer science, have become increasingly popular. Meanwhile, the popularity of humanities degrees has declined.

    But just because a degree has higher earning potential doesn’t mean it’s immune to job instability. In 2022, 6.8 percent of recent graduates with computer science degrees were unemployed, while just 2.2 percent of education majors—who typically earn some of the lowest salaries—were unemployed.

    “The more specific the major, the more sensitive it is to sectoral shocks,” said Jeff Strohl, director of the center at Georgetown. “More general majors actually have a lot more flexibility in the labor market. I would expect to see some of the softer majors that start with higher unemployment than the STEM majors be a little more stable.”

    And earning a graduate degree can also substantially boost earnings for workers with a bachelor’s degree in a more general field, such as multidisciplinary studies, social sciences or education and public service. Meanwhile, the graduate earnings premium for more career-specific fields isn’t as high.

    “About 25 percent of bachelor of arts majors don’t by themselves have a positive return on investment,” Strohl said. “But we need to look at the graduate earnings premium, because many B.A. majors don’t stand by themselves.”

    Although salaries for college graduates are one metric that can help college students decide on a major, Morris said it shouldn’t be the only consideration.

    “Don’t just chase the money,” she said. “The job market can be very unpredictable. Students need to be aware of their own intrinsic interests and find ways to differentiate themselves.”

    Source link

  • Why Area Studies Matters (opinion)

    Why Area Studies Matters (opinion)

    Area studies, the interdisciplinary study of region-specific knowledge, is under threat in the United States. Some area studies programs are facing immediate dismantling by red-state legislatures. Others, at private universities or in blue states, are more likely to experience a slow decline through dozens of small cuts that may leave them untenable. While most area studies programs are small, their loss would ripple through a wide range of disciplines, impoverishing teaching, research and scholarship across the humanities and social sciences.

    Most contemporary area studies departments were developed and funded in part to meet perceived U.S. national security needs during the Cold War. Nonetheless, area studies programs have, from the outset, reached far beyond policy concerns. They should be saved, not (just) out of concern for the national interest, but because they are fundamental to our modern universities. Area studies have helped to pluralize our understanding of the drivers of history, the sources of literary greatness and the origins and uses of the sciences, enabling scholars to challenge narratives of “Western” normativity.

    As the second Trump administration has thrown federal support for area studies into question, some scholars have come to the field’s defense from the perspective of U.S. security and national interests. They have noted that cutting government funding for programs such as the Foreign Language Area Studies (FLAS) fellowships will linguistically and intellectually impoverish future cadres of policymakers. But in the present political landscape, in which the Trump administration has demonstrated little if any interest in maintaining the trappings of U.S. soft power, it seems unlikely that the federal government will restore funding for language education and the development of regionally specific knowledge. Their ability to contribute to U.S. soft power will not save area studies.

    The future of area studies lies beyond state security and policy interests and instead with the core mission of our universities. If we are to save area studies, we must admit—and celebrate—the fact that the benefits of area studies have never been just about U.S. national interests. Indeed, area studies have decisively shaped how scholarship and education are practiced on U.S. university campuses.

    Since the 1950s, area studies programs have quietly informed disciplinary practices across the humanities and social sciences, changing education even for students who never take courses offered by formal area studies departments. In part, this is because scholars educated through area studies programs teach in history, anthropology, political science, religious studies and a bevy of other programs that require a depth of linguistic and regional knowledge. These scholars introduce global, regional and non-Western knowledge to students at colleges and universities that may not host their own area studies programs, but that rely on the cultivation of regionally specific knowledge at institutions that have invested in and embraced the area studies model. Some of these scholars undertook area studies as their primary field of research. In other cases, including my own, they hold Ph.Ds. in other disciplines but would not have been able to conduct their research without access to the language and regionally specific courses offered by area studies programs at their universities.

    The influence of area studies stretches beyond this immediate impact on scholars and their students. Area studies scholars have insisted that there is just as much to be learned within Middle Eastern, Latin American or sub-Saharan African literature, histories and cultures as there is in Western European or the modern North American Anglophone traditions. At their best, area studies have reminded us that none of these formations or knowledge traditions exist in isolation, that there are no “pure” or untouched civilizations and that ideas and practices have always circulated and shaped each other, whether violently or peacefully. Certainly, many scholars knew and studied these realities well before the advent of the contemporary area studies model. Nonetheless, the presence of area studies in many prominent U.S. universities from the 1950s onward enabled a quiet but certain reckoning with historical scholarly exclusions and helped to internationalize U.S. campus communities.

    Federal and state cuts and institutional austerity are now reshaping university departments and programs across many disciplines. But area studies programs are especially at risk in part because they are excluded from some calls for the defense of the humanities or liberal arts that take an older, pre–area studies view of our shared cultural and historical knowledge. Even more troublingly, the far right is eager to claim and weaponize the humanities for itself. Its vision of the humanities, and of the liberal arts more generally, is one that not only rejects area studies, but also seeks to undo critical approaches to European and Anglophone literature and history. The far right portrays the humanities in triumphalist civilizational terms, imagining a fallacious pure Western (white) tradition that justifies contemporary forms of dominance and exclusion.

    Scholars within the fields that have seen increased interest from the far right are fighting their own battles against these imagined, reactionary pasts. But those of us within area studies—and fields that have been enriched by area studies—also have our part to play. We must refuse to concede to narratives of human history, literature, culture and politics that write out the experiences and contributions of non-European, non-Anglophone or nonwhite individuals and communities.

    The most extreme current threat to area studies, like many threats to the humanities and social sciences more generally, comes from hostile red-state legislatures. I completed an area studies M.A. in central Eurasian studies at Indiana University, a program that hosts languages such as Mongolian, Kurdish and Uyghur, which are rarely if ever taught at other institutions in North America. That program, like many of Indiana’s other vaunted area studies degrees (and many other programs) is currently slated for suspension with “teach-out toward elimination.”

    Yet even institutions seemingly removed from such direct political pressure seem poised to reduce their engagement with area studies. I am now an assistant professor in South Asian languages and civilizations at the University of Chicago, a program that has produced renowned scholars of South Asia globally and offers languages ranging from Tibetan to Tamil. The university has proposed decreasing the number of departments within its Division of the Arts and Humanities and limiting offerings in language classes that do not regularly attract large numbers of students. These policies could result in significant cuts to relatively small area studies programs like my own. And none of these proposals are unique. Whether rapidly or slowly, universities across the country are walking back their commitments to area studies, especially the study of non-Western languages.

    There are actions that we, as area studies scholars, can take to ensure the longevity of our work. As we revel in the complexities of the regions we have chosen to study, we sometimes forget how unfamiliar they remain to many American undergraduate students. Unfamiliarity, however, should not mean inaccessibility. The Shahnameh or the Mahabharata may be less familiar to many of our students than The Iliad and The Odyssey, but there is no reason they should be less accessible. The study of modern sub-Saharan African histories or Southeast Asian languages is not intrinsically more esoteric than the study of modern North American histories or Western European languages. Our goal must be to welcome students into topics that seem unfamiliar and to share in their joy as what was once unfamiliar slowly becomes part of their system of knowledge.

    Likewise, one of the most significant challenges stemming from the Cold War foundations of area studies is that the discipline is often organized along a mid-20th century, U.S.-centric understanding of global political fault lines and cultural boundaries associated with nation-states. These boundaries, as many scholars have shown, do not always reflect how people experience and understand their own cultures and histories. Yet scholars in area studies have become increasingly adept at working beyond these boundaries. Many of us use the framework of area studies to challenge understandings of regional borders as natural, identifying forms of mobility and connectivity that upend assumptions built on the locations of modern lines on modern maps.

    Even as we make area studies more accessible and more reflective of transregional cultural worlds, area studies programs will never be moneymakers for U.S. universities. As the novelist Lydia Kiesling, a beneficiary of area studies and specifically of FLAS funding, noted in Time, “The market will never decide that Uzbek class is a worthwhile proposition, or that it is important for a K–12 teacher in a cash-strapped district to attend a free symposium on world history.” And so, in the absence of federal funding for these programs, any defense of area studies must ultimately come down to asking—begging!—our universities to look beyond the financial motives that seem to have overtaken their educational missions.

    Ultimately, area studies allows us to embrace, even revel in, cultural, social and linguistic particularity and specificity and, through understanding these differences, recognize our shared humanity. At their best, area studies programs help students and the public dismantle cultural hierarchies through knowledge of non-Western traditions that have depth and heterogeneity equal to that of their European and Anglophone counterparts. In our present moment, as a dizzying range of university programs are destroyed by right-wing legislatures or threatened by aggressive institutional austerity, it may seem futile to call for the preservation of this seemingly small corner of the U.S. intellectual universe. Yet in an era when governments, both in the U.S. and abroad, seem beholden to narrow and exclusionary nationalist interests, fields of study that center the pluralism within our shared global histories and cultures are needed in our universities more than ever.

    Amanda Lanzillo is an assistant professor in South Asian languages and civilizations at the University of Chicago.

    Source link