As the world of higher ed continues to evolve at lightning speed, many students are understandably feeling some pressure to keep up. And that’s having a significant impact on the way they’re operating day-to-day in and out of the classroom. Over the past few years, faculty have reported noticeable changes in student expectations, with 49% recently telling us that the need to adapt to those expectations is a top challenge.
So, what’s shifting, and how can faculty better adapt to meet their students where they are without going overboard? Let’s examine two examples of how needs and expectations are changing: AI use and deadline extension requests.
The line between responsible AI use and cheating is fuzzy for some students
Last year, 46% of those we surveyed in our annual Faces of Faculty report named combating cheating and plagiarism as a top challenge, down only slightly from 49% in 2023. And as AI becomes a bigger, more integral part of the higher ed experience, it’s growing increasingly difficult for many students to distinguish between responsible AI use and academic dishonesty.
Forty-two percent of faculty we surveyed say they see significant or severe ethical and legal risks associated with generative AI in education, with 82% of instructors expressing concern specifically about AI and academic integrity. While today’s students expect AI to play some kind of role in the learning process, many stand on shaky ground when it comes to applying it ethically in their coursework.
Many faculty have told us that they like to set clear expectations at the beginning of the semester/term around AI-use, either verbally or within their syllabus. By doing so, they can provide students with clear-cut guidance on how they should approach their coursework.
Faculty are finding that the more they know about AI, the better they can safeguard assignments from potential overreliance. One educator from Missouri told us, “I am learning more about Al and Al detection this year, and am making quite a few adjustments to my assignments so they are more personal and reflective, rather than Al-tempting assignments.”
Using anti-cheating software has become a very popular method for instructors, with many using online plagiarism detection tools like turnitin or “locked down” browsers.
“Spending a lot more time and effort identifying and using reliable plagiarism detection software, especially AI detectors.” – Faculty member
Extension requests: pushing deadlines and boundaries
Another example of changing student needs is the growing expectation from students that their extension requests will be granted. But this has left many instructors feeling overwhelmed, not only by the number of requests to keep track of, but by a rising uncertainty over which requests are based on legitimate reasons. This may very well be a contributing factor for 35% of faculty who cited perceived dishonesty and lack of accountability from students as a top driver of dissatisfaction in 2024.
An adjunct professor from Virginia told us, “I leaned into adapting to students’ expectations, but this became somewhat unmanageable when teaching multiple courses. I am also concerned with setting a precedent for future students in my courses if current students share that accommodations are easily given.”
How instructors are responding
Despite the challenges that this shift presents, many instructors are jumping in to accommodate extension requests from students, offering both patience and a generally high level of understanding. Faculty acknowledge that today’s students have a lot to contend with these days — from financial stressors to academic and social pressures — and they’re prepared to flex to those challenges.
“I became more flexible. I get annoyed by professors my age who ignore the fact that today’s students are under ten times the pressure we were when we were undergraduates. Some of these students are carrying a full course load while working two jobs.” – Other professor role/lecturer/course instructor, Ontario
While they’re empathic to students’ evolving needs, instructors are ready to set their own boundaries when necessary. One faculty member told us, “For the most part, I held firm in my deadlines. I did however increase the number of reminders I sent.”
Regardless of the approach, clear communication with students remains at the heart of how faculty are dealing with this particular shift. Another instructor said, “I look at the individual situation and adapt…I remind students to complete items early to avoid unexpected delays. If there is a technology issue, then I will extend if it is communicated timely.”
We’re happy to see our faculty skillfully weaving through these obstacles while remaining committed to adapting to new student expectations.
To get a full picture about what 1,355 surveyed U.S. and Canadian faculty had to say about changes in student expectations, read our 2024 Faces of Faculty report.
The UK needs a plan for growth and innovation – an industrial strategy is a way of picking winners in terms of sector investments and prioritisation.
Today’s iteration (the fourth in recent times, with Theresa May’s government providing the previous one) chooses eight high-potential sectors to prioritise funding and skills interventions, with the overall intention of encouraging private investment over the long term.
Picking winners for the long term
The choices are the important bit – as the strategy itself notes
Past UK industrial strategies have not lasted because they have either refused to make choices or have failed to back their choices up by reallocating resources and driving genuine behaviour change in both government and industry.
And there are clear commonalities between previous choices and the new ones. Successive governments have prioritised “clean growth”, data and technology, and health – based both on the potential for growth and the impact that investment could have.
What is different this time is the time scales on which the government is thinking – much of the spending discussed today is locked in to the next five years of departmental spending via the spending review, and of course we have those infamous 10-year research and development plans in some areas: the Aerospace Technology Institute (linked to Cranfield), the National Quantum Computing Centre (at the Harwell STFC campus), the Laboratory of Molecular Biology (MRC supported at Cambridge), and the new DRIVE35 automotive programme are the first to be announced.
Government funded innovation programmes will prioritise the IS-8, within a wider goal to focus all of research and development funding on long term economic growth. This explicitly does not freeze out curiosity delivered research – but it is clear that there will be a focus on the other end of the innovation pipeline.
At a macro level UKRI will be pivoting financial support towards the IS-8 sectors – getting new objectives around innovation, commercialisation, and scale-up. If you are thinking that this sounds very Innovate UK you would be right, the Catapult Network will also get tweaks to refocus.
The £500m Local Innovation Partnerships Fund is intended to generate a further £1bn of additional investment and £700m of value to local economies, and there are wider plans to get academia and industry working together: a massive expansion in supercomputer resources (the AI research resource, inevitably) and a new Missions Accelerator programme supported by £500m of funding. And there’s the Sovereign AI Unit within government (that’s another £500m of industry investments) in “frontier AI”. On direct university allocation we get the welcome news that the Higher Education Innovation Fund (HEIF) is here to stay.
There’s an impressively hefty chunk of plans for getting the most out of public sector data – specifically the way in which government (“administrative”) data can be used by research and industry. Nerds like me will have access to a wider range of data under a wider range of licenses – the government will also get better at valuing data in order to maximise returns for the bits it does sell, and there will be ARDN-like approaches available to more businesses to access public data in a safe and controlled way (if parliamentary time allows, legislation will be brought forward) – plus money (£12m) for data sharing infrastructure and the (£100m) national data library.
By sector
The sector plans themselves have a slant towards technology adoption (yes even the creative sector – “createch” is absolutely a thing). But there’s plenty of examples throughout of specific funding to support university-based research, innovation, and bringing discoveries to market – alongside (as you’ll see from Michael’s piece) plenty on skills.
Clearly the focus varies between sectors. For example, there will be a specific UKRI professional and business services innovation programme; while digital and technologies work is more widely focused on the entirety of the UK’s research architecture: there we get promises of “significant” investment via multiple UKRI and ARIA programmes alongside a £240m focus on advanced communication technologies (ACT). The more research-focused sectors also get the ten-year infrastructure-style investments like the £1bn on AI research resources.
Somewhat surprisingly clean energy is not one of the big research funding winners – there’s just £20m over 7 years for the sustainable industrial futures programme (compare the £1bn energy programme in the last spending review). With sustainability also being a mission it also gets a share of the missions accelerator programme (£500m), but for such a research-intensive field that doesn’t feel like a lot.
The creative industries, on the other hand, get £100m via UKRI over the spending review period – there’s a specific creative industries research and development plan coming later this year, alongside (£500m) creative clusters, and further work on measuring the output of the sector. And “createch” (the increasingly technical underpinnings of the creative industries) is a priority too.
It’s also worth mentioning advanced manufacturing as a sector where business and industry are major funders. Here the government is committing “up to £4.3bn” for the sector, with £2.8bn of this going to research and development. Key priorities include work on SME technology adoption, and advanced automotive technologies – the focus is very much on commercialisation, and there is recognition that private finance needs to be a big part of this.
Choice cuts?
The IS-8 are broadly drawn – it is difficult to think of an academic research sector that doesn’t get a slice. But there will be a shaking out of sub-specialisms, and the fact that one of the big spenders (health and medicine) is currently lacking detail doesn’t help understand how the profile of research within that area will shift during the spending review period.
Industrial policy has always been a means of picking winners – focusing necessarily limited investment on the places it will drive benefits. The nearly flat settlement for UKRI in the spending review was encouraging, but it is starting to feel like new announcements like these need to be seen both as net benefits (for the lucky sectors) and funding cuts (for the others).
In higher education institutions, we often speak of “developing talent,” “building capacity,” or “supporting our people.” But what do those phrases really mean when you’re a researcher navigating uncertainty, precarity, or a system that too often assumes resilience, but offers limited resources?
With the renewed focus of REF 2029 on people, culture and environment, and the momentum of the Concordat to Support the Career Development of Researchers, there’s a growing imperative to evidence not only what support is offered, but how it’s experienced.
That’s where I believe that coaching comes in – as a strategic, systemic tool for transforming research culture from the inside out.
At a time when UK higher education is facing significant financial pressures, widespread restructuring, and the real threat of job losses across institutions, it may seem counterintuitive to invest in individuals’ development. But it is precisely because of this instability that our commitment to people must be more visible and deliberate than ever. In moments of systemic strain, the values we choose to protect speak volumes. Coaching offers one way to show – through action, not just intention – that our researchers matter, that their growth is not optional, and that culture isn’t a casualty of crisis, but a lever for recovery.
By coaching, I mean a structured, confidential, and non-directive process that empowers individuals to reflect, identify goals and navigate challenges. Unlike mentoring, which often involves sharing advice or experience, coaching creates a thinking space led by the individual, where the coach supports them to surface their own insights, unpick the unspoken dynamics of academia, build confidence in their agency, and cultivate their personal narrative of progress.
Coaching is not just development – it’s disruption
We tend to associate coaching with senior leadership, performance management, or executive transition. But over the last seven years, I’ve championed coaching for researchers – especially early career researchers – as a means of shifting the developmental paradigm from “this is what you need to know” to “what do you need, and how can we co-create that space?”
When coaching is designed well – thoughtfully matched, intentionally scaffolded, and thoughtfully led – it becomes a quiet form of disruption. It gives researchers the confidence to think through difficult questions. And it models a research culture where vulnerability is not weakness but wisdom.
This is especially powerful for those who feel marginalised in academic environments – whether due to career stage, background, identity or circumstance. One early career researcher recently told me that coaching “helped me stop asking whether I belonged in academia and start asking how I could shape it. For the first time, I felt like I didn’t have to shrink myself to fit in.” That’s the kind of feedback you won’t find in most institutional KPIs – but it says a lot about the culture we’re building.
Why coaching belongs in your research strategy
Coaching still suffers from being seen as peripheral – a nice-to-have, often under-resourced and siloed from mainstream provision. Worse, it’s sometimes positioned as remedial, offered only when things go wrong.
As someone who assesses UK institutions for the European Commission-recognised HR Excellence in Research Award, I’ve seen first-hand how embedding coaching as a core element of researcher support isn’t just the right thing to do – it’s strategically smart. Coaching complements and strengthens the implementation of institutional actions for the Concordat to Support the Career Development of Researchers, by centring the individual researcher experience – not just a tick-box approach to the principles.
What’s striking is how coaching aligns with the broader institutional goals we often hear in strategy documents: autonomy, impact, innovation, wellbeing, inclusion. These are not incidental outcomes; they’re the foundations of a healthy research pipeline, and coaching delivers on these – but only if we treat it as a central thread of our culture, not a side offer.
Crucially, coaching is evidence of how we live our values. It offers a clear, intentional method for demonstrating how people and culture are not just statements but structures – designed, delivered, and experienced.
In REF 2029 institutions will be asked to evidence the kind of environment where research happens. Coaching offers one of the most meaningful, tangible ways to demonstrate that such an environment exists through the lived experiences of the people working within it.
Culture is personal – and coaching recognises that
In higher education, we often talk about culture as though it’s something we can declare or design. But real culture – the kind that shapes whether researchers thrive or withdraw – is co-created, day by day, through dialogue, trust, and reflection.
Culture lives in the everyday, unrecorded interactions: the invisible labour of masking uncertainty while trying to appear “resilient enough” to succeed; the internal negotiation before speaking up in a lab meeting; or the emotional weight carried by researchers who feel like they don’t belong.
Coaching transforms those invisible moments into deliberate acts of empowerment. It creates intentional, reflective spaces where researchers – regardless of role or background – are supported to define their own path, voice their challenges, and recognise their value. It’s in these conversations that inclusion is no longer an aspiration but a lived reality where researchers explore their purpose, surface their barriers, and recognise their value.
This is especially needed in environments where pressure to perform is high, and space to reflect is minimal. Coaching doesn’t remove the pressures of academia. But it builds capacity to navigate them with intention – and that’s culture work at its core.
Embedding a coaching culture as part of researcher development shouldn’t be a fringe benefit or pilot project – it should be an institutional expectation. We need more trained internal coaches who understand the realities of academic life and more visibly supported coaching opportunities aligned with the Researcher Development Concordat. The latter encourages a minimum of ten days’ (pro rata) professional development for research staff per year. Coaching is one of the most impactful ways those days can be used – not just to develop researchers, but to transform the culture they inhabit.
A call to embed – not bolt on
If we’re serious about inclusive, people-centred research environments, then coaching should be treated as core business. It should not be underfunded, siloed, or left to goodwill. It must be valued, supported, and embedded – reflected in institutional KPIs, Researcher Development Concordat and Research Culture Action Plans, and REF narratives alike.
And in a sector currently under intense financial pressure, we should double down on culture as a lived commitment to those we ask to do difficult, meaningful work during difficult, uncertain times. Coaching is a strategic lever for equity, integrity, and excellence.
What happened? On May 14, U.S. Defense Secretary Pete Hegseth issued a memo declaring that the Defense Department would move to cap reimbursement for indirect research costs to 15% for all new grants for colleges. Hegseth also ordered officials to renegotiate rates on existing awards. If colleges do not agree, DOD officials should terminate previously awarded grants and reissue them under the “revised terms,” he said.
Overall, Hegseth estimated the move would save the agency $900 million annually.
A group of higher education associations and research universities sued on June 16, arguing that the Defense Department overstepped its authority and noting that other courts had blocked the Trump administration’s caps at other agencies.
“As with those policies, if DOD’s policy is allowed to stand, it will stop critical research in its tracks, lead to layoffs and cutbacks at universities across the country, badly undermine scientific research at United States universities, and erode our nation’s enviable status as a global leader in scientific research and innovation,” they wrote in court documents.
The next day, U.S. District Judge Brian Murphy granted a temporary restraining order blocking the Defense Department from implementing its policy until further ordered.
What’s next? Murphy has scheduled a July 2 hearing on the temporary restraining order.
After the National Institutes of Health tried earlier this year to cut funding for universities’ costs indirectly related to research and set off alarm bells across higher education, 10 higher education associations decided to come up with their own model for research funding rather than having the government take the lead.
Now, after just over six weeks of work, that group known as the Joint Associations Group is homing in on a plan to rework how the government funds research, and they want feedback from the university research community before they present a proposal to Congress and the Trump administration at the end of the month.
“Unfortunately, something is going to change,” said Barbara Snyder, president of the Association of American Universities. “Either we will be part of it or it will be imposed upon us … Significant division in the research community is going to kill us.”
Snyder and other JAG members said at a virtual town hall Tuesday that the current system for direct and indirect research funding costs has served the community well, but it isn’t transparent and leads to confusion about how the rates are calculated, among other challenges. AAU and other higher ed groups sued the NIH in February after the agency proposed capping indirect expenses for all institutions at 15 percent of the direct research costs—down from the average of 28 percent. (Historically, colleges negotiate their own reimbursement rates directly with the federal government.)
The White House said the cap would make more money available for “legitimate scientific research,” but universities warned that the change would halt lifesaving research and lead to job losses, among other consequences. The NIH rate cap would mean a cut of $4 billion for university-based research.
Court challenges have since halted the NIH plan, as well as similar caps proposed by twoother federal agencies; meanwhile, the Department of Defense is working on its own plan related to indirect costs. Snyder said the lawsuits are about fiscal year 2025, while the JAG effort looks ahead to fiscal year 2026 and beyond.
Over the years, Congress and federal agencies have sought to rethink the funding model but didn’t reach an agreement. In fact, after the first Trump administration proposed a 15 percent cap on indirect costs in 2017, Congress specifically prohibited such a move. But now that prohibition doesn’t seem likely to stick as lawmakers consider bills to fund the government for fiscal year 2026, so a new model is necessary. Adding to the pressure on universities, Trump has proposed significant cuts to research funding in his budget.
JAG’s panel of experts presented two options to the university research community at a webinar last week and then answered questions at the town hall Tuesday. Colleges and universities have until June 22 to test the proposed models and provide feedback before JAG sends its final proposal to the government June 27, though any model will likely need additional work.
“No one would choose to work at this rapid pace and rethink how to effectively, fairly and transparently cover these real and unavoidable costs,” said Matt Owens, president of the Council of Government Relations, at last week’s webinar. “But we are where we are, and it’s vital that we meet this moment so that we can emerge with an improved and sustainable indirect cost policy that will enable our country to continue leading the world in research and innovation.”
Proposed Models
Both versions of what JAG is calling the Fiscal Accountability in Research model, or FAIR, are geared toward offering more accountability and transparency about how federal research dollars are spent. JAG hopes that in the end, the new model will be simpler than the current one. They also want to nix terms like “indirect costs rate” and “overhead” for either essential research support or general research operations in an effort to underscore that the money goes toward the real costs of research.
“This will require a bit of a culture change in institutions, but we think the benefit of that far outweighs the downsides,” said Kelvin Droegemeier, a professor and special adviser to the chancellor for science and policy at the University of Illinois at Urbana-Champaign, who led the JAG effort, at the webinar.
One model, which the group calls FAIR No. 1, would include costs related to managing the grant, general research operations and facilities as a fixed percentage of the total budget. The percentage would be based in part on the type of institution and research. This approach is designed to be simple and reasonable, according to the group’s presentation, but it’s more general, which makes it “difficult to account for the wide array of research frameworks that now exist.”
The other model, FAIR No. 2, would more accurately reflect the actual costs of a project and make the structure for federal grants more like those from private foundations. Under this model, essential research support would be lumped into the project costs while funding for general research operations, such as payroll and procurement, would be a fixed percentage of the total budget. That change would likely increase the direct costs of the project.
Droegemeier and other members of JAG’s expert panel noted that FAIR No. 2 would be a “significant departure” from the current approach, and universities would likely need more time to overhaul their processes for tracking costs. Still, the group said this model would better show what the money goes toward, addressing a key concern from Congress.
Droegemeier described the two models as “bookends” and said the group would probably end up somewhere between the two.
‘In a Good Spot’
At Tuesday’s town hall, attendees questioned whether Congress or the Trump administration would even consider JAG’s proposal and why any change was necessary.
Droegemeier said he’s met with members of Congress who have endorsed their process, and he’s kept in touch with Trump administration officials about the group’s work. So far, he’s seen a positive response to the models, adding that officials at the Office of Management and Budget indicated that they weren’t “oceans apart.”
“We’ve done everything possible to build goodwill and trust,” he said. “There’s a long road ahead of us, but I think we’re in a good spot.”
Other speakers echoed that point, noting that Sen. Susan Collins, a Republican from Maine and chair of the powerful Appropriations Committee, publicly supported the models at a recent hearing. And NIH director Jay Bhattacharya called the proposals “quite promising” at the same hearing, STAT Newsreported.
Additionally, the House’s appropriations bill for the Department of Defense calls on the agency to “work closely with the extramural research community to develop an optimized Facilities and Administrative cost reimbursement solution for all parties that ensures the nation remains a world leader in innovation.”
Across the board, speakers at the town hall said they must act to have a say in discussions about the future of research funding.
“The two models are a significant change,” said Deborah Altenburg, vice president for research policy and advocacy at the Association of Public and Land-grant Universities. “But all of our organizations are responding to a new political situation.”
The University Policy Engagement Network (UPEN) recently announced that it had been successful in a UKRI bid to develop and expand UK policy to research infrastructure, facilitating connections and engagement between public and civil servants on one hand, and research organisations on the other.
This call is a recent manifestation of a perennial and important interest in evidence-informed policymaking, and policy and research engagement. Policy engagement is also part of an increased focus on engagement with and impact of research, driven by the Research Excellence Framework.
University-based researchers and policymakers respond to different incentives in ways that are not always conducive to engagement. Interviewees described a wide range of influences on policy, including many types of research, much of which is produced outside the university sector. For some types of research, such as rapid research, researchers at higher education institutions were seen as being at a disadvantage. To address these considerations, our interviewees suggested that research co-creation – involving policymakers earlier in the process to develop research ideas and design projects – could promote engagement with policy.
Engagement from the start
In a typical research process, university-based researchers develop, conduct, and publish their research with a high degree of independence from the stakeholders of their research. Once the research is completed, researchers disseminate their findings, hoping to reach external stakeholders, including policymakers. In contrast, co-created research brings research stakeholders into the research process at the beginning and maintains stakeholder influence and co-creation throughout.
When asked how researchers can increase engagement with policy, one participant said: :
Co-designing projects with people involved in policy from the outset rather than, you know, what I often see, which is ‘we’ve done this stuff and now, who can we send it to?’ So, getting people involved from the outset and the running of it through advice.
Because policy priorities shift and because research often takes a long time to complete, co-creation is not a perfect solution for policy research engagement. But co-creation may increase the likelihood that research findings are relevant to and usable for the specific needs of policymakers. Another benefit of co-creation is that, by taking part in the research process, policymakers are more likely to feel invested in the research and inclined to use its findings.
Co-creation of research with policymakers requires access to and some form of relationship with relevant policymakers. While some researchers have easier access to policymakers than others, there are structures in place to facilitate the networking required to build relevant relationships, including through academic fellowship with the UK Parliament. Researchers can sometimes connect more easily to ministers and policymakers via intermediary organisations such as mission groups, representative bodies, think tanks, and professional organisations.
Designing successful co-creation
In a policy-research co-creation model, one of the questions that is worth asking is what is co-created: is research co-created, policy co-created, or both? For example, one participant in our study viewed researcher engagement with policymakers as policy-co-creation, rather than as research co-creation. Researchers can ask themselves: “What policy am I well-positioned to co-create based on my research?” as well as “How can my research benefit from co-creation with its stakeholders?”
Our article highlights that one of the more frequent pathways for researchers based at universities to engage with policy is through conducting commissioned research. Commissioned research is often aligned with policy needs and facilitates co-creation. Yet independence, rigour, and criticality – markers of quality research – still need to be ensured even as part of co-created and commissioned research.
Commissioned research was not the only type of research discussed by our participants that led to policy engagement. Interviewees provided examples of researchers with an established and rigorous body of work that answered policy-relevant questions which were successful in shaping policy. Sometimes, a body of research developed over time and over multiple studies is better suited for policy engagement. Sometimes this takes the form of a systematic review designed to bring a large body of research literature to bear on a current policy problem.
This raises an important consideration for mechanisms that incentivise engagement: how does incentivising engagement affect the multiple priorities that researchers based at higher education institutions need to meet? The danger here is that, as more policy engagement is incentivised, researchers at higher education institutions might prioritise forms and qualities of research which lend themselves to engagement over those which higher education is uniquely placed to offer.
As current efforts to expand UK-wide policy to research infrastructure develop, it is important to consider the multiple complexities associated with policy research engagement. In our view, for policy and research engagement to be meaningful, policy to research infrastructure needs to support high quality research, targeted engagement, and have a clear sense of what each of these means in practice.
Education Secretary Linda McMahon has repeatedly said that the February and March cancellations and firings at her department cut not only the “fat” but also into some of the “muscle” of the federal role in education. So, even as she promises to dismantle her department, she is also bringing back some people and restarting some activities. Court filings and her own congressional testimony illuminate what this means for the agency as a whole, and for education research in particular.
McMahon told a U.S. House committee last month she rehired 74 employees out of the roughly 2,000 who were laid off or agreed to separation packages. A court filing earlier this month says the agency will revive about a fifth of research and statistics contracts killed earlier this year, at least for now, though that doesn’t mean the work will look exactly as it did before.
The Trump administration disclosed in a June 5 federal court filing in Maryland that it either has or is planning to reinstate 20 of 101 terminated contracts to comply with congressional statutes. More than half of the reversals will restart 10 regional education laboratories that the Trump administration had said were engaged in “wasteful and ideologically driven spending,” but had been very popular with state education leaders. The reinstatements also include an international assessment, a study of how to help struggling readers, and Datalab, a web-based data analysis tool for the public.
Related: Our free weekly newsletter alerts you to what research says about schools and classrooms.
Even some of the promised reinstatements are uncertain because the Education Department plans to put some of them up for new bids (see table below). That process could take months and potentially result in smaller contracts with fewer studies or hours of technical assistance.
These research activities were terminated by Elon Musk’s Department of Government Efficiency (DOGE) before McMahon was confirmed by the Senate. The Education Department’s disclosure of the reinstatements occurred a week after President Donald Trump bid farewell to Musk in the Oval Office and on the same day that the Trump-Musk feud exploded on social media.
See which IES contracts have been or are slated to be restarted, or under consideration for reinstatement
Description
Status
1
Regional Education Laboratory – Mid Atlantic
Intends to seek new bids and restart contract
2
Regional Education Laboratory – Southwest
Intends to seek new bids and restart contract
3
Regional Education Laboratory – Northwest
Intends to seek new bids and restart contract
4
Regional Education Laboratory – West
Intends to seek new bids and restart contract
5
Regional Education Laboratory – Appalachia
Intends to seek new bids and restart contract
6
Regional Education Laboratory – Pacific
Intends to seek new bids and restart contract
7
Regional Education Laboratory – Central
Intends to seek new bids and restart contract
8
Regional Education Laboratory – Midwest
Intends to seek new bids and restart contract
9
Regional Education Laboratory – Southeast
Intends to seek new bids and restart contract
10
Regional Education Laboratory – Northeast and Islands
Intends to seek new bids and restart contract
11
Regional Education Laboratory – umbrella support contract
Intends to seek new bids and restart contract
12
What Works Clearinghouse (website, training reviewers, but no reviewing of education research)
Approved for reinstatement
13
Statistical standards and data confidentiality technical assistance for the National Center for Education Statistics
Reinstated
14.
Statistical and confidentiality review of electronic data files and technical reports
Approved for reinstatement
15
Datalab, a web-based data analysis tool for the public
Approved for reinstatement
16
U.S. participation in the Program for International Student Assessment (PISA), an international test overseen by the Organization for Economic Cooperation and Development (OECD)
Reinstated
17
Data quality and statistical methodology assistance
Reinstated
18
EDFacts, a collection of administrative data from school districts around the country
Reinstated
19
Demographic and geospatial estimates (e.g. school poverty and school locations) used for academic research and federal program administration
Approved for reinstatement
20
Evaluation of the Multi-tiered System of Supports in reading, an approach to help struggling students
Approved for reinstatement
21
Implementation of the Striving Readers Comprehensive Literacy Program and feasibility of conducting an impact evaluation of it.
Evaluating whether to restart
22
Policy-relevant findings for the National Evaluation of Career and Technical Education
Evaluating whether to restart
23
The National Postsecondary Student Aid Study (how students finance college, college graduation rates and workforce outcomes)
Evaluating whether to restart
24
Additional higher ed studies
Evaluating whether to restart
25
Publication assistance on educational topics and the annual report
Evaluating whether to restart
26
Conducting peer review of applications, manuscripts and grant competitions at the Institute of Education Sciences
Evaluating whether to restart
The Education Department press office said it had no comment beyond what was disclosed in the legal brief.
Education researchers, who are suing the Trump administration to restore all of its previous research and statistical activities, were not satisfied.
Elizabeth Tipton, president of the Society for Research on Educational Effectiveness (SREE) said the limited reinstatement is “upsetting.” “They’re trying to make IES as small as they possibly can,” she said, referring to the Institute of Education Sciences, the department’s research and data arm.
SREE and the American Educational Research Association (AERA) are suing McMahon and the Education Department in the Maryland case. The suit asks for a temporary reinstatement of all the contracts and the rehiring of IES employees while the courts adjudicate the broader constitutional issue of whether the Trump administration violated congressional statutes and exceeded its executive authority.
The 20 reinstatements were not ordered by the court, and in some instances, the Education Department is voluntarily restarting only a small slice of a research activity, making it impossible to produce anything meaningful for the public. For example, the department said it is reinstating a contract for operating the What Works Clearinghouse, a website that informs schools about evidence-based teaching practices. But, in the legal brief, the department disclosed that it is not planning to reinstate any of the contracts to produce new content for the site.
In the brief, the administration admitted that congressional statues mention a range of research and data collection activities. But the lawyers argued that the legislative language often uses the word may instead of must, or notes that evaluations of education programs should be done “as time and resources allow.”
“Read together, the Department has wide discretion in whether and which evaluations to undertake,” the administration lawyers wrote.
The Trump administration argued that as long as it has at least one contract in place, it is technically fulfilling a congressional mandate. For example, Congress requires that the Education Department participate in international assessments. That is why it is now restarting the contract to administer the Program for International Student Assessment (PISA), but not other international assessments that the country has participated in, such as the Trends in International Mathematics and Science Study (TIMSS).
The administration argued that researchers didn’t make a compelling case that they would be irreparably harmed if many contracts were not restarted. “There is no harm alleged from not having access to as-yet uncreated data,” the lawyers wrote.
One of the terminated contracts was supposed to help state education agencies create longitudinal data systems for tracking students from pre-K to the workforce. The department’s brief says that states, not professional associations of researchers, should sue to restore those contracts.
In six instances, the administration said it was evaluating whether to restart a study. For example, the legal brief says that because Congress requires the evaluation of literacy programs, the department is considering a reinstatement of a study of the Striving Readers Comprehensive Literacy Program. But lawyers said there was no urgency to restart it because there is no deadline for evaluations in the legislative language.
In four other instances, the Trump administration said it wasn’t feasible to restart a study, despite congressional requirements. For example, Congress mandates that the Education Department identify and evaluate promising adult education strategies. But after terminating such a study in February, the Education Department admitted that it is now too difficult to restart it. The department also said it could not easily restart two studies of math curricula in low-performing schools. One of the studies called for the math program to be implemented in the first year and studied in the second year, which made it especially difficult to restart. A fourth study the department said it could not restart would have evaluated the effectiveness of extra services to help teens with disabilities transition from high school to college or work. When DOGE pulled the plug on that study, those teens lost those services too.
The Hechinger Report provides in-depth, fact-based, unbiased reporting on education that is free to all readers. But that doesn’t mean it’s free to produce. Our work keeps educators and the public informed about pressing issues at schools and on campuses throughout the country. We tell the whole story, even when the details are inconvenient. Help us keep doing that.
An environmental researcher at Tulane University resigned Wednesday after accusing campus officials, reportedly under pressure from Gov. Jeff Landry, of issuing a “gag order” that prevented her from publicly discussing her work, which focused on racial disparities in the petrochemical workforce.
“Scholarly publications, not gag orders, are the currency of academia,” Kimberly Terrell, the now-former director of community engagement at Tulane’s Environmental Law Clinic, wrote in her resignation letter. “There is always room for informed debate. But Tulane leaders have chosen to abandon the principles of knowledge, education, and the greater good in pursuit of their own narrow agenda.”
Terrell’s resignation comes amid wider efforts by the Trump administration and its allies to control the types of research—including projects related to environmental justice—academics are permitted to pursue and punish campus protesters for espousing messages the president and other public officials disagree with.
“It started with the pro-Palestinian activism on our campus and others across the country. It’s emboldened a lot of political leaders to feel they can make inroads by silencing faculty in other areas,” Michelle Lacey, a math professor and president of Tulane’s chapter of the American Association of University Professors, told Inside Higher Ed. “That was the catalyst for creating a climate where university administrators are very nervous, especially now as we see the government pulling funding for areas of research they don’t like.”
Last spring, Landry praised Tulane president Michael Fitts and university police for removing students who were protesting Israel’s attacks on Gaza. Soon after, the Legislature passed a provision creating harsher punishments for protesters who disrupt traffic, which Landry later signed into law.
Landry, a Republican aligned with Trump, has a history of trying to exert control over the state’s public higher education institutions.
Last summer, he enacted a law that allows him to directly appoint board chairs at the state’s public colleges and universities. And in November, following Trump’s election, Landry publicly called on officials at Louisiana State University to punish a law professor who allegedly made brief comments in class about students who voted for the president.
Landry’s office denied to the Associated Press (which first reported on Terrell’s resignation) that it pressured Tulane to silence research from the law clinic. Michael Strecker, a Tulane spokesperson, also told the outlet that the university “is fully committed to academic freedom and the strong pedagogical value of law clinics” and declined to comment on “personnel matters.”
Strecker added in a statement that Tulane administrators have been working with the law school’s leadership on how the law clinics could better support the university’s education mission.
“Debates about how best to operate law clinics’ teaching mission have occurred nationally and at Tulane for years—this is nothing new,” Strecker said. “This effort includes most recently input from an independent, third-party review.”
But Terrell’s account of the events that led to her resignation call the universities’ academic freedom commitments into question, while also implying that Landry—and powerful industry groups—wield some influence over private higher education institutions in the state.
And it’s not something Tulane, a private university in New Orleans, should tolerate, Lacey said.
Kimberly Terrell
“The academic freedom of all university researchers must be unequivocally defended at both public and private institutions,” Lacey wrote in a statement. “This includes the right to conduct and disseminate research that may be unfavorably viewed by government officials or corporate entities. Political demands to stifle controversial research are an affront to the advancement of knowledge and open exchange of ideas, as is the voluntary compliance with such requests by university leadership.”
The latest controversy at Tulane stems from a paper Terrell published April 9 in the peer-reviewed journal Ecological Economics. Her research found that while Black people in Louisiana are underrepresented in the state’s petrochemical workforce, they are overexposed to toxic pollutants the industry releases into an area of the state between New Orleans and Baton Rouge known as “Cancer Alley.”
But according to emails obtained by Inside Higher Ed and other outlets, Fitts worried that publicizing Terrell’s research and the clinic’s other work, which includes legal advocacy, could jeopardize funding for the university’s $600 million plan to redevelop New Orleans’ historic Charity Hospital into residential and commercial spaces as part of a broader downtown expansion plan.
As Terrell explained in her resignation letter, Fitts and other top Tulane executives were at Louisiana’s state capitol on April 16 lobbying for the project when “someone accused the university of being anti–chemical industry” and cited her study, which was receiving media attention after it was published the week prior. According to Terrell, “the story that came down to me through the chain of command was that Governor Landry threatened to veto any bill with funding for Tulane’s Charity project unless Fitts did something about the Environmental Law Clinic.”
‘Complete Gag Order’
After that, Terrell says, she was “placed under a complete gag order,” which the emails appear to confirm.
“Effective immediately all external communications that are not client-based—that is, directly related to representation—must be pre-approved by me,” Marcilynn Burke, dean of Tulane’s law school, wrote in an April 25 email to law clinic staff. “Such communications include press releases, interviews, videos, social media postings, etc. Please err on the side of over-inclusion as we work to define the boundaries through experience.”
A week later, on May 4, Burke wrote another email to clinic staff explaining that “elected officials and major donors have cited the clinic as an impediment to them lending their support to the university generally and this project specifically,” referring to Fitts’s plans to redevelop the old hospital. Terrell wrote that when she pleaded her case to Provost Robin Forman, “he refused to acknowledge my right to freely conduct and disseminate research” and also “let slip that my job description was likely going to be rewritten.”
Terrell described the entire law clinic as being “under siege” and said she would rather leave her position “than have my work used as an excuse for President Fitts to dismantle the Tulane Environmental Law Clinic.”
Other academics, free speech experts and environmental justice advocates also believe Tulane’s moves to silence Terrell’s work amounts to an attack on academic freedom with implications beyond the campus.
“The administration of Tulane University, far from standing up for academic freedom, is participating in the effort to suppress free inquiry and the pursuit of knowledge by scientific methods,” Michael Ash, an economics and public policy professor at the University of Massachusetts, said in a statement. “Any effort to reduce academic freedom for Dr. Terrell either by changing her job classification or by redefining whether the protection applies is a blatant and un-American attempt to suppress the type of free inquiry that has made this country great.”
Joy Banner, co-founder and co-director of the Descendants Project, a community organization that works in Cancer Alley, added that the Tulane Environmental Law Clinic is a vital public health resource.
Without the clinic, “it would be far more difficult to show the racially discriminatory practices of the industry, from preferential hiring practices to a pattern of concentrating pollution in majority Black neighborhoods,” she said in a statement. “President Fitts must commit to protecting it at all costs.”
I’ve been walking these streets so long: in the SRHE Blog a series of posts is chronicling, decade by decade, the progress of SRHE since its foundation 60 years ago in 1965. As always, our memories are supported by some music of the times, however bad it might have been[1].
Some parts of the world, like some parts of higher education, were drawing breath after momentous years. The oil crisis of 1973-74 sent economic shocks around the world. In 1975 the Vietnam war finally ended, and the USA also saw the conviction of President Richard Nixon’s most senior staff John Mitchell, Bob Haldeman and John Ehrlichman, found guilty of the Watergate cover-up. Those were the days when the Washington Post nailed its colours to the mast rather than not choosing sides, and in the days when the judicial system and the fourth estate could still expose and unseat corrupt behaviour at the highest levels. Washington Post editor Katharine Graham supported her journalists Woodward and Bernstein against huge establishment pressure, as Tammy Wynette sangStand by your man.How times change.
Higher education in the UK had seen a flurry of new universities in the 1960s: Aston, Brunel, Bath, Bradford, City, Dundee, Heriot-Watt, Loughborough, Salford, Stirling, Surrey, the New University of Ulster, and perhaps most significant of all, the Open University. All the new UK universities were created before 1970; there were no more in the period to 1975, but the late 60s and early 1970s saw the even more significant creation of the polytechnics, following the influential 1966 White Paper A Plan for Polytechnics and Other Colleges. The Times Higher Education Supplement, established in 1971 under editor Brian Macarthur, had immediately become the definitive trade paper for HE with an outstanding journalistic team including Peter (now Lord) Hennessy, David Hencke and (now Sir) Peter Scott (an SRHE Fellow), later to become the THES editor and then VC at Kingston. THES coverage of the polytechnic expansion in the 1970s was dominated by North East London Polytechnic (NELP, now the University of East London), with its management team of George Brosan and Eric Robinson. They were using a blueprint created in their tenure at Enfield College, and fully developed in Robinson’s influential book, The New Polytechnics – the People’s Universities. NELP became “a byword for innovation”, as Tyrrell Burgess’s obituary of George Brosan said, developing an astonishing 80 new undergraduate programmes validated by the Council for National Academic Awards, created like SRHE in 1965. Burgess himself had been central to NELP’s radical school for independent study and founded the journal, Higher Education Review, working with its long-time editor John Pratt (an SRHE Fellow), later the definitive chronicler of The Polytechnic Experiment. In Sheffield one of the best of the polytechnic directors, the Reverend Canon Dr George Tolley, was overseeing the expansion of Sheffield Polytechnic as it merged with two colleges of education to become Sheffield City Polytechnic.
As in so many parts of the world the HE system was increasingly diverse and rapidly expanding. In Australia nine universities had been established between 1964 and 1975: Deakin, Flinders, Griffith, James Cook, La Trobe, Macquarie, Murdoch, Newcastle, and Wollongong. The Australian government had taken on full responsibility for HE funding as Breen (Monash) explained, and had even abolished university fees in 1974, which Mangan’s (Queensland) later review regarded as not necessarily a good thing. How times change.
In the USA the University of California model established under president Clark Kerr in the 1960s dominated strategic thinking about HE. Berkeley’s Martin Trow had already written The British Academics with AH Halsey (Oxford) and was about to become the Director of the Centre for Studies of Higher Education at Berkeley, where his elite-mass-universal model of how HE systems developed would hold sway for decades.
In the UK two new laws, the Sex Discrimination Act 1975 and the Equal Pay Act 1970, came into force on 29 December, aiming to end unequal pay of men and women in the workplace. In the USA the Higher Education Act 1972 with its Title IX had been a hugely influential piece of legislation which prohibited sex discrimination in educational institutions receiving federal aid. How times change. Steve Harley’s 1975 lyrics would work now with President Trump: You’ve done it all, you’ve broken every code.
Some things began in 1975 which would become significant later. In HE, institutions that had mostly been around for years or even centuries but started in a new form included Buckinghamshire College of Higher Education (later Buckinghamshire New University), Nene College of Higher Education (University of Northampton), Bath Spa University College, Roehampton, and Dublin City University. Control of Glasgow College of Technology (Glasgow Caledonian University) transferred from Glasgow Corporation to the newly formed Strathclyde Regional Council. Nigeria had its own flurry of new universities in Calabar, Jos, Maiduguri and Port Harcourt.
Everyone knew that “you’re gonna need a bigger higher education system” as the blockbuster hit Jaws was released. 1975 was the year when Ernő Rubik applied for a patent for his invention the Magic Cube, Microsoft was founded as a partnership between Bill Gates and Paul Allen, and Margaret Thatcher defeated Edward Heath to become leader of the Conservative Party. Bruce Springsteen was already ‘The Boss’ when Liz Truss was Born to run on 26 July; she would later briefly become a THES journalist and briefly Shadow Minister for Higher Education, before ultimately the job briefly as boss. 1970s terrorism saw a bomb explode in the Paris offices of Springer publishers: the March 6 Group (connected to the Red Army Faction) demanded amnesty for the Baader-Meinhof Group.
Higher education approaching a period of consolidation
Guy Neave, then perhaps the leading continental European academic in research into HE, later characterised 1975-1985 as a period of consolidation. In the UK the government was planning for (reduced) expansion and Labour HE minister Reg Prentice was still quoting the 1963 Robbins Report in Parliament: “The planning figure of 640,000 full-time and sandwich course students in Great Britain in 1981 which I announced in November is estimated to make courses of higher education available for all those who are qualified by ability and attainment to pursue them and who wish to do so. It allows for the number of home students under 21 entering higher education in Great Britain, expressed as a proportion of the population aged 18, to rise from 14% in 1973 to 17% in 1981. … the reductions in forecast higher education expenditure in the recent Public Expenditure White Paper are almost entirely attributable to the lower estimate of prospective student demand.” Government projections of student numbers were always wrong, as Maurice Kogan (Brunel) might have helped to explain – I thought by now you’d realise. 1975 was the year when Kogan, a former senior civil servant in the Department of Education and Science, published his hugely influential Educational Policy-making: A Study of Interest Groups and Parliament.
“By 1973, when the university system was in crisis with the collapse of the quinquennial funding system, it was clear that the Society was significantly failing to meet the ambitious targets it had started out with: it held annual conferences but attendance at 100 to 120 ensured that any surplus was low. It had successfully launched the valuable Research into Higher Education Abstracts but its … monographs, … while influential among specialists did not command a wide readership. The Society appeared to be at a crossroads as to its future: so far it had succeeded in expanding its membership, both corporate and individual, but this could easily be reversed if it failed to generate sufficient activity to retain it. Early in 1973 the Governing Council agreed to hold a special meeting … and commissioned a paper from Leo Evans, one of its members, and Harriet Greenaway, the Society’s Administrator … The “Discussion Paper on the Objectives of the Society” … quoted the aims set out in the Articles of Association “to promote and encourage research in higher education and related fields” and argued that the Society’s objectives needed to be broadened. … The implied thrust of the paper was that the Society had become too narrow in its research interests and that it should be more willing to address issues related to the development of the higher education system.”
In the end the objectives were expanded to include concern for the development of the HE sector, but the Society’s direction was not wholly settled, according to Shattock. Moreover: “Both in 1973-74 and 1974-75 there was great concern about the Society’s continued financial viability, and in 1976 the Society moved its premises out of London to the University of Surrey where it was offered favourable terms.” (Someone Saved My Life Tonight). In 1976 Lewis Elton of Surrey, one of SRHE’s founders, would become Chair of the Society when the incumbent Roy Niblett suffered ill health. It was the same year that the principal inspiration for the foundation of SRHE (as Shattock put it), Nicholas Malleson, died at only 52. SRHE’s finances were soon back on an even keel but It would be more than 25 years before they achieved long-term stability.
Rob Cuthbert is editor of SRHE News and the SRHE Blog, Emeritus Professor of Higher Education Management, University of the West of England and Joint Managing Partner, Practical Academics. Email [email protected]. Twitter/X @RobCuthbert.
[1] The top selling single of 1975 was Bye Bye Baby by the Bay City Rollers, and the Eurovision Song Contest was won by Ding-a-Dong. The album charts were dominated by greatest hit albums from Elton John, Tom Jones, The Stylistics, Perry Como, Engelbert Humperdinck and Jim Reeves. I rest my case. As always, there were some exceptions.
By Professor Isabel Lucas, Pro-Vice Chancellor for Education at the Liverpool School of Tropical Medicine and outgoing Chair of the national Heads of Educational Development Group (HEDG).
In higher education, prestige and promotion have long hinged on research output. But with growing numbers of academics focused on teaching, educational leadership and knowledge exchange, the old metrics no longer fit. A report by the European Association for Universities places academic career reform at the heart of its 2030 vision, highlighting the need to recognise impact beyond traditional research publications. This shift is not only about fairness – it’s about organisational effectiveness and employee wellbeing.
Research has long dominated academic prestige, promotions, and funding. Sterling et al. (2023) argue that current academic career frameworks are weighted heavily toward research, often sidelining innovative teaching and educational leadership. Yet, as the higher education sector evolves, so too must our understanding of what counts as impactful academic work.
The reality is already shifting. Data from HESA (2022) shows a 10% rise in teaching-only contracts between 2015 and 2022, balanced by a 9% decrease in research-related roles. This suggests a growing academic population for whom the current research-heavy promotion pathways simply don’t apply. However, ‘teaching-only’ staff (a problematic term as it is inevitably not only teaching) often find themselves ineligible – or unrecognised – within traditional academic progression systems. The lack of progression routes for these high-quality staff capable of transforming the education and student experience at a strategic level risks undermining job satisfaction and retention.
What’s more, staff on Professional Service contracts, including roles like educational developers and academic skills support tutors, are engaged in academic work without the benefits or recognition of an academic title. HESA’s own definitions blur the lines: academic function is tied to the contract, not necessarily the work performed. This disconnect creates a situation where talented, impactful educators are ‘othered’ – excluded from meaningful recognition and progression.
Key findings from sector analysis undertaken in 2024 via the Heads of Educational Development Group (HEDG) showed some alarming disparities among middle managers with institutional responsibility for learning and teaching:
Career Blockages:
100% of academic contract holders in the study had access to promotion to Reader/Professor.
Only 39% of Professional Service contract holders had similar access—even when doing the same academic work as peers on academic contracts.
Misalignment of Identity and Contract:
Staff whose professional identity did not match their contract type (e.g. self-identifying as academic but on a Professional Service contract) reported significantly lower satisfaction and empowerment scores.
Promotion Criteria Gaps:
Respondents noted they could meet academic promotion criteria, but were ineligible due to contract type.
Job satisfaction scores were lowest where staff reported that promotion routes existed but were inaccessible due to the nature of their role.
So, how can HE evolve its career structures beyond research? Establishing clear, visible academic promotion routes to Reader/Professor that recognise leadership in education, curriculum innovation, and pedagogic research would be a good starting point. Making sure promotion frameworks include non-research excellence – impact on student learning, institutional strategy, and sector-wide education initiatives – would be even more inclusive. Neither of these things should pose a significant operational or cost challenge to universities and would reap significant rewards in staff retention and satisfaction.
Institutions that fail to adapt risk not just losing talent, but falling behind in impact, innovation, and reputation. It’s time to value all forms of academic excellence. The future of higher education, now more than ever, depends on it.