Researchers at the University of Kansas have produced a set of guidelines to help educators from preschool through higher education responsibly implement artificial intelligence in a way that empowers teachers, parents, students and communities alike.
Earlier this year, President Donald Trump issued an executive order instructing schools to incorporate AI into their operations. The framework is intended to help all schools and educational facilities do so in a manner that fits their unique communities and missions.
“We see this framework as a foundation,” said James Basham, director of CIDDL and professor of special education at KU. “As schools consider forming an AI task force, for example, they’ll likely have questions on how to do that, or how to conduct an audit and risk analysis. The framework can help guide them through that, and we’ll continue to build on this.”
The framework features four primary recommendations.
Establish a stable, human-centered foundation.
Implement future-focused strategic planning for AI integration.
Ensure AI educational opportunities for every student.
Conduct ongoing evaluation, professional learning and community development.
First, the framework urges schools to keep humans at the forefront of AI plans, prioritizing educator judgment, student relationships and family input on AI-enabled processes and not relying on automation for decisions that affect people. Transparency is also key, and schools should communicate how AI tools work, how decisions are made and ensure compliance with student protection laws such as the Individuals with Disabilities Education Act and Family Education Rights and Privacy Act, the report authors write.
The document also outlines recommendations for how educational facilities can implement the technology. Establishing an AI integration task force including educators, administrators, families, legal advisers and specialists in instructional technology and special education is key among the recommendations. The document also shares tips on how to conduct an audit and risk analysis before adoption and consider how tools can affect student placement and identification and consider possible algorithmic error patterns. As the technologies are trained on human data, they run the risk of making the same mistakes and repeating biases humans have made, Basham said.
That idea is also reflected in the framework’s third recommendation. The document encourages educators to commit to learner-centered AI implementation that considers all students, from those in gifted programs to students with cognitive disabilities. AI tools should be prohibited from making final decisions on IEP eligibility, disciplinary actions and student progress decisions, and mechanisms should be installed that allow for feedback on students, teachers and parents’ AI educational experiences, the authors wrote.
Finally, the framework urges ongoing evaluation, professional learning and community development. As the technology evolves, schools should regularly re-evaluate it for unintended consequences and feedback from those who use it. Training both at implementation and in ongoing installments will be necessary to address overuse or misuse and clarify who is responsible for monitoring AI use and to ensure both the school and community are informed on the technology.
The framework was written by Basham; Trey Vasquez, co-principal investigator at CIDDL, operating officer at KU’s Achievement & Assessment Institute and professor of special education at KU; and Angelica Fulchini Scruggs, research associate and operations director for CIDDL.
“The priority at CIDDL is to share transparent resources for educators on topics that are trending and in a way that is easy to digest,” Fulchini Scruggs said. “We want people to join the community and help them know where to start. We also know this will evolve and change, and we want to help educators stay up to date with those changes to use AI responsibly in their schools.”
Mike Krings, the University of Kansas
Mike Krings is a Public Affairs Officer with the KU News Service at the University of Kansas.
Latest posts by eSchool Media Contributors (see all)
Are you thinking about starting a podcast? I invited Dr. Anna Clemens to share her podcasting journey. We talk about how social media and online presence has changed for researchers in 2025. And, how storytelling can help people connect with your research in meaningful ways.
Dr. Anna Clemens is an academic writing coach who specializes in scientific research papers. She runs the Researchers’ Writing Academy, an online course where she helps researchers to get published in high-ranking journals without lacking structure in the writing process.
Jennifer van Alstyne: Hi everyone, this is Jennifer Van Alstyne. Welcome to the Social Academic Podcast. I’m here with Dr. Anna Clemens of the Researchers’ Writing Academy. Anna, I’m so happy to have you here today. First, because you’re my friend and we’ve been trying to do this for multiple years now. I’m so happy! And second because I want to share the program that you’ve created for scientists to help them write better. It’s actually something I’ve recommended to clients of mine, something clients of mine have participated in. So I wanted to share you with everyone who listens to the podcast. Would you please introduce yourself?
Dr. Anna Clemens: Yeah, of course. Thank you so much for having me. And I’m super excited. And it’s been such a joy having some of your clients in the program.
I run a program called the Researchers’ Writing Academy, where we help researchers, well, kind of develop a really structured writing process so they can get published in the journals they want to get published in. We kind of look a bit more toward top-tier journals, high-impact journals. But honestly, what we teach kind of helps you wherever you want to go.
I have a background in chemistry. So my PhD’s in chemistry and I transitioned into writing after that. So it’s a really fun way to be able to combine kind of my scientific knowledge with writing and helping folks to get published and make that all really time efficient.
Jennifer: Gosh, that’s amazing. I think that I did not have a lot of writing support when I was in grad school. And I really felt like even though I’m an excellent writer, like I’m a creative writer, like that’s what I went to school for.
Anna: You write poetry.
Jennifer: I write poetry and I think I’m a good academic writer, but I feel like I had to teach myself all of that. And it was a lot of correction after something was already submitted in order to bring it closer to what was actually publishable.
Anna: Right.
Jennifer: I lost so much time by not knowing things. So I love that you created a program to support people who maybe aren’t getting the training that they need to publish in those high impact journals.
Anna: Yeah, because that’s so common. Like, honestly, who gets good academic writing training? That’s really almost nobody.
I often see even people who do go on, do some kind of course of their university if they offer some kind of course. They’re often not really so focused on the things that I’m teaching, which is like a lot of storytelling and a lot like being efficient with your writing, like kind of the step by step. You kind of often know just like academic English, how do I sound good? And I think honestly, this is less important than knowing how to really tell a story in your paper and having that story be consistent and not losing time by all the like edits and rewrites, etc., that are so frustrating to do.
Storytelling for Researchers and Scientists
Jennifer: Hmm, you brought up storytelling. That’s really insightful.
As a creative writer, story is so important to the words that we create and how people can connect with them. Why is storytelling important for researchers?
Anna: Well, I think it’s because we’re all humans, right? So we just as humans, really need storytelling to be able to access information in the best way and to connect to that information and to kind of put it into the kind of frameworks that we have already in our minds.
This is what a lot of researchers really overestimate is like, your research is so incredibly specific, right? It’s so much, like that thing to you, it’s all like when you’re doing it, you’re like, of course you know every detail about it. And you just forget how little other people know. It’s even if they’re in the same field because we always think, “Oh, no, everyone knows what I know.” Also a bit this feeling of like, not quite realizing like, it’s also called like the experts curse I think, when you are an expert in something, and you don’t realize how little other people know. And you kind of undervalue what you know.
So anyway, if you really want your papers to be read, if you want to get published, you need to be able to, to make it accessible to like the journal editor, right? The peer reviewers, but also the readers later, they need to be able to understand the data in a way that makes sense to them. And I think that’s where storytelling comes in. Also, it really helps with structuring the writing process. Like honestly, if you think about storytelling first, the really nice side effect is your writing process will be a lot easier because you don’t have to go back and edit quite so many times.
Jennifer: Oh, that’s fascinating. So not only does it improve how the research is being communicated It improves the process of writing it too.
Anna: I think so. Yeah, because when you’re clear on the story, everything is clear in your head from the start. And you don’t need to kind of . . . I mean, when you write a paper for the first time, or even people who’ve written a few papers, they still sometimes start writing with the introduction. And it’s such a waste of time. Like they just start at the start, right? And then they end up like deleting all those paragraphs and all those words after when they actually have written so much that they then after a while understand the story that they want to tell. And instead, what I’m suggesting is like, define the story first. And I like guide people through how to do that.
Because I think the problem is you don’t really know how to do it when you don’t have like a framework for it. You have kind of the framework there from the start. So you know what the story is and you don’t have to kind of figure out the story while you’re writing. Instead, you know what the story is and the way I’m teaching it, I’m like giving people prompts so that it’s really easy to define the story because also story is really elusive, I think. Or we use it in this elusive way often when we like we kind of use it as like a throwaway term. Oh, yeah, you you should tell a story in your paper. And you go like, “Yeah, I guess. But what does that mean?” I’m trying to like give a definition for that. So that is like really clear. Okay?
Jennifer: I appreciate that. I think so many people aren’t sure what it means. And even if they think they know what it means, they don’t necessarily know how it applies to their scientific writing. So that’s really interesting.
Researchers’ Writing Academy
Jennifer: I want to talk about podcasts, but actually, since we’re already talking about program stuff right now, I’m curious about the format of your program because people who are listening to this may not be familiar with your work. And I want to make sure that they get to hear about all the cool things that they get if they join.
Anna: Yeah, the Researchers’ Writing Academy is very comprehensive.
Jennifer: Yeah, in a good way.
Anna: It’s almost hard to tell people about it because there’s so much in there. So, what people get is like, there’s an online course, we call it the journal publication formula, that’s like the step-by-step system, walks you through online lessons that you can watch, super short digestible lessons that walk you through step-by-step. So you can just write your paper alongside the lessons.
And then because we noticed that you really may want some help actually writing in your day to day work, right? Because we’re also incredibly busy. And then it’s just helpful to have some kind of accountability, some community, and that’s what we offer as well. So we do a lot of things around accountability and we have like, cowriting sessions, for example, where we meet, we have six now, six per week across time zones.
Jennifer: Wow, that’s amazing! So if you’re anywhere in the world, there’s a chance that one of those six times during the day will work for you. Oh my gosh, that’s so cool.
Anna: Yeah. I mean, they should work. I mean for Europe and the US, most of them will work. Or not, but it depends where in the US you are, etc. But even like a few in Australia, there’s at least one per week that will work for you depending on how long you want to stay up. Some people do, we have one client who comes, he likes to do writing after his kids are in bed. So he loves nine to 10pm, you know, like, yeah. So yeah, there’s a lot. And we do like, writing retreats every now and again, and writing sprints. So we like offer a lot of support around that. And we have like a really lovely community that are so supportive. Actually, I just talked to one member today, and she just got promoted to full professor.
Jennifer: Exciting
Anna: And she was like, “I couldn’t have done it without this community.” This was so like, valuable, not only getting the feedback on her article, but also, just knowing that like, there’s the support. And that’s really, I mean, that’s so lovely for me to hear, because this is honestly what I dreamed of. This is what I wanted to build. And it’s really nice knowing that people do, you know, really, not only reach career goals, but have a supportive community because academia can be a little toxic.
Jennifer: Yeah, yeah, there’s so many reports that have come out and said, mental health struggles, toxicity, it’s consistent. Yeah.
Anna: And honestly, writing plays a big part in that, because like, kind of the way we are normally not talking about writing. I think writing like, it’s, you sometimes see like, more seasoned academics. They sometimes are really good at writing and then act as if they have it all figured out, but not share their process. So you as like a novice writer think, “Shit, I should have figured it out. Like, why do I not know how this works?”
Jennifer: This is easy for them.
Anna: Yeah, exactly. The other day, someone said to me, “Yeah, I know this professor and he just writes his paper while I’m talking to him at a conference.” And I’m like, “Oh, okay, this is an interesting process.”
Jennifer: Wow. Like, it’s so clear in his brain that he can focus on that and a conversation at the same time. Fascinating.
Anna: Fascinating. And honestly, you don’t have to do that. But she kind of thought like, “This is who I have to be. This is how I have to do it.” That creates so much pressure. And yeah, writing just hits like, emotionally, it’s really hard, right? When we feel like we are procrastinating, when we have really low confidence in our writing and just feel really disappointed in ourselves because we’re like overly perfectionistic, can’t send stuff off, keep like, you know, refining sentences. It’s just really, really hard.
This is really why a community is so beautiful when we can all just open up about how hard it is and also give each other tips. Like, I just love when people, you know, share also what’s working for them. And like, down to little techniques. Like the other day, someone was sharing in the community about how they started having like their Friday afternoons as like a margin in their calendar. So, if they didn’t get, you know, to all the things they had done, if there was any derailing event, they still had like time on a Friday. A little hack like that, right?
That just like makes you more productive, makes you just honestly feel better about your work. Because we’re really tough on ourselves often. Like we’re really harsh and just, you know, having like a community that has this kind of spirit of being kind to yourself and working with your brain and not against it. Yeah, that’s really, really . . . that’s a really lovely place. Really supportive.
Jennifer: That sounds amazing. I’m curious about who should join your program because it sounds like it’s so supportive. It sounds like there’s community and accountability and training. So, I love all of that, but there’s probably some people who the program’s not right for. So, like, maybe who shouldn’t join and who should definitely join?
Anna: Yeah, that’s a good question. I mean, it is in terms of like career stage, it’s pretty open from PhD student up to professor. And we have all of those kind of career stages in the program. The biggest group is assistant professors, just so you know, like who you can expect to be in the program. And also the PhD students who are in there are often older. It’s really interesting. They’re often like second kind of career type students who maybe have, you know, chosen that path a little later in life. Just a little side note. It’s kind of interesting.
Jennifer: I think that makes so much sense because if I’m going back for like a PhD later on, I’m like, “I’m going to get all the support that I can to make the most of this time.” And joining a program like yours would make so much sense to me.
Anna: Yeah, they’re probably also busier most of the time because their parents or other stuff going on in their lives already.
Jennifer: Yeah, that’s what makes it easier to have time for like the life and the people that you care about because you already have these processes in place.
Anna: Yeah, yeah. So as to who shouldn’t join or who this wouldn’t be a good fit for, we don’t actually serve researchers in the humanities. So there’s this really science-based, social sciences included. And you know, physical sciences, life science, earth science, all the sciences we are super happy to have inside the program just because the general publication formula is super focused on just that type of research and really honestly quite focused on like original research papers, even though we have members who write review papers using it because honestly, the process isn’t very different. But we are like, just the examples, everything is from like original research papers. So just FYI.
Otherwise, I would say like we’re really super supportive and we don’t have like a lot of this like hustle culture, you know. This is all about, we don’t believe in like, having to wake up at 4am to have your whole three hour morning routine, including writing done, because a lot of us like have kids or have other kinds of commitments. So there is a lot of like kind of understanding that, you know, all of this has to work for real life. And not just for, I don’t know, people who have, yeah, men I guess who have a lot of support in the background traditionally, right? This is how research has been done. And yeah, even though we do have really lovely men in the program as well. So it’s not just women, but I guess this is kind of the approach that, yeah, we have in the community, in the academy.
Jennifer: I love that. So not hustle culture. More let’s learn these processes and have accountability together so that we can move towards this goal of publishing with kindness.
Anna: Yeah. It’s so funny, like this being kind. I mean, we often say like, “Be kind to yourself,” because sometimes we don’t achieve the goals we set, often we don’t achieve the goals we set ourselves, right? And what I always say is it’s a data point. Like, this was a really good data point this week, because just reflect on what happened. Oh, did your child get sick? Oh, there you go. So maybe you now need to have a process, what happens if my child gets sick? Because then, you can’t plan that, right? So you have to have, or it’s good to have in your kind of system, in your writing system, in your writing practice, that you account for that. Some kind of strategy, what you do when that happens. Or like, this took me a lot longer to complete, like, I thought I would get my introduction section done this week, but actually, I didn’t. Well, really good data point. Actually, maybe it takes you longer.
Look at how where you spend the time doing this section. This is really good to know for next time. Actually, maybe schedule one or two days more for this. So that’s kind of like the approach, the vibe that like is in there. So it’s not so, it’s not harsh.
Jennifer: Yeah, I like that vibe. That’s my kind of vibe.
Anna: Mine too. Yeah, mine too. And it really crystallized for me because I once was in a business coaching program where the vibe was really different. You probably remember me talking about this because I did tell you at the time, and it was so awful for me. And I really. . . but until then, it was really a bummer because I spent a lot of money on it.
Jennifer: And you’re like, “My community needs kindness and support for each other.
Anna: This was my big learning. Apparently, I needed to spend a lot of money to really have this like so, so clear that this is not for me. Like the bro-y culture is not for me. I need the kindness. Because otherwise, it doesn’t work. I don’t work like that if someone tells me I have to, I don’t know, have all these non-negotiables everyday.
Jennifer: Yeah, like change who you are.
Anna: Yeah, like you just have to do it. Like it’s just about the discipline. You know, I don’t think that works. I honestly don’t think it works in the long term. Like maybe you can force yourself for like a few months or years and then you’re burning out or something. Like, I just don’t see how this is a sensible approach.
Jennifer: No. And I remember at the time you mentioned that you felt burned out. Like you were being affected by the culture that you were experiencing. So creating a warm culture for people inside your program, the Researchers’ Writing Academy is wonderful. Everyone gets to benefit from your research.
Anna: Right? Yea!
Anna’s podcasting journey
Jennifer: So I want to chat a little bit about online presence because I mean, we met online, we mostly communicate online, but also like you have taken some actions this year in particular to have a stronger online presence through a new avenue, which is podcasting. I’m curious because when I started my podcast, it was like not very intentional. It was like, “Oh, I just better record this thing and like, it’s going to make it like a little more accessible than if it was just in writing.” And the podcast kind of evolved into a regular series after I had already decided to start it. Whereas you came in more with a plan, you had purpose, you had drive to do more episodes than I could imagine. And so what was it like to kind of get that spark of an idea that like, I want a podcast?
Anna: Yeah, I’ve had this, I mean, I had this desire for a long time. Many, many years. I always wanted to have a podcast.
Jennifer: Really?
Anna: Really because I listen to podcasts a lot. Like I’m really into them. And years ago, someone told me you would have such a good voice for podcasts. I was like, really? I don’t, because when you listen to your own voice, you’re like, “No, I don’t think so.” And I still don’t know whether this is really true, but I wanted to be more online. Like kind of, I wanted to have an online presence that wasn’t just social media.
Because honestly, I have such a weird relationship to social media, myself. It does like cognitively do something to my brain that isn’t always good, you know. Like hanging out there too much or getting sucked in, especially back on Twitter, now on Bluesky it’s a little bit like that too. There’s sometimes a lot of negativity. And I feel like people are too harsh, coming back to the being too harsh. I just can’t take it. Like, it’s not for me, but also just the fact that there’s just a lot going on there.
I wanted to be available to people somewhere else. And a podcast and I did actually simultaneously, like launch my podcast on YouTube as well. So it’s like a video podcast. That just made sense to me. Like, that just felt really aligned with what I like to consume, what I think my ideal clients like to consume. And where I also felt like I can like express myself, I guess, in a really good way. I mean, I do love writing, I do actually have a blog too. But it’s almost like when you have a blog, unless you’re like really, really good at SEO, which is a little hard in my niche, to be honest. Like nobody reads it, right? Unless you like amplify it through social media.
Jennifer: Actively sharing it. It’s its own marketing.
Anna: Yeah, yeah. So it’s still like social media connected. And I kind of wanted to have another avenue. Anyway, yeah. Talking also, I also like talking. So podcast made sense.
Jennifer: That’s amazing. When I started my podcast, it was kind of just like, you know, going on zoom and hitting record. What is your process like? Are there other people involved? What is the kind of behind the scenes for your podcast?
Anna: Yes, I have solo episodes. And I also have episodes with former clients or current clients actually, like members of the research as writing academy or alumni. And I also had one with one of my team members, our kind of client experience manager, Yvonne, where we talked about community. And I also had you on, right, as a guest expert. I think you’re the only guest expert actually we’ve had so far.
Jennifer: I feel so special. That’s amazing.
Anna: So yeah. The process for interviews, I would think of questions ahead of time. And we, for example, then chatted about the questions. This is also what I did with Yvonne. Just have a quick chat. I think both times it was written, like through Slack, just like, “Hey, does this make sense? Where do we want to go with this? Okay, maybe this should be a different discussion. Let’s focus on that.” And similar, actually, with the clients I interviewed. I would just send them a list of questions and be like,” Hey, you don’t need to prepare anything, but if you want to do” and then basically hop on and have a conversation and it’d be quite natural. And like this one where, you know, you don’t necessarily have to follow a script, you just go where it takes you.
For my solo episodes, it’s a little bit different where I do write an outline. And honestly, like, what surprised me was this took a lot of time. Even when I knew what I wanted to say, and maybe this is me being too perfect, too much of a perfectionist, because I would go back. So I’d write the outline, I would go back the next day or the day after I read it again and have more ideas. I’d be like, “No, no, this should be like this.” So, it took me a lot of time. But then also, I think the outlines got better and better and better. And then I was really, you know, proud of the episodes. I was like, “Yeah, I really expressed this, I think, in a good way.” Because what I did afterwards then is I took this transcript from that episode and turned those into a blog post.
Then with the blog post, I’m like, “Yeah, they’re really meaty. There’s so much in there.” Like, there’s so much longer than my other blog posts that were just blog posts without podcast episodes. So that was really interesting to me. Just like, you know, understanding I guess a little bit more about the process of writing or synthesizing ideas and concepts. And yeah, after the outline, I would record on my own, I would record the episodes with that outline like in front of me. So kind of a bullet point outline.
Jennifer: It sounds like your brain really likes the outlining process. And when you come back the second time, you have ideas to flush it out and tell the story even better. That’s really cool.
Anna: Yeah, it was honestly really fun writing those outlines. Because recording sometimes, especially in the beginning, was a little more stressful than I expected. It was shockingly stressful because I’m on video a lot. I thought it would be rather easy to record cause of my experience. And I think it would have been pretty easy if I just had done audio, but because I was also doing video, it felt a lot harder because it’s really hard to read an outline and look in the camera at the same time.
Jennifer: Oh yeah.
Anna: Like really, really hard. And I also couldn’t spend even more time like rehearsing the outline to the point where I didn’t need to look at it anymore. Like I didn’t feel like that made sense. And I was really struggling with that. And I was just like, being a little unhappy about it. Because when I talk, like when I’m like, I’m on a lot of calls, you know, inside the Academy, for example, or like interviews like this. And I find, for me it’s quite natural already to look at the camera. Like, I look at the camera a lot. But when I have an outline, you know, it’s like you do look at it. It was so hard. And actually, you helped me a lot with that.
Yeah, because I was sharing this, that I was really unhappy with my recordings because of, I wasn’t looking at the camera. And you said, “Well, look, so many people aren’t even recording video for that exact reason. And you’re putting something out that is less perfect than you hope will still be so useful to the people, to people watching it. Honestly, that doesn’t matter.” And then I was like, “Yeah, this is like perfectionism.” It was all right. I just wanted to have it perfect. And I had a different standard for myself. But I didn’t need to be there. Like I was just not there. And that was totally fine. It didn’t need to be quite as polished as I thought maybe it should be.
Jennifer: Yeah, and I think that we don’t give ourselves enough grace for like our first things, right? Like the first episodes, like the first launch of something new. Like, we want it to be really great because it’s new and because it represents us. But sometimes like, we’re just not there in terms of our own practice or our own skills, like something may need to build or improve for us to get to where we dream about being. And that’s okay. I really didn’t think, I didn’t have those negative feelings when I started my podcast, but so many of my clients and so many of the people that I’ve met along the way have talked about the first maybe five or six episodes being just such a struggle.
Looking at themselves on video, listening to themselves speak, doing the editing themselves. It brought up all of those feelings about like watching themselves and what it would be like for other people to watch them. But the truth is that like you are watching yourself and doing all of those things more than anyone else is. Like, if someone else is watching it, they may not even listen to or watch the entire thing. And if they are, maybe they’re doing something else, like cleaning up their room. You know, if it’s a podcast, it’s not something that people will always sit there and like stare at your face and look at everything you did that was wrong. That’s what we’re doing.
Anna: Yeah, yeah. Yeah. You’re so right.
Jennifer: For me, this year I have Sir Nic who does all of this kind of sound editing for me and he’s here in the virtual studio with us making sound levels all good. And then my husband Matthew does the video editing. So I don’t have to look at myself anymore or listen to myself. And it is so nice! It’s, oh my goodness, it’s such a relief for me to have those things off my plate. Do you have support on your team for podcast things or is it just the people who are working on, you know, the different kind of accountability coaching and things that are in the program?
Anna: Yeah, I did have support. So I outsource the editing, video and audio editing.
Jennifer: Love that.
Anna: I couldn’t have done it myself, honestly, like not so much. I mean, it takes a lot of time. I think people often underestimate just how much time this takes. And especially if you want the audio to be kind of good, you do want someone, an audio engineer I think. This was important to me to have like a decent microphone, decent audio. So I actually invested quite a lot in this space. I started recording in my former office. I’m not in there now anymore, but it had really high ceilings. So I put all these sound panels up, these like boards and I bought curtains that I now brought into this room as well to like reduce the echo. And that was just worth it to me. But yeah, I did have support. And then in-house, like on my team, my operations manager, she also helped me with the podcast. Like she would do a lot of like even reviewing episodes and suggesting maybe further edits. So I didn’t have to watch myself very much.
Jennifer: Oh, that’s great.
Anna: She would also take out little like clips from the episode that we then put on social media. Like as YouTube shorts, for example.
Jennifer: Yeah.
Anna: Yeah, so it was a really, really smooth process with a lot of support.
Jennifer: Yeah, getting support was something that I didn’t think my podcast deserved in the beginning, but now I feel like my listeners do. My listeners deserve that. If I can keep doing it for them, I’m going to. So I’m glad we got to chat about that because a lot of people are like, “Oh, I’m just going to go on Zoom and record.” And then maybe they’re surprised when the editing process is a lot longer. But also the first few episodes, if you’re starting something new like editing, like audio stuff, like even just being on video, it’s going to be hard. And it might not be as good as you want it to be at first, but it’s going to get better. It’s going to get better. Oh, before we… Oh, sorry. Go ahead.
Anna: No, no, no. I just said so true.
Social media for academics post-Elon
Jennifer: Well, I wanted to chat about the social media landscape and how things have been changing since Elon took over Twitter. I know you are on Bluesky now. I would love to hear a little bit about your experience of that platform.
Anna: Yeah, I’m on Bluesky now and I’m not on X or Twitter anymore. I mean, I do still have the account, but I don’t check it anymore. Some people are still finding me through there, though. That’s kind of interesting. I see it in my data, but I haven’t logged in in like months. Bluesky is very similar to Twitter, honestly, in the sense of the type of conversations that are happening there. But at least for me, there’s a lot less engagement than there was. And I’m actually wondering whether a lot of academics gave up on social media after Twitter went downhill, because there was this like really great academic community on Twitter through which I guess we met.
Jennifer: Yeah.
Anna: Back in the day. And I don’t see that happening on Bluesky. Bluesky does have a few other features, like additional features though that I really like. Like the way you can customize your feed a lot better. You can create those lists. So if you’re new to Bluesky, you can just like, there’s probably a list for researchers in your field.
“I struggle with writing a compelling story that is interesting outside of my field, yet doesn’t oversell my data.” ✍️
Jennifer: Yeah, like the starter packs and the different lists you could put together.
Anna: Exactly, starter packs. That’s what it’s called. Yeah. So you can just like hit follow all and you already have a feed full of people you want to have in your feed. And getting started is kind of really cool on Bluesky. I do think, I don’t know, something is different about the algorithm over there, but I’m not an expert. I don’t really know, but it feels like not as much things are like going viral per se.
Jennifer: Yeah.
Anna: Maybe a little more one to one.
Jennifer: Yeah. Oh, that’s really interesting. When I when I first joined Bluesky, which was much later than everyone else. It was really just last month. I found that it was very quiet. I connected with the people that were like the most talkative on Twitter. I hadn’t run Sky Follower Bridge or any of the tools to help me get connected yet because I wanted to see what the platform was like naturally. Like if someone was just signing up for the first time without having been on Twitter. And I was able to find people pretty easily. Like the people that I most often talked to or connected with, guests on The Social Academic, those kinds of things. But I wasn’t finding conversations. Like the people who I knew from social media weren’t talking all that much. They weren’t posting original content the way that they had on other platforms.
And when I did run Sky Follower Bridge and found all of the people from Threads, from X, etc. I realized that like so many people had accounts that they just hadn’t connected with people yet. Like they, you know, maybe started their account during the big X exodus and then they connected with 12 people because that’s who they found when they first got there. And when they didn’t find their community, it’s like maybe they stopped logging in. And I think that’s really normal for people. Like you’re going to look for the warmth in the conversations or just like the people talking and watching it, being able to see it without even participating in it. Like if you don’t see when you get there, it’s kind of like, “Well, why am I going to spend time in this space?” I had to do a lot more work than I expected in order to find the conversations. And I had to connect with a lot more people without knowing that they were going to follow me back. Like without that anticipation in order for me to feel connected. But once I did that, once I was following, like I follow like over a thousand people now, once I did that, it started to feel like old Twitter to me. Like the community and conversation. Yeah, there’s a lot of people who aren’t talking there, but I was just surprised how much effort it took to get to that feeling. More than other platforms for me.
Anna: Do you enjoy it now? Like the way you liked Twitter?
Jennifer: You know, I don’t think I really enjoy any one social media platform over another anymore. I feel like my relationship with creating content has changed a lot in that I found more ease and I found less pressure and I found like good processes that work for me. And because of that, I don’t spend a lot of time on social media. Like I’m not on there browsing for conversations the way that I think I did when I was on X. Like old Twitter, I liked spending time there and jumping into conversations. And now social media is more, I don’t intentionally put in my day as much anymore. That’s what it is. And I like that. I like how my relationship with social media has changed. But no, I haven’t gone back to how I engaged in old Twitter, I think. What about you?
Anna: That makes sense. Yeah, it’s similar for me, actually. I have to say I go through phases with it. So I do put out like content on several platforms like Threads, Bluesky and LinkedIn and then like YouTube as shorts. And I do go in and kind of check, does anyone comment? Like is anyone starting a conversation? I do this several times a week. But I don’t get sucked in as much anymore, if ever. Yeah, and I’m like super intentional about the time I spend there, I guess.
Jennifer: How are you intentional?
Anna: Well, I kind of set myself a timer as well.
Jennifer: Oh, like a literal timer.
Anna: So I don’t let myself like do more than, I don’t know, five minutes per platform.
Jennifer: Really?!
Anna: If there is like, of course, if there is comments, like actual, interesting conversations to join, I will, you know, override, but I’m really trying not to, not to get sucked in because it’s so easy for me. I don’t know. My brain is really-
Jennifer: That is really smart. I’ve never set a timer for that short amount of time. I’ll be like 30 minutes, you know, 30 minutes a day. Like if I’m going to have a timer maybe that’s what I would set it for. But five minutes is so much more specific, direct. That would wake my brain up. I should try something like that if I get sucked in again.
Anna: Yeah, I like it. I do like it. And because now I feel like the social media landscape for academics has changed in a way. They’re used to be, or for me they’re used to be just Twitter. I was basically just on Twitter and I didn’t really do anything on any other platform whereas now it’s a lot more spread out. And, I don’t know, there’s good and bad things about that. But now I feel like, “Okay, I need to spend time on LinkedIn. I need to spend on Blue Sky and on Threads.” So, you know, I just can’t spend like that much time anymore on just one platform. So it has to be kind of a bit more time efficient.
Jennifer: Okay, so you’re on Bluesky, Mastodon, YouTube, LinkedIn-
Anna: I’m not on Mastodon. Threads.
Jennifer: Not on Mastodon. Threads, LinkedIn and YouTube.
Where can people find your blog and your podcast? I want people to be able to get connected with you after this.
Anna: Thank you so much for that lovely conversation. And it was so fun finally being a guest on your show.
Jennifer: I’m so happy. Anna, I am so happy to have shared the Researchers’ Writing Academy with people because I really believe in your program. I believe in the process. And I know that you’re someone who goes in and updates things and improves them. And so I’ve always recommended the Researchers’ Writing Academy to professors. And I really encourage you if you’re listening to this to check it out.
Jennifer receives no monies or gift when you sign up for the Researchers’ Writing Academy or any of the other recommendations she shares on The Social Academic.
Dr Anna Clemens is an academic writing coach who specializes in scientific research papers. She runs the Researchers’ Writing Academy, an online course where she helps researchers to get published in high-ranking journals without lacking structure in the writing process.
Sign up for Anna’s free training on how to develop a structured writing process to get published in top-tier journals efficiently.
Months after individual researchers, advocacy groups and a coalition of Democratic state attorneys general filed two lawsuits against the National Institutes of Health for terminating hundreds of active research grants misaligned with the Trump administration’s ideologies, some scientists are hopeful that the agency will soon restore the grants and allow them to resume their research.
Last week, a federal judge in Massachusetts ordered the NIH to restore the roughly 900 grants named in the lawsuits, including many focused on studying vaccine hesitancy, LGBTQ+ health and diversity, equity and inclusion in the medical field. U.S. District Judge William Young, who was appointed by President Ronald Reagan, ruled the terminations void and unlawful, stating during a hearing that in all his years on the bench he’d “never seen” discrimination by the government to this extent.
Although Science reported Thursday morning that the NIH has internally communicated plans to restore those grants “as soon as practicable”—and also cease further grant terminations—researchers say they still don’t know when they can expect to get the money they were promised.
“Since the ruling, we are really encouraged,” said Heidi Moseson, a plaintiff in one of the cases and a senior researcher at Ibis Reproductive Health. “But we haven’t heard anything from the NIH about our grants being reinstated, and we don’t have a window into what that process looks like.”
Back in March, Moseson received a letter from the agency terminating her grant, which was aimed at improving the accuracy of data collected in sexual and reproductive health research for all people, including those who identify as transgender and gender diverse. The award “no longer effectuates agency priorities,” the letter said. “Research programs based on gender identity are often unscientific, have little identifiable return on investment, and do nothing to enhance the health of many Americans.”
The NIH did not respond to Inside Higher Ed’s request for comment on its specific plans for restoring the terminated grants.
Appeal Anxiety
Moseson said each week that goes by with the grant on pause “is another week where people are not being appropriately screened into clinical care and research that would be relevant for their bodies, leading to missed preventative care or, conversely, unnecessary preventive care.”
While her team is ready to resume their research as soon as the NIH restores the funding in accordance with the judge’s ruling, she’s bracing for further disruptions ahead, depending on what happens with the appeals process.
On Monday, the NIH filed a notice of appeal with the U.S. Court of Appeals for the First Circuit. It also filed a motion to stay the judge’s order to restore the grants while pending the appeal, but Young denied that motion on Tuesday, noting that a stay “would cause irreparable harm to the plaintiffs.”
“This is a case in equity concerning health research already bought and paid for by the Congress of the United States through funds appropriated for expenditure and properly allocated during this fiscal year,” the judge wrote. “Even a day’s delay further destroys the unmistakable legislative purpose from its accomplishment.”
The following day, Michelle Bulls, a senior NIH official who oversees extramural funding, told staffers in an email that the agency must restore funding for the hundreds of projects identified by the plaintiffs, Science reported. “Please proceed with taking action on this request as part of the first phase of our compliance with the court’s judgment,” Bulls wrote, noting that “additional information is forthcoming.”
Noam Ross, executive director at rOpenSci, a nonprofit that supports reproducible open research, and co-founder of the website Grant Watch, which is tracking grant terminations, put out a call for information on LinkedIn Wednesday about any grants the NIH has restored. But he told Inside Higher Ed Thursday afternoon that he has yet to receive any verified reports of restored NIH grants.
Shalini Goel Agarwal, counsel for Protect Democracy, a nonprofit focused on combating perceived authoritarian threats, and one of the lawyers representing the plaintiffs, said Thursday morning that she also had not yet heard of any researchers getting grant money the NIH previously terminated.
Though it’s not clear what could come of the government’s effort to appeal Young’s ruling, “at this moment the judge’s order is in effect and the NIH should be returning money to the researchers whose grants were terminated,” she said. “NIH should right now be undoing the effects of its directives.”
‘Cautiously Optimistic’
Katie Edwards, a social work professor at the University of Michigan and a plaintiff in one of the cases, said that as of Thursday afternoon, she had yet to receive any communication from the NIH about its plans to restore her numerous multiyear grants.
Edwards, whose research focuses on Indigenous and LGBTQ+ youth, said that delaying the grants much longer will undermine the research she’s already started, to the detriment of public health research.
“For some of our studies, it’s just a matter of weeks before they’ll be really hard if not impossible to restart. I’m feeling a lot of anxiety,” she said. “We’re in a waiting phase, but I’m trying to be cautiously optimistic.”
Despite the uncertainty of what’s ahead, she did get some reassuring news from the NIH on Thursday. The agency notified her that it approved her bid for a new three-year, $710,000 grant to develop and evaluate a self-defense program for adult women survivors of sexual violence. Like many other applications for new grants, the application had been in limbo for months. “So something (good??) is going on there!” she said in an email.
Other cases moving through the courts also look promising for federally funded researchers eager to get their grants restored.
On Monday, U.S. District Court Judge Rita Lin ruled that the Environmental Protection Agency, the National Science Foundation and the National Endowment for the Humanities had also unlawfully terminated grants that had already been awarded to researchers in the University of California’s 10-campus system. The judge, a Biden appointee, ordered the government to restore them, adding that she is weighing extending the order to 13 other federal agencies, including the NIH.
“Many of the cases that are making their way through the courts share claims that are being made about the illegality of the federal government’s actions,” said Olga Akselrod, counsel for the American Civil Liberties Union and a lawyer representing the plaintiffs in one of the suits against the NIH. “Any time we have a win in one of these cases it’s an important statement of the applicable law, and that’s relevant for all of the cases that are proceeding.”
The courts have pushed back against much of President Donald Trump’s agenda, but he did win a small victory this week in a dispute with education researchers.
On June 3, a federal judge in Washington, D.C., denied a request by four education research trade associations for a preliminary injunction, which means that the Education Department doesn’t have to temporarily reinstate fired employees and canceled contracts within its research and data arm, the Institute of Education Sciences.
Researchers had hoped to return the research division to its pre-Trump status while the court takes time to decide the overall issue in the case, which is whether the Trump administration exceeded its executive authority in these mass firings and contract terminations. Now, the cuts in the research arm of the department will remain while the case proceeds.
Four education research groups (the Association for Education Finance and Policy (AEFP), the Institute for Higher Education Policy (IHEP), the National Academy of Education (NAEd) and the National Council on Measurement in Education (NCME)) are suing the Education Department because their federally funded studies, evaluations and surveys have been slashed and their access to data is slated to be curtailed. They also contend that historical data archives are at risk, along with future data quality. Their legal argument is that the cuts were arbitrary and capricious and they say that the Trump administration eliminated many activities that Congress requires by law.
U.S. District Judge Trevor McFadden acknowledged that the “upheaval” at the Institute of Education Sciences is “understandably jarring for those who rely on studies and data produced by the Institute.” However, McFadden explained in a written opinion that the law that the researchers are using to sue the executive branch, the Administrative Procedure Act, was “never meant to be a bureaucratic windbreak insulating agencies from political gales.”
“It is not this Court’s place to breathe life back into wide swathes of the Institute’s cancelled programs and then monitor the agency’s day-to-day statutory compliance,” McFadden wrote.
In the opinion, McFadden noted that some of the researchers’ complaints, such as losing remote access to student data for research purposes, may be “ripe for standalone challenges,” but bundling all of their grievances together is a “losing gambit.”
The ruling not only denied researchers the short-term remedy they sought but also cast doubt on the prospects of their overall case. “We are disappointed with and disagree with the Court’s decision, and are evaluating our next steps,” said Adam Pulver, an attorney at Public Citizen, a nonprofit advocacy organization representing two of the research organizations.
A federal judge in Maryland is still considering a similar request to temporarily restore research-related cuts at the Education Department by two other education research groups. That suit, which also accuses the Trump administration of exceeding its executive power, was brought by the American Educational Research Association (AERA) and the Society for Research on Educational Effectiveness (SREE).
Educators fighting the cuts have had one victory so far, in a separate case filed in federal district court in Boston. On May 22, U.S. District Judge Myong Joun ordered the Trump administration to reinstate 1,300 Education Department employees terminated in March. The Trump administration is challenging the decision, but the court said on June 4 that the Education Department couldn’t postpone rehiring everyone while the appeal works its way through the courts. This case was brought by two Massachusetts school districts, a teachers union and 21 Democratic attorneys general.
This story about education researchers suing the Trump administration was written by Jill Barshay and produced by The Hechinger Report, a nonprofit, independent news organization focused on inequality and innovation in education. Sign up for Proof Points and other Hechinger newsletters.
The Hechinger Report provides in-depth, fact-based, unbiased reporting on education that is free to all readers. But that doesn’t mean it’s free to produce. Our work keeps educators and the public informed about pressing issues at schools and on campuses throughout the country. We tell the whole story, even when the details are inconvenient. Help us keep doing that.
The Royal Society has announced a $40 million fund designed to attract global research talent to the U.K.
The Faraday Fellowship “accelerated international route” will provide up to $5.4 million per academic or group willing to relocate to British universities and research institutes, over a period of five to 10 years. The society said that it would be willing to consider larger awards “in exceptional circumstances.”
Adrian Smith, president of the Royal Society, said that international science was “in a state of flux with some of the certainties of the postwar era now under question.
“With funding streams and academic freedom coming under threat, the best scientific talent will be looking for stability. The U.K. can be at the front of the queue in attracting that talent,” Smith said.
“Our new opportunity, combined with schemes from [UK Research and Innovation] and the Royal Academy of Engineering, is a step in the right direction.”
UPDATE: The hearing scheduled for May 9 has been postponed until May 16 at the U.S. District Court for the District of Columbia. The court will hear two similar motions at the same time and consider whether to temporarily restore the cuts to research and data collections and bring back fired federal workers at the Education Department. More details on the underlying cases in the article below.
Some of the biggest names in education research — who often oppose each other in scholarly and policy debates — are now united in their desire to fight the cuts to data and scientific studies at the U.S. Department of Education.
The roster includes both Grover J. “Russ” Whitehurst, the first head of the Institute of Education Sciences (IES) who initiated studies for private school vouchers, and Sean Reardon, a Stanford University sociologist who studies inequity in education. They are just two of the dozens of scholars who have submitted declarations to the courts against the department and Secretary Linda McMahon. They describe how their work has been harmed and argue that the cuts will devastate education research.
Professional organizations representing the scholars are asking the courts to restore terminated research and data and reverse mass firings at the Institute of Education Sciences, the division that collects data on students and schools, awards research grants, highlights effective practices and measures student achievement.
Related: Our free weekly newsletter alerts you to what research says about schools and classrooms.
Three major suits were filed last month in U.S. federal courts, each brought by two different professional organizations. The six groups are the Association for Education Finance and Policy (AEFP), Institute for Higher Education Policy (IHEP), American Educational Research Association (AERA), Society for Research on Educational Effectiveness (SREE), National Academy of Education (NAEd) and the National Council on Measurement in Education (NCME). The American Educational Research Association alone represents 25,000 researchers and there is considerable overlap in membership among the professional associations.
Prominent left-wing and progressive legal organizations spearheaded the suits and are representing the associations. They are Public Citizen, Democracy Forward and the Legal Defense Fund, which was originally founded by the National Association for the Advancement of Colored People (NAACP) but is an independent legal organization. Allison Scharfstein, an attorney for the Legal Defense Fund, said education data is critical to documenting educational disparities and improve education for Black and Hispanic students. “We know that the data is needed for educational equity,” Scharfstein said.
Officers at the research associations described the complex calculations in suing the government, mindful that many of them work at universities that are under attack by the Trump administration and that its members are worried about retaliation.
“A situation like this requires a bit of a leap of faith,” said Elizabeth Tipton, president of the Society for Research on Educational Effectiveness and a statistician at Northwestern University. “We were reminded that we are the Society for Research on Educational Effectiveness, and that this is an existential threat. If the destruction that we see continues, we won’t exist, and our members won’t exist. This kind of research won’t exist. And so the board ultimately decided that the tradeoffs were in our favor, in the sense that whether we won or we lost, that we had to stand up for this.”
The three suits are similar in that they all contend that the Trump administration exceeded its executive authority by eliminating activities Congress requires by law. Private citizens or organizations are generally barred from suing the federal government, which enjoys legal protection known as “sovereign immunity.” But under the Administrative Procedure Act of 1946, private organizations can ask the courts to intervene when executive agencies have acted arbitrarily, capriciously and not in accordance with the law. The suits point out, for example, that the Education Science Reform Act of 2002 specifically requires the Education Department to operate Regional Education Laboratories and conduct longitudinal and special data collections, activities that the Education Department eliminated in February among a mass cancelation of projects.
The suits argue that it is impossible for the Education Department to carry out its congressionally required duties, such as the awarding of grants to study and identify effective teaching practices, after the March firing of almost 90 percent of the IES staff and the suspension of panels to review grant proposals. The research organizations argue that their members and the field of education research will be irreparably harmed.
Of immediate concern are two June deadlines. Beginning June 1, researchers are scheduled to lose remote access to restricted datasets, which can include personally identifiable information about students. The suits contend that loss harms the ability of researchers to finish projects in progress and plan future studies. The researchers say they are also unable to publish or present studies that use this data because there is no one remaining inside the Education Department to review their papers for any inadvertent disclosure of student data.
The second concern is that the termination of more than 1,300 Education Department employees will become final by June 10. Technically, these employees have been on administrative leave since March, and lawyers for the education associations are concerned that it will be impossible to rehire these veteran statisticians and research experts for congressionally required tasks.
The suits describe additional worries. Outside contractors are responsible for storing historical datasets because the Education Department doesn’t have its own data warehouse, and researchers are worried about who will maintain this critical data in the months and years ahead now that the contracts have been canceled. Another concern is that the terminated contracts for research and surveys include clauses that will force researchers to delete data about their subjects. “Years of work have gone into these studies,” said Dan McGrath, an attorney at Democracy Forward, who is involved in one of the three suits. “At some point it won’t be possible to put Humpty Dumpty back together again.”
In all three of the suits, lawyers have asked the courts for a preliminary injunction to reverse the cuts and firings, temporarily restoring the studies and bringing federal employees back to the Education Department to continue their work while the judges take more time to decide whether the Trump administration exceeded its authority. A first hearing on a temporary injunction is scheduled on Friday in federal district court in Washington.*
A lot of people have been waiting for this. In February, when DOGE first started cutting non-ideological studies and data collections at the Education Department, I wondered why Congress wasn’t protesting that its laws were being ignored. And I was wondering where the research community was. It was so hard to get anyone to talk on the record. Now these suits, combined with Harvard University’s resistance to the Trump administration, show that higher education is finally finding its voice and fighting what it sees as existential threats.
The three suits:
Public Citizen suit
Plaintiffs: Association for Education Finance and Policy (AEFP) and the Institute for Higher Education Policy (IHEP)
Attorneys: Public Citizen Litigation Group
Defendants: Secretary of Education Linda McMahon and the U.S. Department of Education
Date filed: April 4
Where: U.S. District Court for the District of Columbia
A concern: Data infrastructure. “We want to do all that we can to protect essential data and research infrastructure,” said Michal Kurlaender, president of AEFP and a professor at University of California, Davis.
Status: Public Citizen filed a request for a temporary injunction on April 17 that was accompanied by declarations from researchers on how they and the field of education have been harmed. The Education Department filed a response on April 30. A hearing is scheduled for May 9.
Democracy Forward suit
Plaintiffs: American Educational Research Association (AERA) and the Society for Research on Educational Effectiveness (SREE)
Attorneys: Democracy Forward
Defendants: U.S. Department of Education, Institute of Education Sciences, Secretary of Education Linda McMahon and Acting Director of the Institute of Education Sciences Matthew Soldner
Date filed: April 14
Where: U.S. District Court for the District of Maryland, Southern Division
A concern: Future research. “IES has been critical to fostering research on what works, and what does not work, and for providing this information to schools so they can best prepare students for their future,” said Ellen Weiss, executive director of SREE. “Our graduate students are stalled in their work and upended in their progress toward a degree. Practitioners and policymakers also suffer great harm as they are left to drive decisions without the benefit of empirical data and high-quality research,” said Felice Levine, executive director of AERA.
Status: A request for a temporary injunction was filed April 29, accompanied by declarations from researchers on how their work is harmed.
Legal Defense Fund suit
Plaintiffs: National Academy of Education (NAEd) and the National Council on Measurement in Education (NCME)
Attorneys: Legal Defense Fund
Defendants: The U.S. Department of Education and Secretary of Education Linda McMahon
Date filed: April 24
Where: U.S. District Court for the District of Columbia
A concern: Data quality. “The law requires not only data access but data quality,” said Andrew Ho, a Harvard University professor of education and former president of the National Council on Measurement in Education. “For 88 years, our organization has upheld standards for valid measurements and the research that depends on these measurements. We do so again today.”
Status: A request for a temporary injunction was filed May 2.*
* Correction: This paragraph was corrected to make clear that lawyers in all three suits have asked the courts to temporarily reverse the research and data cuts and personnel firings. Also, May 9th is a Friday, not a Thursday. We regret the error.
The Hechinger Report provides in-depth, fact-based, unbiased reporting on education that is free to all readers. But that doesn’t mean it’s free to produce. Our work keeps educators and the public informed about pressing issues at schools and on campuses throughout the country. We tell the whole story, even when the details are inconvenient. Help us keep doing that.
We know that the use of generative AI in research is now ubiquitous. But universities have limited understanding of who is using large language models in their research, how they are doing so, and what opportunities and risks this throws up.
The University of Edinburgh hosts the UK’s first, and largest, group of AI expertise – so naturally, we wanted to find out how AI is being used. We asked our three colleges to check in on how their researchers were using generative AI, to inform what support we provide, and how.
Using AI in research
The most widespread use, as we would expect, was to support communication: editing, summarising and translating texts or multimedia. AI is helping many of our researchers to correct language, improve clarity and succinctness, and transpose text to new mediums including visualisations.
Our researchers are increasingly using generative AI for retrieval: identifying, sourcing and classifying data of different kinds. This may involve using large language models to identify and compile datasets, bibliographies, or to carry out preliminary evidence syntheses or literature reviews.
Many are also using AI to conduct data analysis for research. Often this involves developing protocols to analyse large data sets. It can also involve more open searches, with large language models detecting new correlations between variables, and using machine learning to refine their own protocols. AI can also test complex models or simulations (digital twins), or produce synthetic data. And it can produce new models or hypotheses for testing.
AI is of course evolving fast, and we are seeing the emergence of more niche and discipline-specific tools. For example, self taught reasoning models (STaRs) can generate rationales that can be fine-tuned to answer a range of research questions. Or retrieval augmented generation (RAG) can enable large language models to access external data that enhances the breadth and accuracy of their outputs.
Across these types of use, AI can improve communication and significantly save time. But it also poses significant risks, which our researchers were generally alert to. These involve well-known problems with accuracy, bias and confabulation – especially where researchers use AI to identify new (rather than test existing) patterns, to extrapolate, or to underpin decision-making. There are also clear risks around sharing of intellectual property with large language models. And not least, researchers need to clearly attribute the use of AI in their research outputs.
The regulatory environment is also complex. While the UK does not as yet have formal AI legislation, many UK and international funders have adopted guidelines and rules. For example, the European Union has a new AI Act, and EU funded projects need to comply with European Commission guidelines on AI.
Supporting responsible AI
Our survey has given us a steer on how best to support and manage the use of AI in research – leading us to double down on four areas that require particular support:
Training. Not surprisingly the use of generative AI is far more prevalent among early career researchers. This raises issues around training, supervision and oversight. Our early career researchers need mentoring and peer support. But more senior researchers don’t necessarily have the capacity to keep pace with the rapid evolution of AI applications.
This suggests the need for flexible training opportunities. We have rolled out a range of courses, including three new basic AI courses to get researchers started in the responsible use of AI in research, and online courses on ethics of AI.
We are also ensuring our researchers can share peer support. We have set up an AI Adoption Hub, and are developing communities of practice in key areas of AI research – notably research in AI and Health which is one of the most active areas of AI research. A similar initiative is being developed for AI and Sustainability.
Data safety. Our researchers are rightly concerned about feeding their data into large language models, given complex challenges around copyright and attribution. For this reason, the university has established its own interface with the main open source large language models including ChatGPT – the Edinburgh Language Model (ELM). ELM provides safer access to large language model, operating under a “zero data retention” agreement so that data is not retained by Open AI. We are encouraging our researchers to develop their own application programming interfaces (APIs), which allow them to provide more specific instructions to enhance their results.
Ethics. AI in research throws up a range of challenges around ethics and integrity. Our major project on responsible AI, BRAID, and ethics training by the Institute for Academic Development, provide expertise on how we adapt and apply our ethics processes to address the challenges. We also provide an AI Impact Assessment tool to help researchers work through the potential ethical and safety risks in using AI.
Research culture. The use of AI is ushering in a major shift in how we conduct research, raising fundamental questions about research integrity. When used well, generative AI can make researchers more productive and effective, freeing time to focus on those aspects of research that require critical thinking and creativity. But they also create incentives to take short cuts that can compromise the rigour, accuracy and quality of research. For this reason, we need a laser focus on quality over quantity.
Groundbreaking research is not done quickly, and the most successful researchers do not churn out large volumes of papers – the key is to take time to produce robust, rigorous and innovative research. This is a message that will be strongly built into our renewed 2026 Research Cultures Action Plan.
AI is helping our researchers drive important advances that will benefit society and the environment. It is imperative that we tap the opportunities of AI, while avoiding some of the often imperceptible risks in its mis-use. To this end, we have decided to make AI a core part of our Research and Innovation Strategy – ensuring we have the right training, safety and ethical standards, and research culture to harness the opportunities of this exciting technology in an enabling and responsible way.
Additionally, the Trump administration has variously moved to cancel or suspend research contracts and grants at Columbia University, the University of Pennsylvania and most recently Princeton University as part of punitive actions tied to investigations of campus antisemitism or, in Penn’s case, the decision to allow a trans woman to compete on the women’s swim team three years ago. The administration also briefly froze (and then unfroze) United States Department of Agriculture funds for the University of Maine system after the state’s governor engaged in a tense exchange with President Trump at the White House.
Below, 15 researchers across nine different research areas who have had their federal grants terminated since the start of the Trump administration share just a few of the thousands of stories behind these cuts.
—Elizabeth Redden, opinion editor
Preventing Intimate Partner Violence
Prostock-Studio/iStock/Getty Images Plus
By Rebecca Fielding-Miller, Nicholas Metheny and Sarah Peitzmeier
Each year, more than 3,000 American women are murdered by their partners. Pregnancy and the postpartum period are high-risk periods for intimate partner violence (IPV), which is linked to negative maternal outcomes such as miscarriage, hemorrhage and postpartum depression. Perinatal IPV is also linked to worse infant health outcomes, such as preterm birth and low birth weight, and to adverse childhood experiences. This makes prevention of perinatal IPV crucial not just for the survivor but for the entire family.
Perinatal IPV and its cascade of negative outcomes are preventable—but only if we study the epidemiology and prevention of IPV as rigorously as we study hypertension or any other perinatal complication. A grant rescinded last month by the NIH would have trained a cohort of 12 early-career clinicians and researchers to learn how to study IPV as part of their ongoing research on pregnancy, birth and the postpartum period. We proposed training investigators working in diverse communities across the spectrum of America, with a commitment to including communities disproportionately impacted by IPV and maternal mortality, including Black and LGBTQ+ communities. To solve a problem with constrained resources, it is efficient to focus efforts on where the problem is most severe. While the termination letter named this targeting of training resources an “amorphous equity objective,” we call it a data-driven approach to rigorous science.
Training grants like this one help shift an entire field by giving young investigators the skills and knowledge to add a focus on IPV to their research for the next several decades. In addition to training these 12 young researchers, the grant would have also supported turning the mentorship curriculum we developed into an open-access online training for clinicians and researchers to access in perpetuity, multiplying the impact of the work to train even more investigators in the field. As with the approximately 700 other terminated NIH grants, cutting this work before our aims are realized but after significant costs have been incurred to establish the mentorship team and design the curriculum is the definition of government inefficiency and waste.
With this grant rescinded, none of the promised training will occur. Pregnant people and their babies from every community across America will continue to suffer, without the benefit of advances in the science of how we prevent these violence exposures. Our termination notice claims that the proposed trainings are “antithetical to the scientific inquiry, do nothing to expand our knowledge of living systems, provide low returns on investment, and ultimately do not enhance health, lengthen life, or reduce illness.” We could not disagree more. Anyone who has cared for a child or for the person who gave birth to them knows that preventing maternal and infant death and abuse should be a nonpartisan issue. The current administration is intent on making even this issue into “us” versus “them.” When it comes to public health, there is no such thing. American families deserve better.
Rebecca Fielding Miller is an associate professor of public health at the University of California, San Diego. Her research focuses on health disparities in infectious disease and gender-based violence.
Nicholas Metheny is an Atlanta-area scientist and registered nurse with clinical and research experience in the post-violence care of women and sexual and gender minority communities.
Sarah Peitzmeier is an assistant professor at the University of Maryland School of Public Health who develops and tests interventions to prevent gender-based violence. She is also a practicing birth doula and victim advocate.
Is Work-Study Working?
Okrasyuk/iStock/Getty Images Plus
By Judith Scott-Clayton
On March 7, at 9:49 a.m., I received an email with “GRANT AWARD TERMINATION” in all caps in the subject line. Attached to the email was a letter, addressed to me as project director and referring to our Department of Education grant by its award number. The letter was generic, virtually identical to three other termination letters received that day at the Community College Research Center at Columbia University’s Teachers College, where I am affiliated. It did not mention our project title nor provide any project-specific details to explain why our project, as the email states, “is now inconsistent with, and no longer effectuates, the Department’s priorities.” A few hours later, I received a formal notification that the grant end date was that day: March 7, 2025.
The project—a collaboration with Adela Soliz of Vanderbilt University and Tom Brock of CCRC—was titled “Does Federal Work-Study Work for Students? Evidence From a Randomized Controlled Trial.” The Federal Work-Study (FWS) program was created in 1964 as part of the Economic Opportunity Act and covers up to 75 percent of the wages of college students working part-time in mostly on-campus jobs, with colleges paying the rest. In a typical year, the program provides more than $1 billion in support to more than 450,000 college students with financial need at more than 3,000 institutions all across the country. Several states also have their own similar programs.
Our study would be the first to rigorously evaluate the causal impact of the program on students’ enrollment, employment, persistence and degree completion. We were also conducting interviews, focus groups and surveys to understand how students find FWS jobs, what kinds of work they do, what resources institutions devote to running the program and how much it all costs to operate, all with the goal of ensuring the program is delivering the maximum impact for every single student that participates and for every dollar spent.
At the time of its cancellation, we were about four and a half years into a six-year project. We were right in the middle of randomizing what would be the final cohort of our study sample and fielding the final round of a student survey. This final year is especially important, because the early cohorts were heavily impacted by the pandemic. For the past three weeks, we have been scrambling to pull together any other resources we could find to preserve our options and avoid losing this final cohort of participants. We have also been scrambling to figure out how to continue to pay critical staff and doctoral students involved in the project until we can figure out the next steps.
As for the broader impact of the termination: The Federal Work-Study program itself will keep on going, at least for now; we just won’t know whether it works or not. We hypothesize that it may provide valuable work-based learning opportunities that keep students engaged and give them advantages in the labor market after college, but it’s possible that it distracts students from their studies and hurts their academic performance. We may think that it helps students to afford college, but perhaps the complexity of finding a specific job and navigating all the necessary paperwork reduces its value for the students that need help the most. The next time the program is up for debate, policymakers will be flying blind: Without actual evidence all we can do is speculate.
Since 1964, the FWS program has disbursed more than $95 billion in awards. In comparison, our grant was less than three-thousandths of 1 percent of that amount, and the amount remaining to finish our work and share our findings with the public was just a fraction of that. Our project was motivated by a desire to help policymakers ensure that every dollar invested in financial aid has the maximum possible impact for low-income students. So it is discouraging to learn, so close to the finish line, that this first-of-its-kind evaluation of a major federal program is “now inconsistent with, and no longer effectuates, the Department’s priorities.”
Judith Scott-Clayton is a professor of economics and education at Teachers College, Columbia University, in the Department of Education Policy and Social Analysis, where she directs the Economics and Education Program and teaches courses on the economics of education, labor economics and causal inference.
Democracy Research
AlexeyPushkin/iStock/Getty Images Plus
By Rob Blair, Jessica Gottlieb, Laura Paler and Julie Anne Weaver
We lost funding for the Democratic Erosion Consortium (DEC) as part of the federal government’s recent cancellation of foreign assistance grants. Directed by scholars at Brown University, the University of Houston and American University, DEC works to make academic research on democratic backsliding accessible to policymakers and practitioners seeking evidence-based strategies to defend democracy around the globe.
Originally launched in 2017 on a shoestring budget, DEC began as an effort to improve pedagogy on a troubling trend observable both abroad and at home: the strategic dismantling of democratic norms and institutions by elected leaders with autocratic ambitions. In 2022, in line with the U.S. government’s dual interests in democratic resilience and evidence-based policymaking, we received a grant from the State Department to expand DEC’s work.
The State Department’s investment enabled us to grow our reach beyond the classroom and into the policy arena. We drew on an expanding network of scholars to synthesize evidence on urgent questions—such as how to reduce the spread of misinformation and measure democratic decline. We also built out a novel event data set on democratic erosion and trained partners around the world to use it in their own work.
The immediate consequences are clear: several full- and part-time staff lost funding for their jobs. But the long-term damage is hard to quantify. It’s difficult to argue for the value of evidence-based policymaking in foreign aid when the entire category of foreign assistance has effectively been gutted. More than that, the partnerships we built between academics, practitioners and policymakers were yielding real-time insights and responses—a rare example of successful research-policy collaboration. That infrastructure is now gone.
And at a moment when democratic backsliding is accelerating in many parts of the world, the U.S. government is stepping away from efforts to understand and counter it. Ending this grant not only weakens the ability to monitor democratic erosion globally, it also reduces public awareness and understanding of a phenomenon that is increasingly visible in the U.S. itself.
With the federal policy audience for our work largely gone, we are refocusing our efforts on our other two core constituencies: students and academics. We continue to support instructors engaged in teaching our democratic erosion course and to improve the Democratic Erosion Event Dataset. And in response to growing concern about democratic backsliding in the U.S., we’re developing a more robust domestic data-collection effort, paired with public engagement.
Given intense partisan disagreement around what even constitutes democratic erosion, we are seeking to increase the credibility of new evidence by capturing partisan-diverse perspectives and applying our established comparative framework to U.S. events. We are hoping to continue this work, despite the loss of our federal grant, because the political reality in the U.S. and around the world tells us we need to be worried about democratic erosion now more than ever.
Rob Blair is the Arkadij Eisler Goldman Sachs Associate Professor of Political Science and International and Public Affairs at Brown University.
Jessica Gottlieb is an associate professor at the University of Houston’s Hobby School of Public Affairs.
Laura Paler is an associate professor in the Department of Government in the School of Public Affairs at American University.
Julie Anne Weaver is the research director of the Democratic Erosion Consortium and a lecturer on government at Harvard University.
COVID-19 and Related Immunology Research
peterschreiber.media/iStock/Getty Images Plus
By Matthew Woodruff
On March 24, 2020, I stood in a Biosafety Level 2+ facility at Emory University with six colleagues being taught best practices for working with the largely unknown pathogen, SARS-CoV-2. Other unknowns included where we would get masks (N95s were unavailable), risks of infection to our young kids at home and who would pay for the experiments needed to gain insight into the deadly new virus sweeping across the nation.
That last question was answered relatively quickly. Rapid investment by the first Trump administration’s NIH launched SeroNet, a five-year effort across 25 institutions to “expand the nation’s capacity for SARS-CoV-2 serologic testing on a population-level and advance research on the immune response to SARS-CoV-2 infection and COVID-19 vaccination among diverse and vulnerable populations.” We did just that. Over the coming years, taxpayer dollars funded more than 600 peer-reviewed publications, reflecting significant advances in disease pathology, treatment strategies, disease impact in immunocompromised patients, vaccine testing and more.
Our team at Emory led projects dedicated to understanding the balance between productive and pathogenic immunity in hopes of alleviating disease. We discovered why your immune system sometimes turns on itself in the throes of severe infection, uncovered similarities between the immune responses of chronically autoimmune patients and those who were seriously ill with COVID-19, and documented continued disturbances in patients with long COVID. Importantly, we learned that these responses weren’t unique to COVID-19 and were broadly relevant to human health.
In 2022, I started my own lab founded on those concepts. We have been optimistic that the work we are doing will ultimately serve the American people in our shared desire to live longer, healthier lives.
But over the past months, that optimism has dissipated. Ham-handed targeting of “DEI” awards leaves us unable to understand how diverse human populations might respond differently to infection or develop different kinds of chronic diseases. Mistrust of the same vaccine programs that have halted the spread of measles globally has left us unable to test next-generation vaccines that might provide broad protection against emerging viral strains. And then, on March 24, it was announced that the five-year commitment that the first Trump administration made to our work would no longer be honored. Our COVID-related funding through SeroNet would be halted, effective immediately.
Our fledgling program, a few months ago extremely promising, is now on life support. My lab has invested heavily with our time and limited resources, which are now running thin, into promising new areas of clinically relevant immunology that suddenly look like financial dead ends. The decision to halt entire fields of study in what was previously highly fertile scientific space is as damaging as it is unprecedented, and our lab is left with a business model that is now fundamentally broken.
Matthew Woodruff is an assistant professor of immunology at the Emory University Lowance Center for Human Immunology. His lab studies antibody responses in the context of infection, vaccination and autoimmune disease.
Training Tomorrow’s Biomedical Workforce
Unaihuiziphotography/iStock/Getty Images Plus
By Samantha Meenachand Ryan Poling-Skutvik
On March 21, the NIH terminated our training grant award, which supported the Enhancing Science, Technology, Engineering, and Math Education Diversity (ESTEEMED) program at the University of Rhode Island. The mission of URI ESTEEMED was to increase the preparation of undergraduate students—freshmen and sophomores—to conduct biomedical research, enabling them to succeed in advanced research in preparation to pursue a Ph.D. in STEM. Our ultimate goals were to provide students who were from groups underrepresented in STEM or from disadvantaged economic backgrounds with academic enrichment, research and soft skills development, and a sense of community. NIH claims that our award “no longer effectuates agency priorities” and that it involves “amorphous equity objectives, [that] are antithetical to the scientific inquiry.”
While the language in the termination email itself was derisive and political, the fallout from the loss of this award will be felt for years to come. The state of Rhode Island immediately lost $1.2 million in direct economic activity, and an important workforce development initiative will end, significantly reducing state and regional competitiveness in a growing technological field. Like many other states, Rhode Island has a pressing need for professionals trained in biotechnology, and recruiting people to Rhode Island has often proven to be challenging. This challenge is exemplified by the recent establishment of the Rhode Island Life Sciences Hub with a specific mandate to grow the biotechnology sector in the state.
By contrast, there is a large untapped pool of talent within Rhode Island, who are limited by access to education and training in large part due to the financial pressures families face. Our URI ESTEEMED program recruited talented students who likely would not have had the resources necessary to enter these careers. While NIH would like to argue that ESTEEMED was used to “support unlawful discrimination on the basis of race and other protected characteristics,” ESTEEMED trainees were selected through a rigorous and competitive application process, making these awards merit-based. Without the financial support of this program, many of our trainees would not have been able to attend URI or would not have had the opportunity to focus on research.
URI ESTEEMED in its current form will cease to exist at the end of this semester. We are still figuring out to what capacity we can continue to recruit and train students, but without NIH funds, training programs such as ESTEEMED will not be able to alleviate the many pressures these students face. The political decision to terminate this grant inflicts direct financial pain on some of the most promising students, and these effects will reverberate for years to come.
Samantha Meenach is a professor in the Department of Chemical, Biomolecular, and Materials Engineering at the University of Rhode Island.
Ryan Poling-Skutvik is an assistant professor in the Department of Chemical, Biomolecular, and Materials Engineering and the Department of Physics at the University of Rhode Island.
Alzheimer’s and Dementia Research for Diverse Populations
By Jason D. Flatt
Research funding for diverse populations impacted by Alzheimer’s disease and related dementias (ADRD) is currently being terminated by the U.S. federal government. These terminations are attributed to the premise that the research is incompatible with agency priorities. For instance, funding for studies including older transgender individuals, as well as lesbian, gay, bisexual, queer, intersex and other LGBTQIA+ identities, has been terminated. In addition, funding decisions have been rescinded, and grants have been pulled from scientific review. The National Institutes of Health has stated, “Research programs based on gender identity are often unscientific, have little identifiable return on investment, and do nothing to enhance the health of many Americans. Many such studies ignore, rather than seriously examine, biological realities. It is the policy of NIH not to prioritize these research programs.”
To date, around 700 NIH grants have been terminated, including many important studies on HIV/AIDS, cancer, COVID-19 and ADRD. Of these, about 25 have focused on ADRD. Personally, I have lost nearly $5 million in research funding from the NIH and the Department of Defense because my ADRD research includes transgender people. My research focuses on the needs of LGBTQIA+ and non-LGBTQIA+ older adults, particularly those affected by ADRD and Parkinson’s disease, as well as their caregivers and health-care providers. Some have suggested that we remove or rephrase “forbidden” language in future grants and/or exclude transgender people from our studies, but I will not do that. It is not pro-science and will not ensure that all people benefit from our research. The current and future termination of grants and contracts will have a significant impact on the health of older Americans, slow our innovation, limit our ability to provide care and impede progress in finding a cure.
I am working to raise awareness about these terminations and find ways to either reverse the decisions or secure alternative funding for this vital research. This includes speaking with the press, informing policymakers, generating visibility on social media alongside colleagues and peers, consulting with legal experts, and engaging with community members. I am also deeply concerned about the future of early-career scientists, who are essential in leading efforts to find cures for diseases affecting our communities, especially as the baby boomer generation ages. Many of the grants that have been terminated were early-career awards for newly minted doctoral researchers and faculty, diversity supplements for doctoral students, and competitive NIH predoctoral and postdoctoral fellowships.
In light of today’s sociopolitical climate, it is more important than ever for our civic, academic and research communities to unite in advocating for inclusion, standing up for diverse groups, including LGBTQIA+ communities, and ensuring that early-career scholars and the broader aging population have opportunities for potential cures, treatments and health care.
Jason D. Flatt is an associate professor at the University of Nevada, Las Vegas, School of Public Health, in the Department of Social and Behavioral Health.
I have spent the past year and a half as a postdoc researching the effects of Virginia’s Get a Skill, Get a Job, Get Ahead (G3) initiative, a tuition-free community college program implemented in 2021. Similar to most statewide free college programs, G3 is a last-dollar scholarship program for state residents attending one of Virginia’s 23 community colleges, though students who already receive the maximum Pell Grant and enroll full-time are eligible for an additional living stipend to support the costs of books, transit and other expenses frequently incurred while enrolled. Virginia implemented the program as a bipartisan pandemic-recovery strategy to reverse steep enrollment declines in community colleges and boost credential completion in five high-demand workforce areas: early childhood education, health care, information technology, manufacturing and skilled trades, and public safety.
Like so many other critical research projects in education, our Institute of Education Sciences funding was terminated by the Trump administration’s ongoing efforts to gut the Department of Education and publicly funded research at large. The abrupt termination of the grant, which supports researchers at both the University of Pennsylvania and the Community College Research Center at Columbia University’s Teachers College, is a depressing way to finish out my postdoc. The project is part of a larger IES grant that established the Accelerating Recovery in Community Colleges network, a group of research teams focused on strategies to improve community college enrollment and student success. The loss of funding means canceled conference presentations and convenings; it means planned collaborations with other research teams in the network will not happen. We simply cannot accomplish all the things we set out to do without the resources provided by the grant.
The grant termination is demoralizing on multiple levels. It funded my postdoc, which has been an invaluable experience in developing my skills as an education policy researcher. While my position was nearing its end regardless, the ongoing forced austerity on public-facing research portends a future where these types of opportunities are not available to later generations of scholars. And on a less personal note, canceling education research, especially toward the end of its life cycle, is extremely wasteful and inefficient. It hinders the completion of projects that public money has already been invested in and limits dissemination efforts that help to drive the overwhelmingly positive return on investment from these types of research projects.
This is a real shame in the case of our work on G3. Our findings and planned future research on the policy hold critical implications for policymakers and institutions in Virginia and across the US. States like Arkansas, Indiana and Kentucky have similarly implemented workforce-targeted free college initiatives. And given the heightened attention from policymakers on career and technical education in recent years, it is reasonable to think more states will follow suit. Our work on G3 is in service of improving community college student outcomes so that more students have the resources and opportunities to pursue meaningful careers and life trajectories. Without any federal funding, it will only be more difficult to uncover the best ways to go about achieving these ends.
Daniel Sparks is a postdoctoral researcher in economics and education at the University of Pennsylvania’s Graduate School of Education.
Training Pediatric Physician-Scientists
FluxFactory/e+/Getty Images
By Sallie Permar
The NIH made the abrupt decision last month to terminate the Pediatric Scientist Development Program (PSDP), a long-standing initiative that has trained generations of physician-scientists dedicated to advancing child health. This decision was made without an opportunity for resubmission or revision, and it appears to be linked to diversity, equity and inclusion requirements in our renewal application, components we were previously required to include and encouraged to expand by our reviewers, and that were later weaponized as justification for defunding.
For more than 40 years, the PSDP has served as a critical pipeline for training pediatric physician-scientists. Through rigorous mentorship, research training and career development, the PSDP has trained more than 270 pediatric physician-scientists, helping launch the careers of child health researchers who have made groundbreaking discoveries in areas such as childhood cancer, genetic disorders, autoimmunity and infectious diseases. At a time when pediatric research faces increasing challenges, this decision further weakens an already fragile infrastructure. It is not merely an administrative setback; it has immediate and far-reaching consequences that will be felt across academic institutions and the future of the health of children and the adults they become. Pediatric research is the highest yield of all medical research, providing lifetimes of health.
Without federal funding, our health as Americans faces several dire immediate and long-term impacts:
Loss of training opportunities and career uncertainty for pediatric researchers: The PSDP was on track to expand through deepening of our public-private institutional partnership funding model, due to increasing interest across states and pediatric specialties. We received a record high number of talented applicants this year. Now we are now forced to determine how many, if any, new trainees can be supported. Additionally, the program serves as the critical bridge between physician-scientists’ clinical training and their ability to secure independent research grants. With NIH funding cut, current trainees will face financial instability, and prospective trainees might be forced to abandon their research, and their career aspirations, altogether.
Weakening of the pediatric research pipeline: The PSDP has been a key factor in addressing the national shortage of pediatric physician-scientists. Without it, fewer pediatricians will enter research careers, exacerbating an already urgent pediatric workforce crisis at a time when children are presenting with more complex health needs.
Children’s health in jeopardy: Cutting PSDP funding halts critical research on chronic childhood diseases like genetic conditions, asthma and obesity, leaving millions of children without hope for better treatments or cures, directly reducing their chance for health and quality of life.
The PSDP’s termination is not just a loss for academic medicine, it is a direct threat to the future of pediatric research and children’s health. Pediatricians pursuing research careers already face significant challenges, including limited funding opportunities and lower salaries compared to other medical specialties. By eliminating the PSDP, the NIH has removed one of the most effective mechanisms for supporting these researchers at a critical stage in their careers.
We call on academic leaders, policymakers and child health advocates to take immediate action. The future of children’s health research depends on our ability to reverse this decision and ensure that pediatric physician-scientists continue to receive the training and support they need to advance medical discoveries for the next generation.
Sallie Permar is the Nancy C. Paduano Professor and Chair at Weill Cornell Medicine and pediatrician in chief at New York–Presbyterian/Weill Cornell Medical Center.
Global Development and Women’s Empowerment
By Denise L. Baer
On Monday, Jan. 27, I received an email from local project staff in Guatemala canceling that day’s key informant interview due to the “review of cooperation projects by the United States government” and the request to “suspend activities” until further notice. This was the first notice that the evaluation of the Legal Reform Fund (LRF) project that I was conducting had been paused—and, in effect, permanently canceled. After checking in with the project implementer, the American Bar Association’s Rule of Law Initiative (ABA-ROLI), I received formal notification of the pause later that same day.
LRF provided contextualized expert legal technical assistance and training to partnering government agencies, parliamentarians, judges, court staff and women entrepreneurs to improve women’s access to land, property rights and credit in Guatemala, Indonesia, Mexico and Timor-Leste. I had been working on the evaluation for about two months, with the intent to complete all initial staff interviews before the end of January and then move on to field data collection. The evaluation had been approved last December by the Department of State, with approval of the inception report coming from the department’s Office of Global Women’s Issues just a week earlier. While I’d been tracking the flurry of executive orders, I doubted that this project would violate the new “two-gender” policy—after all, it was funded through the Women’s Global Development and Prosperity (W-GDP) Initiative created by President Trump himself during his first administration in 2019 and championed by his daughter Ivanka with great fanfare. The initiative aimed to help 50 million women in developing countries realize their economic potential by 2025; the LRF project was only one of many funded by W-GDP initially and later continued by the Biden administration.
The LRF project ended December 2024. Was it effective and efficient? Were the planned outcomes achieved? We will never know. Since I was paid by ABA-ROLI for the work conducted to date before the pause, the primary cost of this discontinuance is not to me personally, but to the American people, who funded this project. The call for this evaluation and the approval of my proposal was born of the government’s desire for efficiency and to ensure funded initiatives were going according to plan. Indeed, the Government Accountability Office had identified a less-than-robust implementation framework in many early W-GDP projects, and this evaluation was intended to provide critical evidence of whether processes had improved.
Now we will never know how strong the evidence base is for supporting women entrepreneurs through this initiative. It is profoundly stunning that not only would the Trump administration stop work midstream for so many projects, but they would also stop evaluations of project work already completed—even for programs they themselves created and supported. How does funding a project and then shutting down the work of determining how effective that project was fight waste, fraud and abuse?
Denise L. Baer is a scholar-practitioner fellow at the Graduate School of Political Management at George Washington University.
This audio is auto-generated. Please let us know if you have feedback.
Dive Brief:
Researchers, unions and others sued the National Institutes of Health on Wednesday over the agency’s purge of diversity, equity and inclusion-related research activity that has resulted in lost grant funding and career opportunities.
Plaintiffs, including dozens of academic scientists, alleged that the agency’s leaders, starting in February, “upended NIH’s enviable track record of rigor and excellence, launching a reckless and illegal purge to stamp out NIH-funded research that addresses topics and populations that they disfavor.”
They are asking a federal court to block NIH from enforcing its anti-DEI directives both in the short term and permanently and to restore grants to researchers that the agency has cut under the Trump administration.
Dive Insight:
The complaint counts at least 678 research projects that have been terminated by NIH, some of them potentially by the Elon Musk-led Department of Government Efficiency rather than NIH staff.
The recently cut grants amount to over $2.4 billion, the lawsuit noted. Of that, $1.3 billion was already spent on projects “stopped midstream that is now wasted,” and $1.1 billion has been revoked.
Plaintiffs argue that grant terminations “cut across diverse topics that NIH is statutorily required to research,” many of which involve life-threatening diseases. Specifically, they argue that NIH’s actions violate the Administrative Procedures Act and constitutional limits on executive branch authority, and are unconstitutionally vague.
In the lawsuit, filed in U.S. district court in Massachusetts, plaintiffs detailed how their lives, careers and potentially life-saving research have been thrown into turmoil by the NIH’s attack on DEI under President Donald Trump.
Among them is a postdoctoral fellow at the University of New Mexico’s medical school who studies alcohol’s impact on Alzheimer’s risk. The researcher, the first in her family to graduate college, sought a grant created to help promising researchers from underrepresented backgrounds transition to tenure-track faculty positions.
According to the lawsuit, the researcher “satisfies the eligibility criteria for the program and invested months into assembling her application,” but NIH refused to consider it “solely because the program is designed to help diversify the profession.”
Another plaintiff, a Ph.D. candidate at a private California university,had received a high score on a research funding application for a dissertation proposal that would have studied suicide prevention among LGBTQ+ youth experiencing homelessness.
But the candidate learned that new restrictions on LGBTQ-related research meant the NIH would not likely fund the project. The turn of events will harm the researcher’s “ability to progress through their PhD program,” the complaint said.
Others include a University of Michigan social work professor whose research focuses on sexual violence in minority communities. TheNIH has cut at least six grants supporting her research because the agency said it “no longer effectuates agency priorities,” according to the complaint.
Setting the various cuts in motion was internal NIH guidance, most of it revealed by the news media and cited in the complaint, that directed agency staff to terminate and deny DEI-related grant proposals. One memo instructed NIH officials to “completely excise all DEI activities.”
Staff guidance included research topics for grant terminations. One document forbade three research activity topics: China, DEI and transgender issues. A later document, the complaint alleges, effectively banned research grants around vaccine hesitancy and COVID-19.
NIH did not immediately respond to a request for comment Thursday.
The scale of impact by both DEI cuts and other funding chaos at NIH is broad, cutting across much of the higher ed world.The United Auto Workers, one of the plaintiffs, counts tens of thousands of members who depend on NIH grants for their work and training, according to the lawsuit. It also noted 18,000 full-time graduate students who received their primary federal funding support through NIH in 2022.
Additionally, the Trump administration has variously moved to cancel or suspend research contracts and grants at Columbia University, the University of Pennsylvania and most recently Princeton University as part of punitive actions tied to investigations of campus antisemitism or, in Penn’s case, the decision to allow a trans woman to compete on the women’s swim team three years ago. The administration also briefly froze (and then unfroze) United States Department of Agriculture funds for the University of Maine system after the state’s governor engaged in a tense exchange with President Trump at the White House.
Below, 16 researchers across nine different research areas who have had their federal grants terminated since the start of the Trump administration share just a few of the thousands of stories behind these cuts.
—Elizabeth Redden, opinion editor
Preventing Intimate Partner Violence
Prostock-Studio/iStock/Getty Images Plus
By Rebecca Fielding-Miller, Nicholas Metheny, Abigail Hatcher and Sarah Peitzmeier
Each year, more than 3,000 American women are murdered by their partners. Pregnancy and the postpartum period are high-risk periods for intimate partner violence (IPV), which is linked to negative maternal outcomes such as miscarriage, hemorrhage and postpartum depression. Perinatal IPV is also linked to worse infant health outcomes, such as preterm birth and low birth weight, and to adverse childhood experiences. This makes prevention of perinatal IPV crucial not just for the survivor but for the entire family.
Perinatal IPV and its cascade of negative outcomes are preventable—but only if we study the epidemiology and prevention of IPV as rigorously as we study hypertension or any other perinatal complication. A grant rescinded last month by the NIH would have trained a cohort of 12 early-career clinicians and researchers to learn how to study IPV as part of their ongoing research on pregnancy, birth and the postpartum period. We proposed training investigators working in diverse communities across the spectrum of America, with a commitment to including communities disproportionately impacted by IPV and maternal mortality, including Black and LGBTQ+ communities. To solve a problem with constrained resources, it is efficient to focus efforts on where the problem is most severe. While the termination letter named this targeting of training resources an “amorphous equity objective,” we call it a data-driven approach to rigorous science.
Training grants like this one help shift an entire field by giving young investigators the skills and knowledge to add a focus on IPV to their research for the next several decades. In addition to training these 12 young researchers, the grant would have also supported turning the mentorship curriculum we developed into an open-access online training for clinicians and researchers to access in perpetuity, multiplying the impact of the work to train even more investigators in the field. As with the approximately 700 other terminated NIH grants, cutting this work before our aims are realized but after significant costs have been incurred to establish the mentorship team and design the curriculum is the definition of government inefficiency and waste.
With this grant rescinded, none of the promised training will occur. Pregnant people and their babies from every community across America will continue to suffer, without the benefit of advances in the science of how we prevent these violence exposures. Our termination notice claims that the proposed trainings are “antithetical to the scientific inquiry, do nothing to expand our knowledge of living systems, provide low returns on investment, and ultimately do not enhance health, lengthen life, or reduce illness.” We could not disagree more. Anyone who has cared for a child or for the person who gave birth to them knows that preventing maternal and infant death and abuse should be a nonpartisan issue. The current administration is intent on making even this issue into “us” versus “them.” When it comes to public health, there is no such thing. American families deserve better.
Rebecca Fielding Miller is an associate professor of public health at the University of California, San Diego. Her research focuses on health disparities in infectious disease and gender-based violence.
Nicholas Metheny is an Atlanta-area scientist and registered nurse with clinical and research experience in the post-violence care of women and sexual and gender minority communities.
Abigail Hatcher is an associate professor at the University of North Carolina and University of the Witwatersrand, where she develops and tests health sector models for preventing violence in pregnancy.
Sarah Peitzmeier is an assistant professor at the University of Maryland School of Public Health who develops and tests interventions to prevent gender-based violence. She is also a practicing birth doula and victim advocate.
Is Work-Study Working?
Okrasyuk/iStock/Getty Images Plus
By Judith Scott-Clayton
On March 7, at 9:49 a.m., I received an email with “GRANT AWARD TERMINATION” in all caps in the subject line. Attached to the email was a letter, addressed to me as project director and referring to our Department of Education grant by its award number. The letter was generic, virtually identical to three other termination letters received that day at the Community College Research Center at Columbia University’s Teachers College, where I am affiliated. It did not mention our project title nor provide any project-specific details to explain why our project, as the email states, “is now inconsistent with, and no longer effectuates, the Department’s priorities.” A few hours later, I received a formal notification that the grant end date was that day: March 7, 2025.
The project—a collaboration with Adela Soliz of Vanderbilt University and Tom Brock of CCRC—was titled “Does Federal Work-Study Work for Students? Evidence From a Randomized Controlled Trial.” The Federal Work-Study (FWS) program was created in 1964 as part of the Economic Opportunity Act and covers up to 75 percent of the wages of college students working part-time in mostly on-campus jobs, with colleges paying the rest. In a typical year, the program provides more than $1 billion in support to more than 450,000 college students with financial need at more than 3,000 institutions all across the country. Several states also have their own similar programs.
Our study would be the first to rigorously evaluate the causal impact of the program on students’ enrollment, employment, persistence and degree completion. We were also conducting interviews, focus groups and surveys to understand how students find FWS jobs, what kinds of work they do, what resources institutions devote to running the program and how much it all costs to operate, all with the goal of ensuring the program is delivering the maximum impact for every single student that participates and for every dollar spent.
At the time of its cancellation, we were about four and a half years into a six-year project. We were right in the middle of randomizing what would be the final cohort of our study sample and fielding the final round of a student survey. This final year is especially important, because the early cohorts were heavily impacted by the pandemic. For the past three weeks, we have been scrambling to pull together any other resources we could find to preserve our options and avoid losing this final cohort of participants. We have also been scrambling to figure out how to continue to pay critical staff and doctoral students involved in the project until we can figure out the next steps.
As for the broader impact of the termination: The Federal Work-Study program itself will keep on going, at least for now; we just won’t know whether it works or not. We hypothesize that it may provide valuable work-based learning opportunities that keep students engaged and give them advantages in the labor market after college, but it’s possible that it distracts students from their studies and hurts their academic performance. We may think that it helps students to afford college, but perhaps the complexity of finding a specific job and navigating all the necessary paperwork reduces its value for the students that need help the most. The next time the program is up for debate, policymakers will be flying blind: Without actual evidence all we can do is speculate.
Since 1964, the FWS program has disbursed more than $95 billion in awards. In comparison, our grant was less than three-thousandths of 1 percent of that amount, and the amount remaining to finish our work and share our findings with the public was just a fraction of that. Our project was motivated by a desire to help policymakers ensure that every dollar invested in financial aid has the maximum possible impact for low-income students. So it is discouraging to learn, so close to the finish line, that this first-of-its-kind evaluation of a major federal program is “now inconsistent with, and no longer effectuates, the Department’s priorities.”
Judith Scott-Clayton is a professor of economics and education at Teachers College, Columbia University, in the Department of Education Policy and Social Analysis, where she directs the Economics and Education Program and teaches courses on the economics of education, labor economics and causal inference.
Democracy Research
AlexeyPushkin/iStock/Getty Images Plus
By Rob Blair, Jessica Gottlieb, Laura Paler and Julie Anne Weaver
We lost funding for the Democratic Erosion Consortium (DEC) as part of the federal government’s recent cancellation of foreign assistance grants. Directed by scholars at Brown University, the University of Houston and American University, DEC works to make academic research on democratic backsliding accessible to policymakers and practitioners seeking evidence-based strategies to defend democracy around the globe.
Originally launched in 2017 on a shoestring budget, DEC began as an effort to improve pedagogy on a troubling trend observable both abroad and at home: the strategic dismantling of democratic norms and institutions by elected leaders with autocratic ambitions. In 2022, in line with the U.S. government’s dual interests in democratic resilience and evidence-based policymaking, we received a grant from the State Department to expand DEC’s work.
The State Department’s investment enabled us to grow our reach beyond the classroom and into the policy arena. We drew on an expanding network of scholars to synthesize evidence on urgent questions—such as how to reduce the spread of misinformation and measure democratic decline. We also built out a novel event data set on democratic erosion and trained partners around the world to use it in their own work.
The immediate consequences are clear: several full- and part-time staff lost funding for their jobs. But the long-term damage is hard to quantify. It’s difficult to argue for the value of evidence-based policymaking in foreign aid when the entire category of foreign assistance has effectively been gutted. More than that, the partnerships we built between academics, practitioners and policymakers were yielding real-time insights and responses—a rare example of successful research-policy collaboration. That infrastructure is now gone.
And at a moment when democratic backsliding is accelerating in many parts of the world, the U.S. government is stepping away from efforts to understand and counter it. Ending this grant not only weakens the ability to monitor democratic erosion globally, it also reduces public awareness and understanding of a phenomenon that is increasingly visible in the U.S. itself.
With the federal policy audience for our work largely gone, we are refocusing our efforts on our other two core constituencies: students and academics. We continue to support instructors engaged in teaching our democratic erosion course and to improve the Democratic Erosion Event Dataset. And in response to growing concern about democratic backsliding in the U.S., we’re developing a more robust domestic data-collection effort, paired with public engagement.
Given intense partisan disagreement around what even constitutes democratic erosion, we are seeking to increase the credibility of new evidence by capturing partisan-diverse perspectives and applying our established comparative framework to U.S. events. We are hoping to continue this work, despite the loss of our federal grant, because the political reality in the U.S. and around the world tells us we need to be worried about democratic erosion now more than ever.
Rob Blair is the Arkadij Eisler Goldman Sachs Associate Professor of Political Science and International and Public Affairs at Brown University.
Jessica Gottlieb is an associate professor at the University of Houston’s Hobby School of Public Affairs.
Laura Paler is an associate professor in the Department of Government in the School of Public Affairs at American University.
Julie Anne Weaver is the research director of the Democratic Erosion Consortium and a lecturer on government at Harvard University.
COVID-19 and Related Immunology Research
peterschreiber.media/iStock/Getty Images Plus
By Matthew Woodruff
On March 24, 2020, I stood in a Biosafety Level 2+ facility at Emory University with six colleagues being taught best practices for working with the largely unknown pathogen, SARS-CoV-2. Other unknowns included where we would get masks (N95s were unavailable), risks of infection to our young kids at home and who would pay for the experiments needed to gain insight into the deadly new virus sweeping across the nation.
That last question was answered relatively quickly. Rapid investment by the first Trump administration’s NIH launched SeroNet, a five-year effort across 25 institutions to “expand the nation’s capacity for SARS-CoV-2 serologic testing on a population-level and advance research on the immune response to SARS-CoV-2 infection and COVID-19 vaccination among diverse and vulnerable populations.” We did just that. Over the coming years, taxpayer dollars funded more than 600 peer-reviewed publications, reflecting significant advances in disease pathology, treatment strategies, disease impact in immunocompromised patients, vaccine testing and more.
Our team at Emory led projects dedicated to understanding the balance between productive and pathogenic immunity in hopes of alleviating disease. We discovered why your immune system sometimes turns on itself in the throes of severe infection, uncovered similarities between the immune responses of chronically autoimmune patients and those who were seriously ill with COVID-19, and documented continued disturbances in patients with long COVID. Importantly, we learned that these responses weren’t unique to COVID-19 and were broadly relevant to human health.
In 2022, I started my own lab founded on those concepts. We have been optimistic that the work we are doing will ultimately serve the American people in our shared desire to live longer, healthier lives.
But over the past months, that optimism has dissipated. Ham-handed targeting of “DEI” awards leaves us unable to understand how diverse human populations might respond differently to infection or develop different kinds of chronic diseases. Mistrust of the same vaccine programs that have halted the spread of measles globally has left us unable to test next-generation vaccines that might provide broad protection against emerging viral strains. And then, on March 24, it was announced that the five-year commitment that the first Trump administration made to our work would no longer be honored. Our COVID-related funding through SeroNet would be halted, effective immediately.
Our fledgling program, a few months ago extremely promising, is now on life support. My lab has invested heavily with our time and limited resources, which are now running thin, into promising new areas of clinically relevant immunology that suddenly look like financial dead ends. The decision to halt entire fields of study in what was previously highly fertile scientific space is as damaging as it is unprecedented, and our lab is left with a business model that is now fundamentally broken.
Matthew Woodruff is an assistant professor of immunology at the Emory University Lowance Center for Human Immunology. His lab studies antibody responses in the context of infection, vaccination and autoimmune disease.
Training Tomorrow’s Biomedical Workforce
Unaihuiziphotography/iStock/Getty Images Plus
By Samantha Meenachand Ryan Poling-Skutvik
On March 21, the NIH terminated our training grant award, which supported the Enhancing Science, Technology, Engineering, and Math Education Diversity (ESTEEMED) program at the University of Rhode Island. The mission of URI ESTEEMED was to increase the preparation of undergraduate students—freshmen and sophomores—to conduct biomedical research, enabling them to succeed in advanced research in preparation to pursue a Ph.D. in STEM. Our ultimate goals were to provide students who were from groups underrepresented in STEM or from disadvantaged economic backgrounds with academic enrichment, research and soft skills development, and a sense of community. NIH claims that our award “no longer effectuates agency priorities” and that it involves “amorphous equity objectives, [that] are antithetical to the scientific inquiry.”
While the language in the termination email itself was derisive and political, the fallout from the loss of this award will be felt for years to come. The state of Rhode Island immediately lost $1.2 million in direct economic activity, and an important workforce development initiative will end, significantly reducing state and regional competitiveness in a growing technological field. Like many other states, Rhode Island has a pressing need for professionals trained in biotechnology, and recruiting people to Rhode Island has often proven to be challenging. This challenge is exemplified by the recent establishment of the Rhode Island Life Sciences Hub with a specific mandate to grow the biotechnology sector in the state.
By contrast, there is a large untapped pool of talent within Rhode Island, who are limited by access to education and training in large part due to the financial pressures families face. Our URI ESTEEMED program recruited talented students who likely would not have had the resources necessary to enter these careers. While NIH would like to argue that ESTEEMED was used to “support unlawful discrimination on the basis of race and other protected characteristics,” ESTEEMED trainees were selected through a rigorous and competitive application process, making these awards merit-based. Without the financial support of this program, many of our trainees would not have been able to attend URI or would not have had the opportunity to focus on research.
URI ESTEEMED in its current form will cease to exist at the end of this semester. We are still figuring out to what capacity we can continue to recruit and train students, but without NIH funds, training programs such as ESTEEMED will not be able to alleviate the many pressures these students face. The political decision to terminate this grant inflicts direct financial pain on some of the most promising students, and these effects will reverberate for years to come.
Samantha Meenach is a professor in the Department of Chemical, Biomolecular, and Materials Engineering at the University of Rhode Island.
Ryan Poling-Skutvik is an assistant professor in the Department of Chemical, Biomolecular, and Materials Engineering and the Department of Physics at the University of Rhode Island.
Alzheimer’s and Dementia Research for Diverse Populations
By Jason D. Flatt
Research funding for diverse populations impacted by Alzheimer’s disease and related dementias (ADRD) is currently being terminated by the U.S. federal government. These terminations are attributed to the premise that the research is incompatible with agency priorities. For instance, funding for studies including older transgender individuals, as well as lesbian, gay, bisexual, queer, intersex and other LGBTQIA+ identities, has been terminated. In addition, funding decisions have been rescinded, and grants have been pulled from scientific review. The National Institutes of Health has stated, “Research programs based on gender identity are often unscientific, have little identifiable return on investment, and do nothing to enhance the health of many Americans. Many such studies ignore, rather than seriously examine, biological realities. It is the policy of NIH not to prioritize these research programs.”
To date, around 700 NIH grants have been terminated, including many important studies on HIV/AIDS, cancer, COVID-19 and ADRD. Of these, about 25 have focused on ADRD. Personally, I have lost nearly $5 million in research funding from the NIH and the Department of Defense because my ADRD research includes transgender people. My research focuses on the needs of LGBTQIA+ and non-LGBTQIA+ older adults, particularly those affected by ADRD and Parkinson’s disease, as well as their caregivers and health-care providers. Some have suggested that we remove or rephrase “forbidden” language in future grants and/or exclude transgender people from our studies, but I will not do that. It is not pro-science and will not ensure that all people benefit from our research. The current and future termination of grants and contracts will have a significant impact on the health of older Americans, slow our innovation, limit our ability to provide care and impede progress in finding a cure.
I am working to raise awareness about these terminations and find ways to either reverse the decisions or secure alternative funding for this vital research. This includes speaking with the press, informing policymakers, generating visibility on social media alongside colleagues and peers, consulting with legal experts, and engaging with community members. I am also deeply concerned about the future of early-career scientists, who are essential in leading efforts to find cures for diseases affecting our communities, especially as the baby boomer generation ages. Many of the grants that have been terminated were early-career awards for newly minted doctoral researchers and faculty, diversity supplements for doctoral students, and competitive NIH predoctoral and postdoctoral fellowships.
In light of today’s sociopolitical climate, it is more important than ever for our civic, academic and research communities to unite in advocating for inclusion, standing up for diverse groups, including LGBTQIA+ communities, and ensuring that early-career scholars and the broader aging population have opportunities for potential cures, treatments and health care.
Jason D. Flatt is an associate professor at the University of Nevada, Las Vegas, School of Public Health, in the Department of Social and Behavioral Health.
I have spent the past year and a half as a postdoc researching the effects of Virginia’s Get a Skill, Get a Job, Get Ahead (G3) initiative, a tuition-free community college program implemented in 2021. Similar to most statewide free college programs, G3 is a last-dollar scholarship program for state residents attending one of Virginia’s 23 community colleges, though students who already receive the maximum Pell Grant and enroll full-time are eligible for an additional living stipend to support the costs of books, transit and other expenses frequently incurred while enrolled. Virginia implemented the program as a bipartisan pandemic-recovery strategy to reverse steep enrollment declines in community colleges and boost credential completion in five high-demand workforce areas: early childhood education, health care, information technology, manufacturing and skilled trades, and public safety.
Like so many other critical research projects in education, our Institute of Education Sciences funding was terminated by the Trump administration’s ongoing efforts to gut the Department of Education and publicly funded research at large. The abrupt termination of the grant, which supports researchers at both the University of Pennsylvania and the Community College Research Center at Columbia University’s Teachers College, is a depressing way to finish out my postdoc. The project is part of a larger IES grant that established the Accelerating Recovery in Community Colleges network, a group of research teams focused on strategies to improve community college enrollment and student success. The loss of funding means canceled conference presentations and convenings; it means planned collaborations with other research teams in the network will not happen. We simply cannot accomplish all the things we set out to do without the resources provided by the grant.
The grant termination is demoralizing on multiple levels. It funded my postdoc, which has been an invaluable experience in developing my skills as an education policy researcher. While my position was nearing its end regardless, the ongoing forced austerity on public-facing research portends a future where these types of opportunities are not available to later generations of scholars. And on a less personal note, canceling education research, especially toward the end of its life cycle, is extremely wasteful and inefficient. It hinders the completion of projects that public money has already been invested in and limits dissemination efforts that help to drive the overwhelmingly positive return on investment from these types of research projects.
This is a real shame in the case of our work on G3. Our findings and planned future research on the policy hold critical implications for policymakers and institutions in Virginia and across the US. States like Arkansas, Indiana and Kentucky have similarly implemented workforce-targeted free college initiatives. And given the heightened attention from policymakers on career and technical education in recent years, it is reasonable to think more states will follow suit. Our work on G3 is in service of improving community college student outcomes so that more students have the resources and opportunities to pursue meaningful careers and life trajectories. Without any federal funding, it will only be more difficult to uncover the best ways to go about achieving these ends.
Daniel Sparks is a postdoctoral researcher in economics and education at the University of Pennsylvania’s Graduate School of Education.
Training Pediatric Physician-Scientists
FluxFactory/e+/Getty Images
By Sallie Permar
The NIH made the abrupt decision last month to terminate the Pediatric Scientist Development Program (PSDP), a long-standing initiative that has trained generations of physician-scientists dedicated to advancing child health. This decision was made without an opportunity for resubmission or revision, and it appears to be linked to diversity, equity and inclusion requirements in our renewal application, components we were previously required to include and encouraged to expand by our reviewers, and that were later weaponized as justification for defunding.
For more than 40 years, the PSDP has served as a critical pipeline for training pediatric physician-scientists. Through rigorous mentorship, research training and career development, the PSDP has trained more than 270 pediatric physician-scientists, helping launch the careers of child health researchers who have made groundbreaking discoveries in areas such as childhood cancer, genetic disorders, autoimmunity and infectious diseases. At a time when pediatric research faces increasing challenges, this decision further weakens an already fragile infrastructure. It is not merely an administrative setback; it has immediate and far-reaching consequences that will be felt across academic institutions and the future of the health of children and the adults they become. Pediatric research is the highest yield of all medical research, providing lifetimes of health.
Without federal funding, our health as Americans faces several dire immediate and long-term impacts:
Loss of training opportunities and career uncertainty for pediatric researchers: The PSDP was on track to expand through deepening of our public-private institutional partnership funding model, due to increasing interest across states and pediatric specialties. We received a record high number of talented applicants this year. Now we are now forced to determine how many, if any, new trainees can be supported. Additionally, the program serves as the critical bridge between physician-scientists’ clinical training and their ability to secure independent research grants. With NIH funding cut, current trainees will face financial instability, and prospective trainees might be forced to abandon their research, and their career aspirations, altogether.
Weakening of the pediatric research pipeline: The PSDP has been a key factor in addressing the national shortage of pediatric physician-scientists. Without it, fewer pediatricians will enter research careers, exacerbating an already urgent pediatric workforce crisis at a time when children are presenting with more complex health needs.
Children’s health in jeopardy: Cutting PSDP funding halts critical research on chronic childhood diseases like genetic conditions, asthma and obesity, leaving millions of children without hope for better treatments or cures, directly reducing their chance for health and quality of life.
The PSDP’s termination is not just a loss for academic medicine, it is a direct threat to the future of pediatric research and children’s health. Pediatricians pursuing research careers already face significant challenges, including limited funding opportunities and lower salaries compared to other medical specialties. By eliminating the PSDP, the NIH has removed one of the most effective mechanisms for supporting these researchers at a critical stage in their careers.
We call on academic leaders, policymakers and child health advocates to take immediate action. The future of children’s health research depends on our ability to reverse this decision and ensure that pediatric physician-scientists continue to receive the training and support they need to advance medical discoveries for the next generation.
Sallie Permar is the Nancy C. Paduano Professor and Chair at Weill Cornell Medicine and pediatrician in chief at New York–Presbyterian/Weill Cornell Medical Center.
Global Development and Women’s Empowerment
By Denise L. Baer
On Monday, Jan. 27, I received an email from local project staff in Guatemala canceling that day’s key informant interview due to the “review of cooperation projects by the United States government” and the request to “suspend activities” until further notice. This was the first notice that the evaluation of the Legal Reform Fund (LRF) project that I was conducting had been paused—and, in effect, permanently canceled. After checking in with the project implementer, the American Bar Association’s Rule of Law Initiative (ABA-ROLI), I received formal notification of the pause later that same day.
LRF provided contextualized expert legal technical assistance and training to partnering government agencies, parliamentarians, judges, court staff and women entrepreneurs to improve women’s access to land, property rights and credit in Guatemala, Indonesia, Mexico and Timor-Leste. I had been working on the evaluation for about two months, with the intent to complete all initial staff interviews before the end of January and then move on to field data collection. The evaluation had been approved last December by the Department of State, with approval of the inception report coming from the department’s Office of Global Women’s Issues just a week earlier. While I’d been tracking the flurry of executive orders, I doubted that this project would violate the new “two-gender” policy—after all, it was funded through the Women’s Global Development and Prosperity (W-GDP) Initiative created by President Trump himself during his first administration in 2019 and championed by his daughter Ivanka with great fanfare. The initiative aimed to help 50 million women in developing countries realize their economic potential by 2025; the LRF project was only one of many funded by W-GDP initially and later continued by the Biden administration.
The LRF project ended December 2024. Was it effective and efficient? Were the planned outcomes achieved? We will never know. Since I was paid by ABA-ROLI for the work conducted to date before the pause, the primary cost of this discontinuance is not to me personally, but to the American people, who funded this project. The call for this evaluation and the approval of my proposal was born of the government’s desire for efficiency and to ensure funded initiatives were going according to plan. Indeed, the Government Accountability Office had identified a less-than-robust implementation framework in many early W-GDP projects, and this evaluation was intended to provide critical evidence of whether processes had improved.
Now we will never know how strong the evidence base is for supporting women entrepreneurs through this initiative. It is profoundly stunning that not only would the Trump administration stop work midstream for so many projects, but they would also stop evaluations of project work already completed—even for programs they themselves created and supported. How does funding a project and then shutting down the work of determining how effective that project was fight waste, fraud and abuse?
Denise L. Baer is a scholar-practitioner fellow at the Graduate School of Political Management at George Washington University.