Tag: Researchers

  • Program stops early-career researchers quitting – Campus Review

    Program stops early-career researchers quitting – Campus Review

    Universities need workers with comprehensive analytical and strategic skills, but funding cuts and progression barriers have caused retention issues, leading to early-career researchers leaving universities in droves.

    Please login below to view content or subscribe now.

    Membership Login

    Source link

  • U.K. Weighs Streamlining Visa Process for Researchers

    U.K. Weighs Streamlining Visa Process for Researchers

    The U.K. government has been urged to remove barriers in the visa process for researchers in order to capitalize on new U.S. restrictions imposed by Donald Trump.

    The U.S. president last weekend announced a $100,000 fee for applicants to the H-1B visa program, making a vital visa route used by skilled foreign workers in the U.S. inaccessible to many.

    The U.K. is reportedly considering removing fees for its global talent visa in response. The Campaign for Science and Engineering (CaSE) warned that high visa costs are already a significant barrier but said it is not the only change that needs to be made.

    In a new report, CaSE highlights the obstacles presented by the current system, including concerns raised by professionals who handle visa and immigration issues at U.K. research institutions.

    It warns that information about who is eligible for the visa route is often ambiguous and hard to navigate. According to the Wellcome Sanger Institute, which contributed to the report, the language around “exceptional talent” can be intimidating for talented applicants, although many institutions also receive a large number of low-quality applications.

    “These examples point to a wider issue of confusion and unclear messaging about who is eligible, resulting in missed opportunities and cost inefficiencies,” says the report.

    Visa policy is also increasingly complex and can put a significant strain on organizations, according to CaSE.

    The Sainsbury Laboratory (TSL), a research organization that specializes in molecular plant-microbe interactions, said visa support now demands a full-time employee in human resources as well as external support costing more than $21,000 per year in legal fees.

    “The U.K. visa system is becoming increasingly complex, unclear and time-consuming—especially for research institutes like TSL that depend on international talent.

    “Policy changes are poorly communicated, portals outdated and guidance inconsistent, requiring our HR to spend extensive time interpreting information.”

    TSL said that without a fair and functional visa system, the U.K. risks reaching a “breaking point in our ability to attract global talent and sustain world-leading research.”

    Alicia Greated, executive director of CaSE, said U.K. research faces “major challenges” under the current system. She wants to see the government take action that will improve things for skilled workers and those that employ them.

    Greated welcomed reports that the Labour administration was considering reducing visa fees for highly skilled researchers, adding, “If these changes happen, they will put the U.K. in a strong position to compete on the global skills market, especially given the changes in the opposite direction in the U.S.”

    However, she said that the removal of indefinite leave to remain, or permanent residency, from individuals already settled in the U.K.—as Reform UK is advocating—would be extremely damaging to U.K. R&D and the wider economy, as well as individuals and their families.

    “Policy proposals like this also have a negative impact on the attractiveness of the U.K. as a destination for the world’s brightest and best researchers because people may worry their right to be in the country could be taken away.”

    Source link

  • A researcher’s view on using AI to become a better writer

    A researcher’s view on using AI to become a better writer

    Writing can be hard, equal parts heavy lifting and drudgery. No wonder so many students are turning to the time-saving allure of ChatGPT, which can crank out entire papers in seconds. It rescues them from procrastination jams and dreaded all-nighters, magically freeing up more time for other pursuits, like, say … doomscrolling.

    Of course, no one learns to be a better writer when someone else (or some AI bot) is doing the work for them. The question is whether chatbots can morph into decent writing teachers or coaches that students actually want to consult to improve their writing, and not just use for shortcuts.

    Maybe.

    Jennifer Meyer, an assistant professor at the University of Vienna in Austria, has been studying how AI bots can be used to improve student writing for several years. In an interview, she explained why she is cautious about the ability of AI to make us better writers and is still testing how to use the new technology effectively.

    All in the timing 

    Meyer says that just because ChatGPT is available 24/7 doesn’t mean students should consult it at the start of the writing process. Instead, Meyer believes that students would generally learn more if they wrote a first draft on their own. 

    That’s when AI could be most helpful, she thinks. With some prompting, a chatbot could provide immediate writing feedback targeted to each students’ needs. One student might need to practice writing shorter sentences. Another might be struggling with story structure and outlining. AI could theoretically meet an entire classroom’s individual needs faster than a human teacher. 

    Related: Our free weekly newsletter alerts you to what research says about schools and classrooms.

    In Meyer’s experiments, she inserted AI only after the first draft was done as part of the revision process. In a study published in 2024, she randomly assigned 200 German high school students to receive AI feedback after writing a draft of an essay in English. Their revised essays were stronger than those of 250 students who were also told to revise, but didn’t get help from AI. 

    In surveys, those with AI feedback also said they felt more motivated to rewrite than those who didn’t get feedback. That motivation is critical. Often students aren’t in the mood to rewrite, and without revisions, students can’t become better writers.

    Meyer doesn’t consider her experiment proof that AI is a great writing teacher. She didn’t compare it with how student writing improved after human feedback. Her experiment compared only AI feedback with no feedback. 

    Most importantly, one dose of AI writing feedback wasn’t enough to elevate students’ writing skills. On a second, fresh essay topic, the students who had previously received AI feedback didn’t write any better than the students who hadn’t been helped by AI.

    Related: AI writing feedback ‘better than I thought,’ top researcher says

    It’s unclear how many rounds of AI feedback it would take to boost a student’s writing skills more permanently, not just help revise the essay at hand. 

    And Meyer doesn’t know whether a student would want to keep discussing writing with an AI bot over and over again. Maybe students were willing to engage with it in this experiment because it was a novelty, but could soon tire of it. That’s next on Meyer’s research agenda.

    A viral MIT study

    A much smaller MIT study published earlier this year echoes Meyer’s theory. “Your Brain on ChatGPT” went viral because it seemed to say that using ChatGPT to help write an essay made students’ brains less engaged. Researchers found that students who wrote an essay without any online tools had stronger brain connectivity and activity than students who used AI or consulted Google to search for source materials. (Using Google while writing wasn’t nearly as bad for the brain as AI.) 

    Although those results made headlines, there was more to the experiment. The students who initially wrote an essay on their own were later given ChatGPT to help improve their essays. That switch to ChatGPT boosted brain activity, in contrast to what the neuroscientists found during the initial writing process. 

    Related: University students offload critical thinking, other hard work to AI

    These studies add to the evidence that delaying AI a bit, after some initial thinking and drafting, could be a sweet spot in learning. That’s something researchers need to test more. 

    Still, Meyer remains concerned about giving AI tools to very weak writers and to young children who haven’t developed basic writing skills. “This could be a real problem,” said Meyer. “It could be detrimental to use these tools too early.”

    Cheating your way to learning?

    Meyer doesn’t think it’s always a bad idea for students to ask ChatGPT to do the writing for them. 

    Just as young artists learn to paint by copying masterpieces in museums, students might learn to write better by copying good writing. (The late great New Yorker editor John Bennet taught Jill to write this way. He called it “copy work” and he encouraged his journalism students to do it every week by copying longhand the words of legendary writers, not AI.)

    Meyer suggests that students ask ChatGPT to write a sample essay that meets their teacher’s assignment and grading criteria. The next step is key. If students pretend it’s their own piece and submit it, that’s cheating. They’ve also offloaded cognitive work to technology and haven’t learned anything.

    Related: AI essay grading is already as ‘good as an overburdened’ teacher, but researchers say it needs more work

    But the AI essay can be an effective teaching tool, in theory, if students study the arguments, organizational structure, sentence construction and vocabulary before writing a new draft in their own words. Ideally, the next assignment should be better if students have learned through that analysis and internalized the style and techniques of the model essay, Meyer said. 

    “My hypothesis would be as long as there’s cognitive effort with it, as long as there’s a lot of time on task and like critical thinking about the output, then it should be fine,” said Meyer.

    Reconsidering praise

    Everyone likes a compliment. But too much praise can drown learning just as too much water can keep flowers from blooming.  

    ChatGPT has a tendency to pour the praise on thick and often begins with banal flattery, like “Great job!” even when a student’s writing needs a lot of work. In Meyer’s test of whether AI feedback can improve students’ writing, she intentionally told ChatGPT not to start with praise and instead go straight to constructive criticism.

    Her parsimonious approach to praise was inspired by a 2023 writing study about what motivates students to revise. The study found that when teachers started off with general praise, students were left with the false impression that their work was already good enough so they didn’t put in the extra effort to rewrite.

    Related: Asian American students lose more points in an AI essay grading study — but researchers don’t know why

    In Meyer’s experiment, the praise-free feedback was effective in getting students to revise and improve their essays. But she didn’t set up a direct competition between the two approaches — praise-free vs. praise-full — so we don’t know for sure which is more effective when students are interacting with AI.

    Being stingy with praise rubs real teachers the wrong way. After Meyer removed praise from the feedback, teachers told her they wanted to restore it. “They wondered about why the feedback was so negative,” Meyer said. “That’s not how they would do it.”

    Meyer and other researchers may one day solve the puzzle of how to turn AI chatbots into great writing coaches. But whether students will have the willpower or desire to forgo an instantly written essay is another matter. As long as ChatGPT continues to allow students to take the easy way out, it’s human nature to do so. 

    Shirley Liu is a graduate student in education at Northwestern University. Liu reported and wrote this story along with The Hechinger Report’s Jill Barshay.

    Contact staff writer Jill Barshay at 212-678-3595, jillbarshay.35 on Signal, or [email protected].

    This story about using AI to become a better writer was produced by The Hechinger Report, a nonprofit, independent news organization focused on inequality and innovation in education. Sign up for Proof Points and other Hechinger newsletters.

    The Hechinger Report provides in-depth, fact-based, unbiased reporting on education that is free to all readers. But that doesn’t mean it’s free to produce. Our work keeps educators and the public informed about pressing issues at schools and on campuses throughout the country. We tell the whole story, even when the details are inconvenient. Help us keep doing that.

    Join us today.

    Source link

  • Nation’s Report Card at risk, researchers say

    Nation’s Report Card at risk, researchers say

    This story was reported by and originally published by APM Reports in connection with its podcast Sold a Story: How Teach Kids to Read Went So Wrong.

    When voters elected Donald Trump in November, most people who worked at the U.S. Department of Education weren’t scared for their jobs. They had been through a Trump presidency before, and they hadn’t seen big changes in their department then. They saw their work as essential, mandated by law, nonpartisan and, as a result, insulated from politics.

    Then, in early February, the Department of Government Efficiency showed up. Led at the time by billionaire CEO Elon Musk, and known by the cheeky acronym DOGE, it gutted the Department of Education’s Institute of Education Sciences, posting on X that the effort would ferret out “waste, fraud and abuse.”

    A post from the Department of Government Efficiency.

    When it was done, DOGE had cut approximately $900 million in research contracts and more than 90 percent of the institute’s workforce had been laid off. (The current value of the contracts was closer to $820 million, data compiled by APM Reports shows, and the actual savings to the government was substantially less, because in some cases large amounts of money had been spent already.)

    Among staff cast aside were those who worked on the National Assessment of Educational Progress — also known as the Nation’s Report Card — which is one of the few federal education initiatives the Trump administration says it sees as valuable and wants to preserve.

    The assessment is a series of tests administered nearly every year to a national sample of more than 10,000 students in grades 4, 8 and 12. The tests regularly measure what students across the country know in reading, math and other subjects. They allow the government to track how well America’s students are learning overall. Researchers can also combine the national data with the results of tests administered by states to draw comparisons between schools and districts in different states.

    The assessment is “something we absolutely need to keep,” Education Secretary Linda McMahon said at an education and technology summit in San Diego earlier this year. “If we don’t, states can be a little manipulative with their own results and their own testing. I think it’s a way that we keep everybody honest.”

    But researchers and former Department of Education employees say they worry that the test will become less and less reliable over time, because the deep cuts will cause its quality to slip — and some already see signs of trouble.

    “The main indication is that there just aren’t the staff,” said Sean Reardon, a Stanford University professor who uses the testing data to research gaps in learning between students of different income levels.

    All but one of the experts who make sure the questions in the assessment are fair and accurate — called psychometricians — have been laid off from the National Center for Education Statistics. These specialists play a key role in updating the test and making sure it accurately measures what students know.

    “These are extremely sophisticated test assessments that required a team of researchers to make them as good as they are,” said Mark Seidenberg, a researcher known for his significant contributions to the science of reading. Seidenberg added that “a half-baked” assessment would undermine public confidence in the results, which he described as “essentially another way of killing” the assessment.

    The Department of Education defended its management of the assessment in an email: “Every member of the team is working toward the same goal of maintaining NAEP’s gold-standard status,” it read in part.

    The National Assessment Governing Board, which sets policies for the national test, said in a statement that it had temporarily assigned “five staff members who have appropriate technical expertise (in psychometrics, assessment operations, and statistics) and federal contract management experience” to work at the National Center for Education Statistics. No one from DOGE responded to a request for comment.

    Harvard education professor Andrew Ho, a former member of the governing board, said the remaining staff are capable, but he’s concerned that there aren’t enough of them to prevent errors.

    “In order to put a good product up, you need a certain number of person-hours, and a certain amount of continuity and experience doing exactly this kind of job, and that’s what we lost,” Ho said.

    The Trump administration has already delayed the release of some testing data following the cutbacks. The Department of Education had previously planned to announce the results of the tests for 8th grade science, 12th grade math and 12th grade reading this summer; now that won’t happen until September. The board voted earlier this year to eliminate more than a dozen tests over the next seven years, including fourth grade science in 2028 and U.S. history for 12th graders in 2030. The governing board has also asked Congress to postpone the 2028 tests to 2029, citing a desire to avoid releasing test results in an election year. 

    “Today’s actions reflect what assessments the Governing Board believes are most valuable to stakeholders and can be best assessed by NAEP at this time, given the imperative for cost efficiencies,” board chair and former North Carolina Gov. Bev Perdue said earlier this year in a press release.

    The National Assessment Governing Board canceled more than a dozen tests when it revised the schedule for the National Assessment of Educational Progress in April. This annotated version of the previous schedule, adopted in 2023, shows which tests were canceled. Topics shown in all caps were scheduled for a potential overhaul; those annotated with a red star are no longer scheduled for such a revision.

    Recent estimates peg the annual cost to keep the national assessment running at about $190 million per year, a fraction of the department’s 2025 budget of approximately $195 billion.

    Adam Gamoran, president of the William T. Grant Foundation, said multiple contracts with private firms — overseen by Department of Education staff with “substantial expertise” — are the backbone of the national test.

    “You need a staff,” said Gamoran, who was nominated last year to lead the Institute of Education Sciences. He was never confirmed by the Senate. “The fact that NCES now only has three employees indicates that they can’t possibly implement NAEP at a high level of quality, because they lack the in-house expertise to oversee that work. So that is deeply troubling.”

    The cutbacks were widespread — and far outside of what most former employees had expected under the new administration.

    “I don’t think any of us imagined this in our worst nightmares,” said a former Education Department employee, who spoke on condition of anonymity for fear of retaliation by the Trump administration. “We weren’t concerned about the utter destruction of this national resource of data.”

    “At what point does it break?” the former employee asked.

    Related: Suddenly sacked

    Every state has its own test for reading, math and other subjects. But state tests vary in difficulty and content, which makes it tricky to compare results in Minnesota to Mississippi or Montana.

    “They’re totally different tests with different scales,” Reardon said. “So NAEP is the Rosetta stone that lets them all be connected.”

    Reardon and his team at Stanford used statistical techniques to combine the federal assessment results with state test scores and other data sets to create the Educational Opportunity Project. The project, first released in 2016 and updated periodically in the years that followed, shows which schools and districts are getting the best results — especially for kids from poor families. Since the project’s release, Reardon said, the data has been downloaded 50,000 times and is used by researchers, teachers, parents, school boards and state education leaders to inform their decisions.

    For instance, the U.S. military used the data to measure school quality when weighing base closures, and superintendents used it to find demographically similar but higher-performing districts to learn from, Reardon said.

    If the quality of the data slips, those comparisons will be more difficult to make.

    “My worry is we just have less-good information on which to base educational decisions at the district, state and school level,” Reardon said. “We would be in the position of trying to improve the education system with no information. Sort of like, ‘Well, let’s hope this works. We won’t know, but it sounds like a good idea.’”

    Seidenberg, the reading researcher, said the national assessment “provided extraordinarily important, reliable information about how we’re doing in terms of teaching kids to read and how literacy is faring in the culture at large.”

    Producing a test without keeping the quality up, Seidenberg said, “would be almost as bad as not collecting the data at all.”

    The Hechinger Report provides in-depth, fact-based, unbiased reporting on education that is free to all readers. But that doesn’t mean it’s free to produce. Our work keeps educators and the public informed about pressing issues at schools and on campuses throughout the country. We tell the whole story, even when the details are inconvenient. Help us keep doing that.

    Join us today.



    Source link

  • Law Firm Threatens Brown Climate Researchers

    Law Firm Threatens Brown Climate Researchers

    A law firm representing anti–wind energy groups is demanding that Brown University researchers retract findings linking those groups to the fossil fuel industry, The New York Times reported Monday. 

    The move comes weeks after Brown reached an agreement with the Trump administration. The government restored $510 million in frozen federal research grants after the university agreed to certain demands, including adopting the Trump administration’s definitions of male and female and turning over admissions data. 

    The Trump administration has halted or canceled thousands of other research grants across the country, including many focused on climate change.

    Marzulla Law LLC characterized the research published by Brown’s Climate and Development Lab as “false and injurious” in an Aug. 11 letter to Brown’s general counsel. It threatened to file complaints with Brown’s public and private funders, including the Energy Department, the National Science Foundation and the Mellon Foundation. 

    A university spokesperson did not comment specifically on the law firm’s demands but told the Times that it’s committed to maintaining academic freedom. 

    Brown researchers who authored a case study about Marzulla Law have written that the firm has “a history of advancing anti-environmental lawsuits and significant ties with the fossil fuel industry.” Researchers have also published findings accusing one of the firm’s clients—the nonprofit Green Oceans, which is trying to shut down the construction of a nearly complete $4 billion wind farm off the coast of Rhode Island—of being part of “a fossil-fuel-funded disinformation network.”

    On Friday, the Trump administration, which opposes the wind energy industry, halted the wind farm project without citing specific reasons. 

    Source link

  • SCOTUS Ruling Has “Bleak Implications” for Researchers

    SCOTUS Ruling Has “Bleak Implications” for Researchers

    Photo illustration by Justin Morrison/Inside Higher Ed | SDI Productions/E+/Getty Images

    Hope is fading that federally funded researchers whose grants were terminated by the National Institutes of Health earlier this year will be able to resume their work as planned.

    On Thursday, the United States Supreme Court ruled 5 to 4 that any legal challenges to the grant terminations should be litigated in the Court of Federal Claims, not the federal district court system they’ve been moving through for months.

    It’s the latest twist in federally funded researchers’ legal fight to claw back nearly $800 million in medical research grants—though accounting for the multiyear grants that the NIH is refusing to fulfill puts that figure closer to $2 billion—the NIH terminated for running afoul of the Trump administration’s ideological priorities. Many of the grants funded programs that advanced diversity, equity and inclusion initiatives and research projects focused on topics such as LGBTQ+ health, vaccine hesitancy and racial disparities.

    Researchers sued the NIH in April and got a win in June when a federal district court judge in Massachusetts ordered the agency to reinstate the grants immediately. Although the NIH has since reinstated many of those grants, Scott Delaney, an epidemiologist at Harvard University and former lawyer who’s been tracking grant cancellations, told Inside Higher Ed that after Thursday’s ruling those reinstated grants will “almost certainly” be re-terminated. If that happens, “I don’t think they’ll get their money back.”

    That’s in part because the Supreme Court said researchers will have to re-file their lawsuits in federal claims court, which generally doesn’t have the power to issue injunctive relief that could keep grant money flowing during the litigation process. And it could take months or even years for the claims court to decide if researchers are owed damages.

    “Nobody has that kind of time. The nature of research is that you can’t just stop and restart it many months later,” said Delaney. “Folks have already had to do that once and many aren’t able to—they’ve had to lay off staff and lost contact with study participants. This additional delay probably renders the research unviable going forward.”

    Trump ‘Always Wins’

    Delaney is among numerous experts and advocates who say the decision is both a blow to the scientific research enterprise and the latest evidence that the Supreme Court is inclined to interpret the law to favor the Trump administration’s whims.

    “Make no mistake: This was a decision critical to the future of the nation, and the Supreme Court made the wrong choice,” the Association of American Medical Colleges said in a statement. “History will look upon these mass NIH research grant terminations with shame. The Court has turned a blind eye to this grievous attack on science and medicine, and we call upon Congress to take action to restore the rule of law at NIH.”

    Jeremy Berg, who served as director of the National Institute of General Medical Sciences from 2003 to 2011, said in an email to Inside Higher Ed that while “many (but not all) grants from the lawsuits that had been terminated have been reinstated at this point,” the big question the Supreme Court’s ruling raises now “is whether NIH will start to re-terminate them.”

    Although a 5-4 majority did agree on Thursday that the district can review NIH’s reasoning for the terminations and kept in place a court order blocking the guidance that prompted the cancellations, Berg said the mixed ruling is “potentially very damaging” because redirecting the case to a different court means “the stay blocking the required reinstatements could go into effect.”

    He added that Justice Ketanji Brown Jackson’s dissent sums up his interpretation of the ruling’s implications. “This is Calvinball jurisprudence with a twist,” Jackson wrote. “Calvinball has only one rule: There are no fixed rules. We seem to have two: That one, and this Administration always wins.”

    That’s how Samuel Bagenstos, a professor of law and public policy at the University of Michigan and former general counsel to the Department of Health and Human Services, interpreted the decision, too.

    “The message the courts sent yesterday is very strong that they are going to let the Trump administration shut down the grants right now and remit grantees to the really uncertain process of going to the Court of Federal Claims and potentially getting damages in the future,” he said in an interview with Inside Higher Ed Friday.

    “But that’s really cold comfort for the grantees,” Bagenstos added. “If they can’t get the grants restarted right now, they probably can’t continue their research projects, and the prospect of maybe getting damages in the future doesn’t keep those research projects alive. It’s a bad sign for the entire research community.”

    The NIH is far from the only federal agency that has canceled federal research grants that don’t align with the Trump administration’s ideologies. The National Science Foundation, the Education Department and the National Endowment for the Humanities are all facing legal challenges in federal district courts after freezing or canceling grants.

    And the Supreme Court’s ruling on the NIH’s terminations has implications for those cases, as well.

    “The message seems to be pretty clear that if you have an ongoing grant that’s been terminated and you want to go to court to keep the money flowing, you’re out of luck,” Bagenstos said. “It’s got very bleak implications for all researchers who are depending on continuing the flow of federal grants.”

    Source link

  • Racial bias affects early math education. Researchers are trying to stop that

    Racial bias affects early math education. Researchers are trying to stop that

    The early years are a critical time to teach the foundations of math. That’s when children learn to count, start identifying shapes and gain an early understanding of concepts like size and measurement. These years can also be a time when children are confronted with preconceived notions of their abilities in math, often based on their race, which can negatively affect their math success and contribute to long-standing racial gaps in scores. 

    These are some of the motivating factors behind the Racial Justice in Early Math project, a collaboration between the Erikson Institute, a private graduate school focused on child development, and the University of Illinois Chicago. The project aims to educate teachers and provide resources including books, teacher tips and classroom activities that help educators combat racial bias in math instruction.  

    I sat down with Danny Bernard Martin, professor of education and mathematics at the University of Illinois Chicago, project director Priscila Pereira and Jennifer McCray, a research professor at the Erikson Institute, to learn more about their work. This conversation has been edited for length and clarity.

    What are some of the key examples of racial injustice that you see in early math education?

    Martin: If I say to you, ‘Asians are good at math,’ that’s something that you’ve heard, we know that’s out there. When does that kind of belief start? Well, there’s something called ‘racial-mathematical socialization’ that we take seriously in this project, that we know happens in the home before children come to school. Parents and caregivers are generating messages around math that they transmit to children, and then those messages may get reinforced in schools.

    Even at the early math level, there are research projects beginning to construct Black children in particular ways, comparing Black children to white children as the norm. That is a racial justice issue, because that narrative about white children, Black children, Asian American children, Latinx children, then filters out. It becomes part of the accepted truth, and then it impacts what teachers do and what principals and school leaders believe about children.  

    What does this look like in schools?

    McCray: Perhaps the math curriculum doesn’t represent them or their experience. We all know that often schools for children of color are under-resourced. What often happens in under-resourced schools is that the curriculum and the teaching tends to focus on the basics. There might be an overemphasis on drilling or doing timed tests. We also have those situations where people are doing ability grouping in math. And we know what the research says about that, it’s basically ‘good education for you, and poor education for you.’ It’s almost impossible to do any of that without doing harm. 

    One line of research has been to watch teachers interact with children and videotape or study them. And in diverse classrooms with white teachers … often it is observed that children who are Black or Latina aren’t called on as often, or aren’t listened to as much, or don’t have the same kind of opportunity to be a leader in the classroom.  

    What should teacher prep programs, administrators and families do to address racial justice issues in early math? 

    McCray: Maybe the white teacher is reflecting on themselves, on their own biases … trying to connect with families or communities in some way that’s meaningful. We want teachers to have that balance of knowing that sometimes you do want to teach a procedure, but you never want to be shutting down ideas for creative ways to solve a math problem, or culturally distinct ways to solve a math problem that might come from your students.

    It might be something like, you’re working on sorting in an early childhood classroom. And what if a child is thinking about a special craft that their parent does that’s like the [papel picado], or papers that get cut in very elaborate designs in Mexico. … If the teacher doesn’t have space to listen, it could be a shutdown moment, instead of a moment of connection, where the child is actually bringing something … that is associated with their own identity.

    Pereira: I do feel that sometimes the conversations of racial justice really put the weight on teachers and teachers alone. Teaching is part of a larger structure. Maybe your school will not allow you to do the work that is needed. I’m thinking about [a teacher] who was required to follow a scripted curriculum that did not promote the positive math identity for Black children. It needs to be a whole community effort.

    How is your initiative changing this?

    Pereira: There are resources in terms of opportunities that we offer to teachers to engage with our content and ideas: webinars, a fellowship and an immersive learning experience in the summer of 2026. These spaces are moments in which educators, researchers and people that are engaged in the education of young learners, can come together … and disrupt mainstream notions of understanding what is racial justice and how one gets that in the classroom.  

    Right now, research and initiatives zeroing in on race are under scrutiny, especially at the college level. Do you foresee any additional challenges to this work?

    Pereira: There was a National Science Foundation grant program focused on racial equity in STEM and we had been planning to apply for funds to do something there. … It’s gone. … The only place we’re welcome is where there’s a governor who is willing to take on Trump. We just have to keep doing the work, because we know what’s right. But it is challenging, for sure.

    Contact staff writer Jackie Mader at 212-678-3562 or [email protected]

    This story about racial justice in math was produced by The Hechinger Report, a nonprofit, independent news organization focused on inequality and innovation in education. Sign up for the Early Childhood newsletter.

    The Hechinger Report provides in-depth, fact-based, unbiased reporting on education that is free to all readers. But that doesn’t mean it’s free to produce. Our work keeps educators and the public informed about pressing issues at schools and on campuses throughout the country. We tell the whole story, even when the details are inconvenient. Help us keep doing that.

    Join us today.

    Source link

  • KU researchers publish guidelines to help responsibly implement AI in education

    KU researchers publish guidelines to help responsibly implement AI in education

    This story originally appeared on KU News and is republished with permission.

    Key points:

    Researchers at the University of Kansas have produced a set of guidelines to help educators from preschool through higher education responsibly implement artificial intelligence in a way that empowers teachers, parents, students and communities alike.

    The Center for Innovation, Design & Digital Learning at KU has published “Framework for Responsible AI Integration in PreK-20 Education: Empowering All Learners and Educators with AI-Ready Solutions.” The document, developed under a cooperative agreement with the U.S. Department of Education, is intended to provide guidance on how schools can incorporate AI into its daily operations and curriculum.

    Earlier this year, President Donald Trump issued an executive order instructing schools to incorporate AI into their operations. The framework is intended to help all schools and educational facilities do so in a manner that fits their unique communities and missions.

    “We see this framework as a foundation,” said James Basham, director of CIDDL and professor of special education at KU. “As schools consider forming an AI task force, for example, they’ll likely have questions on how to do that, or how to conduct an audit and risk analysis. The framework can help guide them through that, and we’ll continue to build on this.”

    The framework features four primary recommendations.

    • Establish a stable, human-centered foundation.
    • Implement future-focused strategic planning for AI integration.
    • Ensure AI educational opportunities for every student.
    • Conduct ongoing evaluation, professional learning and community development.

    First, the framework urges schools to keep humans at the forefront of AI plans, prioritizing educator judgment, student relationships and family input on AI-enabled processes and not relying on automation for decisions that affect people. Transparency is also key, and schools should communicate how AI tools work, how decisions are made and ensure compliance with student protection laws such as the Individuals with Disabilities Education Act and Family Education Rights and Privacy Act, the report authors write.

    The document also outlines recommendations for how educational facilities can implement the technology. Establishing an AI integration task force including educators, administrators, families, legal advisers and specialists in instructional technology and special education is key among the recommendations. The document also shares tips on how to conduct an audit and risk analysis before adoption and consider how tools can affect student placement and identification and consider possible algorithmic error patterns. As the technologies are trained on human data, they run the risk of making the same mistakes and repeating biases humans have made, Basham said.

    That idea is also reflected in the framework’s third recommendation. The document encourages educators to commit to learner-centered AI implementation that considers all students, from those in gifted programs to students with cognitive disabilities. AI tools should be prohibited from making final decisions on IEP eligibility, disciplinary actions and student progress decisions, and mechanisms should be installed that allow for feedback on students, teachers and parents’ AI educational experiences, the authors wrote.

    Finally, the framework urges ongoing evaluation, professional learning and community development. As the technology evolves, schools should regularly re-evaluate it for unintended consequences and feedback from those who use it. Training both at implementation and in ongoing installments will be necessary to address overuse or misuse and clarify who is responsible for monitoring AI use and to ensure both the school and community are informed on the technology.

    The framework was written by Basham; Trey Vasquez, co-principal investigator at CIDDL, operating officer at KU’s Achievement & Assessment Institute and professor of special education at KU; and Angelica Fulchini Scruggs, research associate and operations director for CIDDL.

    Educators interested in learning more about the framework or use of AI in education are invited to connect with CIDDL. The center’s site includes data on emergent themes in AI guidance at the state level and information on how it supports educational technology in K-12 and higher education. As artificial intelligence finds new uses and educators are expected to implement the technology in schools, the center’s researchers said they plan to continue helping educators implement it in ways that benefit schools, students of all abilities and communities.

    “The priority at CIDDL is to share transparent resources for educators on topics that are trending and in a way that is easy to digest,” Fulchini Scruggs said. “We want people to join the community and help them know where to start. We also know this will evolve and change, and we want to help educators stay up to date with those changes to use AI responsibly in their schools.”

    Latest posts by eSchool Media Contributors (see all)

    Source link

  • Storytelling for Scientists and the Researchers’ Writing Podcast with Dr. Anna Clemens

    Storytelling for Scientists and the Researchers’ Writing Podcast with Dr. Anna Clemens

    Are you thinking about starting a podcast? I invited Dr. Anna Clemens to share her podcasting journey. We talk about how social media and online presence has changed for researchers in 2025. And, how storytelling can help people connect with your research in meaningful ways.

    Dr. Anna Clemens is an academic writing coach who specializes in scientific research papers. She runs the Researchers’ Writing Academy, an online course where she helps researchers to get published in high-ranking journals without lacking structure in the writing process.

    Before we get started, Join Anna for a 3-day Online Writing Retreat 16-18 July 2025 and make significant progress on your summer writing project in just a few days. Get your ticket now before registration ends on 10 July! 🚀

    Subscribe to The Social Academic blog.

    The form above subscribes you to new posts published on The Social Academic blog.
    Want emails from Jennifer about building your online presence? Subscribe to her email list.
    Looking for the podcast? Subscribe on Spotify.
    Prefer to watch videos? Subscribe on YouTube.

    Jennifer van Alstyne: Hi everyone, this is Jennifer Van Alstyne. Welcome to the Social Academic Podcast. I’m here with Dr. Anna Clemens of the Researchers’ Writing Academy. Anna, I’m so happy to have you here today. First, because you’re my friend and we’ve been trying to do this for multiple years now. I’m so happy! And second because I want to share the program that you’ve created for scientists to help them write better. It’s actually something I’ve recommended to clients of mine, something clients of mine have participated in. So I wanted to share you with everyone who listens to the podcast. Would you please introduce yourself?

    Dr. Anna Clemens: Yeah, of course. Thank you so much for having me. And I’m super excited. And it’s been such a joy having some of your clients in the program.

    I run a program called the Researchers’ Writing Academy, where we help researchers, well, kind of develop a really structured writing process so they can get published in the journals they want to get published in. We kind of look a bit more toward top-tier journals, high-impact journals. But honestly, what we teach kind of helps you wherever you want to go.

    I have a background in chemistry. So my PhD’s in chemistry and I transitioned into writing after that. So it’s a really fun way to be able to combine kind of my scientific knowledge with writing and helping folks to get published and make that all really time efficient.

    Jennifer: Gosh, that’s amazing. I think that I did not have a lot of writing support when I was in grad school. And I really felt like even though I’m an excellent writer, like I’m a creative writer, like that’s what I went to school for. 

    Anna: You write poetry. 

    Jennifer: I write poetry and I think I’m a good academic writer, but I feel like I had to teach myself all of that. And it was a lot of correction after something was already submitted in order to bring it closer to what was actually publishable. 

    Anna: Right.

    Jennifer: I lost so much time by not knowing things. So I love that you created a program to support people who maybe aren’t getting the training that they need to publish in those high impact journals.

    Anna: Yeah, because that’s so common. Like, honestly, who gets good academic writing training? That’s really almost nobody.

    I often see even people who do go on, do some kind of course of their university if they offer some kind of course. They’re often not really so focused on the things that I’m teaching, which is like a lot of storytelling and a lot like being efficient with your writing, like kind of the step by step. You kind of often know just like academic English, how do I sound good? And I think honestly, this is less important than knowing how to really tell a story in your paper and having that story be consistent and not losing time by all the like edits and rewrites, etc., that are so frustrating to do.

    Jennifer: Hmm, you brought up storytelling. That’s really insightful.

    As a creative writer, story is so important to the words that we create and how people can connect with them. Why is storytelling important for researchers?

    Anna: Well, I think it’s because we’re all humans, right? So we just as humans, really need storytelling to be able to access information in the best way and to connect to that information and to kind of put it into the kind of frameworks that we have already in our minds.

    This is what a lot of researchers really overestimate is like, your research is so incredibly specific, right? It’s so much, like that thing to you, it’s all like when you’re doing it, you’re like, of course you know every detail about it. And you just forget how little other people know. It’s even if they’re in the same field because we always think, “Oh, no, everyone knows what I know.” Also a bit this feeling of like, not quite realizing like, it’s also called like the experts curse I think, when you are an expert in something, and you don’t realize how little other people know. And you kind of undervalue what you know.

    So anyway, if you really want your papers to be read, if you want to get published, you need to be able to, to make it accessible to like the journal editor, right? The peer reviewers, but also the readers later, they need to be able to understand the data in a way that makes sense to them. And I think that’s where storytelling comes in. Also, it really helps with structuring the writing process. Like honestly, if you think about storytelling first, the really nice side effect is your writing process will be a lot easier because you don’t have to go back and edit quite so many times.

    Jennifer: Oh, that’s fascinating. So not only does it improve how the research is being communicated It improves the process of writing it too.

    Anna: I think so. Yeah, because when you’re clear on the story, everything is clear in your head from the start. And you don’t need to kind of . . . I mean, when you write a paper for the first time, or even people who’ve written a few papers, they still sometimes start writing with the introduction. And it’s such a waste of time. Like they just start at the start, right? And then they end up like deleting all those paragraphs and all those words after when they actually have written so much that they then after a while understand the story that they want to tell. And instead, what I’m suggesting is like, define the story first. And I like guide people through how to do that.

    Because I think the problem is you don’t really know how to do it when you don’t have like a framework for it. You have kind of the framework there from the start. So you know what the story is and you don’t have to kind of figure out the story while you’re writing. Instead, you know what the story is and the way I’m teaching it, I’m like giving people prompts so that it’s really easy to define the story because also story is really elusive, I think. Or we use it in this elusive way often when we like we kind of use it as like a throwaway term. Oh, yeah, you you should tell a story in your paper. And you go like, “Yeah, I guess. But what does that mean?” I’m trying to like give a definition for that. So that is like really clear. Okay?

    Jennifer: I appreciate that. I think so many people aren’t sure what it means. And even if they think they know what it means, they don’t necessarily know how it applies to their scientific writing. So that’s really interesting.

    Jennifer: I want to talk about podcasts, but actually, since we’re already talking about program stuff right now, I’m curious about the format of your program because people who are listening to this may not be familiar with your work. And I want to make sure that they get to hear about all the cool things that they get if they join.

    Anna: Yeah, the Researchers’ Writing Academy is very comprehensive. 

    Jennifer: Yeah, in a good way. 

    Anna: It’s almost hard to tell people about it because there’s so much in there. So, what people get is like, there’s an online course, we call it the journal publication formula, that’s like the step-by-step system, walks you through online lessons that you can watch, super short digestible lessons that walk you through step-by-step. So you can just write your paper alongside the lessons.

    And then because we noticed that you really may want some help actually writing in your day to day work, right? Because we’re also incredibly busy. And then it’s just helpful to have some kind of accountability, some community, and that’s what we offer as well. So we do a lot of things around accountability and we have like, cowriting sessions, for example, where we meet, we have six now, six per week across time zones. 

    Jennifer: Wow, that’s amazing! So if you’re anywhere in the world, there’s a chance that one of those six times during the day will work for you. Oh my gosh, that’s so cool.

    Anna: Yeah. I mean, they should work. I mean for Europe and the US, most of them will work. Or not, but it depends where in the US you are, etc. But even like a few in Australia, there’s at least one per week that will work for you depending on how long you want to stay up. Some people do, we have one client who comes, he likes to do writing after his kids are in bed. So he loves nine to 10pm, you know, like, yeah. So yeah, there’s a lot. And we do like, writing retreats every now and again, and writing sprints. So we like offer a lot of support around that. And we have like a really lovely community that are so supportive. Actually, I just talked to one member today, and she just got promoted to full professor. 

    Jennifer: Exciting

    Anna: And she was like, “I couldn’t have done it without this community.” This was so like, valuable, not only getting the feedback on her article, but also, just knowing that like, there’s the support. And that’s really, I mean, that’s so lovely for me to hear, because this is honestly what I dreamed of. This is what I wanted to build. And it’s really nice knowing that people do, you know, really, not only reach career goals, but have a supportive community because academia can be a little toxic.

    Jennifer: Yeah, yeah, there’s so many reports that have come out and said, mental health struggles, toxicity, it’s consistent. Yeah. 

    Anna: And honestly, writing plays a big part in that, because like, kind of the way we are normally not talking about writing. I think writing like, it’s, you sometimes see like, more seasoned academics. They sometimes are really good at writing and then act as if they have it all figured out, but not share their process. So you as like a novice writer think, “Shit, I should have figured it out. Like, why do I not know how this works?” 

    Jennifer: This is easy for them. 

    Anna: Yeah, exactly. The other day, someone said to me, “Yeah, I know this professor and he just writes his paper while I’m talking to him at a conference.” And I’m like, “Oh, okay, this is an interesting process.” 

    Jennifer: Wow. Like, it’s so clear in his brain that he can focus on that and a conversation at the same time. Fascinating. 

    Anna: Fascinating. And honestly, you don’t have to do that. But she kind of thought like, “This is who I have to be. This is how I have to do it.” That creates so much pressure. And yeah, writing just hits like, emotionally, it’s really hard, right? When we feel like we are procrastinating, when we have really low confidence in our writing and just feel really disappointed in ourselves because we’re like overly perfectionistic, can’t send stuff off, keep like, you know, refining sentences. It’s just really, really hard.

    This is really why a community is so beautiful when we can all just open up about how hard it is and also give each other tips. Like, I just love when people, you know, share also what’s working for them. And like, down to little techniques. Like the other day, someone was sharing in the community about how they started having like their Friday afternoons as like a margin in their calendar. So, if they didn’t get, you know, to all the things they had done, if there was any derailing event, they still had like time on a Friday. A little hack like that, right?

    That just like makes you more productive, makes you just honestly feel better about your work. Because we’re really tough on ourselves often. Like we’re really harsh and just, you know, having like a community that has this kind of spirit of being kind to yourself and working with your brain and not against it. Yeah, that’s really, really . . . that’s a really lovely place. Really supportive.

    Jennifer: That sounds amazing. I’m curious about who should join your program because it sounds like it’s so supportive. It sounds like there’s community and accountability and training. So, I love all of that, but there’s probably some people who the program’s not right for. So, like, maybe who shouldn’t join and who should definitely join? 

    Anna: Yeah, that’s a good question. I mean, it is in terms of like career stage, it’s pretty open from PhD student up to professor. And we have all of those kind of career stages in the program. The biggest group is assistant professors, just so you know, like who you can expect to be in the program. And also the PhD students who are in there are often older. It’s really interesting. They’re often like second kind of career type students who maybe have, you know, chosen that path a little later in life. Just a little side note. It’s kind of interesting.

    Jennifer: I think that makes so much sense because if I’m going back for like a PhD later on, I’m like, “I’m going to get all the support that I can to make the most of this time.” And joining a program like yours would make so much sense to me.

    Anna: Yeah, they’re probably also busier most of the time because their parents or other stuff going on in their lives already. 

    Jennifer: Yeah, that’s what makes it easier to have time for like the life and the people that you care about because you already have these processes in place. 

    Anna: Yeah, yeah. So as to who shouldn’t join or who this wouldn’t be a good fit for, we don’t actually serve researchers in the humanities. So there’s this really science-based, social sciences included. And you know, physical sciences, life science, earth science, all the sciences we are super happy to have inside the program just because the general publication formula is super focused on just that type of research and really honestly quite focused on like original research papers, even though we have members who write review papers using it because honestly, the process isn’t very different. But we are like, just the examples, everything is from like original research papers. So just FYI.

    Otherwise, I would say like we’re really super supportive and we don’t have like a lot of this like hustle culture, you know. This is all about, we don’t believe in like, having to wake up at 4am to have your whole three hour morning routine, including writing done, because a lot of us like have kids or have other kinds of commitments. So there is a lot of like kind of understanding that, you know, all of this has to work for real life. And not just for, I don’t know, people who have, yeah, men I guess who have a lot of support in the background traditionally, right? This is how research has been done. And yeah, even though we do have really lovely men in the program as well. So it’s not just women, but I guess this is kind of the approach that, yeah, we have in the community, in the academy.

    Jennifer: I love that. So not hustle culture. More let’s learn these processes and have accountability together so that we can move towards this goal of publishing with kindness. 

    Anna: Yeah. It’s so funny, like this being kind. I mean, we often say like, “Be kind to yourself,” because sometimes we don’t achieve the goals we set, often we don’t achieve the goals we set ourselves, right? And what I always say is it’s a data point. Like, this was a really good data point this week, because just reflect on what happened. Oh, did your child get sick? Oh, there you go. So maybe you now need to have a process, what happens if my child gets sick? Because then, you can’t plan that, right? So you have to have, or it’s good to have in your kind of system, in your writing system, in your writing practice, that you account for that. Some kind of strategy, what you do when that happens. Or like, this took me a lot longer to complete, like, I thought I would get my introduction section done this week, but actually, I didn’t. Well, really good data point. Actually, maybe it takes you longer.

    Look at how where you spend the time doing this section. This is really good to know for next time. Actually, maybe schedule one or two days more for this. So that’s kind of like the approach, the vibe that like is in there. So it’s not so, it’s not harsh.

    Jennifer: Yeah, I like that vibe. That’s my kind of vibe. 

    Anna: Mine too. Yeah, mine too. And it really crystallized for me because I once was in a business coaching program where the vibe was really different. You probably remember me talking about this because I did tell you at the time, and it was so awful for me. And I really. . .  but until then, it was really a bummer because I spent a lot of money on it.

    Jennifer: And you’re like, “My community needs kindness and support for each other. 

    Anna: This was my big learning. Apparently, I needed to spend a lot of money to really have this like so, so clear that this is not for me. Like the bro-y culture is not for me. I need the kindness. Because otherwise, it doesn’t work. I don’t work like that if someone tells me I have to, I don’t know, have all these non-negotiables everyday.

    Jennifer: Yeah, like change who you are.

    Anna: Yeah, like you just have to do it. Like it’s just about the discipline. You know, I don’t think that works. I honestly don’t think it works in the long term. Like maybe you can force yourself for like a few months or years and then you’re burning out or something. Like, I just don’t see how this is a sensible approach.

    Jennifer: No. And I remember at the time you mentioned that you felt burned out. Like you were being affected by the culture that you were experiencing. So creating a warm culture for people inside your program, the Researchers’ Writing Academy is wonderful. Everyone gets to benefit from your research.

    Anna: Right? Yea!

    Jennifer: So I want to chat a little bit about online presence because I mean, we met online, we mostly communicate online, but also like you have taken some actions this year in particular to have a stronger online presence through a new avenue, which is podcasting. I’m curious because when I started my podcast, it was like not very intentional. It was like, “Oh, I just better record this thing and like, it’s going to make it like a little more accessible than if it was just in writing.” And the podcast kind of evolved into a regular series after I had already decided to start it. Whereas you came in more with a plan, you had purpose, you had drive to do more episodes than I could imagine. And so what was it like to kind of get that spark of an idea that like, I want a podcast?

    Anna: Yeah, I’ve had this, I mean, I had this desire for a long time. Many, many years. I always wanted to have a podcast. 

    Jennifer: Really?

    Anna: Really because I listen to podcasts a lot. Like I’m really into them. And years ago, someone told me you would have such a good voice for podcasts. I was like, really? I don’t, because when you listen to your own voice, you’re like, “No, I don’t think so.” And I still don’t know whether this is really true, but I wanted to be more online. Like kind of, I wanted to have an online presence that wasn’t just social media.

    Anna Clemens

    Because honestly, I have such a weird relationship to social media, myself. It does like cognitively do something to my brain that isn’t always good, you know. Like hanging out there too much or getting sucked in, especially back on Twitter, now on Bluesky it’s a little bit like that too. There’s sometimes a lot of negativity. And I feel like people are too harsh, coming back to the being too harsh. I just can’t take it. Like, it’s not for me, but also just the fact that there’s just a lot going on there.

    I wanted to be available to people somewhere else. And a podcast and I did actually simultaneously, like launch my podcast on YouTube as well. So it’s like a video podcast. That just made sense to me. Like, that just felt really aligned with what I like to consume, what I think my ideal clients like to consume. And where I also felt like I can like express myself, I guess, in a really good way. I mean, I do love writing, I do actually have a blog too. But it’s almost like when you have a blog, unless you’re like really, really good at SEO, which is a little hard in my niche, to be honest. Like nobody reads it, right? Unless you like amplify it through social media.

    Jennifer: Actively sharing it. It’s its own marketing.

    Anna: Yeah, yeah. So it’s still like social media connected. And I kind of wanted to have another avenue. Anyway, yeah. Talking also, I also like talking. So podcast made sense.

    Jennifer: That’s amazing. When I started my podcast, it was kind of just like, you know, going on zoom and hitting record. What is your process like? Are there other people involved? What is the kind of behind the scenes for your podcast?

    Anna: Yes, I have solo episodes. And I also have episodes with former clients or current clients actually, like members of the research as writing academy or alumni. And I also had one with one of my team members, our kind of client experience manager, Yvonne, where we talked about community. And I also had you on, right, as a guest expert. I think you’re the only guest expert actually we’ve had so far. 

    Jennifer: I feel so special. That’s amazing.

    Anna: So yeah. The process for interviews, I would think of questions ahead of time. And we, for example, then chatted about the questions. This is also what I did with Yvonne. Just have a quick chat. I think both times it was written, like through Slack, just like, “Hey, does this make sense? Where do we want to go with this? Okay, maybe this should be a different discussion. Let’s focus on that.” And similar, actually, with the clients I interviewed. I would just send them a list of questions and be like,” Hey, you don’t need to prepare anything, but if you want to do” and then basically hop on and have a conversation and it’d be quite natural. And like this one where, you know, you don’t necessarily have to follow a script, you just go where it takes you.

    For my solo episodes, it’s a little bit different where I do write an outline. And honestly, like, what surprised me was this took a lot of time. Even when I knew what I wanted to say, and maybe this is me being too perfect, too much of a perfectionist, because I would go back. So I’d write the outline, I would go back the next day or the day after I read it again and have more ideas. I’d be like, “No, no, this should be like this.” So, it took me a lot of time. But then also, I think the outlines got better and better and better. And then I was really, you know, proud of the episodes. I was like, “Yeah, I really expressed this, I think, in a good way.” Because what I did afterwards then is I took this transcript from that episode and turned those into a blog post. 

    Then with the blog post, I’m like, “Yeah, they’re really meaty. There’s so much in there.” Like, there’s so much longer than my other blog posts that were just blog posts without podcast episodes. So that was really interesting to me. Just like, you know, understanding I guess a little bit more about the process of writing or synthesizing ideas and concepts.  And yeah, after the outline, I would record on my own, I would record the episodes with that outline like in front of me. So kind of a bullet point outline.

    Jennifer: It sounds like your brain really likes the outlining process. And when you come back the second time, you have ideas to flush it out and tell the story even better. That’s really cool.

    Anna: Yeah, it was honestly really fun writing those outlines. Because recording sometimes, especially in the beginning, was a little more stressful than I expected. It was shockingly stressful because I’m on video a lot. I thought it would be rather easy to record cause of my experience. And I think it would have been pretty easy if I just had done audio, but because I was also doing video, it felt a lot harder because it’s really hard to read an outline and look in the camera at the same time. 

    Jennifer: Oh yeah. 

    Anna: Like really, really hard. And I also couldn’t spend even more time like rehearsing the outline to the point where I didn’t need to look at it anymore. Like I didn’t feel like that made sense. And I was really struggling with that. And I was just like, being a little unhappy about it. Because when I talk, like when I’m like, I’m on a lot of calls, you know, inside the Academy, for example, or like interviews like this. And I find, for me it’s quite natural already to look at the camera. Like, I look at the camera a lot. But when I have an outline, you know, it’s like you do look at it. It was so hard. And actually, you helped me a lot with that.

    Yeah, because I was sharing this, that I was really unhappy with my recordings because of, I wasn’t looking at the camera. And you said, “Well, look, so many people aren’t even recording video for that exact reason. And you’re putting something out that is less perfect than you hope will still be so useful to the people, to people watching it. Honestly, that doesn’t matter.” And then I was like, “Yeah, this is like perfectionism.” It was all right. I just wanted to have it perfect. And I had a different standard for myself. But I didn’t need to be there. Like I was just not there. And that was totally fine. It didn’t need to be quite as polished as I thought maybe it should be.

    Jennifer: Yeah, and I think that we don’t give ourselves enough grace for like our first things, right? Like the first episodes, like the first launch of something new. Like, we want it to be really great because it’s new and because it represents us. But sometimes like, we’re just not there in terms of our own practice or our own skills, like something may need to build or improve for us to get to where we dream about being. And that’s okay. I really didn’t think, I didn’t have those negative feelings when I started my podcast, but so many of my clients and so many of the people that I’ve met along the way have talked about the first maybe five or six episodes being just such a struggle.

    Looking at themselves on video, listening to themselves speak, doing the editing themselves. It brought up all of those feelings about like watching themselves and what it would be like for other people to watch them. But the truth is that like you are watching yourself and doing all of those things more than anyone else is. Like, if someone else is watching it, they may not even listen to or watch the entire thing. And if they are, maybe they’re doing something else, like cleaning up their room. You know, if it’s a podcast, it’s not something that people will always sit there and like stare at your face and look at everything you did that was wrong. That’s what we’re doing.

    Anna: Yeah, yeah. Yeah. You’re so right. 

    Jennifer: For me, this year I have Sir Nic who does all of this kind of sound editing for me and he’s here in the virtual studio with us making sound levels all good. And then my husband Matthew does the video editing. So I don’t have to look at myself anymore or listen to myself. And it is so nice! It’s, oh my goodness, it’s such a relief for me to have those things off my plate. Do you have support on your team for podcast things or is it just the people who are working on, you know, the different kind of accountability coaching and things that are in the program?

    Anna: Yeah, I did have support. So I outsource the editing, video and audio editing. 

    Jennifer: Love that. 

    Anna: I couldn’t have done it myself, honestly, like not so much. I mean, it takes a lot of time. I think people often underestimate just how much time this takes. And especially if you want the audio to be kind of good, you do want someone, an audio engineer I think. This was important to me to have like a decent microphone, decent audio. So I actually invested quite a lot in this space. I started recording in my former office. I’m not in there now anymore, but it had really high ceilings. So I put all these sound panels up, these like boards and I bought curtains that I now brought into this room as well to like reduce the echo. And that was just worth it to me. But yeah, I did have support. And then in-house, like on my team, my operations manager, she also helped me with the podcast. Like she would do a lot of like even reviewing episodes and suggesting maybe further edits. So I didn’t have to watch myself very much. 

    Jennifer: Oh, that’s great. 

    Anna: She would also take out little like clips from the episode that we then put on social media. Like as YouTube shorts, for example.

    Jennifer: Yeah. 

    Anna: Yeah, so it was a really, really smooth process with a lot of support.

    Jennifer: Yeah, getting support was something that I didn’t think my podcast deserved in the beginning, but now I feel like my listeners do. My listeners deserve that. If I can keep doing it for them, I’m going to. So I’m glad we got to chat about that because a lot of people are like, “Oh, I’m just going to go on Zoom and record.” And then maybe they’re surprised when the editing process is a lot longer. But also the first few episodes, if you’re starting something new like editing, like audio stuff, like even just being on video, it’s going to be hard. And it might not be as good as you want it to be at first, but it’s going to get better. It’s going to get better. Oh, before we… Oh, sorry. Go ahead. 

    Anna: No, no, no. I just said so true. 

    Social media for academics post-Elon

    Jennifer: Well, I wanted to chat about the social media landscape and how things have been changing since Elon took over Twitter. I know you are on Bluesky now. I would love to hear a little bit about your experience of that platform.

    Anna: Yeah, I’m on Bluesky now and I’m not on X or Twitter anymore. I mean, I do still have the account, but I don’t check it anymore. Some people are still finding me through there, though. That’s kind of interesting. I see it in my data, but I haven’t logged in in like months. Bluesky is very similar to Twitter, honestly, in the sense of the type of conversations that are happening there. But at least for me, there’s a lot less engagement than there was. And I’m actually wondering whether a lot of academics gave up on social media after Twitter went downhill, because there was this like really great academic community on Twitter through which I guess we met. 

    Jennifer: Yeah.

    Anna: Back in the day. And I don’t see that happening on Bluesky. Bluesky does have a few other features, like additional features though that I really like. Like the way you can customize your feed a lot better. You can create those lists. So if you’re new to Bluesky, you can just like, there’s probably a list for researchers in your field.

    “I struggle with writing a compelling story that is interesting outside of my field, yet doesn’t oversell my data.” ✍️

    How to use storytelling ethically: https://annaclemens.com/blog/story-telling-scientific-paper/

    #AcademicSky #AcademicWriting #ScienceSky

    [image or embed]

    — Anna Clemens, PhD (@annaclemens.com) July 6, 2025 at 4:09 AM

    Jennifer: Yeah, like the starter packs and the different lists you could put together. 

    Anna: Exactly, starter packs. That’s what it’s called. Yeah. So you can just like hit follow all and you already have a feed full of people you want to have in your feed. And getting started is kind of really cool on Bluesky. I do think, I don’t know, something is different about the algorithm over there, but I’m not an expert. I don’t really know, but it feels like not as much things are like going viral per se. 

    Jennifer: Yeah.

    Anna: Maybe a little more one to one.

    Jennifer: Yeah. Oh, that’s really interesting. When I when I first joined Bluesky, which was much later than everyone else. It was really just last month. I found that it was very quiet. I connected with the people that were like the most talkative on Twitter. I hadn’t run Sky Follower Bridge or any of the tools to help me get connected yet because I wanted to see what the platform was like naturally. Like if someone was just signing up for the first time without having been on Twitter. And I was able to find people pretty easily. Like the people that I most often talked to or connected with, guests on The Social Academic, those kinds of things. But I wasn’t finding conversations. Like the people who I knew from social media weren’t talking all that much. They weren’t posting original content the way that they had on other platforms.

    And when I did run Sky Follower Bridge and found all of the people from Threads, from X, etc. I realized that like so many people had accounts that they just hadn’t connected with people yet. Like they, you know, maybe started their account during the big X exodus and then they connected with 12 people because that’s who they found when they first got there. And when they didn’t find their community, it’s like maybe they stopped logging in. And I think that’s really normal for people. Like you’re going to look for the warmth in the conversations or just like the people talking and watching it, being able to see it without even participating in it. Like if you don’t see when you get there, it’s kind of like, “Well, why am I going to spend time in this space?” I had to do a lot more work than I expected in order to find the conversations. And I had to connect with a lot more people without knowing that they were going to follow me back. Like without that anticipation in order for me to feel connected. But once I did that, once I was following, like I follow like over a thousand people now, once I did that, it started to feel like old Twitter to me. Like the community and conversation. Yeah, there’s a lot of people who aren’t talking there, but I was just surprised how much effort it took to get to that feeling. More than other platforms for me.

    Anna: Do you enjoy it now? Like the way you liked Twitter?

    Jennifer: You know, I don’t think I really enjoy any one social media platform over another anymore. I feel like my relationship with creating content has changed a lot in that I found more ease and I found less pressure and I found like good processes that work for me. And because of that, I don’t spend a lot of time on social media. Like I’m not on there browsing for conversations the way that I think I did when I was on X. Like old Twitter, I liked spending time there and jumping into conversations. And now social media is more, I don’t intentionally put in my day as much anymore. That’s what it is. And I like that. I like how my relationship with social media has changed. But no, I haven’t gone back to how I engaged in old Twitter, I think. What about you?

    Anna: That makes sense. Yeah, it’s similar for me, actually. I have to say I go through phases with it. So I do put out like content on several platforms like Threads, Bluesky and LinkedIn and then like YouTube as shorts. And I do go in and kind of check, does anyone comment? Like is anyone starting a conversation? I do this several times a week. But I don’t get sucked in as much anymore, if ever. Yeah, and I’m like super intentional about the time I spend there, I guess.

    Jennifer: How are you intentional?

    Anna: Well, I kind of set myself a timer as well. 

    Jennifer: Oh, like a literal timer.

    Anna: So I don’t let myself like do more than, I don’t know, five minutes per platform. 

    Jennifer: Really?!

    Anna: If there is like, of course, if there is comments, like actual, interesting conversations to join, I will, you know, override, but I’m really trying not to, not to get sucked in because it’s so easy for me. I don’t know. My brain is really- 

    Jennifer: That is really smart. I’ve never set a timer for that short amount of time. I’ll be like 30 minutes, you know, 30 minutes a day. Like if I’m going to have a timer maybe that’s what I would set it for. But five minutes is so much more specific, direct. That would wake my brain up. I should try something like that if I get sucked in again.

    Anna: Yeah, I like it. I do like it. And because now I feel like the social media landscape for academics has changed in a way. They’re used to be, or for me they’re used to be just Twitter. I was basically just on Twitter and I didn’t really do anything on any other platform whereas now it’s a lot more spread out. And, I don’t know, there’s good and bad things about that. But now I feel like, “Okay, I need to spend time on LinkedIn. I need to spend on Blue Sky and on Threads.” So, you know, I just can’t spend like that much time anymore on just one platform. So it has to be kind of a bit more time efficient.

    Jennifer: Okay, so you’re on Bluesky, Mastodon, YouTube, LinkedIn- 

    Anna: I’m not on Mastodon. Threads.

    Jennifer: Not on Mastodon. Threads, LinkedIn and YouTube.

    Where can people find your blog and your podcast? I want people to be able to get connected with you after this.

    Jump to website and social media links in Anna’s bio below

    Anna: Thank you so much for that lovely conversation. And it was so fun finally being a guest on your show.

    Jennifer: I’m so happy. Anna, I am so happy to have shared the Researchers’ Writing Academy with people because I really believe in your program. I believe in the process. And I know that you’re someone who goes in and updates things and improves them. And so I’ve always recommended the Researchers’ Writing Academy to professors. And I really encourage you if you’re listening to this to check it out.

    Jennifer receives no monies or gift when you sign up for the Researchers’ Writing Academy or any of the other recommendations she shares on The Social Academic.

    My name is Jennifer Van Alstyne. Thank you for checking out this episode of The Social Academic Podcast. Subscribe to the podcast on Spotify or on our YouTube channel.

    Want to hear more of Anna’s story? Check out her episode of The Bold PhD from Dr. Gertrude Nonterah (a former guest here on The Social Academic).

    Subscribe to The Social Academic blog.

    The form above subscribes you to new posts published on The Social Academic blog.
    Want emails from Jennifer about building your online presence? Subscribe to her email list.
    Looking for the podcast? Subscribe on Spotify.
    Prefer to watch videos? Subscribe on YouTube.

    Dr Anna Clemens is an academic writing coach who specializes in scientific research papers. She runs the Researchers’ Writing Academy, an online course where she helps researchers to get published in high-ranking journals without lacking structure in the writing process.

    Sign up for Anna’s free training on how to develop a structured writing process to get published in top-tier journals efficiently.

    Anna Clemens



    Source link

  • Researchers “Cautiously Optimistic” NIH Will Restore Grants

    Researchers “Cautiously Optimistic” NIH Will Restore Grants

    Months after individual researchers, advocacy groups and a coalition of Democratic state attorneys general filed two lawsuits against the National Institutes of Health for terminating hundreds of active research grants misaligned with the Trump administration’s ideologies, some scientists are hopeful that the agency will soon restore the grants and allow them to resume their research.

    Last week, a federal judge in Massachusetts ordered the NIH to restore the roughly 900 grants named in the lawsuits, including many focused on studying vaccine hesitancy, LGBTQ+ health and diversity, equity and inclusion in the medical field. U.S. District Judge William Young, who was appointed by President Ronald Reagan, ruled the terminations void and unlawful, stating during a hearing that in all his years on the bench he’d “never seen” discrimination by the government to this extent.

    Although Science reported Thursday morning that the NIH has internally communicated plans to restore those grants “as soon as practicable”—and also cease further grant terminations—researchers say they still don’t know when they can expect to get the money they were promised.

    “Since the ruling, we are really encouraged,” said Heidi Moseson, a plaintiff in one of the cases and a senior researcher at Ibis Reproductive Health. “But we haven’t heard anything from the NIH about our grants being reinstated, and we don’t have a window into what that process looks like.”

    Back in March, Moseson received a letter from the agency terminating her grant, which was aimed at improving the accuracy of data collected in sexual and reproductive health research for all people, including those who identify as transgender and gender diverse. The award “no longer effectuates agency priorities,” the letter said. “Research programs based on gender identity are often unscientific, have little identifiable return on investment, and do nothing to enhance the health of many Americans.”

    The NIH did not respond to Inside Higher Ed’s request for comment on its specific plans for restoring the terminated grants.

    Appeal Anxiety

    Moseson said each week that goes by with the grant on pause “is another week where people are not being appropriately screened into clinical care and research that would be relevant for their bodies, leading to missed preventative care or, conversely, unnecessary preventive care.”

    While her team is ready to resume their research as soon as the NIH restores the funding in accordance with the judge’s ruling, she’s bracing for further disruptions ahead, depending on what happens with the appeals process.

    On Monday, the NIH filed a notice of appeal with the U.S. Court of Appeals for the First Circuit. It also filed a motion to stay the judge’s order to restore the grants while pending the appeal, but Young denied that motion on Tuesday, noting that a stay “would cause irreparable harm to the plaintiffs.”

    “This is a case in equity concerning health research already bought and paid for by the Congress of the United States through funds appropriated for expenditure and properly allocated during this fiscal year,” the judge wrote. “Even a day’s delay further destroys the unmistakable legislative purpose from its accomplishment.”

    The following day, Michelle Bulls, a senior NIH official who oversees extramural funding, told staffers in an email that the agency must restore funding for the hundreds of projects identified by the plaintiffs, Science reported. “Please proceed with taking action on this request as part of the first phase of our compliance with the court’s judgment,” Bulls wrote, noting that “additional information is forthcoming.”

    Noam Ross, executive director at rOpenSci, a nonprofit that supports reproducible open research, and co-founder of the website Grant Watch, which is tracking grant terminations, put out a call for information on LinkedIn Wednesday about any grants the NIH has restored. But he told Inside Higher Ed Thursday afternoon that he has yet to receive any verified reports of restored NIH grants.

    Shalini Goel Agarwal, counsel for Protect Democracy, a nonprofit focused on combating perceived authoritarian threats, and one of the lawyers representing the plaintiffs, said Thursday morning that she also had not yet heard of any researchers getting grant money the NIH previously terminated.

    Though it’s not clear what could come of the government’s effort to appeal Young’s ruling, “at this moment the judge’s order is in effect and the NIH should be returning money to the researchers whose grants were terminated,” she said. “NIH should right now be undoing the effects of its directives.”

    ‘Cautiously Optimistic’

    Katie Edwards, a social work professor at the University of Michigan and a plaintiff in one of the cases, said that as of Thursday afternoon, she had yet to receive any communication from the NIH about its plans to restore her numerous multiyear grants.

    Edwards, whose research focuses on Indigenous and LGBTQ+ youth, said that delaying the grants much longer will undermine the research she’s already started, to the detriment of public health research.

    “For some of our studies, it’s just a matter of weeks before they’ll be really hard if not impossible to restart. I’m feeling a lot of anxiety,” she said. “We’re in a waiting phase, but I’m trying to be cautiously optimistic.”

    Despite the uncertainty of what’s ahead, she did get some reassuring news from the NIH on Thursday. The agency notified her that it approved her bid for a new three-year, $710,000 grant to develop and evaluate a self-defense program for adult women survivors of sexual violence. Like many other applications for new grants, the application had been in limbo for months. “So something (good??) is going on there!” she said in an email.

    Other cases moving through the courts also look promising for federally funded researchers eager to get their grants restored.

    On Monday, U.S. District Court Judge Rita Lin ruled that the Environmental Protection Agency, the National Science Foundation and the National Endowment for the Humanities had also unlawfully terminated grants that had already been awarded to researchers in the University of California’s 10-campus system. The judge, a Biden appointee, ordered the government to restore them, adding that she is weighing extending the order to 13 other federal agencies, including the NIH.

    “Many of the cases that are making their way through the courts share claims that are being made about the illegality of the federal government’s actions,” said Olga Akselrod, counsel for the American Civil Liberties Union and a lawyer representing the plaintiffs in one of the suits against the NIH. “Any time we have a win in one of these cases it’s an important statement of the applicable law, and that’s relevant for all of the cases that are proceeding.”

    Source link