The American Association of University Professors chapter at Columbia University is urging officials there to reject the Trump administration’s demands, which include putting an academic department under receivership, abolishing the University Judicial Board and giving security employees arrest authority.
“Compliance would make Columbia complicit in its own destruction, stripping shared control of academic and student affairs from the faculty and administration and replacing the deliberative practices and structures of the university with peremptory fiats from outside the institution,” the AAUP chapter said in a statement Tuesday. “We see no evidence that compliance would assuage the hostility of the White House.”
The Trump administration announced March 7 it was canceling about $400 million in federal grants and contracts for Columbia due to what it claims is the university’s “continued inaction in the face of persistent harassment of Jewish students.” Then, in a letter last week, federal officials listed “next steps that we regard as a precondition for formal negotiations regarding Columbia University’s continued financial relationship with the United States government.” They set a March 20 deadline for complying with the demands, which also include a mask ban, a plan for changing admissions and more.
The Columbia AAUP’s statement said, “The government’s demands read like a ransom letter, dictating to the university what principles it must sacrifice and what ideological positions it must adopt to restore research funding.” As for the justification of fighting antisemitism, the AAUP chapter said the university took “many actions over the last year to accommodate its Jewish students, sometimes at the expense of the grievances of other campus groups.”
The AAUP chapter said this “assault on Columbia will serve as a model for attacks on other universities across the nation” and urged colleagues to speak out and “march in the streets.”
The White House didn’t return a request for comment Tuesday.
Moody’s Ratings on Tuesday downgraded its outlook for the higher education sector from stable to negative due to recent and potential federal policy changes.
The revised outlook comes as the Trump administration has gutted the Education Department via mass layoffs and sought to aggressively overhaul higher education with a flurry of executive orders that have destabilized certain funding streams.
“Actions and potential changes include cuts to research funding, enforcement actions against diversity programs, staff reductions at the US Department of Education, uncertainty over federal student aid, and possible expanded taxes on endowments,” Moody’s analysts wrote in the report released Tuesday. “These factors are causing institutions to pause capital investments, freeze hiring, and cut spending.”
In December, Moody’s projected a stable 2025 with anticipated revenue growth of 4 percent—the most optimistic outlooks for the sector among a trio of predictions from key financial organizations. Now the ratings agency notes federal policy changes could prompt revenue shortfalls, particularly at research universities, due to a proposed cap on National Institutes of Health reimbursements for research-related costs. That cap, which is currently blocked by a court order, would mean about $100 million in cuts annually for research universities that spend at least $50 million on research and award 70 research doctorates a year, according to Moody’s.
In addition to the NIH rate cut, an increase to the endowment tax would hit wealthy, private universities and likely drive cuts to financial aid or in other spending categories, the report found. The current endowment tax is 1.4 percent for institutions with at least 500 students and $500,000 in assets per student, but recent Republican proposals have floated raising that tax significantly. One proposal has called for a 10 percent tax and changing the per-student endowment threshold from $500,000 to $200,000. Another GOP proposal would set the tax at 21 percent.
Potential disruptions to federal financial aid disbursement, however, would impact all colleges and universities. Moody’s noted that “only a select group of wealthy institutions have the financial flexibility to manage such a scenario without likely seeing steep enrollment decline.” Given steep cuts to the Education Department, Moody’s expressed concern that the Federal Student Aid office could be affected, particularly after last year’s overhaul of the Free Application for Federal Student Aid, which was beset by multiple technical challenges.
“The administration has said the reductions will not affect the department’s statutorily mandated functions such as administering Title IV financial aid and providing assistance to federal student loan borrowers, but the extent to which that will be the case is uncertain,” the report noted.
Federal enforcement actions against diversity, equity and inclusion initiatives—which the Trump administration has targeted—also pose a financial risk to the sector, according to Moody’s. The report cited the potential for “a wide array of funding cuts, including Title IV funding suspension, if [universities] do not comply” with Trump’s executive orders clamping down on DEI offerings.
Moody’s also flagged potential losses due to the possible reduction in visas for foreign students. Colleges and universities that would be hit the hardest, according to the report, are those that are “reliant on STEM master’s programs, or more niche offerings like art and design programs.”
The report concluded that the outlook could revert to stable “if many of the federal policies and proposals are reversed or halted by judicial intervention or do not come to pass. Stronger-than-expected investment market returns and operating revenue growth could also lead to a revision of the outlook to stable.”
A federal judge in Maryland this week ordered the U.S. Department of Education to reinstate numerous grants that support teacher-preparation programs.
The department canceled the $600 million in grants last month as part of a wider effort to slash federal funding and eliminate programs that promote diversity, equity and inclusion. In response, the American Association of Colleges for Teacher Education, the National Center for Teacher Residencies and the Maryland Association of Colleges for Teacher Education challenged the cuts, arguing in a lawsuit that the grant terminations were illegal.
On Monday, U.S District Judge Julie Rubin ordered the department to restore funding for the Supporting Effective Educator Development program, the Teacher Quality Partnership program and the Teacher and School Leader incentive program within five business days. That order comes after a federal judge last week directed the department to reinstate canceled grants in eight states.
“We are thrilled that the court has ruled in favor of preserving funding for TQP, SEED, and TSL grants, which have a transformative impact on our nation’s education system,” AACTE president and CEO Cheryl Holcomb-McCoy said in a news release.
The order also blocks the department from terminating any other TQP, SEED or TSL grant awards “in a manner this court has determined is likely unlawful as violative of the Administrative Procedure Act,” which instructs courts to “hold unlawful and set aside final agency actions” deemed “arbitrary, capricious, an abuse of discretion, or otherwise not in accordance with law.”
The judge asked both the department and the plaintiffs to file a status report within seven business days showing compliance with the order.
March Madness officially begins Thursday with a match-up between the University of Louisville and Creighton University.
Michael Allio/Icon Sportswire/Getty Images
No shame if you forgot National Collegiate Athletic Association’s Division I basketball championships were coming up—after all, this March has been filled with more than enough madness in higher ed, even without paying attention to basketball.
Nonetheless, the biggest event in college sports kicks off this week. If you’ve been a little too concerned with the news cycle to fill out your bracket, we’re here to help. Every year since 2006, Inside Higher Ed has determined which teams would win in the men’s and women’s tournaments if the results were based on academic, rather than athletic, performance.
To determine the winners, we used the NCAA’s key academic performance metric, known as the academic progress rate, for the 2022–23 academic year, the most recent data available. The academic progress rate measures student athlete retention and academic eligibility, though some outside experts have said the metric paints an imperfect picture of a program’s academic performance.
(Full disclosure, we did use this metric to determine the winners of the First Four matchups, even though two of the four games will be determined before publication Wednesday morning.)
If two colleges had the same APR, we used 2023–24 graduation success rate, the proportion of athletes who graduated within six years of entering an institution, as tiebreakers. If teams tied again, we turned to the team’s six-year federal graduation rates, which is a more inclusive metric.
Luckily, none of the teams tied in all three categories. Still, there were a handful of nail-biting victories. For instance, the Clemson University Tigers tied the Liberty University Flames on both the academic progress and graduate success rates. But when looking at the overall graduation rate, Clemson won by one point. After besting the Flames in the Final Four, the Tigers beat out the University of Louisville to win the whole thing.
Now, the Inside Higher Ed bracket likely won’t win you any money. But there’s no bad time to celebrate the academic achievements of student athletes alongside their athletic prowess.
The AI Opportunities Action Plan, led by Matt Clifford CBE and announced in January, documents recommendations for the government to grow the UK’s AI sector to ‘position the UK to be an AI maker, not an AI taker’ in the field and help achieve economic growth.
The UK’s AI Action Plan highlights the critical need to harness international talent and expand the workforce with AI expertise. However, this ambition is at odds with recent moves by the British government to limit international student numbers through stricter visa regulations, leading universities to make difficult decisions—cutting courses, slashing budgets, and exploring alternative strategies to maintain financial stability and global relevance.
The AI Action Plan: A policy contradiction
Despite a well-documented skills gap in the UK’s AI sector, the Government’s actions have forced universities to pivot toward establishing global campuses in a bid to preserve financial stability and maintain and promote international collaboration in general. This trend is exemplified by universities like Coventry University, which opened a campus in Delhi last year, and the University of Lancaster’s partnership with Deakin University in Indonesia. Today, UK universities operate 38 campuses across 18 countries, educating more than 67,750 students abroad.
While these international campuses help extend the UK’s academic reach, the UK’s immigration policies are creating significant barriers to attracting top-tier AI talent to work domestically. Many international graduates, trained to UK standards, are struggling to secure postgraduate visas for themselves and their families, preventing them from contributing their skills to the UK economy.
Visa barriers for graduates
One of the main visa routes intended to help international talent integrate into the UK workforce is the High Potential Individual (HPI) visa. The HPI visa is a UK immigration pathway designed for recent graduates from 40 top global universities, allowing them to live and work in the UK for several years. However, this scheme remains restrictive. To qualify, applicants must have a qualification from one of the eligible global universities in the last five years. Of the universities included, 47.62% are from the US, and there is just one institution from the entire southern hemisphere on the list.
The AI action plan recommended the government consider reforming the HPI pathway with ‘graduates from some leading AI institutions, such as the Indian Institutes of Technology and (since 2020) Carnegie Mellon University in the US, are not currently included in the High Potential Individual visa eligibility list’.
The AI Action Plan itself highlights the need for a rethink of the UK’s immigration system to attract graduates from top AI institutions worldwide. However, the government has only ‘partially agreed‘ with this recommendation, pointing to existing visa schemes that they believe meet the needs of skilled workers, including AI graduates. However, it can be argued that the UK visa process is often expensive, and Global Talent Visas require employer sponsorship while failing to account for the challenges that international graduates face when trying to secure long-term employment, especially in industries with rapidly evolving skills like AI. Even if the HPI eligibility list was expanded, our existing visa pathways are too restrictive to support a rapid influx of skilled graduates.
Government and university collaboration
The AI Action Plan calls on the government to ‘support Higher Education institutions in increasing the number of AI graduates and teaching industry-relevant skills.’ The reality is that many UK universities have already adjusted their strategies to cope with both domestic financial pressures and the measures introduced to quell international students through restricted immigration pathways.
The question remains whether universities will be expected to reverse course, intensify efforts to recruit domestically and retain AI talent to meet the government’s urgent targets. Without a targeted and affordable visa system to support these efforts, the AI Action Plan’s goals risk falling short of their potential.
This is not about asking Universities to ensure that their international students have clear career pathways post-graduation or providing AI-specific courses. The government must create an AI-specific visa that allows graduates from top global institutions to work in the UK.
The real need lies in fostering closer collaboration between higher education institutions and government policymakers, particularly when it comes to visas. The government must take responsibility for creating a new visa pathway if it wants to meet the aims of the AI action plan. Universities cannot be expected to U-turn- develop new courses in the face of financial constraints and restrictive visa policies.
Mauve Group is a global HR, Employer of Record and business consultancy provider. Mauve specialises in supporting organisations of all sizes to expand overseas, helping companies navigate the complexities of employing workers across borders.
SHINGLETOWN, Calif. — On a cold morning in October, the sun shone weakly through tall sugar pines and cedars in Shingletown, a small Northern California outpost whose name is a reminder of its history as a logging camp in the 1800s. Up a gravel road banked with iron-rich red soil, Dylan Knight took a break from stacking logs.
Knight is one of 10 student loggers at Shasta College training to operate the heavy equipment required for modern-day logging: processors to remove limbs from logs that have just been cut, skidders to pull logs out of the cutting site, loaders to stack and sort the logs by species and masticators to mulch up debris.
For centuries, logging was a seasonal, learn-on-the-job trade passed down from father to son. But as climate change and innovations in the industry have changed logging into a year-round business, there aren’t always enough workers to fill jobs.
“Our workforce was dying,” said Delbert Gannon, owner of Creekside Logging. “You couldn’t even pick from the bottom of the barrel. It was affecting our production and our ability to haul logs. We felt we had to do something.”
Related: Interested in innovations in higher education? Subscribe to our free biweekly higher education newsletter.
Around the country, community colleges are stepping in to run apprenticeship programs for heritage industries, such as logging and aquaculture, which are too small to run. These partnerships help colleges expand the workforce development programs central to their mission. The partnerships also help keep small businesses in small industries alive by managing state and federal grants and providing the equipment, courses and staff to train workers.
As industries go, logging is small, and it’s struggling. In 2023 there were only about 50,000 logging jobs in the U.S., but the number of logging companies has been on the decline for several years. Most loggers are over 50, according to industry data, and older generations are retiring, contributing to more than 6,000 vacant positions every year on average. The median annual salary for loggers is about $50,000.
Student logger Bryce Shannon operates a wood chipper at a logging site as part of his instruction at Shasta College in Redding, Calif. Credit: Minh Connors for The Hechinger Report
Retirements have hit Creekside Logging hard. In 2018 Gannon’s company had jobs to do, and the machines to do them, but nobody to do the work. He reached out to Shasta College, which offers certificates and degrees in forestry and heavy equipment operation, to see if there might be a student who could help.
That conversation led to a formal partnership between the college and 19 timber companies to create a pre-apprenticeship course in Heavy Equipment Logging Operations. Soon after, they formed the California Registered Apprenticeship Forest Training program. Shasta College used $3.5 million in grant funds to buy the equipment pre-apprentices use.
Logging instruction takes place on land owned by Sierra Pacific Industries lumber company — which does not employ its own loggers and so relies on companies like Creekside Lumber to fell and transport logs to mills.
Each semester, 10 student loggers like Knight take the pre-apprenticeship course at Shasta College. Nearly all are hired upon completion. Once employed, they continue their work as apprentices in the forest training program, which Shasta College runs in partnership with employers like Gannon. State apprenticeship funds help employers offset the cost of training new workers, as well as the lost productivity of on-the-job mentors.
For Creekside Logging — a 22-person company — working with Shasta College makes participation in the apprenticeship program possible.Gannon’s company often trained new loggers, only to have them back out of the job months later. It can cost tens of thousands of dollars to train a new worker, and Creekside couldn’t afford to keep taking the financial risk. Now Gannon has a steady flow of committed employees, trained at the college rather than on his payroll. Workers who complete the pre-apprenticeship know what they’re getting into — working outdoors in the cold all day, driving big machines and cutting down trees.
Workers who complete the apprenticeship, Gannon said, are generally looking for a career and not just a seasonal job.
Talon Gramps-Green, a student logger at Shasta College in Redding, Calif., shows off stickers on his safety helmet. Credit: Minh Connors for The Hechinger Report
“You get folks that are going to show up every day,” Gannon said. “They got to test drive the career and know they like heavy equipment. They want to work in the woods. The college has solved that for us.”
Apprentices benefit too. Workers who didn’t grow up around a trade can try it out, which for some means tracking down an elusive pathway into the work. Kyra Lierly grew up in Redding, about 30 miles west of Shingletown, and previously worked for the California Department of Forestry as a firefighter. She’s used to hard work, but when she looked into getting a job as a logger she couldn’t find a way in. Some companies had no office phone or website, she says. Jobs were given out casually, by word of mouth.
“A lot of logging outfits are sketchy, and I wanted to work somewhere safe,” said Lierly, 25. She worked as an apprentice with Creekside Lumber but is taking a break while she completes an internship at Sierra Pacific Industries, a lumber producer, and gets a certificate in natural resources at Shasta College.
“The apprenticeship made forestry less intimidating because the college isn’t going to partner with any company that isn’t reputable,” Lierly said.
Apprenticeships, with their combination of hands-on and classroom learning, are found in many union halls but, until now, was not known to be common practice in the forested sites of logging crews.
State and federally registered apprenticeships have gained popularity in recent years as training tools in health care, cybersecurity and telecommunications.
Federal funding grew steadily from $145 million in 2018 to more than $244 million during the last years of the Biden administration. That money was used to support apprenticeships in traditional building trades as well as industries that don’t traditionally offer registered apprenticeships, including teaching and nursing.
The investment aims to address the shortage of skilled workers. The number of working adults in the U.S. doesn’t align with the number of skilled jobs, a disparity that is only slowly recovering after the pandemic.
Labor shortages hit especially hard in rural areas, where trades like logging have an outsized impact on their local economies. For regional heritage trades like logging, just a few apprentices can make the difference between staying in business and shutting down.
Lucas Licea, a student logger at Shasta College in Redding, Calif., operates a loader. Credit: Minh Connors for The Hechinger Report
“There’s a common misconception of registered apprentices that they’re only in the building trades when most are in a variety of sectors,” said Manny Lamarre, who served as deputy assistant secretary for employment and training with the Labor Department during the Biden administration. More than 5,000 new occupations have registered with the department to offer apprenticeships since 2021, he said. “We can specifically support unique small occupations in rural communities where a lot of people are retiring.”
Education Secretary Linda McMahon, who was confirmed earlier this month, said in her confirmation hearing that she supports apprenticeships. But ongoing cuts make it unclear what the new federal role will be in supporting such programs.
However, “sharing the capacity has been an important way to get apprenticeships into rural and small employers,” said Vanessa Bennett, director at the Center for Apprenticeship and Work-Based Learning at the nonprofit Jobs for the Future. It’s helpful when employers partner with a nonprofit or community college that can sponsor an apprenticeship program, as Shasta College does, Bennett said.
Once Knight, the student logger, completes the heavy equipment pre-apprenticeship, he plans to return to his hometown of Oroville, about 100 miles south of Shingletown. His tribe — the Berry Creek Rancheria of Tyme Maidu Indians — is starting its own logging crew, and Knight will be one of only two members trained to use some of the most challenging pieces of logging equipment.
“This program is awesome,” said Knight, 24. “It’s really hands-on. You learn as you go and it helps to have a great instructor.”
Student logger Dylan Knight drives a masticator, which grinds wood into chips, as Shasta College instructor Chris Hockenberry looks on. Credit: Minh Connors for The Hechinger Report
Across the country in Maine, a community college is helping to train apprentices for jobs at heritage oyster, mussel and kelp farms that have struggled to find enough workers to meet the growing demand for shellfish. Often classified as seasonal work, aquaculture jobs can become year-round careers for workers trained in both harvesting shellfish and planning for future seasons.
“I love the farm work and I feel confident that I will be able to make a full-length career out of this,” said Gabe Chlebowski, who completed a year-long apprenticeship with Muscongus Bay Aquaculture, which harvests in Damariscotta, Maine. A farm boy from rural Pennsylvania, Chlebowski worked in construction and stone masonry after high school. When his parents moved to Maine, he realized that he wanted a job on the water. With no prior experience, he applied for an oyster farming apprenticeship and was accepted.
“I was the youngest by five years and the only person who’d never worked on water,” said Chlebowski, 22. “I grew up in a landlocked state surrounded by corn fields. I had the work ethic and no idea what I was doing in boats.”
The apprenticeship program was launched in 2023 by the Gulf of Maine Research Institute, which joined with the Maine Aquaculture Association and Educate Maine to create a yearlong apprenticeship with Southern Maine Community College. Apprentices take classes in shellfish biology, water safety, skiff driving and basic boat maintenance. Grants helped pay for the boots, jackets and fishing bibs apprentices needed.
“The workforce here was a bottleneck,” said Carissa Maurin, aquaculture program manager for GMRI. New workers with degrees in marine biology were changing their minds after starting training at aquaculture farms. “Farms were wasting time and money on employees that didn’t want to be there.”
Chlebowski completed the apprenticeship at Muscongus Bay in September. He learned how to repair a Yamaha outdoor motor, how to grade oysters and how to work on a 24-foot, flat-bottom skiff. He stayed on as an employee, working at the farm on the Damariscotta River — the oyster capital of New England. The company is known for two varieties of oysters: Dodge Cove Pemaquid and Wawenauk.
Oyster farming generates local pride, Chlebowski said. The Shuck Station in downtown Damariscotta gives oyster farmers a free drink when they come in and there’s an annual summer shucking festival. But the company is trying to provide careers, Chlebowski said, not just high-season jobs.
“It can be hard to make a career out of farming, but it’s like any trade,” he said, adding that there is work to do year-round. “Welding and HVAC have trade schools and apprenticeships. Why shouldn’t aquaculture?”
Chlebowski’s apprenticeship turned into a career. Back in Shingletown, students in the logging program hope for the same result when they finish.
Until then, they spend Mondays, Wednesdays and Fridays in the woods learning how to operate and maintain equipment. Tuesdays and Thursdays are spent on Shasta College’s Redding campus, where the apprentices take three classes: construction equipment operation, introduction to forestry and wood products and milling.
At the end of the semester, students demonstrate their skills at a showcase in the Shingletown woods. Logging company representatives will attend and scout for workers. Students typically get offers at the showcase. So far, 50 students have completed the pre-apprenticeship program and most transitioned into full apprenticeships. Fifteen people have completed the full apprenticeship program and now earn from $40,000 to $90,000 a year as loggers.
Mentorship is at the heart of apprenticeships. On the job, new workers are paired with more experienced loggers who pass on knowledge and supervise the rookies as they complete tasks. Pre-apprentices at Shasta College learn from Jonas Lindblom, the program’s heavy equipment and logging operations instructor.
At the logging site, Lindblom watches as a tall sugar pine slowly falls and thuds to the ground. Lindblom’s father, grandfathers and great-grandfather all drove trucks for logging companies in Northern California.
An axe sticks out of a freshly cut tree at a logging site used to train student loggers enrolled at Shasta College in Redding, Calif. Credit: Minh Connors for The Hechinger Report
This is a good area for apprentices to “just be able to learn at their pace,” he said. “They’re not pushed and they can get comfortable in the machines without developing bad habits along the way.”
Lindblom, who studied agriculture education at Chico State University, spent all his breaks during college working as a logger. He works closely with the logging companies that partner with the program to make sure he’s teaching up-to-date practices. It’s better for new loggers to learn in this outdoor classroom, he said, than on the job.
“The majority of these students did not grow up in logging families,” he said. “This is a great opportunity to pass on this knowledge and share where the industry is going.”
Contact editor Christina A. Samuels at 212-678-3635 or [email protected].
The Hechinger Report provides in-depth, fact-based, unbiased reporting on education that is free to all readers. But that doesn't mean it's free to produce. Our work keeps educators and the public informed about pressing issues at schools and on campuses throughout the country. We tell the whole story, even when the details are inconvenient. Help us keep doing that.
While many of our conversations have focused on what generative AI means for student assignments and learning outcomes, there’s another question faculty are asking—often individually and quietly: How can we leverage AI in our own academic and administrative work? And more importantly, should we?
The answer, I believe, lies in using AI to help clear space for the work only we can do—the collaboration, connection, and critical guidance that makes education transformative.
That doesn’t mean that we simply use AI as a crutch for answering emails or summarizing meetings. In fact, I believe the true promise of AI comes from using it, in Ethan Mollick’s words, as a “genuine intellectual partner,” one that can enhance classroom discussions, assist with creating engaging instructional materials, and even help develop sophisticated problem sets or simulations that previously required extensive preparation time. As Mollick says, “the focus needs to move from task automation to capability augmentation.”
AI offers many potential applications for faculty work. While faculty should continue to prioritize the importance of maintaining human connection, empathy, and support in our teaching practice, we need to consider other ways AI can augment our work. Perhaps one way is in the design of our courses, the assignments and activities that chart student progress across content and outcomes. But rather than asking AI to develop prompts or notes for us, we can use AI as a tool to help develop our work in surprising ways.
Works in Theory, Wobbles in Practice
We’ve all fallen in love with that one key discussion question or written assignment prompt that just fizzles in the classroom. Despite our best intentions, we may not provide enough information, or we fail to anticipate a blind spot that leads students down fruitless paths. One of the challenges of course design is that all our work can seem perfectly clear and effective when we are knee-deep in the design process, but everything somehow falls apart when deployed in the wild. From simple misunderstandings to complex misconceptions, these issues typically don’t reveal themselves until we see actual student work—often when it’s too late to prevent frustration.
Bridging this gap requires iterative refinement—recognizing that what works in theory or in controlled conditions needs real-world testing, adaptation, and continuous improvement. It’s not just about designing something that works in the lab but ensuring our designs are resilient, adaptable, and responsive enough to thrive in the wild.
While there’s no substitute for real-world testing, I began wondering if AI could help with this iterative refinement. I didn’t want AI to refine or tweak my prompts. I wanted to see if I could task AI with modelling hundreds of student responses to my prompts in the hope that this process might yield the kind of insight I was too close to see.
The Process: AI-Assisted Assignment Stress Testing
After experimenting with systems like Claude and ChatGPT, I’ve discovered they can effectively analyze and refine writing prompts through the creation of simulated student responses. The basic approach works like this. First, provide the AI with information about your course and key characteristics of your student population. Then, share the assignment prompt. The AI internally generates multiple simulated student responses across different skill levels. After, it provides a comprehensive analysis identifying potential issues and opportunities.
You might specify that the analysis include common misinterpretations students might make or any structural or organizational challenges in the prompt. But the AI can also identify content development patterns and potential issues as well as population-specific concerns based on your student demographics. Finally, the AI can even suggest refinements to the prompt.
Seeing What You’re Not Seeing
To test this approach, I uploaded a personal narrative prompt that asks students to connect their life experiences to their academic goals—a common assignment in first-year writing courses.
The AI analysis revealed several blind spots in my prompt design. For instance, I hadn’t considered how non-traditional students might struggle with “choice of major” language, since many are career-changers. The AI modeled responses also revealed that students might have difficulty transitioning between personal narrative and academic analysis sections. Most valuable was seeing how different student populations might interpret the same instructions. Career-changers might focus too heavily on work experiences, while others might struggle with how much personal information to share. These insights allowed me to add clarifying language and support materials before any real students encountered these challenges.
The entire process took about 30 minutes but potentially saved hours of student confusion and faculty clarification emails. Of course, AI responses aren’t identical to human student responses, and we should be cautious about viewing AI as an infallible expert or source of absolute truth. But used as an additional lens when developing assignments, this approach can grant course designers a different perspective, one that triggers valuable insights and potentially reduces workload.
This process allowed me to develop targeted support materials for predicted problem areas before students struggle, building proactive scaffolding into course design from the beginning. And by sharing insights gained through AI analysis, departments could collectively improve assignment design practices—particularly valuable for multi-section courses where consistency matters. Over time, we could build a practical library of “what works” that faculty could draw from, including analyses explaining why certain assignments succeed with particular student populations and learning objectives.
AI-assisted assignment analysis offers a promising tool that respects our expertise while expanding our ability to anticipate student needs. While the technology isn’t perfect and will never replace insights gained from direct student interaction, it provides a valuable perspective that helps identify blind spots before students encounter them. This represents just one way thoughtfully implemented AI can help us do more of what matters: creating meaningful learning experiences. By using AI for the predictive work of assignment design, we free more time and energy for the deeply human work of guiding and connecting with our students—the work that only we can do.
Dr. Nathan Pritts is a leader in higher education, specializing in faculty development, instructional innovation, and the integration of emerging technologies in teaching and learning. As Professor and Program Chair for First Year Writing at the University of Arizona Global Campus, he has spearheaded initiatives in strategic implementation of online learning technologies, comprehensive faculty training programs, and the creation of scalable interventions to support both faculty and students in online environments. As author and researcher, Dr. Pritts has published widely on topics including digital pedagogy, AI-enhanced curriculum design, assessment strategies, and the future of higher education.
While many of our conversations have focused on what generative AI means for student assignments and learning outcomes, there’s another question faculty are asking—often individually and quietly: How can we leverage AI in our own academic and administrative work? And more importantly, should we?
The answer, I believe, lies in using AI to help clear space for the work only we can do—the collaboration, connection, and critical guidance that makes education transformative.
That doesn’t mean that we simply use AI as a crutch for answering emails or summarizing meetings. In fact, I believe the true promise of AI comes from using it, in Ethan Mollick’s words, as a “genuine intellectual partner,” one that can enhance classroom discussions, assist with creating engaging instructional materials, and even help develop sophisticated problem sets or simulations that previously required extensive preparation time. As Mollick says, “the focus needs to move from task automation to capability augmentation.”
AI offers many potential applications for faculty work. While faculty should continue to prioritize the importance of maintaining human connection, empathy, and support in our teaching practice, we need to consider other ways AI can augment our work. Perhaps one way is in the design of our courses, the assignments and activities that chart student progress across content and outcomes. But rather than asking AI to develop prompts or notes for us, we can use AI as a tool to help develop our work in surprising ways.
Works in Theory, Wobbles in Practice
We’ve all fallen in love with that one key discussion question or written assignment prompt that just fizzles in the classroom. Despite our best intentions, we may not provide enough information, or we fail to anticipate a blind spot that leads students down fruitless paths. One of the challenges of course design is that all our work can seem perfectly clear and effective when we are knee-deep in the design process, but everything somehow falls apart when deployed in the wild. From simple misunderstandings to complex misconceptions, these issues typically don’t reveal themselves until we see actual student work—often when it’s too late to prevent frustration.
Bridging this gap requires iterative refinement—recognizing that what works in theory or in controlled conditions needs real-world testing, adaptation, and continuous improvement. It’s not just about designing something that works in the lab but ensuring our designs are resilient, adaptable, and responsive enough to thrive in the wild.
While there’s no substitute for real-world testing, I began wondering if AI could help with this iterative refinement. I didn’t want AI to refine or tweak my prompts. I wanted to see if I could task AI with modelling hundreds of student responses to my prompts in the hope that this process might yield the kind of insight I was too close to see.
The Process: AI-Assisted Assignment Stress Testing
After experimenting with systems like Claude and ChatGPT, I’ve discovered they can effectively analyze and refine writing prompts through the creation of simulated student responses. The basic approach works like this. First, provide the AI with information about your course and key characteristics of your student population. Then, share the assignment prompt. The AI internally generates multiple simulated student responses across different skill levels. After, it provides a comprehensive analysis identifying potential issues and opportunities.
You might specify that the analysis include common misinterpretations students might make or any structural or organizational challenges in the prompt. But the AI can also identify content development patterns and potential issues as well as population-specific concerns based on your student demographics. Finally, the AI can even suggest refinements to the prompt.
Seeing What You’re Not Seeing
To test this approach, I uploaded a personal narrative prompt that asks students to connect their life experiences to their academic goals—a common assignment in first-year writing courses.
The AI analysis revealed several blind spots in my prompt design. For instance, I hadn’t considered how non-traditional students might struggle with “choice of major” language, since many are career-changers. The AI modeled responses also revealed that students might have difficulty transitioning between personal narrative and academic analysis sections. Most valuable was seeing how different student populations might interpret the same instructions. Career-changers might focus too heavily on work experiences, while others might struggle with how much personal information to share. These insights allowed me to add clarifying language and support materials before any real students encountered these challenges.
The entire process took about 30 minutes but potentially saved hours of student confusion and faculty clarification emails. Of course, AI responses aren’t identical to human student responses, and we should be cautious about viewing AI as an infallible expert or source of absolute truth. But used as an additional lens when developing assignments, this approach can grant course designers a different perspective, one that triggers valuable insights and potentially reduces workload.
This process allowed me to develop targeted support materials for predicted problem areas before students struggle, building proactive scaffolding into course design from the beginning. And by sharing insights gained through AI analysis, departments could collectively improve assignment design practices—particularly valuable for multi-section courses where consistency matters. Over time, we could build a practical library of “what works” that faculty could draw from, including analyses explaining why certain assignments succeed with particular student populations and learning objectives.
AI-assisted assignment analysis offers a promising tool that respects our expertise while expanding our ability to anticipate student needs. While the technology isn’t perfect and will never replace insights gained from direct student interaction, it provides a valuable perspective that helps identify blind spots before students encounter them. This represents just one way thoughtfully implemented AI can help us do more of what matters: creating meaningful learning experiences. By using AI for the predictive work of assignment design, we free more time and energy for the deeply human work of guiding and connecting with our students—the work that only we can do.
Dr. Nathan Pritts is a leader in higher education, specializing in faculty development, instructional innovation, and the integration of emerging technologies in teaching and learning. As Professor and Program Chair for First Year Writing at the University of Arizona Global Campus, he has spearheaded initiatives in strategic implementation of online learning technologies, comprehensive faculty training programs, and the creation of scalable interventions to support both faculty and students in online environments. As author and researcher, Dr. Pritts has published widely on topics including digital pedagogy, AI-enhanced curriculum design, assessment strategies, and the future of higher education.
On the whole research funding is not configured to be sensitive to place.
Redistribution
It does good things in regions but this is different to funding being configured to do so. For example, universities in the North East performed strongly in the REF and as a consequence they received an uplift in QR funding. This will allow them to invest in their research capacity, this will bring agglomerate benefits in the North East, and go some small way to rebalancing the UK’s research ecosystem away from London.
REF isn’t designed to do that. It has absolutely no interest where research takes place, just that the research that takes place is excellent. The UK isn’t a very big place and it has a large number of universities. Eventually, if you fund enough things in enough places you will eventually help support regional clusters of excellence.
There are of course some specific place based funds but this doesn’t mean they are redistributive as well as being regionally focussed. The Higher Education Innovation Fund (HEIF) is focussed on regional capacity but it is £260m of a total annual Research England funding distribution of £2.8bn. HEIF is calculated using provider knowledge exchange work on businesses, public and third sector engagement, and the wider public. A large portion of the data is gathered through the HE-BCI Survey.
The result of this is that there is place based funding but inevitably institutions with larger research capacities receive larger amounts of funding. Of the providers that received the maximum HEIF funding in 2024/25 five were within the golden triangle, one was in the West Midlands, one was in the East Midlands, two were in Yorkshire and the Humber, one was in the North West, and one was in the South East but not the golden triangle. It is regional but it is not redistributive.
The Strength in Places Fund (SIPF) is a £312.5 million competitive funding scheme that takes a place-based approach to research and innovation (R&I) funding. SIPF is a UK Research and Innovation (UKRI) strategic fund managed by the SIPF delivery team based at Innovate UK and Research England. The aim of the Fund is to help areas of the UK build on existing strengths in R&I to deliver benefits for their local economy
This fund has been more successful in achieving a more regionally distributed spread of funding. For example, the fund has delivered £47m to Wales compared to only £18m in South East England. Although quality was a key factor, and there are some challenges to how aligned projects are to wider regional priorities, it seems that a focus on a balanced portfolio made a difference. As RAND Europe note
[…]steps were taken to ensure a balanced portfolio in terms of geographical spread and sectors; however, quality was the primary factor influencing panel recommendations (INTXX). Panel members considered the projects that had been funded in Wave 1 and the bids submitted in Wave 2, and were keen on ensuring no one region was overrepresented. One interviewee mentioned that geographical variation of awards contributed to the credibility of a place-based funding system[…].
The Regional Innovation Fund which aimed to support local innovation capacity was allocated with a specific modifier to account for where there had historically been less research investment. SPIF has been a different approach to solving the same conundrum of how best support research potential in every region of the UK.
It’s within this context that it is interesting to arrive at UKRI’s most recent analysis of the geographical distribution of its funding in 2022/23 and 2023/24. There are two key messages the first is that
All regions and nations received an increase in UKRI investment between the financial years 2021 to 2022 and 2023 to 2024. The greatest absolute increases in investment were seen in the North West, West Midlands and East Midlands. The greatest proportional increases were seen in Northern Ireland, the East Midlands and North West.
And the second is that
The percentage of UKRI funding invested outside London, the South East and East of England, collectively known as the ‘Greater South East’, rose to 50% in 2023 to 2024. This is up from 49% in the 2022 to 2023 financial year and 47% in the 2021 to 2022 financial year. This represents a cumulative additional £1.4 billion invested outside the Greater South East since the 2021 to 2022 financial year.
Waterloo sunset?
In the most literal sense the funding between the Greater South East and the rest of the country could not be more finely balanced. In flat cash terms the rest of the UK outside of the Greater South East has overtaken the Greater South East for the first time while investment per capita in the Greater South East still outstrips the rest of the country by a significant amount.
The reasons for this shift is because of greater investments in the North West, West Midlands, and East Midlands who cumulatively saw an increase of £550m worth of funding over the past three years. The regions with the highest absolute levels of funding saw some of the smallest proportions of increases in investment.
The evaluations and UKRI’s dataset present an interesting picture. There is nothing unusual about the way funding is distributed as it follows where the highest numbers of researchers, providers, and economic activity is located. It would be an entirely arbitrary mechanism which penalised the South East for having research strengths.
Simultaneously, with constrained resources there are lots of latent assets outside of the golden triangle that will not get funding. The UK is unusually reliant on its capital as an economic contributor and research funding follows this. The only way to rebalance this is to make deliberate efforts, like with SIPF, to lean toward a more balanced portfolio of funding.
This isn’t a plea to completely rip up the rule book, and a plea for more money in an era of fiscal constraint will not be listened to, but it does bring into sharp relief a choice. Either research policy is about bolstering the UK’s economic centre or it is about strengthening the potential of research where it receives less funding. There simply is not enough money to do both.
As Wonkhe’s commuter student series has demonstrated, there are many students, scholars, policy makers and practitioners doing some excellent work to highlight the challenges faced by commuter students in higher education.
Whilst the narrative might be changing, what is less discussed is some of the benefits of commuting for students.
In my PhD research whilst students spoke of long and difficult journeys, struggling to park on campus or counterintuitive academic timetabling, they also spoke about the positives of their experiences. It’s these that I think are an important starting point for those looking to know more about this student group.
Challenging the narrative
Simply put, not all students want to live in student accommodation. Some prefer to live with family. Mature students don’t necessarily want to live with younger students or uproot their family units into family accommodation options.
For those who have been in full-time employment to starting their studies, they saw the commute as just part of a working day.
One of the students in my study specifically told me that they didn’t expect their institution to do anything special for them as a commuter because commuting was just “what you do” in order to get to class.
For the wider student experience, a number of students in my research participated in multiple extra-curricular activities and had numerous friends they’d met through these, as well as on their commute or on their course. One commuter even elected to get a rail replacement bus at a weekend to participate in their sports club. Anyone who has ever taken a rail replacement bus will know that this isn’t a decision taken lightly.
In all these instances, commuter students demonstrated either the positive benefits of commuting, or that they were able to participate in university life irrespective of their commute.
I’m not trying to say that all commuter student experiences are positive. Unless you hold an amount of luck, you’re going to be late to class at some point due to some kind of transport issue. We’ve all been there, stood waiting for a bus that doesn’t turn up or a train that’s delayed. It’s frustrating and sometimes (read, often) it makes you late to class.
What I am saying is that this deficit narrative of commuter students often does them a bit of a disservice and ends up homogenising their experiences in a way that isn’t always that useful if we’re trying to think about how to improve their experience.
After all, when we’re framing commuter student experiences what we’re really doing is talking about how they fit in with the institutional structures that frame their university life.
For one of the commuters I spent time with, delays to their journey were frequent even when allowing plenty of time for their journey. What is of interest though was what happened next.
One of their class tutors routinely employed the institutions’ academic attendance policy. The policy stated that if a student arrived later than 15 minutes after the class had started, they’d be marked as absent. This meant that a couple of times, whilst they’d informed this staff member via email in advance that they’d be late and did physically make it to their class, they were still marked absent because they arrived 15 minutes late.
In contrast another tutor, acknowledging that this student had emailed them warning of their late attendance, allowed them to sign the register when they arrived.
Institutions can have one attendance policy but have staff enact it in different ways. If you’re a commuter student who’s had an awful journey to get to class, it’s not ideal when you’re not sure how your class tutor might respond.
This particular student had been previously recommended by their tutor that they should factor more time into their journey to arrive on time. This student, along with many others in my study, were already factoring in extra time in their journeys.
You can also see why there might be some disgruntlement between students here. If you’ve done the same journey as someone else on your course who has a different module tutor, you’d be annoyed to find out that they let them sign the register yet your tutor marked you as absent. Experiences like these then can lead to differing expectations between students and the institution which could develop into some pretty bad feeling.
Great expectations
I suggest it’s a question here around what the institution deems a reasonable expectation of its students, and by extension whether or not the institutional structures are suitably flexible enough to accommodate any fluctuations within these.
If the above example here is anything to go by, it’s likely that what’s considered as a reasonable expectation of commuter students will differ between students, institutional policies and ultimately the staff that enact them.
Where students spoke to me about commuting to university in more positive terms, it was often relating to how their expectations of university had matched their reality. This could be down to the individual themselves. For example, where students had researched their commute or done a commute to work prior to starting their degree they took things like disruptions to travel as simply part of life as a commuter student.
But if the expectations that were being asked of students had changed. For example, being taught by different academic staff with different stances on attendance, or these experiences were not clear between the institution and the student to begin with, this negative narrative could often arise as a result.
Being a commuter student is not mutually exclusive to having a poor student experience. But if we want to hear more about the positive experiences of commuter students, we need to think about why they’re positive and consider how our institutions can enhance these experiences further.