Amy Wax was suspended and had her pay cut for the 2025–26 academic year after repeatedly making disparaging remarks.
Jumping Rocks/Universal Images Group/Getty Images
A Pennsylvania district judge dismissed a lawsuit Thursday against the University of Pennsylvania filed by Amy Wax, a tenured law professor who was suspended for the 2025–26 academic year on half pay as part of a punishment for years of flagrantly racist, sexist, xenophobic and homophobic remarks.
University of Pennsylvania
In the suit filed in January, Wax claimed that the university discriminated against her by punishing her—a white Jewish woman—for speech about Black students but not punishing pro-Palestinian faculty members for speech that allegedly endorsed violence against Jews.
“As much as Wax would like otherwise, this case is not a First Amendment case. It is a discrimination case brought under federal antidiscrimination laws,” senior U.S. district judge Timothy Savage wrote in a 16-page opinion. “We conclude Wax has failed to allege facts that show that her race was a factor in the disciplinary process and there is no cause of action under federal anti-discrimination statutes based on the content of her speech.”
Savage also refuted Wax’s argument that the court should view “her comments disparaging Black students as a statement on behalf of a protected class.”
“Nothing in the disciplinary process or her comments leads to the conclusion that she was penalized for associating with a protected class. Her comments were not advocacy for protected classes,” he wrote. “They were negative and directed at protected classes. Criticizing minorities does not equate to advocacy for them or for white people. Her claim that criticism of minorities was a form of advocating for them is implausible.”
Wax was sanctioned in September 2024 after a years-long disciplinary battle over a laundry list of offensive statements she made during her tenure at the law school, including that “gay couples are not fit to raise children,” “Mexican men are more likely to assault women” and that it is “rational to be afraid of Black men in elevators.” Wax has worked at the law school since 2001.
In addition to a one-year suspension on half pay, the school eliminated her summer pay in perpetuity, publicly reprimanded her and took away her named chair. In 2018, she was removed from teaching required courses after commenting on the “academic performance and grade distributions of the Black students in her required first-year courses,” according to former dean of the law school Theodore W. Ruger.
Funds from the sale of the campus will pay off a portion of MCNY’s $67.4 million outstanding debt.
Cash-strapped Metropolitan College of New York is planning to sell its Manhattan campus to the City University of New York for $40 million, a regulatory filing first reported by Bloomberg shows.
The two institutions signed a letter of intent on Monday, according to the regulatory filing, which notes that proceeds will be used to pay off a portion of MCNY’s $67.4 million outstanding debt.
MCNY agreed to sell the site last year as part of a forbearance agreement with bondholders.
Metropolitan College of New York has struggled to keep up with debt in recent years and failed to maintain the agreed-upon ratio of liquid assets, according to a regulatory filing from July. The small college enrolled fewer than 500 students, according to the latest state data, and posted a deficit of more than $7 million in fiscal year 2023, publicly available financial data shows.
CUNY is purchasing 101,542 square feet across three floors in the shared building, which officials told Bloomberg they intend to use as a temporary site for the Hunter-Bellevue School of Nursing amid ongoing construction projects. The sale will require approval from bondholders as well as Metropolitan College’s accreditor, the Middle States Commission on Higher Education.
Lake-Sumter State College named GOP lawmaker John R. Temple as its president Thursday, making him the latest politician to helm a state institution, the Orlando Business Journal reported.
Temple, an ally of Republican governor Ron DeSantis, breaks with many of his fellow politicians who have become college presidents in that he does have administrative experience in higher education. Temple was hired as the college’s associate vice president for workforce in 2023. Previously he was a teacher and administrator in K–12 schools.
Other recent political hires include former lieutenant governor Jeanette Nuñez at Florida International University, lobbyist and DeSantis ally Marva Johnson at Florida A&M University, and former education commissioner Manny Diaz Jr. in an interim role at the University of West Florida.
Multiple others have been hired across the state college system. A recent analysis by Inside Higher Ed found at least a dozen executive hires with ties to the Republican Party or DeSantis since 2022. Multiple others donated thousands of dollars to GOP candidates and causes.
Another state institution, North Florida College, is also considering a political candidate for its next president. Mike Prendergast, former Citrus County sheriff and chief of staff for Rick Scott, the Republican governor–turned–U.S. senator, is one of several finalists for the North Florida job.
The University of Florida also hired an interim president last week, tapping for the job Donald Landry, a former Columbia University Medical School administrator with ties to conservative academic organizations. Landry was hired after the Florida Board of Governors rejected former University of Michigan president Santa Ono for the UF job for his past support of diversity, equity and inclusion initiatives, which he sought to distance himself from amid his candidacy.
Faculty and administrators’ opinions about generative artificial intelligence abound. But students—path breakers in their own right in this new era of learning and teaching—have opinions, too. That’s why Inside Higher Ed is dedicating the second installment of its 2025–26 Student Voice survey series to generative AI.
About the Survey
Student Voice is an ongoing survey and reporting series that seeks to elevate the student perspective in institutional student success efforts and in broader conversations about college.
Some 1,047 students from 166 two- and four-year institutions, public and private nonprofit, responded to this flash survey about generative artificial intelligence and higher education, conducted in July. Explore the data, captured by our survey partner Generation Lab, at this link. The margin of error is plus or minus three percentage points.
See what students have to say about trust in colleges and universities here, and look out for future student polls and reporting from our 2025–26 survey cycle, Student Voice: Amplified.
Some of the results are perhaps surprising: Relatively few students say that generative AI has diminished the value of college, in their view, and nearly all of them want their institutions to address academic integrity concerns—albeit via a proactive approach rather than a punitive one. Another standout: Half of students who use AI for coursework say it’s having mixed effects on their critical thinking abilities, while a quarter report it’s helping them learn better.
Here are seven things to know from the survey, plus some expert takes on what it all means, as higher education enters its fourth year of this new era and continues to struggle to lead on AI.
Most students are using generative AI for coursework, but many are doing so in ways that can support, not outsource, their learning.
The majority of students, some 85 percent, indicate they’ve used generative AI for coursework in the last year. The top three uses from a long list of options are: brainstorming ideas (55 percent), asking it questions like a tutor (50 percent) and studying for exams or quizzes (46 percent). Treating it like an advanced search engine also ranks high. Some other options present more of a gray area for supporting authentic learning, such as editing work and generating summaries. (Questions for educators include: Did the student first read what was summarized? How substantial were the edits?)
Fewer students report using generative AI to complete assignments for them (25 percent) or write full essays (19 percent). But elsewhere in the survey, students who report using AI to write essays are somewhat more likely than those using it to study to say AI has negatively impacted their critical thinking (12 percent versus 6 percent, respectively). Still, the responses taken as a whole add nuance to ongoing discussions about the potential rewards, not just risks, of AI. One difference: Community college students are less likely to report using AI for coursework, for specific use cases and over all. Twenty-one percent of two-year students say they haven’t used it in the last year, compared to 14 percent of four-year students.
Performance pressures, among other factors, are driving cheating.
The top reason students say some of their peers use generative AI in ways that violate academic integrity policies is pressure to get good grades (37 percent over all). Being pressed for time (27 percent) and not really caring about academic integrity policies (26 percent) are other reasons students chose. There are some differences across student subgroups, including by age: Adult learners over 25 are more likely than younger peers to cite lack of time due to work, family or other obligations, as well as lack of confidence in their abilities, for example. Younger students, meanwhile, are more likely to say that peers don’t really care about such policies, or don’t connect with course content. Despite the patchwork of academic integrity policies within and across institutions, few students—just 6 percent over all—blame unclear policies or expectations from professors about what constitutes cheating with AI.
Nearly all students want action on academic integrity, but most reject policing.
Some 97 percent believe that institutions should respond to academic integrity threats in the age of generative AI. Yet approaches such as AI-detection software and limiting technology use in classrooms are relatively unpopular options,selected by 21 percent and 18 percentof students, respectively. Instead, more students want education on ethical AI use (53 percent) and—somewhat contradicting the prior set of responses about what’s driving cheating—clearer, standardized policies on when and how AI tools can be used. Transparency seems to be a value: Nearly half of students want their institutions to allow more flexibility in using AI tools, as long as students are transparent about it.
Fewer support a return to handwritten tests or bluebooks for some courses, though this option is more popular among students at private nonprofit institutions than among their public institution peers, at 33 percent versus 22 percent. Those at private nonprofit institutions are also much more in favor of assessments that are generally harder to complete with AI, such as oral exams and in-class essays.
Students have mixed views on faculty use of generative AI for teaching.
The slight plurality of students (29 percent) is somewhat positive about faculty use of AI for creating assignments and other tasks, as long as it’s used thoughtfully and transparently. This of course parallels the stance that many students want from their institutions on student AI use, flexibility underpinned by transparency.
Another 14 percent are very positive about faculty use of AI, saying it could make instruction more relevant or efficient. But 39 percent of students feel somewhat or very negatively about it, raising concerns about quality and overreliance—the same concerns faculty members and administrators tend to have about student use. The remainder, 15 percent, are neutral on this point.
Generative AI is influencing students’ learning and critical thinking abilities.
More than half of students (55 percent) who have used AI for coursework in the last year say it’s had mixed effects on their learning and critical thinking skills: It helps sometimes but can also make them think less deeply. Another 27 percent say that the effects have actually been positive. Fewer, 7 percent, estimate that the net effect has been negative, and they’re concerned about overreliance. Men—who also report using generative AI for things like brainstorming ideas and completing assignments at higher rates than their women and nonbinary peers—are also more likely to indicate that the net effect has been positive: More than a third of men say generative AI is improving their thinking, compared to closer to one in five women.
Students want information and support in preparing for a world shaped by AI.
When thinking about their futures, not just academic integrity in the present, students again say they want their institutions to offer—but not necessarily require—training on how to use AI tools professionally and ethically, and to provide clearer guidance on ethical versus misuse of AI tools. Many students also say they want space to openly discuss AI’s risks and benefits. Just 16 percent say preparing them for a future shaped by generative AI should be left up to individual professors or departments, underscoring the importance of an institutional response. And just 5 percent say colleges don’t need to take any specific action at all here. Adult students—many of whom are already working—are most likely to say that institutions should offer training on how to use AI tools professionally and ethically, at 57 percent.
Less popular options from the full list:
Integrate AI-related content into courses across majors: 18 percent
Leave it up to individual professors or departments: 16 percent
Create new majors or academic programs focused on AI: 11 percent
Connect students with employers or internships that involve AI: 9 percent
Colleges don’t need to take any specific actions around AI: 5 percent
On the whole, generative AI isn’t devaluing college for students—and it’s increasing its value for some.
Students have mixed views on whether generative AI has influenced how they think of the value of college. But 35 percent say there’s been no change, and 23 percent say it’s more valuable now. Fewer, 18 percent, say they now question the value of college more than they used to. Roughly another quarter of students say it has changed how they think about college value, they’re just not sure in what way. So college value hasn’t plummeted in students’ eyes due to generative AI—but the technology is influencing how they think about it.
‘There Is No Instruction Manual’
Student Voice poll respondent Daisy Partey, 22, agreed with her peers that institutions should take action on student use of generative AI—and said that faculty members and other leaders need to understand how accessible and potent it is.
Daisy Partey
“I’d stress that it’s super easy to use,” she said in an interview. “It’s just so simple to get what you need from it.”
Partey, who graduated from the University of Nevada at Reno in May with a major in communications and minor in public health, said using generative AI became the default for some peers—even for something as simple as a personal introduction statement. That dynamic, coupled with fear of false positives from AI-detection tools, generally chilled her own use of AI throughout college.
She did sometimes use ChatGPT as a study partner or search tool, but tried to limit her use: “Sometimes I’d find myself thinking, ‘Well, I could just ChatGPT it.’ But in reality, figuring it out on my own or talking to another physical human being—that’s good for you,’” she said.
As for how institutions should address generative AI, Partey—like many Student Voice respondents—advocated a consistent, education-based approach, versus contradictory policies from class to class and policing student use. Similarly, Partey said, students need to know how and when to use AI responsibly for work, even as it’s still unknown how the technology will impact fields she’s interested in, such as social media marketing. (As for AI’s impact on the job market for new graduates, the picture is starting to form.)
“Provide training so that students know what they’re going into and the expectations for AI use in the workplace,” she emphasized.
Another Student Voice respondent at a community college in Texas, who asked to remain anonymous to speak about AI, said she uses generative AI to stay organized with tasks, create flash cards for tests and exams, and come up with new ideas.
“AI isn’t just about cheating,” she said. “For some students, it’s like having a 24-7 tutor.”
Jason Gulya, a professor of English and media communications at Berkeley College who reviewed the survey results, said they challenge what he called the “AI is going to kill college and democratize all knowledge” messaging pervading social media.
That the majority of students say AI has made their degree equally or more valuable means that this topic is “extremely nuanced” and “AI might not change the perceived value of a college degrees in the ways we expect,” he added.
Relatedly, Gulya called the link between pressure to get good grades and overreliance on AI “essential.” AI tools that have been “marketed to students as quick and efficient ways to get the highest grades” play into a “model of education that places point-getting and grade-earning over learning,” he said. One possible implication for faculty? Using alternative assessment practices “that take pressure away from earning a grade and that instead recenter learning.”
Jill Abney, associate director of the Center for the Enhancement of Learning and Teaching at the University of Kentucky, said it makes “total sense” that students also report thattime constraints are fueling academic dishonesty, since many are “stretched to the limits with jobs and other responsibilities on top of schoolwork.” To this point, one of the main interventions she and colleagues recommend to concerned instructors is “scaffolding assignments so students are making gradual progress and not waiting until the last minute.”
On clarity of guidelines around AI use, Abney said that most instructors she works with have, in fact, “put a lot of time into crafting clear AI policies.” Some have even moved beyond course-level policies toward an assignment-by-assignment labeling approach, “to ensure clear communication with students.” Tools to this end include the university’s own Student AI Use Scale.
Mark Watkins, assistant director of academic innovation and lecturer of writing and rhetoric at the University of Mississippi, underscored that both faculty-set policies for student use of AI and expectations for faculty use of AI have implications for faculty academic freedom, which “should be respected.”
At the same time, he said, “there needs to be leadership and a sense of direction from institutions about AI integration that is guided. To me, that means institutions should invest in consensus-building around what use cases are appropriate and publish frameworks for all stakeholders,” including faculty, staff and administrators.” Watkins has proposed his own “VALUES” framework for faculty use of AI in education, which addresses such topics as validating and assessing student learning.
Ultimately, Abney said, it’s a good thing students are thinking about how AI is impacting their cognition—a developing area of research—adding that students tend to “crave shared spaces of conversation where they can have open dialogues about AI with their instructors and peers.”
That’s what learning about generative AI and establishing effective approaches requires, she said, “since there is no instruction manual.”
This independent editorial project is produced with the Generation Lab and supported by the Gates Foundation.
As of 2024, 55 percent of law schools offered courses on AI.
Photo illustration by Justin Morrison/Inside Higher Ed | Maxxa_Satori and PhonlamaiPhoto/iStock/Getty Images
As more and more law firms integrate generative artificial intelligence into their practices, a growing number of law schools are preparing future lawyers to adapt.
Nearly three years after OpenAI’s ChatGPT went mainstream—followed by Anthropic’s Claude, Google’s Gemini and a host of other similar platforms—some 30 percent of law offices are using AI-based technology tools, according to data published by the American Bar Association this past spring. While ChatGPT is the most widely used, legal research–specific tools, such as Thomson Reuters’ CoCounsel, Lexis+ AI and Westlaw AI, are also catching on in the sector.
At the same time, 62 percent of law schools have incorporated formal opportunities to learn about or use AI into their first-year curriculum; 93 percent are considering updating their curriculum to incorporate AI education. In practice, however, many of those offerings may not be adequate, said Daniel W. Linna Jr., director of law and technology initiatives at Northwestern University’s Pritzker School of Law.
“Law firms are starting to expect more and more that students will be exposed to this in law school,” he said. “But they also understand that the current reality is that not many law schools are doing much more than basic training. And some may not even be doing that.”
AI-Savvy Will Have ‘Leg Up’
At its best, experts believe AI has the power to make lawyers more efficient and accurate, as well as the potential to expand public access to legal services. But as fake citations and misquotes appearing in AI-generated legal filings have already shown, lawyers need more than access to these tools to get the most out of using them. They need to know how they work and recognize their limitations.
“Law schools have to prepare students to be intentional users of this technology, which will require them to have foundational knowledge and understanding in the first place,” said Caitlin Moon, a professor and founding co-director of Vanderbilt Law School’s AI Law Lab. “We have to preserve that core learning process so that they remain the human expert and this technology complements and supports their expertise.”
It’s not clear yet the extent to which AI will reshape the legal job market over the next several years, especially for new lawyers whose first jobs after law school have historically involved reviewing documents and conducting legal research—two areas where AI tools excel. According to one interpretation of a new report from Goldman Sachs on how AI could affect the workforce, 17 percent of jobs in the legal sector may be at risk.
“Law firms on the cutting edge of innovation are certainly trying to figure out how leveraging this technology improves their bottom line,” Moon said. “For recent graduates, those who are coming into firms with an understanding and familiarity with AI have a leg up.”
Pressure on Law Schools
Regardless of what’s to come, all this uncertainty is putting pressure on law schools across the country to meet the moment, said Gary Marchant, faculty director of the Center for Law, Science and Innovation at Arizona State University’s Sandra Day O’Connor College of Law, which began offering an AI specialization last year.
“It creates a requirement for law schools and law firms to train future lawyers differently, so that they learn some of the third- and fourth-year associate skills while they’re still in law school,” Marchant said. “Even if AI doesn’t advance any further, it’s already come so far that it’s transforming the practice of law, and it could change even more. Right now, the conclusion is that lawyers who know AI will replace lawyers who don’t know AI.”
Recognition of that reality drove the University of San Francisco School of Law to become the first in the country to integrate generative AI education throughout its curriculum. Those efforts will be aided through partnerships with Accordance and Anthropic, the school announced last week.
“AI is something every student needs to understand, no matter what kind of law they want to do,” said Johanna Kalb, dean of USF’s law school. “Given how quickly these AI tools are improving and becoming more specialized, each of these innovations is going to change what lawyers are being asked to do and what skills they really need.”
While USF may be one of the few law schools with an AI curriculum mandate, 55 percent of programs offered specialized courses designed to teach students about AI in 2024, according to the most recent available ABA data.
That percentage has likely increased over the past year, said Andrew Perlman, dean of Suffolk University Law School and a member of the ABA’s Task Force on Law and Artificial Intelligence.
This fall Suffolk’s law school, which launched one of the country’s first legal technology programs nearly a decade ago, is requiring all first-year students to complete a custom generative AI learning track as part of its course on legal practice skills.
“There was a lot of hesitation early on about how useful AI may be inside law practices, but there is now an increasingly widespread recognition that hiring lawyers who understand both the traditional methods of practicing law and have the ability to embrace AI is a useful combination,” Perlman said. “Training students with that new skill set is going to put our graduates in a better position to succeed in the long run.”
Jacob Levine, a second-year student at Harvard Law School, got a taste of the demand for that balance during an internship at a law firm this summer.
“AI was a tool that was present and using it was permitted, but there was a lot of emphasis on gauging the ability of the individual to be able to do the analytical work that’s expected of a young attorney,” he said. “It’s important to know how to use AI but not purely rely on it and use it blindly. A big part of being able to do that is knowing how to do everything yourself.”
At ISTE this summer, I lost count of how many times I heard “AI” as the answer to every educational challenge imaginable. Student engagement? AI-powered personalization! Teacher burnout? AI lesson planning! Parent communication? AI-generated newsletters! Chronic absenteeism? AI predictive models! But after moderating a panel on improving the high school experience, which focused squarely on human-centered approaches, one district administrator approached us with gratitude: “Thank you for NOT saying AI is the solution.”
That moment crystallized something important that’s getting lost in our rush toward technological fixes: While we’re automating attendance tracking and building predictive models, we’re missing the fundamental truth that showing up to school is a human decision driven by authentic relationships.
The real problem: Students going through the motions
The scope of student disengagement is staggering. Challenge Success, affiliated with Stanford’s Graduate School of Education, analyzed data from over 270,000 high school students across 13 years and found that only 13 percent are fully engaged in their learning. Meanwhile, 45 percent are what researchers call “doing school,” going through the motions behaviorally but finding little joy or meaning in their education.
This isn’t a post-pandemic problem–it’s been consistent for over a decade. And it directly connects to attendance issues. The California Safe and Supportive Schools initiative has identified school connectedness as fundamental to attendance. When high schoolers have even one strong connection with a teacher or staff member who understands their life beyond academics, attendance improves dramatically.
The districts that are addressing this are using data to enable more meaningful adult connections, not just adding more tech. One California district saw 32 percent of at-risk students improve attendance after implementing targeted, relationship-based outreach. The key isn’t automated messages, but using data to help educators identify disengaged students early and reach out with genuine support.
This isn’t to discount the impact of technology. AI tools can make project-based learning incredibly meaningful and exciting, exactly the kind of authentic engagement that might tempt chronically absent high schoolers to return. But AI works best when it amplifies personal bonds, not seeks to replace them.
Mapping student connections
Instead of starting with AI, start with relationship mapping. Harvard’s Making Caring Common project emphasizes that “there may be nothing more important in a child’s life than a positive and trusting relationship with a caring adult.” Rather than leave these connections to chance, relationship mapping helps districts systematically identify which students lack that crucial adult bond at school.
The process is straightforward: Staff identify students who don’t have positive relationships with any school adults, then volunteers commit to building stronger connections with those students throughout the year. This combines the best of both worlds: Technology provides the insights about who needs support, and authentic relationships provide the motivation to show up.
True school-family partnerships to combat chronic absenteeism need structures that prioritize student consent and agency, provide scaffolding for underrepresented students, and feature a wide range of experiences. It requires seeing students as whole people with complex lives, not just data points in an attendance algorithm.
The choice ahead
As we head into another school year, we face a choice. We can continue chasing the shiny startups, building ever more sophisticated systems to track and predict student disengagement. Or we can remember that attendance is ultimately about whether a young person feels connected to something meaningful at school.
The most effective districts aren’t choosing between high-tech and high-touch–they’re using technology to enable more meaningful personal connections. They’re using AI to identify students who need support, then deploying caring adults to provide it. They’re automating the logistics so teachers can focus on relationships.
That ISTE administrator was right to be grateful for a non-AI solution. Because while artificial intelligence can optimize many things, it can’t replace the fundamental human need to belong, to feel seen, and to believe that showing up matters.
The solution to chronic absenteeism is in our relationships, not our servers. It’s time we started measuring and investing in both.
Dr. Kara Stern, SchoolStatus
Dr. Kara Stern is Director of Education for SchoolStatus, a portfolio of data-driven solutions that help K-12 districts improve attendance, strengthen family communication, support teacher growth, and simplify daily operations. A former teacher, principal, and head of school, she holds a Ph.D. in Teaching & Learning from NYU.
Latest posts by eSchool Media Contributors (see all)
Many institutions, including the SUNY system, have invested in Title VI coordinators in recent months.
Photo illustration by Justin Morrison/Inside Higher Ed | howtogoto/iStock/Getty Images
New York is mandating that all colleges in the state designate a coordinator to oversee investigations into discrimination on the basis of race, color, national origin and shared ancestry, which is prohibited under Title VI of the Civil Rights Act of 1964, Gov. Kathy Hochul’s office announced Wednesday.
According to Hochul, the state is the first in the country to pass such a law.
“By placing Title VI coordinators on all college campuses, New York is combating antisemitism and all forms of discrimination head-on,” she said in the press release. “No one should fear for their safety while trying to get an education. It’s my top priority to ensure every New York student feels safe at school, and I will continue to take action against campus discrimination and use every tool at my disposal to eliminate hate and bias from our school communities.”
Many colleges have begun hiring for Title VI coordinator roles in the past several months in response to the surge in reports of antisemitism and Islamophobia following Hamas’s fatal Oct. 7, 2023 attack on Israeli civilians. In some cases, the Department of Education’s Office for Civil Rights required institutions to add these roles after finding that they failed to adequately address complaints of discrimination on their campuses.
The State University of New York system had already mandated each of its campuses to bring on a Title VI coordinator by the fall 2025 semester.
Earlier this month, higher education policy leaders from all 50 states gathered in Minneapolis for the 2025 State Higher Education Executive Officers Higher Education Policy Conference. During a plenary session on the future of learning and work and its implications for higher education, Aneesh Raman, chief economic opportunity officer at LinkedIn, reflected on the growing need for people to be able to easily build and showcase their skills.
In response to this need, the avenues for learning have expanded, with high numbers of Americans now completing career-relevant training and skill-building through MOOCs, microcredentials and short-term certificates, as well as a growing number of students completing postsecondary coursework while in high school through dual enrollment.
The time for pontificating about the implications for higher education is past; what’s needed now is a pragmatic examination of our long-standing practices to ask, how do we evolve to keep up? We find it prudent and compelling to begin at the beginning—that is, with the learning-evaluation process (aka credit-evaluation process), as it stands to either help integrate more Americans into higher education or serve to push them out.
A 2024 survey of adult Americans conducted by Public Agenda for Sova and the Beyond Transfer Policy Advisory Board found, for example, that nearly four in 10 respondents attempted to transfer some type of credit toward a college credential. This included credit earned through traditional college enrollment and from nontraditional avenues, such as from trade/vocational school, from industry certification and from work or military experience. Of those who tried to transfer credit, 65 percent reported one or more negative experiences, including having to repeat prior courses, feeling limited in where they could enroll based on how their prior learning was counted and running out of financial aid when their prior learning was not counted. Worse, 16 percent gave up on earning a college credential altogether because the process of transferring credit was too difficult.
What if that process were drastically improved? The Council for Adult and Experiential Learning’s research on adult learners finds that 84 percent of likely enrollees and 55 percent of those less likely to enroll agree that the ability to receive credit for their work and life experience would have a strong influence on their college enrollment plans. Recognizing the untapped potential for both learners and institutions, we are working with a distinguished group of college and university leaders, accreditors, policy researchers and advocates who form the Learning Evaluation and Recognition for the Next Generation (LEARN) Commission to identify ways to improve learning mobility and promote credential completion.
With support from the American Association of Collegiate Registrars and Admissions Officers and Sova, the LEARN Commission has been analyzing the available research to better understand the limitations of and challenges within current learning evaluation approaches, finding that:
Learning-evaluation decision-making is a highly manual and time-intensive process that involves many campus professionals, including back-office staff such as registrars and transcript evaluators and academic personnel such as deans and faculty.
Across institutions, there is high variability in who performs reviews; what information and criteria are used in decision-making; how decisions are communicated, recorded and analyzed; and how long the process takes.
Along with this variability, most evaluation decisions are opaque, with little data used, criteria established or transparency baked in to help campus stakeholders understand how these decisions are working for learners.
While there have been substantial efforts to identify course equivalencies, develop articulation agreements and create frameworks for credit for prior learning to make learning evaluation more transparent and consistent, the data and technology infrastructure to support the work remain woefully underdeveloped. Without adequate data documenting date of assessment and aligned learning outcomes, credit for prior learning is often dismissed in the transfer process; for example, a 2024 survey by AACRAO found that 54 percent of its member institutions do not accept credit for prior learning awarded at a prior institution.
Qualitative research examining credit-evaluation processes across public two- and four-year institutions in California found that these factors create many pain points for learners. For one, students can experience unacceptable wait times—in some cases as long as 24 weeks—before receiving evaluation decisions. When decisions are not finalized prior to registration deadlines, students can end up in the wrong classes, take classes out of sequence or end up extending their time to graduation.
In addition to adverse impacts on students, MDRC research illuminates challenges that faculty and staff experience due to the highly manual nature of current processes. As colleges face dwindling dollars and real personnel capacity constraints, the status quo becomes unsustainable and untenable. Yet, we are hopeful that the thoughtful application of technology—including AI—can help slingshot institutions forward.
For example, institutions like Arizona State University and the City University of New York are leading the way in integrating technology to improve the student experience. The ASU Transfer Guide and CUNY’s Transfer Explorer democratize course equivalency information, “making it easy to see how course credits and prior learning experiences will transfer and count.” Further, researchers at UC Berkeley are studying how to leverage the plethora of data available—including course catalog descriptions, course articulation agreements and student enrollment data—to analyze existing course equivalencies and provide recommendations for additional courses that could be deemed equivalent. Such advances stand to reduce the staff burden for institutions while preserving academic quality.
While such solutions are not yet widely implemented, there is strong interest due to their high value proposition. A recent AACRAO survey on AI in credit mobility found that while just 15 percent of respondents report currently using AI for credit mobility, 94 percent of respondents acknowledge the technology’s potential to positively transform credit-evaluation processes. And just this year, a cohort of institutions across the country came together to pioneer new AI-enabled credit mobility technology under the AI Transfer and Articulation Infrastructure Network.
As the LEARN Commission continues to assess how institutions, systems of higher education and policymakers can improve learning evaluation, we believe that increased attention to improving course data and technology infrastructure is warranted and that a set of principles can guide a new approach to credit evaluation. Based on our emerging sense of the needs and opportunities in the field, we offer some guiding principles below:
Shift away from interrogating course minutiae to center learning outcomes in learning evaluation. Rather than fixating on factors like mode of instruction or grading basis, we must focus on the learning outcomes. To do so, we must improve course data in a number of ways, including adding learning outcomes to course syllabi and catalog descriptions and capturing existing equivalencies in databases where they can be easily referenced and applied.
Provide students with reliable, timely information on the degree applicability of their courses and prior learning, including a rationale when prior learning is not accepted or applied. Institutions can leverage available technology to automate existing articulation rules, recommend new equivalencies and generate timely evaluation reports for students. This can create more efficient advising workflows, empower learners with reliable information and refocus faculty time to other essential work (see No.3).
Use student outcomes data to improve the learning evaluation process. Right now, the default is that all prior learning is manually vetted against existing courses. But what if we shifted that focus to analyzing student outcomes data to understand whether students can be successful in subsequent learning if their credits are transferred and applied? In addition, institutions should regularly review course transfer, applicability and student success data at the department and institution level to identify areas for improvement—including in the design of curricular pathways, student supports and classroom pedagogy.
Overhaul how learning is transcripted and how transcripts are shared. We can shorten the time involved on the front end of credit-evaluation processes by shifting away from manual transcript review to machine-readable transcripts and electronic transcript transmittal. When accepting and applying prior learning—be it high school dual-enrollment credit, credit for prior learning or a course transferred from another institution—document that learning in the transcript as a course (or, as a competency for competency-based programs) to promote its future transferability.
Leverage available technology to help learners and workers make informed decisions to reach their end goals. In the realm of learning evaluation, this can be facilitated by integrating course data and equivalency systems with degree-modeling software to enable learners and advisers to identify the best path to a credential that minimizes the amount of learning that’s left on the table.
In these ways, we can redesign learning evaluation processes to accelerate students’ pathways and generate meaningful value in the changing landscape of learning and work. Through the LEARN Commission, we will continue to refine this vision and identify clear actionable steps. Stay tuned for the release of our full set of recommendations this fall and join the conversation at #BeyondTransfer.
Beth Doyle is chief of strategy at the Council for Adult and Experiential Learning and is a member of the LEARN Commission.
Carolyn Gentle-Genitty is the inaugural dean of Founder’s College at Butler University and is a member of the LEARN Commission.
Jamienne S. Studley is the immediate past president of the WASC Senior College and University Commission and is a member of the LEARN Commission.
Over the past decade, local newsrooms have been disappearing from the U.S., leaving communities without a trusted information source for happenings in their region. But a recently established initiative from the State University of New York aims to deploy student reporters to bolster the state’s independent and public news organizations.
Last year SUNY launched the Institute for Local News, engaging a dozen student reporting programs at colleges across the state—including Stony Brook University, the University at Buffalo and the University at Albany—to produce local news content. Faculty direct and edit content produced by student journalists for local media partners.
This summer, the Institute sent its first cohort of journalism interns out into the field, offering 20 undergraduates paid roles in established newsrooms. After a successful first year, SUNY leaders plan to scale offerings to include even more student interns in 2026.
The background: The Institute for Local News has a few goals, SUNY chancellor John B. King told Inside Higher Ed: to mobilize students to engage in local news reporting in places that otherwise may not be covered, to instill students with a sense of civic service and to provide meaningful experiential learning opportunities.
News deserts, or areas that lack news sources, can impact community members’ ability to stay informed about their region. New York saw a 40 percent decrease in newspaper publications from 2004 to 2019, according to data from the University of North Carolina.
Research from the University of Vermont’s Center for Community News found that over 1,300 colleges and universities are located in or near counties defined as news deserts, but last year nearly 3,000 student journalists in university-led programs helped those communities by publishing tens of thousands of stories in local news outlets.
A 2024 study from the Business–Higher Education Forum found a lack of high-quality internships available for all college students, compared to the number of students who want to partake in these experiences. Research also shows students believe internships are a must-have to launch their careers, but not everyone can participate, often due to competing priorities or financial constraints.
To combat these challenges, SUNY, aided by $14.5 million in support from the New York State budget, is working to expand internship offerings—including in journalism—by providing pay and funds for transportation and housing as needed.
“We think having those hands-on learning opportunities enriches students’ academic experience and better prepares them for postgraduation success,” King said.
The Institute for Local News is backed by funding from the Lumina Foundation and is part of the Press Forward movement.
On the ground: Grace Tran, a rising senior at SUNY Oneonta majoring in media studies, was one of the first 20 students selected to participate in an internship with a local news organization this summer.
Tran and her cohort spent three days at Governor’s Island learning about journalism, climate issues and water quality in New York City before starting their assignments for the summer. Tran worked at Capital Region Independent Media in Clifton Park as a video editor and producer, cutting interviews, filming on-site and interviewing news sources.
“I wasn’t a journalism buff but more [focused on] video production,” Tran said. “But having this internship got me into that outlet, and it taught me so much and now I feel like a journalism buff.”
In addition to exploring new parts of the region and digging deeper into news principles, Tran built a professional network and learned how to work alongside career professionals.
“It’s my first-ever media job and there were no other interns there; it was just me with everyone else who’s been in this industry for such a long time,” Tran said. “It built a lot of [my] communication skills—how you should act, professionalism, you know, you can’t go to a site in jeans or with a bad attitude.”
Meeting the other SUNY journalism interns before starting full-time was important, Tran said, because it gave her peers for feedback and support.
What’s next: SUNY hopes to replicate this year’s numbers of 160 students publishing work and 20 summer interns through the Institute for Local News and expand internships in the near future, King said.
The Institute for Local News is just one avenue for students to get hands-on work experience, King said. SUNY is building out partnerships with the Brooklyn and New York Public Library systems for internships, as well as opportunities to place interns with the Department of Environmental Conservation to focus on climate action.
“We have a ways to go to get to our goal for every SUNY undergraduate to have that meaningful internship experience,” King said. “But we really want to make sure every student has that opportunity.”
Do you have a career-focused intervention that might help others promote student success? Tell us about it.
I’ll admit a pet peeve when writers set out two extreme views, attributed vaguely to others, and then position themselves in the squishy middle as the embodiment of the golden mean. It seems too easy and feeds the cultural myth that the center is always correct.
So, at the risk of annoying myself, I’ve been frustrated with the discourse recently around whether students’ choice of majors matters. It both does and doesn’t, though that may be more obvious from a community college perspective than from other places.
“Comprehensive” community colleges, such as my own, are called that because they embrace both a transfer mission (“junior college”) and a vocational mission (“trade school”). The meaning of a major can be very different across that divide.
For example, students who major in nursing have the inside track at becoming nurses in a way that students who major in, say, English don’t. Welding is a specific skill. HVAC repair is a skill set aimed squarely at certain kinds of jobs. In each case, the goal is a program—sometimes a degree, sometimes a diploma or certificate—that can lead a student directly into employment that pays a living wage. In some cases, such as nursing, it’s fairly normal to go on to higher degrees; in others, such as welding, it’s less common. Either way, though, the content of what’s taught is necessary to get into the field.
In many transfer-focused programs, the opposite is true. A student with the eventual goal of, say, law school can take all sorts of liberal arts classes here, then transfer and take even more. Even if they want to stop at the bachelor’s level, the first two years of many bachelor’s programs in liberal arts fields are as much about breadth as about depth. Distribution requirements are called what they’re called because the courses are distributed across the curriculum.
At the level of a community college, you might not be able to distinguish the future English major from the future poli sci major by looking at their transcripts. They’ll take basic writing, some humanities, some social science, some math, some science and a few electives. And many receiving institutions prefer that students don’t take too many classes in their intended major in the first two years. Whether that’s because of a concern for student well-roundedness or an economic concern among departments about giving away too many credits is another question.
Of course, sometimes the boundary gets murky. Fields like social work straddle the divide between vocational and transfer, since the field often requires a bachelor’s degree. Similarly, a field like criminal justice can be understood as police training, but it also branches into criminology and sociology. And business, a perennially popular major, often leads to transfer despite defining itself as being all about the market.
The high-minded defense of the view that majors don’t matter is that student interest is actually much more important than choice of major. I agree strongly with that. I’d much rather see a student who loves literature study that than force herself to slog through an HVAC program, hating every moment of it. The recent travails of computer science graduates in the job market should remind us that there are no guaranteed occupations. Students who love what they study, or who just can’t stop thinking about it, get the most out of it. And after a few years, most adults with degrees are working in fields unrelated to their degrees anyway. To me, that’s a strong argument for the more evergreen skills of communication, analysis, synthesis, research and teamwork: No matter what the next hot technology is, people who have those skills are much more likely to thrive than people who don’t. A candidate’s tech skill may get them the first job, but their soft skills—not a fan of the term—get them promoted.
I want our students to be able to support themselves in the world that actually exists. I also want them to be able to support themselves in the world that will exist 20 years from now. Technological trends can be hard to get right. Remember when MOOCs were going to change everything? Or the Segway? In my more optimistic moments, I like to think that bridging the divide between the liberal arts and the vocational fields is one of the best things community colleges can do. Even if that feels squishy and centrist.