The Trump administration wants colleges to make a number of changes to their policies in order to get an edge in grant funding.
Brendan Smialowski/AFP/Getty Images
After four universities rejected the Trump administration’s compact for higher education, the White House is planning to meet Friday afternoon with the remaining five that have yet to respond.
A White House official confirmed plans of the meeting to Inside Higher Ed but didn’t say what the purpose of the gathering was or which universities would attend. Nine universities were asked to give feedback on the wide-ranging proposal by Oct. 20.
The virtual meeting will likely include May Mailman, a White House adviser, and Vincent Haley, director of the White House’s Domestic Policy Council, according to a source with knowledge of the White House’s plans. Mailman, Haley and Education Secretary Linda McMahon signed the letter sent to the initial nine about the compact.
So far, the Massachusetts Institute of Technology, Brown University, the University of Pennsylvania and the University of Southern California have publicly rejected the deal. Dartmouth College, the University of Arizona, the University of Texas at Austin, the University of Virginia and Vanderbilt University haven’t said whether they’ll agree to the compact. Trump officials have said that the signatories could get access to more grant funding and threatened the funding of those that don’t agree.
After USC released its letter rejecting the proposal, Liz Huston, a White House spokesperson, told the Los Angeles Times that “as long as they are not begging for federal funding, universities are free to implement any lawful policies they would like.”
Following the first rejection from MIT last Friday, President Trump posted on Truth Social that all colleges could now sign on. The White House has said that some institutions have already reached out to do so.
The source with knowledge of the White House’s plans said that the meeting “appears to be an effort to regain momentum by threatening institutions to sign even though it’s obviously not in the schools’ interest to do so.”
The Wall Street Journal reported that Arizona State University, the University of Kansas and Washington University in St. Louis were also invited. According to the Journal, the goal of the meeting was to answer questions about the proposal and to find common ground with the institutions.
Former senator Lamar Alexander, a Tennessee Republican and trustee at Vanderbilt, wrote in a Wall Street Journal op-ed that the compact was an example of federal overreach akin to previous efforts to impose uniform national standards on K–12 schools.
“Mr. Trump’s proposed higher education compact may provoke some useful dialogue around reform,” he wrote. “But the federal government shouldn’t try to manage the nation’s 6,000 colleges and universities.”
Inside Higher Ed reached out to the remaining five institutions as well as the new invitees, but they haven’t responded to a request for comment or to confirm whether they’ll attend the meeting.
After four universities rejected the Trump administration’s compact for higher education, the White House met Friday with some universities about the proposal.
A White House official confirmed plans of the meeting to Inside Higher Ed but didn’t say what the purpose of the gathering was or which universities would attend. Nine universities were asked to give feedback on the wide-ranging proposal by Oct. 20.
The virtual meeting planned to include May Mailman, a White House adviser, and Vincent Haley, director of the White House’s Domestic Policy Council, according to a source with knowledge of the White House’s plans. Mailman, Haley and Education Secretary Linda McMahon signed the letter sent to the initial nine about the compact.
So far, the Massachusetts Institute of Technology, Brown University, the University of Pennsylvania and the University of Southern California have publicly rejected the deal. Dartmouth College, the University of Arizona, the University of Texas at Austin, and Vanderbilt University haven’t said whether they’ll agree to the compact. UVA said late Friday afternoon that it wouldn’t agree to the proposal.
The Wall Street Journal reported that Arizona State University, the University of Kansas and Washington University in St. Louis were also invited. According to the Journal, the goal of the meeting was to answer questions about the proposal and to find common ground with the institutions.
Inside Higher Ed reached out to the universities, but none confirmed whether they attended the meeting.
The nine-page document would require universities to make a number of far-reaching changes from abolishing academic departments or programs that “purposefully punish, belittle, and even spark violence against conservative ideas” to capping international undergraduate enrollment at 15 percent. Institutions also would have to agree to freeze their tuition and require standardized tests for admissions, among other provisions.
Trump officials have said that the signatories could get access to more grant funding and threatened the funding of those that don’t agree. The Justice Department would enforce the terms of the agreement, which are vague and not all defined.
After USC released its letter rejecting the proposal, Liz Huston, a White House spokesperson, told the Los Angeles Times that “as long as they are not begging for federal funding, universities are free to implement any lawful policies they would like.”
Following the first rejection from MIT last Friday, President Trump posted on Truth Social that all colleges could now sign on. The White House has said that some institutions have already reached out to do so.
The source with knowledge of the White House’s plans said that the meeting “appears to be an effort to regain momentum by threatening institutions to sign even though it’s obviously not in the schools’ interest to do so.”
Former senator Lamar Alexander, a Tennessee Republican and trustee at Vanderbilt, wrote in aJournal op-ed that the compact was an example of federal overreach akin to previous efforts to impose uniform national standards on K–12 schools.
“Mr. Trump’s proposed higher education compact may provoke some useful dialogue around reform,” he wrote. “But the federal government shouldn’t try to manage the nation’s 6,000 colleges and universities.”
A Joint Warning
The American Council on Education and 35 other organizations warned in a joint statement released Friday that “the compact’s prescriptions threaten to undermine the very qualities that make our system exceptional.”
The organizations that signed requested the administration withdraw the compact and noted that “higher education has room for improvement.”
But “the compact is a step in the wrong direction,” the letter states. “The dictates set by it are harmful for higher education and our entire nation, no matter your politics.”
The letter is just the latest sign of a growing resistance in higher ed to the compact. Faculty and students at the initial group of universities rallied Friday to urge their administrators to reject the compact. According to the American Association of University Professors, which organized the national day of action, more than 1,000 people attended the UVA event.
And earlier this month, the American Association of Colleges and Universities released a statement that sharply criticized the compact. The statement said in part that college and university presidents “cannot trade academic freedom for federal funding” and that institutions shouldn’t be subject “to the changing priorities of successive administrations.” Nearly 150 college presidents and associations have endorsed that statement.
The joint statement from ACE and others, including AAC&U, was a way to show that the associations, which the letter says “span the breadth of the American higher education community and the full spectrum of colleges and universities nationwide,” are united in their opposition.
“The compact offers nothing less than government control of a university’s basic and necessary freedoms—the freedoms to decide who we teach, what we teach, and who teaches,” the statement reads. “Now more than ever, we must unite to protect the values and principles that have made American higher education the global standard.”
But not everyone in the sector signed on.
Key groups that were absent from the list of signatories include the Association of Public and Land-grant Universities, the Association of American Universities, the American Association of State Colleges and Universities, the National Association of Independent Colleges and Universities, Career Education Colleges and Universities, and the American Association of Community Colleges.
Inside Higher Ed reached out to each of those groups, asking whether they were invited to sign and, if so, why they chose not to do so. Responses varied.
AAU noted that it had already issued its own statement Oct. 10. AASCU said it was also invited to sign on and had “significant concerns” about the compact but decided to choose other ways to speak out.
“We are communicating in multiple ways with our member institutions and policymakers about the administration’s request and any impact it might have on regional public universities,” Charles Welch, the association’s president, said in an email.
Other organizations had not responded by the time this story was published.
by Mehreen Ashraf, Eimear Nolan, Manual F Ramirez, Gazi Islam and Dirk Lindebaum
Walk into almost any university today, and you can be sure to encounter the topic of AI and how it affects higher education (HE). AI applications, especially large language models (LLM), have become part of everyday academic life, being used for drafting outlines, summarising readings, and even helping students to ‘think’. For some, the emergence of LLMs is a revolution that makes learning more efficient and accessible. For others, it signals something far more unsettling: a shift in how and by whom knowledge is controlled. This latter point is the focus of our new article published in Organization Studies.
At the heart of our article is a shift in what is referred to epistemic (or knowledge) governance: the way in which knowledge is created, organised, and legitimised in HE. In plain terms, epistemic governance is about who gets to decide what counts as credible, whose voices are heard, and how the rules of knowing are set. Universities have historically been central to epistemic governance through peer review, academic freedom, teaching, and the public mission of scholarship. But as AI tools become deeply embedded in teaching and research, those rules are being rewritten not by educators or policymakers, but by the companies that own the technology.
From epistemic agents to epistemic consumers
Universities, academics, and students have traditionally been epistemic agents: active producers and interpreters of knowledge. They ask questions, test ideas, and challenge assumptions. But when we rely on AI systems to generate or validate content, we risk shifting from being agents of knowledge to consumers of knowledge. Technology takes on the heavy cognitive work: it finds sources, summarises arguments, and even produces prose that sounds academic. However, this efficiency comes at the cost of profound changes in the nature of intellectual work.
Students who rely on AI to tidy up their essays, or generate references, will learn less about the process of critically evaluating sources, connecting ideas and constructing arguments, which are essential for reasoning through complex problems. Academics who let AI draft research sections, or feed decision letters and reviewer reports into AI with the request that AI produces a ‘revision strategy’, might save time but lose the slow, reflective process that leads to original thought, while undercutting their own agency in the process. And institutions that embed AI into learning systems hand part of their epistemic governance – their authority to define what knowledge is and how it is judged – to private corporations.
This is not about individual laziness; it is structural. As Shoshana Zuboff argued in The age of surveillance capitalism, digital infrastructures do not just collect information, they reorganise how we value and act upon it. When universities become dependent on tools owned by big tech, they enter an ecosystem where the incentives are commercial, not educational.
Big tech and the politics of knowing
The idea that universities might lose control of knowledge sounds abstract, but it is already visible. Jisc’s 2024 framework on AI in tertiary education warns that institutions must not ‘outsource their intellectual labour to unaccountable systems,’ yet that outsourcing is happening quietly. Many UK universities, including the University of Oxford, have signed up to corporate AI platforms to be used by staff and students alike. This, in turn, facilitates the collection of data on learning behaviours that can be fed back into proprietary models.
This data loop gives big tech enormous influence over what is known and how it is known. A company’s algorithm can shape how research is accessed, which papers surface first, or which ‘learning outcomes’ appear most efficient to achieve. That’s epistemic governance in action: the invisible scaffolding that structures knowledge behind the scenes. At the same time, it is easy to see why AI technologies appeal to universities under pressure. AI tools promise speed, standardisation, lower costs, and measurable performance, all seductive in a sector struggling with staff shortages and audit culture. But those same features risk hollowing out the human side of scholarship: interpretation, dissent, and moral reasoning. The risk is not that AI will replace academics but that it will change them, turning universities from communities of inquiry into systems of verification.
The Humboldtian ideal and why it is still relevant
The modern research university was shaped by the 19th-century thinker Wilhelm von Humboldt, who imagined higher education as a public good, a space where teaching and research were united in the pursuit of understanding. The goal was not efficiency: it was freedom. Freedom to think, to question, to fail, and to imagine differently.
That ideal has never been perfectly achieved, but it remains a vital counterweight to market-driven logics that render AI a natural way forward in HE. When HE serves as a place of critical inquiry, it nourishes democracy itself. When it becomes a service industry optimised by algorithms, it risks producing what Žižek once called ‘humans who talk like chatbots’: fluent, but shallow.
The drift toward organised immaturity
Scholars like Andreas Scherer and colleagues describe this shift as organised immaturity: a condition where sociotechnical systems prompt us to stop thinking for ourselves. While AI tools appear to liberate us from labour, what is happening is that they are actually narrowing the space for judgment and doubt.
In HE, that immaturity shows up when students skip the reading because ‘ChatGPT can summarise it’, or when lecturers rely on AI slides rather than designing lessons for their own cohort. Each act seems harmless; but collectively, they erode our epistemic agency. The more we delegate cognition to systems optimised for efficiency, the less we cultivate the messy, reflective habits that sustain democratic thinking. Immanuel Kant once defined immaturity as ‘the inability to use one’s understanding without guidance from another.’ In the age of AI, that ‘other’ may well be an algorithm trained on millions of data points, but answerable to no one.
Reclaiming epistemic agency
So how can higher education reclaim its epistemic agency? The answer lies not only in rejecting AI but also in rethinking our possible relationships with it. Universities need to treat generative tools as objects of inquiry, not an invisible infrastructure. That means embedding critical digital literacy across curricula: not simply training students to use AI responsibly, but teaching them to question how it works, whose knowledge it privileges, and whose it leaves out.
In classrooms, educators could experiment with comparative exercises: have students write an essay on their own, then analyse an AI version of the same task. What’s missing? What assumptions are built in? How were students changed when the AI wrote the essay for them and when they wrote them themselves? As the Russell Group’s 2024 AI principles note, ‘critical engagement must remain at the heart of learning.’
In research, academics too must realise that their unique perspectives, disciplinary judgement, and interpretive voices matter, perhaps now more than ever, in a system where AI’s homogenisation of knowledge looms. We need to understand that the more we subscribe to values of optimisation and efficiency as preferred ways of doing academic work, the more natural the penetration of AI into HE will unfold.
Institutionally, universities might consider building open, transparent AI systems through consortia, rather than depending entirely on proprietary tools. This isn’t just about ethics; it’s about governance and ensuring that epistemic authority remains a public, democratic responsibility.
Why this matters to you
Epistemic governance and epistemic agency may sound like abstract academic terms, but they refer to something fundamental: the ability of societies and citizens (not just ‘workers’) to think for themselves when/if universities lose control over how knowledge is created, validated and shared. When that happens, we risk not just changing education but weakening democracy. As journalist George Monbiot recently wrote, ‘you cannot speak truth to power if power controls your words.’ The same is true for HE. We cannot speak truth to power if power now writes our essays, marks our assignments, and curates our reading lists.
Mehreen Ashraf is an Assistant Professor at Cardiff Business School, Cardiff University, United Kingdom.
Eimear Nolan is an Associate Professor in International Business at Trinity Business School, Trinity College Dublin, Ireland.
Manuel F Ramirez is Lecturer in Organisation Studies at the University of Liverpool Management School, UK.
Gazi Islam is Professor of People, Organizations and Society at Grenoble Ecole de Management, France.
Dirk Lindebaum is Professor of Management and Organisation at the School of Management, University of Bath.
While generative artificial intelligence tools have proliferated in education and workplace settings, not all tools are free or accessible to students and staff, which can create equity gaps regarding who is able to participate and learn new skills. To address this gap, San Diego State University leaders created an equitable AI alliance in partnership with the University of California, San Diego, and the San Diego Community College District. Together, the institutions work to address affordability and accessibility concerns for AI solutions, as well as share best practices, resources and expertise.
In the latest episode of Voices of Student Success, host Ashley Mowreader speaks with James Frazee, San Diego State University’s chief information officer, about the alliance and SDSU’s approach to teaching AI skills to students.
An edited version of the podcast appears below.
Q: Can you give us the high-level overview: What is the Equitable AI Alliance? What does it mean to be equitable in AI spaces?
James Frazee, chief information officer at San Diego State University
A: Our goal is simple but ambitious: to make AI literacy and access available as opportunities to all of our students, and I mean every student, whether they started at a community college, a California State University like ours or at a University of California school. We want to make sure they all have that same foundation to understand and apply AI responsibly in their lives, in their careers and during their academic journey.
Through this alliance, we’re trying to align resources and expand access to institutionally supported AI tools. So when people are using the free tools, they’re not free, right? They’re paying for them with their privacy, with their intellectual property. We want to make sure that they have access, not only to the training they need to use these tools responsibly, but also to the high-quality tools that are more accurate and that have commercial data protection so that they can rest assured that their intellectual property isn’t being used to train the underlying large language models.
Q: The alliance strives to work across institutions, which is atypical in many cases in higher ed. Can you talk about that partnership and why this is important for your students?
A: The Equitable AI Alliance emerged from survey results. We have this listening infrastructure we’ve created here at San Diego State—we launched an AI survey in 2023, within months of ChatGPT going public. We really wanted to establish a baseline and determine what tools our students were using, what opinions did they have about AI and maybe, most importantly, what did they expect from us institutionally in order to help them meet the moment?
During the analysis of those survey findings, we discovered evidence of a growing digital divide. For instance, we asked students about how many devices they had. If you have a smartphone, a tablet, a desktop and a laptop, you would have four smart devices.
What we found was more devices led to people being more likely to say that AI had positively affected their education, and more devices meant that they were more likely to be paying for the paid versions of these tools. We also saw in the open-ended responses … people being concerned about fee increases as a result of AI, people being concerned about students who didn’t have access to these tools or fluency with these tools being disadvantaged.
People were saying, “The people who are using these have an unfair advantage,” right? Students were asking questions about, is everybody going to be able to afford what they need in order to keep up with AI? So that really was a key driver in forming this alliance.
Q: When it comes to consolidating those resources or making sure that students have access, what does that look like? And how do you all share?
A: The Equitable AI Alliance is really two things. First, it’s a consortium that’s all about saving time and saving money and having universities and colleges come together to really look at ways to form these partnerships to democratize access to these high-quality tools. And also to provide the training that people need. So that’s kind of the first part of it, and that’s much larger than the regional consortium.
But we have a regional consortium between our San Diego Community College District, San Diego State University and the University of California at San Diego, which is also dubbed the Equitable AI Alliance. And the mission there is to ensure that every student, no matter where they begin their journey, has access to AI literacy, to those high-quality tools and opportunities to leverage those to help them succeed, both inside and outside of the classroom.
It’s really, ultimately about responding to the workforce needs that we’re seeing. Employers today are demanding students come to them with fluency using these tools, and if they don’t have that fluency, they’re not going to get that internship or that job interview. So it’s really important. That’s where those microcredentials that we’re sharing across our institutions are really powerful, because they can put that badge on their LinkedIn profile, which may make the difference between them getting the interview or not, just having that little artifact there that demonstrates that they have some skills and knowledge can really make an impact.
Q: What is the microcredential? How are students engaging with that?
A: The microcredentials themselves are really powerful because they’re basically mini courses in our learning management system. We try and make them bite-size enough to where people actually get through them.
There are five modules. The first module is really kind of demystifying AI—this is not some dark art. We try to explain, at a high level, how does AI work?
The second module, which is arguably the most important one, is all about responsible use. The fact that these models are built on information from human beings, which is inherently biased. How to be critical consumers of that information, the environmental costs, the human costs, talking about how to cite the use of these tools in your work, both academically and professionally.
Then there’s a module on what AI can do for you. And so we have different microcredentials, a microcredential for faculty, there’s microcredentials for students. For instance, in the microcredential for students, it’s focusing on using AI to find jobs, prepare for jobs, tailor your résumé for a particular job or internship, how to do role-playing—to practice for an interview, let’s say.
And then there’s finding apps, finding generative AI tools, how to do that, because there’s different AI tools you might want to use for certain things, like maybe you want to create some sort of graphic—you might want to use Midjourney or DALL-E, or whatever it might be.
And then there’s the activities. Part of the idea with the activities, which they have to do in order to earn the badge, is that we’re designing activities that try and keep the microcredential evergreen. So for instance, when we first rolled out the microcredential, nobody had heard of DeepSeek, because it didn’t exist. So now we have an activity that has people going out and looking for the latest large language models that are emerging. Every day, there’s some new model, it seems—that is something to be aware of.
And then bringing it back to again, why it’s important for them to be able to be in the loop, pointing out the fact that these models are often very sycophantic, right? They want to tell you what they think you want to hear. And so you really have to go back and forth and ideate with the tools, which requires a little practice, a little coaching, and you have to fact-check everything. And so that’s a really big part of this idea of, what does it mean to be literate when it comes to using these tools?
Q: When it came to developing the microcredential, who were the stakeholders at the table?
A: We have a long history of engaging with faculty and providing fellowships to faculty. That’s a way for us to incentivize engagement with faculty.
That manifests itself in the form of course release. So, in other words, we provide them with reassigned time, buy them out of teaching a course, so that they can come and work with us and consult with us. We have a long history of doing that, and this goes back decades, first helping us with faculty development around moving courses online.
We wanted that to be done by faculty for faculty. Yes, we have instructional designers who are staff, but we really wanted the faculty to be driving that. We identified in 2023 our first AI faculty fellows, and we got a faculty member from information systems and a faculty member from anthropology—very different in terms of their skill sets and their orientation to research. One a qualitative ethnographic researcher, another more of a quantitative machine learning focus. Very complementary in terms of just balancing each other out.
Twenty twenty-three was the first time we had ever provided fellowships to students. We provided fellowships to two students. One was an engineering student and another was an Africana studies student. Again, very different in terms of the academic domain and the discipline they were in, but again, very balanced.
So those two AI student fellows and the two AI faculty fellows helped us design the survey instrument, get the IRB [institutional review board] approvals, launch the survey, promote the survey. I really want to give credit where credit is due: We got an incredible response rate. We’re lucky if we usually get like a 3 percent response rate from a student survey. We got a 21 percent response rate in 2023; 7,811 students responded to that survey.
The credit for that goes to Associated Students, our student government. The president of Associated Students that year ran on a platform of getting students high-paying jobs, and he knew for students to get high-paying jobs, they needed to be conversant with AI. So he helped us promote that survey, and the whole campaign was around “your voice matters.” So thanks to his help and the help of these AI student fellows, we got this incredible response from our students.
So anyway, the students and the faculty fellows helped us analyze those results and then use that data to build these microcredentials. So very much involving faculty and students and our University Senate, our library. I mean, the library knows a thing or two about information literacy, right? They absolutely have to be at the table. Our Center for Teaching and Learning, which is responsible for providing faculty with professional development on campus, they were also very involved from the very outset, so very much of a collaborative effort.
Q: I wanted to ask about culture and creating a campus culture that embraces AI. How are you all thinking about engaging stakeholders in these hard conversations and bringing different disciplines to the playing field?
A: I think it’s really important. That’s what the data has done for us. It’s really created space for these conversations, because faculty will respond to evidence. If you have data that is from their students, who they care about deeply, that creates space for these conversations.
For instance, one of the things that emerged from the survey findings was inconsistency. In the same course, maybe taught by different instructors, there would be different expectations and policies with regard to AI.
In multiple sections of Psychology 101—and that’s not a real example, I’m just using that as a fictitious example—one instructor might completely forbid the use of AI and another one might require it, and that’s stressful for students because they didn’t know what to expect.
In fact, one of the comments that really resonated with me from the survey was, and this is a verbatim quote, “Just tell us what you expect and be clear about it.” Students were getting mixed messages.
So that led to conversations with our University Senate about the need to be clear with our students. I’m happy to report, just this past May, our University Senate unanimously passed a policy that requires an AI … statement in every syllabus. That was an important step in the right direction.
The University Senate also created guidelines for the use of generative AI in assessments and deliverables. You know, it’s important that you not be prescriptive with your faculty. You need to provide them with lots of examples of language that they can use or tweak, because they own the curriculum, and knowing that you don’t have to take a one-size-fits-all approach.
Maybe one assignment, it’s restricted; in another assignment, it’s unrestricted, right? You can do that. And they’re like, “Oh yeah, I can do that.” Giving them examples of language they can use, and also encouraging them to use this as an opportunity to have a conversation with their students.
The students want more direction on how to use these tools appropriately. And I think if you race to a policy that’s all about academic misconduct, it’s frankly insulting to the students, to just assume everybody’s cheating, and then when they leave here and go into their place of business, they’re going to be expected to use these tools. So, really powerful conversations.
That’s been key here—just talking about [AI]. I mean, it’s this seismic kind of epistemic shift for our faculty and how knowledge is created, how we acquire knowledge, how we represent knowledge, how we assess knowledge. It’s a stressful time for our faculty—they need to be able to process that with other faculty, and that’s super important.
Q: It’s also important that you’re having that conversation collegewide, because if this is a career competency and students do need AI skills, it needs to happen in every classroom, or at least be addressed in every classroom.
A: That’s a really good point, Ashley. In fact, we’re launching a program this year that we’re calling the AI-ready course design workshop, and the idea for that is that we’re identifying a faculty member from every major and we are paying them—and this is super important, too: It’s really a sign of respect, in terms of acknowledging the labor required to reimagine an assignment, to weave AI into the fabric of that assignment.
The goal is to have a faculty member from every major who teaches a required course in that major at least two times. We want to make sure that they have an opportunity to do this and then refine it and do it again. They’re being paid over break this winter to reimagine an assignment that leverages AI, and it is a deliverable. They will produce a three- to five-minute introspective video where they reflect on what they did, why they did it and what were the learning outcomes, both for them and for their students.
That is great because we will have an example from every major of how you can use AI in the fabric of your teaching. And I think that’s what faculty need right now. Again, they need lots of examples, and we’re incentivizing that through this program. We already have something we call the “AI in action” video series, so we already have some examples, but we don’t have examples from every major.
For us right now, I think you’re seeing a lot of engagement from faculty in engineering and sciences. We’re concerned that our humanities faculty need to engage; we need to engage the political scientists. We need to engage the philosophers and the historians. They can’t just sit this out. They’re really going to be key players in moving this forward, to prepare our students, regardless of major, for this AI-augmented world that we’re living in.
Q: What are some of the lessons that you’ve learned that you hope higher education can learn from? How do you all hope to be a model to your peers across the sector?
A: I think key is the importance of data and using data to inform the choices you’re making, whether it’s in the classroom, whether it’s in the cabinet. I report to the president, and using data to really drive those conversations, and using that to make sure that you’re engaging all of those stakeholders.
For instance, we’re looking at the survey data. That survey that we did in 2023 and repeated in 2024, we’ve now scaled up to the entire California State University system, and that is underway right now. In fact, I was just looking at the latest response rates. We have had, as of this morning, 77,714 people responding to the survey … which is about a 15 percent response rate. We’ve got half a million students in the CSU, so it’s a big number.
I was looking at [the data] with the council of vice presidents and my colleague … the provost, and I said, “When you look at the numbers for San Diego State, we’ve had 10,682 responses from students. We’ve had 406 responses from faculty and 556 responses from staff. But relative to the students, the response rate from faculty is pretty low.” So I talked with [the provost] about sending a message out to our academic leaders—the deans and the department chairs and the school directors—encouraging their faculty to respond to the survey, so that we have a balanced perspective.
Everybody has a voice. That is certainly something that I want to encourage; this whole idea of incentivizing faculty engagement, I think, is important. I think you really need to provide that encouragement for faculty to experiment, to show off, and then to really use that as an opportunity to recognize those faculty and celebrate them. That does a couple things. One, it honors them for taking the risk to do this work. Then it might inspire another faculty [member] to build on that work, or have coffee with that person and talk about what they wish they would have known that they could advise this person on who maybe is early career and would appreciate their advice. I think that idea of incentivizing faculty engagement is another thing that I would encourage the audience to consider.
Q: What’s next for you all? Are there other cool interventions or programs that are coming out?
A: That survey data is going to do quite a few things for us. It’s going to help us to not only refine the microcredentials and the work we’re doing with the microcredentials, but it’s also going to allow us to scaffold conversations with industry and our industry partners in terms of being responsive to the competencies they’re going to need in their industry.
I think it’s something like 35 out of the top 50 AI companies are housed here in California, but they can’t find the talent they need in California, let alone the United States, so they’re having to go abroad to get the people they need to continue to innovate. So using this as an opportunity to work with our industry partners to make sure we’re preparing this workforce that they need to continue to innovate, that’s a key element of it, and then using this data also to help us get additional resources and use that data to say, “Hey, here’s a gap we’ve identified. We need to fill this gap,” and using that data to make the case for that investment.
One of the great ironies and great frustrations of my career teaching first-year college writing was having students enter our class armed with a whole host of writing strategies which they had been explicitly told they needed to know “for college,” and yet those strategies—primarily the following of prescriptive templates—were entirely unsuited to the experience students were going to have over the next 15 weeks of our course (and beyond).
I explored and diagnosed these frustrations in Why They Can’t Write, and while many other writing teachers in both high school and college shared that they’d seen and been equally (or more) frustrated by the same things. In the intervening years, there’s been some progress, but frankly, not enough, primarily because the structural factors that distorted how writing is taught precollege have not been addressed.
As long as writing is primarily framed as workforce preparation to be tested through standardization and quantification, students will struggle when invited into a more nuanced conversation that requires them to mine their own thoughts and experiences of the world and put those thoughts and experiences in juxtaposition with the ideas of others. The good news, in my experience, is that once invited into this struggle, many students are enthusiastic to engage, at least once they genuinely believe that you are interested in the contours of their minds and their experiences.
Clark calls for a “higher ed and secondary ed alliance” based in the values we all at least claim to share: free inquiry, self-determination and an appreciation for lives that are more than the “skills” we’re supposed to bring to our employers.
Something I can’t help but note is that the challenges college instructors are having getting students to steer clear of outsourcing their thinking to large language models would be significantly lessened if students had a greater familiarity with thinking during their secondary education years. Unfortunately, the system of indefinite future reward that has been reduced to pure transactions in exchange for grades and credentials has signaled that the outputs of the homework machine are satisfactory, so why not just give in?
When I go to campuses and schools and have the opportunity to speak to students, I try to list all kinds of reasons why they shouldn’t just give in, reasons which, in the end, boil down to the fact that being a big dumb-dumb who doesn’t know anything and can’t do anything without the aid of a predictive text-generation machine is simply an unfulfilling and unpleasant way to go through life.
In short, they will not be happy, even if they find ways to navigate their “work” with the aid of AI, because humans simply need more than this from our existences.
In a world where machines can handle the technical knowledge, the only differentiator is being human.
This is not news to those of us with those degrees, like my sister-in-law, who took her liberal arts degree from Denison University all the way to a general counsel job at a Fortune 300 company, or someone else with a far humbler résumé … me.
As I wrote in 2013 in this very space, the key to my success as an adult who has had to repeatedly adapt to a changing world is my liberal arts degrees, degrees that armed me with foundational and enduring skills that have served me quite well.
But, of course, it is about more than these skills. My pursuit of these degrees also allowed me to consider what a good life should be. That knowledge has put me in a position where—knock wood—I wake up just about every morning looking forward to what I have to do that day.
This is true even as the things I most care about—education, reading/writing, uh … democracy—appear to be inexorably crumbling around me. Perhaps this is because my knowledge of the value of humanistic study as something more than a route to a good job makes me more willing to fight for its continuation.
Sometimes when I encounter some hand-wringing about the inevitability of AI and the uncertainty of the future, I want to remind the fretful that we actually have a very sound idea of what we should be emphasizing, the same stuff we always should have been emphasizing—teaching, learning, living, being human.
We have clear notions of what this looks like. The main question now is if we have the collective will to move toward that future, or if we will give in to something much darker, much less satisfying and much less human.
Philanthropist MacKenzie Scott has gifted Morgan State University $63 million in unrestricted funds, the largest gift in the university’s history.
In 2020, Scott awarded the historically Black university in Baltimore $40 million, which went toward multiple research centers and endowed faculty positions, among other advancements.
Morgan State leaders announced that the new funding will help build the university’s endowment, expand student supports and advance its research.
David K. Wilson, president of Morgan State, called the gift “a resounding testament to the work we’ve done to drive transformation, not only within our campus but throughout the communities we serve.”
“To receive one historic gift from Ms. Scott was an incredible honor; to receive two speaks volumes about the confidence she and her team have in our institution’s stewardship, leadership, and trajectory,” Wilson said in the announcement. “This is more than philanthropy—it’s a partnership in progress.”
A recent report from the Community College Research Center at Columbia University’s Teachers College found that high school students graduate college at higher rates and earn more after college if they’ve taken a combination of dual-enrollment and Advanced Placement courses.
The report, released Tuesday, drew on administrative data from Texas on students expected to graduate high school in 2015–16 and 2016–17, as well as some data from students expected to complete in 2019–20 and 2022–23. It explored how different kinds of accelerated coursework, and different combinations of such work, affected student outcomes.
Researchers found that students who combined Advanced Placement or International Baccalaureate courses with dual-enrollment courses boasted higher completion rates and earnings than their peers. Of these students, 92 percent enrolled in or completed a credential a year after high school, and 71 percent earned a credential by year six.
These students also showed the strongest earnings outcomes in their early 20s. They earned $10,306 per quarter on average at age 24, compared to $9,746 per quarter among students who took only dual enrollment and $8,934 per quarter for students who took only AP/IB courses. However, students taking both dual-enrollment and AP/IB courses tended to be less racially and socioeconomically diverse than students taking AP/IB courses alone, the report found.
Students who combined dual enrollment with career and technical education—who made up just 5 percent of students in the study—also reaped positive outcomes later in life. These students earned $9,746 per quarter on average by age 24, compared to $8,097 per quarter on average for students with only a CTE focus.
“Most dual-enrollment students in Texas also take other accelerated courses, and those who do tend to have stronger college and earnings trajectories,” CCRC senior research associate Tatiana Velasco said in a press release. “It’s a pattern we hadn’t fully appreciated before, which offers clues for how to expand the benefits of dual enrollment to more students.”
The share of Americans who believe higher education has lost its way is on the rise, according to a new survey the Pew Research Center published Wednesday.
Of the 3,445 people who responded to the survey last month, 70 percent said higher education is generally “going in the wrong direction,” up from 56 percent in 2020. They cited high costs, poor preparation for the job market and lackluster development of students’ critical thinking and problem-solving skills.
The survey results come amid turmoil for the higher education sector, which was already facing rising public skepticism about the value of a college degree before Donald Trump took office earlier this year. But over the past nine months, the Trump administration has terminated billions in federal research grants and withheld even more money from several selective institutions.
Another survey published this week found that most Americans oppose the government’s cuts to higher education.
Earlier this month, Trump asked universities to sign a compact that would give them preference in federal funding decisions if they agree to make sweeping operational changes, including suppressing criticism of conservative views on campus.
But the state of campus free speech is already one factor driving the public’s overall negative views about higher education, according to the survey.
Forty-five percent of respondents said colleges and universities are doing a fair or poor job of exposing students to a wide range of opinions and viewpoints; 46 percent said institutions are doing an inadequate job of providing students opportunities to express their own opinions and viewpoints.
Political leanings also influenced perceptions of higher education, though the gap between Republicans and Democrats has narrowed in recent years.
According to the survey, 77 percent of Republicans and Republican-leaning respondents said higher education is moving in the wrong direction, compared to 65 percent of Democrats and Democratic-leaning respondents.
Republicans were more likely than Democrats to say that universities are doing a poor or fair job of preparing students for well-paying jobs, developing students’ critical thinking and problem-solving skills, exposing students to a wide range of opinions and viewpoints, and providing opportunities for students to express their own opinions and views.
A new report analyzes earnings for college graduates from different majors.
Nuthawut Somsuk/Getty Images
Despite mounting public skepticism about the value of a college degree, the data is still clear: Over all, college graduates have much higher earning potential than their peers without a bachelor’s degree. But the limits of those boosted earnings are often decided by a student’s major.
American workers with a four-year degree ages 25 to 54 earn a median annual salary of $81,000—70 percent more than their peers with a high school diploma alone, according to a new report that Georgetown University’s Center on Education and the Workforce published Thursday. However, the salary range for workers with a bachelor’s degree can span anywhere from $45,000 a year for graduates of education and public service to $141,000 for STEM majors.
And even within those fields, salary levels have a big range. Humanities majors in the prime of their careers earn between $48,000 and $105,000 a year, with a median salary of $69,000. Meanwhile, business and communications majors earn between $58,000 and $129,000 a year, with a median salary of $86,000.
“Choosing a major has long been one of the most consequential decisions that college students make—and this is particularly true now, when recent college graduates are facing an unusually rocky labor market,” said Catherine Morris, senior editor and writer at CEW and lead author of the report, “The Major Payoff: Evaluating Earnings and Employment Outcomes Across Bachelor’s Degrees.”
“Students need to weigh their options carefully.”
The report, which analyzed earnings and unemployment data collected by the U.S. Census Bureau’s American Community Survey from 2009 to 2023, also documented rising unemployment for recent college graduates. In 2008, recent graduates had lower unemployment rates relative to all workers (6.8 percent versus 9.8 percent). But that gap has narrowed over the past 15 years; since 2022, recent college graduates have faced higher levels of unemployment relative to all workers.
Morris attributed rising unemployment for recent college graduates to a mix of factors, including increased layoffs in white-collar fields, the rise of artificial intelligence and general economic uncertainty. At the same time, climbing tuition prices and the student debt crisis have heightened consumer concern about a degree’s return on investment.
“Over the past 15 years, there’s been more and more of a shift toward students wanting to get degrees in majors that they perceive as lucrative or high-paying,” Morris, who noted that STEM degrees, especially computer science, have become increasingly popular. Meanwhile, the popularity of humanities degrees has declined.
But just because a degree has higher earning potential doesn’t mean it’s immune to job instability. In 2022, 6.8 percent of recent graduates with computer science degrees were unemployed, while just 2.2 percent of education majors—who typically earn some of the lowest salaries—were unemployed.
“The more specific the major, the more sensitive it is to sectoral shocks,” said Jeff Strohl, director of the center at Georgetown. “More general majors actually have a lot more flexibility in the labor market. I would expect to see some of the softer majors that start with higher unemployment than the STEM majors be a little more stable.”
And earning a graduate degree can also substantially boost earnings for workers with a bachelor’s degree in a more general field, such as multidisciplinary studies, social sciences or education and public service. Meanwhile, the graduate earnings premium for more career-specific fields isn’t as high.
“About 25 percent of bachelor of arts majors don’t by themselves have a positive return on investment,” Strohl said. “But we need to look at the graduate earnings premium, because many B.A. majors don’t stand by themselves.”
Although salaries for college graduates are one metric that can help college students decide on a major, Morris said it shouldn’t be the only consideration.
“Don’t just chase the money,” she said. “The job market can be very unpredictable. Students need to be aware of their own intrinsic interests and find ways to differentiate themselves.”
The report, released Thursday, shows the majority of students see a positive return on investment within 10 years, but Strada says it isn’t enough.
Chaichan Pramjit/iStock/Getty Images Plus
Seventy percent of the country’s college graduates see their investment pay off within 10 years, but that outcome correlates strongly to the state where a student obtains their degree, according to the Strada Foundation’s latest State Opportunity Index.
The report, released Thursday, shows that states such as California and Delaware surpass the average at 76 percent and 75 percent, respectively, while North Dakota, for example, falls significantly short at 53 percent.
Across the board, the nation still has a ways to go before it can ensure all graduates see a positive return on investment, according to the report.
“Too many learners invest substantial time and money without achieving strong career and earnings outcomes,” it says. “Meanwhile, many employers struggle to find the skilled talent they need to fill high-wage jobs.”
Strada hopes that the index and the five categories it highlights—outcomes, coaching, affordability, work-based learning and employer alignment—will provide a framework for policymakers to “strengthen the link between education and opportunity.”
“The State Opportunity Index reinforces our belief at Strada Education Foundation that we as a nation can’t just focus on college access and completion and assume that a college degree will consistently deliver for all on the promise of postsecondary education as a pathway to opportunity,” Strada president Stephen Moret said in a news release. “We must look at success beyond completion, with a sharper focus on helping people land jobs that pay well and offer growth opportunities.”