The University of Melbourne was the highest ranked Australian university.
Australia’s universities have charged up the global leaderboard in a year where many of their international peers lost ground, according to a world-renowned tertiary rankings list.
Please login below to view content or subscribe now.
The 2026 rankings overall show that “East Asian nations, led by China, continue to thrive,” said THE’s chief global affairs officer, Phil Baty.
Photo illustration by Justin Morrison/Inside Higher Ed | bingdian, cbarnesphotography and DNY59/iStock/Getty Images
Even before U.S. universities lost billions of dollars in federal research funding and international students struggled to obtain visas, America’s dominance in research impact and global reputation was waning. According to the latest rankings from Times Higher Education, the U.S. has continued to cede influence to universities in Asia.
For several years, the rankings from Inside Higher Ed’s parent company have documented a steady decline in the U.S.’s leadership in global higher education. The 2026 World University Rankings reflect that ongoing trend: Just 102 universities from the United States cracked the top 500—the lowest figure on record, down from a high of 125 in 2018. (The rankings started in 2004.)
The downward trend is less apparent in the overall top 10, where seven U.S. institutions appear. The Massachusetts Institute of Technology is the highest-ranking American institution, coming in at No. 2, just behind the top-ranked University of Oxford. According to THE, Princeton University recorded an institutional best score to tie for third.
But institutions farther down the list have slipped. Twenty-five colleges logged their worst-ever scores while 62 dropped in the rankings, which uses 18 measures to judge institutions on five areas including teaching, research quality and international outlook.
The 2026 rankings of more than 2,100 institutions are based on data collected from 2022 and 2023 and don’t reflect the Trump administration’s push to reshape American higher education. They don’t show what impact cuts to research funding and the crackdown on international students might have on U.S. institutions’ position on the global stage. Those changes could lead to a further decline for U.S. institutions in the rankings, though Ellie Bothwell, THE rankings editor, said what future rankings might show is hard to predict.
“Any country that cuts research funding, that limits internationalization of higher education, would be in danger of declining in the ranking,” she said. “Those are key things that we measure. Those are important things that universities do. There’s always going to be a risk if you cut those. There’ll be a decline, but it’s all relative, so it does also depend what goes on elsewhere.”
Looking at the overall 2026 rankings, Bothwell called the decline for the U.S. “striking,” adding that the drop reflects increased global competition. American institutions on average received lower scores on measures related to research, such as citation impact, as well as research strength and reputation.
Meanwhile, Asian universities continue to climb the rankings. Five universities from China are now in the top 40, and 18 achieved their highest ranks ever, according to THE.
THE’s chief global affairs officer, Phil Baty, said in a statement that the latest data suggests higher ed is moving toward a new world order with an Eastern center of gravity.
“This year’s rankings highlight a dramatic and accelerating trend—the shift in the balance of power in research and higher education excellence from the long-established, dominant institutions of the West to rising stars of the East,” Baty said.
He predicted that U.S. institutions and those in Western Europe would continue to lose ground to their East Asian counterparts in the rankings. “This clear trend is set to persist as research funding and international talent attraction continue to be stymied in the West,” he added.
More than two years into a conservative takeover of New College of Florida, spending has soared and rankings have plummeted, raising questions about the efficacy of the overhaul.
While state officials, including Republican governor Ron DeSantis, have celebrated the death of what they have described as “woke indoctrination” at the small liberal arts college, student outcomes are trending downward across the board: Both graduation and retention rates have fallen since the takeover in 2023.
Those metrics are down even as New College spends more than 10 times per student what the other 11 members of the State University System spend, on average. While one estimate last year put the annual cost per student at about $10,000 per member institution, New College is an outlier, with a head count under 900 and a $118.5 million budget, which adds up to roughly $134,000 per student.
Now critics are raising new questions about NCF’s reputation, its worth and its future prospects as a public liberal arts college.
A Spending Spree
To support the overhaul, the state has largely issued a blank check for New College, with little pushback from officials.
While some—like Florida Board of Governors member Eric Silagy—have questioned the spending and the state’s return on investment, money keeps flowing. Some critics say that’s because the college is essentially a personal project of the governor.
“With DeSantis, I think his motivation for the takeover was that he was running for president and he needed some educational showcase. And he picked us because we were an easy target,” one New College of Florida faculty member said, speaking on the condition of anonymity.
But now, two-plus years and one failed presidential run later, money continues to flow to the college to help establish new athletics programs and recruit larger classes each year. Part of the push behind such recruiting efforts, the faculty member said, is because of retention issues.
“It’s kind of like a Ponzi scheme: Students keep leaving, so they have to recruit bigger and bigger cohorts of students, and then they say, ‘Biggest class ever’ because they have to backfill all the students who have left,” they said.
Nathan Allen, a New College alum who served as vice president of strategy at NCF for almost a year and a half after the takeover but has since stepped down, echoed that sentiment, arguing that administrators are spending heavily with little return on investment and have failed to stabilize the institution. He also said they’ve lost favor with lawmakers, who have expressed skepticism in conversations—even though New College is led by former Speaker of the Florida House Richard Corcoran, a Republican.
“I think that the Senate and the House are increasingly sensitive to the costs and the outcomes,” Allen said. “Academically, Richard’s running a Motel 6 on a Ritz-Carlton budget, and it makes no sense.”
While New College’s critics have plenty to say, supporters are harder to find.
Inside Higher Ed contacted three NCF trustees (one of whom is also a faculty member), New College’s communications office, two members of the Florida Board of Governors (including Silagy) and the governor’s press team for this article. None responded to requests for comment.
A Rankings Spiral
Since the takeover, NCF has dropped nearly 60 spots among national liberal arts colleges in the U.S. News & World Report Best Colleges rankings, from 76th in 2022 to 135th this year.
Though critics have long argued that such rankings are flawed and various institutions have stopped providing data to U.S. News, the state of Florida has embraced the measurement. Officials, including DeSantis, regularly tout Florida’s decade-long streak as the top state for higher education, and some public universities have built rankings into their strategic plans. But as most other universities in the state are climbing in the rankings, New College is sliding, a fact unmentioned at a Monday press conference featuring DeSantis and multiple campus leaders.
Corcoran, the former Republican lawmaker hired as president shortly after the takeover, did not directly address the rankings slide when he spoke at the briefing at the University of Florida. But in his short remarks, Corcoran quibbled over ranking metrics.
“The criteria is not fair,” he said.
Specifically, he took aim at peer assessment, which makes up 20 percent of the rankings criteria. Corcoran argued that Florida’s institutions, broadly, suffer from a negative reputation among their peers, whose leaders take issue with the conservative agenda DeSantis has imposed on colleges and universities.
“This guy has changed the ideology of higher education to say, ‘We’re teaching how to think, not what to think,’ and we’re being peer reviewed by people who think that’s absolutely horrendous,” Corcoran said.
An Uncertain Future
As New College’s cost to the state continues to rise and rankings and student outcomes decline, some faculty members and alumni have expressed worry about what the future holds. While some believe DeSantis is happy to keep pumping money into New College, the governor is term limited.
“It’s important to keep in mind that New College is not a House or Senate project; it’s not a GOP project. It’s a Ron DeSantis project. Richard Corcoran has a constituency of one, and that’s Ron,” Allen said.
Critics also argue that changes driven by the college’s administration and the State University System—such as reinstating grades instead of relying on the narrative evaluations NCF has historically used and limiting course offerings, among other initiatives—are stripping away what makes New College special. They argue that as it loses traditions, it’s also losing differentiation.
Rodrigo Diaz, a 1991 New College graduate, said that the Sarasota campus had long attracted quirky students who felt stifled by more rigid academic environments. Now the administration and state are imposing “uniformity,” he said, which he argued will be “the death of New College.”
And some critics worry that death is exactly what lies ahead for NCF. The anonymous faculty member said they feel “an impending sense of doom” at New College and fear that it could close within the next two years. Allen said he has heard a similar timeline from lawmakers.
Even Corcoran referenced possible closure at a recent Board of Governors meeting.
In his remarks, the president emphasized that a liberal arts college should “produce something different.” And “if it doesn’t produce something different, then we should be closed down. But if we are closed down, I say this very respectfully, Chair—then this Board of Governors should be shut down, too,” Corcoran said, noting that many of its members have liberal arts degrees.
To Allen, that remark was an unforced error that revealed private conversations about closure are likely happening behind closed doors.
“I think Richard made the mistake of not realizing those conversations haven’t been public. He made them public, but the Board of Governors is very clearly talking to him about that,” he said.
But Allen has floated an alternative to closure: privatization.
Founded in 1960, New College was private until it was absorbed by the state in 1975. Allen envisions “the same deal in reverse” in a process that would be driven by the State Legislature.
“I think that the option set here is not whether it goes private or stays public, I think it’s whether it goes private or closes,” Allen said. “And I think that that is increasingly an open conversation.”
(Though NCF did not respond to media inquiries, Corcoran has voiced opposition to such a plan.)
Allen has largely pushed his plan privately, meeting with lawmakers, faculty, alumni and others. Reactions are mixed, but the idea seems to be a growing topic of conversation on campus. The anonymous faculty member said they are increasingly warming to the idea as the only viable solution, given that they believe the other option is closure within the next one to three years.
“I’m totally convinced this is the path forward, if there is a path forward at all,” they said.
Diaz said the idea is also gaining momentum in conversations with fellow alumni. He called himself “skeptical but respectful” of the privatization plan and said he has “a lot of doubt and questions.” But Diaz said that he and other alumni should follow the lead of faculty members.
“Now, if the faculty were to jump on board with the privatization plan, then I think that people like myself—alumni like myself, who are concerned for the future of the college—should support the faculty,” Diaz said. “But the contrary is also true. If the faculty sent up a signal that ‘We don’t like this, we have doubts about this,’ then, in good conscience, I don’t think I could back the plan.”
The Growing Movement to Reform Research Assessment and Rankings
By Dean Hoke, September 22, 2025: For the past fifteen years, I have been closely observing what can only be described as a worldwide fascination—if not obsession—with university rankings, whether produced by Times Higher Education, QS, or U.S. News & World Report. In countless conversations with university officials, a recurring theme emerges: while most acknowledge that rankings are often overused by students, parents, and even funders when making critical decisions, few deny their influence. Nearly everyone agrees that rankings are a “necessary evil”—flawed, yet unavoidable—and many institutions still direct significant marketing resources toward leveraging rankings as part of their recruitment strategies.
It is against this backdrop of reliance and ambivalence that recent developments, such as Sorbonne University’s decision to withdraw from THE rankings, deserve closer attention
In a move that signals a potential paradigm shift in how universities position themselves globally, Sorbonne University recently announced it will withdraw from the Times Higher Education (THE) World University Rankings starting in 2026. This decision isn’t an isolated act of defiance—Utrecht University had already left THE in 2023, and the Coalition for Advancing Research Assessment (CoARA), founded in 2022, has grown to 767 members by September 2025. Together, these milestones reflect a growing international movement that questions the very foundations of how we evaluate academic excellence.
The Sorbonne Statement: Quality Over Competition
Sorbonne’s withdrawal from THE rankings isn’t merely about rejecting a single ranking system. It appears to be a philosophical statement about what universities should stand for in the 21st century. The institution has made it clear that it refuses to be defined by its position in what it sees as commercial ranking matrices that reduce complex academic institutions to simple numerical scores.
Understanding CoARA: The Quiet Revolution
The Coalition for Advancing Research Assessment represents one of the most significant challenges to traditional academic evaluation methods in decades. Established in 2022, CoARA has grown rapidly to include 767 member organizations as of September 2025. This isn’t just a European phenomenon—though European institutions have been early and enthusiastic adopters. The geographic distribution of CoARA members tells a compelling story about where resistance to traditional ranking systems is concentrated. As the chart shows, European countries dominate participation, led by Spain and Italy, with strong engagement also from Poland, France, and several Nordic countries. This European dominance isn’t accidental—the region’s research ecosystem has long been concerned about the Anglo-American dominance of global university rankings and the way these systems can distort institutional priorities.
The Four Pillars of Reform
CoARA’s approach centers on four key commitments that directly challenge the status quo:
1. Abandoning Inappropriate Metrics The agreement explicitly calls for abandoning “inappropriate uses of journal- and publication-based metrics, in particular inappropriate uses of Journal Impact Factor (JIF) and h-index.” This represents a direct assault on the quantitative measures that have dominated academic assessment for decades.
2. Avoiding Institutional Rankings Perhaps most relevant to the Sorbonne’s decision, CoARA commits signatories to “avoid the use of rankings of research organisations in research assessment.” This doesn’t explicitly require withdrawal from ranking systems, but it does commit institutions to not using these rankings in their own evaluation processes.
3. Emphasizing Qualitative Assessment The coalition promotes qualitative assessment methods, including peer review and expert judgment, over purely quantitative metrics. This represents a return to more traditional forms of academic evaluation, albeit updated for modern needs.
4. Responsible Use of Indicators Rather than eliminating all quantitative measures, CoARA advocates for the responsible use of indicators that truly reflect research quality and impact, rather than simply output volume or citation counts.
European Leadership
Top 10 Countries by CoARA Membership:
The geographic distribution of CoARA members tells a compelling story about where resistance to traditional ranking systems is concentrated. As the chart shows, European countries dominate participation, led by Spain and Italy, with strong engagement also from Poland, France, and several Nordic countries. This European dominance isn’t accidental—the region’s research ecosystem has long been concerned about the Anglo-American dominance of global university rankings and the way these systems can distort institutional priorities.
The geographic distribution of CoARA members tells a compelling story about where
Prestigious European universities like ETH Zurich, the University of Zurich, Politecnico di Milano, and the University of Manchester are among the members, lending credibility to the movement. However, the data reveals that the majority of CoARA members (84.4%) are not ranked in major global systems like QS, which adds weight to critics’ arguments about institutional motivations.
CoARA Members Ranked vs Not Ranked in QS:
The Regional Divide: Participation Patterns Across the Globe
What’s particularly striking about the CoARA movement is the relative absence of U.S. institutions. While European universities have flocked to join the coalition, American participation remains limited. This disparity reflects fundamental differences in how higher education systems operate across regions.
American Participation: The clearest data we have on institutional cooperation with ranking systems comes from the United States. Despite some opposition to rankings, 78.1% of the nearly 1,500 ranked institutions returned their statistical information to U.S. News in 2024, showing that the vast majority of American institutions remain committed to these systems. However, there have been some notable American defections. Columbia University is among the latest institutions to withdraw from U.S. News & World Report college rankings, joining a small but growing list of American institutions questioning these systems. Yet these remain exceptions rather than the rule.
European Engagement: While we don’t have equivalent participation rate statistics for European institutions, we can observe their engagement patterns differently. 688 universities appear in the QS Europe ranking for 2024, and 162 institutions from Northern Europe alone appear in the QS World University Rankings: Europe 2025. However, European institutions have simultaneously embraced the CoARA movement in large numbers, suggesting a more complex relationship with ranking systems—continued participation alongside philosophical opposition.
Global Participation Challenges: For other regions, comprehensive participation data is harder to come by. The Arab region has 115 entries across five broad areas of study in QS rankings, but these numbers reflect institutional inclusion rather than active cooperation rates. It’s important to note that some ranking systems use publicly available data regardless of whether institutions actively participate or cooperate with the ranking organizations.
This data limitation itself is significant—the fact that we have detailed participation statistics for American institutions but not for other regions may reflect the more formalized and transparent nature of ranking participation in the U.S. system versus other global regions.
American universities, particularly those in the top tiers, have largely benefited from existing ranking systems. The global prestige and financial advantages that come with high rankings create powerful incentives to maintain the status quo. For many American institutions, rankings aren’t just about prestige—they’re about attracting international students, faculty, and research partnerships that are crucial to their business models.
Beyond Sorbonne: Other Institutional Departures
Sorbonne isn’t alone in taking action. Utrecht University withdrew from THE rankings earlier, citing concerns about the emphasis on scoring and competition. These moves suggest that some institutions are willing to sacrifice prestige benefits to align with their values. Interestingly, the Sorbonne has embraced alternative ranking systems such as the Leiden Open Rankings, which highlight its impact.
The Skeptics’ View: Sour Grapes or Principled Stand?
Not everyone sees moves like Sorbonne’s withdrawal as a noble principle. Critics argue that institutions often raise philosophical objections only after slipping in the rankings. As one university administrator put it: “If the Sorbonne were doing well in the rankings, they wouldn’t want to leave. We all know why self-assessment is preferred. ‘Stop the world, we want to get off’ is petulance, not policy.”
This critique resonates because many CoARA members are not major players in global rankings, which fuels suspicion that reform may be as much about strategic positioning as about values. For skeptics, the call for qualitative peer review and expert judgment risks becoming little more than institutions grading themselves or turning to sympathetic peers.
The Stakes: Prestige vs. Principle
At the heart of this debate is a fundamental tension: Should universities prioritize visibility and prestige in global markets, or focus on measures of excellence that reflect their mission and impact? For institutions like the Sorbonne, stepping away from THE rankings is a bet that long-term reputation will rest more on substance than on league table positions. But in a globalized higher education market, the risk is real—rankings remain influential signals to students, faculty, and research partners. Rankings also exert practical influence in ways that reformers cannot ignore. Governments frequently use global league tables as benchmarks for research funding allocations or as part of national excellence initiatives. International students, particularly those traveling across continents, often rely on rankings to identify credible destinations, and faculty recruitment decisions are shaped by institutional prestige. In short, rankings remain a form of currency in the global higher education market.
This is why the decision to step away from them carries risk. Institutions like the Sorbonne and Utrecht may gain credibility among reform-minded peers, but they could also face disadvantages in attracting international talent or demonstrating competitiveness to funders. Whether the gamble pays off will depend on whether alternative measures like CoARA or ROI rankings achieve sufficient recognition to guide these critical decisions.
The Future of Academic Assessment
The CoARA movement and actions like Sorbonne’s withdrawal represent more than dissatisfaction with current ranking systems—they highlight deeper questions about what higher education values in the 21st century. If the movement gains further momentum, it could push institutions and regulators to diversify evaluation methods, emphasize collaboration over competition, and give greater weight to societal impact.
Yet rankings are unlikely to disappear. For students, employers, and funders, they remain a convenient—if imperfect—way to compare institutions across borders. The practical reality is that rankings will continue to coexist with newer approaches, even as reform efforts reshape how universities evaluate themselves internally.
Alternative Rankings: The Rise of Outcome-Based Assessment
While CoARA challenges traditional rankings, a parallel trend focuses on outcome-based measures such as return on investment (ROI) and career impact. Georgetown University’s Center on Education and the Workforce, for example, ranks more than 4,000 colleges on the long-term earnings of their graduates. Its findings tell a very different story than research-heavy rankings—Harvey Mudd College, which rarely appears at the top of global research lists, leads ROI tables with graduates projected to earn $4.5 million over 40 years.
Other outcome-oriented systems, such as The Princeton Review’s “Best Value” rankings, emphasize affordability, employment, and post-graduation success. These approaches highlight institutions that may be overlooked by global research rankings but deliver strong results for students. Together, they represent a pragmatic counterbalance to CoARA’s reform agenda, showing that students and employers increasingly want measures of institutional value beyond research metrics alone.
These alternative models can be seen most vividly in rankings that emphasize affordability and career outcomes. *The Princeton Review’s* “Best Value” rankings, for example, combine measures of financial aid, academic rigor, and post-graduation outcomes to highlight institutions that deliver strong returns for students relative to their costs. Public universities often rise in these rankings, as do specialized colleges that may not feature prominently in global research tables.
Institutions like the Albany College of Pharmacy and Health Sciences illustrate this point. Although virtually invisible in global rankings, Albany graduates report median salaries of $124,700 just ten years after graduation, placing the college among the best in the nation on ROI measures. For students and families making education decisions, data like this often carries more weight than a university’s position in QS or THE.
Together with Georgetown’s ROI rankings and the example of Harvey Mudd College, these cases suggest that outcome-based rankings are not marginal alternatives—they are becoming essential tools for understanding institutional value in ways that matter directly to students and employers.
Rankings as Necessary Evil: The Practical Reality
The CoARA movement and actions like Sorbonne’s withdrawal represent more than just dissatisfaction with current ranking systems. They reflect deeper questions about the values and purposes of higher education in the 21st century.
If the movement gains momentum, we could see:
Diversification of evaluation methods, with different regions and institution types developing assessment approaches that align with their specific values and goals
Reduced emphasis on competition between institutions in favor of collaboration and shared improvement
Greater focus on societal impact rather than purely academic metrics
More transparent and open assessment processes that allow for a better understanding of institutional strengths and contributions
Conclusion: Evolution, Not Revolution
The Coalition for Advancing Research Assessment and decisions like Sorbonne’s withdrawal from THE rankings represent important challenges to how we evaluate universities, but they signal evolution rather than revolution. Instead of the end of rankings, we are witnessing their diversification. ROI-based rankings, outcome-focused measures, and reform initiatives like CoARA now coexist alongside traditional global league tables, each serving different audiences.
Skeptics may dismiss reform as “sour grapes,” yet the concerns CoARA raises about distorted incentives and narrow metrics are legitimate. At the same time, American resistance reflects both philosophical differences and the pragmatic advantages U.S. institutions enjoy under current systems.
The most likely future is a pluralistic landscape: research universities adopting CoARA principles internally while maintaining a presence in global rankings for visibility; career-focused institutions highlighting ROI and student outcomes; and students, faculty, and employers learning to navigate multiple sources of information rather than relying on a single hierarchy.
In an era when universities must demonstrate their value to society, conversations about how we measure excellence are timely and necessary. Whether change comes gradually or accelerates, the one-size-fits-all approach is fading. A more complex mix of measures is emerging—and that may ultimately serve students, institutions, and society better than the systems we are leaving behind. In the end, what many once described to me as a “necessary evil” may persist—but in a more balanced landscape where rankings are just one measure among many, rather than the single obsession that has dominated higher education for so long.
Dean Hoke is Managing Partner of Edu Alliance Group, a higher education consultancy. He formerly served as President/CEO of the American Association of University Administrators (AAUA). Dean has worked with higher education institutions worldwide. With decades of experience in higher education leadership, consulting, and institutional strategy, he brings a wealth of knowledge on colleges’ challenges and opportunities. Dean is the Executive Producer and co-host for the podcast series Small College America.
Claremont McKenna takes the top spot, while Barnard College, Columbia University, and Indiana University come in last.
166 of the 257 schools surveyed got an F for their speech climate.
For the first time ever, a majority of students would prevent speakers from both the left and right who express controversial views, ranging from abortion to transgender issues, from stepping foot on campus.
WASHINGTON, D.C., Sept. 9, 2025– If America’s colleges could earn report cards for free speech friendliness, most would deserve an “F”— and conservative students are increasingly joining their liberal peers in supporting censorship.
The sixth annual College Free Speech Rankings, released today by the Foundation for Individual Rights and Expression and survey partner College Pulse, show a continued decline in support for free speech among all students, but particularly conservatives. Students of every political persuasion show a deep unwillingness to encounter controversial ideas. The survey, which is the most comprehensive look at campus expression in the country, ranked 257 schools based on 68,510 student responses to a wide array of free speech-related questions.
The rankings come at a notable moment for free speech on college campuses: clashes over the Israeli-Palestinian conflict, a vigorous and aggressive culture of student activism, and the Trump administration’s persistent scrutiny of higher education.
“This year, students largely opposed allowing any controversial campus speaker, no matter that speaker’s politics,” said FIRE President and CEO Greg Lukianoff. “Rather than hearing out and then responding to an ideological opponent, both liberal and conservative college students are retreating from the encounter entirely. This will only harm students’ ability to think critically and create rifts between them. We must champion free speech on campus as a remedy to our culture’s deep polarization.”
For the second time, Claremont McKenna has claimed the top spot in the rankings. Speech controversies at the highest-rated schools are rare, and their administrations are more likely to support free speech. The schools that improved their score the most, including Dartmouth College and Vanderbilt University, worked to reform their policies and recently implemented new programs that support free speech and encourage open discourse.
“Even one egregious anti-free speech incident can destroy students’ trust in their administration and cause a school to plummet in the rankings,” said FIRE Vice President of Research Angela C. Erickson. “If campus administrators, faculty, and students want to enjoy an atmosphere of trust on campus, they can start by protecting each other’s rights.”
Other key findings from the report include:
166 of the 257 schools surveyed got an F for their speech climate, while only 11 schools received a speech climate grade of C or higher.
Only 36% of students said that it was “extremely” or “very” clear that their administration protects free speech on campus.
A record 1 in 3 students now holds some level of acceptance – even if only “rarely” — for resorting to violence to stop a campus speech.
53% of students say that the Israeli-Palestinian conflict is a difficult topic to discuss openly on campus. On 21 of the campuses surveyed, at least 75% of students said this — including 90% of students at Barnard.
For the first time ever, a majority of students oppose their school allowing any of the six controversial speakers they were asked about onto campus — three controversial conservative speakers and three controversial liberal ones.
“More students than ever think violence and chaos are acceptable alternatives to peaceful protest,” said FIRE Chief Research Advisor Sean Stevens. “This finding cuts across partisan lines. It is not a liberal or conservative problem — it’s an American problem. Students see speech that they oppose as threatening, and their overblown response contributes to a volatile political climate.”
The Foundation for Individual Rights and Expression (FIRE) is a nonpartisan, nonprofit organization dedicated to defending and sustaining the individual rights of all Americans to free speech and free thought—the most essential qualities of liberty. FIRE recognizes that colleges and universities play a vital role in preserving free thought within a free society. To this end, we place a special emphasis on defending the individual rights of students and faculty members on our nation’s campuses, including freedom of speech, freedom of association, due process, legal equality, religious liberty, and sanctity of conscience.
It’s college rankings season again, a time of congratulations, criticism and, occasionally, corrections for institutions and the organizations that rate them.
Typically U.S. News & World Report, the giant of the college rankings world, unranks some institutions months after its results are published over data discrepancies that are usually the result of honest mistakes. But in rare instances, erroneous data issues aren’t mistakes but outright fraud. And when that happens, it can result in soul-searching and, ideally, redemption for those involved.
That’s what happened at Temple University, which was rocked by a rankings scandal in 2018, when it became clear that Moshe Porat, the dean of Temple’s Richard J. Fox School of Business and Management, had knowingly provided false data to U.S. News for years in a successful effort to climb the rankings. Temple’s online master of business administration soared to No. 1—until the scheme was exposed. U.S. News temporarily unranked the program, the U.S. Department of Education hit Temple with a $700,000 fine and Porat was convicted of fraud.
Since then, Temple has worked hard to restore its reputation. In the aftermath of the scandal, officials imposed universitywide changes to how it handles facts and figures, establishing a Data Verification Unit within the Ethics and Compliance Office. Now any data produced by the university goes through a phalanx of dedicated fact-checkers, whether it’s for a rankings evaluation or an admissions brochure.
A Culture Shift
Temple’s Data Verification Unit was introduced in 2019 amid the fallout of the rankings scandal.
At first, it gave rise to “friction points,” as university officials were required to go through new processes to verify data before it was disseminated, said Susan Smith, Temple’s chief compliance officer. But now she believes the unit has won the trust of colleagues on campus who have bought in to more rigorous fact-checking measures.
“It’s been an incredibly positive thing for Temple and I think for data integrity over all,” Smith said.
Initially, Temple partnered with an outside law firm to verify data and lay the groundwork for the unit. Now that is all handled in-house by a small team that works across the university.
While Smith said “the vast majority of mistakes” she sees “are innocent,” her team is there “to act as a sort of backstop” and to “verify that the data is accurate, that there’s integrity in the data.”
The Data Verification Unit also provides training on best practices for data use and dissemination.
University officials believe placing the Data Verification Unit under the centralized Office of Compliance and Ethics—which reports directly to Temple’s Board of Trustees—is unique. And some say the process has created a bit of a culture shift as they run numbers by the unit.
Temple spokesperson Stephen Orbanek, who joined the university after the rankings scandal, said running news releases by the Data Verification Unit represented a “total change” from the way he was accustomed to operating. And while it can sometimes slow down the release of certain data points or responses to media requests, he said he’s been able to give reporters more robust data.
He also noted times when Temple has had to pull back on marketing claims and use “less impressive” statistics after the Data Verification Unit flagged issues with materials. As an example, he cited a fact sheet put out by the university in which officials wanted to refer to Temple as a top producer of Fulbright scholars. But the Data Verification Unit insisted that a caveat was needed: The statistic pertained only to the 2022–23 academic year.
Ultimately, Orbanek sees the Data Verification Unit as a boon for a more transparent campus culture.
“The culture has just kind of shifted, and you get on board,” Orbanek said.
Other Rankings Scandals
Other universities have been less forthcoming about fixing their own data issues.
In 2022, a professor called out his employer, Columbia University, for submitting inaccurate data to U.S. News, which responded by unranking the institution for a short time. Following the scandal and accusations of fraud by some critics, Columbia announced the university would no longer submit data to U.S. News. Officials argued that the rankings have outsize influence on prospective students but don’t adequately measure institutional quality.
Yet Columbia still publishes large swaths of data, such as its Common Data Set. Asked how the university has acted to verify data in the aftermath of the rankings scandal, a spokesperson wrote by email that data is “reviewed by a well-established, independent advisory firm to ensure reporting accuracy” but did not respond to a request for more details on the verification processes.
The University of Southern California also navigated a rankings scandal in 2022. USC provided faulty data to U.S. News for its Rossier School of Education, omitting certain metrics, which helped it rise in the rankings, according to a third-party report that largely blamed a former dean.
U.S. News temporarily unranked Rossier; graduate students sued the university, accusing officials of falsely advertising rankings based on fraudulent data. That legal battle is ongoing, and earlier this year a judge ruled that the case can proceed as a class action suit.
Officials did not respond to a request from Inside Higher Ed for comment on whether or how USC has changed the way it verifies data for use in rankings or for other purposes.
U.S. News also did not respond to specific questions about if or how it verifies that information submitted by institutions to be used for ranking purposes is accurate. A spokesperson told Inside Higher Ed, “U.S. News believes that data transparency and internal accountability practices by educational institutions are good for those institutions and good for consumers.”
Universities across the MENA region have made significant strides in the latest 2026 QS World University Rankings (WUR), reflecting a sustained push in attracting international institutions and students.
From a previous list of 88 institutions featured in the rankings last year, the numbers increased to a total of 115 in 2026, with the region’s most notable climb being that of King Fahad University of Petroleum and Minerals in Saudi Arabia, which has been listed in the top 100 globally at a rank of 67 – a historic record for institutions in the region.
The 16 MENA countries also added 27 new entries from across nine countries, second as a region only to Asia, which added 54 new institutions from across 19 countries.
Among these, the University of Tripoli marked Libya’s debut in the QS WUR. Apart from Libya, only two other countries, Guatemala and Honduras, entered the rankings for the first time this year, each with one institution.
When examining year-on-year changes, some 53% of institutions in the MENA region either maintained or improved their global ranking, while only 23% saw a decline.
This is the lowest proportion of declining institutions among all global regions, outperforming Europe, where the maintain/improve versus decline rate stands at 52% to 44%, and Australia and New Zealand (AUNZ), where the rate is 36% to 61%.
Countries that are part of the Gulf Cooperation Council (GCC), Bahrain, Kuwait, Oman, Qatar, Saudi Arabia, and the United Arab Emirates, all share a common approach in making significant investments in research and education, aligned with bold national visions.
Collectively, GCC countries outperform the MENA region average across all nine QS World University Rankings indicators. Their institutions particularly excel under the global engagement lens, which looks at internationalisation indicators such as international faculty ratio (IFR), international student ratio (ISR), and international research network (IRN). This reflects their strong global appeal in attracting international talent and fostering cross-border academic collaboration.
Saudi Arabia leads MENA region
Among the top 25 countries by number of ranked institutions, Saudi Arabia leads the MENA region – with 22 universities featured in the QS WUR 2026, six more than in 2024. The overall average score of Saudi institutions increased by 38%, from 20.7 to 28.5, over the past two editions.
These advancements are arguably a result of Saudi’s 2030 Vision, as the country promised to have at least five of its universities among the top 200 universities in international rankings, thus budgeting for substantial funding for research, university-industry collaboration, and global partnerships.
The rankings come as Dubai expands its international branch campus ecosystem, aiming to host 50% international students by 2030 as a part of its Education 33 strategy, positioning itself as an international education hub.
Qatar also finds itself in a similar position, as Qatar University moved 10 places up to reach 112 globally. The country’s investment in research infrastructure and faculty recruitment has improved its performance in citations per faculty – a key QS metric.
The Qatar National Vision 2030 aims to establish a world-class education system aligned with labour market needs, offering high-quality, accessible learning for all stages of life. It emphasises the development of independent and accountable institutions, robust public-private research funding, and active global engagement in cultural and scientific domains.
Meanwhile, outside the GCC, four other countries have shown particularly impressive performances: Egypt, Jordan, Iraq, and Lebanon. These countries rank among the top six in the MENA region in terms of ranked institutions, sharing the spotlight with Saudi Arabia and the United Arab Emirates.
According to QS’s Best Student Cities rankings, Jordan’s capital, Amman, is now the best city in the Middle East. Additionally, Jordan saw multiple universities ranked in the WUR this year, with the University of Jordan, Jordan University of Science and Technology, and the German Jordanian University improving in previous years.
While none have yet reached the global top 400, the country is investing in STEM-focused faculty and expanding regional collaborations, especially with the Gulf.
Meanwhile, Egypt now has 13 institutions featured in QS rankings, with Cairo University, Ain Shams University, and The American University in Cairo (AUC) leading the way.
And in Lebanon, the American University of Beirut remains the top Lebanese institution and one of the top institutions in the MENA region.
Despite geopolitical tensions in Lebanon, a surprise improvement occurred as the Lebanese University (LU) climbed from 577 globally in 2024 to 515 in the WUR 2026. And after the Lebanese American University placed round 701-710 globally in 2025, in 2026 it projected to 535 on the list.
What’s next?
Stakeholders discussed the potential reasons why universities from the MENA region have shown such a marked jump in the ranking yea on year.
“From my perspective, key drivers include stronger institutional strategies around internationalisation, improved research output, and increasing collaborations with global partners,” Gulf Medical University academic quality assurance & institutional effectiveness specialist, Salaheldin Mostafa Khalifa, told The PIE News.
“We can expect continued upward momentum for MENA universities in global rankings. Many institutions are investing heavily in research infrastructure, international collaborations, and faculty development,” he added.
Meanwhile, QS broke down the “sustained progress” that universities in the regions have seen over the past year.
We can expect continued upward momentum for MENA universities in global rankings. Many institutions are investing heavily in research infrastructure, international collaborations, and faculty development Salaheldin Mostafa Khalifa, Gulf Medical University
“There are clear signs of upward momentum,” said product and research advisor at QS, Wesley Siquera, noting that the umber of ranked MENA institutions had jumped from 84 to 115 between the QS WUR 2024 and 2026 editions.
“Finally, national development strategies provide strong indicators of where future progress may come from,” he added. “Several of the regional ‘visions’ explicitly set goals for placing domestic universities among the world’s top institutions. If these targets are met, we could see by 2030: three Omani universities in the top 500, five Saudi universities in the top 200, and seven Egyptian universities in the top 500.”
Columbia University has reached a $9 million settlement agreement with undergraduate students who alleged the institution deliberately submitted false information to U.S. News & World Report to artificially boost its college rankings position.
The preliminary settlement, filed last Monday in Manhattan federal court and pending judicial approval, resolves claims that Columbia misrepresented key data points to enhance its standing in the influential annual rankings. The university reached as high as No. 2 in the undergraduate rankings in 2022 before the alleged misconduct came to light.
Students alleged that Columbia consistently provided inaccurate data to U.S. News, including the false claim that 83% of its classes contained fewer than 20 students. The lawsuit argued these misrepresentations were designed to improve the university’s ranking position and, consequently, attract more students willing to pay premium tuition rates.
The settlement covers approximately 22,000 undergraduate students who attended Columbia College, the Fu Foundation School of Engineering and Applied Science, and the School of General Studies between fall 2016 and spring 2022.
The controversy began in July 2022 when Columbia mathematics professor Dr. Michael Thaddeus published a detailed analysis questioning the accuracy of data underlying the university’s No. 2 ranking. His report alleged that much of the information Columbia provided to U.S. News was either inaccurate or misleading.
Following the publication of Thaddeus’s findings, Columbia’s ranking plummeted to No. 18 in September 2022. The dramatic drop highlighted the significant impact that data accuracy has on institutional rankings and reputation.
In response to the allegations, Columbia announced in June 2023 that its undergraduate programs would withdraw from participating in U.S. News rankings altogether. The university cited concerns about the “outsized influence” these rankings have on prospective students’ decision-making processes.
“Much is lost when we attempt to distill the quality and nuance of an education from a series of data points,” Columbia stated in explaining its decision to withdraw from the rankings process.
While denying wrongdoing in the settlement agreement, Columbia acknowledged past deficiencies in its reporting practices. The university stated it “deeply regrets deficiencies in prior reporting” and has implemented new measures to ensure data accuracy.
Columbia now provides prospective students with information that has been reviewed by an independent advisory firm, demonstrating the institution’s commitment to transparency and accurate representation of its educational offerings.
Columbia’s decision to withdraw from U.S. News rankings reflects a growing skepticism among elite institutions about the value and impact of college ranking systems. Harvard and Yale have also stopped submitting data to U.S. News for various programs, signaling a potential shift in how prestigious universities approach rankings participation.
Under the terms of the agreement, student attorneys plan to seek up to one-third of the settlement amount for legal fees, which would leave approximately $6 million available for distribution among affected students. The settlement requires approval from a federal judge before taking effect.
Student lawyers characterized the accord as “fair, reasonable and adequate” given the circumstances of the case and the challenges inherent in proving damages from ranking manipulation.
By Somayeh Aghnia, Co-Founder and Chair of the Board of Governors at the London School of Innovation.
University rankings have long been a trusted, if controversial, proxy for quality. Students use them to decide where to study. Policymakers use them to shape funding. Universities use them to benchmark themselves against competitors. But in an AI-powered world, are these rankings still measuring what matters?
If we’ve learned anything from the world of business over the last decade, it’s this: measuring the wrong things can lead even the most successful organisations astray. The tech industry, in particular, has seen numerous examples of companies optimising for vanity metrics (likes, downloads, growth at all costs) only to realise too late that these metrics didn’t align with real value creation.
The metrics we choose to measure today will shape the universities we get tomorrow.
The Problem with Today’s Rankings
Current university ranking systems, whether national or global, tend to rely on a familiar set of indicators:
Research volume and citations
Academic and employer reputation surveys
Faculty-student ratios
International staff and student presence
Graduate salary data
Student satisfaction and completion rates
While these factors offer a snapshot of institutional performance, they often fail to reflect the complex reality of the world. A university may rise in the rankings even as it fails to respond to student needs, workforce realities, or societal challenges.
For example, graduate salary data may tell us something about economic outcomes, but very little about the long-term adaptability or purpose-driven success of those graduates or their impact on improving society. Research citations measure academic influence, but not whether the research is solving real-world problems. Reputation surveys tend to reward legacy and visibility, not innovation or inclusivity.
In short, they anchor universities to a model optimised for the industrial era, not the intelligence era.
Ready for the AI paradigm?
Artificial Intelligence is a paradigm shift that is changing what we value in all aspects of life including education, especially higher education, how we define learning, what we want as an outcome, and how we measure success.
In a world where knowledge is increasingly accessible, and where intelligent systems can generate information, summarise research, and tutor students, the role of a university shifts from delivering knowledge or developing skills to curating learning experiences focusing on developing humans’ adaptability, and preparing students, and society, for uncertainty.
This means the university of the future must focus less on scale, tradition, and prestige, and more on relevance, adaptability, and ethical leadership. These are harder to measure, but far more important. This demands a new value system. And with that, a new approach to how we assess institutional success.
What Should We Be Measuring?
As we rethink what universities are for, we must also rethink how we assess their impact. Inspired by the “measure what matters” philosophy from business strategy, we need new metrics that reflect AI-era priorities. These could include:
1. Adaptability: How quickly and responsibly does an institution respond to societal, technological, and labour market shifts? This could be measured by:
Curriculum renewal cycle: Time between major curriculum updates in response to new tech or societal trends.
New programme launches: Number and relevance of AI-, climate-, or digital economy-related courses introduced in the last 3 years.
Agility audits: Internal audits of response times to regulatory or industry change (e.g., how quickly AI ethics is integrated into professional courses).
Employer co-designed modules: % of programmes co-developed with industry or public sector partners.
2. Student agency: Are students empowered to shape their own learning paths, engage with interdisciplinary challenges, and co-create knowledge? This could be measured by:
Interdisciplinary enrolment: % of students engaged in flexible, cross-departmental study pathways.
Student-designed modules/projects: Number of modules that allow student-led curriculum or research projects.
Participation in governance: % of students involved in academic boards, curriculum design panels, or innovation hubs.
Satisfaction with personalisation: Student survey responses (e.g., NSS, internal pulse surveys) on flexibility and autonomy in learning.
3. AI and digital literacy: To what extent are institutions preparing their staff and their graduates for a world where AI is embedded in every profession? This could be measured by:
Curriculum integration: % of degree programmes with AI/digital fluency embedded as a learning outcome.
Staff development: Hours or participation rates in AI-focused CPD for academic and support staff.
AI usage in teaching and assessment: Extent of AI-enabled platforms, feedback systems, or tutors in active use.
Graduate outcomes: Employer feedback or destination data reflecting readiness for digital-first/AI-ready roles.
4. Contribution to local and global challenges: Are research efforts aligned with pressing societal needs amplified with advancements of AI such as social justice, or the AI divide? This could be measured by:
UN SDG alignment: % of research/publications mapped to UN Sustainable Development Goals.
AI-for-good projects: Number of AI projects tackling societal or environmental issues.
Community partnerships: Active partnerships with local authorities, civic groups, or NGOs on social challenges.
Policy influence: Instances where university research or expertise shapes public policy (e.g. citations in white papers or select committees).
5. Wellbeing and belonging: How well are staff and students supported to thrive, not just perform, within the institution? This could be measured by:
Staff/student wellbeing index: Use of validated tools like the WEMWBS (Warwick-Edinburgh Mental Wellbeing Scale) in internal surveys.
Use of support services: Uptake and satisfaction rates for mental health, EDI, and financial support services.
Sense of belonging scores: Survey data on inclusion, psychological safety, and campus climate.
Staff retention and engagement: Turnover data, satisfaction from staff pulse surveys, or exit interviews.
These are not soft metrics. They are foundational indicators of whether a university is truly fit for purpose in a volatile and AI-transformed world. You could call this a “University Fitness for Future Index”, a system that doesn’t rank but reveals how well an institution is evolving, and as a result its academics, staff and students are adapting to a rapidly changing world.
From Status to Substance
Universities must now face the uncomfortable but necessary task of redefining their identity and purpose. Those who focus solely on preserving status will struggle. Those who embrace the opportunity to lead with substance – authenticity, impact, innovation – have a chance to thrive.
AI will not wait for the sector to catch up. Students, staff, employers, and communities are already asking deeper questions: Does this university prepare me for an unpredictable future? Does it care about the society I will enter after graduation? Is it equipping us to lead with courage and ethics in an AI-powered world?
These are the questions that matter. And increasingly, they should be the ones that will shape how institutions are evaluated, regardless of their position in the league tables.
It’s time we evolve our frameworks to reflect what really counts, that increasingly will be defined by usefulness, purpose, and trust.
A Call for Courage
We are not simply in an era of change. We are in a change of era.
If we are serious about preparing our learners, and our society, for a world defined by intelligent systems, we must also be serious about redesigning the system that educates them.
That means shifting from prestige to purpose. From competition to contribution. From reputation to relevance.
Because the institutions that will lead the future are not necessarily those that top today’s rankings.
They are the ones willing to ask: what truly matters now and are we brave enough to measure it?