Higher education has reached its canary-in-a-coal-mine moment: The recent resignation of the University of Virginia president under intense political pressure isn’t just another leadership transition but an indicator of hazards ahead, with similar pressures mounting George Mason University. Higher education governing boards cannot ignore these urgent warning signs that signal peril for the governance structures that have supported our universities and colleges for centuries.
U.S. higher education is built on a unique model of governance in which independent citizen trustees exercise fiduciary oversight, set policy, safeguard institutional autonomy, support fulfillment of the mission and act in the best interests of the university or college as stewards of the public trust. This model of self-governance has preserved the academic freedom and driven the innovation that are hallmarks of U.S. higher education and that form the foundation of the profound societal impact and global prominence of the sector.
Today, this governance model faces significant disruption. At both public and private institutions, trustees are being encouraged by policy-driven think tanks to serve as ideological agents and interfere with management rather than act as true fiduciaries. This violation of institutional autonomy is destabilizing and harmful to governance, yielding fractured boards, diminished presidential authority, politicized decision-making, academic censorship and loss of public trust.
Boards must take this warning very seriously and take a hard look at whether their decisions reflect independent judgment aligned with the institution’s mission or, instead, the influence of external agendas. If governance fails, academic freedom is compromised, academic quality is weakened, public trust is eroded and the promise of U.S. higher education and its role in a democracy will disappear.
To guide boards in upholding institutional autonomy and mission stewardship, the Association of Governing Boards of Universities and Colleges recently launched the Govern NOW initiative with support from a Mellon Foundation grant. As part of this initiative, AGB developed a models of governance comparison and checklist for governance integrity. These tools help board members distinguish between effective governance and ideologically driven overreach and provide a framework to assess their practices and recommit to their fiduciary responsibilities.
This is especially critical due to growing misinformation about the role of trustees. Without a true understanding of their responsibilities, they might act independently of board consensus, undermine governance norms, overstep management boundaries and pursue ideological agendas. These actions not only weaken governance by harming board cohesion and culture but also threaten the institutional stability and mission that trustees are charged to uphold.
This moment is not about partisan politics. It is about leadership and whether we will allow institutional governance to be hijacked by ideological conflict. At stake is the integrity of the governance system that has been the foundation on which the strength and distinction of U.S. higher education has been built.
To every trustee, I implore you to look inward. Ask whether your board is governing with independence and as stewards of mission and public trust. Use the tools AGB developed to evaluate your culture and boundaries. Engage in real dialogue with your president. Lead together with courage and clarity to secure higher education’s promise.
Ross Mugler is board chair and acting president and CEO of the Association of Governing Boards of Universities and Colleges.
By Rob Carthy, Director of International Development, Northumbria University.
Attending Duolingo’s inaugural DETcon London, I anticipated a day focused on the evolving landscape of language testing. What I experienced was a candid, and at times controversial, conversation about the geopolitical, political, and technological pressures facing UK higher education. Duolingo may have been the convenor, but the themes of the day went to the heart of the sector’s future.
Here are ten of my key takeaways from a thought-provoking day.
1. Crisis is the New Business-as-Usual
“Since I started, we’ve faced crisis after crisis.” These words from Katja Lamping, Director of Student Recruitment at UCL, resonated deeply. From the pandemic to the fall of Kabul, Ukraine, and Gaza, the past five years have demanded a level of institutional reactivity we’ve never seen before. The clear message was that this isn’t a blip. As former Home and Foreign Secretary Jack Straw bluntly put it, the first rule of preparing for uncertainty is to “expect that the unexpected is going to happen.” For university leaders, strategic planning now looks less like a road map and more like training for a cross-country race in the dark.
2. Agility is Now a Core Competency
The flip side of constant crisis is the need for agile solutions. We heard how the Duolingo English Test (DET) became a vital tool during the pandemic because it was accessible when physical test centres were not. This story is symbolic of a wider truth: our traditional processes and partnerships can be brittle. To keep our doors open to global talent, we must build resilience and responsiveness into our operations, from admissions to student support.
3. The Political Headwinds are Strengthening
Jack Straw’s portrayal of the government’s immigration white paper was sobering. He spoke of a view that some university business models are “not sustainable.” And highlighted the view in Westminster that the student visa route was being used as a “racket” for asylum claims. I might have disagreed with him – many in the room did – but this is a view held by many and is a reality we must face.
Straw, and later in the day Rory Stewart, said the mood in Westminster is hardening regardless of who is in power. This government wants to bring down immigration. It’s often said Brits don’t mean students, but the stark reality is the government wants to get numbers down, and students are one, if not the easiest, lever they can pull. We must be on the front foot, demonstrating our commitment to robust compliance and ethical recruitment, and articulating the immense value international students bring—a value HEPI’s own research has quantified at a net £37.4 billion for the 2021/22 cohort alone.
4. Technology Can Deliver Both Access and Integrity
A powerful message came from Duolingo’s CEO and co-founder, Luis Von Ahn, who shared his personal story of growing up in Guatemala and seeing how English proficiency could transform a person’s life, yet how prohibitive the cost of testing was. His core argument was that technology should be a democratising force. But most compellingly, he tackled the security question head-on. He argued that far from being less secure, an AI-powered test can offer greater integrity than a traditional test centre. The ability to use AI to monitor hundreds of behaviours simultaneously—from eye-gaze to keystroke patterns—in addition to human proctors, presented a powerful case that a digital-first approach doesn’t have to mean a compromise on security; it could, in fact, mean the opposite.
5. The Biggest Risk of AI Might Be Inaction
In a fascinating session on technology, Dr. Laura Gilbert OBE of the Tony Blair Institute offered a powerful counter-narrative to the usual fears around AI. She argued that the biggest risk might be “not doing it at all.” While we worry about academic integrity, we risk missing the opportunity to use AI to solve our biggest challenges, from relieving the administrative burden in admissions to revolutionising personalised learning. Her point that technology like AI is essential to sustaining public services like the NHS has direct parallels with the financial challenges in our own sector.
6. Trust in Technology Must Be Earned
Dr. Gilbert was clear: you cannot just demand trust in AI. It must be earned through what she called “radical transparency.” For universities adopting tools like the DET for high-stakes admissions, this is a critical lesson. We must demand evidence from our tech partners that their tools are secure, equitable, and have been rigorously evaluated for bias. Publishing that evidence, as Duolingo was highlighted as a good example, should become the industry standard.
A crucial warning from Dr. Gilbert was that if left to market forces, AI will inevitably make the advantage more advantaged, worsening societal inequality. For a sector committed to widening participation, this is a profound challenge. As we adopt AI, we must actively and consciously steer it towards closing, not widening, access gaps. The goal of using technology to reach students from previously untapped regions is a noble one, but it requires a constant and active focus on equity.
7. AI Isn’t Just A Buzzword
It’s a transformative force in assessment, personalisation and inclusion. I was struck by Duolingo’s mission-led approach, especially its ability to deliver high-quality, low-cost English testing to learners across the globe, including refugee and displaced students. Innovations like adaptive testing, AI-driven speaking practice, and real-time fraud detection redefine what “secure” and “authentic” assessment looks like. The session challenged my assumptions about test integrity and proved that democratisation doesn’t mean compromise. The balance between rigour and compassion resonated strongly—proof that access and excellence can coexist. At Northumbria, we’re increasingly mindful of our role in enabling fairer pathways into UK higher education. The Duolingo English Test is no longer a disruptor—it’s fast becoming a vital enabler that we should all be paying attention to.
8. Is it a Brave New World?
Rory Stewart’s session offered a powerful analysis of our turbulent times, contrasting the post-1989 era of liberal democracy and globalisation with our current “shadow world” of populism, protectionism, and a retreat from a rules-based international order. Stewart highlighted key shifts, including China’s rise without democratisation, the 2008 financial crisis, and the chaotic impact of social media and failed interventions in Iraq and Afghanistan. Stewart warned that global trends, like Trump’s attacks on US universities, could easily manifest in the UK, emphasizing that “what happens in the US can come here”. These attacks include significant funding cuts and threats to academic freedom over perceived ideological biases. This serves as a stark reminder for UK higher education to remain vigilant against similar political pressures.
9. A Little Can Go A Long Way
The session on the carbon cost of testing made me sit up. John Crick from the International Education Sustainability Group (IESG) revealed that switching from test centre-based exams to online alternatives can cut carbon emissions by up to 98%. The equivalent of planting a Sherwood Forest of trees every year. The analysis showed that the biggest environmental impact comes from travel to test centres, especially in regions without local provision. What struck me was how easily we overlook this area in our sustainability strategies. IESG’s meticulous modelling gives us a much-needed baseline to challenge assumptions and examine the unintended carbon consequences of our English language policies. It’s a conversation starter—but one we in international education need to have now if we’re serious about meeting climate goals.
10. Our Soft Power is Precious, But Not Guaranteed
The conference ended with a discussion on the UK’s soft power and the launch of Duolingo’s Welcome Project, which seeks to provide a place for students displaced by the turmoil in the US with opportunities in the UK.
While our leading universities remain beacons of global influence, the day’s discussions made it clear this cannot be taken for granted. A domestic political narrative focused on clamping down on immigration, combined with financial models that are visibly creaking, risks tarnishing one of the UK’s greatest exports. We must collectively find a way to reconcile the need for control and sustainability with the projection of being an open, welcoming, and world-leading destination for education.
Rob Carthy is the Director of International Development at Northumbria University, where he leads the university’s international student recruitment strategies. He is focused on developing a sustainable and diverse international community in Newcastle upon Tyne. Find him on LinkedIn.
For the last 15 years, science teacher Jeff Grant has used information on climate change from the federal website Climate.gov to create lesson plans, prepare students for Advanced Placement tests and educate fellow teachers. Now, Grant says, he is “grabbing what [he] can” from the site run by the National Oceanic and Atmospheric Administration’s Climate Program Office, amid concerns that the Trump administration is mothballing it as part of a broader effort to undermine climate science and education.
“It’s just one more thing stifling science education,” said Grant, who teaches at Downers Grove North High School in the Chicago suburbs.
Since early May, all 10 editorial contributors to Climate.gov have lost their jobs, and the organization that produces its education resources will soon run out of money. On June 24, the site’s homepage was redirected to NOAA.gov, a change NOAA said was made to comply with an earlier executive order on “restoring gold standard science.” Those steps follow many others the president has made to dismantle federal efforts to fight climate change, which his administration refers to as the “new green scam.”
Former employees of Climate.gov and other educators say they fear that the site, which will no longer produce new content, could be transformed into a platform for disinformation.
“It will make it harder for teachers to do a good job in educating their students about climate change,” said Glenn Branch, deputy director of the nonprofit National Center for Science Education. “Previously, they could rely on the federal government to provide free, up-to-date, accurate resources on climate change that were aimed at helping educators in particular, and they won’t be able to do so if some of these more dire predictions come to pass.”
Such concerns have some foundation. For example, Covid.gov, which during the Biden administration offered health information and access to Covid-19 tests, has been revamped to promote the controversial theory that the coronavirus was created in a lab. The administration has also moved aggressively to delete from government sites other terms that are currently out of favor, such as references to transgender people that were once on the National Park Service website of the Stonewall National Memorial, honoring a major milestone in the fight for LGBTQ+ rights.
Kim Doster, director of NOAA’s office of communications, declined to answer specific questions but shared a version of the statement posted on the NOAA website when Climate.gov was transferred. “In compliance with Executive Order 14303, Restoring Gold Standard Science, NOAA is relocating all research products from Climate.gov to NOAA.gov in an effort to centralize and consolidate resources,” it says.
Related: Want to read more about how climate change is shaping education? Subscribe to our free newsletter.
Climate.gov, founded in 2010 to support earth science instruction in schools, had become a go-to site for educators and the general public for news and information about temperature, sea level rise and other indicators of global warming.
For many educators, it has served a particularly key role. Because its resources are free, they are vital in schools that lack resources and funding, teachers and experts say.
Rebecca Lindsey, Climate.gov’s lead editor and writer, was one of several hundred NOAA probationary employees fired in February, then rehired and put on administrative leave, before being terminated again in March. The rest of the content production team — which included a meteorologist, a graphic artist and data visualizers — lost their jobs in mid-May. Only the site’s two web developers still have their jobs.
A screenshot of the Sea Level Rise Viewer, an interactive NOAA that’s listed as a resource on Climate.gov, a government climate website. Credit: NOAA Office for Coastal Management
Lindsey said she worries that the government “intended to keep the site up and use it to spread climate misinformation, because they were keeping the web developers and getting rid of the content team.”
In addition, the Climate Literacy and Energy Awareness Network, the official content provider for the education section of the site, has not received the latest installment of its three-year grant and expects its funds to run out in August.
“We won’t have funding to provide updates, fix hyperlinks and make sure that new resources are being added, or help teachers manage or address or use the resources,” said Anne Gold, CLEAN’s principal investigator. “It’s going to start deteriorating in quality.”
CLEAN, whose website is hosted by Carleton College, is now searching for other sources of money to continue its work, Gold said.
With the June 24 change redirecting visitors from Climate.gov to NOAA.gov/climate, the website for the first time falls under the purview of a political appointee: Doster. Its previous leader, David Herring, is a science writer and educator.
Melissa Lau, an AP environmental science teacher in Piedmont, Oklahoma, said the relocated site was “really difficult to navigate.”
As someone who lives in Tornado Alley, Lau said, she frequented CLEAN and NOAA sites to show her students localized, real-time data on storm seasons. She said she is concerned that teachers won’t have time to track down information that was shifted in the website’s move and, as a result, may opt not to teach climate change.
The executive order on “restoring gold standard science” that appears to have triggered the shift gives political appointees the authority to decide what science information needs to be modified to align with its tenets.
While the disclaimer posted to NOAA.gov seems to imply that Climate.gov did not meet this requirement, educators and researchers said that the site and its CLEAN education resources were the epitome of a gold standard.
“I want to stress that the reason why CLEAN is considered the gold standard is because we have such high standards for scientific accuracy, classroom readiness and maintenance,” Gold said. “We all know that knowledge is power, and power gives hope. … [Losing funding] is going to be a huge loss to classrooms and to students and the next generation.”
This is only the latest attack by the Trump administration on education around climate change. This month, the U.S. Global Change Research Program’s website, GlobalChange.gov, was shut down by the administration, after the program was defunded in April. The website once hosted an extensive climate literacy guide, along with all five iterations of the National Climate Assessment — a congressionally required report that informed the public about the effects and risks of climate change, along with local, actionable responses.
The Department of Commerce, which oversees NOAA, has cut other federal funding for climate research, including at Princeton University, arguing that these climate grant awards promoted “exaggerated and implausible climate threats, contributing to a phenomenon known as ‘climate anxiety,’ which has increased significantly among America’s youth.”
“The more you know [about climate change], the more it’s not a scary monster in the closet,” said Lauren Madden, professor of elementary science education at the College of New Jersey. “It’s a thing you can react to.” She added, “We’re going to have more storms, we’re going to have more fires, we’re going to have more droughts. There are things we can do to help slow this. … I think that quells anxiety, that doesn’t spark it.”
And climate education has broad public support — about 3 in 4 registered voters say schools should teach children about global warming, according to a 2024 report from the Yale Program on Climate Change Communication. Similarly, 77 percent of Americans regard it as very or somewhat important for elementary and secondary school students to learn about climate change, according to a 2019 study. And all but five states have adopted science standards that incorporate at least some instruction on climate change.
As a result, many science teachers rely on federal tools and embed them in their curriculum. They are worried that the information will no longer be relevant, or disappear entirely, according to Lori Henrickson, former climate integration specialist for Washington state’s education department. Henrickson, who lost her job this June as the result of state budget cuts, was in charge of integrating climate education across content areas in the state, from language arts to physical education.
The .gov top-level domain connotes credibility and accessibility, according to Branch: “It is also easier for teachers facing or fearing climate change denial backlash to cite a reliable, free source from the federal government.”
With Climate.gov’s future uncertain, educators are looking to other resources, like university websites and tools from other countries.
“I’m sure there will continue to be tools, and there will be enough people who will be willing to pay to access them,” Madden said. But, she added, “they probably won’t be as comprehensive, and it won’t feel like it’s a democratic process. It’ll feel like: If you or your employer are willing to chip in for it, then you’ll have access.”
“I feel like with all the federal websites, I’m constantly checking to see what’s still up and what’s not,” Madden said.
Bertha Vazquez, education director for the Center of Inquiry, an organization that works to preserve science and critical thinking, said she worried that the disappearance of climate information could leave U.S. students behind.
“The future of the American economy is not in oil, the future of the American economy is in solar and wind and geothermal. And if we’re going to keep up with the international economy, we need to go in that direction,” she said. But while the U.S. should be leading the way in scientific discovery, Vazquez said, such work will now be left to other countries.
Lau said she felt helpless and frustrated about Climate.gov’s shutdown and about the “attack on American science in general.”
“I don’t know what to do. I can contact my legislators, but my legislators from my state are not going to be really open to my concerns,” she said. “If students next year are asking me questions about [science research and funding], I have to tell them, ‘I do not know,’ and just have to leave it at that.”
Contact editor Caroline Preston at 212-870-8965, via Signal at CarolineP.83 or on email at preston@hechingerreport.org.
The Hechinger Report provides in-depth, fact-based, unbiased reporting on education that is free to all readers. But that doesn’t mean it’s free to produce. Our work keeps educators and the public informed about pressing issues at schools and on campuses throughout the country. We tell the whole story, even when the details are inconvenient. Help us keep doing that.
Despite a strong desire from the public to get to the bottom of the Jeffrey Epstein case, which saw the trafficking and sexual exploitation of thousands of young girls, the cabal associated with Epstein continues its conspiracy to suppress the ugly truth of the ruling class.
Institutions need to optimize their website content for AI-powered search results.
Search is dramatically evolving—and fast. Generative AI (Gen AI), especially Large Language Models (LLMs), are completely reshaping how information is processed, synthesized, and delivered. This changes how prospective students are influenced and impacts your institution’s visibility.
For today’s prospective college students, “search” is far more than a simple tool to find the best university; it’s what they do first to find the information they need during all stages of their college journey. In a world overflowing with options, your university’s visibility and prominence in these evolving search results can be the deciding factor in whether you’re even on a student’s radar.
I’ve recently had several conversations with university leaders, and one thing is clear: maximizing discoverability in this new, AI-powered era is top of mind. This blog is a direct result of those conversations and aims to cut through the noise to explain the why, what, and how of AI-driven student search.
The WHY: Generative AI-Powered Search
Generative AI (Gen AI) is the powerful application of machine learning that is transforming how information is created, compiled, and presented. Gen AI’s ability to create, summarize, and discover new information is precisely why it has become so crucial to modern online searching.
At the core are LLMs like ChatGPT and Gemini that are trained on massive amounts of data to understand and generate human-like language. These models enable students to ask complex, conversational questions like: “Which MBA program is best if I’m working full-time and want to study online?”
LLMs understand the intent behind that question—not just the words. That’s a huge leap from traditional keyword-based search. And the Gen AI is pulling from a vast range of sources, summarizing information, and delivering fast, context-rich answers with relevant links.
The WHAT: AI is Transforming Student Search
1. The rise of conversational search
Search is no longer just about typing in a few keywords and scrolling through results. Today’s prospective students are asking real questions, using full sentences, and expecting immediate, tailored answers whether it’s on Gemini, Siri, or ChatGPT.
In 2025, over 20% of the global population is already using voice search like Siri and Alexa. Many of these searches are like natural conversations—they’re specific, urgent, and detailed. That means your website content needs to be structured to answer these questions directly and naturally.
Your content needs to do more than just match keywords; it needs to thoroughly and thoughtfully answer the actual questions behind what students ask the Gen AI tool. Otherwise, your university could remain hidden from the Gen AI tools students use most.
2. Google AI overviews and AI mode: A new front page
Google’s AI Overviews (AIOs) are fundamentally changing the Search Engine Results Page (SERP) and content visibility by providing AI-generated summaries at the top. Instead of showing multiple blue links, AIOs serve up AI-generated summaries right at the top—pulling from multiple sources and citing them directly. If your content is cited in the summary, your visibility increases. If not, you might be left out entirely.
Soon, these summaries will be able to have paid ads in them. As part of the release in Google Marketing Live 2025, ads will appear directly within these AI Overviews, creating new high-visibility placements that are essential for maintaining paid visibility. You need to start planning to include AIO ads as part of your paid media strategy. Visibility is no longer about just bidding on keywords—it’s about being where the AI puts attention.
3. The accuracy challenge with LLMs
LLMs, which are also the technology powering AI Overviews, are powerful, but not perfect. They generate answers quickly, but if they lack real-time data, they can “hallucinate” or produce outdated info. Think of it this way—while your institutional content can become part of an AI’s knowledge base, the accuracy and strength of the AI’s responses are heavily dependent on your website’s structure for AI discoverability and the completeness and timeliness of your content on key pages like academic programs, faculty profiles, research archives, financial aid and student success stories.
However, students do not just take the face value of a summary. They want to dive deeper. Interestingly, AI assistants often pull from forums like Reddit or Quora. That’s a signal: clarity, authenticity, and helpfulness now compete with traditional authority. If your content sounds genuinely human and directly answers real student questions, it’s more likely to be cited by these tools and trusted by prospective students.
Talk with our digital marketing and enrollment experts
RNL works with colleges and universities across the country to ensure their digital marketing is optimized and filling their academic programs. Reach out today for a complimentary consultation to discuss:
Search engine optimization
Digital marketing
Lead generation
Digital engagement throughout the enrollment funnel
We are firmly in the age of Search + Chat. For universities’ content creators and marketing teams, this means adapting your strategy to a hybrid model where optimizing for both traditional search engines and AI citations is crucial. It’s no longer about ranking high on Google; it’s about being part of the conversation students are having with AI.
Just as prompt engineers craft inputs for LLMs, your content needs to “prompt” search AI effectively. This means creating well-structured, meaningful content that makes it easy for AI to understand and cite your information. This adds a layer of sophistication to content optimization, moving us toward what some call Generative Engine Optimization (GEO). Think of it as SEO, reimagined for an AI-first search environment.
Top 5 GEO Strategies You Can Focus on Now
1. Topic-focused content
Move beyond program name focus to cover broader topics comprehensively, addressing full student intent. For example, instead of just “Best Online MBA,” create content around “Which MBA program is the best while balancing a full-time job?” or “career paths in business analytics” or “balancing graduate studies with work.” This helps AI understand the full context, making your university’s degree program’s content relevant for diverse student queries.
2. Answer-focused structure
Use short, digestible sections with clear, question-based headings. For example, “When are the application deadlines for fall 2025?” or “How do I schedule a campus tour?” Include plain-text facts and data-driven claims (e.g., graduate employment rates, program rankings, faculty research impact). Content with specific data is 40% more likely to appear in LLM responses.
3. Build authority (E-E-A-T)
AI models favor content that signals Experience, Expertise, Authoritativeness, and Trust. For universities, this means transparently displaying faculty qualifications, publishing original research, program rankings, and highlighting alumni success, through testimonials. Strong E-E-A-T signals trustworthiness to AI, crucial for students making significant educational decisions. This isn’t just for humans, it’s how AI decides your credibility.
4. Structured data and schema markup to speak AI’s language
Think of schema markup as a universal translator for your website. It’s code you add to your pages that tells AI models and search engines what specific pieces of information mean, not just what they say. For example:
You can mark up your academic programs as “Courses,” detailing credit hours, learning outcomes, and faculty.
Your events (like campus tours or info sessions) can be identified as “Events” with dates, times, and locations.
Faculty profiles can be marked as “Persons,” highlighting their name, title, department, and research interests.
Testimonials can be flagged as “Reviews,” complete with star ratings and reviewer names.
Why this matters: When AI understands the precise context of your content, it can extract accurate information more effectively. This dramatically boosts your visibility in AI Overviews, rich snippets, and voice search.
5. AI crawler accessibility
For AI models to learn from your website, they first need to be able to “read” it. This means ensuring your university’s websites and program pages are fully accessible to AI crawlers.
Check your robots.txt file: This file tells web crawlers (including those used by AI) which parts of your site they can and cannot access. Make sure it’s not inadvertently blocking important academic programs, admissions details, or faculty research sections.
Handle JavaScript-heavy elements: Many modern university sites use JavaScript for interactive elements like program finders, application portals, or dynamic course catalogs. If not set up correctly, AI crawlers might not “see” the content generated by this JavaScript. Consider Server-Side Rendering (SSR) or Static Site Generation (SSG) to ensure this critical content is visible to crawlers.
If AI crawlers can’t access your academic program content, it won’t be discoverable by AI- powered search.
Final Thoughts: Show Up Where It Counts
The AI-driven evolution of student search isn’t a distant prediction—it’s happening now. My conversations with campus partners consistently confirms this: AI isn’t replacing traditional student search, but profoundly reshaping how students search, find, trust, and act on information.
The smartest path forward isn’t choosing between Google and AI chat tools. It’s using both. This is a powerful convergence where AI assistants deliver fast, personalized insights, while Google Search provides foundational depth, structure and authority.
Ask yourself: Is your content part of that journey? Is it fresh, factual, and findableAI and traditional search? For higher ed marketing and enrollment management professionals seeking to make a lasting impact, the answer is clear: Be the answer in both places.
At RNL, we’re committed to helping universities stay discoverable throughout the entire funnel—from awareness to inquiry to application and enrollment. We care deeply about the student journey too, and we know how critical it is for students to find the right-fit institutions at the right time. That’s why we stay agile—continuously evolving our strategies to meet students where they are and help institutions show up early, stay relevant, and convert when it counts.
Talk with our digital marketing and enrollment experts
RNL works with colleges and universities across the country to ensure their digital marketing is optimized and filling their academic programs. Reach out today for a complimentary consultation to discuss:
Search engine optimization
Digital marketing
Lead generation
Digital engagement throughout the enrollment funnel
The UCAS 30 June application deadline is the last point an applicant can apply outside of clearing.
Though most applications (particularly from UK 18 year olds) happen by the January deadline, the June figures allow for a complete analysis of application behaviour in the UCAS main scheme.
The number of 18 year old UK applicants has reached a record high of 328,390 (up 2.2 per cent on last year) – with the total number of applicants at 665,070 (up 1.3 per cent on last year).
Application rates
As always it is salutary to compare the often-pushed narrative that young people are being tempted away from expensive/poor-value/woke (delete as per your personal preference) higher education with the actuality that numbers are rising. You could even be tempted to imagine what the application rates might be like in a sector with a realistic student maintenance offer.
I mention application rates because this is what declinist commentators will seize on. For UK domiciled 18 year old applicants, the application rate is 41.20 – down from 41.80 per cent last year. This fall is visible across most measures of deprivation: in England, for example, every IMD quintile but quintile 5 (the least disadvantaged) sees a falling application rate.
In part, this could be a function of another year where the dominance of higher tariff providers in driving applications has increased: higher tariff providers disproportionately inspire applications from (and recruit) better off young people.
This chart shows the number of applications to each of three tariff groups. For UK 18 year olds the default is fast becoming an application to a high tariff provider. We don’t (unfortunately) get application numbers by deprivation and tariff group.
One innovation in this year’s release is information on offer rates – the proportion of applications that result in an offer being made. We get three years of data, which demonstrate that offer rates are rising across the sector – and that (as you may expect) high tariff providers are less likely to make offers than lower tariff providers. The growth among high tariff providers is driven both by rising application numbers and a rising offer rate.
For believers of the other recruitment myth (that universities load up on international students and are less keen to take even very able home students) we get a timely corrective. It turns out that 98.5 per cent of UK 18 year old applicants have an offer, compared with 89.7 per cent of international students.
Finally, it’s always fascinating to look at applications by subject area – a plot by CAH1 groups shows a sharp rise in the popularity of business, subjects allied to medicine, engineering, and law: with an intriguing drop in applications to computing subjects. There may be a generative AI effect on computing applications – the rise of “vibe coding” and other uses of agents in software development may mean that the attraction of learning to programme computers properly may be waning.
That’s the best explanation I have – and it is curious that law (a domain where predictions of AI tools eating entry level roles are ten a penny) doesn’t appear to be experiencing a similar phenomenon.
Less than a decade after their introduction, degree apprenticeships have become a significant feature of higher education provision across the United Kingdom. Despite this shared initiative, institutions in England, Scotland, Wales, and Northern Ireland continue to operate largely independently, creating a fragmented UK landscape that limits collective learning and improvement.
This separation has resulted in a fragmented landscape that undermines opportunities for mutual learning and improvement. The absence of sustained dialogue means each nation continues to trial and refine its own approach in relative isolation, an approach that leaves apprentices short-changed.
If we want better outcomes for everyone involved, we need to stop running four parallel experiments and start talking to one another.
As a consortium of educational leaders committed to work-based higher education across the UK, we’ve collectively observed a concerning trend during our extensive engagement with employers, universities, and apprentices: the persistent siloing of knowledge and practice between our four nations. While Scotland has established its graduate apprenticeships program, England has developed its degree apprenticeships framework, and both Wales and Northern Ireland have implemented their own distinct approaches. Despite facing remarkably similar implementation challenges, there remains a troubling lack of systematic knowledge-sharing and collaborative learning across these national boundaries.
Enhanced cross-border collaboration could lead to better outcomes for institutions, apprentices, and employers alike, preventing duplication of efforts and fostering collective improvements based on shared experiences.
Diverse approaches
Each UK nation has developed its distinct approach to integrating apprenticeships within higher education, despite common policy objectives and implementation challenges.
In 2024, the Labour government announced the Institute for Apprenticeships and Technical Education (Transfer of Functions etc) Bill, paving the way for the establishment of Skills England. Previously employers defined apprenticeship standards, with apprentices required to dedicate at least 20 per cent of their training time away from the workplace, concluding with an end-point assessment. The new legislation gives the government powers to bypass employer groups to design and approve standards and apprenticeship assessment plans in a move argued to make the skills system more “agile” to employer needs and allow Skills England to become central to Labour’s five missions.
In Scotland, graduate apprenticeships managed by Skills Development Scotland similarly prioritise employer involvement. However, Scotland employs a more centrally controlled skills system, directly influencing university offerings through funded apprenticeship places. This approach is further reinforced by the Tertiary Education and Training (Funding and Governance) (Scotland) Bill – introduced in February 2025 – which centralises responsibility for the delivery and funding of apprenticeships within the Scottish Funding Council. By consolidating these responsibilities, the bill aims to enhance system efficiency, transparency, and alignment with the Scottish labour market, thereby facilitating improved outcomes for learners and employers.
Wales introduced a novel structure by establishing the Commission for Tertiary Education and Research (Medr), a single governing body overseeing the entire tertiary education sector, including apprenticeships. This model represents a significant structural departure from other nations.
Northern Ireland’s strategy aligns apprenticeships with broader economic ambitions, specifically targeting a transformation to a “10X economy” by 2030. Apprenticeships play a pivotal role in this ambitious economic development strategy, not merely seen as educational pathways, but as strategic instruments for workforce development and sectoral transformation.
Shared challenges, isolated solutions
Despite the distinct policy approaches, institutions in each nation encounter remarkably similar operational difficulties. Institutions consistently face challenges integrating workplace experiences within academic curricula, navigating multiple regulatory frameworks, and establishing comprehensive support mechanisms for apprentices. These recurring issues highlight a fundamental inefficiency: duplicated efforts across borders without coordinated learning.
For instance, Middlesex University’s Sustainable Degree Apprenticeships report identifies common struggles across the UK, particularly with managing supernumerary positions for nursing apprentices and reconciling workplace assessments with academic expectations.
The widespread nature of these issues emphasises the potential value of a collective, cross-border approach to sharing effective strategies and solutions.
Exemplifying untapped collaborative potential is the University of the West of Scotland’s (UWS) approach to graduate apprenticeships. UWS’ graduate apprenticeship business management programme has introduced dedicated “link tutors” who act as a consistent point of contact for both apprentices and employers. These tutors navigate the complex relationship between universities and employers, support apprentices in managing the demands of full-time work alongside academic study and help ensure alignment between on-the-job experience and academic outcomes. For apprentices who have struggled in more traditional learning environments, this targeted, consistent support has been especially impactful.
The UWS example points to a broader truth – that the success of degree apprenticeships depends not just on academic content or employer engagement, but on the quality of the relationships built around the apprentice. UWS link tutors demonstrate what is possible when those relationships are given structure and sustained attention. However, without mechanisms for knowledge-sharing across the UK, such practices risk becoming isolated successes rather than the foundation for a more consistent and effective system.
Barriers to effective collaboration
The persistence of fragmentation across the UK is not accidental but reinforced by several systemic barriers. Firstly, the varied regulatory and quality assurance frameworks across each nation create natural divisions. These distinct regulations complicate collaborative efforts and reinforce separation.
Competition among institutions for apprenticeships and employer partnerships further discourages cooperation. Institutions often perceive cross-border collaboration as potentially undermining competitive advantage, despite potential long-term benefits for shared knowledge. Divergent policy frameworks across the four nations intensifies these tensions. Employers operating across England, Scotland, Wales and Northern Ireland face significant challenges navigating the inconsistent apprenticeship standards, funding mechanisms, and regulatory requirements, thereby limiting the scale and effectiveness of apprenticeship programs and potentially undermining broader national objectives of skills development and economic growth.
Additionally, frequent policy shifts undermine the stability required for effective collaborative planning. Institutions, wary of unpredictable policy changes, prefer short-term, autonomous strategies rather than investing in potentially unstable cross-border collaborations.
And the absence of structured platforms for meaningful cross-border exchange remains a significant barrier. Resource constraints, particularly in staff workloads and budgetary limitations frequently hinder the capacity of institutions to engage in sustained, meaningful dialogue with counterparts in other UK regions. This lack of institutional infrastructure and resourcing limits the development of collaborative practices essential for a cohesive UK-wide degree apprenticeship ecosystem.
The imperative for collaborative platforms
Addressing these barriers requires deliberate action to create structured, cross-border collaborative forums. Recent informal discussions among apprenticeship providers across the UK indicate widespread acknowledgment of these missed collaborative opportunities. Academics frequently express frustration about facing common challenges without access to shared resources or systematic opportunities to learn from peers in other parts of the UK. This is despite frequent calls from the sector.
What is lacking is a coordinated infrastructure that supports regular exchange of pedagogical models, assessment strategies, and institutional policies. Cross-nation working groups, joint practitioner networks, and shared digital platforms could help bridge this divide. These would not only allow for the exchange of effective practice but also aid in the development of more consistent approaches that benefit apprentices and employers alike.
The challenge is not a lack of innovation, but a lack of connection. Many institutions already possess effective, well-tested solutions to the very problems others are still grappling with. Without formal channels to communicate these solutions, valuable knowledge remains isolated and difficult to access. If higher education institutions across the UK are to realise the full potential of degree apprenticeships, they must find ways to turn informal acknowledgement into formal collaboration.
The benefits of greater cross-border collaboration are substantial. Institutions could significantly improve the quality of apprenticeship programmes by collectively addressing shared challenges. Enhanced efficiency could reduce duplication of effort, allowing institutions to focus resources more strategically and effectively.
Moreover, apprentices themselves stand to gain significantly. Improved programme coherence, stemming from collective learning, could ensure apprentices receive uniformly high-quality education and training, irrespective of their geographic location.
Employers – essential stakeholders in apprenticeship programmes – would similarly benefit from improved programme consistency and quality. Collaborative cross-border dialogue could help standardise employer expectations and streamline their participation across multiple jurisdictions.
A collective future
Degree apprenticeships represent a substantial collective investment aimed at reshaping higher education and addressing key skills shortages within the UK economy. Apprentices at the heart of this initiative deserve integrated, high-quality experiences informed by the best practices and shared knowledge of institutions across the entire UK.
Institutions and policymakers must therefore commit to overcoming existing fragmentation by prioritising structured cross-border collaboration. This approach not only maximises the effectiveness of the significant resources already committed but also establishes a more coherent, effective educational framework for future apprentices.
Ultimately, collaboration among UK higher education institutions represents not only good educational practice but a strategic imperative, ensuring that apprenticeships fully realise their potential as transformative educational opportunities.
Our apprentices deserve better than four parallel experiments. They deserve the best of what all four nations have learned. It’s time we started talking to each other.
As we pass the 30 June deadline for this year’s undergraduate admissions cycle, UCAS’ data offers an early view of applicant and provider behaviour as we head into Confirmation and Clearing. It also marks a personal milestone for me, as it’s my first deadline release since rejoining UCAS. I wanted to take a deeper look at the data to reflect on how much things have changed since I worked here 10 years ago.
Applicant demand has always been shaped by two key elements: the size of the potential applicant pool, and their propensity to apply. Since I last worked at UCAS in 2016, these two factors have continuously interchanged over the better part of the past decade – sometimes increasing or decreasing independently but often counterbalancing each other. Let’s take a look at how things are shaping up this year.
Overall, by the 30 June there have been 665,070 applicants (all ages, all domiciles) this year, compared to 656,760 (+1.3%) in 2024. This is an increase in applicants of over 64,000 since UCAS last reported in January, although the profile of these additional applicants is very different. At the January Equal Consideration Deadline (ECD), over half of the total number of applicants were UK 18-year olds, who are the most likely group to have applied by that stage in the cycle. They represent just 8% of the additional applicants since January, among a much larger proportion of UK mature and international students.
As we saw at January, the differences in demand for places between young people from the most advantaged (POLAR4 Quintile 5) and most disadvantaged (POLAR4 Quintile 1) areas at June remain broadly the same as last year – with the most advantaged 2.15 times more likely to apply to HE than those from the least advantaged backgrounds, compared to 2.17 last year.
UK 18-year-old demand
Demand for UK higher education (HE) has long been shaped by the 18-year-old population – the largest pool of applicants. Despite the well-known challenges facing the HE sector at present, at the 30 June deadline we see record numbers of UK 18-year-old applicants, with 328,390 applicants this year – up from 321,410 (+2%) in 2024. This trend was almost entirely locked in by the January deadline, given the vast majority of UK 18-year-old applicants have applied at this stage in the cycle.
During my previous tenure at UCAS, the size of the UK 18-year-old population had been falling year on year but from 2020, it began to increase. This continued growth drives the increase in UK 18-year-old applicant numbers we have observed in recent cycles. But when we look at their overall application rate to understand the strength of demand among this group, the data shows a marginal decline again this year – down to 41.2% from 41.9% in 2024. The historically strong growth in the propensity of UK 18-year-olds to apply for HE, which we’ve observed across the last decade, has clearly plateaued.
This could be due to a range of factors, such as young people choosing to take up work or an apprenticeship, or financial barriers. We know that cost of living is increasingly influencing young people’s decisions this year, with pre-applicants telling us that financial support – such as scholarships or bursaries – ranks as the second most important consideration for them (46%), followed closely by universities’ specific cost-of-living support (34%).
Interesting to note is the number of UK 19-year-old applicants. When separating the data to distinguish 19-year-olds applying for the first time (as opposed to those reapplying), there has been a decent increase – from 46,680 last year to 48,890 this year (+4.7%). For many years, the number of first-time UK 19-year-old applicants had been falling year on year, but since 2023 this trend has started to reverse. This suggests that demand among young people may be holding up as they decide to take a year out before applying to university or college.
Mature students
For UK mature students (aged 21+), the picture looks very different. The number of mature students applying to university or college ebbs and flows depending on the strength of the job market, so since I was last at UCAS, we have typically seen applications decrease when employment opportunities are strong and vice versa. Alongside fluctuations linked to the employment market, rising participation at age 18 means there is a smaller pool of potential older applicants who have not already entered HE. The falling demand from mature students continues in 2025, although in recent years there have been small but significant increases in the volume of mature applicants applying after the 30 June deadline and directly into Clearing.
As of this year’s 30 June deadline there have been 86,310 UK mature (21+) applicants, compared to 89,690 (-3.8%) in 2024, meaning a fall in demand compared to the previous year at this point in the cycle for the fourth year in a row. However, whereas at the January deadline mature applicants were down 6.4% compared to the same point last year, at June the figure is only 3.8% down showing some recovery in the numbers. This is another indication that mature students are applying later in the cycle. While it remains too early to say whether we will see continued growth in mature direct to Clearing applicants in 2025, last year 9,390 UK mature students who applied direct to Clearing were accepted at university or college, an increase of 7.4% on 2023 and 22.7% higher than 2022.
International students
When looking at the UCAS data through the lens of international students, the landscape has changed significantly since 2016. Brexit led to a sharp decline in EU applicants, offset by strong growth elsewhere, the pandemic caused disruption to international student mobility, and we’ve seen intensified global competition, shifting market dynamics and geopolitics which are increasingly influencing where they choose to study. This year we’re seeing growth once more, with 138,460 international applicants compared to 135,460 in 2024 (+2.2%) – although this stood at +2.7% at January. It should be noted that UCAS does only see a partial view of undergraduate international admissions (we tend to get a more complete picture by the end of the cycle) and we don’t capture data on postgraduate taught and research pathways.
Interest among Chinese students in UK education has held firm since my time at UCAS, and this year we’re seeing a record number of applicants from China – 33,870, up from 30,860 (+10%) in 2024. This year’s data also shows increases in applicants from Ireland (6,060 applicants, +15%), Nigeria (3,170 applicants, +23%) and the USA (7,930 applicants, +14%).
Offer-making
We are releasing a separate report on offer-making this year, alongside the usual data dashboard for applications. This additional data covers offers and offer rates over the past three years, from the perspective of applicants according to their age and where they live, and from the perspective of providers by UK nation and tariff group.
What we’re seeing as the natural consequence of increased applications this year is an uplift in offers. Universities have made more offers than ever before this year, with 2.0 million main scheme offers to January deadline applicants overall, largely driven by the rise in UK 18-year-olds applicants (who are the most likely to use their full five choices while applying). This record high surpasses the previous peak of 1.9 million offers set last year (+3.8%).
While the main scheme offer rate has increased across all provider tariff groups, the most notable uplift is for higher tariff providers – up 3.2 percentage points to 64.4% this year. Despite the increase in offer rates, higher tariffs do still remain the lowest, partly due to being the most selective institutions. Offer rates by medium and lower tariff providers have also increased, by 0.9 percentage points to 77.0% among medium tariff providers, and by 1.5 percentage points to 81.7% among lower tariff providers. This means that, among those who applied by the Equal Consideration Deadline in January, 72.5% of main scheme applications received an offer this year, also a record high, and 1.8 percentage points higher than in 2024.
It’s worth noting that we’ll be updating our provider tariff groupings in time for the 2026 cycle, to reflect changes in the higher education landscape.
Looking ahead
For students who are intent on going to university or college, it makes this a very good year, with more opportunities than ever before. A record 94.5% of students who applied by the January deadline will be approaching the critical summer period having received at least one offer. High levels of offer-making by universities and colleges typically translates into more acceptances, which should give applicants plenty of confidence heading into results day.
I’m delighted to be back at UCAS, and my team will continue to dig further into the data as Confirmation and Clearing draws nearer to see how demand translates into accepted places come results day.
UCAS
UCAS, the Universities and Colleges Admissions Service, is an independent charity, and the UK’s shared admissions service for higher education.
UCAS’ services support young people making post-18 choices, as well as mature learners, by providing information, advice, and guidance to inspire and facilitate educational progression to university, college, or an apprenticeship.
UCAS manages almost three million applications, from around 700,000 people each year, for full-time undergraduate courses at over 380 universities and colleges across the UK.
UCAS is committed to delivering a first-class service to all our beneficiaries — they’re at the heart of everything we do.
One shaped by search queries, digital experiences, instant communication, and high expectations. Today’s prospective students demand speed, personalization, and clarity from their first interaction. For institutions that want to grow, scale, and compete, relying on spreadsheets or legacy databases is no longer sustainable.
You need a system that works as hard as your team does. One that doesn’t just manage applicants, but empowers strategy, fosters connection, and drives retention.
That’s the promise of a modern Enrollment Management System (EMS), but only if you choose the right one.
What Is an Enrollment Management System?
An Enrollment Management System is more than a tool for admissions; it’s a digital backbone for your recruitment, application, and onboarding processes. Think of it as an intelligent, data-powered engine that drives student acquisition and supports institutional growth goals.
While many systems include basic applicant tracking and form building, a true EMS integrates across departments, touching admissions, marketing, student services, financial aid, and beyond. It’s designed to give your team a real-time view of the applicant pipeline while also enabling automation, analytics, and multichannel communication.
Example: Mautic by HEM is a dedicated, all-in-one CRM and marketing automation platform for education, built on the open-source Mautic tool. It facilitates thorough applicant tracking by letting schools define custom stages and funnels for the enrollment journey: admissions teams can monitor each contact’s progress through stages (inquiry, application, accepted, etc.) and even apply lead scoring to prioritize the most engaged prospects.
The best platforms don’t just move information. They orchestrate outcomes.
A modern EMS aligns your people, data, and processes so that your team spends less time chasing forms and more time building relationships. It adapts to your enrollment strategy, whether that’s growing international reach, increasing diversity, boosting conversion, or all of the above.
What Does an Enrollment System Do?
It streamlines student recruitment and admissions, enabling your team to launch campaigns, collect inquiries, and track applicant engagement without toggling between multiple platforms. While “enrollment management” is often associated with software, it’s fundamentally a strategic function, and the right EMS becomes a catalyst for this function to succeed.
Here’s how:
It streamlines student recruitment and admissions, enabling your team to launch campaigns, collect inquiries, and track applicant engagement without toggling between multiple platforms. From inquiry to enrollment, every stage is logged, measured, and improved.
Example: Tools like TargetX make it easy to launch campaigns, track lead engagement, and move prospects from inquiry to enrollment. TargetX is built on Salesforce and tailored for higher education, especially career colleges that need efficient outreach.
It enables marketing and communications teams to segment audiences, trigger campaigns, and personalize outreach across email, text, and student web portals, all with full visibility into what converts.
Example: EMS platforms such as Finalsite Enrollment combine CRM and marketing automation to segment audiences and personalize outreach across email, SMS, and web. Designed for independent K–12 schools, Finalsite ensures your message resonates from the first click.
It supports financial aid and yield strategy by syncing with your student information system (SIS) or CRM. That means your staff can track aid packages, award statuses, and net tuition impact, all within the same ecosystem.
Example: Integrated EMS like Anthology allows institutions to view aid packages, tuition forecasts, and academic data in one place. Anthology is especially powerful for institutions with complex admissions models or rolling start dates.
It strengthens student retention by providing advisors with access to academic history, risk indicators, and automated nudges that support at-risk students from the very start of their academic journey.
Example: By giving advisors access to risk flags and real-time data, platforms like Salesforce Education Cloud enable timely interventions that support students long after they’ve enrolled.
And most importantly, it delivers data analysis and forecasting that lets institutional leaders plan with precision. From demographic breakdowns to conversion rates, it provides insight into not just who applied but why they enrolled.
Example: With advanced analytics, tools like Technolutions Slate offer actionable insights into yield, demographics, and conversion rates, helping you refine your enrollment strategy over time.
What is the point of strategic enrollment management? The point of strategic enrollment management (SEM) is to align an institution’s recruitment, admissions, retention, and graduation strategies with its long-term goals, using data and coordinated planning to optimize student success and institutional sustainability. An effective EMS ensures that your strategic enrollment plan becomes an operational reality, daily, seamlessly, and at scale.
Core Features to Look For in an EMS
1. Centralized Database and CRM
A unified database helps you keep track of every applicant and their journey, from interactions and submitting forms to uploading documents and communication history. Look for systems that include robust CRM tools with inquiry tracking, source attribution, and segmentation capabilities.
Example: TargetX (Liaison): Provides a single dashboard with a 360° view of each student, consolidating everything from event registrations and communication touchpoints to financial aid info, all in the same place. This unified database supports data-driven decision making in recruitment and admissions.
Choose a system with customizable forms, document upload functionality, e-signature support, and user-friendly applicant portals. Features like drag-and-drop form builders and application status tracking can greatly improve the experience for both students and staff.
Example: Classe365 supports paperless admissions with custom online application forms. Students can easily apply from home, and submitted form data is automatically mapped into the school’s SIS to avoid manual re-entry. This makes the entire application-to-enrollment workflow smooth and efficient for both applicants and staff.
A strong student enrollment management system allows you to send personalized, automated messages via email, SMS, or in-app notifications. You should be able to build workflows, for example, a welcome message on inquiry, a reminder to complete an application, or an invitation to an open house. Some systems even offer AI chatbots for 24/7 engagement.
Example: Mautic by HEM features built-in email and text messaging automation, enabling schools to send personalized emails or SMS updates triggered by prospect behavior.
Source: HEM
4. Workflow Automation and Task Management
Look for features that reduce manual work, automatic task assignment, follow-up reminders, and to-do lists. These help your admissions team stay on top of deadlines and reduce errors.
Example:Blackbaud Enrollment Management allows schools to tailor their admissions process with configurable workflows and checklists in one centralized system. Staff have personalized task dashboards, and the system automatically triggers next steps, sending follow-up reminders, updating statuses, or notifying counselors based on defined rules. This saves time and keeps the team on schedule
Your EMS should integrate with your SIS, LMS, financial software, and marketing tools. Data should flow without duplication. Look for open APIs or pre-built integrations with platforms you already use.
Example: Slate supports bi-directional data exchange with campus systems. It can push and pull data to external SIS, LMS, financial aid systems, content management systems, and more via its Integration Center. This means application data or status updates in Slate can automatically appear in the SIS, and vice versa, ensuring consistency across all systems.
Analytics tools allow you to track conversion rates, demographic trends, and recruitment performance. Some EMS platforms even offer predictive analytics to identify at-risk applicants or forecast yield.
Example: TargetX goes beyond basic reporting by incorporating predictive analytics features. It includes a Prospect Scoring tool that lets schools apply tailored scoring models to their applicant pool. This means the system can automatically evaluate and rank prospective students based on likelihood to enroll (or other success indicators), helping admissions teams focus their efforts on the best-fit leads. Of course, standard reports and real-time dashboards are also available in TargetX for monitoring application trends and campaign performance at a glance.
No two schools are the same. Ensure your EMS allows you to customize application workflows, add custom fields, configure user roles, and scale as your institution grows.
Example:A cloud-based SaaS platform, Slate, is designed to “scale seamlessly” with an institution’s growth. All technical infrastructure is managed in modern, secure data centers, and Slate regularly updates with new features at no extra cost. This means an organization can start small and trust that Slate will accommodate more applicants, programs, or campuses over time without needing a major system overhaul. In short, EMS vendors focus on both customization (to meet unique local needs) and scalability (to support more users, records, and features as needed).
Adoption hinges on usability. During demos, pay attention to how intuitive the interface is for both staff and applicants. If the system is difficult to use, your team simply won’t use it to its full potential.
Example: User experience drives adoption. During evaluations, platforms like Classe365 and Class by Infospeed regularly earn praise for intuitive interfaces, which is important when your team has limited tech support.
Modern students (and parents) expect mobile-friendly platforms. Responsive design or dedicated mobile apps improve application completion rates and accessibility.
Example: Slate: Entirely web-based and built with responsive HTML5 design, so all end-user interfaces are mobile-ready by default. Admissions officers and applicants can access Slate “anytime, anywhere,” and the system is compatible across iOS, Android, and other modern smartphones without any special app required.
Data privacy is critical. Look for FERPA, GDPR, or other compliance features, role-based access controls, encryption, and regular security audits.
Example:Slate emphasizes that security is an “absolute commitment.” Slate encrypts all data in transit and at rest, and is fully compliant with regulations including PCI-DSS, NACHA, FERPA, GDPR, ADA Section 508, and more. Each client institution’s data is siloed in its own private database, and features like single sign-on integration and multi-factor authentication are supported, all to protect sensitive student information.
Source: Slate
How to Choose the Right System: The Smart Institution’s Guide
Too often, institutions jump into vendor demos before clearly understanding their own needs. But choosing an EMS isn’t like buying a software license. It’s like hiring a new department, one that will touch nearly every part of your student journey.
Too many schools choose an EMS the way they might buy a printer—look at features, pick the cheapest, hope for the best.
That’s a mistake.
Here’s how to do it right:
1. Audit Your Current Process
Bring your admissions, marketing, IT, and registrar teams together. Map the journey from first touch to enrolled student. Identify bottlenecks, duplicate data entry, communication gaps, and missed opportunities.
Ask:
Where are we losing leads?
What’s manual that should be automated?
What data are we not capturing?
Example: EMS tools like LeadSquared often shine here by centralizing fragmented workflows.
Example: If your institution works with international agents, Class by Infospeed is built for managing agent relationships and complex course offerings, a crucial feature for language schools and ESL programs.
Admissions staff. Recruiters. Advisors. These are the people who will live in the system every day. Their input is gold. Make them part of demos. Let them ask tough questions. Choosing a solution like SchoolMint, praised for its intuitive design, becomes easier when usability is prioritized.
4. Research Vendors Strategically
Not all systems serve all markets equally. Some are better for K-12. Others shine in graduate admissions. Some are strong in portfolio management; others in agent tracking.
Look for:
Reviews from schools like yours
Live or recorded demos
Transparent pricing models
Implementation timelines
Shortlist 3–5 vendors. Your shortlist should reflect your institution’s specific context. For graduate schools, Liaison CAS platforms are especially effective. For community colleges, TargetX offers a powerful combination of CRM and enrollment tools without requiring heavy configuration.
5. Evaluate Integration and Migration
Ask each vendor:
How do you integrate with our SIS, LMS, and payment gateways?
Can you support our CRM, or replace it?
How will you handle data migration?
Do you offer API access?
A disconnected EMS is a ticking time bomb. Ask vendors like Technolutions Slate or Salesforce Education Cloud about APIs and migration support—they’re known for smooth onboarding and flexibility.
6. Test the User Experience
Never buy blind. Ask for a sandbox account or personalized demo. Simulate key tasks: submitting an application, assigning leads, pulling a report. Include both staff and mock student journeys.
What feels intuitive? What’s clunky? What’s fast?
Your system is only as good as the people who use it.
7. Scrutinize Support and Training
Great technology without support is useless. Ask:
Who handles onboarding?
Is training included or extra?
What’s your support SLA?
Can we talk to a current client?
Look for a partner, not just a vendor. Look to vendors like Anthology, which are known for offering detailed implementation timelines, role-based training, and strong post-launch support.
8. Evaluate Total Cost and ROI
Look beyond license fees. Consider:
Implementation and training costs
User seat pricing
Support packages
Future upgrade fees
Opportunity cost of inefficiency
For example, Classe365 offers bundled modules that can be more cost-effective for institutions seeking an all-in-one platform.
Then flip the question:
How much time, enrollment yield, and data quality could we gain?
What to Avoid: Mistakes That Derail Enrollment Success
Let’s be clear: choosing the wrong EMS won’t just slow you down, it can undermine your enrollment goals for years.
Common mistakes include:
Prioritizing brand over fit. The best-known system is not always the best match for your institution’s size, staff capacity, or audience.
Skipping the discovery phase. Without understanding your real process needs, you risk choosing a tool that solves the wrong problems.
Overcomplicating the solution. Feature-rich platforms are great—if your team has the time and training to use them. Don’t choose complexity over usability.
Neglecting integration. A system that doesn’t talk to your CRM or SIS will create data silos and extra work.
Ignoring security and compliance. Your EMS will hold sensitive student data. Ensure it meets regulatory requirements like FERPA or GDPR, and ask vendors for proof of their data protection protocols.
Leaving end-users out of the process. If admissions and marketing staff don’t weigh in, you may end up with a system that leadership likes, but staff resents.
Rushing implementation. A fast deployment might sound appealing, but skipping onboarding, testing, and training will lead to low adoption and missed ROI.
A better approach? Take your time. Do the homework. Involve your people. And choose a system that solves your real problems, not just your imagined ones.
A Strategic Investment, Not Just a Tech Upgrade
The right Enrollment Management System is more than a technology purchase. It’s a strategic accelerator. When implemented well, it becomes the operating system for your admissions engine, fueling smarter campaigns, stronger applicant engagement, faster decision-making, and ultimately, better student outcomes.
Institutions that invest intentionally in their EMS see tangible results: higher yield rates, improved retention, deeper applicant insights, and more efficient operations. They don’t just fill classes, they shape them.
But none of this happens by chance. It requires a clear vision, a methodical evaluation, and a commitment to ongoing improvement.
Partnering for Enrollment Success
Choosing an EMS is just the beginning. Implementing it well, and aligning it with your enrollment strategy requires experience, insight, and a steady hand.
That’s where Higher Education Marketing (HEM) comes in. We’ve helped institutions across Canada and beyond design, implement, and optimize enrollment solutions that work. Whether you need a student-facing CRM portal, a smarter communication strategy, or guidance on vendor selection, our team can help.
Book a free consultation with HEM today, and let’s build an enrollment strategy that’s as forward-thinking as your institution. Because better tools don’t just make your job easier, they make your goals achievable.
Need help sorting through the multitudes of enrollment management systems for the right one for your school? Contact HEM today for more information.
Frequently Asked Questions
Question: What is an enrollment management system?
Answer: An Enrollment Management System is more than a tool for admissions; it’s a digital backbone for your recruitment, application, and onboarding processes
Question: What does an enrollment system do?
Answer: It streamlines student recruitment and admissions, enabling your team to launch campaigns, collect inquiries, and track applicant engagement without toggling between multiple platforms.
Question: What is the point of strategic enrollment management?
Answer: The point of strategic enrollment management (SEM) is to align an institution’s recruitment, admissions, retention, and graduation strategies with its long-term goals, using data and coordinated planning to optimize student success and institutional sustainability.
Depending on the era in which we began learning, formally or informally, we all have a diverse range of valuable definitions and perspectives about artificial intelligence pertaining to teaching and learning.
Teaching and learning with AI
When I was a student in elementary school during the 1980s, AI was using a calculator rather than longhand or a traditional adding machine for arithmetic. Additionally, the entire school only had six computers for student use, which were housed in the library. Only students acting responsibly earned access to time on these devices to play The Oregon Trail, “an educational game that simulates the hardships of [1848] …”
With all of this in mind, AI has been teaching us, and we’ve been learning from it, for quite some time, in and out of school.
However, with the advancement of generative AI, the implications for teaching and learning now have to do more with academic integrity. And academic dishonesty policies about original work vs. AI in education have come into the conversation. This is where MindTap features like Turnitin can be applied to help monitor students’ acceptable and ethical use of AI in English composition courses.
My conversation with students
My students may engage in conversations about acceptable, ethical uses of GenAI and academic integrity before they even enroll in my courses. This is because I post the policies in my syllabus. Students learn that there is a monitoring system in place in MindTap for English by Turnitin. Once enrolled in MindTap, there are discussions, in both online and face-to-face modalities, about these policies at length. Policies are also copied into each of the writing assignments in MindTap. Our focus is on ethics, or academic integrity, to ensure students’ coursework is original. Valuable feedback, information and resources can be provided for students to learn and progress rather than to get a grade.
Since students cannot prove learning and mastery of learning outcomes without work being original, I discuss with them and copy in their assignments that they should not use any words that are not original. MindTap provides me with access to Turnitin to monitor academic integrity.
Suggestions for monitoring
To help monitor students’ use of AI, parameters in MindTap for English with Turnitin should be set. For example, students need to submit more than 300 words for the detector to perform. Once students submit work, the detector generates an originality report. This can be downloaded to provide the instructor and learner with feedback about the percentage amount of acceptable and ethical usage of AI or plagiarism.
Inbox where the similarity percentage can be viewed and clicked on for expanded, detailed information.
The report highlights where originality is in question directly on the student’s document. Some instructors will set percentage parameters as well, instructing students that there cannot be more than 15% flagged by the detector in MindTap. Clicking on what the detector has highlighted shows the possible source where information may have been taken or just generally that AI has been used. Note: this is just a monitoring system. So, please be mindful that the report is a tool instructors can use to have conversations with their students. We cannot accuse academic dishonesty based on a report alone.
Shows the match overview with all of the plagiarism flagged, which can also be AI. Each part can be clicked on and expanded to show the original source.
MindTap’s monitoring system has always been correct for me, but conversations are still beneficial for assurance. I use this monitoring document for every submission in MindTap.
The big picture to consider
AI can be used ethically as a tool for teaching and learning, bridging student learning gaps and strengthening their mastery of skills. However, when it comes to academic integrity, the concern is that GenAI is being used not as an aid, but as a tool devoid of the values of teaching and learning. According to Cengage’s recent research, 82% of instructors expressed concern specifically about AI and academic integrity. Setting policies and parameters with clear definitions and having conversations with students is essential to my ability to monitor my students’ acceptable use of AI.
Do you use AI in your English composition classroom? Reach out to discuss the ways you’re utilizing AI as an ethical tool to advance teaching and learning.
Written by Faye Pelosi, Professor in the Communications Department at Palm Beach State College and Cengage Faculty Partner.
Stay tuned for Professor Pelosi’s upcoming video demo of how she uses the MindTap Turnitin feature in her English course.