Category: Featured

  • Why Australian education needs SaaS+ – Campus Review

    Why Australian education needs SaaS+ – Campus Review

    Digital transformation has become essential for educational institutions navigating budget pressures, evolving compliance demands, and rising expectations from students and staff. But for many universities and TAFEs, ERP projects have been slow, costly, and difficult to deliver.

    This article explores how the SaaS+ delivery model from TechnologyOne is helping education providers unlock faster and better results with a delivery approach designed for the sector, not just the software.

    The sector needs change and certainty

    Why education providers can’t afford risk 

    Australian universities and TAFEs face growing pressure to modernise outdated systems while maintaining tight budgets and resource control. Finance, HR, and administrative teams are expected to do more with less, managing complex funding models and ensuring seamless student and staff experiences, all while staying compliant with evolving regulations.

    Yet, many ERP projects still fall short. Long timelines, shifting scopes, and many other challenges have led to cost blowouts, underwhelming outcomes, and internal fatigue from staff caught in the crossfire.
    Uncertainty isn’t just inconvenient in a sector where every dollar and every hour matters. It’s unsustainable.

    For transformation to succeed, education providers need more than a product. They need a clear path to results: one that simplifies complexity and removes unnecessary risk from the equation.

    Enter SaaS+: One platform. One price. One trusted partner.

    What is SaaS+ and why is it different? 

    SaaS+ is TechnologyOne’s delivery model for enterprise software, and it turns the traditional ERP experience on its head.

    Instead of relying on multiple vendors, consultants, and unpredictable timelines, SaaS+ delivers everything under one roof: software, implementation, support, and ongoing success – all covered by a single annual fee.

    It’s a complete, end-to-end model that takes the risk out of transformation and puts control back in the hands of the institution.

    SaaS+ is also underpinned by preconfigured solutions built specifically for education. That means less time spent reinventing the wheel and more time focusing on the outcomes that matter – better student experiences, smarter financial decisions, and more efficient operations across the board.

    Education outcomes, not IT projects

    Proven success from sector leaders 

    For many institutions, traditional ERP projects have become more about navigating implementation than achieving real change. SaaS+ shifts the focus back to what matters: delivering better outcomes for students, staff, and the broader education community.

    Institutions like Victoria University and TasTAFE, for example, have recently embraced TechnologyOne’s SaaS+ model to modernise systems, streamline administration, and refocus their resources on delivering better student outcomes.

    These institutions aren’t just upgrading software. They’re improving how they operate, how they serve their students, and how they plan for what’s next.

    With SaaS+, success isn’t measured by go-live dates. It’s measured by the outcomes it enables.

    Why buying Australian matters

    When it comes to ERP, local knowledge isn’t a nice-to-have – it’s essential. From sector-specific compliance to the nuances of funding models, education providers in Australia operate in a unique regulatory and operational environment. Generic, global systems often fall short.

    TechnologyOne is Australia’s only homegrown ERP provider, with more than 37 years of experience working alongside the country’s universities, TAFEs, and education departments. Our solutions are built, hosted, and supported locally, with a deep understanding of the sector’s needs embedded from day one.

    Today, our software supports more than 6.5 million students across 150+ education institutions in Australia, New Zealand and the UK. That reach is built on local trust, not on a global scale.

    Beyond the product itself, this local presence means faster support, tighter alignment with government and education standards, and a genuine partnership model. It also means every dollar invested stays in the region, supporting local jobs, innovation, and long-term capability in the sector.

    Time to value, time to lead

    A smarter path forward for transformation 

    SaaS+ is designed to accelerate results. With a preconfigured approach and a single point of accountability, it removes the friction and uncertainty that often slow traditional ERP rollouts. Faster implementations mean faster benefits, and more time to focus on the strategic goals that matter.

    Whether it’s enabling more responsive finance and HR teams, supporting hybrid workforces, or improving how students interact with institutional services, SaaS+ helps education providers act with confidence and clarity.

    Because when your systems work better, your people can too.

    Explore how TechnologyOne’s OneEducation SaaS+ model is helping institutions across Australia lead the future of education.

    Do you have an idea for a story?
    Email [email protected]

    Source link

  • Coaching can be a strategic act of research culture

    Coaching can be a strategic act of research culture

    In higher education institutions, we often speak of “developing talent,” “building capacity,” or “supporting our people.” But what do those phrases really mean when you’re a researcher navigating uncertainty, precarity, or a system that too often assumes resilience, but offers limited resources?

    With the renewed focus of REF 2029 on people, culture and environment, and the momentum of the Concordat to Support the Career Development of Researchers, there’s a growing imperative to evidence not only what support is offered, but how it’s experienced.

    That’s where I believe that coaching comes in – as a strategic, systemic tool for transforming research culture from the inside out.

    At a time when UK higher education is facing significant financial pressures, widespread restructuring, and the real threat of job losses across institutions, it may seem counterintuitive to invest in individuals’ development. But it is precisely because of this instability that our commitment to people must be more visible and deliberate than ever. In moments of systemic strain, the values we choose to protect speak volumes. Coaching offers one way to show – through action, not just intention – that our researchers matter, that their growth is not optional, and that culture isn’t a casualty of crisis, but a lever for recovery.

    By coaching, I mean a structured, confidential, and non-directive process that empowers individuals to reflect, identify goals and navigate challenges. Unlike mentoring, which often involves sharing advice or experience, coaching creates a thinking space led by the individual, where the coach supports them to surface their own insights, unpick the unspoken dynamics of academia, build confidence in their agency, and cultivate their personal narrative of progress.

    Coaching is not just development – it’s disruption

    We tend to associate coaching with senior leadership, performance management, or executive transition. But over the last seven years, I’ve championed coaching for researchers – especially early career researchers – as a means of shifting the developmental paradigm from “this is what you need to know” to “what do you need, and how can we co-create that space?”

    When coaching is designed well – thoughtfully matched, intentionally scaffolded, and thoughtfully led – it becomes a quiet form of disruption. It gives researchers the confidence to think through difficult questions. And it models a research culture where vulnerability is not weakness but wisdom.

    This is especially powerful for those who feel marginalised in academic environments – whether due to career stage, background, identity or circumstance. One early career researcher recently told me that coaching “helped me stop asking whether I belonged in academia and start asking how I could shape it. For the first time, I felt like I didn’t have to shrink myself to fit in.” That’s the kind of feedback you won’t find in most institutional KPIs – but it says a lot about the culture we’re building.

    Why coaching belongs in your research strategy

    Coaching still suffers from being seen as peripheral – a nice-to-have, often under-resourced and siloed from mainstream provision. Worse, it’s sometimes positioned as remedial, offered only when things go wrong.

    As someone who assesses UK institutions for the European Commission-recognised HR Excellence in Research Award, I’ve seen first-hand how embedding coaching as a core element of researcher support isn’t just the right thing to do – it’s strategically smart. Coaching complements and strengthens the implementation of institutional actions for the Concordat to Support the Career Development of Researchers, by centring the individual researcher experience – not just a tick-box approach to the principles.

    What’s striking is how coaching aligns with the broader institutional goals we often hear in strategy documents: autonomy, impact, innovation, wellbeing, inclusion. These are not incidental outcomes; they’re the foundations of a healthy research pipeline, and coaching delivers on these – but only if we treat it as a central thread of our culture, not a side offer.

    Crucially, coaching is evidence of how we live our values. It offers a clear, intentional method for demonstrating how people and culture are not just statements but structures – designed, delivered, and experienced.

    In REF 2029 institutions will be asked to evidence the kind of environment where research happens. Coaching offers one of the most meaningful, tangible ways to demonstrate that such an environment exists through the lived experiences of the people working within it.

    Culture is personal – and coaching recognises that

    In higher education, we often talk about culture as though it’s something we can declare or design. But real culture – the kind that shapes whether researchers thrive or withdraw – is co-created, day by day, through dialogue, trust, and reflection.

    Culture lives in the everyday, unrecorded interactions: the invisible labour of masking uncertainty while trying to appear “resilient enough” to succeed; the internal negotiation before speaking up in a lab meeting; or the emotional weight carried by researchers who feel like they don’t belong.

    Coaching transforms those invisible moments into deliberate acts of empowerment. It creates intentional, reflective spaces where researchers – regardless of role or background – are supported to define their own path, voice their challenges, and recognise their value. It’s in these conversations that inclusion is no longer an aspiration but a lived reality where researchers explore their purpose, surface their barriers, and recognise their value.

    This is especially needed in environments where pressure to perform is high, and space to reflect is minimal. Coaching doesn’t remove the pressures of academia. But it builds capacity to navigate them with intention – and that’s culture work at its core.

    Embedding a coaching culture as part of researcher development shouldn’t be a fringe benefit or pilot project – it should be an institutional expectation. We need more trained internal coaches who understand the realities of academic life and more visibly supported coaching opportunities aligned with the Researcher Development Concordat. The latter encourages a minimum of ten days’ (pro rata) professional development for research staff per year. Coaching is one of the most impactful ways those days can be used – not just to develop researchers, but to transform the culture they inhabit.

    A call to embed – not bolt on

    If we’re serious about inclusive, people-centred research environments, then coaching should be treated as core business. It should not be underfunded, siloed, or left to goodwill. It must be valued, supported, and embedded – reflected in institutional KPIs, Researcher Development Concordat and Research Culture Action Plans, and REF narratives alike.

    And in a sector currently under intense financial pressure, we should double down on culture as a lived commitment to those we ask to do difficult, meaningful work during difficult, uncertain times. Coaching is a strategic lever for equity, integrity, and excellence.

    Source link

  • Higher education postcard: Queen Alexandra’s House

    Higher education postcard: Queen Alexandra’s House

    Greetings from South Kensington!

    I’ve told elsewhere the story of how the Imperial Institute was founded following the Great Exhibition of 1851, and how the South Kensington site became a hub for colleges, museums and culture. And naturally, where there are students, there is a need to house students.

    And one group of students, in particular, exercised the Victorian imagination: women. Let’s take a look at The Era, of July 5, 1884:

    It’s clearly no use training the girls to be high class governesses, if you can’t keep them safe from the predations of that London.

    Step forward, Francis Cook. He was a rich man – head of Cook, Son and Co, traders in fabric and clothes – and became one of Britain’s richest men. He gave £40,000 to fund the construction of a hall of residence for women studying in South Kensington, which meant, at that time, at the Royal College of Art, the Royal College of Music, or the Royal College of Science. (It’s also worth noting another fact or two relating to Cook. His second wife, Tennessee Celeste Claflin was an American suffragist, clairvoyant and medium, who with her sister was one of the first women to open a Wall Street brokerage firm. The sister – Victoria Woodhull – was the first woman to run for the presidency of the United States, in 1872.)

    The hall was to provide 100 bedrooms, each two connected by a shared sitting room. Plans included a concert hall, gymnasium, library and common room. The concert hall would be used by the Royal College of Music, and there were music practice rooms and art studios too. A truly magnificent residence. There are images on the Queen Alexandra’s House website.

    It was named for Alexandra of Denmark, then Princess of Wales, who had taken a keen interest in the project. After the death of her husband King Edward VII, Alexandra became the Queen Mother, and suggested in 1914 that Alexandra House be renamed Queen Alexandra’s House.

    Also in 1914, a little scandal took place. Here’s a clipping from the Daily Chronicle of February 6 that year:

    The Ulster Volunteers were a paramilitary force, established in 1912, dedicated to the overthrow of Home Rule for Ireland. (And not to be confused with the unionist Ulster Volunteer Force which was active between 1966 and 2007, although they clearly shared a lot of aims and values!)

    As “Imperial Student” wrote, “I have known Irish women, Roman Catholics, Jewesses, Non-conformists there, and can safely say that all shades of opinion have been sheltered there. Are they expected to support such an entertainment as is to be held next Monday?” (To be clear, the scandal was the support for the Ulster Volunteers, not for the Student Christian Movement.) The correspondent continued:

    One feels sure that Queen Alexandra has no knowledge of the fact that an entertainment is to be held there in support of a hospital for volunteers armed to fight the forces of the Crown. It is to be hoped that this may be called to her Majesty’s attention and that she may intimate her disapproval of such a proceeding.

    I am sure you will be relieved to know that the Bucks Advertiser and Aylesbury News reported on 14 February that “the unfortunate incident at Queen Alexandra’s House has passed without causing trouble in Court of other circles.”

    Queen Alexandra’s House continues to serve today as when it was founded; it is an independent charity, still providing residential accommodation for female students, in a very desirable part of London.

    It’s royal connection continues; as shown in this February 1963 photograph in the Illustrated London News. I think that the Princess Alexandra in the photograph is the great granddaughter of the Alexandra after whom the House is named.

    The postcard was sent on 13 September 1914 – not long after the outbreak of World War I, to Miss Bates in Horsted Keynes, Sussex.

    Dear Winnie, Just a card of our house – no such houses at Horsted Keynes. Write soon, love from Gladys.

    And here’s a jigsaw.

    Source link

  • The Complicity of Higher Education in Slavery

    The Complicity of Higher Education in Slavery

    New Jersey’s legacy as a “slave state of the North” is often overlooked, especially in the sanitized histories of its most prestigious universities. Yet a closer examination reveals that the state’s institutions of higher education—particularly Princeton University and Rutgers University—were not only complicit in slavery, but were active beneficiaries of racial exploitation. Their histories are deeply intertwined with a system that built wealth and social power through the bondage of Black people.

    This article is based on the findings of For Such a Time as This: The Nowness of Reparations for Black People in New Jersey, a landmark report from the New Jersey Reparations Council. The report is an urgent call for transformative change through reparative justice. It draws a direct throughline from New Jersey’s foundational embrace of slavery, through its Jim Crow era and more recent forms of structural racism, to today’s reality of “Two New Jerseys”—one Black, one white, separated by a staggering $643,000 racial wealth gap between median Black and white family wealth.

    Princeton University: Built by the Enslaved, for the Elite

    Founded in 1746 as the College of New Jersey, Princeton University’s early leadership reads like a roll call of slaveholders. Nine of its first presidents enslaved Black people. At least five brought enslaved individuals to live and labor on campus—including Aaron Burr Sr., who in 1756 purchased a man named Caesar to work in the newly built President’s House. Another, John Witherspoon, signer of the Declaration of Independence and president from 1768 to 1794, kept two people in bondage and spoke out against emancipation, claiming that freeing enslaved people would bring “ruin.”

    Financially and culturally, Princeton thrived on slavery. Many of its trustees, donors, and faculty enriched themselves through plantation economies and the transatlantic slave trade. Historian Craig Steven Wilder has shown that the university’s enrollment strategy was deliberately skewed toward elite southern families who owned enslaved people. From 1768 to 1794, the proportion of southern students doubled, while the number of students from New Jersey declined. Princeton became a finishing school for the sons of America’s racial aristocracy.

    Slavery was not just in the background—it was present in the daily life of the institution. Enslaved Black people worked in kitchens, cleaned dormitories, and served food at official university events. Human beings were bought and sold in full view of Nassau Hall. These men and women, their names often lost to history, were the invisible labor force that built the foundation for one of the wealthiest universities in the world.

    The results of this complicity are measurable. Princeton graduates shaped the American Republic—including President James Madison, three U.S. Supreme Court justices, 13 governors, 20 senators, and 23 congressmen. Many of them carried forward the ideologies of white supremacy and anti-Black violence they absorbed in their youth.

    Rutgers University: Queen’s College and the Profits of Enslavement

    Rutgers University, originally established as Queen’s College in 1766, shares a similarly grim legacy. The college’s early survival depended on donations and labor directly tied to slavery. Prominent among its early trustees was Philip Livingston, a signer of the Declaration of Independence who made his fortune by trading enslaved people and operating Caribbean plantations.

    Enslaved labor helped build Rutgers, too. A man named Will, enslaved by the family of a college trustee, is among the few individuals whose name has survived. His work helped construct the early physical campus, though his story, like so many others, is only briefly mentioned in account books and correspondence.

    The intellectual environment of Queen’s College mirrored the dominant racial attitudes of the time. While some students and faculty opposed slavery, their voices were overwhelmed by an institution that upheld the social, political, and economic status quo. Rutgers, like Princeton, prepared white elites to rule a society built on racial exclusion.

    Toward Reparative Justice

    The For Such a Time as This report from the New Jersey Reparations Council underscores that the legacy of slavery is not a relic of the past—it is embedded in the material realities of today. New Jersey’s racial wealth gap—$643,000 between Black and white families—is not accidental. It is the result of centuries of dispossession, disinvestment, and discrimination.

    The state’s leading universities played a formative role in that history. Acknowledgment of this fact is only a first step. True reckoning means meaningful reparative action. It means directing resources and power toward the communities that have been systematically denied them. It means funding education, housing, healthcare, and business development in Black communities, and making structural changes to how wealth and opportunity are distributed.

    Princeton and Rutgers are not just relics of the past; they are major economic and political actors in the present. As institutions with billion-dollar endowments and vast influence, they have both the means and the moral obligation to contribute to a just future.

    The question now is whether they will answer the call. 

    Source link

  • AI in Higher Education Marketing

    AI in Higher Education Marketing

    An Argument With Myself

    Reaping the benefits of AI also means addressing the concerns and challenges of using it.

    Artificial intelligence (AI) has already made significant inroads into higher education, transforming various aspects of campus life and academic processes. Since becoming part of the mainstream lexicon two years ago, AI has rapidly evolved from a subject of concern regarding academic integrity to an integral tool for enhancing educational experiences. Today, AI is influencing everything from recruitment strategies to long-term student success, with institutions using advanced analytics to predict outcomes, optimize operations, and improve decision-making. Our 2025 Marketing and Recruitment Practices for Undergraduate Students Report details some of the ways colleges and universities have incorporated AI in higher education marketing and enrollment operations.

    However, the integration of AI in higher education is not without its challenges and ethical considerations. As we examine the pros and cons of utilizing AI in higher education marketing, it’s crucial to understand that this technology is no longer a future prospect but a present reality shaping the landscape of colleges and universities across the nation.

    The pros of AI in higher education marketing

    AI offers transformative benefits for higher education marketing by enabling personalized and data-driven strategies. Key advantages include:

    • Personalized outreach: AI analyzes vast datasets to tailor content and communication for prospective students, increasing engagement and conversion rates. For example, predictive analytics can identify high-value leads and anticipate drop-off points in the enrollment process. And since Ann Taylor, Target, Netflix and a host of other brands are utilizing AI to serve me content that is specifically tailored to my tastes, my buying behaviors, and my blood sugar level/impulse control, it is imperative that higher ed keep up with the rest of the content consumer driven market.
    • Automation: AI automates repetitive tasks like email campaigns, social media posts, and chatbot interactions, freeing up staff to focus on strategy and relationship-building. This reduces costs and improves operational efficiency. Higher ed leaders continue to lament the talent/staff crisis on campus, particularly in smaller cities and rural areas where the available talent may be shallow and work-from-home opportunities are not widespread. Instead, we must maximize the time of the staff we have and utilize them for the activities and outcomes that are truly reliant on human interaction, while automating, outsourcing, or eliminating the rest.
    • Real-time support: AI-powered chatbots provide 24/7 support, answering student inquiries instantly and improving the overall student experience. Digital assistants engage with your prospective students, parents, alumni, and supporters when it’s best for THEM, rather than best for you. International student populations may not be in your time zone and may be unable to connect during U.S. business hours. Parents and prospective parents may be researching during off-hours. The RNL Compass digital assistant provides that round-the-clock engagement that directly integrates and feeds data to your CRM while also protecting your data in a closed environment.
    • Scalability: Institutions can scale their marketing efforts across diverse demographics and platforms without requiring proportional increases in resources, helping smaller teams achieve broader reach.

    Potential cons with AI in higher education marketing

    Despite its advantages, AI in higher education marketing could pose significant risk or create unforeseen challenges if not managed with care:

    • Data privacy issues: The use of AI requires collecting and analyzing large amounts of personal data, raising concerns about compliance with privacy regulations such as GDPR or FERPA. Data security, privacy, and management are top concerns on campuses. It is incredibly important that you are utilizing tools that not only secure your data but that you are managing that data ethically. AI governance requires thoughtful planning and ongoing management. RNL works closely with partners who wish to devise a governance framework whether or not you are implementing AI tools.
    • Bias in algorithms: AI systems may inadvertently perpetuate biases present in training data, leading to unfair targeting or exclusion of certain student groups.
    • Round peg, square hole syndrome: Many AI solutions are not created for higher ed and do not account for the specific, complex needs that colleges and universities have compared to other consumer or B2B industries.
    • Loss of human touch: Over-reliance on AI can make interactions feel impersonal, potentially alienating prospective students who value human connection. Working with your team to talk about appropriate uses for AI, proper proofreading, and quality control is key. My colleague Dr. Raquel Bermejo discussed the need to balance technology and human connection with students.
    • Implementation costs: While AI promises cost savings over time, initial setup costs for advanced tools and training staff can be prohibitive for some institutions. Work closely with a trusted partner/vendor to ensure you are getting the best bang for your buck. Embracing AI may require investment, but it should yield so much more in return.

    Be aware of all the pros and cons as you evaluate your AI options

    In summary, while AI enhances efficiency and personalization in higher education marketing, institutions must navigate ethical challenges, potential biases, and implementation hurdles to maximize its benefits responsibly.

    We cannot, however, let the possible risks prevent our institutions from maximizing this tremendous capacity-building tool. As a 50+ year veteran in higher education, RNL has a unique understanding of your campus environment, the likely trepidation, the potential hurdles to adoption, and the risk of inaction. That is why we are investing in AI development that is built just for you, your students, and your campus needs. Coupled with RNL’s renowned consulting expertise, governance support, strict attention to data privacy, and industry-leading marketing and enrollment solutions, we can help you and your campus use AI to advance your mission and achieve your goals while minimizing risk and campus pushback.

    Discover RNL Edge, the AI solution for higher education

    RNL Edge is a comprehensive suite of higher education AI solutions that will help you engage constituents, optimize operations, and analyze data instantly—all in a highly secure environment that keeps your institutional data safe. With limitless uses for enrollment and fundraising, RNL Edge is truly the AI solution built for the entire campus.

    Ask for a Discovery Session

    Source link

  • Why Assess Your Students: The Path to Better Retention and Graduation Rates

    Why Assess Your Students: The Path to Better Retention and Graduation Rates

    As an enrollment manager or a vice president of academic affairs, or even a leader in student affairs, you might think, “Why should I care about gathering data from our current student population? That’s Institutional Research’s job.” But if you care about the health of your institution, if you care about keeping your students enrolled to graduation and if you care about showing your students you care about them as individuals, then regularly assessing student motivation and student satisfaction is an activity that should be on your radar. Intentionally using that data to improve the lives of your students and to identify key challenges for the college should be a priority for every member of the institutional leadership team.

    You may know that assessing student satisfaction is important, but you need to get others on board on campus.

    “If the WHY is powerful, the HOW is easy.” – Jim Rohn

    Student-level data: Motivational assessments

    Understanding what students need to be successful as they first enter your institution is a powerful way to begin building connections and showing students you care about them. Providing them with the services that they say they want and need to be successful will put you in the best position to serve students in the way they want to be served. In the recently published 2025 National First-Year Students and Their Motivation to Complete College Report, we identified the top 10 requests for support by incoming first-year students, based on the nearly 62,000 responses to the College Student Inventory in the fall of 2024:

    2025 National First-Year Students and Their Motivations for Completing College: Top 10 requests for assistance

    Source: 2025 National First-Year Students and Their Motivation to Complete College Report

    Among first-year students’ top ten requests for assistance, we found themes of connection and belonging, career assistance, academic support, and financial guidance. These top 10 have remained fairly consistent over the last few years.

    When campuses are aware of what incoming students need in the aggregate, institutional resources can be targeted to support these services. And when campuses, specifically advisors, know what individual students have self-identified as desired areas of support, guidance can be provided directly to the students most in need of and most receptive to receiving assistance.

    While campuses can see a 1% improvement in student retention within the first year of implementing a motivational assessment, we have found that campuses that are assessing student motivation on a consistent basis over multiple years are most likely to see retention levels improve.(We recognize that motivation data alone doesn’t lead to improved retention, but the student-level data is an important component of institutional retention efforts.) The impact of consistently assessing student motivation with the RNL Retention Management System (RMS):

    2025 National First-Year Students and Their Motivations for Completing College: Chart showing higher graduation rates for institutions using retention assessments2025 National First-Year Students and Their Motivations for Completing College: Chart showing higher graduation rates for institutions using retention assessments
    Data based on a February 2025 RNL review of reported retention rates 2015-2024 in IPEDS for client institutions using one or more of the instruments in the RNL Retention Management System.

    The bottom line on why you should care about assessing individual student motivation

    Asking students as they enter your institution what they need shows that you care about their experience. Using that data to build relationships between advisors and students lays the foundation of one of the most important connections students can have with your institution. Guiding students to the specific service or support they seek puts you in the best position to engage your students in meaningful ways. Ultimately, serving your students in the ways they need will make your institution more likely to retain those students.

    Learn more about the national student motivation data and how it supporting campus retention efforts by joining live or listening to the on-demand session First Year Focus: Understanding Student Motivations, Recognizing Opportunities, and Taking Action.

    Download the First-Year Student Motivation Report

    2025 National First-Year Students and Their Motivation to Complete College Report2025 National First-Year Students and Their Motivation to Complete College ReportWhat are the needs, challenges, and priorities for first-year college students? Find out in the National First-Year Students and Their Motivation to Complete College Report. You will learn their attitudes on finishing college, top areas of assistance, desire for career assistance, and more.

    Read Now

    Institution-level data: Student satisfaction assessments

    Knowing what students value across all class levels at your institution can provide the student voice in your data-informed decision-making efforts. Assessing student satisfaction is another way to show students you care about them, their experience with you, and what matters to them. Aligning your resources with student-identified priorities will reflect a student-centered environment where individuals may be more likely to want to stay.

    Student satisfaction data from across your student population can inform and guide your institutional efforts in multiple ways:

    • Student success and retention activities: Identifying your top priorities for response so you are working on high-importance, low-satisfaction areas from the student perspective.
    • Strategic planning: Incorporate the student voice into your long-term planning efforts to stay aligned with where they want to see you make investments.
    • Accreditation: Document your progress year over year as part of a continuous improvement process to show your regional accreditor that you are paying attention and responding to students (and not just when it is time for re-affirmation!).
    • Recruitment: Highlight your high-importance, high-satisfaction strengths to attract students who will care about what you can offer.

    To assist institutions with building the case for student satisfaction assessment on their campuses, we have developed two brief videos (under two minutes each), one talking about why assess satisfaction and why work with RNL specifically. My colleague Shannon Cook also hosted a 30-minute webinar that is available on demand to dive deeper into the why and how of assessing student satisfaction.

    Satisfaction data provides valuable perspectives for every department on campus, identifying areas to celebrate and areas to invest more time, energy, and resources. Campuses that respond to what their students care about have reported seeing satisfaction levels increase and graduation rates improve. Most institutions we work with assess student satisfaction at least once every two or three years and then use the intervening months to explore the data through demographic subpopulations and conversations on campus, take action in high-priority areas, and communicate back with students about what has been done based on the student feedback. These ongoing cycles put institutions in the best position to create a culture of institutional improvement based on the student voice.

    Student motivation and satisfaction assessments are effective practices

    According to the results of the 2025 Effective Practices for Student Success, Retention and Completion Report, assessing student motivation and student satisfaction are methods used by high percentages of institutions and are considered to be highly effective.

    2025 Effective Practices for Student Success Report: Chart showing 2/3 of four year institutions assess incoming students and only half of two-year institutions do2025 Effective Practices for Student Success Report: Chart showing 2/3 of four year institutions assess incoming students and only half of two-year institutions do

    Source: 2025 Effective Practices for Student Success, Retention, and Completion

    The impact of assessing student motivation and student satisfaction on institutional graduation rates has been documented with numerous studies over the years.

    It is important to be aware that just gathering the data will not magically help you retain students. It is the first step in the process, following these ABCs:

    1. Assess the needs with student and institutional level data collection
    2. Build a high impact completion plan to engage students from pre-enrollment to retention to graduation, taking action based on what students say
    3. Connect students to campus resources that best match their needs and will increase their likelihood to persist and complete and Communicate about what you are doing and why as improvements are made.

    Contact me if you would like to learn more about assessing student motivation and student satisfaction on your campus.

    Source link

  • The College Planning Playbook: What Works According to Students

    The College Planning Playbook: What Works According to Students

    What Works (and What Gets Ignored) According to Real Students

    If you work in enrollment or financial aid, you’ve probably asked yourself: What actually helps students figure out college, and what just adds to the pile? For the 2025 E-Expectations survey, we went straight to the source—nearly 1,600 high school students themselves—and the answers are refreshingly straightforward. Spoiler: it’s not about the fanciest new tech, and it’s also not about drowning them in glossy brochures. When it comes to their “college planning playbook,” teenagers are looking for clear, actionable guidance that helps them make a huge life decision without losing their sanity (or their savings).

    Here’s what we learned from our latest survey, and how you can use it to actually move the needle.

    Students aren’t just window shopping

    Forget the idea that students are passively leafing through mailers. Today’s applicants are strategic: they use whatever gets them closer to a decision and tune out the rest. When we asked, “Which resources have you used and how helpful were they?” the results were clear.

    The top five: What really works

    1. School emails still rule: Those emails you labor over? They’re not just spam fodder. Nearly 90% of students say they’re helpful, and just as many actually read them. The catch? Short, relevant, and timely messages work best. If you’re still sending email blasts that sound like a commercial, rethink your approach.

    2. The official college website remains the king: When in doubt, students go straight to the source. Nine out of ten use college websites to research schools, making them the most-used tool, and 88% percent find them genuinely helpful. Students want the facts—what programs exist, what dorms look like, what deadlines are looming. If your website buries the basics, you’re losing them.

    3. Nothing beats boots on the ground: Visiting campus is still the gold standard for gut checks. Eighty-eight percent say in-person visits are helpful, but only 80% manage to take one (travel and cost are real barriers). When they do, it’s a game-changer.

    4. College planning websites make life easier: Think of these as digital guidance counselors. They’re used by 82% of students, and 85% say they’re helpful. The draw? Easy side-by-side comparisons and less spreadsheet chaos.

    5. College fairs still pack a punch: They may be old school but they are effective: 80% of students attend college fairs, and 85% get helpful info they couldn’t find online. Sometimes, a face-to-face conversation is what tips the scale.

    Mind the gap: Underused but powerful

    There are plenty of tools out there, but some of the most helpful ones are flying under the radar. Here’s where colleges can do better:

    Virtual tours and VR experiences: Students who use them love them (84% helpful), but only 77% have tried. Virtual can’t replace a campus tour, but it’s the next best thing—especially for out-of-state or lower-income students.

    Online student communities: Authentic peer advice matters, but only 77% know about these platforms (even though 84% find them helpful).

    Financial aid calculators: Nothing is scarier than the price tag, but only 81% use these tools, even though 85% say they’re helpful.

    Live chats and chatbots: Quick answers, real-time help, yet only about 70% of students use them. Visibility is the issue, not usefulness.

    And let’s talk about personalized texts and live messages from admissions counselors: students crave direct, real-time communication, but only 77% have gotten it, even though 84% rate it as helpful.

    What enrollment pros should actually do

    So what’s the actionable playbook? Here’s what our data says:

    • Promote your virtual stuff: Highlight virtual tours, student communities, and interactive platforms, especially for students who can’t visit in person.
    • Show the path to a job: Put career outcomes front and center. Students want to see how your programs connect to real-world gigs.
    • Make digital tools impossible to miss: If you have a chatbot or live chat, make it obvious. Don’t bury these features on your website.
    • Lead with affordability: Share scholarship calculators and cost tools early and often. Don’t make families hunt for them.
    • Invest in personal touch: The more tailored your outreach (think texts, quick emails, not just form letters), the better.
    • Make campus visits happen: Subsidize travel, host regional visit days, or beef up your virtual experiences for those who can’t make the trip.

    The bottom line

    Read the 2025 E-Expectations Report

    Students don’t want a firehose of information. They want a GPS. The best colleges aren’t the ones with the flashiest websites or the most emails—they’re the ones who help students navigate from “I have no clue” to “I’ve got this.” Our job isn’t just to provide facts. It’s to be the trusted co-pilot on a student’s most important road trip.

    Want the full breakdown, including more data and actionable insights?

    Read the 2025 E-Expectations Trend Report to get a comprehensive experience of what students expect and experience when searching for colleges. If you’re serious about helping students (and your own enrollment goals), you’ll want to see everything we uncovered!

    Source link

  • Tracking the Trump administration’s moves to cap indirect research funding

    Tracking the Trump administration’s moves to cap indirect research funding

    Status: Temporarily blocked

    What happened? On May 14, U.S. Defense Secretary Pete Hegseth issued a memo declaring that the Defense Department would move to cap reimbursement for indirect research costs to 15% for all new grants for colleges. Hegseth also ordered officials to renegotiate rates on existing awards. If colleges do not agree, DOD officials should terminate previously awarded grants and reissue them under the “revised terms,” he said. 

    Overall, Hegseth estimated the move would save the agency $900 million annually. 

    A group of higher education associations and research universities sued on June 16, arguing that the Defense Department overstepped its authority and noting that other courts had blocked the Trump administration’s caps at other agencies. 

    As with those policies, if DOD’s policy is allowed to stand, it will stop critical research in its tracks, lead to layoffs and cutbacks at universities across the country, badly undermine scientific research at United States universities, and erode our nation’s enviable status as a global leader in scientific research and innovation,” they wrote in court documents

    The next day, U.S. District Judge Brian Murphy granted a temporary restraining order blocking the Defense Department from implementing its policy until further ordered. 

    What’s next? Murphy has scheduled a July 2 hearing on the temporary restraining order.

    Source link

  • UK’s rankings lead under threat from global peers in QS World University Rankings 2026

    UK’s rankings lead under threat from global peers in QS World University Rankings 2026

    • By Viggo Stacey, International Education & Policy Writer at QS Quacquarelli Symonds.

    As UK education minister Bridget Phillipson has rightly acknowledged, the UK is home to many world-class universities. 

    And the country’s excellence in higher education is yet again on display in the QS World University Rankings 2026.  

    Imperial College London, University of Oxford, University of Cambridge and UCL all maintain their places in the global top 10 and 17 of the total 90 UK universities ranked this year are in the top 100, two more than last year. 

    The University of Sheffield and The University of Nottingham have returned to the global top 100 for the first time since 2023 and 2024 respectively. 

    But despite improvements at the top end of the QS ranking, some 61% of ranked UK universities have dropped this year. 

    Overall, the 2026 ranking paints a picture of heightening global competition. A number of markets have been emerging as higher education hubs in recent decades – and the increased investment, attention and ambition in various places is apparent in this year’s iteration. 

    Saudi Arabia – whose government had set a target to have five institutions in the top 200 by 2030 – has seen its first entry into to top 100, with King Fahd University of Petroleum & Minerals soaring 34 places to rank 67th globally. 

    Vietnam, a country that is aiming for five of its universities to feature in the top 500 by the end of the decade, has seen its representation in the rankings leap from six last year to 10 in 2026. 

    China is still the third most represented location in the world in the QS World University Rankings with 72 institutions, behind only the US with 192 and the UK with 90. And yet, close to 80 institutions that are part of the Chinese Double First Class Universities initiative to build world-class universities still do not feature in the overall WUR. 

    Saudi Arabia currently has three institutions in the top 200, while Vietnam has one in the top 500. If these countries succeed in their ambitions, which universities will lose out among the globe’s top in five years’ time? 

    The financial pressure the UK higher education is facing is well documented. Universities UK (UUK) recently calculated that government policy decisions will result in a £1.4 billion reduction in funding to higher education providers in England in 2025/26. The Office for Students’s warning that 43% of England’s higher education institutions will be in deficit this academic year is often cited. 

    Some 19% UK university leaders say they have cut back on investment in research given the current financial climate, and an additional 79% are considering future reductions. 

    On a global scale, cuts like this will more than likely have a detrimental impact on the UK’s performance in the QS World University Ranking – the world’s most-consulted international university ranking and leading higher education benchmarking tool. 

    The 2026 QS World University Rankings already identify areas where UK universities are behind global competitors. 

    With a 39.2 average score in the Citations per Faculty area, measuring the intensity of research at universities, the UK is already far behind places such as Singapore, the Netherlands, Hong Kong, Australia and Mainland China, all of which have average scores of at least 70. 

    In Faculty Student Ratio, analysing the number of lecturers compared to students, the UK (average score of 26.7) is behind the best performing locations such as Norway (73.7), Switzerland (63.8) and Sweden (61.8). 

    While Oxford, Cambridge and LSE all feature in the global top 15 in Employment Outcomes and 13 UK universities feature in the top 100 for reputation among employers, other universities across the world are improving at a faster rate than many UK universities. 

    And, despite its historical dominance in the global education lens, global competitors are catching up with UK higher education in international student ratio and international faculty.  

    While 74% of UK universities improved in the international student ratio indicator in 2022, the last few years have identified a weakening among UK institutions. In 2023, 54% of UK universities fell in this area, in 2024, 56% dropped and in 2025, 74% declined. And in 2026, 73% dropped.  

    The government in Westminster is already aware that every £1 it spends on R&D delivers £7 of economic benefits in the long term and, for that reason, it prioritised spending to rise to £22.6bn in 2029-30 from £20.4bn in 2025-26.  

    But without the financial stability at higher education institutions in question, universities will need more support going ahead beyond support for their research capabilities. Their role in developing graduates with the skills to propel the UK forward is being overlooked.  The QS 2026 World University Ranking is already showing that global peers are forging ahead. UK universities will need the right backing to maintain their world-leading position.

    Source link

  • Machine learning technology is transforming how institutions make sense of student feedback

    Machine learning technology is transforming how institutions make sense of student feedback

    Institutions spend a lot of time surveying students for their feedback on their learning experience, but once you have crunched the numbers the hard bit is working out the “why.”

    The qualitative information institutions collect is a goldmine of insight about the sentiments and specific experiences that are driving the headline feedback numbers. When students are especially positive, it helps to know why, to spread that good practice and apply it in different learning contexts. When students score some aspect of their experience negatively, it’s critical to know the exact nature of the perceived gap, omission or injustice so that it can be fixed.

    Any conscientious module leader will run their eye down the student comments in a module feedback survey – but once you start looking across modules to programme or cohort level, or to large-scale surveys like NSS, PRES or PTES, the scale of the qualitative data becomes overwhelming for the naked eye. Even the most conscientious reader will find that bias sets in, as comments that are interesting or unexpected tend to be foregrounded as having greater explanatory power over those that seem run of the mill.

    Traditional coding methods for qualitative data require someone – or ideally more than one person – to manually break down comments into clauses or statements that can be coded for theme and sentiment. It’s robust, but incredibly laborious. For student survey work, where the goal might be to respond to feedback and make improvements at pace, institutions are open that this kind of robust analysis is rarely, if ever, the standard practice. Especially as resources become more constrained, devoting hours to this kind of detailed methodological work is rarely a priority.

    Let me blow your mind

    That is where machine learning technology can genuinely change the game. Student Voice AI was founded by Stuart Grey, an academic at the University of Strathclyde (now working at the University of Glasgow), initially to help analyse student comments for large engineering courses. Working with Advance HE he was able to train the machine learning model on national PTES and PRES datasets. Now, further training the algorithm on NSS data, Student Voice AI offers literally same-day analysis of student comments for NSS results for subscribing institutions.

    Put the words “AI” and “student feedback” in the same sentence and some people’s hackles will immediately rise. So Stuart spends quite a lot of time explaining how the analysis works. The word he uses to describe the version of machine learning Student Voice AI deploys is “supervised learning” – humans manually label categories in datasets and “teach” the machine about sentiment and topic. The larger the available dataset the more examples the machine is exposed to and the more sophisticated it becomes. Through this process Student Voice AI has landed on a discreet number of comment themes and categories for taught students and the same for postgraduate research students that the majority of student comments consistently fall into – trained on and distinctive to UK higher education student data. Stuart adds that the categories can and do evolve:

    “The categories are based on what students are saying, not what we think they might be talking about – or what we’d like them to be talking about. There could be more categories if we wanted them, but it’s about what’s digestible for a normal person.”

    In practice that means that institutions can see a quantitative representation of their student comments, sorted by category and sentiment. You can look at student views of feedback, for example, and see the balance of positive, neutral and negative sentiment, overall, segment it into departments or subject areas, or years of study, then click through to see the relevant comments to see what’s driving that feedback. That’s significantly different from, say, dumping your student comments into a third party generative AI platform (sharing confidential data with a third party while you’re at it) and asking it to summarise. There’s value in the time and effort saved, but also in the removal of individual personal bias, and the potential for aggregation and segmentation for different stakeholders in the system. And it also becomes possible to compare student qualitative feedback across institutions.

    Now, Student Voice AI is partnering with student insight platform evasys to bring machine learning technology to qualitative data collected via the evasys platform. And evasys and Student Voice AI have been commissioned by Advance HE to code and analyse open comments from the 2025 PRES and PTES surveys – creating opportunities to drill down into a national dataset that can be segmented by subject discipline and theme as well as by institution.

    Bruce Johnson, managing director at evasys is enthused about the potential for the technology to drive culture change both in how student feedback is used to inform insight and action across institutions:

    “When you’re thinking about how to create actionable insight from survey data the key question is, to whom? Is it to a module leader? Is it to a programme director of a collection of modules? Is it to a head of department or a pro vice chancellor or the planning or quality teams? All of these are completely different stakeholders who need different ways of looking at the data. And it’s also about how the data is presented – most of my customers want, not only quality of insight, but the ability to harvest that in a visually engaging way.”

    “Coming from higher education it seems obvious to me that different stakeholders have very different uses for student feedback data,” says Stuart Grey. “Those teaching at the coalface are interested in student engagement; at the strategic level the interest is in strategic level interest in trends and sentiment analysis and there are also various stakeholder groups in professional services who never get to see this stuff normally, but we can generate the reports that show them what students are saying about their area. Frequently the data tells them something they knew anyway but it gives them the ammunition to be able to make change.”

    The results are in

    Duncan Berryman, student surveys officer at Queens University Belfast, sums up the value of AI analysis for his small team: “It makes our life a lot easier, and the schools get the data and trends quicker.” Previously schools had been supplied with Excel spreadsheets – and his team were spending a lot of time explaining and working through with colleagues how to make sense of the data on those spreadsheets. Being able to see a straightforward visualisation of student sentiment on the various themes means that, as Duncan observes rather wryly, “if change isn’t happening it’s not just because people don’t know what student surveys are saying.”

    Parama Chaudhury, professor of economics and pro vice provost education (student academic experience) at University College London explains where qualitative data analysis sits in the wider ecosystem for quality enhancement of teaching and learning. In her view, for enhancement purposes, comparing your quantitative student feedback scores to those of another department is not particularly useful – essentially it’s comparing apples with oranges. Yet the apparent ease of comparability of quantitative data, compared with the sense of overwhelm at the volume and complexity of student comments, can mean that people spend time trying to explain the numerical differences, rather than mining the qualitative data for more robust and actionable explanations that can give context to your own scores.

    It’s not that people weren’t working hard on enhancement, in other words, but they didn’t always have the best possible information to guide that work. “When I came into this role quite a lot of people were saying ‘we don’t understand why the qualitative data is telling us this, we’ve done all these things,’” says Parama. “I’ve been in the sector a long time and have received my share of summaries of module evaluations and have always questioned those summaries because it’s just someone’s ‘read.’ Having that really objective view, from a well-trained algorithm makes a difference.”

    UCL has tested two-page summaries of student comments to specific departments this academic year, and plans to roll out a version for every department this summer. The data is not assessed in a vacuum; it forms part of the wider institutional quality assurance and enhancement processes which includes data on a range of different perspectives on areas for development. Encouragingly, so far the data from students is consistent with what has emerged from internal reviews, giving the departments that have had the opportunity to engage with it greater confidence in their processes and action plans.

    None of this stops anyone from going and looking at specific student comments, sense-checking the algorithm’s analysis and/or triangulating against other data. At the University of Edinburgh, head of academic planning Marianne Brown says that the value of the AI analysis is in the speed of turnaround – the institutionl carries out a manual reviewing process to be sure that any unexpected comments are picked up. But being able to share the headline insight at pace (in this case via a PowerBI interface) means that leaders receive the feedback while the information is still fresh, and the lead time to effect change is longer than if time had been lost to manual coding.

    The University of Edinburgh is known for its cutting edge AI research, and boasts the Edinburgh (access to) Language Models (ELM) a platform that gives staff and students access to generative AI tools without sharing data with third parties, keeping all user data onsite and secured. Marianne is clear that even a closed system like ELM is not appropriate for unfettered student comment analysis. Generative AI platforms offer the illusion of a thematic analysis but it is far from robust because generative AI operates through sophisticated guesswork rather than analysis of the implications of actual data. “Being able to put responses from NSS or our internal student survey into ELM to give summaries was great, until you started to interrogate those summaries. Robust validation of any output is still required,” says Marianne. Similarly Duncan Berryman observes: “If you asked a gen-AI tool to show you the comments related to the themes it had picked out, it would not refer back to actual comments. Or it would have pulled this supposed common theme from just one comment.”

    The holy grail of student survey practice is creating a virtuous circle: student engagement in feedback creates actionable data, which leads to education enhancement, and students gain confidence that the process is authentic and are further motivated to share their feedback. In that quest, AI, deployed appropriately, can be an institutional ally and resource-multiplier, giving fast and robust access to aggregated student views and opinions. “The end result should be to make teaching and learning better,” says Stuart Grey. “And hopefully what we’re doing is saving time on the manual boring part, and freeing up time to make real change.”

    Source link