Blog

  • Higher education postcard: Queen Alexandra’s House

    Higher education postcard: Queen Alexandra’s House

    Greetings from South Kensington!

    I’ve told elsewhere the story of how the Imperial Institute was founded following the Great Exhibition of 1851, and how the South Kensington site became a hub for colleges, museums and culture. And naturally, where there are students, there is a need to house students.

    And one group of students, in particular, exercised the Victorian imagination: women. Let’s take a look at The Era, of July 5, 1884:

    It’s clearly no use training the girls to be high class governesses, if you can’t keep them safe from the predations of that London.

    Step forward, Francis Cook. He was a rich man – head of Cook, Son and Co, traders in fabric and clothes – and became one of Britain’s richest men. He gave £40,000 to fund the construction of a hall of residence for women studying in South Kensington, which meant, at that time, at the Royal College of Art, the Royal College of Music, or the Royal College of Science. (It’s also worth noting another fact or two relating to Cook. His second wife, Tennessee Celeste Claflin was an American suffragist, clairvoyant and medium, who with her sister was one of the first women to open a Wall Street brokerage firm. The sister – Victoria Woodhull – was the first woman to run for the presidency of the United States, in 1872.)

    The hall was to provide 100 bedrooms, each two connected by a shared sitting room. Plans included a concert hall, gymnasium, library and common room. The concert hall would be used by the Royal College of Music, and there were music practice rooms and art studios too. A truly magnificent residence. There are images on the Queen Alexandra’s House website.

    It was named for Alexandra of Denmark, then Princess of Wales, who had taken a keen interest in the project. After the death of her husband King Edward VII, Alexandra became the Queen Mother, and suggested in 1914 that Alexandra House be renamed Queen Alexandra’s House.

    Also in 1914, a little scandal took place. Here’s a clipping from the Daily Chronicle of February 6 that year:

    The Ulster Volunteers were a paramilitary force, established in 1912, dedicated to the overthrow of Home Rule for Ireland. (And not to be confused with the unionist Ulster Volunteer Force which was active between 1966 and 2007, although they clearly shared a lot of aims and values!)

    As “Imperial Student” wrote, “I have known Irish women, Roman Catholics, Jewesses, Non-conformists there, and can safely say that all shades of opinion have been sheltered there. Are they expected to support such an entertainment as is to be held next Monday?” (To be clear, the scandal was the support for the Ulster Volunteers, not for the Student Christian Movement.) The correspondent continued:

    One feels sure that Queen Alexandra has no knowledge of the fact that an entertainment is to be held there in support of a hospital for volunteers armed to fight the forces of the Crown. It is to be hoped that this may be called to her Majesty’s attention and that she may intimate her disapproval of such a proceeding.

    I am sure you will be relieved to know that the Bucks Advertiser and Aylesbury News reported on 14 February that “the unfortunate incident at Queen Alexandra’s House has passed without causing trouble in Court of other circles.”

    Queen Alexandra’s House continues to serve today as when it was founded; it is an independent charity, still providing residential accommodation for female students, in a very desirable part of London.

    It’s royal connection continues; as shown in this February 1963 photograph in the Illustrated London News. I think that the Princess Alexandra in the photograph is the great granddaughter of the Alexandra after whom the House is named.

    The postcard was sent on 13 September 1914 – not long after the outbreak of World War I, to Miss Bates in Horsted Keynes, Sussex.

    Dear Winnie, Just a card of our house – no such houses at Horsted Keynes. Write soon, love from Gladys.

    And here’s a jigsaw.

    Source link

  • The Complicity of Higher Education in Slavery

    The Complicity of Higher Education in Slavery

    New Jersey’s legacy as a “slave state of the North” is often overlooked, especially in the sanitized histories of its most prestigious universities. Yet a closer examination reveals that the state’s institutions of higher education—particularly Princeton University and Rutgers University—were not only complicit in slavery, but were active beneficiaries of racial exploitation. Their histories are deeply intertwined with a system that built wealth and social power through the bondage of Black people.

    This article is based on the findings of For Such a Time as This: The Nowness of Reparations for Black People in New Jersey, a landmark report from the New Jersey Reparations Council. The report is an urgent call for transformative change through reparative justice. It draws a direct throughline from New Jersey’s foundational embrace of slavery, through its Jim Crow era and more recent forms of structural racism, to today’s reality of “Two New Jerseys”—one Black, one white, separated by a staggering $643,000 racial wealth gap between median Black and white family wealth.

    Princeton University: Built by the Enslaved, for the Elite

    Founded in 1746 as the College of New Jersey, Princeton University’s early leadership reads like a roll call of slaveholders. Nine of its first presidents enslaved Black people. At least five brought enslaved individuals to live and labor on campus—including Aaron Burr Sr., who in 1756 purchased a man named Caesar to work in the newly built President’s House. Another, John Witherspoon, signer of the Declaration of Independence and president from 1768 to 1794, kept two people in bondage and spoke out against emancipation, claiming that freeing enslaved people would bring “ruin.”

    Financially and culturally, Princeton thrived on slavery. Many of its trustees, donors, and faculty enriched themselves through plantation economies and the transatlantic slave trade. Historian Craig Steven Wilder has shown that the university’s enrollment strategy was deliberately skewed toward elite southern families who owned enslaved people. From 1768 to 1794, the proportion of southern students doubled, while the number of students from New Jersey declined. Princeton became a finishing school for the sons of America’s racial aristocracy.

    Slavery was not just in the background—it was present in the daily life of the institution. Enslaved Black people worked in kitchens, cleaned dormitories, and served food at official university events. Human beings were bought and sold in full view of Nassau Hall. These men and women, their names often lost to history, were the invisible labor force that built the foundation for one of the wealthiest universities in the world.

    The results of this complicity are measurable. Princeton graduates shaped the American Republic—including President James Madison, three U.S. Supreme Court justices, 13 governors, 20 senators, and 23 congressmen. Many of them carried forward the ideologies of white supremacy and anti-Black violence they absorbed in their youth.

    Rutgers University: Queen’s College and the Profits of Enslavement

    Rutgers University, originally established as Queen’s College in 1766, shares a similarly grim legacy. The college’s early survival depended on donations and labor directly tied to slavery. Prominent among its early trustees was Philip Livingston, a signer of the Declaration of Independence who made his fortune by trading enslaved people and operating Caribbean plantations.

    Enslaved labor helped build Rutgers, too. A man named Will, enslaved by the family of a college trustee, is among the few individuals whose name has survived. His work helped construct the early physical campus, though his story, like so many others, is only briefly mentioned in account books and correspondence.

    The intellectual environment of Queen’s College mirrored the dominant racial attitudes of the time. While some students and faculty opposed slavery, their voices were overwhelmed by an institution that upheld the social, political, and economic status quo. Rutgers, like Princeton, prepared white elites to rule a society built on racial exclusion.

    Toward Reparative Justice

    The For Such a Time as This report from the New Jersey Reparations Council underscores that the legacy of slavery is not a relic of the past—it is embedded in the material realities of today. New Jersey’s racial wealth gap—$643,000 between Black and white families—is not accidental. It is the result of centuries of dispossession, disinvestment, and discrimination.

    The state’s leading universities played a formative role in that history. Acknowledgment of this fact is only a first step. True reckoning means meaningful reparative action. It means directing resources and power toward the communities that have been systematically denied them. It means funding education, housing, healthcare, and business development in Black communities, and making structural changes to how wealth and opportunity are distributed.

    Princeton and Rutgers are not just relics of the past; they are major economic and political actors in the present. As institutions with billion-dollar endowments and vast influence, they have both the means and the moral obligation to contribute to a just future.

    The question now is whether they will answer the call. 

    Source link

  • AI in Higher Education Marketing

    AI in Higher Education Marketing

    An Argument With Myself

    Reaping the benefits of AI also means addressing the concerns and challenges of using it.

    Artificial intelligence (AI) has already made significant inroads into higher education, transforming various aspects of campus life and academic processes. Since becoming part of the mainstream lexicon two years ago, AI has rapidly evolved from a subject of concern regarding academic integrity to an integral tool for enhancing educational experiences. Today, AI is influencing everything from recruitment strategies to long-term student success, with institutions using advanced analytics to predict outcomes, optimize operations, and improve decision-making. Our 2025 Marketing and Recruitment Practices for Undergraduate Students Report details some of the ways colleges and universities have incorporated AI in higher education marketing and enrollment operations.

    However, the integration of AI in higher education is not without its challenges and ethical considerations. As we examine the pros and cons of utilizing AI in higher education marketing, it’s crucial to understand that this technology is no longer a future prospect but a present reality shaping the landscape of colleges and universities across the nation.

    The pros of AI in higher education marketing

    AI offers transformative benefits for higher education marketing by enabling personalized and data-driven strategies. Key advantages include:

    • Personalized outreach: AI analyzes vast datasets to tailor content and communication for prospective students, increasing engagement and conversion rates. For example, predictive analytics can identify high-value leads and anticipate drop-off points in the enrollment process. And since Ann Taylor, Target, Netflix and a host of other brands are utilizing AI to serve me content that is specifically tailored to my tastes, my buying behaviors, and my blood sugar level/impulse control, it is imperative that higher ed keep up with the rest of the content consumer driven market.
    • Automation: AI automates repetitive tasks like email campaigns, social media posts, and chatbot interactions, freeing up staff to focus on strategy and relationship-building. This reduces costs and improves operational efficiency. Higher ed leaders continue to lament the talent/staff crisis on campus, particularly in smaller cities and rural areas where the available talent may be shallow and work-from-home opportunities are not widespread. Instead, we must maximize the time of the staff we have and utilize them for the activities and outcomes that are truly reliant on human interaction, while automating, outsourcing, or eliminating the rest.
    • Real-time support: AI-powered chatbots provide 24/7 support, answering student inquiries instantly and improving the overall student experience. Digital assistants engage with your prospective students, parents, alumni, and supporters when it’s best for THEM, rather than best for you. International student populations may not be in your time zone and may be unable to connect during U.S. business hours. Parents and prospective parents may be researching during off-hours. The RNL Compass digital assistant provides that round-the-clock engagement that directly integrates and feeds data to your CRM while also protecting your data in a closed environment.
    • Scalability: Institutions can scale their marketing efforts across diverse demographics and platforms without requiring proportional increases in resources, helping smaller teams achieve broader reach.

    Potential cons with AI in higher education marketing

    Despite its advantages, AI in higher education marketing could pose significant risk or create unforeseen challenges if not managed with care:

    • Data privacy issues: The use of AI requires collecting and analyzing large amounts of personal data, raising concerns about compliance with privacy regulations such as GDPR or FERPA. Data security, privacy, and management are top concerns on campuses. It is incredibly important that you are utilizing tools that not only secure your data but that you are managing that data ethically. AI governance requires thoughtful planning and ongoing management. RNL works closely with partners who wish to devise a governance framework whether or not you are implementing AI tools.
    • Bias in algorithms: AI systems may inadvertently perpetuate biases present in training data, leading to unfair targeting or exclusion of certain student groups.
    • Round peg, square hole syndrome: Many AI solutions are not created for higher ed and do not account for the specific, complex needs that colleges and universities have compared to other consumer or B2B industries.
    • Loss of human touch: Over-reliance on AI can make interactions feel impersonal, potentially alienating prospective students who value human connection. Working with your team to talk about appropriate uses for AI, proper proofreading, and quality control is key. My colleague Dr. Raquel Bermejo discussed the need to balance technology and human connection with students.
    • Implementation costs: While AI promises cost savings over time, initial setup costs for advanced tools and training staff can be prohibitive for some institutions. Work closely with a trusted partner/vendor to ensure you are getting the best bang for your buck. Embracing AI may require investment, but it should yield so much more in return.

    Be aware of all the pros and cons as you evaluate your AI options

    In summary, while AI enhances efficiency and personalization in higher education marketing, institutions must navigate ethical challenges, potential biases, and implementation hurdles to maximize its benefits responsibly.

    We cannot, however, let the possible risks prevent our institutions from maximizing this tremendous capacity-building tool. As a 50+ year veteran in higher education, RNL has a unique understanding of your campus environment, the likely trepidation, the potential hurdles to adoption, and the risk of inaction. That is why we are investing in AI development that is built just for you, your students, and your campus needs. Coupled with RNL’s renowned consulting expertise, governance support, strict attention to data privacy, and industry-leading marketing and enrollment solutions, we can help you and your campus use AI to advance your mission and achieve your goals while minimizing risk and campus pushback.

    Discover RNL Edge, the AI solution for higher education

    RNL Edge is a comprehensive suite of higher education AI solutions that will help you engage constituents, optimize operations, and analyze data instantly—all in a highly secure environment that keeps your institutional data safe. With limitless uses for enrollment and fundraising, RNL Edge is truly the AI solution built for the entire campus.

    Ask for a Discovery Session

    Source link

  • Why Assess Your Students: The Path to Better Retention and Graduation Rates

    Why Assess Your Students: The Path to Better Retention and Graduation Rates

    As an enrollment manager or a vice president of academic affairs, or even a leader in student affairs, you might think, “Why should I care about gathering data from our current student population? That’s Institutional Research’s job.” But if you care about the health of your institution, if you care about keeping your students enrolled to graduation and if you care about showing your students you care about them as individuals, then regularly assessing student motivation and student satisfaction is an activity that should be on your radar. Intentionally using that data to improve the lives of your students and to identify key challenges for the college should be a priority for every member of the institutional leadership team.

    You may know that assessing student satisfaction is important, but you need to get others on board on campus.

    “If the WHY is powerful, the HOW is easy.” – Jim Rohn

    Student-level data: Motivational assessments

    Understanding what students need to be successful as they first enter your institution is a powerful way to begin building connections and showing students you care about them. Providing them with the services that they say they want and need to be successful will put you in the best position to serve students in the way they want to be served. In the recently published 2025 National First-Year Students and Their Motivation to Complete College Report, we identified the top 10 requests for support by incoming first-year students, based on the nearly 62,000 responses to the College Student Inventory in the fall of 2024:

    2025 National First-Year Students and Their Motivations for Completing College: Top 10 requests for assistance

    Source: 2025 National First-Year Students and Their Motivation to Complete College Report

    Among first-year students’ top ten requests for assistance, we found themes of connection and belonging, career assistance, academic support, and financial guidance. These top 10 have remained fairly consistent over the last few years.

    When campuses are aware of what incoming students need in the aggregate, institutional resources can be targeted to support these services. And when campuses, specifically advisors, know what individual students have self-identified as desired areas of support, guidance can be provided directly to the students most in need of and most receptive to receiving assistance.

    While campuses can see a 1% improvement in student retention within the first year of implementing a motivational assessment, we have found that campuses that are assessing student motivation on a consistent basis over multiple years are most likely to see retention levels improve.(We recognize that motivation data alone doesn’t lead to improved retention, but the student-level data is an important component of institutional retention efforts.) The impact of consistently assessing student motivation with the RNL Retention Management System (RMS):

    2025 National First-Year Students and Their Motivations for Completing College: Chart showing higher graduation rates for institutions using retention assessments2025 National First-Year Students and Their Motivations for Completing College: Chart showing higher graduation rates for institutions using retention assessments
    Data based on a February 2025 RNL review of reported retention rates 2015-2024 in IPEDS for client institutions using one or more of the instruments in the RNL Retention Management System.

    The bottom line on why you should care about assessing individual student motivation

    Asking students as they enter your institution what they need shows that you care about their experience. Using that data to build relationships between advisors and students lays the foundation of one of the most important connections students can have with your institution. Guiding students to the specific service or support they seek puts you in the best position to engage your students in meaningful ways. Ultimately, serving your students in the ways they need will make your institution more likely to retain those students.

    Learn more about the national student motivation data and how it supporting campus retention efforts by joining live or listening to the on-demand session First Year Focus: Understanding Student Motivations, Recognizing Opportunities, and Taking Action.

    Download the First-Year Student Motivation Report

    2025 National First-Year Students and Their Motivation to Complete College Report2025 National First-Year Students and Their Motivation to Complete College ReportWhat are the needs, challenges, and priorities for first-year college students? Find out in the National First-Year Students and Their Motivation to Complete College Report. You will learn their attitudes on finishing college, top areas of assistance, desire for career assistance, and more.

    Read Now

    Institution-level data: Student satisfaction assessments

    Knowing what students value across all class levels at your institution can provide the student voice in your data-informed decision-making efforts. Assessing student satisfaction is another way to show students you care about them, their experience with you, and what matters to them. Aligning your resources with student-identified priorities will reflect a student-centered environment where individuals may be more likely to want to stay.

    Student satisfaction data from across your student population can inform and guide your institutional efforts in multiple ways:

    • Student success and retention activities: Identifying your top priorities for response so you are working on high-importance, low-satisfaction areas from the student perspective.
    • Strategic planning: Incorporate the student voice into your long-term planning efforts to stay aligned with where they want to see you make investments.
    • Accreditation: Document your progress year over year as part of a continuous improvement process to show your regional accreditor that you are paying attention and responding to students (and not just when it is time for re-affirmation!).
    • Recruitment: Highlight your high-importance, high-satisfaction strengths to attract students who will care about what you can offer.

    To assist institutions with building the case for student satisfaction assessment on their campuses, we have developed two brief videos (under two minutes each), one talking about why assess satisfaction and why work with RNL specifically. My colleague Shannon Cook also hosted a 30-minute webinar that is available on demand to dive deeper into the why and how of assessing student satisfaction.

    Satisfaction data provides valuable perspectives for every department on campus, identifying areas to celebrate and areas to invest more time, energy, and resources. Campuses that respond to what their students care about have reported seeing satisfaction levels increase and graduation rates improve. Most institutions we work with assess student satisfaction at least once every two or three years and then use the intervening months to explore the data through demographic subpopulations and conversations on campus, take action in high-priority areas, and communicate back with students about what has been done based on the student feedback. These ongoing cycles put institutions in the best position to create a culture of institutional improvement based on the student voice.

    Student motivation and satisfaction assessments are effective practices

    According to the results of the 2025 Effective Practices for Student Success, Retention and Completion Report, assessing student motivation and student satisfaction are methods used by high percentages of institutions and are considered to be highly effective.

    2025 Effective Practices for Student Success Report: Chart showing 2/3 of four year institutions assess incoming students and only half of two-year institutions do2025 Effective Practices for Student Success Report: Chart showing 2/3 of four year institutions assess incoming students and only half of two-year institutions do

    Source: 2025 Effective Practices for Student Success, Retention, and Completion

    The impact of assessing student motivation and student satisfaction on institutional graduation rates has been documented with numerous studies over the years.

    It is important to be aware that just gathering the data will not magically help you retain students. It is the first step in the process, following these ABCs:

    1. Assess the needs with student and institutional level data collection
    2. Build a high impact completion plan to engage students from pre-enrollment to retention to graduation, taking action based on what students say
    3. Connect students to campus resources that best match their needs and will increase their likelihood to persist and complete and Communicate about what you are doing and why as improvements are made.

    Contact me if you would like to learn more about assessing student motivation and student satisfaction on your campus.

    Source link

  • The College Planning Playbook: What Works According to Students

    The College Planning Playbook: What Works According to Students

    What Works (and What Gets Ignored) According to Real Students

    If you work in enrollment or financial aid, you’ve probably asked yourself: What actually helps students figure out college, and what just adds to the pile? For the 2025 E-Expectations survey, we went straight to the source—nearly 1,600 high school students themselves—and the answers are refreshingly straightforward. Spoiler: it’s not about the fanciest new tech, and it’s also not about drowning them in glossy brochures. When it comes to their “college planning playbook,” teenagers are looking for clear, actionable guidance that helps them make a huge life decision without losing their sanity (or their savings).

    Here’s what we learned from our latest survey, and how you can use it to actually move the needle.

    Students aren’t just window shopping

    Forget the idea that students are passively leafing through mailers. Today’s applicants are strategic: they use whatever gets them closer to a decision and tune out the rest. When we asked, “Which resources have you used and how helpful were they?” the results were clear.

    The top five: What really works

    1. School emails still rule: Those emails you labor over? They’re not just spam fodder. Nearly 90% of students say they’re helpful, and just as many actually read them. The catch? Short, relevant, and timely messages work best. If you’re still sending email blasts that sound like a commercial, rethink your approach.

    2. The official college website remains the king: When in doubt, students go straight to the source. Nine out of ten use college websites to research schools, making them the most-used tool, and 88% percent find them genuinely helpful. Students want the facts—what programs exist, what dorms look like, what deadlines are looming. If your website buries the basics, you’re losing them.

    3. Nothing beats boots on the ground: Visiting campus is still the gold standard for gut checks. Eighty-eight percent say in-person visits are helpful, but only 80% manage to take one (travel and cost are real barriers). When they do, it’s a game-changer.

    4. College planning websites make life easier: Think of these as digital guidance counselors. They’re used by 82% of students, and 85% say they’re helpful. The draw? Easy side-by-side comparisons and less spreadsheet chaos.

    5. College fairs still pack a punch: They may be old school but they are effective: 80% of students attend college fairs, and 85% get helpful info they couldn’t find online. Sometimes, a face-to-face conversation is what tips the scale.

    Mind the gap: Underused but powerful

    There are plenty of tools out there, but some of the most helpful ones are flying under the radar. Here’s where colleges can do better:

    Virtual tours and VR experiences: Students who use them love them (84% helpful), but only 77% have tried. Virtual can’t replace a campus tour, but it’s the next best thing—especially for out-of-state or lower-income students.

    Online student communities: Authentic peer advice matters, but only 77% know about these platforms (even though 84% find them helpful).

    Financial aid calculators: Nothing is scarier than the price tag, but only 81% use these tools, even though 85% say they’re helpful.

    Live chats and chatbots: Quick answers, real-time help, yet only about 70% of students use them. Visibility is the issue, not usefulness.

    And let’s talk about personalized texts and live messages from admissions counselors: students crave direct, real-time communication, but only 77% have gotten it, even though 84% rate it as helpful.

    What enrollment pros should actually do

    So what’s the actionable playbook? Here’s what our data says:

    • Promote your virtual stuff: Highlight virtual tours, student communities, and interactive platforms, especially for students who can’t visit in person.
    • Show the path to a job: Put career outcomes front and center. Students want to see how your programs connect to real-world gigs.
    • Make digital tools impossible to miss: If you have a chatbot or live chat, make it obvious. Don’t bury these features on your website.
    • Lead with affordability: Share scholarship calculators and cost tools early and often. Don’t make families hunt for them.
    • Invest in personal touch: The more tailored your outreach (think texts, quick emails, not just form letters), the better.
    • Make campus visits happen: Subsidize travel, host regional visit days, or beef up your virtual experiences for those who can’t make the trip.

    The bottom line

    Read the 2025 E-Expectations Report

    Students don’t want a firehose of information. They want a GPS. The best colleges aren’t the ones with the flashiest websites or the most emails—they’re the ones who help students navigate from “I have no clue” to “I’ve got this.” Our job isn’t just to provide facts. It’s to be the trusted co-pilot on a student’s most important road trip.

    Want the full breakdown, including more data and actionable insights?

    Read the 2025 E-Expectations Trend Report to get a comprehensive experience of what students expect and experience when searching for colleges. If you’re serious about helping students (and your own enrollment goals), you’ll want to see everything we uncovered!

    Source link

  • Tracking the Trump administration’s moves to cap indirect research funding

    Tracking the Trump administration’s moves to cap indirect research funding

    Status: Temporarily blocked

    What happened? On May 14, U.S. Defense Secretary Pete Hegseth issued a memo declaring that the Defense Department would move to cap reimbursement for indirect research costs to 15% for all new grants for colleges. Hegseth also ordered officials to renegotiate rates on existing awards. If colleges do not agree, DOD officials should terminate previously awarded grants and reissue them under the “revised terms,” he said. 

    Overall, Hegseth estimated the move would save the agency $900 million annually. 

    A group of higher education associations and research universities sued on June 16, arguing that the Defense Department overstepped its authority and noting that other courts had blocked the Trump administration’s caps at other agencies. 

    As with those policies, if DOD’s policy is allowed to stand, it will stop critical research in its tracks, lead to layoffs and cutbacks at universities across the country, badly undermine scientific research at United States universities, and erode our nation’s enviable status as a global leader in scientific research and innovation,” they wrote in court documents

    The next day, U.S. District Judge Brian Murphy granted a temporary restraining order blocking the Defense Department from implementing its policy until further ordered. 

    What’s next? Murphy has scheduled a July 2 hearing on the temporary restraining order.

    Source link

  • UK’s rankings lead under threat from global peers in QS World University Rankings 2026

    UK’s rankings lead under threat from global peers in QS World University Rankings 2026

    • By Viggo Stacey, International Education & Policy Writer at QS Quacquarelli Symonds.

    As UK education minister Bridget Phillipson has rightly acknowledged, the UK is home to many world-class universities. 

    And the country’s excellence in higher education is yet again on display in the QS World University Rankings 2026.  

    Imperial College London, University of Oxford, University of Cambridge and UCL all maintain their places in the global top 10 and 17 of the total 90 UK universities ranked this year are in the top 100, two more than last year. 

    The University of Sheffield and The University of Nottingham have returned to the global top 100 for the first time since 2023 and 2024 respectively. 

    But despite improvements at the top end of the QS ranking, some 61% of ranked UK universities have dropped this year. 

    Overall, the 2026 ranking paints a picture of heightening global competition. A number of markets have been emerging as higher education hubs in recent decades – and the increased investment, attention and ambition in various places is apparent in this year’s iteration. 

    Saudi Arabia – whose government had set a target to have five institutions in the top 200 by 2030 – has seen its first entry into to top 100, with King Fahd University of Petroleum & Minerals soaring 34 places to rank 67th globally. 

    Vietnam, a country that is aiming for five of its universities to feature in the top 500 by the end of the decade, has seen its representation in the rankings leap from six last year to 10 in 2026. 

    China is still the third most represented location in the world in the QS World University Rankings with 72 institutions, behind only the US with 192 and the UK with 90. And yet, close to 80 institutions that are part of the Chinese Double First Class Universities initiative to build world-class universities still do not feature in the overall WUR. 

    Saudi Arabia currently has three institutions in the top 200, while Vietnam has one in the top 500. If these countries succeed in their ambitions, which universities will lose out among the globe’s top in five years’ time? 

    The financial pressure the UK higher education is facing is well documented. Universities UK (UUK) recently calculated that government policy decisions will result in a £1.4 billion reduction in funding to higher education providers in England in 2025/26. The Office for Students’s warning that 43% of England’s higher education institutions will be in deficit this academic year is often cited. 

    Some 19% UK university leaders say they have cut back on investment in research given the current financial climate, and an additional 79% are considering future reductions. 

    On a global scale, cuts like this will more than likely have a detrimental impact on the UK’s performance in the QS World University Ranking – the world’s most-consulted international university ranking and leading higher education benchmarking tool. 

    The 2026 QS World University Rankings already identify areas where UK universities are behind global competitors. 

    With a 39.2 average score in the Citations per Faculty area, measuring the intensity of research at universities, the UK is already far behind places such as Singapore, the Netherlands, Hong Kong, Australia and Mainland China, all of which have average scores of at least 70. 

    In Faculty Student Ratio, analysing the number of lecturers compared to students, the UK (average score of 26.7) is behind the best performing locations such as Norway (73.7), Switzerland (63.8) and Sweden (61.8). 

    While Oxford, Cambridge and LSE all feature in the global top 15 in Employment Outcomes and 13 UK universities feature in the top 100 for reputation among employers, other universities across the world are improving at a faster rate than many UK universities. 

    And, despite its historical dominance in the global education lens, global competitors are catching up with UK higher education in international student ratio and international faculty.  

    While 74% of UK universities improved in the international student ratio indicator in 2022, the last few years have identified a weakening among UK institutions. In 2023, 54% of UK universities fell in this area, in 2024, 56% dropped and in 2025, 74% declined. And in 2026, 73% dropped.  

    The government in Westminster is already aware that every £1 it spends on R&D delivers £7 of economic benefits in the long term and, for that reason, it prioritised spending to rise to £22.6bn in 2029-30 from £20.4bn in 2025-26.  

    But without the financial stability at higher education institutions in question, universities will need more support going ahead beyond support for their research capabilities. Their role in developing graduates with the skills to propel the UK forward is being overlooked.  The QS 2026 World University Ranking is already showing that global peers are forging ahead. UK universities will need the right backing to maintain their world-leading position.

    Source link

  • Machine learning technology is transforming how institutions make sense of student feedback

    Machine learning technology is transforming how institutions make sense of student feedback

    Institutions spend a lot of time surveying students for their feedback on their learning experience, but once you have crunched the numbers the hard bit is working out the “why.”

    The qualitative information institutions collect is a goldmine of insight about the sentiments and specific experiences that are driving the headline feedback numbers. When students are especially positive, it helps to know why, to spread that good practice and apply it in different learning contexts. When students score some aspect of their experience negatively, it’s critical to know the exact nature of the perceived gap, omission or injustice so that it can be fixed.

    Any conscientious module leader will run their eye down the student comments in a module feedback survey – but once you start looking across modules to programme or cohort level, or to large-scale surveys like NSS, PRES or PTES, the scale of the qualitative data becomes overwhelming for the naked eye. Even the most conscientious reader will find that bias sets in, as comments that are interesting or unexpected tend to be foregrounded as having greater explanatory power over those that seem run of the mill.

    Traditional coding methods for qualitative data require someone – or ideally more than one person – to manually break down comments into clauses or statements that can be coded for theme and sentiment. It’s robust, but incredibly laborious. For student survey work, where the goal might be to respond to feedback and make improvements at pace, institutions are open that this kind of robust analysis is rarely, if ever, the standard practice. Especially as resources become more constrained, devoting hours to this kind of detailed methodological work is rarely a priority.

    Let me blow your mind

    That is where machine learning technology can genuinely change the game. Student Voice AI was founded by Stuart Grey, an academic at the University of Strathclyde (now working at the University of Glasgow), initially to help analyse student comments for large engineering courses. Working with Advance HE he was able to train the machine learning model on national PTES and PRES datasets. Now, further training the algorithm on NSS data, Student Voice AI offers literally same-day analysis of student comments for NSS results for subscribing institutions.

    Put the words “AI” and “student feedback” in the same sentence and some people’s hackles will immediately rise. So Stuart spends quite a lot of time explaining how the analysis works. The word he uses to describe the version of machine learning Student Voice AI deploys is “supervised learning” – humans manually label categories in datasets and “teach” the machine about sentiment and topic. The larger the available dataset the more examples the machine is exposed to and the more sophisticated it becomes. Through this process Student Voice AI has landed on a discreet number of comment themes and categories for taught students and the same for postgraduate research students that the majority of student comments consistently fall into – trained on and distinctive to UK higher education student data. Stuart adds that the categories can and do evolve:

    “The categories are based on what students are saying, not what we think they might be talking about – or what we’d like them to be talking about. There could be more categories if we wanted them, but it’s about what’s digestible for a normal person.”

    In practice that means that institutions can see a quantitative representation of their student comments, sorted by category and sentiment. You can look at student views of feedback, for example, and see the balance of positive, neutral and negative sentiment, overall, segment it into departments or subject areas, or years of study, then click through to see the relevant comments to see what’s driving that feedback. That’s significantly different from, say, dumping your student comments into a third party generative AI platform (sharing confidential data with a third party while you’re at it) and asking it to summarise. There’s value in the time and effort saved, but also in the removal of individual personal bias, and the potential for aggregation and segmentation for different stakeholders in the system. And it also becomes possible to compare student qualitative feedback across institutions.

    Now, Student Voice AI is partnering with student insight platform evasys to bring machine learning technology to qualitative data collected via the evasys platform. And evasys and Student Voice AI have been commissioned by Advance HE to code and analyse open comments from the 2025 PRES and PTES surveys – creating opportunities to drill down into a national dataset that can be segmented by subject discipline and theme as well as by institution.

    Bruce Johnson, managing director at evasys is enthused about the potential for the technology to drive culture change both in how student feedback is used to inform insight and action across institutions:

    “When you’re thinking about how to create actionable insight from survey data the key question is, to whom? Is it to a module leader? Is it to a programme director of a collection of modules? Is it to a head of department or a pro vice chancellor or the planning or quality teams? All of these are completely different stakeholders who need different ways of looking at the data. And it’s also about how the data is presented – most of my customers want, not only quality of insight, but the ability to harvest that in a visually engaging way.”

    “Coming from higher education it seems obvious to me that different stakeholders have very different uses for student feedback data,” says Stuart Grey. “Those teaching at the coalface are interested in student engagement; at the strategic level the interest is in strategic level interest in trends and sentiment analysis and there are also various stakeholder groups in professional services who never get to see this stuff normally, but we can generate the reports that show them what students are saying about their area. Frequently the data tells them something they knew anyway but it gives them the ammunition to be able to make change.”

    The results are in

    Duncan Berryman, student surveys officer at Queens University Belfast, sums up the value of AI analysis for his small team: “It makes our life a lot easier, and the schools get the data and trends quicker.” Previously schools had been supplied with Excel spreadsheets – and his team were spending a lot of time explaining and working through with colleagues how to make sense of the data on those spreadsheets. Being able to see a straightforward visualisation of student sentiment on the various themes means that, as Duncan observes rather wryly, “if change isn’t happening it’s not just because people don’t know what student surveys are saying.”

    Parama Chaudhury, professor of economics and pro vice provost education (student academic experience) at University College London explains where qualitative data analysis sits in the wider ecosystem for quality enhancement of teaching and learning. In her view, for enhancement purposes, comparing your quantitative student feedback scores to those of another department is not particularly useful – essentially it’s comparing apples with oranges. Yet the apparent ease of comparability of quantitative data, compared with the sense of overwhelm at the volume and complexity of student comments, can mean that people spend time trying to explain the numerical differences, rather than mining the qualitative data for more robust and actionable explanations that can give context to your own scores.

    It’s not that people weren’t working hard on enhancement, in other words, but they didn’t always have the best possible information to guide that work. “When I came into this role quite a lot of people were saying ‘we don’t understand why the qualitative data is telling us this, we’ve done all these things,’” says Parama. “I’ve been in the sector a long time and have received my share of summaries of module evaluations and have always questioned those summaries because it’s just someone’s ‘read.’ Having that really objective view, from a well-trained algorithm makes a difference.”

    UCL has tested two-page summaries of student comments to specific departments this academic year, and plans to roll out a version for every department this summer. The data is not assessed in a vacuum; it forms part of the wider institutional quality assurance and enhancement processes which includes data on a range of different perspectives on areas for development. Encouragingly, so far the data from students is consistent with what has emerged from internal reviews, giving the departments that have had the opportunity to engage with it greater confidence in their processes and action plans.

    None of this stops anyone from going and looking at specific student comments, sense-checking the algorithm’s analysis and/or triangulating against other data. At the University of Edinburgh, head of academic planning Marianne Brown says that the value of the AI analysis is in the speed of turnaround – the institutionl carries out a manual reviewing process to be sure that any unexpected comments are picked up. But being able to share the headline insight at pace (in this case via a PowerBI interface) means that leaders receive the feedback while the information is still fresh, and the lead time to effect change is longer than if time had been lost to manual coding.

    The University of Edinburgh is known for its cutting edge AI research, and boasts the Edinburgh (access to) Language Models (ELM) a platform that gives staff and students access to generative AI tools without sharing data with third parties, keeping all user data onsite and secured. Marianne is clear that even a closed system like ELM is not appropriate for unfettered student comment analysis. Generative AI platforms offer the illusion of a thematic analysis but it is far from robust because generative AI operates through sophisticated guesswork rather than analysis of the implications of actual data. “Being able to put responses from NSS or our internal student survey into ELM to give summaries was great, until you started to interrogate those summaries. Robust validation of any output is still required,” says Marianne. Similarly Duncan Berryman observes: “If you asked a gen-AI tool to show you the comments related to the themes it had picked out, it would not refer back to actual comments. Or it would have pulled this supposed common theme from just one comment.”

    The holy grail of student survey practice is creating a virtuous circle: student engagement in feedback creates actionable data, which leads to education enhancement, and students gain confidence that the process is authentic and are further motivated to share their feedback. In that quest, AI, deployed appropriately, can be an institutional ally and resource-multiplier, giving fast and robust access to aggregated student views and opinions. “The end result should be to make teaching and learning better,” says Stuart Grey. “And hopefully what we’re doing is saving time on the manual boring part, and freeing up time to make real change.”

    Source link

  • Will guidance on freedom of speech help the staff who fear physical attack for expressing their views?

    Will guidance on freedom of speech help the staff who fear physical attack for expressing their views?

    Just 44 days before duties on it go live, but some 389 days since it closed a consultation on it, the Office for Students (OfS) has finally published Regulatory advice 24 – its guidance to universities and colleges in England on freedom of speech that flows from the Higher Education (Freedom of Speech) Act (HEFoSA).

    The timings matter partly because it’s mid-June, there won’t be many (if any) big committee meetings left (let alone processes designed to engage with people on policy development ahead of approval), and it was OfS itself that fined the University of Sussex partly over the proper approval of some of its policies.

    And it’s not as if there are only minor drafting changes. An 11,773 word draft has become a 23,526 word final, and the list of 30 illustrative examples has grown to 52 – despite the fact that this new version omits all the duties on students unions (which the government announced last year it intends to repeal), and is now also silent on the free speech complaints scheme.

    All the detailed and prescriptive expectations in the original draft over how that should be promoted have gone – largely because we’re all waiting for Parliament to debate (sensible) changes that will cause students to have to use the Office of the Independent Adjudicator (OIA), rather than OfS, to resolve any complaints in this area.

    Alongside, there’s surely a record-breaking 788 paragraph analysis of responses and decisions off the back of the eleven question consultation, some alarming-sounding polling that will likely be making the news, and some short guides for students and staff.

    A lot of the new version of the guidance adds more detail into the examples – many are now more realistic, plenty are better at signalling the differences between “good ideas” and minimum expectations, and a whole host of them are now more accurately qualified with reference to key legal principles or tests, many of which have been emerging in case law since OfS started its consultation.

    That said, some are still so preposterous as to be useless. If there really is a college somewhere that requires students to seek written permission a month in advance to hand out leaflets or post flyers, where those flyers must be posted on a single designated noticeboard which is both small and on a campus where flyers may not be posted anywhere else, I’ll eat my hat – or maybe my pudding at the formal dinner at whichever Oxbridge college authors were reminiscing about when Example 38 was drafted.

    As there are 52 of them, this initial article doesn’t dive into all of the vignettes comprehensively – although doubtless a number of them (not least because of the judicious use of qualifiers like “depending on the facts of the case”) will continue to cause readers to cry “yeah but what about…” – which is presumably why OfS initially attempted to let lessons unfurl from the casework rather than publish guidance. And we may well end up looking at some of them in more detail in the coming days and weeks.

    What I have tried to do here is look at the major ways in which the guidance has developed, how it’s handling some of the bigger questions that both universities and their SUs were raising in responses during the process, and what this all tells us about OfS’ intended approach to regulation in this area as of August.

    As a reminder, we’re talking here about the duty to “secure” freedom of speech on campus (A1 in HEFoSA), and the expectations that OfS has around the requirements for a souped up Code of Practice (A2) for each provider. There’s no guidance (yet) over the “promote” duty (A3), and to the extent to which the previous version strayed into those areas, they’ve largely been removed.

    The sandbags are coming

    If we were to identify one theme that has dominated discussion and debate over the Free Speech Bill ever since then universities minister Michelle Donelan stumbled, live on Radio 4, into an apparent contradiction, it would be where free speech (to be protected and promoted) crosses the line into harassment – which of course, under a separate heavy new duty as of August 1st, is something to be actively prevented and prosecuted by universities. Middle grounds are no longer available.

    The good news is that the section on reconciling free speech duties with equality law, anti-harassment provisions, and other legal requirements is better than anything else OfS has published to date on the interactions and fine lines. So detailed, for example, are many of the sections that deal with harassment on campus that at times, it’s a lot more helpful than the material in the actual guidance on registration condition E5 (Harassment and Sexual Misconduct).

    People often, for example, find others’ conduct to be unpleasant or disagreeable – Para 47 reminds us that the concept of harassment in the Protection from Harassment Act 1997 is linked to a course of conduct which amounts to it, that a course of conduct has comprise two or more occasions, that the conduct must be “oppressive and unacceptable” rather than just “unattractive or unreasonable”, and must be of sufficient seriousness to also amount to a criminal offence.

    Similarly, the judgement of harassment isn’t purely subjective – it applies an objective test based on what a reasonable person would think, which helps provide a consistent standard rather than relying solely on individual perceptions.

    Hence in Example 1, a student publishes repeated comments on social media attacking another student based on lawful views, including “tagging” them in posts and encouraging others to “pile on”. The student’s speech is so “extreme, oppressive and distressing” that their course of conduct may amount to harassment – and so carrying out an investigation into the student based on a policy that bans harassment would not breach the “secure” duty.

    Much of that flows from a newly reworked version of what counts as free speech within the law that translates some of the case law and principles set by the ECHR and the UK High Court in cases like Higgs v Farmor’s School. As such, while there’s still lines in there like “The Act protects free speech within the law – it does not protect unlawful speech”, there’s now much more helpful material on the different ways in which free speech might be curtailed or interfered with given other duties.

    To get there it outlines a three step test (with some wild flowchart graphics):

    • Step 1: Is the speech “within the law”? If yes, go to step 2. If no, the duty to “secure” speech does not apply.
    • Step 2: Are there any “reasonably practicable steps” to secure the speech? If yes, take those steps. Do not restrict the speech. If no, go to step 3.
    • Step 3: Are any restrictions “prescribed by law” and proportionate under the European Convention on Human Rights?

    There’s no doubt that it’s a more nuanced and balanced reflection of the legal position than we saw in the draft – albeit that it switches between “what to do in practice” and “what to say to students and staff in theory” in ways that are sometimes unhelpful.

    The problem is that the closer it gets to necessary complexity, the further away it gets from something that’s easy to understand by the very staff and students whose day to day conduct and confidence (what we might call the “culture” on campus) is supposed to be being influenced by the new duties.

    More importantly, as the examples unfurl, it’s both possible to spot numerous ways in which “it’s a balance” turns into Kafka’s cake and eat it, and to see how the “reasonably practicable steps” duty turns into something genuinely hard to understand in practice.

    Someone should do something

    One thing that’s not gone is a tendency in the examples to signal to the outside world that the new rules will tackle the things they’ve read about in the Times and the Telegraph – until you realise that they won’t.

    That Example 1 discussed above (highlighted in the accompanying press release) is a classic of the genre. On the surface it looks like OfS is tackling “mobbing”. But in reality, the whole point about pile-ons is that they’re almost never about one big evil ringleader engaging in conduct that is so “extreme, oppressive and distressing” that their course of conduct may amount to harassment.

    It’s more often than not a hundred micro-oppressions having the cumulative effect of making the target feel terrible. Even if you argue that aspects of social media culture are within the influence (if not control) of a provider, in other parts of the guidance OfS seems to be saying that because each micro-act isn’t harassment, you shouldn’t be trying to meddle in the culture of the campus.

    That problem becomes amplified in the section on microaggressions. In 2019, the Equality and Human Rights Commission (EHRC) found microaggressive acts to be a key component of a culture of racism on campus – and both argued that they could have an impact on equality of opportunity and good relations between different groups, and that universities must not ignore microaggressions that do not meet the definition of harassment in the Equality Act 2010 because of the cumulative impacts of repetition.

    But as soon as universities started to tackle microaggressions by, for example, encouraging their reporting, various anti-EDI culture warriors started to raise concerns. Discussing a scheme launched by Sheffield SU to have their halls reps understand the concept, Spiked’s Joanna Williams argued:

    They will need an odd combination of extreme sensitivity to offence – alongside a high degree of insensitivity to interrupting conversations – to point out exactly where the speakers went wrong. Presumably, candidates will also have to sit some kind of test to prove their own thought purity on all matters concerned with race and ethnicity.

    The Command Paper that led to HEFoSA was also worried:

    Schemes have been established in which students are paid to report others for perceived offences.

    And as Report+Support tools started to open up avenues for students to raise issues such that universities could spot patterns, academics – among them a fairly obscure Cambridge philosopher called Arif Ahmed – started to complain:

    The encouragement to report ‘inappropriate’ or ‘offensive’ behaviour amounts to a snitches’s charter. Any risk-averse white person will simply not engage with anyone from an ethnic minority, in case an innocent or well-meaning remark is overheard, misunderstood and reported. Whatever Downing College may think, being offensive is not an offence.

    Several years on, Arif Ahmed is OfS’ Director for Freedom of Speech and Academic Freedom, asserting that his appointment and approach isn’t “political”, and launching actual regulation (Example 39) that says this:

    University A promotes an anonymous reporting process. Students are encouraged to use a portal to submit anonymous reports to senior staff of “microaggressions”, which is not further defined. The portal includes free text boxes in which reporters may name or otherwise identify the individuals being accused. University A says that it may take action against named (or identifiable) individuals on the basis of any anonymous report that it receives.

    …Depending on the circumstances, the existence of the reporting mechanism and portal may discourage open and lawful discussion of controversial topics, including political topics and matters of public interest.

    …Reasonably practicable steps that A could now take may include remove the free text boxes from the anonymous reporting portal to be replaced with radio buttons that do not permit submission of any identifying data.

    There is a legitimate, if contested, political view that structural racism is fictional, harmful or both – and that what flows from it is division via concepts like microaggressions. There’s another view that to tackle racism you need to interrogate and tackle not just skinheads hurling abuse and painting graffiti, but the insidious yet often unintended impact of stuff like this (EHRC again):

    A recurring theme in our evidence was students and staff being dismissed as “oversensitive” and their experiences of microaggressions viewed as isolated incidents rather than a cumulative and alienating pattern of repeated slights and insults.

    Many staff and students reported that racial harassment doesn’t only happen overtly. All too often, offensive comments were justified by perpetrators as “jokes” or “banter”. The damaging effect of repeated microaggressions is often made worse by a lack of empathy and understanding when individuals decide to speak up about their treatment.

    In that “debate”, OfS has picked the side that we might have expected Arif Ahmed to pick. Whether he’s legally justified in doing so is one question – but let’s not pretend that the agenda is somehow apolitical.

    And for my next trick

    All of this is possible because of a central conceit in the guidance that relates back to a long-running theme in the rhetoric surrounding culture on campus – what we might call a “maximalist” approach to describing free speech, and a “minimalist “ (specific, legal thresholds) approach to harm and harassment.

    Anything goes unless it specifically breaks this specific law, and if you pretend otherwise you might end up “chilling” free speech.

    You might. But while insisting on an objective test to determine whether harassment has happened is a central feature, no such test of objectivity is then applied to whether a chilling effect has occurred – it becomes, in effect, about “potential” and feelings. Hence in its Sussex investigation, OfS said:

    …a chilling effect arose as a result of the Trans and Non-Binary Equality Policy Statement and the resulting breach of condition E1. By “chilling effect”, the OfS means the potential for staff and students to self-censor and not speak about or express certain lawful views. Staff and students may have self-censored as a result of the policy because they were concerned about being in breach of the policy and potentially facing disciplinary action for expressing lawful views.

    So having established that “harassment” has to amount to something objectively criminal, while “chilling” is in the eye of the Director, OfS is able to get away with railing against another newspaper favourite – by all but outlawing requiring academic staff to issue trigger warnings. Example 50:

    Depending on the facts, issuing a “content note” (informing attendees about sensitive material) in advance of this event may not be a reasonably practicable step for A to take. A standing requirement to use content notes may encourage more intrusive investigation of the content of seminars, readings or speaker events. An expectation of content notes may also discourage academics from exposing students to new controversial material (so as not to risk wrongly including no, or the wrong type of, content note).

    You could of course just as easily argue that failing to issue “content notes” could have a chilling effect on some students’ active participation. Alternatively, you could double down and chuck in a minimalist little qualifier for cover:

    However, there may be occasions when the use of specific content notes may be helpful to enable students to access material, if there is evidence that they are in fact helpful.

    The point isn’t to debate whether they work or not – the point is that OfS suddenly gets to pick and choose what it thinks could chill, while demanding that rules reflect specificity and extremity over individual conduct for harassment. It’s culture war politics shoehorned into regulation, with the law lingering around in the background.

    Is the process the punishment?

    You might remember a major news story in 2021 when a student at Abertay was investigated after other students complained that she made “hateful, discriminatory, sexist, racist and transphobic” remarks during an online seminar on gender politics.

    Following an investigation, it was determined that Lisa Keogh had a case to answer in relation to “making inappropriate comments” which “could be construed as discriminatory” – but after a panel reviewed recordings made available from the seminar, it found no evidence of discrimination:

    As a result, the board found there was insufficient evidence to support the allegations made against you on your behaviour in class and, therefore, decided to not uphold the charge of misconduct.

    Keogh’s argument was that she should never have been subject to formal processes in the first place – and so sued.

    Her case was basically that the university acted in breach of the Equality Act 2010 by pursuing her for “expressing her gender critical beliefs” and caused “stress at the most crucial part of my university career” – but Dundee Sheriff Court dismissed her case, with Sheriff Gregor Murray saying that university was entitled to take steps to investigate complaints:

    The number, nature and timing of the allegations, and the involvement of at least three final year students who were about to sit examinations, all placed the university in exactly the type of “tricky territory” that entitled it to investigate immediately.

    The defender was entitled to take steps to investigate complaints. It could not be guilty of discrimination simply because it did so. Following investigation in this case, the complaint against the pursuer was not upheld.

    Cases like that then get mangled into examples like Example 40 in the guidance. In the vignette, a professor expresses views that upset some students – they bring a complaint, there is a lengthy investigation process, and at the end of the process the university finds that there is no case to answer.

    This should have been clear to investigators at the outset, but the university was concerned that closing the investigation quickly would further offend the students who complained. The prospect of a lengthy investigation with an uncertain outcome may deter students and staff from putting forward unpopular views on controversial topics.

    Again, you can just as easily argue that rapidly dismissing students’ genuinely held concerns would have a chilling effect on their confidence to complain, and that students making formal complaints of this sort is so rare that a university would be wise to carefully investigate whether there’s an underlying fire accompanying the smoke.

    But as above, OfS seems to be saying “if students weren’t describing specific behaviours that would meet the harassment test, don’t even investigate” – applying a specific and objective test to harassment while being speculative and partial over its chilling test.

    A useful tool, but not that useful

    The original draft was fairly silent on antisemitism – an obvious issue given the high-profile nature of the coverage and political commentary on it, not least in the context of protests surrounding the war in Gaza.

    Notwithstanding the specific stuff on “time, place and manner” (see below and here) and what OfS might be counting as an “essential function” of a university (again, see below), what I would say is that if there’s a debate about whether action A, protest B or leaflet C amounts to antisemitism, it’s pretty obvious that those advocating the adoption of the IHRA definition are seeking to have it used when making a judgement.

    Some will argue (like Arif Ahmed once did) that universities should not adopt the definition:

    This “definition” is nothing of the kind; adopting it obstructs perfectly legitimate defence of Palestinian rights. As such it chills free speech on a matter of the first importance. I hope the Secretary of State reconsiders the need for it; but these new free speech duties ought to rule it out in any case.

    We’ve covered his mysterious conversion before – and wondered how that might manifest in any final guidance. It doesn’t, at all – but what we do get in the consultation commentary is this astonishing paragraph:

    We do not comment in this guidance on the IHRA definition of antisemitism or on any other proposed non-legally binding definition that a provider or constituent institution may wish to adopt. Nonetheless, we have adopted the IHRA definition because we believe that it is a useful tool for understanding how antisemitism manifests itself in the 21st century. The IHRA definition does not affect the legal definition of racial discrimination, so does not change our approach to implementing our regulatory duties, including our regulatory expectations of registered providers. A provider that adopts any definition (of anything) must do so in a way that has particular regard to, and places significant weight on, the importance of freedom of speech within the law, academic freedom and tolerance for controversial views in an educational context or environment.

    Some will argue that adoption – either by OfS or providers – has precisely the kind of chilling effects that are railed against at length throughout the guidance. Others will argue that adoption as a kind of interesting window dressing without using it to make judgements about things is pointless, raises expectations that can’t later be met, and allows antisemitism to go unchecked.

    I’d argue that this is another classic case of Kafka’s cake and eat it – which dumps a deep set of contradictions on universities and requires attention and leadership from regulators and politicians. We are still not there.

    Practicably reasonable

    As well as that central thread, there are various other issues in the guidance worthy of initial note.

    A major concern from mission groups was the way in which the new duty might play out over transnational branch campuses – especially those with rather more oppressive legal regimes over expression than here.

    We might have expected OfS to use some sort of “what’s practicable relates to the law in the country you’re delivering in” qualifier, but it has somehow managed to square the circle by simply stating, with no further qualification (P13) that:

    HERA does not require providers or constituent institutions to take steps to secure freedom of speech in respect of their activities outside England.

    It’s an… interesting reading, which is maybe related to the usual territorial extent qualifiers in legislation – the consultation commentary is similarly (and uncharacteristically) silent – but what it does appear to do is contradict the usual prescription that it’s about where the main base of the provider is, not where it’s provision is, that sets the duties.

    Even if some legal workaround has been found, it does start to call into question how or why OfS can regulate the quality of your provision in Dubai while not worrying about freedom of speech.

    Another section with a mysteriously short sentence is one on the original Donelan conundrum:

    The OfS will not protect Holocaust denial (by visiting speakers or anyone else).

    That’s a carefully worded sentence which seems to be more about OfS making choices about its time than an explanatory legal position. Unlike in many other countries, holocaust denial is not in and of itself illegal in the UK – although in the weigh up, Article 17 of the ECHR removes protection from speech that is contrary to fundamental convention values, and cases in the UK have tended to be prosecuted under other legislation such as section 127 of the Communications Act 2003 when the content is deemed “grossly offensive”.

    Quite why OfS has not chosen to explain that is unclear – unless it’s worried about revealing that all sorts of other types of grossly offensive stuff might fall under the balancing provision. And more to the point, as I’ve often said on the site, most holocaust deniers don’t announce that the title of their talk in Room 4b On Tuesday evening will be “the holocaust is a fiction” – which opens up the question of whether or not it’s OK to outlaw holocaust deniers who may or may not engage in actual holocaust denial when they turn up.

    The sole example in the guidance on the weigh-ups over external speakers and extremism is one where the proposed speaker is a self-professed member of a proscribed group. It’s easy to say “well it’s fine to ban them” – what we don’t have here is anything meaningfully helpful on the real cases being handled every year.

    And some of the media’s hardy perennials – universities doing things like signing up to charters with contested “values” or engaging in contested work like decolonisation – are also either carefully contorted or preposterous.

    Hence Example 51 describes a university that [overtly] requires that all teaching materials on British history will represent Britain in a positive light – one of the many not as clever as the authors think they are inversions of the allegations often thrown at woke, UK history hating academics.

    Meanwhile Example 52 nudges and winks at the Stonewall Charter by describing a department of a university that applies for accreditation to a charter body with links to the fossil fuel industry, where the accreditation process requires it to sign up to a set of principles that include:

    Fossil fuel exploration is the best way to meet our future energy needs.

    The text underneath is fascinating. Once you’ve got the “depending on the circumstances” qualifier out of the way, we learn that “institutional endorsement of this principle may discourage expression of legally expressible views”. That’s your “chilling” allegation again.

    But rather than warning against signing it, we merely get:

    …not implementing the provisions of any accreditation that risks undermining free speech and academic freedom is likely to be a reasonably practicable step that university B should now take.

    Replace that with the International Holocaust Remembrance Alliance (IHRA) definition of antisemitism, and you can see why the fudge above will satisfy no-one.

    I’ve read the para in the guidance several times now, and each time I read it I resolve different things. Either the university can take a position on contested ideas as long as these aren’t imposed on staff, or it can’t because taking the position on contested ideas would chill staff. Flip a coin.

    It’s that sort of thing that makes the otherwise helpful section that clarifies that you can have a code of conduct for staff and students so silly. Codes of conduct are fine as long as any restrictions on speech reference a legal rule or regime which authorises the interference, that the student, member, member of staff or visiting speaker who is affected by the interference has adequate access to the rule, and if the rule is:

    …formulated with sufficient precision to enable the student, member of staff or

    visiting speaker to foresee the circumstances in which the law would or might be applied, and the likely consequences that might follow.

    I’d tentatively suggest that while that makes sense, OfS’ own guidance represents a set of rules where forseeing how it might respond to a scenario, and the likely consequences that might follow, are clear as mud.

    To clear up protest and disruption rights, OfS stresses viewpoint neutrality, uses its “time, place and manner” confection we first saw last year, and also has a new oft-repeated “essential functions” of higher education qualifier of:

    …learning, teaching, research and the administrative functions and the provider’s or constituent institution’s resources necessary for the above.

    I can’t really call whether OfS thinks the sports hall counts, or whether it thinks the encampment is OK there, but not in a seminar room. Either way, it’s another of those vague definitions that feels open to abuse and interpretation by all sides of a dispute and by OfS itself.

    Another allegation thrown at universities is often about EDI training – Example 53 sets up the idea that an online EDI induction asks if white people are complicit in the structural racism pervading British society, where the only answer marked correct is “True” – a candidate who ticks “False” is required to re-take the test until they have explicitly assented to “True”.

    Maybe I’m being naive, but if that’s grounded in a real example I’d be more worried about that provider’s wider approaches to teaching and assessment than its approach to free speech.

    This university is a vile hell-hole

    A few other fun bits. Fans of reputation management will be disappointed to learn at Example 22 that a social media policy requiring staff to not to post material that is “unnecessarily critical”, coupled with a strong but lawful pop at the provider’s employment practices in a public post on social media, would represent a “protect” policy breach and a “protect” practice breach if the staff member ends up with a warning.

    Meanwhile, notwithstanding the silence over whether full-time SU officers are members or students of a provider, Example 23 has a student representative posting unfavourable commentary on university management on the SU’s website, along with some student testimonials describing students’ experiences of accommodation:

    University Z requires the student to remove this post on the grounds that if the post is reported more widely in the media, this would threaten University Z’s recruitment plans.

    That that would be a breach may feel like a problem for the small number of universities whose senior managers directly threatened SU officers over TEF student submission drafts.

    But more broadly, like so many other examples in the guidance, neither the staff nor the student example get at broader culture issues.

    You might argue that “reasonably practicable steps” in both cases might involve specific commitments to enable dissent, or more explicit encouragement of public discussion over controversial issues.

    You could certainly argue that much of the committee discussion marked “confidential” should be nothing of the sort, and that non-disclosure agreements imposed on settled-with complainants outside of the specific ban on those in sexual misconduct cases should be outlawed.

    You could also argue that in both cases, fears over future funding – your salary for the staff member, your block grant for the SU officer – are classic chillers that need specific steps to be taken. Alas, none of that sort of “why” stuff appears.

    There’s also still a whole bunch of headscratchers. What happens when three different providers have three different sets of policies and codes and all franchise their provision to a fourth provider? Should providers be inspecting the reputation rules in the employment contracts of their degree apprentices or other credit-based work based learning? Now the requirement to tell all new students about all this has been softened, isn’t there still a need to include a lot of FoS material in the still compulsory training to be offered as per E5? And so on.

    In the complaints scheme consultation, there was some controversy over the definition of visiting speakers – including when an invitation manifested as an actual invitation and who was capable of extending one. On this, OfS has actually decided to expand its definition – but neatly sidesteps the Amber Rudd dilemma, namely that while it’s easy to expect people in power to not cancel things because some object, it’s a lot harder to make a volunteer student society run an event that it changes its mind about, regardless of the reason.

    And when the guidance says that OfS would “generally” expect providers to reject public campaigns to punish a student or member of staff for lawful expression of an idea or viewpoint that does not violate any lawful internal regulations, we are still stuck in a situation where some basic principles of democracy for anyone elected on campus – staff, but more often than not, students – come into direct conflict with that expectation even if they are “organised petitions or open letters, an accumulation of spontaneous or organised social media posts, or long-running, focused media campaigns”.

    Changing the culture

    There may well be plenty more to spot in here – legal eagles will certainly be pouring over the document, expectations on all sides may need to be reset, and all in a context of very tight timescales – not least because much of the material implies a need for a much wider review of related policies than just “write a compliant Code”.

    Everyone should also think carefully about the YouGov polling. There are some important caveats to be attached the results and some of the splits based on wording, assumptions and whether it’s even reasonable to expect someone teaching something highly technical to be wading into the sex and gender debate. And whether you’re teaching, researching or otherwise supporting, it must be the case that not all subject areas include as much scope for controversy and “debate” than others.

    But even if you quibble over the N equalling 184, when 24 per cent of those who do not feel free in their teaching cite fear of physical attack, there is a problem that needs urgent interrogation and resolution.

    [Full screen]

    (Thanks as ever to DK for the visualisation of the YouGov polling – sample size 1234 adults and weighted for teaching staff in England, by age, gender, region, and contract type)

    We also still have the debate over the partial repeal of the Act to come too, some additional complexity over complaints to resolve, and as I note above, huge questions like “so can we adopt the International Holocaust Remembrance Alliance (IHRA) definition of antisemitism or not” remain unanswered – as well as a set of inevitable conflicts to come over the practical application of the Supreme Court ruling on the meaning of “woman” in EA2010.

    I should also say that I’ve not had time to properly interrogate the research aspects in the guidance – but we’ll get to that with my colleague James Coe in the coming days.

    What I’m mainly struck by – other than the ways in which a particular set of (contested) views on campus culture have been represented as apolitical – is the way in which, ultimately, much of the material comes down to the regulatory realities of expecting authority to behave.

    In some senses, that’s not unreasonable – governors and leaders hold considerable influence and power over students and staff, and what they ban, or punish, or encourage or celebrate can have important impacts that can be positive for some, and negative for others.

    But to the extent to which there really is a problem with free speech (and academic freedom) on campus, much of it feels much wider and organic than the hermetically sealed campus community assumptions at play in documents of this sort.

    I won’t repeat so many of the things I’ve said on the site over the past few years about confidence being key to a lot of this – suffice to say that the freedom ideal at play in here feels like something that is easier to experience when steps have been taken to improve people’s security, given them time and space to interact meaningfully with each other, and act specifically to boost their bravery.

    Not only should some of the solutions be about resolving conflicts and integrating the concerns into a more stable definition of what it is to be a member of staff or a student, of all the agendas in higher education, it strikes me that this area remains one where solutions and sticks and games of blame abound, but causal analysis feels hopelessly weak.

    In the absence of alternative guidance on the “promote” duty, if I was high up in a university, I’d be resolving to interrogate more carefully and listen more closely before I pretended that my shiny new Code of Practice will do anything other than tick the boxes while making matters worse.

    Source link

  • Flexible Learning and Policy Challenges

    Flexible Learning and Policy Challenges

    What impact is flexible learning having on learners from K-12 through to professional development?

    New Zealand has remarkably high levels of digital access across the population. Why aren’t we out performing other countries in educational measurements?

    This piece serves to introduce a series of six challenges faced by policy makers around flexible learning.

    These six challenges are:

    1. Unequal Access to Technology and Connectivity
    2. Socioeconomic Disparities
    3. Digital Literacy and Skills Gaps
    4. Quality Assurance and Consistent Experience
    5. Teacher Preparedness and Support
    6. Policy and Funding Models

    In this first piece I want to establish what I mean by ‘flexible learning’.

    Like many I struggle to have a single, concise, and consistent “definition” of flexible learning. I would say that flexible learning is a model of delivery that offers learners agency and control over various aspects of their learning experience. Flexible learning is a spectrum. Formal learning courses exist on a continuum between “rigid” and “flexible” delivery. The more control and choice given to the learner, the more flexible the learning experience.

    Flexible learning aims to “empower the student to choose what learning should be studied face-to-face and that which should be studied online, and how to go about engaging with that learning” (2022). This Means empowering the learner to make choices regarding:

    • When: synchronous or asynchronous learning, pace-mandated or self-paced progression.
    • Where: Learning in different locations (home, campus, workplace, etc.).
    • How: Different modes of engagement (online, in-person, blended, hybrid, hyflex).
    • What: Some degree of choice over content or learning pathways, though this is often more associated with “open learning.” Indeed in a world where students are overwhelmed with choices, there are strong arguments that having a prescriptive programme serves students well.

    In my article “Definitions of the Terms Open, Distance, and Flexible in the Context of Formal and Non-Formal Learning,” (2023) I argued that flexible learning is a model of delivery, rather than a fundamental mode of learning. I posit that there are only two core modes of learning: in-person (or face-to-face) and distance learning. Flexible learning then emerges from various combinations and approaches to curriculum design that empower learners to choose amongst these two modes

    As education has a habit of inventing new terms for marginally different practices it might be worth just pointing out the relationship I think exists between flexible learning and forms of Blended, Hybrid, and HyFlex learning. I perceive blended, hybrid, and HyFlex learning as specific models of delivery that fall under the umbrella of flexible learning. They all aim to give agency to the learner regarding how they engage with the material, combining elements of in-person and distance learning.

    I believe that designing for flexible learning means considering the learner’s context and perspective, and creating learning experiences that are relevant, meaningful, motivating, realistic, and feasible within an agreed timeframe. This also involves careful consideration of learning outcomes and assessment in diverse delivery contexts. This means course creators need clarity about learning design principles in relation to flexible approaches, such as working with Notional Study Hours (2020a) and the importance of Learning Outcomes (2020b).

    Based on my broad definition thatFlexible Learning refers to educational approaches and models of delivery that provide learners with a significant degree of choice and control over the time, place, pace, and mode of their learning, leveraging combinations of in-person and distance learning to enhance accessibility and cater to diverse learner needs, how do we face those six policy challenges?

    Watch this space…

    Atkinson, S. P. (2020a, April 14). Working with Notional Study Hours (NSH) or “How much is enough?” Simon Paul Atkinson. https://sijen.com/2020/04/14/working-with-notional-study-hours-nsh-or-how-much-is-enough/

    Atkinson, S. P. (2020b, April 4). Designing Courses: Importance of Learning Outcomes. Simon Paul Atkinson. https://sijen.com/2020/04/04/designing-courses-importance-of-learning-outcomes/

    Atkinson, S. P. (2022a, July 15). How do you define hybrid, or hyflex, learning?. Simon Paul Atkinson. Retrieved from https://sijen.com/2022/07/15/how-do-you-define-hybrid-or-hyflex-learning/

    Atkinson, S. P. (2023). Definitions of the Terms Open, Distance, and Flexible in the Context of Formal and Non-Formal Learning. Journal of Open, Flexible, and Distance Learning, 26(2).3 Retrieved from https://jofdl.nz/index.php/JOFDL/article/view/521

    Source link