Tag: Data

  • What the latest HESA data tells us about university finances

    What the latest HESA data tells us about university finances

    The headlines from the 2023-24 annual financial returns were already pretty well known back in January.

    Even if you didn’t see Wonkhe’s analysis at the time (or the very similar Telegraph analysis in early May), you’d have been well aware that things have not been looking great for the UK’s universities and other higher education providers for a while now, and that a disquieting number of these are running deficits and/or making swingeing cuts.

    What the release of the full HESA Finance open data allows us to do is to peer even deeper into what was going on last academic year, and start making sense of the way in which providers are responding to these ongoing and worsening pressures. In particular, I want to focus in on expenditure in this analysis – it has become more expensive to do just about everything in higher education, and although the point around the inadequacy of fee and research income has been well and frequently made there has been less focus on just how much more money it costs to do anything.

    Not all universities

    The analysis is necessarily incomplete. The May release deals with providers who have a conventional (for higher education) financial year – one that matches the traditional academic year and runs through to the end of August. As the sector has become more diverse the variety of financial years in operation have grown. Traditional large universities have stayed with the status quo – but the variation means that we can’t talk about the entire sector in the same way as we used to, and you should bear this in mind when looking at aggregate 2023-24 data.

    A large number of providers did not manage to make a submission on time. Delays in getting auditor sign off (either because there was an audit capacity problem due to large numbers of local authorities having complex financial problems, or because universities themselves were having said complex financial problems) mean that we are down 18 sets of accounts. A glance down the list shows a few names known to be struggling (including one that has closed and one that has very publicly received a state bailout).

    So full data for the Dartington Hall Trust, PHBS-UK, Coventry University, Leeds Trinity University, Middlesex University, Spurgeon’s College, the University of West London, The University of Kent, University of Sussex, the Royal Central School of Speech and Drama, The Salvation Army, The London School of Jewish Studies, Plymouth Marjon University, the British Academy of Jewellery Limited, Multiverse Group Limited, the London School of Architecture, The Engineering and Design Institute London (TEDI) and the University of Dundee will be following some time in autumn 2025.

    Bad and basic

    HESA’s Key Financial Indicators (KFIs) are familiar and well-documented, and would usually be the first place you would go to get a sense of the overall financial health of a particular university.

    I’m a fan of net liquidity days (a measure showing the number of days a university could run for in the absence of any further income). Anything below a month (31 days) makes me sit up and take notice – when you exclude the pension adjustment (basically money that a university never had and would never need to find – it’s an actuarial nicety linked to to the unique way USS is configured) there’s 10 large-ish universities in that boat including some fairly well known names.

    [Full screen]

    Just choose your indicator of interest in the KFI box and mouse over a mark in the chart to see a time series for the provider of your choice. You can find a provider using the highlighter – and if you want to look at an earlier year on the top chart there’s a filter to let you do that. I’ve filtered out some smaller providers by default as the KFIs are less applicable there, but you can add them back in using the “group” filter.

    I’d also recommend a look at external borrowing as a percentage of total (annual) income – there are some providers in the sector that are very highly leveraged who would both struggle to borrow additional funds at a reasonable rate and are likely to have substantial repayments and stringent covenants that severely constrain the strategic choices they can make.

    Balance board

    This next chart lets you see the fundamentals of your university’s balance sheet – with a ranking by overall surplus and deficit at the top. There are 29 largeish providers who reported a deficit (excluding the pension adjustments again) in 2023-24, with the majority being the kind of smaller modern providers that train large parts of our public sector workforce. These are the kind of universities who are unlikely to have substantial initial income beyond tuition fees, but will still have a significant cost base to sustain (usually staffing costs and the wider estates and overheads that make the university work).

    [Full screen]

    This one works in a pretty similar way to the chart above – mousing over a provider mark on the main surplus/deficit ranking lets you see a simplified balance sheet. The colours show the headline categories, but these are split into more useful indications of what income or expenditure relates to. Again, by default and for ease of reading I have filtered out smaller providers but you could add them in using the “group” filter. For definitions of the terms used HESA has a very useful set of notes below table 1 (from which this visualisation is derived)

    There’s very little discretionary spend within the year – everything pretty much relates to actually paying staff, actually staying in regulatory compliance, and actually keeping the lights on and the campus standing: all things with a direct link to the student experience. For this reason, universities have in the past been more keen to maximise income than bear down on costs although the severity and scope of the current pressure means that cuts that students will notice are becoming a lot more common.

    What universities spend money on

    As a rule of thumb, about half of university expenditure is on staff costs (salaries, pensions, overheads). These costs rise slowly but relatively predictably over time, which is why the increase in National Insurance contributions (which we will see reflected in next year’s accounts) came as such an unwelcome surprise.

    But the real pressure so far has been on the non-staff non-finance costs – which have risen from below 40 per cent a decade ago to rapidly approach 50 per cent this year (note that these figures are not directly comparable, but the year to date includes most larger providers, and the addition of the smaller providers in the regular totals for other years will not change things much).

    What are “other costs”? Put all thoughts of shiny new buildings from your mind (as we will see these are paid for with capital, and only show up in recurrent budgets as finance costs) – once again, we are talking about the niceties of there being power, sewage, wifi, printer paper, and properly maintained buildings and equipment. The combination of inflationary increases and a rise in the cost of raw materials and logistics as a result of the absolute state of the world right now.

    [Full screen]

    Though this first chart defaults to overall expenditure you can use it to drill down as far as individual academic cost centres using the “cc group” and “cc filters”. Select your provider of interest (“All providers” shows the entire sector up to 2022-23, “All providers (year to date)” shows everything we know about for 2023-24. It’s worth being aware that these are original not restated accounts so there may be some minor discrepancies with the balance sheets (which are based on restated numbers).

    The other thing we can learn from table 8 is how university spending is and has been split proportionally between cost centres. Among academic subject areas, one big story has been the rise in spending in business and management – these don’t map cleanly to departments on the ground, but the intention to ready your business school for the hoped-for boom in MBA provision is very apparent.

    [Full screen]

    That’s capital

    I promised I’d get back to new builds (and large refurbishment/maintenance projects) and here we are. Spending is categorised as capital expenditure when it contributes to the development of an asset that will realise value over multiple financial years. In the world of universities spend is generally either on buildings (the estate more generally) or equipment (all the fancy kit you need to do teaching and research).

    What’s interesting about the HESA data here is that we can learn a lot about the source of this capital – it’s fairly clear for instance that the big boom in borrowing when OfS deregulated everything in 2019-20 has long since passed. “Other external sources” (which includes things like donations and bequests) are playing an increasingly big part in some university capital programmes, but the main source remains “internal funds” drawn from surpluses realised in the recurrent budget. These now constitute more than 60 per cent of all capital spend – by contrast external borrowing is less than ten per cent (a record low in the OfS era)

    [Full screen]

    What’s next?

    As my colleague Debbie McVitty has already outlined on the site, the Office for Students chose the same day to publish their own analysis of this crop of financial statements plus an interim update giving a clearer picture of the current year alongside projections for the next few.

    Rather than sharing any real attempt to understand what is going on around the campuses of England, the OfS generally uses these occasions to complain that actors within a complex and competitive market are unable to spontaneously generate a plausible aggregate recruitment prediction. It’s almost as if everyone believes that the expansion plans they have very carefully made using the best available data and committed money to will actually work.

    The pattern with these tends to be that next year (the one people know most about) will be terrible, but future years will gradually improve as awesome plans (see above) start to pay off. And this iteration, even with the extra in year data which contributes to a particularly bad 2025-26 picture, is no exception to this.

    While the HESA data allows for an analysis of individual provider circumstances, the release from OfS covers large groups of providers – mixing in both successful and struggling versions of a “large research intensive” or “medium” provider in a generally unhelpful way.

    [Full screen]

    To be clear, the regulator understands that different providers (though outwardly similar) may have different financial pressures. It just doesn’t want to talk in public about which problems are where, and how it intends to help.

    Source link

  • Otus Wins Gold Stevie® Award for Customer Service Department of the Year

    Otus Wins Gold Stevie® Award for Customer Service Department of the Year

    CHICAGO, IL (GLOBE NEWSWIRE) — Otus, a leading provider of K-12 student data and assessment solutions, has been awarded a prestigious Gold Stevie® Award in the category of Customer Service Department of the Year at the 2025 American Business Awards®. This recognition celebrates the company’s unwavering commitment to supporting educators, students, and families through exceptional service and innovation.

    In addition to the Gold award, Otus also earned two Silver Stevie® Awards: one for Company of the Year – Computer Software – Medium Size, and another honoring Co-founder and President Chris Hull as Technology Executive of the Year.

    “It is an incredible honor to be recognized, but the real win is knowing our work is making a difference for educators and students,” said Hull. “As a former teacher, I know how difficult it can be to juggle everything that is asked of you. At Otus, we focus on building tools that save time, surface meaningful insights, and make student data easier to use—so teachers can focus on what matters most: helping kids grow.”

    The American Business Awards®, now in their 23rd year, are the premier business awards program in the United States, honoring outstanding performances in the workplace across a wide range of industries. The competition receives more than 12,000 nominations every year. Judges selected Otus for its outstanding 98.7% customer satisfaction with chat interactions, and exceptional 89% gross retention in 2024. They also praised the company’s unique blend of technology and human touch, noting its strong focus on educator-led support, onboarding, data-driven product evolution, and professional development.

    “We believe great support starts with understanding the realities educators face every day. Our Client Success team is largely made up of former teachers and school leaders, so we speak the same language. Whether it’s during onboarding, training, or day-to-day communication, we’re here to help districts feel confident and supported. This recognition is a reflection of how seriously we take that responsibility and energizes us to keep raising the bar,” said Phil Collins, Ed.D., Chief Customer Officer at Otus.

    Otus continues to make significant strides in simplifying teaching and learning by offering a unified platform that integrates assessment, data, and instruction—all in one place. Otus has supported over 1 million students nationwide by helping educators make data-informed decisions, monitor progress, and personalize learning. These honors reflect the company’s growth, innovation, and steadfast commitment to helping school communities succeed.

    About Otus

    Otus, an award-winning edtech company, empowers educators to maximize student performance with a comprehensive K-12 assessment, data, and insights solution. Committed to student achievement and educational equity, Otus combines student data with powerful tools that provide educators, administrators, and families with the insights they need to make a difference. Built by teachers for teachers, Otus creates efficiencies in data management, assessment, and progress monitoring to help educators focus on what matters most—student success. Today, Otus partners with school districts nationwide to create informed, data-driven learning environments. Learn more at Otus.com.

    Stay connected with Otus on LinkedIn, Facebook, X, and Instagram.

    eSchool News Staff
    Latest posts by eSchool News Staff (see all)



    Source link

  • How universities can turn QILT data into action – Campus Review

    How universities can turn QILT data into action – Campus Review

    Universities today have access to more data than ever before to assess student success and graduate outcomes. But having data is only part of the equation. The real challenge is turning insights into action.  

    Please login below to view content or subscribe now.

    Membership Login

    Source link

  • Govt. data error sparks doubt over US international enrolments

    Govt. data error sparks doubt over US international enrolments

    The reliability of federal datasets is under scrutiny after an error was identified on the Student and Exchange Visitor Information System (SEVIS) website that appeared to show stagnating international student numbers from August 2024 to the present.  

    The error, brought to The PIE News’s attention by EnglishUSA, casts doubt on recent headlines and media reports about declining international student enrolments in the US, with SEVIS data appearing to show an enrolment decline of 11% between March 2024 and March 2025.  

    “Starting in August 2024, the data appears to be duplicated month after month, with flatlined totals for students on F and M visas. These figures show virtually no fluctuation during a period when natural enrolment shifts would be expected,” explained EnglishUSA executive director, Cheryl Delk-Le Good.  

    “This irregularity comes at a time of heightened concern within the field, particularly as educators and administrators manage the fallout from widespread SEVIS terminations and the resulting confusion around visa status for international students,” added Delk-Le Good.  

    The US Department of Homeland Security (DHS), which runs SEVIS, was alerted to the error on April 14 and said it was “working to resolve the issue”.  

    As of April 25, the dataset has not been updated, and DHS has not responded to The PIE’s request for comment.  

    US International Trade Administration. Market Diversification Tool for International Education. 2023. Retrieved: April 11, 2025.

    Notably, the inaccuracies begin in August 2024 and span both US administrations, suggesting “a computer glitch rather than an intentional act,” said Mark Algren – interim director of the Applied English Center at the University of Kansas and a contributor to EnglishUSA’s data initiatives – who noticed the anomaly.  

    However, Algren added that he had “no idea why someone didn’t catch it,” with the considerable timeframe of the glitch likely to hamper confidence in federal datasets that are relied on by institutions and that ensure transparency in the system.  

    Total F&M visa holders in the US: 

    Month  Total F&M  Change from previous month 
    August 24   1,091,134  -59,822 
    September 24   1,091,137  +3 
    October 24  1,091,141  +4 
    November 24  1,091,144  +3 
    January 25  1,091,142  -2 
    February 25  1,091,155  +13 
    March 25  1,091,161  +11 
    Source: SEVIS

    It is important to note that each monthly dataset recorded by SEVIS is a snapshot of a given day that month, and the drop recorded in August 2024 (which is considered the last accurate figure) could have been taken before many students arrived for the fall academic term.  

    For this reason, “it’s hard to say that an August report is representative of the following fall term,” said Algren, with the true figures yet to be seen.  

    At the start of the 2024/25 academic year, IIE’s fall snapshot reported a 3% rise in international student enrolment, building on sustained growth over the last three years. 

    Despite recent uncertainty in the US caused by the current administration’s recent attacks on higher education, the period of SEVIS’ misreporting represents an earlier timeframe before the impact of Trump’s policies came into effect.  

    Source link

  • AI Runs on Data — And Higher Ed Is Running on Empty

    AI Runs on Data — And Higher Ed Is Running on Empty

    Let’s cut to it: Higher ed is sprinting toward the AI revolution with its shoelaces untied.

    Presidents are in boardrooms making bold declarations. Provosts are throwing out buzzwords like “machine learning” and “predictive modeling.” Enrollment and marketing teams are eager to automate personalization, deploy chatbots, and rewrite campaigns using tools like ChatGPT.

    The energy is real. The urgency is understandable. But there’s an uncomfortable truth institutions need to face: You’re not ready.

    Not because you’re not visionary. Not because your teams aren’t capable. But because your data is a disaster.

    AI is not an easy button

    Somewhere along the way, higher ed started treating AI like a miracle shortcut — a shiny object that could revolutionize enrollment, retention, and student services overnight.

    But AI isn’t a magic wand. It’s more like a magnifying glass, exposing what’s underneath.

    If your systems are fragmented, your records are outdated, and your departments are still hoarding spreadsheets like it’s 1999, AI will only scale the chaos. It won’t save you – it’ll just amplify your problems.

    When AI goes sideways

    Take the California State University system. They announced their ambition to become the nation’s first AI-powered public university system. But after the headlines faded, faculty across the system were left with more questions than answers. Where was the strategy? Who was in charge? What’s the plan?

    The disconnect between vision and infrastructure was glaring.

    Elsewhere, institutions have already bolted AI tools onto outdated systems, without first doing the foundational work. The result? Predictive models that misidentify which students are at risk. Dashboards that contradict themselves. Chatbots that confuse students more than they support them.

    This isn’t an AI failure. It’s a data hygiene failure.

    You don’t need hype — You need hygiene

    Before your institution invests another dollar in AI, ask these real questions:

    • Do we trust the accuracy of our enrollment, academic, and financial data?
    • Are we still manually wrangling CSVs each month just to build reports?
    • Do our systems speak the same language, or are they siloed and outdated?
    • Is our data governance robust enough to ensure privacy, security, and usefulness?
    • Have we invested in the unglamorous but essential work (e.g., integration pipelines, metadata management, and cross-functional alignment)?

    If the answer is “not yet,” then congratulations — you’ve found your starting point. That’s your AI strategy.

    Because institutions that are succeeding with AI, like Ivy Tech Community College, didn’t chase the trend. They built the infrastructure. They did the work. They cleaned up first.

    What true AI readiness looks like (a not-so-subtle sales pitch)

    Let’s be honest: there’s no shortage of vendors selling the AI dream right now. Slick demos, lofty promises, flashy outcomes. But most of them are missing the part that actually matters — a real, proven plan to get from vision to execution.

    This is where Collegis is different. We don’t just sell transformation. We deliver it. Our approach is grounded in decades of experience, built for higher ed, and designed to scale.

    Here’s how we help institutions clean up the mess and build a foundation that makes AI actually work:

    Connected Core®: Your data’s new best friend

    Our proprietary Connected Core solution connects systems, eliminates silos, and creates a single source of truth. It’s the backbone of innovation — powering everything from recruitment to reporting with real-time, reliable data.

    Strategy + AI alignment: Tech that knows where it’s going

    We don’t just implement tools. We align technology to your mission, operational goals, and student success strategy. And we help you implement AI ethically, with governance frameworks that prioritize transparency and accountability.

    Analytics that drive action

    We transform raw data into real insights. From integration and warehousing to dashboards and predictive models, we help institutions interpret what’s really happening — and act on it with confidence.

    Smarter resource utilization

    We help you reimagine how your institution operates. By identifying inefficiencies and eliminating redundancies, we create more agile, collaborative workflows that maximize impact across departments.

    Boosted conversion and retention

    Our solutions enable personalized student engagement, supporting the full lifecycle from inquiry to graduation. That means better conversion rates, stronger persistence, and improved outcomes.

    AI wins when the infrastructure works

    Clean data isn’t a project — it’s a prerequisite. It’s the thing that makes AI more than a buzzword. More than a dashboard. It’s what turns hype into help.

    And when you get it right, the impact is transformational.

    “The level of data mastery and internal talent at Collegis is some of the best-in-class we’ve seen in the EdTech market. When you pair that with Google Cloud’s cutting-edge AI innovation and application development, you get a partnership that can enable transformation not only at the institutional level but within the higher education category at large.”

    — Brad Hoffman, Director, State & Local Government and Higher Education, Google

    There are no shortcuts to smart AI

    AI can only be as effective as the foundation it’s built on. Until your systems are aligned and your data is trustworthy, you’re not ready to scale innovation.

    If you want AI to work for your institution — really work — it starts with getting your data house in order. Let’s build something that lasts. Something that works. Something that’s ready.

    Curious what that looks like? Let’s talk. We’ll help you map out a real, achievable foundation for AI in higher ed.

    You stuck with me to the end? I like you already! Let’s keep the momentum going. If your wheels are turning and you’re wondering where to start, our Napkin Sketch session might be the perfect next step. It’s a fast, collaborative way to map out your biggest data and tech challenges—no pressure, no sales pitch, just a conversation. Check it out!

    Innovation Starts Here

    Higher ed is evolving — don’t get left behind. Explore how Collegis can help your institution thrive.

    Source link

  • Data shows growing GenAI adoption in K-12

    Data shows growing GenAI adoption in K-12

    Key points:

    • K-12 GenAI adoption rates have grown–but so have concerns 
    • A new era for teachers as AI disrupts instruction
    • With AI coaching, a math platform helps students tackle tough concepts
    • For more news on GenAI, visit eSN’s AI in Education hub

    Almost 3 in 5 K-12 educators (55 percent) have positive perceptions about GenAI, despite concerns and perceived risks in its adoption, according to updated data from Cengage Group’s “AI in Education” research series, which regularly evaluates AI’s impact on education.  

    More News from eSchool News

    HVAC projects to improve indoor air quality. Tutoring programs for struggling students. Tuition support for young people who want to become teachers in their home communities.

    Our school has built up its course offerings without having to add headcount. Along the way, we’ve also gained a reputation for having a wide selection of general and advanced courses for our growing student body.

    When it comes to visual creativity, AI tools let students design posters, presentations, and digital artwork effortlessly. Students can turn their ideas into professional-quality visuals, sparking creativity and innovation.

    Ensuring that girls feel supported and empowered in STEM from an early age can lead to more balanced workplaces, economic growth, and groundbreaking discoveries.

    In my work with middle school students, I’ve seen how critical that period of development is to students’ future success. One area of focus in a middle schooler’s development is vocabulary acquisition.

    For students, the mid-year stretch is a chance to assess their learning, refine their decision-making skills, and build momentum for the opportunities ahead.

    Middle school marks the transition from late childhood to early adolescence. Developmental psychologist Erik Erikson describes the transition as a shift from the Industry vs. Inferiority stage into the Identity vs. Role Confusion stage.

    Art has a unique power in the ESL classroom–a magic that bridges cultures, ignites imagination, and breathes life into language. For English Language Learners (ELLs), it’s more than an expressive outlet.

    In the year 2025, no one should have to be convinced that protecting data privacy matters. For education institutions, it’s really that simple of a priority–and that complicated.

    Teachers are superheroes. Every day, they rise to the challenge, pouring their hearts into shaping the future. They stay late to grade papers and show up early to tutor struggling students.

    Want to share a great resource? Let us know at [email protected].

    Source link

  • Sexual misconduct data is coming – here’s what universities should do to prepare

    Sexual misconduct data is coming – here’s what universities should do to prepare

    In 2024, the Office for Students (OfS) launched a pilot survey asking UK students about sexual misconduct during their time in higher education.

    For the first time, there is now a national attempt to capture data on how widespread such incidents are, and how effectively students are supported when they come forward.

    The release of the survey’s results will be a moment that reflects a growing reckoning within the sector: one in which the old tools and quiet handling of disclosures are no longer fit for purpose, and the need for culture change is undeniable.

    This new initiative – known as the Sexual Misconduct Survey (SMS) – ran as a supplement to the National Student Survey (NSS), which since 2005 has become a familiar, if evolving, feature of the higher education calendar.

    While the NSS focuses on broad measures of the student experience, the SMS attempts to delve into one of its most difficult and often under-reported aspects – sexual harassment, violence, and misconduct.

    Its arrival comes against the backdrop of high-profile criticisms of university handling of disclosures, including the misuse of non-disclosure agreements (NDAs), and a new OfS regulatory condition (E6) requiring institutions to take meaningful steps to tackle harassment.

    Understanding the SMS

    The Sexual Misconduct Survey collects both qualitative and quantitative data on students’ experiences. It examines the prevalence of misconduct, the extent to which students are aware of reporting mechanisms, and whether they feel able to use them. Its core aim is clear – to ensure students’ experiences are not just heard, but systematically understood.

    Previous, disparate studies — many led by the National Union of Students and grassroots campaigners — have long indicated that sexual misconduct in higher education is significantly under-reported. This is especially true for marginalised groups, including LGBTQ+ students, Black and disabled students, and students engaged in sex work. The SMS marks an attempt to reach further, with standardised questions asked at scale, across providers.

    Despite its intention, the SMS is not without issues. A key concern raised by student support professionals is the opt-out design. Students were automatically enrolled in the survey unless they actively declined – a move which risks retraumatising victim-survivors who may not have realised the nature of the questions until too late.

    Timing has also drawn criticism. Coming immediately after the exhaustive NSS — with its 26 questions and optional free-text fields — the SMS may suffer from survey fatigue, especially during an already intense period in the academic calendar. Low response rates could undermine the richness or representativeness of the data gathered.

    There are also complex ethical questions about the language used in the survey. In striving for clarity and precision, the SMS employs explicitly descriptive terminology. This can potentially open up difficult experiences unrelated to higher education itself, including childhood abuse or incidents beyond university campuses. Anonymous surveys, by nature, can surface trauma but cannot respond to it — and without parallel safeguarding or signposting mechanisms, the risk of harm increases.

    Lastly, the handling of disclosures matters. While survey responses are anonymous, students need to trust that institutions — and regulators — will treat the findings with sensitivity and respect. Transparency about how data will be used, how institutions will be supported to act on it, and how students will see change as a result is essential to building that trust.

    What to do next?

    The data from the pilot survey will be shared with institutions where response rates and anonymity thresholds allow. But even before the results arrive, universities have an opportunity — and arguably a duty — to prepare.

    Universities should start by preparing leadership and staff to anticipate that the results may reveal patterns or prevalence of sexual misconduct that are difficult to read or acknowledge. Institutional leaders must ensure they are ready to respond with compassion and commitment, not defensiveness or denial.

    Universities should be prepared to review support systems and communication now. Are reporting tools easy to find, accessible, and trauma-informed? Is the student community confident that disclosures will be taken seriously? These questions are important and there is potential for the survey to act as a prompt to review what is already in place as well as what might need urgent attention.

    Universities should also engage students meaningfully. Institutions must commit to involving students — especially survivor advocates and representative bodies — in analysing findings and shaping the response. The worst outcome would be seeing the SMS as a tick-box exercise. The best would be for it to spark co-produced action plans.

    When data is released, institutions avoid the urge to benchmark or downplay. Instead, they should be ready to own the story the data tells and act on the issues it raises. A lower prevalence rate does not necessarily mean a safer campus; it may reflect barriers to disclosure or fear of speaking out. Each result will be different, and a patchwork of responses is no bad thing.

    Finally, it is important to look beyond the numbers and see the person. Qualitative insights from the SMS will be just as important as the statistics. Stories of why students did not report, or how they were treated when they did, offer vital direction for reform and should be something which university leaders and policy makers take time to think about.

    This is only the first year of the SMS, and it is not yet clear whether it will become a permanent feature alongside the NSS. That said, whether the pilot continues or evolves into something new, the challenge it presents is real and overdue.

    The sector cannot afford to wait passively for data. If the SMS is to be more than a compliance exercise, it must be the beginning of a broader culture shift – one that faces up to what students have long known, listens without defensiveness, and builds environments where safety, dignity, and justice are non-negotiable.

    Lasting change will not come from surveys alone. Asking the right questions — and acting with purpose on the answers — is a critical start.

    Source link

  • Data breach affects 10,000 Western Sydney University students – Campus Review

    Data breach affects 10,000 Western Sydney University students – Campus Review

    Students from Western Sydney University (WSU) have had their data accessed and likely posted to the dark web in a data breach event.

    Please login below to view content or subscribe now.

    Membership Login

    Source link

  • Effective academic support requires good data transparency

    Effective academic support requires good data transparency

    Academic support for students is an essential component of their academic success. At a time when resources are stretched, it is critical that academic support structures operate as a well-oiled machine, where each component has a clearly defined purpose and operates effectively as a whole.

    We previously discussed how personal and pastoral tutoring, provided by academic staff, needs to be supplemented by specialist academic support. A natural next step is to consider what that specialist support could look like.

    A nested model

    We’ve identified four core facets of effective academic support, namely personal tutoring (advising/coaching/mentoring etc), the development of academic skills and graduate competencies, all supported by relevant student engagement data. The nested model below displays this framework.

    We also suggest two prerequisites to the provision of academic support.

    Firstly, a student must have access to information related to what academic support entails and how to access this. Secondly, a student’s wellbeing means that they can physically, mentally, emotionally and financially engage with their studies, including academic support opportunities.

    Figure 1: Academic support aspects within a student success nested model

    Focusing on academic support

    Personal tutoring has a central role to play within the curriculum and within academic provision more broadly in enabling student success.

    That said, “academic support” comprises much more than a personal tutoring system where students go for generic advice and support.

    Rather, academic support is an interconnected system with multiple moving parts tailored within each institution and comprising different academic, professional and third-space stakeholders.

    Yet academics remain fundamental to the provision of academic support given their subject matter expertise, industry knowledge and their proximity to students. This is why academics are traditionally personal tutors and historically, this is where the academic support model would have ended. Changes in student needs means the nature of personal tutoring has needed to be increasingly complemented by other forms of academic support.

    Skills and competencies

    Academic skills practitioners can offer rich insights in terms of how best to shape and deliver academic support.

    A broad conception of academic skills that is inclusive of academic literacies, maths, numeracy and stats, study skills, research and information literacy and digital literacy is a key aspect of student academic success. Student acquisition of these skills is complemented by integrated and purposeful involvement of academic skills practitioners across curriculum design, delivery and evaluation.

    Given regulatory focus on graduate outcomes, universities are increasingly expected to ensure that academic support prepares students for graduate-level employability or further study upon graduation. Much like academic skills practitioners, this emphasises the need to include careers and employability consultants in the design and delivery of integrated academic support aligned to the development of both transferable and subject-specific graduate competencies.

    Engaging data

    Data on how students are participating in their learning provides key insights for personal tutors, academic skills practitioners and colleagues working to support the development of graduate competencies.

    Platforms such as StREAM by Kortext enable a data-informed approach to working with students to optimise the provision of academic support. This holistic approach to the sharing of data alongside actionable insights further enables successful transition between support teams.

    Knowing where the support need is situated means that these limited human and financial resources can be directed to where support is most required – whether delivered on an individual or cohort basis. Moreover, targeted provision can be concentrated at relevant points over the academic year. Using engagement data contributes to efficiency drives through balancing the provision of information and guidance to all students. The evidence shows it’s both required and likely to prove effective.

    Academic support is increasingly complicated in terms of how different aspects overlap and interplay within a university’s student success ecosystem. Therefore, when adopting a systems-thinking approach to the design and delivery of academic support, universities must engage key stakeholders, primarily students, academic skills practitioners and personal tutors themselves.

    A priority should be ensuring varied roles of academic support providers are clearly defined both individually and in relation to each other.

    Similarly, facilitating the sharing of data at the individual student level about the provision of academic support should be prioritised to ensure that communication loops are closed and no students fall between service gaps.

    Given that academic support is evolving, we would welcome readers’ views of what additional aspects of academic support are necessary to student success.

    To find out more about how StREAM by Kortext can enable data-informed academic support at your institution, why not arrange a StREAM demonstration.

    Source link

  • HESA has published full student data for 2023–24

    HESA has published full student data for 2023–24

    The 2023-24 HESA student data release was delayed by three rather than six months this year.

    We’re clearly getting better at dealing with the output (and the associated errors) of data collected through the HESA Data Platform. While there are not as many identified issues as in last year’s cycle the list is long and occasionally unnerving.

    Some represent a winding back from last year’s errors (we found 665 distance learning students at the University of Buckingham that should have been there last year), some are surprisingly big (the University of East London should have marked an extra 2,550 students as active), and some are just perplexing: the University of South Wales apparently doesn’t know where 1,570 students came from , the University of Portsmouth doesn’t actually have 470 students from the tiny Caribbean island of St Barthélemy.

    Access and participation

    It is surprising how little access and participation data, in the traditional sense, that HESA now publishes. We get an overview at sector level (the steady growth of undergraduate enrollments from the most deprived parts of England is notable and welcome), but there is nothing by provider level.

    [Full screen]

    One useful proxy for non-traditional students is entry qualifications. We get these by subject area – we learn that in 2025 a quarter of full time undergraduate business students do not have any formal qualifications at all.

    [Full screen]

    With the UK’s four regulators far from agreement as to what should be monitored to ensure that participation reflects potential rather than privilege, it’s not really worth HESA publishing a separate set of UK wide statistics. The closest we get is SEISA, which is now official statistics. I look forward to seeing SEISA applied to UK-domiciled students at individual providers, and published by HESA.

    Student characteristics

    We get by subject area data on disability (at a very general, “marker” – known disability – level) which I have plotted on a by year basis. The axis here is the proportion of students with a known disability – the colours show the total number of students. For me the story here is that creative subjects appear to attract students who disclose disabilities – art and creative writing in particular.

    [Full screen]

    I’ve also plotted ethnicity by provider, allowing you to see the ways in which the ethnic make up of providers has changed over time.

    [Full screen]

    Student domicile (UK)

    UK higher education includes four regulated systems – one in each of the four nations. Although in the main students domiciled in a given nation study at providers based in that same nation, there is a small amount of cross-border recruitment.

    [Full screen]

    Notably nearly three in ten Welsh students study at English providers, including more than a third of Welsh postgraduates. And two in ten Northern Irish students study in England. The concern is that if you move to study, you are less likely to move home – so governments in Wales and Northern Ireland (where there are student number controls) will be thinking carefully about the attractiveness and capacity of their respective systems to avoid a “brain drain”.

    Within these trends, particular providers have proven particularly efficient in cross-border recruitment. If you are a Northern Irish student studying in England chances are you are at Liverpool John Moores University, Liverpool, or Northumbria. If you are Welsh and studying in England your destination may well be UWE, Bristol, Chester – or again, Liverpool John Moores.

    There is a proximity effect – where students are crossing the border, they are likely to stay close to it – but also (if we look at Northern Ireland domiciled students looking at Glasgow or Newcastle) evidence of wider historic cultural links.

    [Full screen]

    Student domicile (international)

    Thinking about providers recruiting from Northern Ireland made me wonder about students from the Republic of Ireland – do we see similar links? As you might expect, the two larger providers in Northern Ireland recruit a significant share, but other winners include the University of Edinburgh and St Margaret’s. UCL has the biggest population among English providers.

    You can use this chart to look at where students from any country in the world end up when they study in the UK (I do insist you look at those St Barthélemy students – literally all at the University of Portsmouth apparently).

    [Full screen]

    An alternate view lets you look at the international population of your institution – the established pattern (China in the Russell Group, India elsewhere) still holds up.

    [Full screen]

    What’s of interest to nervous institutional managers is the way international recruitment is changing over time. This is a more complicated dashboard that helps you see trends at individual providers for a given country, seen along with how your recruitment sits within the sector (mouse over an institution to activate the time series at the bottom.

    [Full screen]

    Source link