Category: Data

  • International enrolments at UK business schools on the mend

    International enrolments at UK business schools on the mend

    UK business schools continue to be buffeted by hostile immigration policies, with some institutions noting two consecutive years of declining overseas enrolments, according to 2025/26 results from the 2025 Chartered Association of Business Schools (ABS) annual membership survey of 48 members.

    But the picture seems to be improving. Almost half of the schools surveyed (46%) reported an increase in international enrolments, up from just 11% the previous year. At undergraduate level, 45% reported rising numbers, compared with 64% at postgraduate level.

    Nevertheless, the association has pointed to policies affecting international students in the UK as continuing causes for concern for business schools as promises made in Keir Starmer’s immigration white paper become a reality.

    While international enrolments at the undergraduate level were down on 2024/25 for 14% of respondents, this is far lower than the 39% who reported the same trend in 2024/25.

    Similarly, while a sizeable chunk of respondents (39%) said overseas enrolments for postgraduate students were down year on year, this is still a noticeable improvement than over three quarters of respondents the year before.

    But the Chartered ABS noted that international enrolments will still be lower than before 2024/25, with some schools reporting two years of decline in a row.

    The Chartered ABS pointed to hostile policies in the UK as a potential reason for declining international enrolments. The UK government’s decision to reduce the Graduate Route by six months is already having an effect, it said, with 60% of survey respondents saying the incoming policy has had a negative impact.

    “The shortening of the Graduate Route, the ban on student dependants, and the proposals for the international student levy will continue to have a damaging impact on business school finances, and by extension, their parent institutions,” warned Stewart Robinson, chair of the Chartered ABS and dean of Newcastle University Business School.

    “These results reveal that while some institutions are seeing student numbers grow and finances stabilise, many institutions continue to face significant challenges. Budget cuts, restructuring, and redundancies will continue, and many business schools will face another year of declining student numbers and income,” he added. 

    The survey revealed that many UK business schools are feeling the pinch, with an increasing number (48%) reporting a drop in year-on-year income in 2025/26 compared to 36% in 2024/25.

    Budget cuts, restructuring, and redundancies will continue, and many business schools will face another year of declining student numbers and income
    Stewart Robinson, Chartered ABS and Newcastle University Business School

    However, more than half of the schools surveyed (58%) said they expected income to increase in 2025/26 – an improvement on the previous year, when more than half expected further decline.

    A slew of policies affecting the international education sector were announced as part of the immigration white paper, with stakeholders concerned that each could have a serious impact on overseas enrolments.

    The government has decided to cut the Graduate Route from two years to just 18 months, shaving six months off the visa route for international graduates from UK institutions.

    A levy on the income institutions make from international student fees was also announced as part of the changes, with a later decision to ringfence this cash to spend on maintenance grants for domestic students. Critics have warned that the move could decimate international enrolments if students are put off by the higher fees many institutions will have to set to cover the cost of the tax.

    An earlier decision to ban almost all international students from bringing their dependants to the country with them on a student visa. Since 2024, when the policy was announced, net migration numbers in the UK have seen a steep decline.

    Source link

  • Is it time to change the rules on NSS publication?

    Is it time to change the rules on NSS publication?

    If we cast our minds back to 2005, the four UK higher education funding bodies ran the first ever compulsory survey of students’ views on the education they receive – the National Student Survey (NSS).

    Back then the very idea of a survey was controversial, we were worried about the impact on the sector reputation, the potential for response bias, and that students would be fearful of responding negatively in case their university downgraded their degree.

    Initial safeguards

    These fears led us to make three important decisions all of which are now well past their sell-by date. These were:

    • Setting a response rate threshold of 50 per cent
    • Restricting publication to subject areas with more than 22 respondents
    • Only providing aggregate data to universities.

    At the time all of these were very sensible decisions designed to build confidence in what was a controversial survey. Twenty years on, it’s time to look at these with fresh eyes to assure ourselves they remain appropriate – and to these eyes they need to change.

    Embarrassment of riches

    One of these rules has already changed: responses are now published where 10 or more students respond. Personally, I think this represents a very low bar, determined as it is by privacy more than statistical reasoning, but I can live with it especially as research has shown that “no data” can be viewed negatively.

    Of the other two, first let me turn to the response rate. Fifty per cent is a very high response rate for any survey, and the fact the NSS achieves a 70 per cent response rate is astonishing. While I don’t think we should be aiming to get fewer responses, drawing a hard line at 50 per cent creates a cliff edge in data that we don’t need.

    There is nothing magical about 50 per cent – it’s simply a number that sounds convincing because it means that at least half your students contributed. A 50 per cent response rate does not ensure that the results are not subject to bias for example, if propensity to respond was in some way correlated with a positive experience the results would still be flawed.

    I would note that the limited evidence that there is suggests that propensity to respond is not correlated with a positive experience, but it’s an under-researched area and one the Office for Students (OfS) should publish some work on.

    Panel beating

    This cliff edge is even more problematic when the data is used in regulation, as the OfS proposes to do a part of the new TEF. Under OfS proposals providers that don’t have NSS data either due to small cohorts or a “low” response rate would have NSS evidence replaced with focus groups or other types of student interaction. This makes sense when the reason is an absolute low number of responses but not when it’s due to not hitting an exceptionally high response rate as Oxford and Cambridge failed to do for many years.

    While focus groups can offer valuable insights, and usefully sit alongside large-scale survey work, it is utterly absurd to ignore evidence from a survey because an arbitrary and very high threshold is not met. Most universities will have several thousand final year students, so even if only 30 per cent of them respond you will have responses from hundreds if not thousands of individuals – which must provide a much stronger evidence base than some focus groups. Furthermore, that evidence base will be consistent with every other university creating one less headache for assessors in comparing diverse evidence.

    The 50 per cent response rate threshold also looks irrational when set against a 30 per cent threshold for the Graduate Outcomes survey. While any response rate threshold is arbitrary to apply, applying two different thresholds needs rather more justification than the fact that the surveys are able to achieve different response rates. Indeed, I might argue that the risk of response bias might be higher with GO for a variety of reasons.

    NSS to GO

    In the absence of evidence in support of any different threshold I would align the NSS and GO publication thresholds at 30 per cent and make the response rates more prominent. I would also share NSS and GO data with TEF panels irrespective of the response rate, and allow them to rely on their expert judgement supported by the excellent analytical team at the OfS. And the TEF panel may then choose to seek additional evidence if they consider it necessary.

    In terms of sharing data with providers, 2025 is really very different to 2005. Social media has arguably exploded and is now contracting, but in any case attitudes to sharing have changed and it is unlikely the concerns that existed in 2005 will be the same as the concerns of the current crop of students.

    For those who don’t follow the detail, NSS data is provided back to Universities via a bespoke portal that provides a number of pre-defined cuts of the data and comments, together with an ability to create your own cross-tabs. This data, while very rich, do not have the analytical power of individualised data and suffer from still being subject to suppression for small numbers.

    What this means is that if we want to understand the areas we want to improve we’re forced to deduce it from a partial picture rather than being laser focussed on exactly where the issues are, and this applies to both the Likert scale questions and the free text.

    It also means that providers cannot form a longitudinal view of the student experience by linking to other data and survey responses they hold at an individual level – something that could generate a much richer understanding of how to improve the student experience.

    Source link

  • Policy uncertainty emerges as top barrier to student mobility 

    Policy uncertainty emerges as top barrier to student mobility 

    While affordability remains the greatest obstacle for students, IDP Education’s new Emerging Futures survey has revealed the growing impact of sudden and unclear policy changes shaping students’ international study decisions.  

    “Students and families are prepared to make sacrifices to afford their international education dreams. They can adjust budgets, seek scholarships and rely on part-time work. But they cannot plan for uncertainty,” said IDP chief partnerships officer Simon Emmett.  

    “When the rules change, without warning or clarity, trust falls away. Students hesitate, delay, or choose to study elsewhere.” 

    Drawing on the views of nearly 8,000 international students from 134 countries between July and August 2025, the results highlighted the critical importance of study destinations communicating policy changes to sustain trust among students.  

    The US and UK were rated the lowest for providing clear guidance on visas and arrivals processes, while New Zealand was identified as the top communicator in this respect.  

    What’s more, the UK saw the steepest rise in students withdrawing from plans to study there, indicating recent policy changes including plans to shorten the Graduate Route and increase compliance metrics for universities are creating uncertainty among international students. 

    Of the students who said they were pivoting away from major study destinations, over half (51%) indicated tuition fees had become unaffordable and one in five said it was too difficult to obtain a visa.  

    In markets such as Malaysia, the Philippines and the UAE, students reported delaying or redirecting applications almost immediately after unclear announcements by major destinations, the report said. 

    Meanwhile Canada’s share of withdrawals was shown to have eased, indicating messaging is helping to rebuild stability, the authors suggested, though Canadian study permit issuance has fallen dramatically in 2025.

    Without that stability, even the most attractive destinations risk losing trust

    Simon Emmett, IDP

    Despite policy disruptions in Australia over recent years, the country remained the most popular first-choice destination globally, ranked highly for value for money, graduate employment opportunities and post-study work pathways.  

    At the same time, many respondents flagged sensitivities to recent visa and enrolment changes, highlighting the need for consistent and transparent messaging to maintain Australia’s competitiveness, according to IDP.  

    The US saw the largest decline in popularity, dropping to third place behind Australia and the UK. 

    NAFSA CEO Fanta Aw said the findings should serve as a “wake-up call” that policy uncertainty has real human and economic costs, emphasising the need for “clear and consistent” communication from institutions and policymakers.  

    “Students are paying close attention to how the US administration handles student visas and post-study experiential learning opportunities like Optional Practical Training,” said Aw. 

    Visa restrictions and policy hostility have rocked the US under Trump’s second presidency, with global visa appointments suspended for nearly a month this summer, as well as thousands of student visa revocations and travel restrictions on 12 nations.  

    Post-study work opportunities are increasingly fragile in the US with government plans to overhaul the H-1B skilled worker visa to favour better paid jobs and OPT coming under increased scrutiny from policymakers. 

    Emmett highlighted the knock-on effect of these policy shocks, with student journeys being disrupted “not by ambition, but by uncertainty”. 

    “Countries that provide predictability will win the confidence of students and their families. Without that stability, even the most attractive destinations risk losing trust,” he said. 

    Despite financial and political challenges, demand for global study remained strong, with half of all prospective students intending to apply within six months, and a further 29% within a year. 

    South Asia emerged as the main driver of intent, with more than 60% of students surveyed from India, Pakistan and Bangladesh preparing near-term applications, though this region was also the most sensitive to abrupt or confusing policy shifts.  

    Source link

  • Higher education data explains why digital ID is a good idea

    Higher education data explains why digital ID is a good idea

    Just before the excitement of conference season, your local Facebook group lost its collective mind. And it shows no sign of calming down.

    Given everything else that is going on, you’d think that reinforcing the joins between key government data sources and giving more visibility to the subjects of public data would be the kind of nerdy thing that the likes of me write about.

    But no. Somebody used the secret code word. ID Cards.

    Who is she and what is she to you?

    I’ve written before about the problems our government faces in reliably identifying people. Any entitlement– or permission– based system needs a clear and unambiguous way of assuring the state that a person is indeed who they claim they are, and have the attributes or documentation they claim to.

    As a nation, we are astonishingly bad at this. Any moderately serious interaction with the state requires a parade of paperwork – your passport, driving license, birth certificate, bank statement, bank card, degree certificate, and two recent utility bills showing your name and address. Just witness the furore over voter ID – to be clear a pointless idea aimed at solving a problem that the UK has never faced – and the wild collection of things that you might be allowed to pull out of your voting day pocket that do not include a student ID.

    We are not immune from this problem in higher education. I’ve been asking for years why you need to apply to a university via UCAS, and apply for funding via the Student Loans Company, via two different systems. It’s then never been clear to me why you then need to submit largely similar information to your university when you enroll.

    Sun sign

    Given that organs of the state have this amount of your personal information, it is then alarming that the only way it can work out what you earn after graduating is by either asking you directly (Graduate Outcomes) or by seeing if anyone with your name, domicile, and date of birth turns up in the Inland Revenue database.

    That latter one – administrative matching – is illustrative of the government’s current approach to identity. If it can find enough likely matches of personal information in multiple government databases it can decide (with a high degree of confidence) that records refer to the same person.

    That’s how they make LEO data. They look for National Insurance Number (NINO), forename, surname, date of birth, postcode, and sex in both HESA student records and the Department for Work and Pension’s Customer Information System (which itself links to the tax database). Keen Wonkhe readers will have spotted that NINO isn’t returned to HESA – to get this they use “fuzzy matching” with personal data from the Student Loans Company, which does. The surname thing is even wilder – they use a sound-based algorithm (SOUNDEX) to allow for flexibility on spellings.

    This kind of nonsense actually has a match rate of more than 90 per cent (though this is lower for ethnically Chinese graduates because sometimes forenames and surnames can switch depending on the cultural knowledge of whoever prepared the data).

    It’s impressive as a piece of data engineering. But given that all of this information was collected and stored by arms of the same government it is really quite poor.

    The tale of the student ID

    Another higher education example. If you were ever a student you had a student ID. It was printed on your student card, and may have turned up on various official documents too. Perhaps you imagined that every student in the UK had a student number, and that there was some kind of logic to the way that they were created, and that there was a canonical national list. You would be wrong.

    Back in the day, this would have been a HESA ID, itself created from your UCAS number and your year of entry (or your year of entry, HESA provider ID, and an internal reference number if you applied directly). Until just a few years ago, the non-UCAS alternative was in use for all students – even including the use of the old HESA provider ID rather than the more commonly used UKPRN. Why the move away from UCAS – well, UCAS had changed how they did identifiers and HESA’s systems couldn’t cope.

    You’re expecting me to say that things are far more sensible now, but no. They are not. HESA has finally fixed the UKPRN issue within a new student ID field (SID). This otherwise replicates the old system but with one important difference: it is not persistent.

    Under the old approach, the idea was you had one student number for life – if you did an undergraduate degree at Liverpool, a masters at Manchester Met, and a PhD at Royal Holloway these were all mapped to the same ID. There was even a lookup service for new providers if the student didn’t have their old number. I probably don’t even need to tell you why this is a good idea if you are interested – in policy terms – in the paths that students within their career in higher education. These days we just administratively match if we need to. Or – as in LEO – assume that the last thing a student studied was the key to or cause of their glittering or otherwise career.

    The case of the LLE

    Now I hear what you might be thinking. These are pretty terrible examples, but they are just bodges – workarounds for bad decisions made in the distant past. But we have the chance to get it right in the next couple of years.

    The design of the Lifelong Learning Entitlement means that the government needs tight and reliable information about who does what bit of learning in order that funds can be appropriately allocated. So you’d think that there would be a rock-solid, portable, unique learner number underpinning everything.

    There is not. Instead, we appear to be standardising on the Student Loans Company customer reference number. This is supposed to be portable for life, but it doesn’t appear in any other sector datasets (the “student support number” is in HESA, but that is somehow different – you get two identifiers from SLC, lucky you). SLC also holds your NINO (you need one to get funding!), and has capacity to hold another additional number of an institution’s choice, but not (routinely) your HESA student ID or your UCAS identifier.

    There’s also space to add a Unique Learner Number (ULN) but at this stage I’m too depressed to go into what a missed opportunity that is.

    Why is standardising on a customer reference number not a good idea? Well, think of all the data SLC doesn’t hold but HESA does. Think about being able to refer easily back to a school career and forward into working life on various government data. Think about how it is HESA data and not SLC data that underpins LEO. Think about the palaver I have described above and ask yourself why you wouldn’t fix it when you had the opportunity.

    Learning to love Big Brother

    I’ll be frank, I’m not crazy about how much the government knows about me – but honestly compared to people like Google, Meta, or – yikes – X (formerly twitter) it doesn’t hugely worry me.

    I’ve been a No2ID zealot in my past (any employee of those three companies could tell you that) but these days I am resigned to the fact that people need to know who I am, and I’d rather be more than 95 per cent confident that they could get it right.

    I’m no fan of filling in forms, but I am a fan of streamlined and intelligent administration.

    So why do we need ID cards? Simply because in proper countries we don’t need to go through stuff like this every time we want to know if a person that pays tax and a person that went to university are the same person. Because the current state of the art is a mess.

    Source link

  • Canada: 47k int’l students flagged for potential visa non-compliance

    Canada: 47k int’l students flagged for potential visa non-compliance

    Aiesha Zafar, assistant deputy minister for migration integrity at IRCC, told the House of Commons Standing Committee on Citizenship and Immigration that 8% of international students reviewed were potentially “non-compliant”, meaning they were not attending classes as required by the terms of their study visa.

    “In terms of the total number of students we asked for compliance information from, that results in potentially 47,175. We have not yet determined whether they are fully non-compliant, these are initial results provided to us by institutions,” stated Zafar, who was questioned by Conservative MP Michelle Rempel Garner about where these students are currently, if they are not complying with their visa terms.

    Determining full non-compliance of the international students, however, is not straightforward, as institutions report data at varying intervals, and students may change schools, graduate, or take authorized leaves.

    Zafar noted that IRCC shares all the data it continually collects with the Canada Border Services Agency (CBSA), which is responsible for locating and removing non-compliant visa holders.

    “Any foreign national in Canada would be under the purview of the CBSA, so they have an inland investigation team,” Zafar told the committee when Garner questioned how the IRCC is able to track and remove students who are in violation of their visas.

    The 47,000 non-compliance cases are a backlog, evidence that fraud detection is strengthening, not weakening, Canadian standards
    Maria Mathai, M.M Advisory Services

    According to Maria Mathai, founder of M.M Advisory Services, which supports Canadian universities in the South Asian market, the figure of over 47,000 students who could be non-compliant being portrayed as a “crisis” misses the real story — that Canada’s immigration system is actively adapting.

    “Front-end Provincial Attestation Letter (PAL) screening now blocks thousands who would have entered before, and ongoing oversight is catching legacy issues. The 47,000 non-compliance cases are a backlog, evidence that fraud detection is strengthening, not weakening, Canadian standards,” Mathai told The PIE News.

    Mathai acknowledged that past PAL allocations contributed to compliance challenges, with regions like Ontario, which hosts the largest share of international students, directing most of its PALs to colleges with higher default rates.

    However, the situation is expected to change with IRCC now imposing strict provincial caps on the number of study permits each province can issue.

    “By surfacing these imbalances now, the new framework is encouraging provinces and institutions to adapt entry practices based on evidence and learning,” stated Mathai.

    Canada’s international student compliance regime, in effect since 2014, was established to identify potentially non-genuine students.

    It includes twice-yearly compliance reporting conducted in partnership with Designated Learning Institutions (DLIs), Canadian colleges, institutes, and universities authorised to host international students.

    While IRCC’s 2024 report noted no recourse against non-reporting DLIs, new rules now allow such institutions to be suspended for up to a year.

    Moreover, Canada’s struggle with international students not showing up for classes is not new, with reports earlier this year indicating nearly 50,000 instances of “no-shows”, international students who failed to enrol at their institutions, in the spring of 2024.

    While the “no-show” cohort included 4,279 Chinese students, 3,902 Nigerian students, and 2,712 Ghanaian students, Indian students accounted for the largest share at 19,582. It highlights a broader issue of immigration fraud originating from India, which Zafar identified as one of the top countries for such cases during her September 23 committee testimony.

    Over a quarter of international students seeking asylum in Canada also came from India and Nigeria.

    According to Pranav Rathi, associate director of international recruitment at Fanshawe College, which hosts one of the largest numbers of Indian students in Ontario, a “rigorous approach” has led to about 20% of Indian applications being declined to ensure only qualified candidates proceed.

    “Each application is carefully reviewed, and checked for aggregate scores, backlogs, and authenticity of mark sheets. We keep ourselves updated with the recognised institution list published by UGC,” stated Rathi.

    “It is mandatory for a student to provide English language tests approved by IRCC and we also verify English proficiency through IELTS or equivalent test reports to confirm readiness for study in Canada.”

    Rathi suggested that one reason Indian students often appear among potentially non-compliant or “no-show” cases is a systemic issue that previously allowed them to change institutions after receiving a study permit.

    He added that schools now need to take a more active role, particularly when students apply through education agents.

    “Institutions should ensure that their representatives are transparent, well-trained, and follow ethical recruitment practices that align with institutional and regulatory standards,” stated Rathi.

    “Ongoing collaboration between institutions and government bodies to monitor market trends and share insights can help build a more transparent and sustainable international education system.”

    Many Canadian institutions are now facing headwinds, with course offerings and research funding being cut as Canada’s study permit refusal rate has climbed to its highest level in over a decade.

    Canadian politicians have also intensified scrutiny of institutions across the country.

    Just days after the IRCC testimony on non-compliant students, a federal committee hearing led by MP Garner saw Conestoga College president John Tibbits questioned on issues ranging from his $600,000 salary to allegations of “juicing foreign student permits” amid growing concerns that healthcare, housing, and jobs that “don’t have capacity” in Ontario.

    “Colleges, including Conestoga, have been subject to scrutiny about the role international [students] play in housing, affordability and community pressures. I welcome the opportunity to reaffirm that Conestoga’s approach has always been about service. Our mission has always been to ensure the communities we serve have access to the skilled labour force they need to survive,” stated Tibbits, while addressing the committee on Thursday.

    “Looking ahead, we believe this is the time to stabilize the system to build an international student program that is sustainable, fair, globally competitive and focused on Canada’s economic priorities,” he added, as reported by CTV News.

    Source link

  • K-12 districts are fighting ransomware, but IT teams pay the price

    K-12 districts are fighting ransomware, but IT teams pay the price

    Key points:

    The education sector is making measurable progress in defending against ransomware, with fewer ransom payments, dramatically reduced costs, and faster recovery rates, according to the fifth annual Sophos State of Ransomware in Education report from Sophos.

    Still, these gains are accompanied by mounting pressures on IT teams, who report widespread stress, burnout, and career disruptions following attacks–nearly 40 percent of the 441 IT and cybersecurity leaders surveyed reported dealing with anxiety.

    Over the past five years, ransomware has emerged as one of the most pressing threats to education–with attacks becoming a daily occurrence. Primary and secondary institutions are seen by cybercriminals as “soft targets”–often underfunded, understaffed, and holding highly sensitive data. The consequences are severe: disrupted learning, strained budgets, and growing fears over student and staff privacy. Without stronger defenses, schools risk not only losing vital resources but also the trust of the communities they serve.

    Indicators of success against ransomware

    The new study demonstrates that the education sector is getting better at reacting and responding to ransomware, forcing cybercriminals to evolve their approach. Trending data from the study reveals an increase in attacks where adversaries attempt to extort money without encrypting data. Unfortunately, paying the ransom remains part of the solution for about half of all victims. However, the payment values are dropping significantly, and for those who have experienced data encryption in ransomware attacks, 97 percent were able to recover data in some way. The study found several key indicators of success against ransomware in education:

    • Stopping more attacks: When it comes to blocking attacks before files can be encrypted, both K-12 and higher education institutions reported their highest success rate in four years (67 percent and 38 percent of attacks, respectively).
    • Following the money: In the last year, ransom demands fell 73 percent (an average drop of $2.83M), while average payments dropped from $6M to $800K in lower education and from $4M to $463K in higher education.
    • Plummeting cost of recovery: Outside of ransom payments, average recovery costs dropped 77 percent in higher education and 39 percent in K-12 education. Despite this success, K-12 education reported the highest recovery bill across all industries surveyed.

    Gaps still need to be addressed

    While the education sector has made progress in limiting the impact of ransomware, serious gaps remain. In the Sophos study, 64 percent of victims reported missing or ineffective protection solutions; 66 percent cited a lack of people (either expertise or capacity) to stop attacks; and 67 percent admitted to having security gaps. These risks highlight the critical need for schools to focus on prevention, as cybercriminals develop new techniques, including AI-powered attacks.

    Highlights from the study that shed light on the gaps that still need to be addressed include:

    • AI-powered threats: K-12 education institutions reported that 22 percent of ransomware attacks had origins in phishing. With AI enabling more convincing emails, voice scams, and even deepfakes, schools risk becoming test grounds for emerging tactics.
    • High-value data: Higher education institutions, custodians of AI research and large language model datasets, remain a prime target, with exploited vulnerabilities (35 percent) and security gaps the provider was not aware of (45 percent) as leading weaknesses that were exploited by adversaries.
    • Human toll: Every institution with encrypted data reported impacts on IT staff. Over one in four staff members took leave after an attack, nearly 40 percent reported heightened stress, and more than one-third felt guilt they could not prevent the breach.

    “Ransomware attacks in education don’t just disrupt classrooms, they disrupt communities of students, families, and educators,” said Alexandra Rose, director of CTU Threat Research at Sophos. “While it’s encouraging to see schools strengthening their ability to respond, the real priority must be preventing these attacks in the first place. That requires strong planning and close collaboration with trusted partners, especially as adversaries adopt new tactics, including AI-driven threats.”

    Holding on to the gains

    Based on its work protecting thousands of educational institutions, Sophos experts recommend several steps to maintain momentum and prepare for evolving threats:

    • Focus on prevention: The dramatic success of lower education in stopping ransomware attacks before encryption offers a blueprint for broader public sector organizations. Organizations need to couple their detection and response efforts with preventing attacks before they compromise the organization.
    • Secure funding: Explore new avenues such as the U.S. Federal Communications Commission’s E-Rate subsidies to strengthen networks and firewalls, and the UK’s National Cyber Security Centre initiatives, including its free cyber defense service for schools, to boost overall protection. These resources help schools both prevent and withstand attacks.
    • Unify strategies: Educational institutions should adopt coordinated approaches across sprawling IT estates to close visibility gaps and reduce risks before adversaries can exploit them.
    • Relieve staff burden: Ransomware takes a heavy toll on IT teams. Schools can reduce pressure and extend their capabilities by partnering with trusted providers for managed detection and response (MDR) and other around-the-clock expertise.
    • Strengthen response: Even with stronger prevention, schools must be prepared to respond when incidents occur. They can recover more quickly by building robust incident response plans, running simulations to prepare for real-world scenarios, and enhancing readiness with 24/7/365 services like MDR.

    Data for the State of Ransomware in Education 2025 report comes from a vendor-agnostic survey of 441 IT and cybersecurity leaders – 243 from K-12 education and 198 from higher education institutions hit by ransomware in the past year. The organizations surveyed ranged from 100-5,000 employees and across 17 countries. The survey was conducted between January and March 2025, and respondents were asked about their experience of ransomware over the previous 12 months.

    This press release originally appeared online.

    Latest posts by eSchool Media Contributors (see all)

    Source link

  • Three Notable StatsCan Papers | HESA

    Three Notable StatsCan Papers | HESA

    Over the summer, Statistics Canda put out a few papers on higher education and immigration which got zero press but nevertheless are interesting enough that I thought you might all want to hear about them. Below are my précis: 

    The first paper, Recent trends in immigration from Canada to the United States by Feng Hou, Milly Yang and Yao Lu, is a very general look at outbound migration to the United States, looking  specifically at the characteristics of Canadian citizens who applying for labour certification in the United States in 2015 and in 2024. I found the three top-line results all somewhat surprising.

    • The number of US certification applicants declined by just over 25% between 2015 and 2024.
    • Outbound migration to the US by Canadians is predominantly a “new” Canadian thing. In 2015, Canadian citizens born outside Canada made up 54% of those seeking certification, and by 2024 that proportion had increased to nearly 60%.
    • Among Canadians seeking US certification in 2015, 41% had a master’s or doctoral degree.  In 2024, that proportion had fallen to 31%.

    In other words, brain drain to the US changed significantly over the space of a decade: fewer Canadians headed south, and among those who did, declining proportions were Canadian-born or held advance degrees. All somewhat surprising.

    The second paper, Fields of study and occupations of immigrants who were international students in Canada before immigration by Youjin Choi and Li Xu, divides out two recent cohorts (2011-15 and 2016-21) of immigrants and starts to tease out various aspects of their current status in Canada.  Here the key findings were:

    • In the 2011-15 period, 13% of all immigrants were former international students. By the 2016-21 period, that number had risen to 23%.
    • About a third of immigrants who were students in Canada say their highest degree was taken outside Canada. It’s a bit difficult to parse this. It may mean, for instance, that they obtained a bachelor’s degree in Canada, went to another country for their master’s degree and came back; it may also mean that they took a master’s degree abroad and took some kind of short post-graduate certificate here.
    • A little over a third of all immigrants who studied in Canada have a STEM degree, a proportion that increased a tiny bit over time. This is higher than for the Canadian-born population, but not hugely different from that of immigrants who did not study here.
    • A little under half of all former international STEM students in the immigrant pool were working in a STEM field, but this is strongly correlated with the level of education. Among sub-Bachelor’s graduates this proportion was a little over 20%, while among those with a Master’s degree or higher it was over 50%. This is significantly higher than it is for Canadian-born post-secondary graduates. In non-STEM fields, the relationship is reversed (i.e. Canadian-born graduates are more likely to be working in an aligned field).

    In other words, former international students are a rising proportion of all immigrants, a high proportion are STEM graduates, and a high proportion of them go on to work in STEM fields. All signs that policy is pushing results in the intended direction.

    The final paper, Retention of science, technology, engineering, mathematics and computer science graduates in Canada by Youjin Choi and Feng Hou, follows three cohorts of both domestic and international student graduates to see whether they stayed in the country (technically, it measures the proportion of graduates who file tax returns in Canada, which is a pretty good proxy for residency). The results are summed up in one incredibly ugly chart (seriously, why is StatsCan dataviz so awful?), which I reproduce below:

    So, in the chart the Y-axis is the percentage of STEM graduates who stay in Canada (measured by the proxy of tax filing) and the X-axis is years since graduation. Since they are following three different cohorts of graduates, the lines don’t all extend to the same length (the earliest cohort could be followed for ten years, the middle for seven and the most recent for just three).  The red set of lines represents outcomes for Canadian-born students and the blue set of lines does the same for international students.

    So, the trivial things this graph shows are that: i) both Canadian and international students leave Canada but ii) international students do so more frequently and iii) leaving the country is something that happens gradually over time. The interesting thing it shows, though, is that the most recent cohort (class of 2018) of STEM graduates are more likely to stay than earlier ones, and that this is especially true for international students: the retention rate of international graduates from the class of 2018 was almost fifteen percentage points higher than for the class of 2015.

    Was it a more welcoming economy? Maybe. But you’d have to think that our system of offering international students a path to citizenship had something to do with it too.

    Two other nuggets in the paper:

    • Canadian-born STEM graduates are slightly more likely to leave than non-STEM graduates (it’s not a huge difference, just a percentage point or two) while among international student graduates, those from STEM programs are substantially less likely to leave than those from non-STEM fields (a fifteen-point gap or more).
    • Regardless of where they are from, and regardless of what they studied, graduates from “highly-ranked” universities (no definition given, unfortunately) were more likely to leave Canada, presumably because degree prestige confers a certain degree of mobility.

    You are now fully up to date on the latest data on domestic and international graduates and their immigration pathways. Enjoy your day.

    Source link

  • Global demand for US master’s degrees plunges by 60%

    Global demand for US master’s degrees plunges by 60%

    The data, collected from January 6 to September 28, aligns closely with the start of Donald Trump’s second presidential term and the ensuing uncertainty around student visas and post-graduation work opportunities. It is based on the search behaviour of over 50 million prospective students on Studyportals.  

    “Prospective international students and their families weigh not only academic reputation but also regulatory stability and post-graduation prospects,” said Studyportals CEO Edwin van Rest: “Right now, those factors are working against institutions.”  

    Studyportals said the steep decline – dropping more than 60% in less than nine months – corresponds to proposed and enacted policy changes impacting student visa duration, Optional Practical Training (OPT) and H-1B work authorisation in the US. 

    Last week, the Trump administration shocked businesses and prospective employees by hiking the H-1B visa fee to $100,000 – over 20 times what employers previously paid. Days later, the government announced proposals to overhaul the visa system in favour of higher-paid workers.  

    Sector leaders have warned that OPT could be the administration’s next target, after a senior US senator called on the homeland security secretary Kristi Noem to stop issuing work authorisations such as OPT to international students.  

    Such a move would have a detrimental impact on student interest in the US, with a recent NAFSA survey suggesting that losing OPT reduces enrolment likelihood from 67% to 48%.  

    Meanwhile, roughly half of current students planning to stay in the US after graduation would abandon those plans if H-1B visas prioritised higher wage earners, the survey indicated.  

    “Prospective students are making go/no-go enrolment decisions, while current students are making stay/leave retention decisions,” said van Rest. 

    “Policy changes ripple through both ends of the pipeline, reducing new inflow and pushing out existing talent already contributing to US research, innovation and competitiveness,” he added.  

    Data: Studyportals

    The search data revealed a spike in interest at the beginning of July, primarily from Vietnam and Bangladesh, and to a lesser extent India and Pakistan. Experts have suggested the new Jardine-Fulbright Scholarship aimed at empowering future Vietnam leaders could have contributed to the rise.  

    Meanwhile, Iran, Nepal and India have seen the steepest drops in master’s demand, declining more than 60% this year to date compared to last.  

    While federal SEVIS data recorded a 0.8% rise in international student levels this semester, plummeting visa arrivals and anecdotal reports of fewer students on campus suggest the rise was in part due to OPT extensions – individuals who are counted in student totals but who are not enrolled on US campuses or paying tuition fees.  

    Beyond the immediate financial concerns of declining international enrolments for some schools, van Rest warned: “The policies we adopt today will echo for years in global talent flows.”

    The UK and Ireland have gained the most relative market share of international interest on Studyportals – both up 16% compared to the same period in 2024. Australia, Austria, Sweden and Spain all experienced a 12% increase on the previous year.  

    In the US, international students make up over half of all students enrolled in STEM fields and 70% of all full-time graduate enrolments in AI-related disciplines, according to Institute of International Education (IIE) data.  

    The policies we adopt today will echo for years in global talent flows

    Edwin van Rest, Studyportals

    What’s more, universities with higher rates of international enrolment have been found to produce more domestic STEM graduates, likely due to greater investment in these disciplines, National Foundation for American Policy (NFAP) research has shown.  

    Last year, graduate students made up 45% of the overall international student cohort (including OPT), compared to undergraduate which comprised roughly 30%, according to IIE Open Doors data.  

    Universities with higher proportions of overseas students have been found to produce more domestic STEM graduates, likely due to greater investment in these disciplines, National Foundation for American Policy (NFAP) research has shown. 

    The news of plummeting international demand comes as domestic enrolments are declining, with less high school graduates entering college education and an overall demographic shrinking of university-age students.  

    In a recent survey by the American Council on Education (ACE), nearly three quarters of college leaders said they were concerned about enrolment levels this semester, with 65% moderately or extremely worried about immigration restrictions and visa revocations.  

    Source link

  • Why critical data literacy belongs in every K–12 classroom

    Why critical data literacy belongs in every K–12 classroom

    Key points:

    An unexpected group of presenters–11th graders from Whitney M. Young Magnet High School in Chicago–made a splash at this year’s ACM Conference on Fairness, Accountability, and Transparency (FAccT). These students captivated seasoned researchers and professionals with their insights on how school environments shape students’ views of AI. “I wanted our project to serve as a window into the eyes of high school students,” said Autumn Moon, one of the student researchers.

    What enabled these students to contribute meaningfully to a conference dominated by PhDs and industry veterans was their critical data literacy–the ability to understand, question, and evaluate the ethics of complex systems like AI using data. They developed these skills through their school’s Data is Power program.

    Launched last year, Data is Power is a collaboration among K-12 educators, AI ethics researchers, and the Young Data Scientists League. The program includes four pilot modules that are aligned to K-12 standards and cover underexplored but essential topics in AI ethics, including labor and environmental impacts. The goal is to teach AI ethics by focusing on community-relevant topics chosen by our educators with input from students, all while fostering critical data literacy. For example, Autumn’s class in Chicago used AI ethics as a lens to help students distinguish between evidence-based research and AI propaganda. Students in Phoenix explored how conversational AI affects different neighborhoods in their city.

    Why does the Data is Power program focus on critical data literacy? In my former role leading a diverse AI team at Amazon, I saw that technical skills alone weren’t enough. We needed people who could navigate cultural nuance, question assumptions, and collaborate across disciplines. Some of the most technically proficient candidates struggled to apply their knowledge to real-world problems. In contrast, team members trained in critical data literacy–those who understood both the math and the societal context of the models–were better equipped to build responsible, practical tools. They also knew when not to build something.

    As AI becomes more embedded in our lives, and many students feel anxious about AI supplanting their job prospects, critical data literacy is a skill that is not just future-proof–it is future-necessary. Students (and all of us) need the ability to grapple with and think critically about AI and data in their lives and careers, no matter what they choose to pursue. As Milton Johnson, a physics and engineering teacher at Bioscience High School in Phoenix, told me: “AI is going to be one of those things where, as a society, we have a responsibility to make sure everyone has access in multiple ways.”

    Critical data literacy is as much about the humanities as it is about STEM. “AI is not just for computer scientists,” said Karren Boatner, who taught Autumn in her English literature class at Whitney M. Young Magnet High School. For Karren, who hadn’t considered herself a “math person” previously, one of the most surprising parts of the program was how much she and her students enjoyed a game-based module that used middle school math to explain how AI “learns.” Connecting math and literature to culturally relevant, real-world issues helps students see both subjects in a new light.

    As AI continues to reshape our world, schools must rethink how to teach about it. Critical data literacy helps students see the relevance of what they’re learning, empowering them to ask better questions and make more informed decisions. It also helps educators connect classroom content to students’ lived experiences.

    If education leaders want to prepare students for the future–not just as workers, but as informed citizens–they must invest in critical data literacy now. As Angela Nguyen, one of our undergraduate scholars from Stanford, said in her Data is Power talk: “Data is power–especially youth and data. All of us, whether qualitative or quantitative, can be great collectors of meaningful data that helps educate our own communities.”

    Latest posts by eSchool Media Contributors (see all)

    Source link

  • Education at a Glance 2025, Part 2

    Education at a Glance 2025, Part 2

    Three weeks ago, the Organization for Economic Co-operation and Development (OECD) released its annual stat fest, Education at a Glance (see last week’s blog for more on this year’s higher education and financing data). The most interesting thing about this edition is that the OECD chose to release some new data from the recent Programme for International Assessment of Adult Competencies (PIAAC) relating to literacy and numeracy levels that were included in the PIAAC 2013 release (see also here), but not in the December 2024 release.   

    (If you need a refresher: PIAAC is kind of like the Programme for International Student Assessment (PISA) but for adults and is carried out once a decade so countries can see for themselves how skilled their workforces are in terms of literacy, numeracy, and problem-solving).

    The specific details of interest that were missing in the earlier data release were on skill level by level of education (or more specifically, highest level of education achieved). OECD for some reason cuts the data into three – below upper secondary, upper secondary and post-secondary non-tertiary, and tertiary. Canada has a lot of post-secondary non-tertiary programming (a good chunk of community colleges are described this way) but for a variety of reasons lumps all college diplomas in with university degrees in with university degrees as “tertiary”, which makes analysis and comparison a bit difficult. But we can only work with the data the OECD gives us, so…

    Figures 1, 2 and 3 show PIAAC results for a number of OECD countries, comparing averages for just the Upper Secondary/Post-Secondary Non-Tertiary (which I am inelegantly going to label “US/PSNT”) and Tertiary educational attainment. They largely tell similar stories. Japan and Finland tend to be ranked towards the top of the table on all measures, while Korea, Poland and Chile tend to be ranked towards the bottom. Canada tends to be ahead of the OECD average at both levels of education, but not by much. The gap between US/PSNT and Tertiary results are significantly smaller on the “problem-solving” measure than on the others (which is interesting and arguably does not say very nice things about the state of tertiary education, but that’s maybe for another day). Maybe the most spectacular single result is that Finns with only US/PSNT education have literacy scores higher than university graduates in all but four other countries, including Canada.

    Figure 1: PIAAC Average Literacy Scores by Highest Level of Education Attained, Population Aged 25-64, Selected OECD Countries

    Figure 2: PIAAC Average Numeracy Scores by Highest Level of Education Attained, Population Aged 25-64, Selected OECD Countries

    Figure 3: PIAAC Average Problem Scores by Highest Level of Education Attained, Population Aged 25-64, Selected OECD Countries

    Another thing that is consistent across all of these graphs is that the gap between US/PSNT and tertiary graduates is not at all the same. In some countries the gap is quite low (e.g. Sweden) and in other countries the gap is quite high (e.g. Chile, France, Germany). What’s going on here, and does it suggest something about the effectiveness of tertiary education systems in different countries (i.e. most effective where the gaps are high, least effective where they are low)?

    Well, not necessarily. First, remember that the sample population is aged 25-64, and education systems undergo a lot of change in 40 years (for one thing, Poland, Chile and Korea were all dictatorships 40 years ago). Also, since we know scoring on these kinds of tests decline with age, demographic patterns matter too. Second, the relative size of systems matters. Imagine two secondary and tertiary systems had the same “quality”, but one tertiary system took in half of all high school graduates and the other only took in 10%. Chances are the latter would have better “results” at the tertiary level, but it would be entirely due to selection effects rather than to treatment effects.

    Can we control for these things? A bit. We can certainly control for the wide age-range because OECD breaks down the data by age. Re-doing Figures 1-3, but restricting the age range to 25-34, would at least get rid of the “legacy” part of the problem. This I do below in Figures 4-6. Surprisingly little changes as a result. The absolute scores are all higher, but you’d expect that given what we know about skill loss over time.  Across the board, Canada remains just slightly ahead of the OECD average. Korea does a bit better in general and Italy does a little bit worse, but other than the rank-order of results is pretty similar to what we saw for the general population (which I think is a pretty interesting finding when you think of how much effort countries put in to messing around with their education systems…does any of it matter?)

    Figure 4: PIAAC Average Literacy Scores by Highest Level of Education Attained, Population Aged 25-34, Selected OECD Countries

    Figure 5: PIAAC Average Numeracy Scores by Highest Level of Education Attained, Population Aged 25-34, Selected OECD Countries

    Figure 6: PIAAC Average Problem Scores by Highest Level of Education Attained, Population Aged 25-34, Selected OECD Countries

    Now, let’s turn to the question of whether or not we can control for selectivity. Back in 2013, I tried doing something like that, but it was only possible because OECD released PIAAC scores not just as averages but also in terms of quartile thresholds, and that isn’t the case this time. But what we can do is look a bit at the relationship between i) the size of the tertiary system relative to the size of the US/PSNT system (a measure of selectivity, basically) and ii) the degree to which results for tertiary students are higher than those for US/PSNT. 

    Which is what I do in Figure 7. The X-axis here is selectivity [tertiary attainment rate ÷ US/PSNT attainment rate rate] for 25-34 year olds on (the further right on the graph, the more open-access the system), and the Y-axis is PIAAC gaps Σ [tertiary score – US/PSNT score] across the literacy, numeracy and problem-solving measures (the higher the score, the bigger the gap between tertiary and US/PSNT scores). It shows that countries like Germany, Chile and Italy are both more highly selective and have greater score gaps than countries like Canada and Korea, which are the reverse. It therefore provides what I would call light support for the theory that the less open/more selective a system of tertiary education is, the bigger the gap tertiary between Tertiary and US/PSNT scores on literacy, numeracy and problem-solving scores.  Meaning, basically, beware of interpreting these gaps as evidence of relative system quality: they may well be effects of selection rather than treatment.

    Figure 7: Tertiary Attainment vs. PIAAC Score Gap, 25-34 year-olds

    That’s enough PIAAC fun for one Monday.  See you tomorrow.

    Source link