Category: privacy

  • K-12 districts are fighting ransomware, but IT teams pay the price

    K-12 districts are fighting ransomware, but IT teams pay the price

    Key points:

    The education sector is making measurable progress in defending against ransomware, with fewer ransom payments, dramatically reduced costs, and faster recovery rates, according to the fifth annual Sophos State of Ransomware in Education report from Sophos.

    Still, these gains are accompanied by mounting pressures on IT teams, who report widespread stress, burnout, and career disruptions following attacks–nearly 40 percent of the 441 IT and cybersecurity leaders surveyed reported dealing with anxiety.

    Over the past five years, ransomware has emerged as one of the most pressing threats to education–with attacks becoming a daily occurrence. Primary and secondary institutions are seen by cybercriminals as “soft targets”–often underfunded, understaffed, and holding highly sensitive data. The consequences are severe: disrupted learning, strained budgets, and growing fears over student and staff privacy. Without stronger defenses, schools risk not only losing vital resources but also the trust of the communities they serve.

    Indicators of success against ransomware

    The new study demonstrates that the education sector is getting better at reacting and responding to ransomware, forcing cybercriminals to evolve their approach. Trending data from the study reveals an increase in attacks where adversaries attempt to extort money without encrypting data. Unfortunately, paying the ransom remains part of the solution for about half of all victims. However, the payment values are dropping significantly, and for those who have experienced data encryption in ransomware attacks, 97 percent were able to recover data in some way. The study found several key indicators of success against ransomware in education:

    • Stopping more attacks: When it comes to blocking attacks before files can be encrypted, both K-12 and higher education institutions reported their highest success rate in four years (67 percent and 38 percent of attacks, respectively).
    • Following the money: In the last year, ransom demands fell 73 percent (an average drop of $2.83M), while average payments dropped from $6M to $800K in lower education and from $4M to $463K in higher education.
    • Plummeting cost of recovery: Outside of ransom payments, average recovery costs dropped 77 percent in higher education and 39 percent in K-12 education. Despite this success, K-12 education reported the highest recovery bill across all industries surveyed.

    Gaps still need to be addressed

    While the education sector has made progress in limiting the impact of ransomware, serious gaps remain. In the Sophos study, 64 percent of victims reported missing or ineffective protection solutions; 66 percent cited a lack of people (either expertise or capacity) to stop attacks; and 67 percent admitted to having security gaps. These risks highlight the critical need for schools to focus on prevention, as cybercriminals develop new techniques, including AI-powered attacks.

    Highlights from the study that shed light on the gaps that still need to be addressed include:

    • AI-powered threats: K-12 education institutions reported that 22 percent of ransomware attacks had origins in phishing. With AI enabling more convincing emails, voice scams, and even deepfakes, schools risk becoming test grounds for emerging tactics.
    • High-value data: Higher education institutions, custodians of AI research and large language model datasets, remain a prime target, with exploited vulnerabilities (35 percent) and security gaps the provider was not aware of (45 percent) as leading weaknesses that were exploited by adversaries.
    • Human toll: Every institution with encrypted data reported impacts on IT staff. Over one in four staff members took leave after an attack, nearly 40 percent reported heightened stress, and more than one-third felt guilt they could not prevent the breach.

    “Ransomware attacks in education don’t just disrupt classrooms, they disrupt communities of students, families, and educators,” said Alexandra Rose, director of CTU Threat Research at Sophos. “While it’s encouraging to see schools strengthening their ability to respond, the real priority must be preventing these attacks in the first place. That requires strong planning and close collaboration with trusted partners, especially as adversaries adopt new tactics, including AI-driven threats.”

    Holding on to the gains

    Based on its work protecting thousands of educational institutions, Sophos experts recommend several steps to maintain momentum and prepare for evolving threats:

    • Focus on prevention: The dramatic success of lower education in stopping ransomware attacks before encryption offers a blueprint for broader public sector organizations. Organizations need to couple their detection and response efforts with preventing attacks before they compromise the organization.
    • Secure funding: Explore new avenues such as the U.S. Federal Communications Commission’s E-Rate subsidies to strengthen networks and firewalls, and the UK’s National Cyber Security Centre initiatives, including its free cyber defense service for schools, to boost overall protection. These resources help schools both prevent and withstand attacks.
    • Unify strategies: Educational institutions should adopt coordinated approaches across sprawling IT estates to close visibility gaps and reduce risks before adversaries can exploit them.
    • Relieve staff burden: Ransomware takes a heavy toll on IT teams. Schools can reduce pressure and extend their capabilities by partnering with trusted providers for managed detection and response (MDR) and other around-the-clock expertise.
    • Strengthen response: Even with stronger prevention, schools must be prepared to respond when incidents occur. They can recover more quickly by building robust incident response plans, running simulations to prepare for real-world scenarios, and enhancing readiness with 24/7/365 services like MDR.

    Data for the State of Ransomware in Education 2025 report comes from a vendor-agnostic survey of 441 IT and cybersecurity leaders – 243 from K-12 education and 198 from higher education institutions hit by ransomware in the past year. The organizations surveyed ranged from 100-5,000 employees and across 17 countries. The survey was conducted between January and March 2025, and respondents were asked about their experience of ransomware over the previous 12 months.

    This press release originally appeared online.

    Latest posts by eSchool Media Contributors (see all)

    Source link

  • Data, privacy, and cybersecurity in schools: A 2025 wake-up call

    Data, privacy, and cybersecurity in schools: A 2025 wake-up call

    Key points:

    In 2025, schools are sitting on more data than ever before. Student records, attendance, health information, behavioral logs, and digital footprints generated by edtech tools have turned K-12 institutions into data-rich environments. As artificial intelligence becomes a central part of the learning experience, these data streams are being processed in increasingly complex ways. But with this complexity comes a critical question: Are schools doing enough to protect that data?

    The answer, in many cases, is no.

    The rise of shadow AI

    According to CoSN’s May 2025 State of EdTech District Leadership report, a significant portion of districts, specifically 43 percent, lack formal policies or guidance for AI use. While 80 percent of districts have generative AI initiatives underway, this policy gap is a major concern. At the same time, Common Sense Media’s Teens, Trust and Technology in the Age of AI highlights that many teens have been misled by fake content and struggle to discern truth from misinformation, underscoring the broad adoption and potential risks of generative AI.

    This lack of visibility and control has led to the rise of what many experts call “shadow AI”: unapproved apps and browser extensions that process student inputs, store them indefinitely, or reuse them to train commercial models. These tools are often free, widely adopted, and nearly invisible to IT teams. Shadow AI expands the district’s digital footprint in ways that often escape policy enforcement, opening the door to data leakage and compliance violations. CoSN’s 2025 report specifically notes that “free tools that are downloaded in an ad hoc manner put district data at risk.”

    Data protection: The first pillar under pressure

    The U.S. Department of Education’s AI Toolkit for Schools urges districts to treat student data with the same care as medical or financial records. However, many AI tools used in classrooms today are not inherently FERPA-compliant and do not always disclose where or how student data is stored. Teachers experimenting with AI-generated lesson plans or feedback may unknowingly input student work into platforms that retain or share that data. In the absence of vendor transparency, there is no way to verify how long data is stored, whether it is shared with third parties, or how it might be reused. FERPA requires that if third-party vendors handle student data on behalf of the institution, they must comply with FERPA. This includes ensuring data is not used for unintended purposes or retained for AI training.

    Some tools, marketed as “free classroom assistants,” require login credentials tied to student emails or learning platforms. This creates additional risks if authentication mechanisms are not protected or monitored. Even widely-used generative tools may include language in their privacy policies allowing them to use uploaded content for system training or performance optimization.

     

    Data processing and the consent gap

    Generative AI models are trained on large datasets, and many free tools continue learning from user prompts. If a student pastes an essay or a teacher includes student identifiers in a prompt, that information could enter a commercial model’s training loop. This creates a scenario where data is being processed without explicit consent, potentially in violation of COPPA (Children’s Online Privacy Protection Act) and FERPA. While the FTC’s December 2023 update to the COPPA Rule did not codify school consent provisions, existing guidance still allows schools to consent to technology use on behalf of parents in educational contexts. However, the onus remains on schools to understand and manage these consent implications, especially with the rule’s new amendments becoming effective June 21, 2025, which strengthen protections and require separate parental consent for third-party disclosures for targeted advertising.

    Moreover, many educators and students are unaware of what constitutes “personally identifiable information” (PII) in these contexts. A name combined with a school ID number, disability status, or even a writing sample could easily identify a student, especially in small districts. Without proper training, well-intentioned AI use can cross legal lines unknowingly.

    Cybersecurity risks multiply

    AI tools have also increased the attack surface of K-12 networks. According to ThreatDown’s 2024 State of Ransomware in Education report, ransomware attacks on K-12 schools increased by 92 percent between 2022 and 2023, with 98 total attacks in 2023. This trend is projected to continue as cybercriminals use AI to create more targeted phishing campaigns and detect system vulnerabilities faster. AI-assisted attacks can mimic human language and tone, making them harder to detect. Some attackers now use large language models to craft personalized emails that appear to come from school administrators.

    Many schools lack endpoint protection for student devices, and third-party integrations often bypass internal firewalls. Free AI browser extensions may collect keystrokes or enable unauthorized access to browser sessions. The more tools that are introduced without IT oversight, the harder it becomes to isolate and contain incidents when they occur. CoSN’s 2025 report indicates that 60 percent of edtech leaders are “very concerned about AI-enabled cyberattacks,” yet 61 percent still rely on general funds for cybersecurity efforts, not dedicated funding.

    Building a responsible framework

    To mitigate these risks, school leaders need to:

    • Audit tool usage using platforms like Lightspeed Digital Insight to identify AI tools being accessed without approval. Districts should maintain a living inventory of all digital tools. Lightspeed Digital Insight, for example, is vetted by 1EdTech for data privacy.
    • Develop and publish AI use policies that clarify acceptable practices, define data handling expectations, and outline consequences for misuse. Policies should distinguish between tools approved for instructional use and those requiring further evaluation.
    • Train educators and students to understand how AI tools collect and process data, how to interpret AI outputs critically, and how to avoid inputting sensitive information. AI literacy should be embedded in digital citizenship curricula, with resources available from organizations like Common Sense Media and aiEDU.
    • Vet all third-party apps through standards like the 1EdTech TrustEd Apps program. Contracts should specify data deletion timelines and limit secondary data use. The TrustEd Apps program has vetted over 12,000 products, providing a valuable resource for districts.
    • Simulate phishing attacks and test breach response protocols regularly. Cybersecurity training should be required for staff, and recovery plans must be reviewed annually.

    Trust starts with transparency

    In the rush to embrace AI, schools must not lose sight of their responsibility to protect students’ data and privacy. Transparency with parents, clarity for educators, and secure digital infrastructure are not optional. They are the baseline for trust in the age of algorithmic learning.

    AI can support personalized learning, but only if we put safety and privacy first. The time to act is now. Districts that move early to build policies, offer training, and coordinate oversight will be better prepared to lead AI adoption with confidence and care.

    Latest posts by eSchool Media Contributors (see all)

    Source link