NCAN said mixed-status families must make their own decision regarding whether they feel comfortable filling out the FAFSA.
College access organizations are raising concerns about students from mixed-status families—families with members who hold different immigration statuses—who are filling out the Free Application for Federal Student Aid amid the Trump administration’s mass deportation campaign.
“Although the Higher Education Act prohibits the use of data for any purpose other than determining and awarding federal financial assistance, [the National College Attainment Network] cannot assure mixed-status students and families that data submitted to US Department of Education (ED), as part of the FAFSA process, will continue to be protected,” NCAN, which represents college access organizations across the nation, wrote in new guidance.
The organization added that the Office of Federal Student Aid has said the Education Department won’t share information that breaks the law.
But “we understand many families’ confidence in this statement may not be as certain under the current administration,” the organization continued. The post advised families to consider whether to submit a FAFSA on a “case-by-case basis.”
The organization had previously published similar guidance before President Donald Trump even took office but updated it after the 2026–27 FAFSA opened late last month. Zenia Henderson, chief program officer for NCAN, said the organization has received a slew of questions about the security of the personal information entered into the FAFSA, and many of its member organizations are reporting that some of the families they work with are forgoing the FAFSA out of fear.
The Trump administration has also attacked programs and initiatives that help undocumented students themselves access higher education. The administration has demanded states stop offering in-state tuition to undocumented students and has attempted to eliminate the Deferred Action for Childhood Arrivals program, which protects from deportation certain undocumented individuals who were brought to the country as children and has opened the door to higher education for this group.
Other experts and advocacy groups agreed that there is cause for concern among mixed-status families.
“Concerns are very much warranted in light of how cross-agency collaboration has been weaponized against immigrant families in recent months—including but not limited to the ostensible collusion between the Departments of Justice and Homeland Security to vacate active asylum cases when parents and children are lawfully appearing in immigration court, so that they can be apprehended on the premises by immigration enforcement and placed in detention,” wrote Faisal Al-Juburi, chief external affairs officer for RAICES, a nonprofit immigrant law center in Texas, in an email to Inside Higher Ed. “There is simply no indication that the Trump administration will adhere to legal precedent.”
Will Davies, director of policy and research for Breakthrough Central Texas, a college access organization, noted in an email to Inside Higher Ed that, even though the Trump administration’s immigration attacks have been especially worrying for mixed-status families, such families have long had to make difficult decisions about when to submit personal information to the government.
He also noted that FAFSA data is protected by the Privacy Act of 1974 and the Family Educational Rights and Privacy Act and said that, to his knowledge, no undocumented parent has ever been targeted using FAFSA data.
Cutting Off Access
For many families, the choice is not as clear-cut as simply not filling out the FAFSA. Most institutions and states calculate their financial aid offerings using the FAFSA’s formula and require students to fill out the FAFSA to take advantage of that aid. If mixed-status families do not complete the FAFSA, they are essentially cutting themselves off from almost all sources of assistance in paying for college.
“It has the potential to close a lot of doors in terms of accessing aid that’s needed, from last-dollar scholarships to merit-based scholarships,” Henderson said. “There are so many folks that ask for FAFSA information and that [the] application be competed in order to check eligibility, because they may not have their own systems or processes in place. FAFSA really is the default way to prove need.”
Three states—California, New York and Washington—have developed their own financial need calculation tools for individuals who want to be considered only for state and local aid. All three address privacy concerns, stating specifically that the data will not be provided to the federal government without a court order.
“The opportunity to pursue an education is highly valued, and financial aid is the only way many students can afford college or training,” the Washington Student Achievement Council wrote in a message, released days after Trump entered office, about aid applicant privacy. WSAC administers Washington’s state aid calculator.
“We sympathize deeply with anyone concerned about their privacy in applying for financial aid, and we support students and families in making decisions that best fit their educational goals and risk considerations. While WSAC cannot provide guidance on what a family should do in a specific situation, we do encourage students, families, educators, and advocates to review the following resources that may provide helpful information.”
Alison De Lucca, executive director of the Southern California College Attainment Network, told Inside Higher Ed in an email that her organization is working with several families who are uncertain if they will fill out the FAFSA this year; an estimated one in every five individuals under the age of 18 in California comes from a mixed-status family.
One SoCal CAN student opted to fill out just California’s state aid form, the California Dream Act Application, this year in order to protect her mother—even though she thought she might have benefited from federal aid.
The education sector is making measurable progress in defending against ransomware, with fewer ransom payments, dramatically reduced costs, and faster recovery rates, according to the fifth annual Sophos State of Ransomware in Education report from Sophos.
Still, these gains are accompanied by mounting pressures on IT teams, who report widespread stress, burnout, and career disruptions following attacks–nearly 40 percent of the 441 IT and cybersecurity leaders surveyed reported dealing with anxiety.
Over the past five years, ransomware has emerged as one of the most pressing threats to education–with attacks becoming a daily occurrence. Primary and secondary institutions are seen by cybercriminals as “soft targets”–often underfunded, understaffed, and holding highly sensitive data. The consequences are severe: disrupted learning, strained budgets, and growing fears over student and staff privacy. Without stronger defenses, schools risk not only losing vital resources but also the trust of the communities they serve.
Indicators of success against ransomware
The new study demonstrates that the education sector is getting better at reacting and responding to ransomware, forcing cybercriminals to evolve their approach. Trending data from the study reveals an increase in attacks where adversaries attempt to extort money without encrypting data. Unfortunately, paying the ransom remains part of the solution for about half of all victims. However, the payment values are dropping significantly, and for those who have experienced data encryption in ransomware attacks, 97 percent were able to recover data in some way. The study found several key indicators of success against ransomware in education:
Stopping more attacks: When it comes to blocking attacks before files can be encrypted, both K-12 and higher education institutions reported their highest success rate in four years (67 percent and 38 percent of attacks, respectively).
Following the money: In the last year, ransom demands fell 73 percent (an average drop of $2.83M), while average payments dropped from $6M to $800K in lower education and from $4M to $463K in higher education.
Plummeting cost of recovery: Outside of ransom payments, average recovery costs dropped 77 percent in higher education and 39 percent in K-12 education. Despite this success, K-12 education reported the highest recovery bill across all industries surveyed.
Gaps still need to be addressed
While the education sector has made progress in limiting the impact of ransomware, serious gaps remain. In the Sophos study, 64 percent of victims reported missing or ineffective protection solutions; 66 percent cited a lack of people (either expertise or capacity) to stop attacks; and 67 percent admitted to having security gaps. These risks highlight the critical need for schools to focus on prevention, as cybercriminals develop new techniques, including AI-powered attacks.
Highlights from the study that shed light on the gaps that still need to be addressed include:
AI-powered threats: K-12 education institutions reported that 22 percent of ransomware attacks had origins in phishing. With AI enabling more convincing emails, voice scams, and even deepfakes, schools risk becoming test grounds for emerging tactics.
High-value data: Higher education institutions, custodians of AI research and large language model datasets, remain a prime target, with exploited vulnerabilities (35 percent) and security gaps the provider was not aware of (45 percent) as leading weaknesses that were exploited by adversaries.
Human toll: Every institution with encrypted data reported impacts on IT staff. Over one in four staff members took leave after an attack, nearly 40 percent reported heightened stress, and more than one-third felt guilt they could not prevent the breach.
“Ransomware attacks in education don’t just disrupt classrooms, they disrupt communities of students, families, and educators,” said Alexandra Rose, director of CTU Threat Research at Sophos. “While it’s encouraging to see schools strengthening their ability to respond, the real priority must be preventing these attacks in the first place. That requires strong planning and close collaboration with trusted partners, especially as adversaries adopt new tactics, including AI-driven threats.”
Holding on to the gains
Based on its work protecting thousands of educational institutions, Sophos experts recommend several steps to maintain momentum and prepare for evolving threats:
Focus on prevention: The dramatic success of lower education in stopping ransomware attacks before encryption offers a blueprint for broader public sector organizations. Organizations need to couple their detection and response efforts with preventing attacks before they compromise the organization.
Secure funding: Explore new avenues such as the U.S. Federal Communications Commission’s E-Rate subsidies to strengthen networks and firewalls, and the UK’s National Cyber Security Centre initiatives, including its free cyber defense service for schools, to boost overall protection. These resources help schools both prevent and withstand attacks.
Unify strategies: Educational institutions should adopt coordinated approaches across sprawling IT estates to close visibility gaps and reduce risks before adversaries can exploit them.
Relieve staff burden: Ransomware takes a heavy toll on IT teams. Schools can reduce pressure and extend their capabilities by partnering with trusted providers for managed detection and response (MDR) and other around-the-clock expertise.
Strengthen response: Even with stronger prevention, schools must be prepared to respond when incidents occur. They can recover more quickly by building robust incident response plans, running simulations to prepare for real-world scenarios, and enhancing readiness with 24/7/365 services like MDR.
Data for the State of Ransomware in Education 2025 report comes from a vendor-agnostic survey of 441 IT and cybersecurity leaders – 243 from K-12 education and 198 from higher education institutions hit by ransomware in the past year. The organizations surveyed ranged from 100-5,000 employees and across 17 countries. The survey was conducted between January and March 2025, and respondents were asked about their experience of ransomware over the previous 12 months.
In 2025, schools are sitting on more data than ever before. Student records, attendance, health information, behavioral logs, and digital footprints generated by edtech tools have turned K-12 institutions into data-rich environments. As artificial intelligence becomes a central part of the learning experience, these data streams are being processed in increasingly complex ways. But with this complexity comes a critical question: Are schools doing enough to protect that data?
The answer, in many cases, is no.
The rise of shadow AI
According to CoSN’s May 2025State of EdTech District Leadership report, a significant portion of districts, specifically 43 percent, lack formal policies or guidance for AI use. While 80 percent of districts have generative AI initiatives underway, this policy gap is a major concern. At the same time,Common Sense Media’s Teens, Trust and Technology in the Age of AI highlights that many teens have been misled by fake content and struggle to discern truth from misinformation, underscoring the broad adoption and potential risks of generative AI.
This lack of visibility and control has led to the rise of what many experts call “shadow AI”: unapproved apps and browser extensions that process student inputs, store them indefinitely, or reuse them to train commercial models. These tools are often free, widely adopted, and nearly invisible to IT teams. Shadow AI expands the district’s digital footprint in ways that often escape policy enforcement, opening the door to data leakage and compliance violations. CoSN’s 2025 report specifically notes that “free tools that are downloaded in an ad hoc manner put district data at risk.”
Data protection: The first pillar under pressure
TheU.S. Department of Education’s AI Toolkit for Schools urges districts to treat student data with the same care as medical or financial records. However, many AI tools used in classrooms today are not inherently FERPA-compliant and do not always disclose where or how student data is stored. Teachers experimenting with AI-generated lesson plans or feedback may unknowingly input student work into platforms that retain or share that data. In the absence of vendor transparency, there is no way to verify how long data is stored, whether it is shared with third parties, or how it might be reused. FERPA requires that if third-party vendors handle student data on behalf of the institution, they must comply with FERPA. This includes ensuring data is not used for unintended purposes or retained for AI training.
Some tools, marketed as “free classroom assistants,” require login credentials tied to student emails or learning platforms. This creates additional risks if authentication mechanisms are not protected or monitored. Even widely-used generative tools may include language in their privacy policies allowing them to use uploaded content for system training or performance optimization.
Data processing and the consent gap
Generative AI models are trained on large datasets, and many free tools continue learning from user prompts. If a student pastes an essay or a teacher includes student identifiers in a prompt, that information could enter a commercial model’s training loop. This creates a scenario where data is being processed without explicit consent, potentially in violation of COPPA (Children’s Online Privacy Protection Act) and FERPA. While theFTC’s December 2023 update to the COPPA Rule did not codify school consent provisions, existing guidance still allows schools to consent to technology use on behalf of parents in educational contexts. However, the onus remains on schools to understand and manage these consent implications, especially with the rule’s new amendments becoming effective June 21, 2025, which strengthen protections and require separate parental consent for third-party disclosures for targeted advertising.
Moreover, many educators and students are unaware of what constitutes “personally identifiable information” (PII) in these contexts. A name combined with a school ID number, disability status, or even a writing sample could easily identify a student, especially in small districts. Without proper training, well-intentioned AI use can cross legal lines unknowingly.
Cybersecurity risks multiply
AI tools have also increased the attack surface of K-12 networks. According toThreatDown’s 2024 State of Ransomware in Education report, ransomware attacks on K-12 schools increased by 92 percent between 2022 and 2023, with 98 total attacks in 2023. This trend is projected to continue as cybercriminals use AI to create more targeted phishing campaigns and detect system vulnerabilities faster. AI-assisted attacks can mimic human language and tone, making them harder to detect. Some attackers now use large language models to craft personalized emails that appear to come from school administrators.
Many schools lack endpoint protection for student devices, and third-party integrations often bypass internal firewalls. Free AI browser extensions may collect keystrokes or enable unauthorized access to browser sessions. The more tools that are introduced without IT oversight, the harder it becomes to isolate and contain incidents when they occur. CoSN’s 2025 report indicates that 60 percent of edtech leaders are “very concerned about AI-enabled cyberattacks,” yet 61 percent still rely on general funds for cybersecurity efforts, not dedicated funding.
Building a responsible framework
To mitigate these risks, school leaders need to:
Audit tool usage using platforms likeLightspeed Digital Insight to identify AI tools being accessed without approval. Districts should maintain a living inventory of all digital tools. Lightspeed Digital Insight, for example, is vetted by 1EdTech for data privacy.
Develop and publish AI use policies that clarify acceptable practices, define data handling expectations, and outline consequences for misuse. Policies should distinguish between tools approved for instructional use and those requiring further evaluation.
Train educators and students to understand how AI tools collect and process data, how to interpret AI outputs critically, and how to avoid inputting sensitive information. AI literacy should be embedded in digital citizenship curricula, with resources available from organizations like Common Sense Media andaiEDU.
Vet all third-party apps through standards like the1EdTech TrustEd Apps program. Contracts should specify data deletion timelines and limit secondary data use. The TrustEd Apps program has vetted over 12,000 products, providing a valuable resource for districts.
Simulate phishing attacks and test breach response protocols regularly. Cybersecurity training should be required for staff, and recovery plans must be reviewed annually.
Trust starts with transparency
In the rush to embrace AI, schools must not lose sight of their responsibility to protect students’ data and privacy. Transparency with parents, clarity for educators, and secure digital infrastructure are not optional. They are the baseline for trust in the age of algorithmic learning.
AI can support personalized learning, but only if we put safety and privacy first. The time to act is now. Districts that move early to build policies, offer training, and coordinate oversight will be better prepared to lead AI adoption with confidence and care.
Rishi Raj Gera, Magic Edtech
Rishi Raj Gera is the Chief Solutions Officer at Magic Edtech. Rishi brings over two decades of experience in designing digital learning systems that sit at the intersection of accessibility, personalization, and emerging technology. His work is driven by a consistent focus on building educational systems that adapt to individual learner needs while maintaining ethical boundaries and equity in design. Rishi continues to advocate for learning environments that are as human-aware as they are data-smart, especially in a time when technology is shaping how students engage with knowledge and one another.
Latest posts by eSchool Media Contributors (see all)
Washington, D.C. –CoSN today awarded Delaware Area Career Center in Delaware, Ohio, the Trusted Learning Environment (TLE) Mini Seal in the Business Practice. The CoSN TLE Seal is a national distinction awarded to school districts implementing rigorous privacy policies and practices to help protect student information. Delaware Area Career Center is the sixth school district in Ohio to earn a TLE Seal or TLE Mini Seal. To date, TLE Seal recipients have improved privacy protections for over 1.2 million students.
The CoSN TLE Seal program requires that school systems uphold high standards for protecting student data privacy across five key practice areas: Leadership, Business, Data Security, Professional Development and Classroom. The TLE Mini Seal program enables school districts nationwide to build toward earning the full TLE Seal by addressing privacy requirements in one or more practice areas at a time. All TLE Seal and Mini Seal applicants receive feedback and guidance to help them improve their student data privacy programs.
“CoSN is committed to supporting districts as they address the complex demands of student data privacy. We’re proud to see Delaware Area Career Center take meaningful steps to strengthen its privacy practices and to see the continued growth of the TLE Seal program in Ohio,” said Keith Krueger, CEO, CoSN.
“Earning the TLE Mini Seal is a tremendous acknowledgement of the work we’ve done to uphold high standards in safeguarding student data. This achievement inspires confidence in our community and connects us through a shared commitment to privacy, transparency and security at every level,” said Rory Gaydos, Director of Information Technology, Delaware Area Career Center.
The CoSN TLE Seal is the only privacy framework designed specifically for school systems. Earning the TLE Seal requires that school systems have taken measurable steps to implement, maintain and improve organization-wide student data privacy practices. All TLE Seal recipients are required to demonstrate that improvement through a reapplication process every two years.
About CoSN CoSN, the world-class professional association for K-12 EdTech leaders, stands at the forefront of education innovation. We are driven by a mission to equip current and aspiring K-12 education technology leaders, their teams, and school districts with the community, knowledge, and professional development they need to cultivate engaging learning environments. Our vision is rooted in a future where every learner reaches their unique potential, guided by our community. CoSN represents over 13 million students and continues to grow as a powerful and influential voice in K-12 education. www.cosn.org
About the CoSN Trusted Learning Environment Seal Program The CoSN Trusted Learning Environment (TLE) Seal Program is the nation’s only data privacy framework for school systems, focused on building a culture of trust and transparency. The TLE Seal was developed by CoSN in collaboration with a diverse group of 28 school system leaders nationwide and with support from AASA, The School Superintendents Association, the Association of School Business Officials International (ASBO) and ASCD. School systems that meet the program requirements will earn the TLE Seal, signifying their commitment to student data privacy to their community. TLE Seal recipients also commit to continuous examination and demonstrable future advancement of their privacy practices. www.cosn.org/trusted
About Delaware Area Career Center Delaware Area Career Center provides unique elective courses to high school students in Delaware County and surrounding areas. We work in partnership with partner high schools to enhance academic education with hands-on instruction that is focused on each individual student’s area of interest. DACC students still graduate from their home high school, but they do so with additional college credits, industry credentials, and valuable experiences. www.delawareareacc.org
eSchool Media staff cover education technology in all its aspects–from legislation and litigation, to best practices, to lessons learned and new products. First published in March of 1998 as a monthly print and digital newspaper, eSchool Media provides the news and information necessary to help K-20 decision-makers successfully use technology and innovation to transform schools and colleges and achieve their educational goals.
The Education Reporting Collaborative, a coalition of eight newsrooms, is investigating the unintended consequences of AI-powered surveillance at schools. Members of the Collaborative are AL.com, The Associated Press, The Christian Science Monitor, The Dallas Morning News, The Hechinger Report, Idaho Education News, The Post and Courier in South Carolina, and The Seattle Times.
One student asked a search engine, “Why does my boyfriend hit me?” Another threatened suicide in an email to an unrequited love. A gay teen opened up in an online diary about struggles with homophobic parents, writing they just wanted to be themselves.
In each case and thousands of others, surveillance software powered by artificial intelligence immediately alerted Vancouver Public Schools staff in Washington state.
Vancouver and many other districts around the country have turned to technology to monitor school-issued devices 24/7 for any signs of danger as they grapple with a student mental health crisis and the threat of shootings.
The goal is to keep children safe, but these tools raise serious questions about privacy and security – as proven when Seattle Times and Associated Press reporters inadvertently received access to almost 3,500 sensitive, unredacted student documents through a records request about the district’s surveillance technology.
The released documents show students use these laptops for more than just schoolwork; they are coping with angst in their personal lives.
Tim Reiland, 42, center, the parent of daughter Zoe Reiland, 17, right, and Anakin Reiland, 15, photographed in Clinton, Miss., Monday, March 10, 2025, said he had no idea their previous schools, in Oklahoma, were using surveillance technology to monitor the students. (AP Photo/Rogelio V. Solis)
Students wrote about depression, heartbreak, suicide, addiction, bullying and eating disorders. There are poems, college essays and excerpts from role-play sessions with AI chatbots.
Vancouver school staff and anyone else with links to the files could read everything. Firewalls or passwords didn’t protect the documents, and student names were not redacted, which cybersecurity experts warned was a massive security risk.
The monitoring tools often helped counselors reach out to students who might have otherwise struggled in silence. But the Vancouver case is a stark reminder of surveillance technology’s unintended consequences in American schools.
In some cases, the technology has outed LGBTQ+ children and eroded trust between students and school staff, while failing to keep schools completely safe.
Gaggle, the company that developed the software that tracks Vancouver schools students’ online activity, believes not monitoring children is like letting them loose on “a digital playground without fences or recess monitors,” CEO and founder Jeff Patterson said.
Roughly 1,500 school districts nationwide use Gaggle’s software to track the online activity of approximately 6 million students. It’s one of many companies, like GoGuardian and Securly, that promise to keep kids safe through AI-assisted web surveillance.
Vancouver schools apologized for releasing the documents. Still, the district emphasizes Gaggle is necessary to protect students’ well-being.
“I don’t think we could ever put a price on protecting students,” said Andy Meyer, principal of Vancouver’s Skyview High School. “Anytime we learn of something like that and we can intervene, we feel that is very positive.”
Dacia Foster, a parent in the district, commended the efforts to keep students safe but worries about privacy violations.
“That’s not good at all,” Foster said after learning the district inadvertently released the records. “But what are my options? What do I do? Pull my kid out of school?”
Foster says she’d be upset if her daughter’s private information was compromised.
“At the same time,” she said, “I would like to avoid a school shooting or suicide.”
Gaggle uses a machine learning algorithm to scan what students search or write online via a school-issued laptop or tablet 24 hours a day, or whenever they log into their school account on a personal device. The latest contract Vancouver signed, in summer 2024, shows a price of $328,036 for three school years – approximately the cost of employing one extra counselor.
The algorithm detects potential indicators of problems like bullying, self-harm, suicide or school violence and then sends a screenshot to human reviewers. If Gaggle employees confirm the issue might be serious, the company alerts the school. In cases of imminent danger, Gaggle calls school officials directly. In rare instances where no one answers, Gaggle may contact law enforcement for a welfare check.
A Vancouver school counselor who requested anonymity out of fear of retaliation said they receive three or four student Gaggle alerts per month. In about half the cases, the district contacts parents immediately.
“A lot of times, families don’t know. We open that door for that help,” the counselor said. Gaggle is “good for catching suicide and self-harm, but students find a workaround once they know they are getting flagged.”
Related: Have you had experience with school surveillance tech? Tell us about it
Seattle Times and AP reporters saw what kind of writing set off Gaggle’s alerts after requesting information about the type of content flagged. Gaggle saved screenshots of activity that set off each alert, and school officials accidentally provided links to them, not realizing they weren’t protected by a password.
After learning about the records inadvertently released to reporters, Gaggle updated its system. Now, after 72 hours, only those logged into a Gaggle account can view the screenshots. Gaggle said this feature was already in the works but had not yet been rolled out to every customer.
The company says the links must be accessible without a login during those 72 hours so emergency contacts—who often receive these alerts late at night on their phones—can respond quickly.
In Vancouver, the monitoring technology flagged more than 1,000 documents for suicide and nearly 800 for threats of violence. While many alerts were serious, many others turned out to be false alarms, like a student essay about the importance of consent or a goofy chat between friends.
Foster’s daughter Bryn, a Vancouver School of Arts and Academics sophomore, was one such false alarm. She was called into the principal’s office after writing a short story featuring a scene with mildly violent imagery.
“I’m glad they’re being safe about it, but I also think it can be a bit much,” Bryn said.
School officials maintain alerts are warranted even in less severe cases or false alarms, ensuring potential issues are addressed promptly.
“It allows me the opportunity to meet with a student I maybe haven’t met before and build that relationship,” said Chele Pierce, a Skyview High School counselor.
Between October 2023 and October 2024, nearly 2,200 students, about 10% of the district’s enrollment, were the subject of a Gaggle alert. At the Vancouver School of Arts and Academics, where Bryn is a student, about 1 in 4 students had communications that triggered a Gaggle alert.
While schools continue to use surveillance technology, its long-term effects on student safety are unclear. There’s no independent research showing it measurably lowers student suicide rates or reduces violence.
A 2023 RAND study found only “scant evidence” of either benefits or risks from AI surveillance, concluding: “No research to date has comprehensively examined how these programs affect youth suicide prevention.”
“If you don’t have the right number of mental health counselors, issuing more alerts is not actually going to improve suicide prevention,” said report co-author Benjamin Boudreaux, an AI ethics researcher.
In the screenshots released by Vancouver schools, at least six students were potentially outed to school officials after writing about being gay, trans or struggling with gender dysphoria.
LGBTQ+ students are more likely than their peers to suffer from depression and suicidal thoughts, and turn to the internet for support.
“We know that gay youth, especially those in more isolated environments, absolutely use the internet as a life preserver,” said Katy Pearce, a University of Washington professor who researches technology in authoritarian states.
In one screenshot, a Vancouver high schooler wrote in a Google survey form they’d been subject to trans slurs and racist bullying. Who created this survey is unclear, but the person behind it had falsely promised confidentiality: “I am not a mandated reporter, please tell me the whole truth.”
When North Carolina’s Durham Public Schools piloted Gaggle in 2021, surveys showed most staff members found it helpful.
But community members raised concerns. An LGBTQ+ advocate reported to the Board of Education that a Gaggle alert about self-harm had led to a student being outed to their family, who were not supportive.
Glenn Thompson, a Durham School of the Arts graduate, poses in front of the school in Durham, N.C., Monday, March 10, 2025. (AP Photo/Karl DeBlaker)
Glenn Thompson, a Durham School of the Arts graduate, spoke up at a board meeting during his senior year. One of his teachers promised a student confidentiality for an assignment related to mental health. A classmate was then “blindsided” when Gaggle alerted school officials about something private they’d disclosed. Thompson said no one in the class, including the teacher, knew the school was piloting Gaggle.
“You can’t just (surveil) people and not tell them. That’s a horrible breach of security and trust,” said Thompson, now a college student, in an interview.
After hearing about these experiences, the Durham Board of Education voted to stop using Gaggle in 2023. The district ultimately decided it was not worth the risk of outing students or eroding relationships with adults.
The debate over privacy and security is complicated, and parents are often unaware it’s even an issue. Pearce, the University of Washington professor, doesn’t remember reading about Securly, the surveillance software Seattle Public Schools uses, when she signed the district’s responsible use form before her son received a school laptop.
Even when families learn about school surveillance, they may be unable to opt out. Owasso Public Schools in Oklahoma has used Gaggle since 2016 to monitor students outside of class.
For years, Tim Reiland, the parent of two teenagers, had no idea the district was using Gaggle. He found out only after asking if his daughter could bring her personal laptop to school instead of being forced to use a district one because of privacy concerns.
The district refused Reiland’s request.
When his daughter, Zoe, found out about Gaggle, she says she felt so “freaked out” that she stopped Googling anything personal on her Chromebook, even questions about her menstrual period. She didn’t want to get called into the office for “searching up lady parts.”
“I was too scared to be curious,” she said.
School officials say they don’t track metrics measuring the technology’s efficacy but believe it has saved lives.
Yet technology alone doesn’t create a safe space for all students. In 2024, a nonbinary teenager at Owasso High School named Nex Benedict died by suicide after relentless bullying from classmates. A subsequent U.S. Department of Education Office for Civil Rights investigation found the district responded with “deliberate indifference” to some families’ reports of sexual harassment, mainly in the form of homophobic bullying.
During the 2023-24 school year, the Owasso schools received close to 1,000 Gaggle alerts, including 168 alerts for harassment and 281 for suicide.
When asked why bullying remained a problem despite surveillance, Russell Thornton, the district’s executive director of technology responded: “This is one tool used by administrators. Obviously, one tool is not going to solve the world’s problems and bullying.”
Despite the risks, surveillance technology can help teachers intervene before a tragedy.
A middle school student in the Seattle-area Highline School District who was potentially being trafficked used Gaggle to communicate with campus staff, said former superintendent Susan Enfield.
“They knew that the staff member was reading what they were writing,” Enfield said. “It was, in essence, that student’s way of asking for help.”
Still, developmental psychology research shows it is vital for teens to have private spaces online to explore their thoughts and seek support.
“The idea that kids are constantly under surveillance by adults — I think that would make it hard to develop a private life, a space to make mistakes, a space to go through hard feelings without adults jumping in,” said Boudreaux, the AI ethics researcher.
Gaggle’s Patterson says school-issued devices are not the appropriate place for unlimited self-exploration. If that exploration takes a dark turn, such as making a threat, “the school’s going to be held liable,” he said. “If you’re looking for that open free expression, it really can’t happen on the school system’s computers.”
Claire Bryan is an education reporter for The Seattle Times. Sharon Lurye is an education data reporter for The Associated Press.
Contact Hechinger managing editor Caroline Preston at 212-870-8965, on Signal at CarolineP.83 or via email at [email protected].
This story about AI-powered surveillance at schools was produced by the Education Reporting Collaborative, a coalition of eight newsrooms that includes AL.com, The Associated Press, The Christian Science Monitor, The Dallas Morning News, The Hechinger Report, Idaho Education News, The Post and Courier in South Carolina, and The Seattle Times.
The Hechinger Report provides in-depth, fact-based, unbiased reporting on education that is free to all readers. But that doesn’t mean it’s free to produce. Our work keeps educators and the public informed about pressing issues at schools and on campuses throughout the country. We tell the whole story, even when the details are inconvenient. Help us keep doing that.