Tag: suicide

  • Podcast: Cuts, student suicide, widening access

    Podcast: Cuts, student suicide, widening access

    This week on the podcast we examine the government’s brutal funding cuts to universities.

    What does the £108m reduction in the Strategic Priorities Grant mean for higher education, and why are media studies and journalism courses losing their high-cost subject funding?

    Plus we discuss the independent review of student suicides, and explore new research on widening participation and regional disparities.

    With Shân Wareing, Vice Chancellor at Middlesex University, Richard Brabner, Executive Chair at the UPP Foundation, Debbie McVitty, Editor at Wonkhe and presented by Jim Dickinson, Associate Editor at Wonkhe.

    Read more

    Why not take a risk-based approach to discrimination or harassment on campus?

    Whatuni Student Choice Awards

    For those in HE cold spots, higher education isn’t presenting as a good bet

    A review of student suicides suggests that standards are now necessary

    What have coroner’s reports said about student suicide?

    A brutal budget for strategic priorities from the Department for Education

    Why are we so embarrassed about Erasmus?

    Source link

  • Schools are surveilling kids to prevent gun violence or suicide. The lack of privacy comes at a cost

    Schools are surveilling kids to prevent gun violence or suicide. The lack of privacy comes at a cost

    The Education Reporting Collaborative, a coalition of eight newsrooms, is investigating the unintended consequences of AI-powered surveillance at schools. Members of the Collaborative are AL.com, The Associated Press, The Christian Science Monitor, The Dallas Morning News, The Hechinger Report, Idaho Education News, The Post and Courier in South Carolina, and The Seattle Times.

    One student asked a search engine, “Why does my boyfriend hit me?” Another threatened suicide in an email to an unrequited love. A gay teen opened up in an online diary about struggles with homophobic parents, writing they just wanted to be themselves.

    In each case and thousands of others, surveillance software powered by artificial intelligence immediately alerted Vancouver Public Schools staff in Washington state.

    Vancouver and many other districts around the country have turned to technology to monitor school-issued devices 24/7 for any signs of danger as they grapple with a student mental health crisis and the threat of shootings.

    The goal is to keep children safe, but these tools raise serious questions about privacy and security – as proven when Seattle Times and Associated Press reporters inadvertently received access to almost 3,500 sensitive, unredacted student documents through a records request about the district’s surveillance technology.

    The released documents show students use these laptops for more than just schoolwork; they are coping with angst in their personal lives.

    Tim Reiland, 42, center, the parent of daughter Zoe Reiland, 17, right, and Anakin Reiland, 15, photographed in Clinton, Miss., Monday, March 10, 2025, said he had no idea their previous schools, in Oklahoma, were using surveillance technology to monitor the students. (AP Photo/Rogelio V. Solis)

    Students wrote about depression, heartbreak, suicide, addiction, bullying and eating disorders. There are poems, college essays and excerpts from role-play sessions with AI chatbots.

    Vancouver school staff and anyone else with links to the files could read everything. Firewalls or passwords didn’t protect the documents, and student names were not redacted, which cybersecurity experts warned was a massive security risk.

    The monitoring tools often helped counselors reach out to students who might have otherwise struggled in silence. But the Vancouver case is a stark reminder of surveillance technology’s unintended consequences in American schools.

    In some cases, the technology has outed LGBTQ+ children and eroded trust between students and school staff, while failing to keep schools completely safe.

    Gaggle, the company that developed the software that tracks Vancouver schools students’ online activity, believes not monitoring children is like letting them loose on “a digital playground without fences or recess monitors,” CEO and founder Jeff Patterson said.

    Related: A lot goes on in classrooms from kindergarten to high school. Keep up with our free weekly newsletter on K-12 education.

    Roughly 1,500 school districts nationwide use Gaggle’s software to track the online activity of approximately 6 million students. It’s one of many companies, like GoGuardian and Securly, that promise to keep kids safe through AI-assisted web surveillance.

    The technology has been in high demand since the pandemic, when nearly every child received a school-issued tablet or laptop. According to a U.S. Senate investigation, over 7,000 schools or districts used GoGuardian’s surveillance products in 2021.

    Vancouver schools apologized for releasing the documents. Still, the district emphasizes Gaggle is necessary to protect students’ well-being.

    “I don’t think we could ever put a price on protecting students,” said Andy Meyer, principal of Vancouver’s Skyview High School. “Anytime we learn of something like that and we can intervene, we feel that is very positive.”

    Dacia Foster, a parent in the district, commended the efforts to keep students safe but worries about privacy violations.

    “That’s not good at all,” Foster said after learning the district inadvertently released the records. “But what are my options? What do I do? Pull my kid out of school?”

    Foster says she’d be upset if her daughter’s private information was compromised.

    “At the same time,” she said, “I would like to avoid a school shooting or suicide.”

    Related: Ed tech companies promise results, but their claims are often based on shoddy research

    Gaggle uses a machine learning algorithm to scan what students search or write online via a school-issued laptop or tablet 24 hours a day, or whenever they log into their school account on a personal device. The latest contract Vancouver signed, in summer 2024, shows a price of $328,036 for three school years – approximately the cost of employing one extra counselor.

    The algorithm detects potential indicators of problems like bullying, self-harm, suicide or school violence and then sends a screenshot to human reviewers. If Gaggle employees confirm the issue might be serious, the company alerts the school. In cases of imminent danger, Gaggle calls school officials directly. In rare instances where no one answers, Gaggle may contact law enforcement for a welfare check.

    A Vancouver school counselor who requested anonymity out of fear of retaliation said they receive three or four student Gaggle alerts per month. In about half the cases, the district contacts parents immediately.

    “A lot of times, families don’t know. We open that door for that help,” the counselor said. Gaggle is “good for catching suicide and self-harm, but students find a workaround once they know they are getting flagged.”

    Related: Have you had experience with school surveillance tech? Tell us about it

    Seattle Times and AP reporters saw what kind of writing set off Gaggle’s alerts after requesting information about the type of content flagged. Gaggle saved screenshots of activity that set off each alert, and school officials accidentally provided links to them, not realizing they weren’t protected by a password.

    After learning about the records inadvertently released to reporters, Gaggle updated its system. Now, after 72 hours, only those logged into a Gaggle account can view the screenshots. Gaggle said this feature was already in the works but had not yet been rolled out to every customer.

    The company says the links must be accessible without a login during those 72 hours so emergency contacts—who often receive these alerts late at night on their phones—can respond quickly.

    In Vancouver, the monitoring technology flagged more than 1,000 documents for suicide and nearly 800 for threats of violence. While many alerts were serious, many others turned out to be false alarms, like a student essay about the importance of consent or a goofy chat between friends.

    Foster’s daughter Bryn, a Vancouver School of Arts and Academics sophomore, was one such false alarm. She was called into the principal’s office after writing a short story featuring a scene with mildly violent imagery.

    “I’m glad they’re being safe about it, but I also think it can be a bit much,” Bryn said.

    School officials maintain alerts are warranted even in less severe cases or false alarms, ensuring potential issues are addressed promptly.

    “It allows me the opportunity to meet with a student I maybe haven’t met before and build that relationship,” said Chele Pierce, a Skyview High School counselor.

    Related: Students work harder when they think they are being watched

    Between October 2023 and October 2024, nearly 2,200 students, about 10% of the district’s enrollment, were the subject of a Gaggle alert. At the Vancouver School of Arts and Academics, where Bryn is a student, about 1 in 4 students had communications that triggered a Gaggle alert.

    While schools continue to use surveillance technology, its long-term effects on student safety are unclear. There’s no independent research showing it measurably lowers student suicide rates or reduces violence.

    A 2023 RAND study found only “scant evidence” of either benefits or risks from AI surveillance, concluding: “No research to date has comprehensively examined how these programs affect youth suicide prevention.”

    “If you don’t have the right number of mental health counselors, issuing more alerts is not actually going to improve suicide prevention,” said report co-author Benjamin Boudreaux, an AI ethics researcher.

    In the screenshots released by Vancouver schools, at least six students were potentially outed to school officials after writing about being gay, trans or struggling with gender dysphoria.

    LGBTQ+ students are more likely than their peers to suffer from depression and suicidal thoughts, and turn to the internet for support.

    “We know that gay youth, especially those in more isolated environments, absolutely use the internet as a life preserver,” said Katy Pearce, a University of Washington professor who researches technology in authoritarian states.

    In one screenshot, a Vancouver high schooler wrote in a Google survey form they’d been subject to trans slurs and racist bullying. Who created this survey is unclear, but the person behind it had falsely promised confidentiality: “I am not a mandated reporter, please tell me the whole truth.”

    When North Carolina’s Durham Public Schools piloted Gaggle in 2021, surveys showed most staff members found it helpful.

    But community members raised concerns. An LGBTQ+ advocate reported to the Board of Education that a Gaggle alert about self-harm had led to a student being outed to their family, who were not supportive.

    Glenn Thompson, a Durham School of the Arts graduate, poses in front of the school in Durham, N.C., Monday, March 10, 2025. (AP Photo/Karl DeBlaker)

    Glenn Thompson, a Durham School of the Arts graduate, spoke up at a board meeting during his senior year. One of his teachers promised a student confidentiality for an assignment related to mental health. A classmate was then “blindsided” when Gaggle alerted school officials about something private they’d disclosed. Thompson said no one in the class, including the teacher, knew the school was piloting Gaggle.

    “You can’t just (surveil) people and not tell them. That’s a horrible breach of security and trust,” said Thompson, now a college student, in an interview.

    After hearing about these experiences, the Durham Board of Education voted to stop using Gaggle in 2023. The district ultimately decided it was not worth the risk of outing students or eroding relationships with adults.

    Related: School ed tech money mostly gets wasted. One state has a solution

    The debate over privacy and security is complicated, and parents are often unaware it’s even an issue. Pearce, the University of Washington professor, doesn’t remember reading about Securly, the surveillance software Seattle Public Schools uses, when she signed the district’s responsible use form before her son received a school laptop.

    Even when families learn about school surveillance, they may be unable to opt out. Owasso Public Schools in Oklahoma has used Gaggle since 2016 to monitor students outside of class.

    For years, Tim Reiland, the parent of two teenagers, had no idea the district was using Gaggle. He found out only after asking if his daughter could bring her personal laptop to school instead of being forced to use a district one because of privacy concerns.

    The district refused Reiland’s request.

    When his daughter, Zoe, found out about Gaggle, she says she felt so “freaked out” that she stopped Googling anything personal on her Chromebook, even questions about her menstrual period. She didn’t want to get called into the office for “searching up lady parts.”

    “I was too scared to be curious,” she said.

    School officials say they don’t track metrics measuring the technology’s efficacy but believe it has saved lives.

    Yet technology alone doesn’t create a safe space for all students. In 2024, a nonbinary teenager at Owasso High School named Nex Benedict died by suicide after relentless bullying from classmates. A subsequent U.S. Department of Education Office for Civil Rights investigation found the district responded with “deliberate indifference” to some families’ reports of sexual harassment, mainly in the form of homophobic bullying.

    During the 2023-24 school year, the Owasso schools received close to 1,000 Gaggle alerts, including 168 alerts for harassment and 281 for suicide.

    When asked why bullying remained a problem despite surveillance, Russell Thornton, the district’s executive director of technology responded: “This is one tool used by administrators. Obviously, one tool is not going to solve the world’s problems and bullying.”

    Related: Schools prove soft targets for hackers

    Despite the risks, surveillance technology can help teachers intervene before a tragedy.

    A middle school student in the Seattle-area Highline School District who was potentially being trafficked used Gaggle to communicate with campus staff, said former superintendent Susan Enfield.

    “They knew that the staff member was reading what they were writing,” Enfield said. “It was, in essence, that student’s way of asking for help.”

    Still, developmental psychology research shows it is vital for teens to have private spaces online to explore their thoughts and seek support.

    “The idea that kids are constantly under surveillance by adults — I think that would make it hard to develop a private life, a space to make mistakes, a space to go through hard feelings without adults jumping in,” said Boudreaux, the AI ethics researcher.

    Gaggle’s Patterson says school-issued devices are not the appropriate place for unlimited self-exploration. If that exploration takes a dark turn, such as making a threat, “the school’s going to be held liable,” he said. “If you’re looking for that open free expression, it really can’t happen on the school system’s computers.”

    Claire Bryan is an education reporter for The Seattle Times. Sharon Lurye is an education data reporter for The Associated Press.

    Contact Hechinger managing editor Caroline Preston at 212-870-8965, on Signal at CarolineP.83 or via email at preston@hechingerreport.org.

    This story about AI-powered surveillance at schools was produced by the Education Reporting Collaborative, a coalition of eight newsrooms that includes AL.com, The Associated Press, The Christian Science Monitor, The Dallas Morning News, The Hechinger Report, Idaho Education News, The Post and Courier in South Carolina, and The Seattle Times.

    The Hechinger Report provides in-depth, fact-based, unbiased reporting on education that is free to all readers. But that doesn’t mean it’s free to produce. Our work keeps educators and the public informed about pressing issues at schools and on campuses throughout the country. We tell the whole story, even when the details are inconvenient. Help us keep doing that.

    Join us today.

    Source link

  • A legislative solution to student suicide prevention: advocating for opt-out consent in response to student welfare concerns

    A legislative solution to student suicide prevention: advocating for opt-out consent in response to student welfare concerns

    Authored by Dr Emma Roberts, Head of Law at the University of Salford.

    The loss of a student to suicide is a profound and heartbreaking tragedy, leaving families and loved ones devastated, while exposing critical gaps in the support systems within higher education. Each death is not only a personal tragedy but also a systemic failure, underscoring the urgent need for higher education institutions to strengthen their safeguarding frameworks.

    Recent government data revealed that 5.7% of home students disclosed a mental health condition to their university in 2021/22, a significant rise from under 1% in 2010/11. Despite this growing awareness of mental health challenges, the higher education sector is grappling with the alarming persistence of student suicides.

    The Office for National Statistics (ONS) reported a rate of 3.0 deaths per 100,000 students in England and Wales in the academic year ending 2020, equating to 64 lives lost. Behind each statistic lies a grieving family, unanswered questions and the haunting possibility that more could have been done. These statistics force universities to confront uncomfortable truths about their ability to support vulnerable students.

    The time for piecemeal solutions has passed. To confront this crisis, bold and systemic reforms are required. One such reform – the introduction of an opt-out consent system for welfare contact – has the potential to transform how universities respond to students in crisis.

    An opt-out consent model

    At present, universities typically rely on opt-in systems, where students are asked to nominate a contact to be informed in emergencies. This has come to be known as the Bristol consent model. Where this system exists, they are not always invoked when students face severe mental health challenges. The reluctance often stems from concerns about breaching confidentiality laws and the fear of legal repercussions. This hesitancy can result in critical delays in involving a student’s support network at the time when their wellbeing may be most at risk, leaving universities unable to provide timely, life-saving interventions. Moreover, evidence suggests that many students, particularly those experiencing mental health challenges, fail to engage with these systems, leaving institutions unable to notify loved ones when serious concerns arise.

    Not all universities have such a system in place. And some universities, while they may have a ‘nominated person’ process, lack the infrastructure to appropriately engage the mechanism of connecting with the emergency contact when most needed.

    An opt-out consent model would reverse this default, automatically enrolling students into a system where a trusted individual – such as a parent, guardian or chosen contact – can be notified if their wellbeing raises grave concerns. Inspired by England and Wales’ opt-out system for organ donation, this approach would prioritise safeguarding without undermining student autonomy.

    Confidentiality must be balanced with the need to protect life. An opt-out model offers precisely this balance, creating a proactive safety net that supports students while respecting their independence.

    Legislative provision

    For such a system to succeed, it must be underpinned by robust legislation and practical safeguards. Key measures would include:

    1. Comprehensive communication: universities must clearly explain the purpose and operation of the opt-out system during student onboarding, ensuring that individuals are fully informed of their rights and options.
    2. Defined triggers: criteria for invoking welfare contact must be transparent and consistently applied. This might include extended absences, concerning behavioural patterns or explicit threats of harm.
    3. Regular reviews: students should have opportunities to update or withdraw their consent throughout their studies, ensuring the system remains flexible and respectful of changing personal circumstances.
    4. Privacy protections: institutions must share only essential information with the nominated contact, ensuring the student’s broader confidentiality is preserved.
    5. Staff training: university staff, including academic and professional services personnel, must receive regular training on recognising signs of mental health crises, navigating confidentiality boundaries and ensuring compliance with the opt-out system’s requirements. This training would help ensure interventions are timely, appropriate and aligned with legal and institutional standards.
    6. Reporting and auditing: universities should implement robust reporting and auditing mechanisms to assess the effectiveness of the opt-out system. This should include maintaining records of instances where welfare contact was invoked, monitoring outcomes and conducting periodic audits to identify gaps or areas for improvement. Transparent reporting would not only enhance accountability but also foster trust among stakeholders.

    Lessons from the organ donation model

    The opt-out system for organ donation introduced in both Wales and England demonstrates the effectiveness of reframing consent to drive societal benefit. Following its implementation, public trust was maintained and the number of registered organ donors increased. A similar approach in higher education could establish a proactive baseline for safeguarding without coercing students into participation.

    Addressing legal and cultural barriers

    A common barrier to implementing such reforms is the fear of overstepping legal boundaries. Currently, universities are hesitant to breach confidentiality, even in critical situations, for fear of breaching trust and privacy and prompting litigation. Enshrining the opt-out system in law to include the key measures listed above would provide institutions with the clarity and confidence to act decisively, ensuring consistency across the sector. Culturally, universities must address potential scepticism by engaging students, staff and families in dialogue about the system’s goals and safeguards.

    The need for legislative action

    To ensure the successful implementation of an opt-out consent system, decisive actions are required from both the government and higher education institutions. The government must take the lead by legislating the introduction of this system, creating a consistent, sector-wide approach to safeguarding student wellbeing. Without legislative action, universities will remain hesitant, lacking the legal clarity and confidence needed to adopt such a bold model.

    Legislation is the only way to ensure every student, regardless of where they study, receives the same high standard of protection, ending the current postcode lottery in safeguarding practices across the sector.

    A call for collective action

    Universities, however, must not wait idly for legislation to take shape. They have a moral obligation to begin addressing the gaps in their welfare notification systems now. By expanding or introducing opt-in systems as an interim measure, institutions can begin closing these gaps, gathering critical data and refining their practices in readiness for a sector-wide transition.

    Universities should unite under sector bodies to lobby the government for legislative reform, demonstrating their collective commitment to safeguarding students. Furthermore, institutions must engage their communities – students, staff and families – in a transparent dialogue about the benefits and safeguards of the opt-out model, ensuring a broad base of understanding and support for its eventual implementation.

    This dual approach of immediate institutional action paired with long-term legislative reform represents a pragmatic and proactive path forward. Universities can begin saving lives today while laying the groundwork for a robust, consistent and legally supported safeguarding framework for the future.

    Setting a New Standard for Student Safeguarding

    The rising mental health crisis among students demands more than institutional goodwill – it requires systemic change. While the suicide rate among higher education students is lower than in the general population, this should not be a cause for complacency. Each loss is a profound tragedy and a clear signal that systemic improvements are urgently needed to save lives. Higher education institutions have a duty to prioritise student wellbeing and must ensure that their environments offer the highest standards of safety and support. An opt-out consent system for welfare contact is not a panacea, but it represents a critical step towards creating safer and more supportive university environments.

    The higher education sector has long recognised the importance of student wellbeing, yet its current frameworks remain fragmented and reactive. This proposal is both bold and achievable. It aligns with societal trends towards proactive safeguarding, reflects a compassionate approach to student welfare and offers a legally sound mechanism to prevent future tragedies.

    The loss of 64 students to suicide in a single academic year is a stark reminder that the status quo is failing. By adopting an opt-out consent system, universities can create a culture of care that saves lives, supports grieving families and fulfils their duty to protect students.

    The time to act is now. With legislative backing and sector-wide commitment, this reform could become a cornerstone of a more compassionate and effective national response to student suicide prevention.

    Source link