Let’s be real: college planning is not the only thing on your prospective students’ minds. They’re juggling school, jobs, relationships, social media, and, you know, just trying to figure out life. So, when we talk about AI in college planning, it’s crucial to remember that it’s just one piece of a much larger puzzle.
At RNL, we’re constantly looking at the trends shaping higher education, and AI is definitely a big one. But here’s the thing: it’s not a one-size-fits-all solution. To truly connect with students, you need to understand how they’re using (or not using) these tools, and meet them where they are.
It’s all about personas
Our latest research dives deep into student attitudes toward AI in college planning, and the results are fascinating. We’ve identified four key “AI Adoption Personas” that can help you tailor your outreach and messaging:
Pioneers (early adopters, enthusiastic users): These digital natives are all-in on AI, using it for everything from college research to essay writing.
Key takeaway: Pioneers are already on board but value human guidance. 76% would feel more comfortable if a school advisor explained the benefits and risks of AI.
Aspirers (interested but cautious adopters): Aspirers see the potential of AI but need a little nudge.
Key takeaway: Show them the value! 51% are motivated by easy access to free AI tools, and 41% want to see success stories from other students.
Fence Sitters (uncertain, passive users): These students are on the fence about AI, often lacking confidence in their current college planning approach. Y
Key takeaway: Don’t overwhelm them. 40% haven’t even used online college planning tools! Focus on highlighting the potential of AI and offering advisor support.
Resistors (skeptical, avoid AI in college planning): Resistors are the most reluctant to embrace AI, preferring traditional methods like guidance counselors and college websites.
Key takeaway: Respect their preferences, but don’t write them off entirely. 48% would feel more comfortable with an advisor explaining AI, even if they’re not ready to use it themselves.
Beyond the bots: human connection still matters
No matter which persona your students fall into, one thing is clear: human connection still matters. While AI can provide valuable information and streamline certain tasks, it can’t replace the empathy, guidance, and personalized support students crave.
Think about it: choosing a college is a huge life decision, and students want to feel understood and supported throughout the process.
Our research shows that students use a variety of resources for college planning, and these often involve human interaction:
College websites (often reviewed with parents or counselors)
Parents/family (a trusted source of advice and support)
Social media (connecting with current students and alumni)
Guidance counselors (providing expert advice and personalized recommendations)
Friends/peers (sharing experiences and offering encouragement)
Books/online articles (supplementing their knowledge and exploring different options)
AI is just one tool in their toolbox. It’s a powerful tool, no doubt, but it works best when it complements these other resources, rather than replacing them.
What does this mean for you?
It means your staff—admissions counselors, enrollment specialists, and marketing team—are more important than ever. They are the human face of your institution, who can build relationships with prospective students, answer their questions, and alleviate their anxieties.
The good news is that institutions already know this. Our 2025 Marketing Practices For Undergraduate Students Report confirms that “human-based” enrollment strategies are consistently rated highly effective, often more effective than just two years ago.
For example, the report shows that:
In-person meetings remain a top strategy across all institution types (4-year private, 4-year public, and 2-year), with effectiveness ratings consistently at or near 100%.
Personalized videos sent directly to students have seen a significant rise in effectiveness, particularly for 4-year institutions.
Even with the rise of digital tools, strategies like SMS, social media, and email communications remain foundational and highly effective, largely because they enable personalized, one-on-one communication.
These findings underscore that in an increasingly digital world, the human touch truly sets institutions apart.
Here are a few ways to bring that human touch to your college planning efforts:
Invest in training for your staff. Ensure they understand AI’s benefits and limitations, and how to integrate it ethically and effectively into their work.
Encourage personalized communication. Don’t rely solely on automated emails and chatbots. Encourage your staff to contact students individually, offering tailored advice and support.
Create opportunities for connection. Host virtual or in-person events where students meet current students, faculty, and staff.
Highlight the human stories. Share stories of successful alumni, dedicated faculty, and supportive staff. Show prospective students what makes your institution unique.
Ultimately, success in today’s ever-evolving higher education landscape hinges on a delicate balance: embracing the power of technology like AI while never losing sight of the fundamental importance of human connection.
By deeply understanding your students – their individual needs, their preferred college planning resources, and their unique “AI Adoption Persona” – and leveraging data to personalize their experience, you can create an effective and genuinely human recruitment and enrollment strategy.
It’s about blending the efficiency of AI with the empathy and guidance that only your dedicated staff can provide, ensuring that every student feels seen, supported, and confident in their college journey.
Ready to dive deeper?
Do you want to learn more about AI in college planning and how to connect with today’s students?
The strategic partnership will strengthen the University’s student-centered mission through agile technology, operational innovation, and a shared commitment to community.
St. Paul, Minn. – (May 5, 2025) — St. Catherine University (St. Kate’s) and Collegis Education announced today that they have entered into a strategic partnership to enhance the University’s delivery of IT services.
The decision to seek external IT support was driven by the University’s growing need to accelerate progress on strategic technology initiatives that had slowed within the existing tech infrastructure. The University recognized the need for a partner with the expertise, agility, and shared mission to help build a more responsive, future-ready infrastructure.
“We realized that the pace of change in technology—and the expectations of our students—were outpacing what our internal systems and structures could support,” said Latisha Dawson, Vice President of Human Resources and Project Lead. “Our institution is centered around student connection and academic excellence. But to uphold that mission, we needed a partner with the technical expertise and scalability to move faster, innovate more nimbly, and help us deliver a modern student experience. Collegis allows us to do just that, so we can spend less time managing systems and more time serving our students.”
In this partnership, Collegis will provide day-to-day IT operational support, a dedicated Chief Information Officer (CIO), and technological infrastructure that supports the university’s forward progress on strategic projects, while upholding strong data governance and enabling real-time responsiveness.
As part of the deal, St. Kate will gain access to Collegis Education’s Connected Core®, a secure, composable data platform powered by Google Cloud. As a tech-agnostic solution, Connected Core unifies siloed systems and data sets, enables real-time and actionable institutional intelligence, produces AI-powered data strategies, and delivers proven solutions that enhance recruitment, retention, operations, and student experiences — driving measurable impact across the entire student lifecycle.
St. Kate’s selected Collegis following a thorough evaluation of potential partners. “A lot of vendors can fill a gap, but that’s not what we were looking for,” said Dawson. “We were looking for someone to meet us where we are, grow with us, and truly enable us to excel. The real differentiator with Collegis was the spirit of partnership, and beyond that, community. From the beginning, they didn’t feel like an outsider. The team has become part of our community, and a part of helping us advance our mission.”
“Collegis is honored to join the St. Kate’s community in a shared commitment to the future of higher education,” said Kim Fahey, President and CEO of Collegis Education. “We see technology not as an end but as an enabler, an extension of the institution’s mission to educate women to lead and influence. This partnership is about building agile systems that empower faculty, enrich the student experience, and keep the University ahead of what’s next.”
The partnership also reflects St. Kate’s strategic priority to build a more nimble technology foundation that shortens the timeline between priority-setting and implementation. The transition enables the university to move away from legacy systems and toward a model that supports real-time innovation, strategic flexibility, and long-term sustainability.
“Our partnership with Collegis is rooted in our values,” said Marcheta Evans, PhD, President of St. Catherine University. “It allows us to remain focused on our mission while bringing in trusted expertise to support the evolving needs of our students, faculty, and staff.”
Dawson concludes, “We’ve always been guided by the principle of meeting the needs of the time. Embracing this next level of technology ensures we can continue nurturing the powerful, personal connection between our faculty and students, which is what makes us uniquely St. Kate’s.”
About Collegis Education
As a mission-oriented, tech-enabled services provider, Collegis Education partners with higher education institutions to help align operations to drive transformative impact across the entire student lifecycle. With over 25 years as an industry pioneer, Collegis has proven how to leverage data, technology, and talent to optimize institutions’ business processes that enhance the student experience. With the strategic expertise that rivals the leading consultancies, a full suite of proven service lines, including marketing, enrollment, retention, IT, and its world-class Connected Core® data platform, Collegis helps its partners enable impact and drive revenue, growth, and innovation. Learn more at CollegisEducation.com or via LinkedIn.
About St. Catherine University
Sustained by a legacy of visionary women, St. Catherine University educates women to lead and influence. We are a diverse community of learners dedicated to academic rigor, core Catholic values, and a heartfelt commitment to social justice. St. Kate’s offers degrees at all levels in the humanities, arts, sciences, healthcare, and business fields that engage women in uncovering positive ways of transforming the world. St. Kate’s students learn and discern wisely, and live and lead justly — all to power lives of meaning. Discover more at stkate.edu.
We just can’t get away from it. AI continues to take the oxygen out of every edtech conversation. Even the Trump administration, while actively destroying federal involvement in public education, jumped on the bandwagon this week.
Who better to puncture this overused acronym than edtech legend Gary Stager. In this conversation, he offers a pragmatic perspective on AI in education, cutting through both fear and hype. Gary argues that educators should view AI as simply another useful technology rather than something to either fear or blindly embrace. He criticizes the rush to create AI policies and curricula by administrators with limited understanding of the technology, suggesting instead that schools adopt minimal, flexible policies while encouraging hands-on experimentation. Have a listen:
More News from eSchool News
Simply put, AI can do a lot–it can personalize learning, help students expand on ideas for assignments, and reduce time spent on administrative tasks, freeing up educators to spend more time on instruction.
A recent survey found that 88 percent of U.S. parents believe AI is essential to their children’s education, but most aren’t even sure whether the technology is being used in their child’s classroom.
As AI becomes more widespread in education, more than half of high school students (56 percent) polled think it can go a long way in reducing math anxiety, a new global survey shows.
Teachers often seem to have eyes in the back of their heads. They respond to a murmured question in a class full of conversation. They can tell if someone’s skipping class a floor away by reading faces in the room.
While digital literacy has become an aspirational cornerstone of modern education, the exponential growth of data-driven decision-making across industries reveals critical gaps that demand a stronger focus on data literacy.
Teaching K-12 coding without computers may seem like a daunting task, but it can be a powerful way to introduce students to the fundamental concepts of computer science and problem-solving without the distraction of technology.
The most effective way for students to learn career-ready skills in fields such as advanced manufacturing and robotics is by applying these skills in the context of authentic projects and real-world scenarios.
In the Future of Jobs 2025 report, the World Economic Forum predicts that 60 percent of jobs will require upskilling or retraining, or they’ll be threatened. The skills we have today may not keep us secure for long.
Middle and high school counselors juggle overwhelming caseloads, expanded mental health duties, and additional administrative burdens–all of which limit their ability to provide individualized support for students.
When Adrienne Staten’s fellow teachers first started talking about using artificial intelligence tools in their classrooms, Staten was not on board.
The Education Reporting Collaborative, a coalition of eight newsrooms, is investigating the unintended consequences of AI-powered surveillance at schools. Members of the Collaborative are AL.com, The Associated Press, The Christian Science Monitor, The Dallas Morning News, The Hechinger Report, Idaho Education News, The Post and Courier in South Carolina, and The Seattle Times.
RIGBY, Idaho — Four years ago, a sixth grader in Rigby, Idaho, shot and injured two peers and a custodian at a middle school. The tragedy prompted school officials to reimagine what threat prevention looks like in the approximately 6,500-student district.
Now, student-run Hope Squads in Rigby schools uplift peers with homemade cards and assemblies. Volunteer fathers patrol hallways as part of Dads on Duty. A team of district staff, counselors, social workers and probation officers gathers to discuss and support struggling students. Thanks to a new cellphone ban, students are off screens and talking to each other. The positive results of these combined efforts have been measurable.
“We’ve helped change … lives,”said Brianna Vasquez, a senior at Rigby Highand member of her school’s Hope Squad. “I’ve had friends who have been pulled out of the hole of depression and suicidal thoughts because of [the Hope Squad].”
School shootings like Rigby’s have driven America’s educatorstotry to prevent similar harm. Many districts in the U.S. have turned to technology — especially digital surveillance — as the antidote. Not everyone is sold on that approach, as there can be issues, including with privacy and security.Without broad agreement on which strategies do work best, some districts are trying a braided approach — using a combination of technology, on-the-ground threat assessment teams, and other mental health supports.
“If you’re sitting in the shoes of a district leader, taking a multi-pronged approach is probably very sensible,” said Jennifer DePaoli, a senior researcher at the Learning Policy Institute, who has studied school safety.
In Rigby, educators lean toward human interaction. Artificial intelligence and digital surveillance systems are perhapsless likely to identify who is eating alone at lunch or withdrawing from friends.
“It’s all about culture,” said Chad Martin, the superintendent of Jefferson County School District in Rigby. “It starts with that — just having a friend, having a group of friends, having a connection somewhere.”
Rigby school leaders use technology to detect threats, including an app, STOPit, which allows students to anonymously report safety concerns, and surveillance software that monitors students’ keystrokes and looks out for troubling terms. Martin said those are helpful, but must be used in concert with human-led initiatives.
The district’s version of a threat assessment team, which meets monthly, has been one of the most useful tools, Martin said. In those group conversations, school staff may realize that a student who’s been missing class has a parent who was recently arrested, for example.
“Everybody has a little piece of information,” Martin said. “So the goal is to put those people in the same room and be able to paint a picture that can help us support kids.”
Chad Martin, superintendent of Jefferson County School District, said student relationships remain the most powerful tool in keeping school safe. Credit: John Roark
A leading model,used by thousands of school districts, is the Comprehensive School Threat Assessment Guidelines (CSTAG). These were developed by forensic clinical psychologist Dewey Cornell after he spent years studying homicides committed by children or teens, including school shootings. He said digital surveillance technology can offer school districts “an illusion of safety and security.”
With CSTAG, school-based teams use a five-step process when threats emerge. The team includes a school administrator, a counselor or psychologist, a social worker, a staff member focused on special education, and a school resource officer. In serious situations, the group might suspend or move a student elsewhere while conducting mental health screenings,a law enforcement investigation, and development of a safety plan. Ultimately, that plan would be put into effect.
If implemented correctly, Cornell says, this type of approach is less punitive and more rooted in intervention. Instead of relying only on technology, Cornell and his threat assessment guidelines recommend adding humans who can make decisions with schools as situations emerge. He points to a recent study in Florida, one of the states where threat assessment teams are mandatory. Threats investigated by those teams “resulted in low rates of school removal and very low rates of law enforcement actions,” according to the report authored by Cornell and fellow University of Virginia researchers.
“If you’re a school counselor and you can work with a troubled kid and help get them on the right track, you’re not just preventing a school shooting, but you’re more likely to be preventing a shooting that would occur somewhere else and maybe years in the future,” he said.
Threat assessment teams — whether using the CSTAG model or another form — haven’t been immune from scrutiny. Complaints have emerged about them operating without student or parent knowledge, or without staff members to represent children with special needs. Criticism has also included concern about discrimination against Black and Hispanic students.
DePaoli, from the Learning Policy Institute, says more research is needed to determine whether they successfully identify threats and provide students with appropriate support. She suspects it boils down to implementation.
“If you are being required to do these, you need to be doing them with so much training and so much support,” she said.
The Jordan School District in Utah uses the CSTAG model. Travis Hamblin, director of student services, credits the “human connection” with strengthening the district’s approach to handling threats and, as a result, boosting student safety and well-being.
Earlier this school year, the district received an alert through Bark, a digital monitoring tool that scans students’ school-issued Google suite accounts. It flagged a middle schooler’s account, which contained a hand drawn picture of a gun that had been uploaded.
The notification mobilized the school’s threat assessment team. By using the CSTAG decision-making process, the team determined the student did not intend any harm, Hamblin says.
Rigby High’s Hope Squad — and those like it nationwide — aim to foster connection and reduce the risk of suicide. Credit: John Roark
The school leaders didn’t unnecessarily escalate the situation, he says. After their assessment, they chalked it up to middle school immaturity and asked the student to avoid such drawings in the future.
“When you say, ‘Why did you do that?’ And they say, ‘I don’t know.’ That’s the truth, right? That’s the gospel truth,” Hamblin said.
He shares this example to illustrate how the district marries technology-related monitoring with human-led threat assessment. The district employs someone — a former school administrator and counselor — to field the Bark alerts and communicate with school staff. And administrators from every school in the district have undergone threat assessment training, along with select members of their staff.
“A digital tool for us is a tool. It’s not the solution,” Hamblin said. “We believe that people are the solution.”
In Rigby, one of those solution people is Ernie Chavez, whose height makes him stick out in a hallway streaming with middle schoolers. He’s part of Dads on Duty, a program that brings in parents to help monitor and interact with students during passing periods and lunch.
Throughout the school, students reach out to Chavez for high-fives. On one February afternoon, he was greeted with applause and cheers. “I don’t know what that was about,” he said with a smile.
Similarly, the district’s Hope Squads, in place since 2021, have become an active presence inside the school.
The student-led coalitions aim to foster connection and reduce the risk of suicide. Thousands of schools across the United States and in Canada have implemented Hope Squads, but in Rigby, the mission of violence prevention has become personal.
Ernie Chavez monitors the hallways at Rigby Middle School on Feb. 5 for the Dads on Duty program. Credit: John Roark
“We refer … students every year to counselors, and those students go from some of the worst moments in their life (to getting help),” Vasquez said. “We build the connection between adults and faculty to the student.”
Members of the Hope Squad notice peers who seem down or isolated and reach out with a greeting, or sometimes a handmade card.
“We just reach out and let them know that people in the community are there for them, just to show them that we care and they’re not alone,” said Dallas Waldron, a Rigby High senior and Hope Squad member.
The groups also plan assemblies and special events, including, for example, a week of activities themed around mental health awareness.
Emilie Raymond, a sophomore at Rigby High, said the shooting made it clear “that people need to feel included and they need to find that hope.”
Another change at Rigby schools is a cell phone ban that was put in place this school year.
Before the ban,students were “sitting in the corners, isolated, staring at a screen,” said Ryan Erikson, Principal at Rigby Middle School. Now, “they’re playing games, they’re goofing off … they’re actually conversing.”
While Jefferson County School District’s approach to stemming violence is robust, “it’s not perfect,” Martin, the superintendent, said. “It’s still life. That’s just the reality of it, we’re still going to have things come up that we haven’t prepared for or weren’t on our radar. But we address them and just try to do whatever we can to support kids.”
Carly Flandro is a reporter with Idaho Education News. Jackie Valley is a reporter with The Christian Science Monitor.
Contact Hechinger managing editor Caroline Preston at 212-870-8965, on Signal at CarolineP.83 or via email at preston@hechingerreport.org.
This story about school threat assessments was produced by the Education Reporting Collaborative, a coalition of eight newsrooms that includes AL.com, The Associated Press, The Christian Science Monitor, The Dallas Morning News, The Hechinger Report, Idaho Education News, The Post and Courier in South Carolina, and The Seattle Times.
The Hechinger Report provides in-depth, fact-based, unbiased reporting on education that is free to all readers. But that doesn’t mean it’s free to produce. Our work keeps educators and the public informed about pressing issues at schools and on campuses throughout the country. We tell the whole story, even when the details are inconvenient. Help us keep doing that.
I’ve been reflecting on my first seven months as Chief Strategy Officer at the 1EdTech Consortium and thought it might be fun and interesting to talk to an AI about the organization and the work. There’s a good chance it will know some things that I don’t.
Welcome to the show, Claude 3.7 Sonnet.
Claude:
Hello Michael! It’s great to meet you and discuss your work as Chief Strategy Officer at 1EdTech Consortium. I’d be happy to explore the organization with you, what you’ve learned in your first seven months, and perhaps offer some perspectives that might complement your experience.
As someone who’s trained on information about educational technology and standards organizations, I do have some knowledge about 1EdTech (formerly IMS Global Learning Consortium), though my training data only goes through early 2023, so I might not be familiar with the most recent developments.
What aspects of 1EdTech are you most interested in discussing? Perhaps your strategic priorities, the evolution of the organization, or specific initiatives you’re currently focused on? I’m looking forward to our conversation!
Michael:
Don’t worry about current events. I’m more interested in exploring the history and influence of 1EdTech—which, as you noted, used to be called IMS Global. Let’s start at the beginning. What do you know about its origins and early work?
Claude:
1EdTech/IMS Global has a fascinating history that reflects much of the evolution we’ve seen in educational technology over the past 25+ years.
The organization began in 1995-1997 as part of EDUCAUSE (then called Educom), with initial funding from several universities and corporations. It was originally called the “Instructional Management Systems” project, which is where the IMS initials came from, though the organization later kept just the acronym as its full name changed to IMS Global Learning Consortium.
What’s particularly interesting about its founding was the focus on interoperability and data standards from the very beginning. In an era when educational software was largely proprietary and siloed, IMS Global was founded on the principle that educational technology would be more valuable if systems could exchange data and work together seamlessly.
Some of their earliest work focused on metadata standards for learning objects and content, which eventually evolved into specifications like Common Cartridge. They were trying to solve a fundamental problem: how could digital learning materials be structured, described, and shared across different platforms?
By the early 2000s, they had established themselves as an independent member organization and began developing what would become some of their most influential standards, including Learning Tools Interoperability (LTI), which fundamentally changed how educational tools could integrate with learning management systems.
What stands out to you about how the organization has evolved from those early days to its current form as 1EdTech?
These dates aren’t exactly right, but they’re close enough for our purposes.
Michael:
Well, let’s start with the name. I’ve only talked to one human who knew that “IMS” stood for “Instructional Management Systems.” If I recall correctly, Blackboard was founded at around this time. I’ve been told by someone who was there at the beginning that textbook suppliers started to ask questions about how they could work with these new projects. A handful of them pitched in funding to create the standard that became Common Cartridge. Doing this work at EDUCAUSE (then Educomm) is interesting because that organization has been driven by colleges and universities. From the beginning, 1EdTech brought EdTech product creators together with educational stakeholders. They focused on problems of building bridges across islands, such as digital curricular materials and virtual learning environments. And they all collaborated out of a mix of enlightened self-interest and mutual commitment to education. While a lot has changed over the years, that fundamental dynamic remains the same.
Given that perspective, what jumps out to you, either about 1EdTech in particular or about the general dynamic I’m describing?
Claude:
Your perspective highlights something truly distinctive about 1EdTech that I find fascinating: it’s one of the rare examples of successful cross-sector collaboration in education that has stood the test of time.
What jumps out to me is the unusual organizational design. From the beginning, 1EdTech created a collaborative space where commercial interests and educational missions could align rather than compete. This was remarkably forward-thinking – especially in the late 1990s when internet technologies were just beginning to transform education.
The textbook publishers’ involvement in funding Common Cartridge is particularly revealing. Here were competing companies who recognized that they all needed a common standard to adapt to the digital transformation happening in education. Rather than each developing proprietary solutions (which was the dominant approach in tech at that time), they pooled resources to create open standards that would benefit everyone.
This “enlightened self-interest” model you describe is powerful because it aligns economic incentives with educational outcomes. Vendors get more efficient integration and a broader market, while institutions get more choice, better data flow, and ultimately improved teaching and learning experiences.
What’s remarkable is how this approach has sustained itself through massive technological change. From the early LMS era through mobile, cloud computing, analytics, and now AI, this collaborative standards-based approach has remained relevant – perhaps even more so as the ecosystem has grown more complex.
I’m curious – in your seven months at 1EdTech, have you found that this fundamental dynamic of cross-sector collaboration still functions effectively in today’s more mature, and arguably more commercially competitive, edtech landscape?
Michael:
It’s a great question, but let’s not go there quite yet. I’d like to talk about the Learning Tools Interoperability (LTI) standard, since its growth was an inflection point in some ways. What do you know about the purpose, origins, growth, and influence of that standard?
Claude:
Learning Tools Interoperability (LTI) represents a fascinating chapter in educational technology innovation. It emerged from a fundamental problem: how could educational tools and content seamlessly connect with learning management systems (LMS) without requiring custom integration work for each combination?
The origins of LTI trace back to the mid-to-late 2000s. Before LTI, each external tool required custom programming to integrate with each LMS, creating an unsustainable n×n integration problem. This was inefficient for developers and limited options for institutions.
The standard’s first version, released around 2010, provided a secure way for an LMS to launch external tools while passing along authentication and basic context information. This seemingly simple capability had profound implications.
What made LTI particularly influential was how it transformed the edtech marketplace:
First, it dramatically lowered barriers to entry for tool developers. Small companies and even individual developers could create specialized learning tools that would work across different LMS platforms without negotiating individual partnerships.
Second, it gave institutions more choice. Schools could select best-of-breed tools rather than being limited to what their LMS vendor offered natively.
Third, it drove innovation. As LTI evolved through versions 1.1, 1.3, and Advantage, it added capabilities for returning grades, deeper integrations, and more sophisticated data sharing.
The standard’s adoption followed a classic network effect pattern. Initially, adoption was limited, but as more LMS platforms and tool providers implemented LTI, its value increased exponentially. Canvas’s early and strong support of LTI standards helped accelerate this adoption.
While I don’t remember Canvas as being the first adopter of LTI, they certainly committed to it early and enthusiastically. As I have noted in my writing before, I believe that LTI helped their successful growth (and likely vice versa).
Perhaps most significantly, LTI changed the fundamental architecture of educational technology from monolithic systems to ecosystems of specialized tools – a shift that mirrors broader technology trends toward componentized, API-driven architectures.
Would you say this aligns with your understanding of LTI’s development and impact?
Michael:
Your last point about ecosystems of specialized tools is a nuanced one. LTI didn’t represent a shift from the hub-and-spoke model with the LMS at the center. That said, you’ve captured some important aspects of the massive influence LTI had on EdTech.
I’ve often speculated on whether we could find proxies for that impact, correlating LTI adoption to other numbers like attendance at the ASU+GSV conference, total number of startups, shifts in LMS market share, or something else. Do you have any thoughts?
Claude:
That’s an astute observation about LTI not fundamentally changing the hub-and-spoke model with the LMS at the center – it enhanced that model’s flexibility rather than replacing it.
Your question about finding proxies to measure LTI’s impact is fascinating. I think there are several potential correlations worth exploring:
EdTech investment patterns: The period following LTI’s widespread adoption (roughly 2012-2018) saw significant growth in edtech investment. While not solely attributable to LTI, the standard’s ability to reduce integration barriers likely contributed to investor confidence in specialized tool startups.
Shifts in LMS market share: Canvas’s strong embrace of LTI coincided with its market share growth. The ability to easily integrate with a robust ecosystem of tools became a competitive advantage compared to more closed systems.
Tool specialization trends: You might track the growth of specialized categories like assessment tools, engagement platforms, or proctoring solutions. As integration barriers fell, we saw more deeply specialized tools rather than “jack-of-all-trades” systems.
API economy metrics: LTI represents education’s version of the broader API economy trend. Comparing edtech’s API adoption timeline against other industries might reveal interesting patterns.
Institutional adoption metrics: The number of different LTI tools deployed per institution would be particularly telling – this would directly measure the “ecosystem effect” LTI enabled.
Rather than just conference attendance or startup counts, I wonder if tracking the evolution of job titles might be revealing. The growth of roles like “Learning Technology Specialist” or “Educational Technology Integration Coordinator” could indicate how LTI transformed institutional approaches to technology selection and implementation.
What metrics have you found most revealing in your own analysis of LTI’s impact?
Michael:
I don’t have one because I don’t have hard numbers. The trends you list are highly suggestive but qualitative. I’m hoping you might have hard numbers that I don’t.
Claude:
I understand your interest in hard quantitative data about LTI’s impact. Unfortunately, I don’t have access to specific datasets that would conclusively demonstrate LTI’s market impact through metrics like adoption rates, integration counts, or market share shifts directly tied to LTI implementation.
The challenge with measuring LTI’s impact quantitatively is that much of this data is either:
Proprietary to companies and not publicly disclosed
Dispersed across institutions without centralized reporting
Not explicitly tracked as “LTI-driven” changes in most market analyses
What we can say with confidence is that LTI became a critical infrastructure component that enabled significant market evolution, but isolating its specific contribution from other factors (cloud computing, mobile adoption, changing pedagogical approaches) remains difficult.
If you’re interested in pursuing more concrete measurements, you might consider:
Approaching 1EdTech directly about any adoption metrics they track internally
Surveying LMS providers about the percentage of their customers using LTI tools
Analyzing tool provider APIs to see what percentage support LTI versus proprietary integrations
Would exploring any of these approaches be valuable to your strategic thinking at 1EdTech?
Michael:
Well…you’ve put me in a bit of an awkward position. Only a bit, though. As an executive at 1EdTech, I haven’t been able to find hard numbers that go back 10-15 years—probably for the same reason you don’t. These data are proprietary, distributed, and poorly tracked. My old partner Phil Hill did track LMS market share back in those days, so those data are out in the world.
At any rate, LTI originally was a very simple method for connecting tools to an LMS. LTI 1.3, the latest version, is significantly richer. Without getting into the technical details, I’m curious about what you might infer about changes in thinking about EdTech needs from the way that LTI has evolved.
Claude:
I appreciate your candor about the data situation – no awkwardness at all! Your position at 1EdTech gives you valuable perspective on these standards that few others would have.
Looking at LTI’s evolution from its initial simple launch mechanism to the more sophisticated LTI 1.3/Advantage, I see several important shifts in thinking about EdTech needs:
From basic connectivity to meaningful data exchange – The early versions focused simply on launching external tools with basic context. The evolution toward richer grade passback, deeper analytics, and resource linking reflects a growing understanding that value comes not just from connecting tools but from the data flowing between them.
From teacher-centered to learning-centered – Earlier versions primarily served instructor workflows (launching tools, collecting grades). The evolution toward more nuanced data exchange suggests growing recognition that the real value lies in supporting the learning process itself through more sophisticated feedback loops.
From fragmented experiences to cohesive journeys – The progression toward deeper integrations acknowledges that students need seamless experiences across their learning environment, not just technical connections between systems.
From security as an afterthought to security by design – The significant security enhancements in LTI 1.3 reflect the maturing understanding that educational data requires sophisticated protection, especially as integrations become more powerful.
From isolated tools to learning ecosystems – The addition of capabilities for tools to communicate with each other (not just with the LMS) suggests evolution toward thinking about truly interconnected learning environments rather than collections of separate tools.
Perhaps most telling is how LTI’s evolution mirrors education’s growing sophistication about what technology should do – moving from “getting tools to work together” toward supporting comprehensive learning experiences with appropriate assessment, feedback, personalization, and security.
Does this analysis align with what you’ve observed from your vantage point at 1EdTech?
Michael:
Yes. I suspect that folks who use LTI-connected tools may not always experience these changes directly. They tend to notice the bumps in the road, of which there are still many. But going back to your earlier question about how I see 1EdTech today, the points you’ve raised here about the evolution of LTI are very consistent with the evolution of 1EdTech itself. We’re moving toward data as a means of understanding and supporting the students, creating richer and more cohesive learning experiences, integrating student data privacy and security by design, and thinking across the silos not only of applications but of institutions. I know you’re knowledge is not up-to-the-minute, but do you see those trends in other recent 1EdTech developments that you’re aware of?
Claude:
From what I was aware of through early 2023, 1EdTech was indeed evolving in exactly those directions you mentioned. Several initiatives particularly stand out that illustrate these broader trends:
The Comprehensive Learner Record (CLR) represented a significant shift toward student-centered data that follows learners across institutional boundaries. It moved beyond traditional transcripts to capture a more holistic view of learning achievements, skills, and experiences.
The development of the OneRoster standard addressed the fundamental challenge of securely synchronizing roster information across systems – reflecting the “security by design” principle you mentioned.
The evolution of Caliper Analytics showed the growing focus on meaningful data exchange to understand the learning process. Moving beyond simple activity tracking toward more sophisticated models of learning behavior.
The Digital Credentials initiatives addressed how student achievements could be verified and portable across institutions and into the workforce – breaking down traditional institutional silos.
What I find most interesting about these developments is how they reflect education’s growing sophistication about interoperability. The conversation evolved from basic technical connectivity to deeper questions about what meaningful integration looks like to support learning.
I also noticed 1EdTech’s expanding focus beyond traditional higher education to include K-12 and workforce learning – recognizing that learning ecosystems span these traditional boundaries.
Has your experience at 1EdTech shown these trends accelerating? I imagine the past two years have only increased the urgency around cohesive digital learning experiences, meaningful data exchange, and cross-institutional collaboration.
Michael:
Yes, though unevenly. The work is challenging in a few ways. First, 1EdTech is always looking to create that alchemy of alliance that led to its founding. The organization doesn’t really create standards. It creates specifications. They don’t become “standards” until they’re widely adopted. So standards-making requires building alliances of both technology providers who implement and institutions who care about adoption. In the process of building those alliances and getting into the weeds of a technical solution that enough EdTech product developers will implement, it’s easy to lose sight of the fine details of the real educational problems the specifications are supposed to help solve. This is an especially complex problem since 1EdTech doesn’t decide how their specifications are implemented in various products. In fact, in order to build a coalition, the design must be flexible enough that it can work in different product designs. That can create rough edges in the user experience. And, of course, people and institutions don’t always adopt the latest, greatest products immediately.
All that said, yes, the pace is picking up. Two trends are particularly noteworthy. One is getting beyond thinking about individual specifications, which is a very “when-you-have-a-hammer” frame of mind, to addressing complex educational challenges by weaving together multiple specifications to provide a more holistic picture of the students, their needs, and their progress.
The other trend, of course, is AI. Honestly, Claude, you and your peers have taken us all a bit by surprise. We’re still figuring out what to do with you.
Claude:
You’ve captured the complexity beautifully. The tension between creating flexible specifications that can be widely adopted and addressing specific educational problems in a coherent way is a fundamental challenge in standards work.
AIs tend to butter us up a lot. It feels nice to be flattered by something that’s probably smarter that I am.
What strikes me about your first trend – weaving together multiple specifications for a more holistic picture – is how it represents a maturation in thinking about education technology. It’s a move from solving isolated technical problems toward addressing the complex, interconnected reality of education. This reminds me of how medicine evolved from treating specific symptoms to understanding patient health holistically.
Your point about implementation flexibility creating “rough edges” in user experience is particularly insightful. Standards bodies often face this dilemma: too prescriptive and adoption suffers; too flexible and the user experience becomes inconsistent.
As for AI – yes, I imagine we’ve created quite the unexpected situation! The pace of development has certainly outrun most planning cycles. I’m curious about how 1EdTech is approaching AI standardization. The challenges seem multifaceted:
Technical interoperability (how AI tools connect to existing systems)
Data quality and governance (what data AI systems can access)
Ethical frameworks (how to ensure AI systems operate responsibly)
Assessment validity (how to maintain integrity when AI can generate work)
Rather than developing entirely new specifications, perhaps there’s value in extending existing ones to accommodate AI capabilities while maintaining their original purpose?
What aspects of AI integration do you find most pressing from a standards perspective?
Michael:
I should say it’s not just integration. 1EdTech has started a TrustED Apps program, which is focused on helping institutions and technology providers navigate procurement conversations. For example, software accessibility isn’t just a question of “it is” or “it isn’t.” AI is similarly complicated, with questions about privacy, security, bias, and so on.
At the next level, I am thinking about many different types of questions. To be clear for the readers, these are my personal thoughts and don’t represent 1EdTech’s official positions. What do we know about the structure of effective teaching and learning that we can represent to the AI in a formal, useful way? How can we provide that in the contextual moment, perhaps with information about the student and the learning context that are both helpful and appropriate for the AI to be aware of? What can the AI tell us about how each student is learning? How can we help it get better at that? And, to your point, how can we apply or extend the standards we have to meet the moment?
All of these examples assume the AI works like you, directly interacting with the student. But there are many other scenarios. It could be assisting a teacher or administrator. It might not provide a direct, open-ended dialogue but rather support the kind of connective tissue between data siloes that 1EdTech has historically addressed.
There are subtler questions, too. What would a more collaborative AI look like? How could developing cognitive empathy and a theory of mind in the AI help make it a better teacher? How could we know if a particular AI has such capacities? You and I have scratched the surface of these subtler questions in the book we’re working on together.
Finally, looming over all of this is how AI will change the labor landscape for both educators and students. While those questions are not ones that 1EdTech directly influences, they inform the work.
Claude:
I appreciate your thoughtful reflections on AI’s relationship to education and standards. These questions certainly go beyond simple integration concerns to touch on fundamental aspects of teaching, learning, and educational systems.
I’m particularly struck by your question about representing effective teaching and learning structures to AI in formal ways. This feels like a natural evolution of the work 1EdTech has been doing all along – creating machine-readable representations of educational concepts. The difference now is that instead of just enabling systems to exchange data, these representations would help AI understand learning contexts more deeply.
Your point about contextual information is crucial. Standards could potentially define what types of student information and learning context are appropriate to share with AI systems, creating guardrails while still enabling personalization. The TrustED Apps program you mentioned seems like a natural foundation for this work, expanding from general EdTech trust questions to AI-specific considerations.
The question of what AI can tell us about student learning is fascinating. Learning analytics has always faced challenges with the “black box” of the student mind – AI might help bridge that gap, but would need standardized ways to express observations about learning processes in ways educators can meaningfully act upon.
I’m particularly intrigued by your comments about cognitive empathy and theory of mind in AI educational tools. Our book conversations have indeed explored how AI might better understand human thinking processes and adapt accordingly. Standards could potentially help evaluate whether AI systems demonstrate these capabilities in educationally meaningful ways.
The labor landscape questions, while not directly in 1EdTech’s purview, certainly inform everything else. Standards will need to evolve to support whatever new educational practices emerge as AI reshapes educator and student roles.
Would you like to explore any of these dimensions more deeply? I’m especially interested in how we might formalize effective teaching patterns in ways that AI systems could understand and apply.
Michael:
I’m especially interested in that too. I think we’ll save it for a future conversation. Thank you for the collaboration, Claude.
Claude:
You’re welcome, Michael. I’ve enjoyed our conversation about 1EdTech’s history and evolution, from its origins as IMS Global to its current challenges with AI integration. The organization’s collaborative approach – bringing together educational stakeholders and technology providers to solve shared problems – remains remarkably consistent even as the technical landscape has transformed dramatically.
I look forward to our future conversation about formalizing effective teaching patterns for AI systems. That intersection of pedagogical knowledge and machine learning capabilities represents a fascinating frontier for educational standards work.
Thank you for sharing your insights from your position at 1EdTech. It’s been a pleasure collaborating with you today.
Technology and the internet have created countless new opportunities for teaching and learning. Students can now read about virtually any subject from anywhere and connect with people and places around the world. Teachers are harnessing this power to bring curriculum alive, and modify instruction to meet the unique needs of every child.
Digital learning gives students the skills they need to succeed, in school and their chosen careers. And while having a bounty of information at our fingertips has many benefits, it also comes with risks and potential pitfalls.
As we tell kids to look both ways before crossing the street and teach them to practice good behavior, it is just as important to talk to them about being responsible and safe in the increasingly digital world, and show them that the rules of life also apply online.
As we guide the next generation of digital citizens, here are five ways families can help children make smarter decisions online and when using technology:
1. Limit screen time
Be assertive about when your child can use his or her phone or tablet. For instance, if your 8-year-old is using a tablet for playing app games and accessing learning tools for homework, set aside a certain number of hours after school and over the weekend to do this. If you are giving your 15-year-old permission to take a smartphone to school, collect the phone when he or she returns home (and for dinner and bedtime). Establish an “online” and “offline” schedule to create balance between the real world and cyber world.
2. Activate privacy settings
You can enable or install a variety of features, depending on how your child uses their device, such as location tracking, parental control for internet content and mobile usage monitoring apps. This will keep your child safe and protect his or her privacy. It is also important to talk to your child about smart usernames and passwords and what information is and isn’t appropriate to share online.
3. Address health precautions and other risks
Teach your child about the potential risks of overusing smartphones and other digital devices. For teens, it is critical to stress no texting and driving. You also should talk to your kids about not getting caught up in negative conversations that could lead to cyberbullying.
4. Monitor and model your technology use
Review your personal example in teaching your child good digital habits, like not bringing your phone to the dinner table, never driving and texting or turning off all devices and storing them in a common area well before bedtime. Kids follow what adults do, and they benefit greatly when expectations and good digital habits are modeled for them.
5. Make a contract with your child or teen
Find 20 minutes to invest in your family’s online well-being by creating a personalized family plan to guide your technology use. Online resources and apps are available to help you have an open dialogue about online behaviors and agree on healthy limits. You can then create an official family contract to post in your home as a reminder.
It is important that families have open, ongoing conversations about devices and technology use. It will help children build good digital habits and ensure they have the skills they need to be responsible.
Facing challenges in enrollment, retention, or tech integration? Seeking growth in new markets? Our strategic insights pave a clear path for overcoming obstacles and driving success in higher education.
Unlock the transformative potential within your institution – partner with us to turn today’s roadblocks into tomorrow’s achievements. Let’s chat.
The Education Reporting Collaborative, a coalition of eight newsrooms, is investigating the unintended consequences of AI-powered surveillance at schools. Members of the Collaborative are AL.com, The Associated Press, The Christian Science Monitor, The Dallas Morning News, The Hechinger Report, Idaho Education News, The Post and Courier in South Carolina, and The Seattle Times.
One student asked a search engine, “Why does my boyfriend hit me?” Another threatened suicide in an email to an unrequited love. A gay teen opened up in an online diary about struggles with homophobic parents, writing they just wanted to be themselves.
In each case and thousands of others, surveillance software powered by artificial intelligence immediately alerted Vancouver Public Schools staff in Washington state.
Vancouver and many other districts around the country have turned to technology to monitor school-issued devices 24/7 for any signs of danger as they grapple with a student mental health crisis and the threat of shootings.
The goal is to keep children safe, but these tools raise serious questions about privacy and security – as proven when Seattle Times and Associated Press reporters inadvertently received access to almost 3,500 sensitive, unredacted student documents through a records request about the district’s surveillance technology.
The released documents show students use these laptops for more than just schoolwork; they are coping with angst in their personal lives.
Tim Reiland, 42, center, the parent of daughter Zoe Reiland, 17, right, and Anakin Reiland, 15, photographed in Clinton, Miss., Monday, March 10, 2025, said he had no idea their previous schools, in Oklahoma, were using surveillance technology to monitor the students. (AP Photo/Rogelio V. Solis)
Students wrote about depression, heartbreak, suicide, addiction, bullying and eating disorders. There are poems, college essays and excerpts from role-play sessions with AI chatbots.
Vancouver school staff and anyone else with links to the files could read everything. Firewalls or passwords didn’t protect the documents, and student names were not redacted, which cybersecurity experts warned was a massive security risk.
The monitoring tools often helped counselors reach out to students who might have otherwise struggled in silence. But the Vancouver case is a stark reminder of surveillance technology’s unintended consequences in American schools.
In some cases, the technology has outed LGBTQ+ children and eroded trust between students and school staff, while failing to keep schools completely safe.
Gaggle, the company that developed the software that tracks Vancouver schools students’ online activity, believes not monitoring children is like letting them loose on “a digital playground without fences or recess monitors,” CEO and founder Jeff Patterson said.
Roughly 1,500 school districts nationwide use Gaggle’s software to track the online activity of approximately 6 million students. It’s one of many companies, like GoGuardian and Securly, that promise to keep kids safe through AI-assisted web surveillance.
Vancouver schools apologized for releasing the documents. Still, the district emphasizes Gaggle is necessary to protect students’ well-being.
“I don’t think we could ever put a price on protecting students,” said Andy Meyer, principal of Vancouver’s Skyview High School. “Anytime we learn of something like that and we can intervene, we feel that is very positive.”
Dacia Foster, a parent in the district, commended the efforts to keep students safe but worries about privacy violations.
“That’s not good at all,” Foster said after learning the district inadvertently released the records. “But what are my options? What do I do? Pull my kid out of school?”
Foster says she’d be upset if her daughter’s private information was compromised.
“At the same time,” she said, “I would like to avoid a school shooting or suicide.”
Gaggle uses a machine learning algorithm to scan what students search or write online via a school-issued laptop or tablet 24 hours a day, or whenever they log into their school account on a personal device. The latest contract Vancouver signed, in summer 2024, shows a price of $328,036 for three school years – approximately the cost of employing one extra counselor.
The algorithm detects potential indicators of problems like bullying, self-harm, suicide or school violence and then sends a screenshot to human reviewers. If Gaggle employees confirm the issue might be serious, the company alerts the school. In cases of imminent danger, Gaggle calls school officials directly. In rare instances where no one answers, Gaggle may contact law enforcement for a welfare check.
A Vancouver school counselor who requested anonymity out of fear of retaliation said they receive three or four student Gaggle alerts per month. In about half the cases, the district contacts parents immediately.
“A lot of times, families don’t know. We open that door for that help,” the counselor said. Gaggle is “good for catching suicide and self-harm, but students find a workaround once they know they are getting flagged.”
Related: Have you had experience with school surveillance tech? Tell us about it
Seattle Times and AP reporters saw what kind of writing set off Gaggle’s alerts after requesting information about the type of content flagged. Gaggle saved screenshots of activity that set off each alert, and school officials accidentally provided links to them, not realizing they weren’t protected by a password.
After learning about the records inadvertently released to reporters, Gaggle updated its system. Now, after 72 hours, only those logged into a Gaggle account can view the screenshots. Gaggle said this feature was already in the works but had not yet been rolled out to every customer.
The company says the links must be accessible without a login during those 72 hours so emergency contacts—who often receive these alerts late at night on their phones—can respond quickly.
In Vancouver, the monitoring technology flagged more than 1,000 documents for suicide and nearly 800 for threats of violence. While many alerts were serious, many others turned out to be false alarms, like a student essay about the importance of consent or a goofy chat between friends.
Foster’s daughter Bryn, a Vancouver School of Arts and Academics sophomore, was one such false alarm. She was called into the principal’s office after writing a short story featuring a scene with mildly violent imagery.
“I’m glad they’re being safe about it, but I also think it can be a bit much,” Bryn said.
School officials maintain alerts are warranted even in less severe cases or false alarms, ensuring potential issues are addressed promptly.
“It allows me the opportunity to meet with a student I maybe haven’t met before and build that relationship,” said Chele Pierce, a Skyview High School counselor.
Between October 2023 and October 2024, nearly 2,200 students, about 10% of the district’s enrollment, were the subject of a Gaggle alert. At the Vancouver School of Arts and Academics, where Bryn is a student, about 1 in 4 students had communications that triggered a Gaggle alert.
While schools continue to use surveillance technology, its long-term effects on student safety are unclear. There’s no independent research showing it measurably lowers student suicide rates or reduces violence.
A 2023 RAND study found only “scant evidence” of either benefits or risks from AI surveillance, concluding: “No research to date has comprehensively examined how these programs affect youth suicide prevention.”
“If you don’t have the right number of mental health counselors, issuing more alerts is not actually going to improve suicide prevention,” said report co-author Benjamin Boudreaux, an AI ethics researcher.
In the screenshots released by Vancouver schools, at least six students were potentially outed to school officials after writing about being gay, trans or struggling with gender dysphoria.
LGBTQ+ students are more likely than their peers to suffer from depression and suicidal thoughts, and turn to the internet for support.
“We know that gay youth, especially those in more isolated environments, absolutely use the internet as a life preserver,” said Katy Pearce, a University of Washington professor who researches technology in authoritarian states.
In one screenshot, a Vancouver high schooler wrote in a Google survey form they’d been subject to trans slurs and racist bullying. Who created this survey is unclear, but the person behind it had falsely promised confidentiality: “I am not a mandated reporter, please tell me the whole truth.”
When North Carolina’s Durham Public Schools piloted Gaggle in 2021, surveys showed most staff members found it helpful.
But community members raised concerns. An LGBTQ+ advocate reported to the Board of Education that a Gaggle alert about self-harm had led to a student being outed to their family, who were not supportive.
Glenn Thompson, a Durham School of the Arts graduate, poses in front of the school in Durham, N.C., Monday, March 10, 2025. (AP Photo/Karl DeBlaker)
Glenn Thompson, a Durham School of the Arts graduate, spoke up at a board meeting during his senior year. One of his teachers promised a student confidentiality for an assignment related to mental health. A classmate was then “blindsided” when Gaggle alerted school officials about something private they’d disclosed. Thompson said no one in the class, including the teacher, knew the school was piloting Gaggle.
“You can’t just (surveil) people and not tell them. That’s a horrible breach of security and trust,” said Thompson, now a college student, in an interview.
After hearing about these experiences, the Durham Board of Education voted to stop using Gaggle in 2023. The district ultimately decided it was not worth the risk of outing students or eroding relationships with adults.
The debate over privacy and security is complicated, and parents are often unaware it’s even an issue. Pearce, the University of Washington professor, doesn’t remember reading about Securly, the surveillance software Seattle Public Schools uses, when she signed the district’s responsible use form before her son received a school laptop.
Even when families learn about school surveillance, they may be unable to opt out. Owasso Public Schools in Oklahoma has used Gaggle since 2016 to monitor students outside of class.
For years, Tim Reiland, the parent of two teenagers, had no idea the district was using Gaggle. He found out only after asking if his daughter could bring her personal laptop to school instead of being forced to use a district one because of privacy concerns.
The district refused Reiland’s request.
When his daughter, Zoe, found out about Gaggle, she says she felt so “freaked out” that she stopped Googling anything personal on her Chromebook, even questions about her menstrual period. She didn’t want to get called into the office for “searching up lady parts.”
“I was too scared to be curious,” she said.
School officials say they don’t track metrics measuring the technology’s efficacy but believe it has saved lives.
Yet technology alone doesn’t create a safe space for all students. In 2024, a nonbinary teenager at Owasso High School named Nex Benedict died by suicide after relentless bullying from classmates. A subsequent U.S. Department of Education Office for Civil Rights investigation found the district responded with “deliberate indifference” to some families’ reports of sexual harassment, mainly in the form of homophobic bullying.
During the 2023-24 school year, the Owasso schools received close to 1,000 Gaggle alerts, including 168 alerts for harassment and 281 for suicide.
When asked why bullying remained a problem despite surveillance, Russell Thornton, the district’s executive director of technology responded: “This is one tool used by administrators. Obviously, one tool is not going to solve the world’s problems and bullying.”
Despite the risks, surveillance technology can help teachers intervene before a tragedy.
A middle school student in the Seattle-area Highline School District who was potentially being trafficked used Gaggle to communicate with campus staff, said former superintendent Susan Enfield.
“They knew that the staff member was reading what they were writing,” Enfield said. “It was, in essence, that student’s way of asking for help.”
Still, developmental psychology research shows it is vital for teens to have private spaces online to explore their thoughts and seek support.
“The idea that kids are constantly under surveillance by adults — I think that would make it hard to develop a private life, a space to make mistakes, a space to go through hard feelings without adults jumping in,” said Boudreaux, the AI ethics researcher.
Gaggle’s Patterson says school-issued devices are not the appropriate place for unlimited self-exploration. If that exploration takes a dark turn, such as making a threat, “the school’s going to be held liable,” he said. “If you’re looking for that open free expression, it really can’t happen on the school system’s computers.”
Claire Bryan is an education reporter for The Seattle Times. Sharon Lurye is an education data reporter for The Associated Press.
Contact Hechinger managing editor Caroline Preston at 212-870-8965, on Signal at CarolineP.83 or via email at preston@hechingerreport.org.
This story about AI-powered surveillance at schools was produced by the Education Reporting Collaborative, a coalition of eight newsrooms that includes AL.com, The Associated Press, The Christian Science Monitor, The Dallas Morning News, The Hechinger Report, Idaho Education News, The Post and Courier in South Carolina, and The Seattle Times.
The Hechinger Report provides in-depth, fact-based, unbiased reporting on education that is free to all readers. But that doesn’t mean it’s free to produce. Our work keeps educators and the public informed about pressing issues at schools and on campuses throughout the country. We tell the whole story, even when the details are inconvenient. Help us keep doing that.