Tag: personalized learning

  • In training educators to use AI, we must not outsource the foundational work of teaching

    In training educators to use AI, we must not outsource the foundational work of teaching

    This story was originally published by Chalkbeat. Sign up for their newsletters at ckbe.at/newsletters.

    I was conferencing with a group of students when I heard the excitement building across my third grade classroom. A boy at the back table had been working on his catapult project for over an hour through our science lesson, into recess, and now during personalized learning time. I watched him adjust the wooden arm for what felt like the 20th time, measure another launch distance, and scribble numbers on his increasingly messy data sheet.

    “The longer arm launches farther!” he announced to no one in particular, his voice carrying the matter-of-fact tone of someone who had just uncovered a truth about the universe. I felt that familiar teacher thrill, not because I had successfully delivered a physics lesson, but because I hadn’t taught him anything at all.

    Last year, all of my students chose a topic they wanted to explore and pursued a personal learning project about it. This particular student had discovered the relationship between lever arm length and projectile distance entirely through his own experiments, which involved mathematics, physics, history, and data visualization.

    Other students drifted over to try his longer-armed design, and soon, a cluster of 8-year-olds were debating trajectory angles and comparing medieval siege engines to ancient Chinese catapults.

    They were doing exactly what I dream of as an educator: learning because they wanted to know, not because they had to perform.

    Then, just recently, I read about the American Federation of Teachers’ new $23 million partnership with Microsoft, OpenAI, and Anthropic to train educators how to use AI “wisely, safely and ethically.” The training sessions would teach them how to generate lesson plans and “microwave” routine communications with artificial intelligence.

    My heart sank.

    As an elementary teacher who also conducts independent research on the intersection of AI and education, and writes the ‘Algorithmic Mind’ column about it for Psychology Today, I live in the uncomfortable space between what technology promises and what children actually need. Yes, I use AI, but only for administrative work like drafting parent newsletters, organizing student data, and filling out required curriculum planning documents. It saves me hours on repetitive tasks that have nothing to do with teaching.

    I’m all for showing educators how to use AI to cut down on rote work. But I fear the AFT’s $23 million initiative isn’t about administrative efficiency. According to their press release, they’re training teachers to use AI for “instructional planning” and as a “thought partner” for teaching decisions. One featured teacher describes using AI tools to help her communicate “in the right voice” when she’s burned out. Another says AI can assist with “late-night lesson planning.”

    That sounds more like outsourcing the foundational work of teaching.

    Watching my student discover physics principles through intrinsic curiosity reminded me why this matters so much. When we start relying on AI to plan our lessons and find our teaching voice, we’re replacing human judgment with algorithmic thinking at the very moment students need us most. We’re prioritizing the product of teaching over the process of learning.

    Most teachers I talk to share similar concerns about AI. They focus on cheating and plagiarism. They worry about students outsourcing their thinking and how to assess learning when they can’t tell if students actually understand anything. The uncomfortable truth is that students have always found ways to avoid genuine thinking when we value products over process. I used SparkNotes. Others used Google. Now, students use ChatGPT.

    The problem is not technology; it’s that we continue prioritizing finished products over messy learning processes. And as long as education rewards predetermined answers over curiosity, students will find shortcuts.

    That’s why teachers need professional development that moves in the opposite direction. They need PD that helps them facilitate genuine inquiry and human connection; foster classrooms where confusion is valued as a precursor to understanding; and develop in students an intrinsic motivation.

    When I think about that boy measuring launch distances with handmade tools, I realize he was demonstrating the distinctly human capacity to ask questions that only he wanted to address. He didn’t need me to structure his investigation or discovery. He needed the freedom to explore, materials to experiment with, and time to pursue his curiosity wherever it led.

    The learning happened not because I efficiently delivered content, but because I stepped back and trusted his natural drive to understand.

    Children don’t need teachers who can generate lesson plans faster or give AI-generated feedback, but educators who can inspire questions, model intellectual courage, and create communities where wonder thrives and real-world problems are solved.

    The future belongs to those who can combine computational tools with human wisdom, ethics, and creativity. But this requires us to maintain the cognitive independence to guide AI systems rather than becoming dependent on them.

    Every time I watch my students make unexpected connections, I’m reminded that the most important learning happens in the spaces between subjects, in the questions that emerge from genuine curiosity, in the collaborative thinking that builds knowledge through relationships. We can’t microwave that. And we shouldn’t try.

    Chalkbeat is a nonprofit news site covering educational change in public schools.

    For more news on AI in education, visit eSN’s Digital Learning hub.

    Latest posts by eSchool Media Contributors (see all)

    Source link

  • Data, privacy, and cybersecurity in schools: A 2025 wake-up call

    Data, privacy, and cybersecurity in schools: A 2025 wake-up call

    Key points:

    In 2025, schools are sitting on more data than ever before. Student records, attendance, health information, behavioral logs, and digital footprints generated by edtech tools have turned K-12 institutions into data-rich environments. As artificial intelligence becomes a central part of the learning experience, these data streams are being processed in increasingly complex ways. But with this complexity comes a critical question: Are schools doing enough to protect that data?

    The answer, in many cases, is no.

    The rise of shadow AI

    According to CoSN’s May 2025 State of EdTech District Leadership report, a significant portion of districts, specifically 43 percent, lack formal policies or guidance for AI use. While 80 percent of districts have generative AI initiatives underway, this policy gap is a major concern. At the same time, Common Sense Media’s Teens, Trust and Technology in the Age of AI highlights that many teens have been misled by fake content and struggle to discern truth from misinformation, underscoring the broad adoption and potential risks of generative AI.

    This lack of visibility and control has led to the rise of what many experts call “shadow AI”: unapproved apps and browser extensions that process student inputs, store them indefinitely, or reuse them to train commercial models. These tools are often free, widely adopted, and nearly invisible to IT teams. Shadow AI expands the district’s digital footprint in ways that often escape policy enforcement, opening the door to data leakage and compliance violations. CoSN’s 2025 report specifically notes that “free tools that are downloaded in an ad hoc manner put district data at risk.”

    Data protection: The first pillar under pressure

    The U.S. Department of Education’s AI Toolkit for Schools urges districts to treat student data with the same care as medical or financial records. However, many AI tools used in classrooms today are not inherently FERPA-compliant and do not always disclose where or how student data is stored. Teachers experimenting with AI-generated lesson plans or feedback may unknowingly input student work into platforms that retain or share that data. In the absence of vendor transparency, there is no way to verify how long data is stored, whether it is shared with third parties, or how it might be reused. FERPA requires that if third-party vendors handle student data on behalf of the institution, they must comply with FERPA. This includes ensuring data is not used for unintended purposes or retained for AI training.

    Some tools, marketed as “free classroom assistants,” require login credentials tied to student emails or learning platforms. This creates additional risks if authentication mechanisms are not protected or monitored. Even widely-used generative tools may include language in their privacy policies allowing them to use uploaded content for system training or performance optimization.

     

    Data processing and the consent gap

    Generative AI models are trained on large datasets, and many free tools continue learning from user prompts. If a student pastes an essay or a teacher includes student identifiers in a prompt, that information could enter a commercial model’s training loop. This creates a scenario where data is being processed without explicit consent, potentially in violation of COPPA (Children’s Online Privacy Protection Act) and FERPA. While the FTC’s December 2023 update to the COPPA Rule did not codify school consent provisions, existing guidance still allows schools to consent to technology use on behalf of parents in educational contexts. However, the onus remains on schools to understand and manage these consent implications, especially with the rule’s new amendments becoming effective June 21, 2025, which strengthen protections and require separate parental consent for third-party disclosures for targeted advertising.

    Moreover, many educators and students are unaware of what constitutes “personally identifiable information” (PII) in these contexts. A name combined with a school ID number, disability status, or even a writing sample could easily identify a student, especially in small districts. Without proper training, well-intentioned AI use can cross legal lines unknowingly.

    Cybersecurity risks multiply

    AI tools have also increased the attack surface of K-12 networks. According to ThreatDown’s 2024 State of Ransomware in Education report, ransomware attacks on K-12 schools increased by 92 percent between 2022 and 2023, with 98 total attacks in 2023. This trend is projected to continue as cybercriminals use AI to create more targeted phishing campaigns and detect system vulnerabilities faster. AI-assisted attacks can mimic human language and tone, making them harder to detect. Some attackers now use large language models to craft personalized emails that appear to come from school administrators.

    Many schools lack endpoint protection for student devices, and third-party integrations often bypass internal firewalls. Free AI browser extensions may collect keystrokes or enable unauthorized access to browser sessions. The more tools that are introduced without IT oversight, the harder it becomes to isolate and contain incidents when they occur. CoSN’s 2025 report indicates that 60 percent of edtech leaders are “very concerned about AI-enabled cyberattacks,” yet 61 percent still rely on general funds for cybersecurity efforts, not dedicated funding.

    Building a responsible framework

    To mitigate these risks, school leaders need to:

    • Audit tool usage using platforms like Lightspeed Digital Insight to identify AI tools being accessed without approval. Districts should maintain a living inventory of all digital tools. Lightspeed Digital Insight, for example, is vetted by 1EdTech for data privacy.
    • Develop and publish AI use policies that clarify acceptable practices, define data handling expectations, and outline consequences for misuse. Policies should distinguish between tools approved for instructional use and those requiring further evaluation.
    • Train educators and students to understand how AI tools collect and process data, how to interpret AI outputs critically, and how to avoid inputting sensitive information. AI literacy should be embedded in digital citizenship curricula, with resources available from organizations like Common Sense Media and aiEDU.
    • Vet all third-party apps through standards like the 1EdTech TrustEd Apps program. Contracts should specify data deletion timelines and limit secondary data use. The TrustEd Apps program has vetted over 12,000 products, providing a valuable resource for districts.
    • Simulate phishing attacks and test breach response protocols regularly. Cybersecurity training should be required for staff, and recovery plans must be reviewed annually.

    Trust starts with transparency

    In the rush to embrace AI, schools must not lose sight of their responsibility to protect students’ data and privacy. Transparency with parents, clarity for educators, and secure digital infrastructure are not optional. They are the baseline for trust in the age of algorithmic learning.

    AI can support personalized learning, but only if we put safety and privacy first. The time to act is now. Districts that move early to build policies, offer training, and coordinate oversight will be better prepared to lead AI adoption with confidence and care.

    Latest posts by eSchool Media Contributors (see all)

    Source link

  • K12 Earns High Marks for Excellence in Online Public Education

    K12 Earns High Marks for Excellence in Online Public Education

    RESTON, Va.(GLOBE NEWSWIRE) — K12, a portfolio brand of Stride, Inc. has been recognized for its steadfast commitment to quality education. In a recent review by Cognia, a global nonprofit that accredits schools, K12 earned an impressive Index of Education Quality (IEQ) score of 327, well above the global average of 296. Cognia praised K12 for creating supportive environments where students are encouraged to learn and grow in ways that work best for them. 

    For over 25 years, K12 has been a pioneer in online public education, delivering flexible, high-quality learning experiences to families across the country. Having served more than 3 million students, K12 has helped shape the future of personalized learning. This long-standing presence in the field reflects a deep understanding of what families need from a modern education partner. The recent Cognia review further validates K12’s role as a trusted provider, recognizing the strength of its learning environments and its commitment to serving all students. 

    “What stood out in this review is how clearly our learning environments are working for students,” said Niyoka McCoy, Chief Learning Officer at Stride, Inc. “From personalized graduation plans to real-time feedback tools and expanded course options, the Cognia team saw what we see every day, which is students being supported in ways that help them grow, stay engaged, and take ownership of their learning.” 

    K12’s impact extends well beyond the virtual classroom. In 2025, the organization was honored with two Gold Stevie® Awards for Innovation in Education and recognized at the Digital Education Awards for its excellence in digital learning. These awards highlight K12’s continued leadership in delivering meaningful, future-focused education. What sets K12-powered online public schools apart is a curriculum that goes beyond the basics, offering students access to STEM, Advanced Placement, dual-credit, industry certifications, and gamified learning experiences. K12’s program is designed to spark curiosity, build confidence, and help students thrive in college, careers, and life. 

    Through student-centered instruction and personalized support, K12 is leading the way in modern education. As the learning landscape evolves, K12 adapts alongside it, meeting the needs of today’s students while shaping the future of education. 

    To learn more about K12 and its accredited programs, visit k12.com.

    About Stride, Inc.  

    Stride Inc. (LRN) is redefining lifelong learning with innovative, high-quality education solutions. Serving learners in primary, secondary, and postsecondary settings, Stride provides a wide range of services including K-12 education, career learning, professional skills training, and talent development. Stride reaches learners in all 50 states and over 100 countries. Learn more at Stridelearning.com.

    eSchool News Staff
    Latest posts by eSchool News Staff (see all)

    Source link