Tag: edtech

  • Do screens help or hurt K-8 learning? Lessons from the UK’s OPAL program

    Do screens help or hurt K-8 learning? Lessons from the UK’s OPAL program

    Key points:

    When our leadership team at Firthmoor Primary met with an OPAL (Outdoor Play and Learning) representative, one message came through clearly: “Play isn’t a break from learning, it is learning.”

    As she flipped through slides, we saw examples from other schools where playgrounds were transformed into hubs of creativity. There were “play stations” where children could build, imagine, and collaborate. One that stood out for me was the simple addition of a music station, where children could dance to songs during break time, turning recess into an outlet for joy, self-expression, and community.

    The OPAL program is not about giving children “more time off.” It’s about making play purposeful, inclusive, and developmental. At Firthmoor, our head teacher has made OPAL part of the long-term school plan, ensuring that playtime builds creativity, resilience, and social skills just as much as lessons in the classroom.

    After seeing these OPAL examples, I couldn’t help but think about how different this vision is from what dominates the conversation in so many schools: technology. While OPAL emphasizes unstructured play, movement, and creativity, most education systems, both in the UK and abroad, are under pressure to adopt more edtech. The argument is that early access to screens helps children personalize their learning, build digital fluency, and prepare for a future where tech skills are essential.

    But what happens when those two philosophies collide?

    On one side, programs like OPAL remind us that children need hands-on experiences, imagination, and social connection–skills that can’t be replaced by a tablet. On the other, schools around the world are racing to keep pace with the digital age.

    Even in Silicon Valley, where tech innovation is born, schools like the Waldorf School of the Peninsula have chosen to go screen-free in early years. Their reasoning echoes OPAL’s ethos: Creativity and deep human interaction lay stronger cognitive and emotional foundations than any app can provide.

    Research supports this caution. The Royal College of Paediatrics and Child Health advises parents and schools to carefully balance screen use with physical activity, sleep, and family interaction. And in 2023, UNESCO warned that “not all edtech improves learning outcomes, and some displace play and social interaction.” Similarly, the OECD’s 2021 report found that heavy screen use among 10-year-olds correlated with lower well-being scores, highlighting the risks of relying too heavily on devices in the early years.

    As a governor, I see both sides: the enthusiasm for digital tools that promise engagement and efficiency, and the concern for children’s well-being and readiness for lifelong learning. OPAL has made me think about what kind of foundations we want to lay before layering on technology.

    So where does this leave us? For me, the OPAL initiative at Firthmoor is a powerful reminder that education doesn’t have to be an either/or choice between tech and tradition. The real challenge is balance.

    This raises important questions for all of us in education:

    • When is the right time to introduce technology?
    • How do we balance digital fluency with the need for deep, human-centered learning?
    • Where do we draw the line between screens and play, and who gets to decide?

    This is a conversation not just for educators, but for parents, policymakers, and communities. How do we want the next generation to learn, play, and thrive?

    Latest posts by eSchool Media Contributors (see all)

    Source link

  • K-12 districts are fighting ransomware, but IT teams pay the price

    K-12 districts are fighting ransomware, but IT teams pay the price

    Key points:

    The education sector is making measurable progress in defending against ransomware, with fewer ransom payments, dramatically reduced costs, and faster recovery rates, according to the fifth annual Sophos State of Ransomware in Education report from Sophos.

    Still, these gains are accompanied by mounting pressures on IT teams, who report widespread stress, burnout, and career disruptions following attacks–nearly 40 percent of the 441 IT and cybersecurity leaders surveyed reported dealing with anxiety.

    Over the past five years, ransomware has emerged as one of the most pressing threats to education–with attacks becoming a daily occurrence. Primary and secondary institutions are seen by cybercriminals as “soft targets”–often underfunded, understaffed, and holding highly sensitive data. The consequences are severe: disrupted learning, strained budgets, and growing fears over student and staff privacy. Without stronger defenses, schools risk not only losing vital resources but also the trust of the communities they serve.

    Indicators of success against ransomware

    The new study demonstrates that the education sector is getting better at reacting and responding to ransomware, forcing cybercriminals to evolve their approach. Trending data from the study reveals an increase in attacks where adversaries attempt to extort money without encrypting data. Unfortunately, paying the ransom remains part of the solution for about half of all victims. However, the payment values are dropping significantly, and for those who have experienced data encryption in ransomware attacks, 97 percent were able to recover data in some way. The study found several key indicators of success against ransomware in education:

    • Stopping more attacks: When it comes to blocking attacks before files can be encrypted, both K-12 and higher education institutions reported their highest success rate in four years (67 percent and 38 percent of attacks, respectively).
    • Following the money: In the last year, ransom demands fell 73 percent (an average drop of $2.83M), while average payments dropped from $6M to $800K in lower education and from $4M to $463K in higher education.
    • Plummeting cost of recovery: Outside of ransom payments, average recovery costs dropped 77 percent in higher education and 39 percent in K-12 education. Despite this success, K-12 education reported the highest recovery bill across all industries surveyed.

    Gaps still need to be addressed

    While the education sector has made progress in limiting the impact of ransomware, serious gaps remain. In the Sophos study, 64 percent of victims reported missing or ineffective protection solutions; 66 percent cited a lack of people (either expertise or capacity) to stop attacks; and 67 percent admitted to having security gaps. These risks highlight the critical need for schools to focus on prevention, as cybercriminals develop new techniques, including AI-powered attacks.

    Highlights from the study that shed light on the gaps that still need to be addressed include:

    • AI-powered threats: K-12 education institutions reported that 22 percent of ransomware attacks had origins in phishing. With AI enabling more convincing emails, voice scams, and even deepfakes, schools risk becoming test grounds for emerging tactics.
    • High-value data: Higher education institutions, custodians of AI research and large language model datasets, remain a prime target, with exploited vulnerabilities (35 percent) and security gaps the provider was not aware of (45 percent) as leading weaknesses that were exploited by adversaries.
    • Human toll: Every institution with encrypted data reported impacts on IT staff. Over one in four staff members took leave after an attack, nearly 40 percent reported heightened stress, and more than one-third felt guilt they could not prevent the breach.

    “Ransomware attacks in education don’t just disrupt classrooms, they disrupt communities of students, families, and educators,” said Alexandra Rose, director of CTU Threat Research at Sophos. “While it’s encouraging to see schools strengthening their ability to respond, the real priority must be preventing these attacks in the first place. That requires strong planning and close collaboration with trusted partners, especially as adversaries adopt new tactics, including AI-driven threats.”

    Holding on to the gains

    Based on its work protecting thousands of educational institutions, Sophos experts recommend several steps to maintain momentum and prepare for evolving threats:

    • Focus on prevention: The dramatic success of lower education in stopping ransomware attacks before encryption offers a blueprint for broader public sector organizations. Organizations need to couple their detection and response efforts with preventing attacks before they compromise the organization.
    • Secure funding: Explore new avenues such as the U.S. Federal Communications Commission’s E-Rate subsidies to strengthen networks and firewalls, and the UK’s National Cyber Security Centre initiatives, including its free cyber defense service for schools, to boost overall protection. These resources help schools both prevent and withstand attacks.
    • Unify strategies: Educational institutions should adopt coordinated approaches across sprawling IT estates to close visibility gaps and reduce risks before adversaries can exploit them.
    • Relieve staff burden: Ransomware takes a heavy toll on IT teams. Schools can reduce pressure and extend their capabilities by partnering with trusted providers for managed detection and response (MDR) and other around-the-clock expertise.
    • Strengthen response: Even with stronger prevention, schools must be prepared to respond when incidents occur. They can recover more quickly by building robust incident response plans, running simulations to prepare for real-world scenarios, and enhancing readiness with 24/7/365 services like MDR.

    Data for the State of Ransomware in Education 2025 report comes from a vendor-agnostic survey of 441 IT and cybersecurity leaders – 243 from K-12 education and 198 from higher education institutions hit by ransomware in the past year. The organizations surveyed ranged from 100-5,000 employees and across 17 countries. The survey was conducted between January and March 2025, and respondents were asked about their experience of ransomware over the previous 12 months.

    This press release originally appeared online.

    Latest posts by eSchool Media Contributors (see all)

    Source link

  • What educators need to know

    What educators need to know

    Key points:

    Literacy has always been the foundation of learning, but for middle school students, the stakes are especially high. These years mark the critical shift from learning to read to reading to learn.

    When students enter sixth, seventh, or eighth grade still struggling with foundational skills, every subject becomes harder–science labs, social studies texts, even math word problems require reading proficiency. For educators, the challenge is not just addressing gaps but also building the confidence that helps adolescents believe they can succeed.

    The confidence gap

    By middle school, many students are keenly aware when they’re behind their peers in reading. Interventions that feel too elementary can undermine motivation. As Dr. Michelle D. Barrett, Senior Vice President of Research, Policy, and Impact at Edmentum, explained:

    “If you have a student who’s in the middle grades and still has gaps in foundational reading skills, they need to be provided with age-appropriate curriculum and instruction. You can’t give them something that feels babyish–that only discourages them.”

    Designing for engagement

    Research shows that engagement is just as important as instruction, particularly for adolescents. “If students aren’t engaged, if they’re not showing up to school, then you have a real problem,” Barrett said. “It’s about making sure that even if students have gaps, they’re still being supported with curriculum that feels relevant and engaging.”

    To meet that need, digital programs like Edmentum’s Exact Path tailor both design and content to the learner’s age. “A middle schooler doesn’t want the cartoony things our first graders get,” Barrett noted. “That kind of thing really does matter–not just for engagement, but also for their confidence and willingness to keep going.”

    Measuring what works

    Educators also need strong data to target interventions. “It’s all about how you’re differentiating for those students,” Barrett said. “You’ve got to have great assessments, engaging content that’s evidence-based, and a way for students to feel and understand success.”

    Exact Path begins with universal screening, then builds personalized learning paths grounded in research-based reading progressions. More than 60 studies in the past two years have shown consistent results. “When students complete eight skills per semester, we see significant growth across grade levels–whether measured by NWEA MAP, STAR, or state assessments,” Barrett added.

    That growth extends across diverse groups. “In one large urban district, we found the effect sizes for students receiving special education services were twice that of their peers,” Barrett said. “That tells us the program can be a really effective literacy intervention for students most at risk.”

    Layering supports for greater impact

    Barrett emphasized that literacy progress is strongest when multiple supports are combined. “With digital curriculum, students do better. But with a teacher on top of that digital curriculum, they do even better. Add intensive tutoring, and outcomes improve again,” she said.

    Progress monitoring and recognition also help build confidence. “Students are going to persist when they can experience success,” Barrett added. “Celebrating growth, even in small increments, matters for motivation.”

    A shared mission

    While tools like Exact Path provide research-backed support, Barrett stressed that literacy improvement is ultimately a shared responsibility. “District leaders should be asking: How is this program serving students across different backgrounds? Is it working for multilingual learners, students with IEPs, students who are at risk?” she said.

    The broader goal, she emphasized, is preparing students for lifelong learning. “Middle school is such an important time. If we can help students build literacy and confidence there, we’re not just improving test scores–we’re giving them the skills to succeed in every subject, and in life.”

    Laura Ascione
    Latest posts by Laura Ascione (see all)

    Source link

  • The Right-Wing Roots of EdTech

    The Right-Wing Roots of EdTech

    The modern EdTech industry is often portrayed as a neutral, innovative force, but its origins are deeply political. Its growth has been fueled by a fusion of neoliberal economics, right-wing techno-utopianism, patriarchy, and classism, reinforced by racialized inequality. One of the key intellectual architects of this vision was George Gilder, a conservative supply-side evangelist whose work glorified technology and markets as liberating forces. His influence helped pave the way for the “Gilder Effect”: a reshaping of education into a market where technology, finance, and ideology collide, often at the expense of marginalized students and workers.

    The for-profit college boom provides the clearest demonstration of how the Gilder Effect operates. John Sperling’s University of Phoenix, later run by executives like Todd Nelson, was engineered as a credential factory, funded by federal student aid and Wall Street. Its model was then exported across the sector, including Risepoint (formerly Academic Partnerships), a company that sold universities on revenue-sharing deals for online programs. These ventures disproportionately targeted working-class women, single mothers, military veterans, and Black and Latino students. The model was not accidental—it was designed to exploit populations with the least generational wealth and the most limited alternatives. Here, patriarchy, classism, and racism intersected: students from marginalized backgrounds were marketed promises of upward mobility but instead left with debt, unstable credentials, and limited job prospects.

    Clayton Christensen and Michael Horn of Harvard Business School popularized the concept of “disruption,” providing a respectable academic justification for dismantling public higher education. Their theory of disruptive innovation framed traditional universities as outdated and made way for venture-capital-backed intermediaries. Yet this rhetoric concealed a brutal truth: disruption worked not by empowering the disadvantaged but by extracting value from them, often reinforcing existing inequalities of race, gender, and class.

    The rise and collapse of 2U shows how this ideology plays out. Founded in 2008, 2U promised to bring elite universities online, selling the dream of access to graduate degrees for working professionals. Its “flywheel effect” growth strategy relied on massive enrollment expansion and unsustainable spending. Despite raising billions, the company never turned a profit. Its high-profile acquisition of edX from Harvard and MIT only deepened its financial instability. When 2U filed for bankruptcy, it was not simply a corporate failure—it was a symptom of an entire system built on hype and dispossession.

    2U also became notorious for its workplace practices. In 2015, it faced a pregnancy discrimination lawsuit after firing an enrollment director who disclosed her pregnancy. Women workers, especially mothers, were treated as expendable, a reflection of patriarchal corporate norms. Meanwhile, many front-line employees—disproportionately women and people of color—faced surveillance, low wages, and impossible sales quotas. Here the intersections of race, gender, and class were not incidental but central to the business model. The company extracted labor from marginalized workers while selling an educational dream to marginalized students, creating a cycle of exploitation at both ends of the pipeline.

    Financialization extended these dynamics. Lenders like Sallie Mae and Navient, and servicers like Maximus, turned students into streams of revenue, with Student Loan Asset-Backed Securities (SLABS) trading debt obligations on Wall Street. Universities, including Purdue Global and University of Arizona Global, rebranded failing for-profits as “public” ventures, but their revenue-driven practices remained intact. These arrangements consistently offloaded risk onto working-class students, especially women and students of color, while enriching executives and investors.

    The Gilder Effect, then, is not just about technology or efficiency. It is about reshaping higher education into a site of extraction, where the burdens of debt and labor fall hardest on those already disadvantaged by patriarchy, classism, and racism. Intersectionality reveals what the industry’s boosters obscure: EdTech has not democratized education but has deepened inequality. The failure of 2U and the persistence of predatory for-profit models are not accidents—they are the logical outcome of an ideological project rooted in conservative economics and systemic oppression.


    Sources

    Source link

  • Weaving digital citizenship into edtech innovation

    Weaving digital citizenship into edtech innovation

    Key points:

    What happens when over 100 passionate educators converge in Chicago to celebrate two decades of educational innovation? A few weeks ago, I had the thrilling opportunity to immerse myself in the 20th anniversary of the Discovery Educator Network (the DEN), a week-long journey that reignited my passion for transforming classrooms.

    From sunrise to past sunset, my days at Loyola University were a whirlwind of learning, laughter, and relentless exploration. Living the dorm life, forging new connections, and rekindling old friendships, we collectively dove deep into the future of learning, creating experiences that went far beyond the typical professional development.

    As an inaugural DEN member, the professional learning community supported by Discovery Education, I was incredibly excited to return 20 years after its founding to guide a small group of educators through the bountiful innovations of the DEN Summer Institute (DENSI). Think scavenger hunts, enlightening workshops, and collaborative creations–every moment was packed with cutting-edge ideas and practical strategies for weaving technology seamlessly into our teaching, ensuring our students are truly future-ready.

    During my time at DENSI, I learned a lot of new tips and tricks that I will pass on to the educators I collaborate with. From AI’s potential to the various new ways to work together online, participants in this unique event learned a number of ways to weave digital citizenship into edtech innovation. I’ve narrowed them down to five core concepts; each a powerful step toward building future-ready classrooms and fostering truly responsible digital citizens.

    Use of artificial intelligence

    Technology integration: When modeling responsible AI use, key technology tools could include generative platforms like Gemini, NotebookLM, Magic School AI, and Brisk, acting as ‘thought partners’ for brainstorming, summarizing, and drafting. Integration also covers AI grammar/spell-checkers, data visualization tools, and feedback tools for refining writing, presenting information, and self-assessment, enhancing digital content interaction and production.

    Learning & application: Teaching students to ethically use AI is key. This involves modeling critical evaluation of AI content for bias and inaccuracies. For instance, providing students with an AI summary of a historical event to fact-check with credible sources. Students learn to apply AI as a thought partner, boosting creativity and collaboration, not replacing their own thinking. Fact-checking and integrating their unique voices are essential. An English class could use AI to brainstorm plot ideas, but students develop characters and write the narrative. Application includes using AI for writing refinement and data exploration, fostering understanding of AI’s academic capabilities and limitations.

    Connection to digital citizenship: This example predominantly connects to digital citizenship. Teaching responsible AI use promotes intellectual honesty and information literacy. Students can grasp ethical considerations like plagiarism and proper attribution. The “red, yellow, green” stoplight method provides a framework for AI use, teaching students when to use AI as a collaborator, editor, or thought partner–or not at all.This approach cultivates critical thinking and empowers students to navigate the digital landscape with integrity, preparing them as responsible digital citizens understanding AI’s implications.

    Digital communication

    Technology integration: Creating digital communication norms should focus on clarity with visuals like infographics, screenshots, and video clips. Canva is a key tool for a visual “Digital Communication Agreement” defining online interaction expectations. Include student voice by the integration and use of pictures and graphics to illustrate behaviors and potentially collaborative presentation / polling tools for student involvement in norm-setting.

    Learning & application: Establishing clear online interaction norms is the focus of digital communication. Applying clear principles teaches the importance of visuals and setting communication goals. Creating a visual “Digital Communication Agreement” with Canva is a practical application where students define respectful online language and netiquette. An elementary class might design a virtual classroom rules poster, showing chat emojis and explaining “think before you post.” Using screenshots and “SMART goals” for online discussions reinforces learning, teaching constructive feedback and respectful debate. In a middle school science discussion board, the teacher could model a respectful response like “I understand your point, but I’m wondering if…” This helps students apply effective digital communication principles.

    Connection to digital citizenship: This example fosters respectful communication, empathy, and understanding of online social norms. By creating and adhering to a “Digital Communication Agreement,” students develop responsibility for online interactions. Emphasizing respectful language and netiquette cultivates empathy and awareness of their words’ impact. This prepares them as considerate digital citizens, contributing positively to inclusive online communities.

    Content curation

    Technology integration: For understanding digital footprints, one primary tool is Google Drive when used as a digital folder to curate students’ content. The “Tech Toolbox” concept implies interaction with various digital platforms where online presence exists. Use of many tools to curate content allows students to leave traces on a range of technologies forming their collective digital footprint.

    Learning & application: This centers on educating students about their online presence’s permanence and nature. Teaching them to curate digital content in a structured way, like using a Google Drive folder, is key. A student could create a “Digital Portfolio” in Google Drive with online projects, proud social media posts, and reflections on their public identity. By collecting and reviewing online artifacts, students visualize their current “digital footprint.” The classroom “listening tour” encourages critical self-reflection, prompting students to think about why they share online and how to be intentional about their online identity. This might involve students reviewing anonymized social media profiles, discussing the impression given to future employers.

    Connection to digital citizenship: This example cultivates awareness of online permanence, privacy, responsible self-presentation, and reputation management. Understanding lasting digital traces empowers students to make informed decisions. The reflection process encourages the consideration of their footprint’s impact, fostering ownership and accountability for online behavior. This helps them become mindful, capable digital citizens.

    Promoting media literacy

    Technology integration: One way to promote media literacy is by using “Paperslides” for engaging content creation, leveraging cameras and simple video recording. This concept gained popularity at the beginning of the DEN through Dr. Lodge McCammon. Dr. Lodge’s popular 1-Take Paperslide Video strategy is to “hit record, present your material, then hit stop, and your product is done” style of video creation is something that anyone can start using tomorrow. Integration uses real-life examples (likely digital media) to share a variety of topics for any audience. Additionally, to apply “Pay Full Attention” in a digital context implies online viewing platforms and communication tools for modeling digital eye contact and verbal cues.

    Learning & application: Integrating critical media consumption with engaging content creation is the focus. Students learn to leverage “Paperslides” or another video creation method to explain topics or present research, moving beyond passive consumption. For a history project, students could create “Paperslides” explaining World War II causes, sourcing information and depicting events. Learning involves using real-life examples to discern credible online sources, understanding misinformation and bias. A lesson might show a satirical news article, guiding students to verify sources and claims through their storyboard portion. Applying “Pay Full Attention” teaches active, critical viewing, minimizing distractions. During a class viewing of an educational video, students could pause to discuss presenter credentials or unsupported claims, mimicking active listening. This fosters practical media literacy in creating and consuming digital content.

    Connection to digital citizenship: This example enhances media literacy, critical online information evaluation, and understanding persuasive techniques. Learning to create and critically consume content makes students informed, responsible digital participants. They identify and question sources, essential for navigating a digital information-saturated world. This empowers them as discerning digital citizens, contributing thoughtfully to online content.

    Collaborative problem-solving

    Technology integration: For practicing digital empathy and support, key tools are collaborative online documents like Google Docs and Google Slides. Integration extends to online discussion forums (Google Classroom, Flip) for empathetic dialogue, and project management tools (Trello, Asana) for transparent organization. 

    Learning & application: This focuses on developing effective collaborative skills and empathetic communication in digital spaces. Students learn to work together on shared documents, applying a “Co-Teacher or Model Lessons” approach where they “co-teach” each other new tools or concepts. In a group science experiment, students might use a shared Google Doc to plan methodology, with one “co-teaching” data table insertion from Google Sheets. They practice constructive feedback and model active listening in digital settings, using chat for clarification or emojis for feelings. The “red, yellow, green” policy provides a clear framework for online group work, teaching when to seek help, proceed cautiously, or move forward confidently. For a research project, “red” means needing a group huddle, “yellow” is proceeding with caution, and “green” is ready for review.

    Connection to digital citizenship: This example is central to digital citizenship, developing empathy, respectful collaboration, and responsible problem-solving in digital environments. Structured online group work teaches how to navigate disagreements and offers supportive feedback. Emphasis on active listening and empathetic responses helps internalize civility, preparing students as considerate digital citizens contributing positively to online communities.

    These examples offer a powerful roadmap for cultivating essential digital citizenship skills and preparing all learners to be future-ready. The collective impact of thoughtfully utilizing these or similar approaches , or even grab and go resources from programs such as Discovery Education’s Digital Citizenship Initiative, can provide the foundation for a strong academic and empathetic school year, empowering educators and students alike to navigate the digital world with confidence, integrity, and a deep understanding of their role as responsible digital citizens.

    In addition, this event reminded me of the power of professional learning communities.  Every educator needs and deserves a supportive community that will share ideas, push their thinking, and support their professional development. One of my long-standing communities is the Discovery Educator Network (which is currently accepting applications for membership). 

    Latest posts by eSchool Media Contributors (see all)

    Source link

  • 4 tips to support the literacy needs of middle and high school students

    4 tips to support the literacy needs of middle and high school students

    Key points:

    Today’s middle schoolers continue to struggle post-pandemic to read and write at the level needed to successfully navigate more complex academic content in the upper grades and beyond, according to a new report from NWEA, a K-12 assessment and research organization.

    Based on NWEA’s research, current 8th graders would need close to a full academic year of additional instruction to catch up to their pre-pandemic peers in reading. This trend was reiterated in recent assessment results from the National Assessment on Educational Progress (NAEP), with only 30 percent of eighth-grade students performing at or above the NAEP proficient level.

    While early literacy initiatives have garnered attention in recent years, the fact remains that many students struggle to read and are not prepared for the rigors of middle school. Students quickly find themselves challenged to keep up as they no longer receive explicit, structured reading instruction, even as they are expected to comprehend increasingly complex materials across subjects, like science, history, or English Language Arts.

    The report, Policy recommendations for addressing the middle school reading crisis, is co-authored by Miah Daughtery, EdD, NWEA VP of Academic Advocacy at HMH (NWEA’s parent company), and Chad Aldeman, founder of Read Not Guess.

    “Our current middle and high schoolers were just starting their literacy journey when the pandemic hit, and we cannot lessen the urgency to support them. But, middle school literacy is complex even for students who are reading on grade level. This demands intentional, well-funded, and focused policy leadership that includes support across the K-12 spectrum,” said Daughtery. “Simply put, learning to read is not done when a student exits elementary school; support cannot stop there either.”

    Policymakers and district leaders must adopt a systems-level approach that supports both early learners and the unique literacy needs of middle and high school students.

    The new report provides four components that can be leveraged to make this happen:

    1. Use high-quality, grade-appropriate assessments that provide specific data on the literacy needs of middle schoolers.
    2. Look at flexible scheduling and policies that promote literacy development throughout the entire school day and help districts more effectively use instructional time.
    3. Understand and support the unique literacy needs of middle schoolers across subjects and disciplines from a systems perspective and invest in teacher professional learning in all disciplines, including at the upper grades, within state and district literacy plans.
    4. Curate relationships with external partners, like community organizations and nonprofits, who share similar goals in improving literacy outcomes, and can both support and reinforce literacy development, stretching beyond the school’s hours and resources.
    eSchool News Staff
    Latest posts by eSchool News Staff (see all)

    Source link

  • The PIE meets Taylor Shead

    The PIE meets Taylor Shead

    “Who am I? I’m one of the people that can see the future well before it’s created.”

    Meet Taylor Shead, the athlete-turned tech entrepreneur who is on a mission to change the way students access and absorb education in the 21st century.

    A former college basketball scholar, her original goal was to train as a reconstructive plastic surgeon alongside her sporting career.

    But like many students, while sports held her attention, she found STEM subjects inaccessible due to the dense language of mathematical equations and chemical symbols.

    “Frankly, I was a little annoyed,” Shead explains. “I was in the best private schools in Texas, and I thought: if I’m in this privileged position where I’m going to college level and I don’t feel prepared, then what about everybody else from all kinds of backgrounds?

    “As an athlete, you have tutors [to help you succeed academically] and so I had a moment when I realised that the education system isn’t working.”

    The statistics back up her hypothesis. In the US, approximately 86% of kids graduate from high school, but only about 37% of them graduate from college. Only 66% of US students reach Level 2 proficiency in mathematics and fewer than 30% of high school students feel prepared to pursue a postsecondary pathway.

    “It was like, this isn’t a problem that’s black or white, it’s not male or female, it’s not rich or poor. This is a problem that impacts everybody,” says Shead.

    “There’s a problem with the current system, the way schooling and college prepares you for each next step, even when it’s the best of the best – so what’s the solution?”

    Building on a three-year stint as an Apple mentor and volunteering in inner city schools in Dallas and Fort Worth, Shead took the leap and founded Stemuli in 2016 as a platform to support kids in STEM subjects.

    Shortly after, the pandemic hit and the world pivoted to online learning. The moment catapulted the business forward and Shead became only the 94th black woman in the history of the world to raise over a million dollars in venture capital.

    The company raised over USD$10 million overall and won the prestigious United Nations AI for good competition in 2024.

    The Stemuli mission is to gamify the curriculum to engage a generation of learners who have grown up on video games. This isn’t online learning for the sake of it; the aim is to create learning opportunities in the co-creative worlds that exist in games.

    “There are 3.3 billion gamers around the world playing right now,” Shead explains. “Yet all the kids I meet in classrooms are bored. Games like Roblox and Minecraft have set the example of STEM learning crossing over to where kids want to be.”

    Stemuli is currently beta testing the third iteration of the platform, a one-world gaming environment where there are infinite possibilities to explore and learn.

    Only 66% of US students reach Level 2 proficiency in math and fewer than 30% of high school students feel prepared to pursue a postsecondary pathway

    “We used to produce a lot of work simulation games but now nobody knows what the future jobs are going to be. Technology is moving so fast,” explains Shead.

    “So we’ve created a much more entrepreneurial gaming experience where, together with an AI prompt assistant, you can test and learn all sorts of ideas in a safe environment. We’ve created a game for entrepreneurship.”

    Shead is keen to stress that there is a misconception that entrepreneurship means that you must aspire to be the boss of your own company. She equates entrepreneurship to a curiosity skillset that builds problem solving and resilience in a fast-changing world.

    “We are a Walton family funded organisation and they partnered with us at Stemuli to scale stimuli across 20 states in the heartland in order to make sure people in rural America have access to AI literacy skills through our video game,” she says.

    “I am obsessed about the idea of a little boy or girl sitting in a rural, remote town that’s seeing with their own eyes the problems that need to be solved in their community. They’re going to create the best technology because they understand the problem, whereas somebody on the coast or Silicon Valley, they’re not even thinking about it.”

    It is also is significant that Shead has achieved so much success in the edtech field, despite coming largely from an athletic background rather than a tech education.

    “Most people think athletes are dumb, but maybe we’re stubborn and hardworking and relentless enough to be the ones that actually can endure the pressure to make something like this happen, right?

    “I like to flip the narrative on its head to say it might take an athlete to go up against established systems and to believe that, in a world that is so structured, that education can actually change for the better. They don’t call athletes game-changers for nothing.”

    There will be many people who feel the status quo in education should be preserved, but the great promise of technology is the potential for companies like Stemuli to open access up for the majority rather than the privileged few.

    “It’s going to be hard, but there are people like me out there who feel inspired by this mission and that means it’s the best time to be alive” says Shead.

    Having seen Shead in action at The PIE Live Asia Pacific, we are inclined to believe her.

    Talor Shead was interviewed by The PIE’s Nicholas Cuthbert and took part in our conference debate – Will AI improve or damage higher education? at The PIE Live Asia Pacific. Watch Taylor explain why it’s the best time to be alive below.

    Source link

  • A practical guide for sourcing edtech

    A practical guide for sourcing edtech

    Key points:

    Virtual reality field trips now enable students to explore the Great Wall of China, the International Space Station, and ancient Rome without leaving the classroom.  Gamified online learning platforms can turn lessons into interactive challenges that boost engagement and motivation. Generative AI tutors are providing real-time feedback on writing and math assignments, helping students sharpen their skills with personalized support in minutes.

    Education technology is accelerating at a rapid pace–and teachers are eager to bring these digital tools to the classroom. But with pandemic relief funds running out, districts are having to make tougher decisions around what edtech they can afford, which vendors will offer the greatest value, and, crucially, which tools come with robust cybersecurity protections.

    Although educators are excited to innovate, school leaders must weigh every new app or online platform against cybersecurity risks and the responsibility of protecting student data. Unfortunately, those risks remain very real: 6 in 10 K-12 schools were targeted by ransomware in 2024.

    Cybersecurity is harder for some districts than others

    The reality is that school districts widely vary when it comes to their internal resources, cybersecurity expertise, and digital maturity.

    A massive urban system may have a dedicated legal department, CISO, and rigid procurement processes. In a small rural district, the IT lead might also coach soccer or direct the school play.

    These discrepancies leave wide gaps that can be exploited by security threats. Districts are often improvising vetting processes that vary wildly in rigor, and even the best-prepared system struggles to know what “good enough” looks like as technology tools rapidly accelerate and threats evolve just as fast.

    Whether it’s apps for math enrichment, platforms for grading, or new generative AI tools that promise differentiated learning at scale, educators are using more technology than ever. And while these digital tools are bringing immense benefits to the classroom, they also bring more threat exposure. Every new tool is another addition to the attack surface, and most school districts are struggling to keep up.

    Districts are now facing these critical challenges with even fewer resources. With the U.S. Department of Education closing its Office of EdTech, schools have lost a vital guidepost for evaluating technology tools safely. That means less clarity and support, even as the influx of new tech tools is at an all-time high.

    But innovation and protection don’t have to be in conflict. Schools can move forward with digital tools while still making smart, secure choices. Their decision-making can be supported by some simple best practices to help guide the way.

    5 green flags for evaluating technology tools

    New School Safety Resources

    With so many tools entering classrooms, knowing how to assess their safety and reliability is essential. But what does safe and trustworthy edtech actually look like?

    You don’t need legal credentials or a cybersecurity certification to answer that question. You simply need to know what to look for–and what questions to ask. Here are five green flags that can guide your decisions and boost confidence in the tools you bring into your classrooms.

    1. Clear and transparent privacy policies

    A strong privacy policy should be more than a formality; it should serve as a clear window into how a tool handles data. The best ones lay out exactly what information is collected, why it’s needed, how it’s used, and who it’s shared with, in plain, straightforward language.

    You shouldn’t need legal training to make sense of it. Look for policies that avoid vague, catch-all phrases and instead offer specific details, like a list of subprocessors, third-party services involved, or direct contact information for the vendor’s privacy officer. If you can’t quickly understand how student data is being handled, or if the vendor seems evasive when you ask, that’s cause for concern.

    1. Separation between student and adult data

    Student data is highly personal, extremely sensitive, and must be treated with extra care. Strong vendors explicitly separate student data from educator, administrator, and parent data in their systems, policies, and user experiences.

    Ask how student data is accessed internally and what safeguards are in place. Does the vendor have different privacy policies for students versus adults? If they’ve engineered that distinction into their platform, it’s a sign they’ve thought deeply about your responsibilities under FERPA and COPPA.

    1. Third-party audits and certifications

    Trust, but verify. Look for tools that have been independently evaluated through certifications like the Common Sense Privacy Seal, iKeepSafe, or the 1EdTech Trusted App program. These external audits validate that privacy claims and company practices are tested against meaningful standards and backed up by third-party validation.

    Alignment with broader security frameworks like NIST Cybersecurity Framework (CSF), ISO 27001, or SOC 2 can add another layer of assurance, especially in states where district policies lean heavily on these benchmarks. These technical frameworks should complement radical transparency. The most trustworthy vendors combine certification with transparency: They’ll show you exactly what they collect, how they store it, and how they protect it. That openness–and a willingness to be held accountable–is the real marker of a privacy-first partner.

    1. Long-term commitment to security and privacy

    Cybersecurity shouldn’t be a one-and-done checklist. It’s a continual practice. Ask vendors how they approach ongoing risks: Do they conduct regular penetration testing? Is a formal incident response plan in place? How are teams trained on phishing threats and secure coding?

    If they follow a framework like the NIST CSF, that’s great. But also dig into how they apply it: What’s their track record for patching vulnerabilities or communicating breaches? A real commitment shows up in action, not just alignment.

    1. Data minimization and purpose limitations

    Trustworthy technology tools collect only what’s essential–and vendors can explain why they need it. If you ask, “Why do you collect this data point?” they should have a direct answer that ties back to functionality, not future marketing.

    Look for platforms that commit to never repurposing student data for behavioral ad targeting. Also, ask about deletion protocols: Can data be purged quickly and completely if requested? If not, it’s time to ask why.

    Laying the groundwork for a safer school year

    Cybersecurity doesn’t require a 10-person IT team or a massive budget. Every district, no matter the size, can take meaningful, manageable steps to reduce risk, establish guardrails, and build trust.

    Simple, actionable steps go a long way: Choose tools that are transparent about data use, use trusted frameworks and certifications as guideposts, and make cybersecurity training a regular part of staff development. Even small efforts , like a five-minute refresher on phishing during back-to-school sessions, can have an outsized impact on your district’s overall security posture.

    For schools operating without deep resources or internal expertise, this work is especially urgent–and entirely possible. It just requires knowing where to start.

    Latest posts by eSchool Media Contributors (see all)

    Source link

  • Data, privacy, and cybersecurity in schools: A 2025 wake-up call

    Data, privacy, and cybersecurity in schools: A 2025 wake-up call

    Key points:

    In 2025, schools are sitting on more data than ever before. Student records, attendance, health information, behavioral logs, and digital footprints generated by edtech tools have turned K-12 institutions into data-rich environments. As artificial intelligence becomes a central part of the learning experience, these data streams are being processed in increasingly complex ways. But with this complexity comes a critical question: Are schools doing enough to protect that data?

    The answer, in many cases, is no.

    The rise of shadow AI

    According to CoSN’s May 2025 State of EdTech District Leadership report, a significant portion of districts, specifically 43 percent, lack formal policies or guidance for AI use. While 80 percent of districts have generative AI initiatives underway, this policy gap is a major concern. At the same time, Common Sense Media’s Teens, Trust and Technology in the Age of AI highlights that many teens have been misled by fake content and struggle to discern truth from misinformation, underscoring the broad adoption and potential risks of generative AI.

    This lack of visibility and control has led to the rise of what many experts call “shadow AI”: unapproved apps and browser extensions that process student inputs, store them indefinitely, or reuse them to train commercial models. These tools are often free, widely adopted, and nearly invisible to IT teams. Shadow AI expands the district’s digital footprint in ways that often escape policy enforcement, opening the door to data leakage and compliance violations. CoSN’s 2025 report specifically notes that “free tools that are downloaded in an ad hoc manner put district data at risk.”

    Data protection: The first pillar under pressure

    The U.S. Department of Education’s AI Toolkit for Schools urges districts to treat student data with the same care as medical or financial records. However, many AI tools used in classrooms today are not inherently FERPA-compliant and do not always disclose where or how student data is stored. Teachers experimenting with AI-generated lesson plans or feedback may unknowingly input student work into platforms that retain or share that data. In the absence of vendor transparency, there is no way to verify how long data is stored, whether it is shared with third parties, or how it might be reused. FERPA requires that if third-party vendors handle student data on behalf of the institution, they must comply with FERPA. This includes ensuring data is not used for unintended purposes or retained for AI training.

    Some tools, marketed as “free classroom assistants,” require login credentials tied to student emails or learning platforms. This creates additional risks if authentication mechanisms are not protected or monitored. Even widely-used generative tools may include language in their privacy policies allowing them to use uploaded content for system training or performance optimization.

     

    Data processing and the consent gap

    Generative AI models are trained on large datasets, and many free tools continue learning from user prompts. If a student pastes an essay or a teacher includes student identifiers in a prompt, that information could enter a commercial model’s training loop. This creates a scenario where data is being processed without explicit consent, potentially in violation of COPPA (Children’s Online Privacy Protection Act) and FERPA. While the FTC’s December 2023 update to the COPPA Rule did not codify school consent provisions, existing guidance still allows schools to consent to technology use on behalf of parents in educational contexts. However, the onus remains on schools to understand and manage these consent implications, especially with the rule’s new amendments becoming effective June 21, 2025, which strengthen protections and require separate parental consent for third-party disclosures for targeted advertising.

    Moreover, many educators and students are unaware of what constitutes “personally identifiable information” (PII) in these contexts. A name combined with a school ID number, disability status, or even a writing sample could easily identify a student, especially in small districts. Without proper training, well-intentioned AI use can cross legal lines unknowingly.

    Cybersecurity risks multiply

    AI tools have also increased the attack surface of K-12 networks. According to ThreatDown’s 2024 State of Ransomware in Education report, ransomware attacks on K-12 schools increased by 92 percent between 2022 and 2023, with 98 total attacks in 2023. This trend is projected to continue as cybercriminals use AI to create more targeted phishing campaigns and detect system vulnerabilities faster. AI-assisted attacks can mimic human language and tone, making them harder to detect. Some attackers now use large language models to craft personalized emails that appear to come from school administrators.

    Many schools lack endpoint protection for student devices, and third-party integrations often bypass internal firewalls. Free AI browser extensions may collect keystrokes or enable unauthorized access to browser sessions. The more tools that are introduced without IT oversight, the harder it becomes to isolate and contain incidents when they occur. CoSN’s 2025 report indicates that 60 percent of edtech leaders are “very concerned about AI-enabled cyberattacks,” yet 61 percent still rely on general funds for cybersecurity efforts, not dedicated funding.

    Building a responsible framework

    To mitigate these risks, school leaders need to:

    • Audit tool usage using platforms like Lightspeed Digital Insight to identify AI tools being accessed without approval. Districts should maintain a living inventory of all digital tools. Lightspeed Digital Insight, for example, is vetted by 1EdTech for data privacy.
    • Develop and publish AI use policies that clarify acceptable practices, define data handling expectations, and outline consequences for misuse. Policies should distinguish between tools approved for instructional use and those requiring further evaluation.
    • Train educators and students to understand how AI tools collect and process data, how to interpret AI outputs critically, and how to avoid inputting sensitive information. AI literacy should be embedded in digital citizenship curricula, with resources available from organizations like Common Sense Media and aiEDU.
    • Vet all third-party apps through standards like the 1EdTech TrustEd Apps program. Contracts should specify data deletion timelines and limit secondary data use. The TrustEd Apps program has vetted over 12,000 products, providing a valuable resource for districts.
    • Simulate phishing attacks and test breach response protocols regularly. Cybersecurity training should be required for staff, and recovery plans must be reviewed annually.

    Trust starts with transparency

    In the rush to embrace AI, schools must not lose sight of their responsibility to protect students’ data and privacy. Transparency with parents, clarity for educators, and secure digital infrastructure are not optional. They are the baseline for trust in the age of algorithmic learning.

    AI can support personalized learning, but only if we put safety and privacy first. The time to act is now. Districts that move early to build policies, offer training, and coordinate oversight will be better prepared to lead AI adoption with confidence and care.

    Latest posts by eSchool Media Contributors (see all)

    Source link