Category: EdTech

  • Weaving digital citizenship into edtech innovation

    Weaving digital citizenship into edtech innovation

    Key points:

    What happens when over 100 passionate educators converge in Chicago to celebrate two decades of educational innovation? A few weeks ago, I had the thrilling opportunity to immerse myself in the 20th anniversary of the Discovery Educator Network (the DEN), a week-long journey that reignited my passion for transforming classrooms.

    From sunrise to past sunset, my days at Loyola University were a whirlwind of learning, laughter, and relentless exploration. Living the dorm life, forging new connections, and rekindling old friendships, we collectively dove deep into the future of learning, creating experiences that went far beyond the typical professional development.

    As an inaugural DEN member, the professional learning community supported by Discovery Education, I was incredibly excited to return 20 years after its founding to guide a small group of educators through the bountiful innovations of the DEN Summer Institute (DENSI). Think scavenger hunts, enlightening workshops, and collaborative creations–every moment was packed with cutting-edge ideas and practical strategies for weaving technology seamlessly into our teaching, ensuring our students are truly future-ready.

    During my time at DENSI, I learned a lot of new tips and tricks that I will pass on to the educators I collaborate with. From AI’s potential to the various new ways to work together online, participants in this unique event learned a number of ways to weave digital citizenship into edtech innovation. I’ve narrowed them down to five core concepts; each a powerful step toward building future-ready classrooms and fostering truly responsible digital citizens.

    Use of artificial intelligence

    Technology integration: When modeling responsible AI use, key technology tools could include generative platforms like Gemini, NotebookLM, Magic School AI, and Brisk, acting as ‘thought partners’ for brainstorming, summarizing, and drafting. Integration also covers AI grammar/spell-checkers, data visualization tools, and feedback tools for refining writing, presenting information, and self-assessment, enhancing digital content interaction and production.

    Learning & application: Teaching students to ethically use AI is key. This involves modeling critical evaluation of AI content for bias and inaccuracies. For instance, providing students with an AI summary of a historical event to fact-check with credible sources. Students learn to apply AI as a thought partner, boosting creativity and collaboration, not replacing their own thinking. Fact-checking and integrating their unique voices are essential. An English class could use AI to brainstorm plot ideas, but students develop characters and write the narrative. Application includes using AI for writing refinement and data exploration, fostering understanding of AI’s academic capabilities and limitations.

    Connection to digital citizenship: This example predominantly connects to digital citizenship. Teaching responsible AI use promotes intellectual honesty and information literacy. Students can grasp ethical considerations like plagiarism and proper attribution. The “red, yellow, green” stoplight method provides a framework for AI use, teaching students when to use AI as a collaborator, editor, or thought partner–or not at all.This approach cultivates critical thinking and empowers students to navigate the digital landscape with integrity, preparing them as responsible digital citizens understanding AI’s implications.

    Digital communication

    Technology integration: Creating digital communication norms should focus on clarity with visuals like infographics, screenshots, and video clips. Canva is a key tool for a visual “Digital Communication Agreement” defining online interaction expectations. Include student voice by the integration and use of pictures and graphics to illustrate behaviors and potentially collaborative presentation / polling tools for student involvement in norm-setting.

    Learning & application: Establishing clear online interaction norms is the focus of digital communication. Applying clear principles teaches the importance of visuals and setting communication goals. Creating a visual “Digital Communication Agreement” with Canva is a practical application where students define respectful online language and netiquette. An elementary class might design a virtual classroom rules poster, showing chat emojis and explaining “think before you post.” Using screenshots and “SMART goals” for online discussions reinforces learning, teaching constructive feedback and respectful debate. In a middle school science discussion board, the teacher could model a respectful response like “I understand your point, but I’m wondering if…” This helps students apply effective digital communication principles.

    Connection to digital citizenship: This example fosters respectful communication, empathy, and understanding of online social norms. By creating and adhering to a “Digital Communication Agreement,” students develop responsibility for online interactions. Emphasizing respectful language and netiquette cultivates empathy and awareness of their words’ impact. This prepares them as considerate digital citizens, contributing positively to inclusive online communities.

    Content curation

    Technology integration: For understanding digital footprints, one primary tool is Google Drive when used as a digital folder to curate students’ content. The “Tech Toolbox” concept implies interaction with various digital platforms where online presence exists. Use of many tools to curate content allows students to leave traces on a range of technologies forming their collective digital footprint.

    Learning & application: This centers on educating students about their online presence’s permanence and nature. Teaching them to curate digital content in a structured way, like using a Google Drive folder, is key. A student could create a “Digital Portfolio” in Google Drive with online projects, proud social media posts, and reflections on their public identity. By collecting and reviewing online artifacts, students visualize their current “digital footprint.” The classroom “listening tour” encourages critical self-reflection, prompting students to think about why they share online and how to be intentional about their online identity. This might involve students reviewing anonymized social media profiles, discussing the impression given to future employers.

    Connection to digital citizenship: This example cultivates awareness of online permanence, privacy, responsible self-presentation, and reputation management. Understanding lasting digital traces empowers students to make informed decisions. The reflection process encourages the consideration of their footprint’s impact, fostering ownership and accountability for online behavior. This helps them become mindful, capable digital citizens.

    Promoting media literacy

    Technology integration: One way to promote media literacy is by using “Paperslides” for engaging content creation, leveraging cameras and simple video recording. This concept gained popularity at the beginning of the DEN through Dr. Lodge McCammon. Dr. Lodge’s popular 1-Take Paperslide Video strategy is to “hit record, present your material, then hit stop, and your product is done” style of video creation is something that anyone can start using tomorrow. Integration uses real-life examples (likely digital media) to share a variety of topics for any audience. Additionally, to apply “Pay Full Attention” in a digital context implies online viewing platforms and communication tools for modeling digital eye contact and verbal cues.

    Learning & application: Integrating critical media consumption with engaging content creation is the focus. Students learn to leverage “Paperslides” or another video creation method to explain topics or present research, moving beyond passive consumption. For a history project, students could create “Paperslides” explaining World War II causes, sourcing information and depicting events. Learning involves using real-life examples to discern credible online sources, understanding misinformation and bias. A lesson might show a satirical news article, guiding students to verify sources and claims through their storyboard portion. Applying “Pay Full Attention” teaches active, critical viewing, minimizing distractions. During a class viewing of an educational video, students could pause to discuss presenter credentials or unsupported claims, mimicking active listening. This fosters practical media literacy in creating and consuming digital content.

    Connection to digital citizenship: This example enhances media literacy, critical online information evaluation, and understanding persuasive techniques. Learning to create and critically consume content makes students informed, responsible digital participants. They identify and question sources, essential for navigating a digital information-saturated world. This empowers them as discerning digital citizens, contributing thoughtfully to online content.

    Collaborative problem-solving

    Technology integration: For practicing digital empathy and support, key tools are collaborative online documents like Google Docs and Google Slides. Integration extends to online discussion forums (Google Classroom, Flip) for empathetic dialogue, and project management tools (Trello, Asana) for transparent organization. 

    Learning & application: This focuses on developing effective collaborative skills and empathetic communication in digital spaces. Students learn to work together on shared documents, applying a “Co-Teacher or Model Lessons” approach where they “co-teach” each other new tools or concepts. In a group science experiment, students might use a shared Google Doc to plan methodology, with one “co-teaching” data table insertion from Google Sheets. They practice constructive feedback and model active listening in digital settings, using chat for clarification or emojis for feelings. The “red, yellow, green” policy provides a clear framework for online group work, teaching when to seek help, proceed cautiously, or move forward confidently. For a research project, “red” means needing a group huddle, “yellow” is proceeding with caution, and “green” is ready for review.

    Connection to digital citizenship: This example is central to digital citizenship, developing empathy, respectful collaboration, and responsible problem-solving in digital environments. Structured online group work teaches how to navigate disagreements and offers supportive feedback. Emphasis on active listening and empathetic responses helps internalize civility, preparing students as considerate digital citizens contributing positively to online communities.

    These examples offer a powerful roadmap for cultivating essential digital citizenship skills and preparing all learners to be future-ready. The collective impact of thoughtfully utilizing these or similar approaches , or even grab and go resources from programs such as Discovery Education’s Digital Citizenship Initiative, can provide the foundation for a strong academic and empathetic school year, empowering educators and students alike to navigate the digital world with confidence, integrity, and a deep understanding of their role as responsible digital citizens.

    In addition, this event reminded me of the power of professional learning communities.  Every educator needs and deserves a supportive community that will share ideas, push their thinking, and support their professional development. One of my long-standing communities is the Discovery Educator Network (which is currently accepting applications for membership). 

    Latest posts by eSchool Media Contributors (see all)

    Source link

  • 4 tips to support the literacy needs of middle and high school students

    4 tips to support the literacy needs of middle and high school students

    Key points:

    Today’s middle schoolers continue to struggle post-pandemic to read and write at the level needed to successfully navigate more complex academic content in the upper grades and beyond, according to a new report from NWEA, a K-12 assessment and research organization.

    Based on NWEA’s research, current 8th graders would need close to a full academic year of additional instruction to catch up to their pre-pandemic peers in reading. This trend was reiterated in recent assessment results from the National Assessment on Educational Progress (NAEP), with only 30 percent of eighth-grade students performing at or above the NAEP proficient level.

    While early literacy initiatives have garnered attention in recent years, the fact remains that many students struggle to read and are not prepared for the rigors of middle school. Students quickly find themselves challenged to keep up as they no longer receive explicit, structured reading instruction, even as they are expected to comprehend increasingly complex materials across subjects, like science, history, or English Language Arts.

    The report, Policy recommendations for addressing the middle school reading crisis, is co-authored by Miah Daughtery, EdD, NWEA VP of Academic Advocacy at HMH (NWEA’s parent company), and Chad Aldeman, founder of Read Not Guess.

    “Our current middle and high schoolers were just starting their literacy journey when the pandemic hit, and we cannot lessen the urgency to support them. But, middle school literacy is complex even for students who are reading on grade level. This demands intentional, well-funded, and focused policy leadership that includes support across the K-12 spectrum,” said Daughtery. “Simply put, learning to read is not done when a student exits elementary school; support cannot stop there either.”

    Policymakers and district leaders must adopt a systems-level approach that supports both early learners and the unique literacy needs of middle and high school students.

    The new report provides four components that can be leveraged to make this happen:

    1. Use high-quality, grade-appropriate assessments that provide specific data on the literacy needs of middle schoolers.
    2. Look at flexible scheduling and policies that promote literacy development throughout the entire school day and help districts more effectively use instructional time.
    3. Understand and support the unique literacy needs of middle schoolers across subjects and disciplines from a systems perspective and invest in teacher professional learning in all disciplines, including at the upper grades, within state and district literacy plans.
    4. Curate relationships with external partners, like community organizations and nonprofits, who share similar goals in improving literacy outcomes, and can both support and reinforce literacy development, stretching beyond the school’s hours and resources.
    eSchool News Staff
    Latest posts by eSchool News Staff (see all)

    Source link

  • The PIE meets Taylor Shead

    The PIE meets Taylor Shead

    “Who am I? I’m one of the people that can see the future well before it’s created.”

    Meet Taylor Shead, the athlete-turned tech entrepreneur who is on a mission to change the way students access and absorb education in the 21st century.

    A former college basketball scholar, her original goal was to train as a reconstructive plastic surgeon alongside her sporting career.

    But like many students, while sports held her attention, she found STEM subjects inaccessible due to the dense language of mathematical equations and chemical symbols.

    “Frankly, I was a little annoyed,” Shead explains. “I was in the best private schools in Texas, and I thought: if I’m in this privileged position where I’m going to college level and I don’t feel prepared, then what about everybody else from all kinds of backgrounds?

    “As an athlete, you have tutors [to help you succeed academically] and so I had a moment when I realised that the education system isn’t working.”

    The statistics back up her hypothesis. In the US, approximately 86% of kids graduate from high school, but only about 37% of them graduate from college. Only 66% of US students reach Level 2 proficiency in mathematics and fewer than 30% of high school students feel prepared to pursue a postsecondary pathway.

    “It was like, this isn’t a problem that’s black or white, it’s not male or female, it’s not rich or poor. This is a problem that impacts everybody,” says Shead.

    “There’s a problem with the current system, the way schooling and college prepares you for each next step, even when it’s the best of the best – so what’s the solution?”

    Building on a three-year stint as an Apple mentor and volunteering in inner city schools in Dallas and Fort Worth, Shead took the leap and founded Stemuli in 2016 as a platform to support kids in STEM subjects.

    Shortly after, the pandemic hit and the world pivoted to online learning. The moment catapulted the business forward and Shead became only the 94th black woman in the history of the world to raise over a million dollars in venture capital.

    The company raised over USD$10 million overall and won the prestigious United Nations AI for good competition in 2024.

    The Stemuli mission is to gamify the curriculum to engage a generation of learners who have grown up on video games. This isn’t online learning for the sake of it; the aim is to create learning opportunities in the co-creative worlds that exist in games.

    “There are 3.3 billion gamers around the world playing right now,” Shead explains. “Yet all the kids I meet in classrooms are bored. Games like Roblox and Minecraft have set the example of STEM learning crossing over to where kids want to be.”

    Stemuli is currently beta testing the third iteration of the platform, a one-world gaming environment where there are infinite possibilities to explore and learn.

    Only 66% of US students reach Level 2 proficiency in math and fewer than 30% of high school students feel prepared to pursue a postsecondary pathway

    “We used to produce a lot of work simulation games but now nobody knows what the future jobs are going to be. Technology is moving so fast,” explains Shead.

    “So we’ve created a much more entrepreneurial gaming experience where, together with an AI prompt assistant, you can test and learn all sorts of ideas in a safe environment. We’ve created a game for entrepreneurship.”

    Shead is keen to stress that there is a misconception that entrepreneurship means that you must aspire to be the boss of your own company. She equates entrepreneurship to a curiosity skillset that builds problem solving and resilience in a fast-changing world.

    “We are a Walton family funded organisation and they partnered with us at Stemuli to scale stimuli across 20 states in the heartland in order to make sure people in rural America have access to AI literacy skills through our video game,” she says.

    “I am obsessed about the idea of a little boy or girl sitting in a rural, remote town that’s seeing with their own eyes the problems that need to be solved in their community. They’re going to create the best technology because they understand the problem, whereas somebody on the coast or Silicon Valley, they’re not even thinking about it.”

    It is also is significant that Shead has achieved so much success in the edtech field, despite coming largely from an athletic background rather than a tech education.

    “Most people think athletes are dumb, but maybe we’re stubborn and hardworking and relentless enough to be the ones that actually can endure the pressure to make something like this happen, right?

    “I like to flip the narrative on its head to say it might take an athlete to go up against established systems and to believe that, in a world that is so structured, that education can actually change for the better. They don’t call athletes game-changers for nothing.”

    There will be many people who feel the status quo in education should be preserved, but the great promise of technology is the potential for companies like Stemuli to open access up for the majority rather than the privileged few.

    “It’s going to be hard, but there are people like me out there who feel inspired by this mission and that means it’s the best time to be alive” says Shead.

    Having seen Shead in action at The PIE Live Asia Pacific, we are inclined to believe her.

    Talor Shead was interviewed by The PIE’s Nicholas Cuthbert and took part in our conference debate – Will AI improve or damage higher education? at The PIE Live Asia Pacific. Watch Taylor explain why it’s the best time to be alive below.

    Source link

  • A practical guide for sourcing edtech

    A practical guide for sourcing edtech

    Key points:

    Virtual reality field trips now enable students to explore the Great Wall of China, the International Space Station, and ancient Rome without leaving the classroom.  Gamified online learning platforms can turn lessons into interactive challenges that boost engagement and motivation. Generative AI tutors are providing real-time feedback on writing and math assignments, helping students sharpen their skills with personalized support in minutes.

    Education technology is accelerating at a rapid pace–and teachers are eager to bring these digital tools to the classroom. But with pandemic relief funds running out, districts are having to make tougher decisions around what edtech they can afford, which vendors will offer the greatest value, and, crucially, which tools come with robust cybersecurity protections.

    Although educators are excited to innovate, school leaders must weigh every new app or online platform against cybersecurity risks and the responsibility of protecting student data. Unfortunately, those risks remain very real: 6 in 10 K-12 schools were targeted by ransomware in 2024.

    Cybersecurity is harder for some districts than others

    The reality is that school districts widely vary when it comes to their internal resources, cybersecurity expertise, and digital maturity.

    A massive urban system may have a dedicated legal department, CISO, and rigid procurement processes. In a small rural district, the IT lead might also coach soccer or direct the school play.

    These discrepancies leave wide gaps that can be exploited by security threats. Districts are often improvising vetting processes that vary wildly in rigor, and even the best-prepared system struggles to know what “good enough” looks like as technology tools rapidly accelerate and threats evolve just as fast.

    Whether it’s apps for math enrichment, platforms for grading, or new generative AI tools that promise differentiated learning at scale, educators are using more technology than ever. And while these digital tools are bringing immense benefits to the classroom, they also bring more threat exposure. Every new tool is another addition to the attack surface, and most school districts are struggling to keep up.

    Districts are now facing these critical challenges with even fewer resources. With the U.S. Department of Education closing its Office of EdTech, schools have lost a vital guidepost for evaluating technology tools safely. That means less clarity and support, even as the influx of new tech tools is at an all-time high.

    But innovation and protection don’t have to be in conflict. Schools can move forward with digital tools while still making smart, secure choices. Their decision-making can be supported by some simple best practices to help guide the way.

    5 green flags for evaluating technology tools

    New School Safety Resources

    With so many tools entering classrooms, knowing how to assess their safety and reliability is essential. But what does safe and trustworthy edtech actually look like?

    You don’t need legal credentials or a cybersecurity certification to answer that question. You simply need to know what to look for–and what questions to ask. Here are five green flags that can guide your decisions and boost confidence in the tools you bring into your classrooms.

    1. Clear and transparent privacy policies

    A strong privacy policy should be more than a formality; it should serve as a clear window into how a tool handles data. The best ones lay out exactly what information is collected, why it’s needed, how it’s used, and who it’s shared with, in plain, straightforward language.

    You shouldn’t need legal training to make sense of it. Look for policies that avoid vague, catch-all phrases and instead offer specific details, like a list of subprocessors, third-party services involved, or direct contact information for the vendor’s privacy officer. If you can’t quickly understand how student data is being handled, or if the vendor seems evasive when you ask, that’s cause for concern.

    1. Separation between student and adult data

    Student data is highly personal, extremely sensitive, and must be treated with extra care. Strong vendors explicitly separate student data from educator, administrator, and parent data in their systems, policies, and user experiences.

    Ask how student data is accessed internally and what safeguards are in place. Does the vendor have different privacy policies for students versus adults? If they’ve engineered that distinction into their platform, it’s a sign they’ve thought deeply about your responsibilities under FERPA and COPPA.

    1. Third-party audits and certifications

    Trust, but verify. Look for tools that have been independently evaluated through certifications like the Common Sense Privacy Seal, iKeepSafe, or the 1EdTech Trusted App program. These external audits validate that privacy claims and company practices are tested against meaningful standards and backed up by third-party validation.

    Alignment with broader security frameworks like NIST Cybersecurity Framework (CSF), ISO 27001, or SOC 2 can add another layer of assurance, especially in states where district policies lean heavily on these benchmarks. These technical frameworks should complement radical transparency. The most trustworthy vendors combine certification with transparency: They’ll show you exactly what they collect, how they store it, and how they protect it. That openness–and a willingness to be held accountable–is the real marker of a privacy-first partner.

    1. Long-term commitment to security and privacy

    Cybersecurity shouldn’t be a one-and-done checklist. It’s a continual practice. Ask vendors how they approach ongoing risks: Do they conduct regular penetration testing? Is a formal incident response plan in place? How are teams trained on phishing threats and secure coding?

    If they follow a framework like the NIST CSF, that’s great. But also dig into how they apply it: What’s their track record for patching vulnerabilities or communicating breaches? A real commitment shows up in action, not just alignment.

    1. Data minimization and purpose limitations

    Trustworthy technology tools collect only what’s essential–and vendors can explain why they need it. If you ask, “Why do you collect this data point?” they should have a direct answer that ties back to functionality, not future marketing.

    Look for platforms that commit to never repurposing student data for behavioral ad targeting. Also, ask about deletion protocols: Can data be purged quickly and completely if requested? If not, it’s time to ask why.

    Laying the groundwork for a safer school year

    Cybersecurity doesn’t require a 10-person IT team or a massive budget. Every district, no matter the size, can take meaningful, manageable steps to reduce risk, establish guardrails, and build trust.

    Simple, actionable steps go a long way: Choose tools that are transparent about data use, use trusted frameworks and certifications as guideposts, and make cybersecurity training a regular part of staff development. Even small efforts , like a five-minute refresher on phishing during back-to-school sessions, can have an outsized impact on your district’s overall security posture.

    For schools operating without deep resources or internal expertise, this work is especially urgent–and entirely possible. It just requires knowing where to start.

    Latest posts by eSchool Media Contributors (see all)

    Source link

  • Data, privacy, and cybersecurity in schools: A 2025 wake-up call

    Data, privacy, and cybersecurity in schools: A 2025 wake-up call

    Key points:

    In 2025, schools are sitting on more data than ever before. Student records, attendance, health information, behavioral logs, and digital footprints generated by edtech tools have turned K-12 institutions into data-rich environments. As artificial intelligence becomes a central part of the learning experience, these data streams are being processed in increasingly complex ways. But with this complexity comes a critical question: Are schools doing enough to protect that data?

    The answer, in many cases, is no.

    The rise of shadow AI

    According to CoSN’s May 2025 State of EdTech District Leadership report, a significant portion of districts, specifically 43 percent, lack formal policies or guidance for AI use. While 80 percent of districts have generative AI initiatives underway, this policy gap is a major concern. At the same time, Common Sense Media’s Teens, Trust and Technology in the Age of AI highlights that many teens have been misled by fake content and struggle to discern truth from misinformation, underscoring the broad adoption and potential risks of generative AI.

    This lack of visibility and control has led to the rise of what many experts call “shadow AI”: unapproved apps and browser extensions that process student inputs, store them indefinitely, or reuse them to train commercial models. These tools are often free, widely adopted, and nearly invisible to IT teams. Shadow AI expands the district’s digital footprint in ways that often escape policy enforcement, opening the door to data leakage and compliance violations. CoSN’s 2025 report specifically notes that “free tools that are downloaded in an ad hoc manner put district data at risk.”

    Data protection: The first pillar under pressure

    The U.S. Department of Education’s AI Toolkit for Schools urges districts to treat student data with the same care as medical or financial records. However, many AI tools used in classrooms today are not inherently FERPA-compliant and do not always disclose where or how student data is stored. Teachers experimenting with AI-generated lesson plans or feedback may unknowingly input student work into platforms that retain or share that data. In the absence of vendor transparency, there is no way to verify how long data is stored, whether it is shared with third parties, or how it might be reused. FERPA requires that if third-party vendors handle student data on behalf of the institution, they must comply with FERPA. This includes ensuring data is not used for unintended purposes or retained for AI training.

    Some tools, marketed as “free classroom assistants,” require login credentials tied to student emails or learning platforms. This creates additional risks if authentication mechanisms are not protected or monitored. Even widely-used generative tools may include language in their privacy policies allowing them to use uploaded content for system training or performance optimization.

     

    Data processing and the consent gap

    Generative AI models are trained on large datasets, and many free tools continue learning from user prompts. If a student pastes an essay or a teacher includes student identifiers in a prompt, that information could enter a commercial model’s training loop. This creates a scenario where data is being processed without explicit consent, potentially in violation of COPPA (Children’s Online Privacy Protection Act) and FERPA. While the FTC’s December 2023 update to the COPPA Rule did not codify school consent provisions, existing guidance still allows schools to consent to technology use on behalf of parents in educational contexts. However, the onus remains on schools to understand and manage these consent implications, especially with the rule’s new amendments becoming effective June 21, 2025, which strengthen protections and require separate parental consent for third-party disclosures for targeted advertising.

    Moreover, many educators and students are unaware of what constitutes “personally identifiable information” (PII) in these contexts. A name combined with a school ID number, disability status, or even a writing sample could easily identify a student, especially in small districts. Without proper training, well-intentioned AI use can cross legal lines unknowingly.

    Cybersecurity risks multiply

    AI tools have also increased the attack surface of K-12 networks. According to ThreatDown’s 2024 State of Ransomware in Education report, ransomware attacks on K-12 schools increased by 92 percent between 2022 and 2023, with 98 total attacks in 2023. This trend is projected to continue as cybercriminals use AI to create more targeted phishing campaigns and detect system vulnerabilities faster. AI-assisted attacks can mimic human language and tone, making them harder to detect. Some attackers now use large language models to craft personalized emails that appear to come from school administrators.

    Many schools lack endpoint protection for student devices, and third-party integrations often bypass internal firewalls. Free AI browser extensions may collect keystrokes or enable unauthorized access to browser sessions. The more tools that are introduced without IT oversight, the harder it becomes to isolate and contain incidents when they occur. CoSN’s 2025 report indicates that 60 percent of edtech leaders are “very concerned about AI-enabled cyberattacks,” yet 61 percent still rely on general funds for cybersecurity efforts, not dedicated funding.

    Building a responsible framework

    To mitigate these risks, school leaders need to:

    • Audit tool usage using platforms like Lightspeed Digital Insight to identify AI tools being accessed without approval. Districts should maintain a living inventory of all digital tools. Lightspeed Digital Insight, for example, is vetted by 1EdTech for data privacy.
    • Develop and publish AI use policies that clarify acceptable practices, define data handling expectations, and outline consequences for misuse. Policies should distinguish between tools approved for instructional use and those requiring further evaluation.
    • Train educators and students to understand how AI tools collect and process data, how to interpret AI outputs critically, and how to avoid inputting sensitive information. AI literacy should be embedded in digital citizenship curricula, with resources available from organizations like Common Sense Media and aiEDU.
    • Vet all third-party apps through standards like the 1EdTech TrustEd Apps program. Contracts should specify data deletion timelines and limit secondary data use. The TrustEd Apps program has vetted over 12,000 products, providing a valuable resource for districts.
    • Simulate phishing attacks and test breach response protocols regularly. Cybersecurity training should be required for staff, and recovery plans must be reviewed annually.

    Trust starts with transparency

    In the rush to embrace AI, schools must not lose sight of their responsibility to protect students’ data and privacy. Transparency with parents, clarity for educators, and secure digital infrastructure are not optional. They are the baseline for trust in the age of algorithmic learning.

    AI can support personalized learning, but only if we put safety and privacy first. The time to act is now. Districts that move early to build policies, offer training, and coordinate oversight will be better prepared to lead AI adoption with confidence and care.

    Latest posts by eSchool Media Contributors (see all)

    Source link

  • Digital learning is different

    Digital learning is different

    Key points:

    In the animated film Up, the character Dug is a talking dog with an interesting mannerism. Each time he sees a movement off to the side, he stops whatever he is doing, stares off in that direction and shouts, “Squirrel!” I feel that this is a perfect representation of how schools often deal with new and emerging technologies. They can be working hard to provide the best instruction for their students but become immediately distracted anytime a new technology is introduced.

    From the internet and computers to cell phones and artificial intelligence, schools continue to invest a lot of time and money into figuring out how best to use these new technologies. Overall, schools have done a good job adapting to the numerous digital tools introduced in classrooms and offices–and often, these tools are introduced as standalone initiatives. Why do school districts feel the need to ‘reinvent the wheel’ every time a new technology is released? Instead of looking at each new technology as a tool that must be integrated in the curriculum, why not determine what is missing from current instruction and identify what prevents integration from occurring naturally?

    Schools need to recognize that it is not just learning how to use these new digital tools that is important. They must learn how to interpret and use the incredible variety of resources that accompany these tools–resources that provide perspectives that students would never have access to when using physical resources.

    Digital is different

    For centuries, learning material has come from a variety of physical resources. These include human-made items (i.e. textbooks, documents, paintings, audio recordings, and movies) as well as one of the most commonly used physical resources: teachers. In traditional instruction, teachers spend a great deal of class time teaching students information from these physical resources. But the physical nature of these resources limits their availability to students. To ensure that students have long-term access to the information provided by these physical resources, most traditional instruction emphasizes memorization, summarizing, and note taking. 

    With digital resources, students can access information at any time from anywhere, which means learning how to retain information is less important than learning how to effectively find credible information. The authenticity of the information is important because the same tools that are used to access digital resources can just as easily be used to create new digital resources. This means there is a lot of misinformation available online, often consisting of nothing more than personal opinions. Students need to not only be able to search for information online, but they also need to be able to verify the authenticity of online information. The ability to identify misleading or false information is a skill that will benefit them in their personal and academic lives.

    Learning

    While it is fairly easy to find information online, especially with the inclusion of AI in search engines, there are some search techniques that will reduce the amount of misinformation found in simple search requests. By teaching students how to refine their searches and discussing the impact of these search skills, students will be more discerning when it comes to reviewing search results. They need to be aware that the most helpful sites do not always appear at the top of the search list. Some sites are sponsored and thus automatically placed at the beginning of the search list. Other sites will tweak their web search parameters to ensure a higher priority in the search list.  A better understanding of how online searching works will result in more effective searches. 

    Once information is found, the authenticity of the resource and the information itself needs to be established. Fortunately, there are standard practices that can be utilized to teach verification. In the early 2000’s, a popular checklist method called CRAAP (Currency [timeliness], Relevance, Authority, Accuracy, Purpose) emerged. While this method was effective in evaluating the authenticity of the website, it did not ensure the accuracy of the information on the website. In 2019, the SIFT (Stop, Investigate, Find confirming resources, and Trace claims) methodology was introduced.  This methodology focuses on determining if online content is credible. These are not the only tools available to teachers. Librarians and media specialists are a good place to start when determining age-appropriate lessons and material to teach verification.

    Students need to have access to some high-quality digital resources starting in elementary school. Teaching website verification at an early age will help students understand, from the beginning, that there is a lot of misinformation available online. At the same time, schools need to ensure that they provide access to digital resources that are age appropriate. Today’s network technology provides many ways for schools to monitor and control what information or sites are available to students at different grade levels. While these network tools are effective, they should be used in conjunction with well-trained teachers who understand how to safely navigate digital resources and students who are expected to practice responsible internet behavior. Introducing a select number of digital resources in elementary classes is the first step toward creating discerning researchers who will gain the ability to effectively judge a website’s appropriateness and usefulness.

    Teaching

    In order to create opportunities for students to experience learning with digital resources, instructional practices need to be less reliant on teacher-directed instruction. The use of physical resources requires the teacher to be the primary distributor of the information. Typically, this is done through lecture or whole-class presentations. With digital resources, students have direct access to the information, so whole-class distribution is not necessary. Instead, instructional practices need to provide lessons that emphasize finding and verifying information, which can be done by shifting to a learner-centered instructional model. In a learner-centered lesson, the onus falls on the student to determine what information is needed, and if the found information is credible for a given task. The class time that previously would have been spent on lecture becomes time for students to practice finding and authenticating online information. Initially, these learning experiences would be designed as guided practice for finding specific information. As students become more proficient with their search skills, the lesson can shift toward project-based lessons.

    Project-based lessons will help students learn how to apply the information they find, as well as determine what unknown information they need to complete the work. Unlike lesson design for practicing information searching and verification, project-based lessons provide opportunities for students to decide what information is needed and how best to use it. Instead of directing the student’s information-gathering, the teacher provides guidance to ensure they are accessing information that will allow the students to complete the project.

    This shift in instruction does not necessarily mean there will be a significant curricular change. The curricular content will remain the same, but the resources could be different. Because students control what resources they use, it is possible that they could find resources different from the ones specified in the curriculum. Teachers will need to be aware of the resources students are using and may have to spend time checking the credibility of the resource. Given the varying formats (text, audio, video, graphic) available with digital resources, students will be able to determine which format(s) best supports their learning style. Because most digital tools utilize the same digital resources and formats, teaching students how to learn with digital resources will prepare them for adapting to the next new digital tool. It is simply a matter of learning how to use the tool–after all, they already know how to use the resource.

    When creating units of study, teachers should consider the type of resources students will be using. To simplify matters, some units should be designed to utilize digital resources only and include lessons that teach students how to find and verify information. Students still need to develop skills to work with physical resources as well. It may be helpful to start off with units that utilize only physical or digital resources. That way teachers can focus on the specific skills needed for each type of resource. As students gain proficiency with these skills, they will learn to use the appropriate skills for the given resources.

    The amount of information available to the public today is staggering. Unfortunately, too much of it is unverified and even purposely misleading. Trying to stop misinformation from being created and distributed is not realistic. But teaching students how to validate online information can make the distribution of and exposure to misinformation much less impactful. The open nature of the internet allows for many divergent opinions and perspectives. We need to ensure that when students graduate, they have the skills necessary to determine the authenticity of online information and to be able to determine its merit.

    Teaching and learning with digital resources is different, and traditional instruction does not meet the learning needs of today’s students. Giving students the opportunity to master learning with digital resources will prepare them for the next technology “squirrel” and will enable them to determine how best to use it on their own.

    Latest posts by eSchool Media Contributors (see all)

    Source link

  • 8 under-the-radar digital learning resources

    8 under-the-radar digital learning resources

    Key points:

    Digital learning resources are transforming classrooms, and educators are always on the lookout for tools that go beyond the standard platforms. There are numerous lesser-known digital platforms that offer unique, high-quality learning experiences tailored to students’ and teachers’ needs.

    Here are ten standout resources that can enhance instruction, boost engagement, and support deeper learning.

    1. CurrikiStudio

    Subject areas: All subjects
    Best for: Interactive learning content creation

    CurrikiStudio is a free, open-source platform that allows teachers to design interactive learning experiences without needing coding skills. Educators can create multimedia lessons, games, and assessments tailored to their curriculum. It’s ideal for flipped classrooms, project-based learning, or blended learning environments.

    2. InqITS (Inquiry Intelligent Tutoring System)

    Subject areas: Science
    Best for: Developing scientific inquiry skills

    InqITS offers virtual science labs where students can conduct experiments, analyze results, and receive real-time feedback. The platform uses AI to assess student performance and provide just-in-time support, making it a great tool for teaching scientific practices and critical thinking aligned with NGSS.

    3. Parlay

    Subject areas: ELA, Social Studies, Science
    Best for: Structured online and in-class discussions

    Parlay enables educators to facilitate student discussions in a more inclusive and data-informed way. With written and live discussion formats, students can express their ideas while teachers track participation, collaboration, and the quality of responses. It’s an excellent tool for fostering critical thinking, debate, and reflective dialogue.

    4. Geoguessr EDU

    Subject areas: Geography, History, Global Studies
    Best for: Geospatial learning and global awareness

    Geoguessr EDU is an educational version of the popular game that drops players into a random location via Google Street View. Students use context clues to determine where they are, building skills in geography, culture, and critical observation. The EDU version allows teachers to control content and track student progress.

    5. Mosa Mack Science

    Subject areas: Science
    Best for: Middle school science with an inquiry-based approach

    Mosa Mack offers animated science mysteries that prompt students to explore real-world problems through investigation and collaboration. With built-in differentiation, hands-on labs, and assessments, it’s a rich resource for schools seeking engaging science content that supports NGSS-aligned inquiry and critical thinking.

    6. Listenwise

    Subject areas: ELA, Social Studies, Science
    Best for: Listening comprehension and current events

    Listenwise curates high-quality audio stories from public radio and other reputable sources, paired with interactive transcripts and comprehension questions. It helps students build listening skills while learning about current events, science topics, and historical moments. It’s especially helpful for English learners and auditory learners.

    7. Mind Over Media

    Subject areas: Media Literacy, Social Studies
    Best for: Analyzing propaganda and media messages

    Created by media literacy expert Renee Hobbs, Mind Over Media teaches students to critically analyze modern propaganda in advertising, news, social media, and political content. Through guided analysis and opportunities to submit their own examples, students build essential digital citizenship and media literacy skills.

    8. Brilliant

    Subject areas: Math, Science, Computer Science
    Best for: Problem-solving and conceptual learning

    Brilliant.org offers interactive lessons and puzzles that teach students how to think logically and apply concepts rather than simply memorize formulas. With content tailored for advanced middle schoolers and high school students, it’s ideal for enrichment, gifted learners, or students seeking challenge and depth in STEM topics.

    Each of these digital learning tools brings something unique to the table–whether it’s fostering deeper discussion, building scientific inquiry skills, or promoting digital literacy.

    As schools look to personalize learning and prepare students for a complex, fast-evolving world, these lesser-known platforms provide meaningful ways to deepen engagement and understanding across subjects.

    By incorporating these tools into your classroom, you not only diversify your digital toolkit but also give students access to a wider range of learning modalities and real-world applications. Whether you’re looking for curriculum support, project-based tools, or enrichment resources, there’s a good chance one of these platforms can help meet your goals.

    Laura Ascione
    Latest posts by Laura Ascione (see all)

    Source link

  • 10 (and counting…) Google goodies for your classroom

    10 (and counting…) Google goodies for your classroom

    Key points:

    Google enthusiasts, unite.

    During an ISTELive 25 session, Dr. Wanda Terral, chief of technology for Tennessee’s Lakeland School System, took attendees through a growing list of Google tools, along with some non-Google resources, to boost classroom creativity, productivity, and collaboration.

    Here are just 10 of the resources Terral covered–explore the full list for more ideas and resources to increase your Google knowledge.

  • Unibuddy launches AI tool to boost student engagement

    Unibuddy launches AI tool to boost student engagement

    Unibuddy, a higher education peer-to-peer engagement platform, has officially launched Assistant – an AI tool designed to support large-scale, authentic student-led conversations.

    Following a successful beta phase, the tool is now fully live with 30 institutions worldwide and delivering impressive results: tripling student engagement, cutting staff workload significantly, and maintaining over 95% accuracy.

    As universities face increasing pressure from tighter budgets and rising student expectations, Unibuddy said its Assistant tool offers a powerful solution to scale meaningful engagement efficiently, combining the speed of AI with the authenticity of real student voices.

    • 65,000 unique students have used Assistant
    • 100,000+ student questions answered automatically without requiring manual intervention
    • 125% increase in students having conversations
    • 60% increase in lead capture
    • five hours saved per day for university staff

    “Today’s students demand instant, authentic and trustworthy communication,” said Diego Fanara, CEO at Unibuddy. “Unibuddy Assistant is the first and only solution that fuses the speed of AI with the credibility of peer-to-peer guidance – giving institutions a scalable way to meet expectations without sacrificing quality or trust.”

    Unibuddy has partnered with more than 600 institutions globally and has supported over 3,000,000 prospective students through the platform. As part of this extensive network, it regularly conducts surveys to uncover fresh insights. Although chatbots are now common in higher education, survey findings highlight key limitations in their effectiveness:

    • 84% of students said that university responses were too slow (Unibuddy Survey, 2025)
    • 79% of students said it was important that universities balance AI automation (for speed) and human interaction (for depth) while supporting them as they navigate the decision-making process (Unibuddy Survey, 2025)
    • 51% of students say they wouldn’t trust a chatbot to answer questions about the student experience (Unibuddy Survey, 2024)
    • 78% say talking to a current student is helpful — making them 3.5x more likely to trust a peer than a bot (Unibuddy Survey, 2025)
    • Only 14% of students felt engaged by the universities they applied to (Unibuddy Survey, 2025)

    Unibuddy says these finding have shaped its offering: using AI to handle routine questions and highlight valuable information, while smoothly handing off to peer or staff conversations when a personal, human connection is needed.

    Buckinghamshire New University used Unibuddy Assistant to transform early-stage engagement – generating 800,000 impressions, 30,000 clickthroughs, and 10,000+ student conversations in just six months. The university saved over 2,000 staff hours and saw 3,000 referrals to students or staff. 

    Today’s students demand instant, authentic and trustworthy communication
    Diego Fanara, Unibuddy

    Meanwhile the University of South Florida Muma College of Business reported over 30 staff hours saved per month, with a 59% click-to-conversation rate and over a third of chats in Assistant resulting in referrals to student ambassador conversations. 

    And the University of East Anglia deployed Assistant across more than 100 web pages, as part of the full Unibuddy product suites deployment of peer-to-peer chat, with student-led content contributing to a 62% offer-to-student conversion rate compared with 34% of those who didn’t engage with Unibuddy. 

    Source link