Category: AI in Education

  • Why agentic AI matters now more than ever

    Why agentic AI matters now more than ever

    Key points:

    For years now, the promise of AI in education has centered around efficiency–grading faster, recommending better content, or predicting where a student might struggle.

    But at a moment when learners face disconnection, systems are strained, and expectations for personalization are growing, task automation feels…insufficient.

    What if we started thinking less about what AI can do and more about how it can relate?

    That’s where agentic AI comes in. These systems don’t just answer questions. They recognize emotion, learn from context, and respond in ways that feel more thoughtful than transactional. Less machine, more mentor.

    So, what’s the problem with what we have now?

    It’s not that existing AI tools are bad. They’re just incomplete.

    Here’s where traditional AI systems tend to fall short:

    • NLP fine-tuning
       Improves the form of communication but doesn’t understand intent or depth.
    • Feedback loops
       Built to correct errors, not guide growth.
    • Static knowledge bases
       Easy to search but often outdated or contextually off.
    • Ethics and accessibility policies
       Written down but rarely embedded in daily workflows.
    • Multilingual expansion
       Translates words, not nuance or meaning across cultures.

    These systems might help learners stay afloat. They don’t help them go deeper.

    What would a more intelligent system look like?

    It wouldn’t just deliver facts or correct mistakes. A truly intelligent learning system would:

    • Understand when a student is confused or disengaged
    • Ask guiding questions instead of giving quick answers
    • Retrieve current, relevant knowledge instead of relying on a static script
    • Honor a learner’s pace, background, and context
    • Operate with ethical boundaries and accessibility in mind–not as an add-on, but as a foundation

    In short, it would feel less like a tool and more like a companion. That may sound idealistic, but maybe idealism is what we need.

    The tools that might get us there

    There’s no shortage of frameworks being built right now–some for developers, others for educators and designers. They’re not perfect. But they’re good places to start.

    Framework Type Use
    LangChain Code Modular agent workflows, RAG pipelines
    Auto-GPT Code Task execution with memory and recursion
    CrewAI Code Multi-agent orchestration
    Spade Code Agent messaging and task scheduling
    Zapier + OpenAI No-code Automated workflows with language models
    Flowise AI No-code Visual builder for agent chains
    Power Automate AI Low-code AI in business process automation
    Bubble + OpenAI No-code Build custom web apps with LLMs

    These tools are modular, experimental, and still evolving. But they open a door to building systems that learn and adjust–without needing a PhD in AI to use them.

    A better system starts with a better architecture

    Here’s one way to think about an intelligent system’s structure:

    Learning experience layer

    • Where students interact, ask questions, get feedback
    • Ideally supports multilingual input, emotional cues, and accessible design

    Agentic AI core

    • The “thinking” layer that plans, remembers, retrieves, and reasons
    • Coordinates multiple agents (e.g., retrieval, planning, feedback, sentiment)

    Enterprise systems layer

    • Connects with existing infrastructure: SIS, LMS, content repositories, analytics systems

    This isn’t futuristic. It’s already possible to prototype parts of this model with today’s tools, especially in contained or pilot environments.

    So, what would it actually do for people?

    For students:

    • Offer guidance in moments of uncertainty
    • Help pace learning, not just accelerate it
    • Present relevant content, not just more content

    For teachers:

    • Offer insight into where learners are emotionally and cognitively
    • Surface patterns or blind spots without extra grading load

    For administrators:

    • Enable guardrails around AI behavior
    • Support personalization at scale without losing oversight

    None of this replaces people. It just gives them better support systems.

    Final thoughts: Less control panel, more compass

    There’s something timely about rethinking what we mean by intelligence in our learning systems.

    It’s not just about logic or retrieval speed. It’s about how systems make learners feel–and whether those systems help learners grow, question, and persist.

    Agentic AI is one way to design with those goals in mind. It’s not the only way. But it’s a start.

    And right now, a thoughtful start might be exactly what we need.

    Latest posts by eSchool Media Contributors (see all)

    Source link

  • Empowering neurodiverse learners with AI-driven solutions

    Empowering neurodiverse learners with AI-driven solutions

    Key points:

    A traditional classroom is like a symphony, where every student is handed the same sheet music and expected to play in perfect unison. But neurodiverse learners are not able to hear the same rhythm–or even the same notes. For them, learning can feel like trying to play an instrument that was never built for them. This is where AI-powered educational tools step in, not as a replacement for the teacher, but as a skilled accompanist, tuning into each learner’s individual tempo and helping them find their own melody.

    At its best, education should recognize and support the unique ways students absorb, process, and respond to information. For neurodiverse students–those with ADHD, dyslexia, autism spectrum disorder (ASD), and other learning differences–this need is especially acute. Traditional approaches often fail to take care of their varied needs, leading to frustration, disengagement, and lost potential. But with advances in AI, we have the opportunity to reshape learning environments into inclusive spaces where all students can thrive.

    Crafting personalized learning paths

    AI’s strength lies in pattern recognition and personalization at scale. In education, this means AI can adapt content and delivery in real time based on how a student is interacting with a lesson. For neurodiverse learners who may need more repetition, multi-sensory engagement, or pacing adjustments, this adaptability is a game changer.

    For example, a child with ADHD may benefit from shorter, interactive modules that reward progress quickly, while a learner with dyslexia might receive visual and audio cues alongside text to reinforce comprehension. AI can dynamically adjust these elements based on observed learning patterns, making the experience feel intuitive rather than corrective.

    This level of personalization is difficult to achieve in traditional classrooms, where one teacher may be responsible for 20 or more students with diverse needs. AI doesn’t replace that teacher; it augments their ability to reach each student more effectively.

    Recent research supports this approach–a 2025 systematic review published in the EPRA International Journal of Multidisciplinary Research found that AI-powered adaptive learning systems significantly enhance accessibility and social-emotional development for students with conditions like autism, ADHD, and dyslexia.

    Equipping educators with real-time insights

    One of the most significant benefits of AI tools for neurodiverse learners is the data they generate–not just for students, but for educators. These systems can provide real-time dashboards indicating which students are struggling, where they’re excelling, and how their engagement levels fluctuate over time. For a teacher managing multiple neurodiverse learners, these insights are crucial. Rather than relying on periodic assessments or observations, educators can intervene early, adjusting lesson plans, offering additional resources, or simply recognizing when a student needs a break.

    Imagine a teacher noticing that a student with ASD consistently disengages during word problems but thrives in visual storytelling tasks. AI can surface these patterns quickly and suggest alternatives that align with the student’s strengths, enabling faster, more informed decisions that support learning continuity.

    Success stories from the classroom

    Across the U.S., school districts are beginning to see the tangible benefits of AI-powered tools for neurodiverse learners. For instance, Humble Independent School District in Texas adopted an AI-driven tool called Ucnlearn to manage its expanding dyslexia intervention programs. The platform streamlines progress monitoring and generates detailed reports using AI, helping interventionists provide timely, personalized support to students. Since its rollout, educators have been able to handle growing caseloads more efficiently, with improved tracking of student outcomes.

    Meanwhile, Houston Independent School District partnered with an AI company to develop reading passages tailored to individual student levels and classroom goals. These passages are algorithmically aligned to Texas curriculum standards, offering engaging and relevant reading material to students, including those with dyslexia and other learning differences, at just the right level of challenge.

    The future of neurodiverse education

    The promise of AI in education goes beyond improved test scores or sleek digital interfaces, it’s about advancing equity. True inclusion means providing every student with tools that align with how they best learn. This could be gamified lessons that minimize cognitive overload, voice-assisted content to reduce reading anxiety, or real-time emotional feedback to help manage frustration. Looking ahead, AI-driven platforms could even support early identification of undiagnosed learning differences by detecting subtle patterns in student interactions, offering a new frontier for timely and personalized intervention.

    Still, AI is not a silver bullet. Its impact depends on thoughtful integration into curricula, alignment with proven pedagogical goals, and ongoing evaluation of its effectiveness. To be truly inclusive, these tools must be co-designed with input from both neurodiverse learners and the educators who work with them. The score is not yet finished; we are still composing. Technology’s real legacy in education will not be in algorithms or interfaces, but in the meaningful opportunities it creates for every student to thrive.

    Latest posts by eSchool Media Contributors (see all)

    Source link

  • Students learn the basics of AI as they weigh its use in their future careers

    Students learn the basics of AI as they weigh its use in their future careers

    This story was originally published by Chalkbeat. Sign up for their newsletters at ckbe.at/newsletters.

    On a recent Thursday morning, Michael Taubman asked his class of seniors at North Star Academy’s Washington Park High School: “What do you think AI’s role should be in your future career?”

    “In school, like how we use AI as a tool and we don’t use it to cheat on our work … that’s how it should be, like an assistant,” said Amirah Falana, a 17-year-old interested in a career in real estate law.

    Fernando Infante, an aspiring software developer, agreed that AI should be a tool to “provide suggestions” and inform the work.

    “It’s like having AI as a partner rather than it doing the work,” said Infante during class.

    Falana and Infante are students in Taubman’s class called The Summit, a yearlong program offered to 93 seniors this year and expanding to juniors next year that also includes a 10-week AI course developed by Taubman and Stanford University.

    As part of the course, students use artificial intelligence tools – often viewed in a negative light due to privacy and other technical concerns – to explore their career interests and better understand how technology could shape the workforce. The class is also timely, as 92% of companies plan to invest in more AI over the next three years, according to a report by global consulting firm McKinsey and Company.

    The lessons provide students with hands-on exercises to better understand how AI works and how they can use it in their daily lives. They are also designed so teachers across subject areas can include them as part of their courses and help high school students earn a Google Career Certificate for AI Essentials, which introduces AI and teaches the basics of using AI tools.

    Students like Infante have used the AI and coding skills they learned in class to create their own apps while others have used them to create school surveys and spark new thoughts about their future careers. Taubman says the goal is to also give students agency over AI so they can embrace technological changes and remain competitive in the workfield.

    “One of the key things for young people right now is to make sure they understand that this technology is not inevitable,” Taubman told Chalkbeat last month. “People made this, people are making decisions about it, and there are pros and cons like with everything people make and we should be talking about this.”

    Students need to know the basics of AI, experts say

    As Generation Z, those born between 1997 and 2012, graduate high school and enter a workforce where AI is new, many are wondering how the technology will be used and to what extent.

    Nearly half of Gen Z students polled by The Walton Family Foundation and Gallup said they use AI weekly, according to the newly released survey exploring how youth view AI. (The Walton Family Foundation is a supporter of Chalkbeat. See our funders list here.) The same poll found that over 4 in 10 Gen Z students believe they will need to know AI in their future careers, and over half believe schools should be required to teach them how to use it.

    This school year, Newark Public Schools students began using Khan Academy’s AI chatbot tutor called Khanmigo, which the district launched as a pilot program last year. Some Newark teachers reported that the tutoring tool was helpful in the classroom, but the district has not released data on whether it helped raise student performance and test scores. The district in 2024 also launched its multimillion project to install AI cameras across school buildings in an attempt to keep students safe.

    But more than just using AI in school, students want to feel prepared to use it after graduating high school. Nearly 3 in 4 college students said their colleges or universities should be preparing them for AI in the workplace, according to a survey from Inside Higher Ed and College Pulse’s Student Voice series.

    Many of the challenges of using AI in education center on the type of learning approach used, accuracy, and building trust with the technology, said Nhon Ma, CEO of Numerade – an online learning assistant that uses AI and educators to help students learn STEM concepts. But that’s why it’s important to immerse students in AI to help them understand the ways it could be used and when to spot issues, Ma added.

    “We want to prepare our youth for this competitive world stage, especially on the technological front so they can build their own competence and confidence in their future paths. That could potentially lead towards higher earnings for them too,” Ma said.

    For Infante, the senior in Taubman’s class, AI has helped spark a love for computer science and deepened his understanding of coding. He used it to create an app that tracks personal milestones and goals and awards users with badges once they reach them. As an aspiring software developer, he feels he has an advantage over other students because he’s learning about AI in high school.

    Taubman also says it’s especially important for students to understand how quickly the technology is advancing, especially for students like Infante looking towards a career in technology.

    “I think it’s really important to help young people grapple with how this is new, but unlike other big new things, the pace is very fast, and the implications for career are almost immediate in a lot of cases,” Taubman added.

    Students learn that human emotions are important as AI grows

    It’s also important to remember the limitations of AI, Taubman said, noting that students need the basic understanding of how AI works in order to question it, identify any mistakes, and use it accordingly in their careers.

    “I don’t want students to lose out on an internship or job because someone else knows how to use AI better than they do, but what I really want is for students to get the internship or the job because they’re skillful with AI,” Taubman said.

    Through Taubman’s class, students are also identifying how AI increases the demand for skills that require human emotion, such as empathy and ethics.

    Daniel Akinyele, a 17-year-old senior, said he was interested in a career in industrial and organizational psychology, which focuses on human behavior in the workplace.

    During Taubman’s class, he used a custom AI tool on his laptop to explore different scenarios where he could use AI in his career. Many involved talking to someone about their feelings or listening to vocal cues that might indicate a person is sad or angry. Ultimately, psychology is a career about human connection and “that’s where I come into play,” Akinyele said.

    “I’m human, so I would understand how people are feeling, like the emotion that AI doesn’t see in people’s faces, I would see it and understand it,” Akinyele added.

    Falana, the aspiring real estate attorney, also used the custom AI tool to consider how much she should rely on AI when writing legal documents. Similar to writing essays in schools, Falana said professionals should use their original writing in their work but AI could serve as a launching pad.

    “I feel like the legal field should definitely put regulations on AI use, like we shouldn’t be able to, draw up our entire case using AI,” Falana said.

    During Taubman’s class, students also discussed fake images and videos created by AI. Infante, who wants to be a software developer, added that he plans to use AI regularly on the job but believes it should also be regulated to limit disinformation online.

    Taubman says it’s important for students to have a healthy level of skepticism when it comes to new technologies. He encourages students to think about how AI generates images, the larger questions around copyright infringement, and their training processes.

    “We really want them to feel like they have agency in this world, both their capacity to use these systems,” Taubman said, “but also to ask these broader questions about how they were designed.”

    Chalkbeat is a nonprofit news site covering educational change in public schools.

    For more on AI in education, visit eSN’s Digital Learning hub.

    Latest posts by eSchool Media Contributors (see all)

    Source link

  • Helping students evaluate AI-generated content

    Helping students evaluate AI-generated content

    Key points:

    Finding accurate information has long been a cornerstone skill of librarianship and classroom research instruction. When cleaning up some materials on a backup drive, I came across an article I wrote for the September/October 1997 issue of Book Report, a journal directed to secondary school librarians. A generation ago, “asking the librarian” was a typical and often necessary part of a student’s research process. The digital tide has swept in new tools, habits, and expectations. Today’s students rarely line up at the reference desk. Instead, they consult their phones, generative AI bots, and smart search engines that promise answers in seconds. However, educators still need to teach students the ability to be critical consumers of information, whether produced by humans or generated by AI tools.

    Teachers haven’t stopped assigning projects on wolves, genetic engineering, drug abuse, or the Harlem Renaissance, but the way students approach those assignments has changed dramatically. They no longer just “surf the web.” Now, they engage with systems that summarize, synthesize, and even generate research responses in real time.

    In 1997, a keyword search might yield a quirky mix of werewolves, punk bands, and obscure town names alongside academic content. Today, a student may receive a paragraph-long summary, complete with citations, created by a generative AI tool trained on billions of documents. To an eighth grader, if the answer looks polished and is labeled “AI-generated,” it must be true. Students must be taught how AI can hallucinate or simply be wrong at times.

    This presents new challenges, and opportunities, for K-12 educators and librarians in helping students evaluate the validity, purpose, and ethics of the information they encounter. The stakes are higher. The tools are smarter. The educator’s role is more important than ever.

    Teaching the new core four

    To help students become critical consumers of information, educators must still emphasize four essential evaluative criteria, but these must now be framed in the context of AI-generated content and advanced search systems.

    1. The purpose of the information (and the algorithm behind it)

    Students must learn to question not just why a source was created, but why it was shown to them. Is the site, snippet, or AI summary trying to inform, sell, persuade, or entertain? Was it prioritized by an algorithm tuned for clicks or accuracy?

    A modern extension of this conversation includes:

    • Was the response written or summarized by a generative AI tool?
    • Was the site boosted due to paid promotion or engagement metrics?
    • Does the tool used (e.g., ChatGPT, Claude, Perplexity, or Google’s Gemini) cite sources, and can those be verified?

    Understanding both the purpose of the content and the function of the tool retrieving it is now a dual responsibility.

    2. The credibility of the author (and the credibility of the model)

    Students still need to ask: Who created this content? Are they an expert? Do they cite reliable sources? They must also ask:

    • Is this original content or AI-generated text?
    • If it’s from an AI, what sources was it trained on?
    • What biases may be embedded in the model itself?

    Today’s research often begins with a chatbot that cannot cite its sources or verify the truth of its outputs. That makes teaching students to trace information to original sources even more essential.

    3. The currency of the information (and its training data)

    Students still need to check when something was written or last updated. However, in the AI era, students must understand the cutoff dates of training datasets and whether search tools are connected to real-time information. For example:

    • ChatGPT’s free version (as of early 2025) may only contain information up to mid-2023.
    • A deep search tool might include academic preprints from 2024, but not peer-reviewed journal articles published yesterday.
    • Most tools do not include digitized historical data that is still in manuscript form. It is available in a digital format, but potentially not yet fully useful data.

    This time gap matters, especially for fast-changing topics like public health, technology, or current events.

    4. The wording and framing of results

    The title of a website or academic article still matters, but now we must attend to the framing of AI summaries and search result snippets. Are search terms being refined, biased, or manipulated by algorithms to match popular phrasing? Is an AI paraphrasing a source in a way that distorts its meaning? Students must be taught to:

    • Compare summaries to full texts
    • Use advanced search features to control for relevance
    • Recognize tone, bias, and framing in both AI-generated and human-authored materials

    Beyond the internet: Print, databases, and librarians still matter

    It is more tempting than ever to rely solely on the internet, or now, on an AI chatbot, for answers. Just as in 1997, the best sources are not always the fastest or easiest to use.

    Finding the capital of India on ChatGPT may feel efficient, but cross-checking it in an almanac or reliable encyclopedia reinforces source triangulation. Similarly, viewing a photo of the first atomic bomb on a curated database like the National Archives provides more reliable context than pulling it from a random search result. With deepfake photographs proliferating the internet, using a reputable image data base is essential, and students must be taught how and where to find such resources.

    Additionally, teachers can encourage students to seek balance by using:

    • Print sources
    • Subscription-based academic databases
    • Digital repositories curated by librarians
    • Expert-verified AI research assistants like Elicit or Consensus

    One effective strategy is the continued use of research pathfinders that list sources across multiple formats: books, journals, curated websites, and trusted AI tools. Encouraging assignments that require diverse sources and source types helps to build research resilience.

    Internet-only assignments: Still a trap

    Then as now, it’s unwise to require students to use only specific sources, or only generative AI, for research. A well-rounded approach promotes information gathering from all potentially useful and reliable sources, as well as information fluency.

    Students must be taught to move beyond the first AI response or web result, so they build the essential skills in:

    • Deep reading
    • Source evaluation
    • Contextual comparison
    • Critical synthesis

    Teachers should avoid giving assignments that limit students to a single source type, especially AI. Instead, they should prompt students to explain why they selected a particular source, how they verified its claims, and what alternative viewpoints they encountered.

    Ethical AI use and academic integrity

    Generative AI tools introduce powerful possibilities including significant reductions, as well as a new frontier of plagiarism and uncritical thinking. If a student submits a summary produced by ChatGPT without review or citation, have they truly learned anything? Do they even understand the content?

    To combat this, schools must:

    • Update academic integrity policies to address the use of generative AI including clear direction to students as to when and when not to use such tools.
    • Teach citation standards for AI-generated content
    • Encourage original analysis and synthesis, not just copying and pasting answers

    A responsible prompt might be: “Use a generative AI tool to locate sources, but summarize their arguments in your own words, and cite them directly.”

    In closing: The librarian’s role is more critical than ever

    Today’s information landscape is more complex and powerful than ever, but more prone to automation errors, biases, and superficiality. Students need more than access; they need guidance. That is where the school librarian, media specialist, and digitally literate teacher must collaborate to ensure students are fully prepared for our data-rich world.

    While the tools have evolved, from card catalogs to Google searches to AI copilots, the fundamental need remains to teach students to ask good questions, evaluate what they find, and think deeply about what they believe. Some things haven’t changed–just like in 1997, the best advice to conclude a lesson on research remains, “And if you need help, ask a librarian.”

    Steven M. Baule, Ed.D., Ph.D.
    Latest posts by Steven M. Baule, Ed.D., Ph.D. (see all)

    Source link

  • White House order prioritizes AI in schools

    White House order prioritizes AI in schools

    Key points:

    • The Trump administration is elevating AI programs in K-12 education
    • The human edge in the AI era
    • Report details uneven AI use among teachers, principals
    • For more news on AI in education, visit eSN’s Digital Learning hub

    A new executive order signed by President Trump takes aim at AI policies in K-12 education by “fostering interest and expertise in artificial intelligence (AI) technology from an early age to maintain America’s global dominance in this technological revolution for future generations.”