Tag: digital learning

  • Helping students evaluate AI-generated content

    Helping students evaluate AI-generated content

    Key points:

    Finding accurate information has long been a cornerstone skill of librarianship and classroom research instruction. When cleaning up some materials on a backup drive, I came across an article I wrote for the September/October 1997 issue of Book Report, a journal directed to secondary school librarians. A generation ago, “asking the librarian” was a typical and often necessary part of a student’s research process. The digital tide has swept in new tools, habits, and expectations. Today’s students rarely line up at the reference desk. Instead, they consult their phones, generative AI bots, and smart search engines that promise answers in seconds. However, educators still need to teach students the ability to be critical consumers of information, whether produced by humans or generated by AI tools.

    Teachers haven’t stopped assigning projects on wolves, genetic engineering, drug abuse, or the Harlem Renaissance, but the way students approach those assignments has changed dramatically. They no longer just “surf the web.” Now, they engage with systems that summarize, synthesize, and even generate research responses in real time.

    In 1997, a keyword search might yield a quirky mix of werewolves, punk bands, and obscure town names alongside academic content. Today, a student may receive a paragraph-long summary, complete with citations, created by a generative AI tool trained on billions of documents. To an eighth grader, if the answer looks polished and is labeled “AI-generated,” it must be true. Students must be taught how AI can hallucinate or simply be wrong at times.

    This presents new challenges, and opportunities, for K-12 educators and librarians in helping students evaluate the validity, purpose, and ethics of the information they encounter. The stakes are higher. The tools are smarter. The educator’s role is more important than ever.

    Teaching the new core four

    To help students become critical consumers of information, educators must still emphasize four essential evaluative criteria, but these must now be framed in the context of AI-generated content and advanced search systems.

    1. The purpose of the information (and the algorithm behind it)

    Students must learn to question not just why a source was created, but why it was shown to them. Is the site, snippet, or AI summary trying to inform, sell, persuade, or entertain? Was it prioritized by an algorithm tuned for clicks or accuracy?

    A modern extension of this conversation includes:

    • Was the response written or summarized by a generative AI tool?
    • Was the site boosted due to paid promotion or engagement metrics?
    • Does the tool used (e.g., ChatGPT, Claude, Perplexity, or Google’s Gemini) cite sources, and can those be verified?

    Understanding both the purpose of the content and the function of the tool retrieving it is now a dual responsibility.

    2. The credibility of the author (and the credibility of the model)

    Students still need to ask: Who created this content? Are they an expert? Do they cite reliable sources? They must also ask:

    • Is this original content or AI-generated text?
    • If it’s from an AI, what sources was it trained on?
    • What biases may be embedded in the model itself?

    Today’s research often begins with a chatbot that cannot cite its sources or verify the truth of its outputs. That makes teaching students to trace information to original sources even more essential.

    3. The currency of the information (and its training data)

    Students still need to check when something was written or last updated. However, in the AI era, students must understand the cutoff dates of training datasets and whether search tools are connected to real-time information. For example:

    • ChatGPT’s free version (as of early 2025) may only contain information up to mid-2023.
    • A deep search tool might include academic preprints from 2024, but not peer-reviewed journal articles published yesterday.
    • Most tools do not include digitized historical data that is still in manuscript form. It is available in a digital format, but potentially not yet fully useful data.

    This time gap matters, especially for fast-changing topics like public health, technology, or current events.

    4. The wording and framing of results

    The title of a website or academic article still matters, but now we must attend to the framing of AI summaries and search result snippets. Are search terms being refined, biased, or manipulated by algorithms to match popular phrasing? Is an AI paraphrasing a source in a way that distorts its meaning? Students must be taught to:

    • Compare summaries to full texts
    • Use advanced search features to control for relevance
    • Recognize tone, bias, and framing in both AI-generated and human-authored materials

    Beyond the internet: Print, databases, and librarians still matter

    It is more tempting than ever to rely solely on the internet, or now, on an AI chatbot, for answers. Just as in 1997, the best sources are not always the fastest or easiest to use.

    Finding the capital of India on ChatGPT may feel efficient, but cross-checking it in an almanac or reliable encyclopedia reinforces source triangulation. Similarly, viewing a photo of the first atomic bomb on a curated database like the National Archives provides more reliable context than pulling it from a random search result. With deepfake photographs proliferating the internet, using a reputable image data base is essential, and students must be taught how and where to find such resources.

    Additionally, teachers can encourage students to seek balance by using:

    • Print sources
    • Subscription-based academic databases
    • Digital repositories curated by librarians
    • Expert-verified AI research assistants like Elicit or Consensus

    One effective strategy is the continued use of research pathfinders that list sources across multiple formats: books, journals, curated websites, and trusted AI tools. Encouraging assignments that require diverse sources and source types helps to build research resilience.

    Internet-only assignments: Still a trap

    Then as now, it’s unwise to require students to use only specific sources, or only generative AI, for research. A well-rounded approach promotes information gathering from all potentially useful and reliable sources, as well as information fluency.

    Students must be taught to move beyond the first AI response or web result, so they build the essential skills in:

    • Deep reading
    • Source evaluation
    • Contextual comparison
    • Critical synthesis

    Teachers should avoid giving assignments that limit students to a single source type, especially AI. Instead, they should prompt students to explain why they selected a particular source, how they verified its claims, and what alternative viewpoints they encountered.

    Ethical AI use and academic integrity

    Generative AI tools introduce powerful possibilities including significant reductions, as well as a new frontier of plagiarism and uncritical thinking. If a student submits a summary produced by ChatGPT without review or citation, have they truly learned anything? Do they even understand the content?

    To combat this, schools must:

    • Update academic integrity policies to address the use of generative AI including clear direction to students as to when and when not to use such tools.
    • Teach citation standards for AI-generated content
    • Encourage original analysis and synthesis, not just copying and pasting answers

    A responsible prompt might be: “Use a generative AI tool to locate sources, but summarize their arguments in your own words, and cite them directly.”

    In closing: The librarian’s role is more critical than ever

    Today’s information landscape is more complex and powerful than ever, but more prone to automation errors, biases, and superficiality. Students need more than access; they need guidance. That is where the school librarian, media specialist, and digitally literate teacher must collaborate to ensure students are fully prepared for our data-rich world.

    While the tools have evolved, from card catalogs to Google searches to AI copilots, the fundamental need remains to teach students to ask good questions, evaluate what they find, and think deeply about what they believe. Some things haven’t changed–just like in 1997, the best advice to conclude a lesson on research remains, “And if you need help, ask a librarian.”

    Steven M. Baule, Ed.D., Ph.D.
    Latest posts by Steven M. Baule, Ed.D., Ph.D. (see all)

    Source link

  • 5 AI tools for classroom creativity

    5 AI tools for classroom creativity

    Key points:

    • AI tools enhance K-12 creativity and innovation through interactive projects
    • A new era for teachers as AI disrupts instruction
    • Report details uneven AI use among teachers, principals
    • For more news on AI and creativity, visit eSN’s Digital Learning hub

    As AI becomes more commonplace in classrooms, it gives students access to creative tools that enhance learning, exploration, and innovation. K-12 students can use AI tools in various ways to boost creativity through art, storytelling, music, coding, and more.

    More News from eSchool News

    HVAC projects to improve indoor air quality. Tutoring programs for struggling students. Tuition support for young people who want to become teachers in their home communities.

    Almost 3 in 5 K-12 educators (55 percent) have positive perceptions about GenAI, despite concerns and perceived risks in its adoption, according to updated data from Cengage Group’s “AI in Education” research series.

    Our school has built up its course offerings without having to add headcount. Along the way, we’ve also gained a reputation for having a wide selection of general and advanced courses for our growing student body.

    Ensuring that girls feel supported and empowered in STEM from an early age can lead to more balanced workplaces, economic growth, and groundbreaking discoveries.

    In my work with middle school students, I’ve seen how critical that period of development is to students’ future success. One area of focus in a middle schooler’s development is vocabulary acquisition.

    For students, the mid-year stretch is a chance to assess their learning, refine their decision-making skills, and build momentum for the opportunities ahead.

    Middle school marks the transition from late childhood to early adolescence. Developmental psychologist Erik Erikson describes the transition as a shift from the Industry vs. Inferiority stage into the Identity vs. Role Confusion stage.

    Art has a unique power in the ESL classroom–a magic that bridges cultures, ignites imagination, and breathes life into language. For English Language Learners (ELLs), it’s more than an expressive outlet.

    In the year 2025, no one should have to be convinced that protecting data privacy matters. For education institutions, it’s really that simple of a priority–and that complicated.

    Teachers are superheroes. Every day, they rise to the challenge, pouring their hearts into shaping the future. They stay late to grade papers and show up early to tutor struggling students.

    Want to share a great resource? Let us know at submissions@eschoolmedia.com.

    Source link

  • Report details uneven AI use among teachers, principals

    Report details uneven AI use among teachers, principals

    Key points:

    English/language arts and science teachers were almost twice as likely to say they use AI tools compared to math teachers or elementary teachers of all subjects, according to a February 2025 survey from the RAND Corporation that delves into uneven AI adoption in schools.

    “As AI tools and products for educational purposes become more prevalent, studies should track their use among educators. Researchers could identify the particular needs AI is addressing in schools and–potentially–guide the development of AI products that better meet those needs. In addition, data on educator use of AI could help policymakers and practitioners consider disparities in that use and implications for equitable, high-quality instruction across the United States,” note authors Julia H. KaufmanAshley WooJoshua EaganSabrina Lee, and Emma B. Kassan.

    One-quarter of ELA, math, and science teachers used AI tools for instructional planning or teaching in the 2023–2024 school year. Nearly 60 percent of surveyed principals also reported using AI tools for their work in 2023-2024.

    Among the one-quarter of teachers nationally who reported using AI tools, 64 percent said that they used them for instructional planning only, whether for their ELA, math, or science instruction; only 11 percent said that they introduced them to students but did not do instructional planning with them; and 25 percent said that they did both.

    Although one-quarter of teachers overall reported using AI tools, the report’s authors observed differences in AI use by subject taught and some school characteristics. For instance, close to 40 percent of ELA or science teachers said they use AI, compared to 20 percent of general elementary education or math teachers. Teachers and principals in higher-poverty schools were less likely to report using AI tools relative to those in lower-poverty schools.

    Eighteen percent of principals reported that their schools or districts provided guidance on the use of AI by staff, teachers, or students. Yet, principals in the highest-poverty schools were about half as likely as principals in the lowest-poverty schools to report that guidance was provided (13 percent and 25 percent, respectively).

    Principals cited a lack of professional development for using AI tools or products (72 percent), concerns about data privacy (70 percent) and uncertainty about how AI can be used for their jobs (70 percent) as factors having a major or minor influence on their AI use.

    The report also offers recommendations for education stakeholders:

    1. All districts and schools should craft intentional strategies to support teachers’ AI use in ways that will most improve the quality of instruction and student learning.

    2. AI developers and decision-makers should consider what useful AI applications have the greatest potential to improve teaching and learning and how to make those applications available in high-poverty contexts.

    3. Researchers should work hand-in-hand with AI developers to study use cases and develop a body of evidence on effective AI applications for school leadership, teaching, and learning.

    Laura Ascione
    Latest posts by Laura Ascione (see all)

    Source link

  • 6 recommendations for AI in classrooms

    6 recommendations for AI in classrooms

    Key points:

    As states move forward with efforts to adopt artificial intelligence, the nonprofit Southern Regional Education Board’s Commission on AI in Education has released its first six recommendations for schools and postsecondary institutions.

    Because of its broad membership, regional breadth, early creation and size, SREB President Stephen L. Pruitt said the commission is poised to produce critical recommendations that will inform not only Southern education decision makers but those throughout the nation.

    “AI is fundamentally changing the classroom and workplace,” Pruitt said. “With that in mind, this commission is working to ensure they make recommendations that are strategic, practical and thoughtful.”

    The commission is set to meet for another year and plans to release a second set of recommendations soon. Here are the first six:

    Policy recommendation #1: Establish state AI networks
    States should establish statewide artificial intelligence networks so people, groups and agencies can connect, communicate, collaborate and coordinate AI efforts across each state. These statewide networks could eventually form a regional group of statewide AI network representatives who could gather regularly to share challenges and successes.

    Policy recommendation #2: Develop targeted AI guidance
    States should develop and maintain targeted guidance for distinct groups using, integrating or supporting the use of AI in education. States should include, for example, elementary students, middle school students, high school students, postsecondary students, teachers, administrators, postsecondary faculty and administrators and parents.

    Policy recommendation #3: Provide high-quality professional development
    State K-12 and postsecondary agencies should provide leadership by working with local districts and institutions to develop plans to provide and incentivize high-quality professional development for AI. The plans should aim to enhance student learning.

    Policy recommendation #4: Integrate into standards & curricula
    States should integrate into statewide K-12 standards and curricula the AI knowledge and skills students need to prepare them for success in the workforce.

    Policy recommendation #5: Assess local capacity and needs
    States should develop and conduct AI needs assessments across their states to determine the capacity of local districts, schools and postsecondary institutions to integrate AI successfully. These should be designed to help states determine which institution, district or school needs state support, what type of support and at what level. 

    Policy recommendation #6: Develop resource allocation plans
    States should develop detailed resource allocation plans for AI implementation in schools, school districts and institutions of postsecondary education to ensure that the implementation of AI is successful and sustainable.
    These plans should inform state fiscal notes related to education and AI.

    The 60-plus member commission was established in February of 2024. Members include policymakers and education and business leaders throughout the 16-state SREB region.

    For more information about the commission please see the following links:

    Latest posts by eSchool Media Contributors (see all)

    Source link

  • AI in K-12 instruction: Insights from instructional coaches

    AI in K-12 instruction: Insights from instructional coaches

    Key points:

    As artificial intelligence (AI) becomes an integral part of modern education, instructional coaches play a pivotal role in guiding teachers on its implementation, bridging the gap between emerging educational technologies and effective classroom practices.

    As trusted mentors and professional development leaders, they guide teachers in implementing AI tools thoughtfully, ensuring that technology enhances student learning while aligning with pedagogical best practices. This article briefly synthesizes responses from instructional coaches regarding their experiences, challenges, and recommendations for integrating AI into K-12 education.  

    Ten instructional coaches, all with advanced degrees, had the following insights into the instructional use of AI in K12 education. They all have more than 10 years of experience in education and work across all three types of school environments: urban, suburban, and rural.

    The coaches reported that AI is used for various instructional purposes. The most-cited applications included providing feedback on student work, creating professional development materials, supporting writing and content generation, creating course content, and enhancing accessibility for students with special needs. Many coaches note that AI tools assisted in grading assignments, offering real-time feedback, and supporting differentiated instruction. AI-powered feedback helps teachers provide more personalized responses without increasing their workload.  Regarding professional development, AI is being used to generate training content for teachers, ensuring they stay updated on educational trends. Coaches are leveraging AI to curate research, synthesize best practices, and develop instructional strategies tailored to their schools.  They encourage teachers and students to utilize AI for brainstorming, outlining essays, and improving writing mechanics.  

    Perceived impact of AI on instruction 

    The vast majority of instructional coaches expressed positive expectations regarding AI’s potential to reduce educator workload, create personalized learning experiences, and improve access for students with disabilities. However, perspectives on AI’s overall impact on education varied. While most believe AI has positively influenced instruction, a few remain cautious about its potential risks.  One coach suggested that allowing students to utilize the tools in a structured setting and teaching them to use AI as a tool is one of the biggest potentials for generative AI in education. About three-fourths of coaches feel that AI will reduce teacher workload by automating repetitive tasks such as grading and data analysis.

    Concerns about AI in education 

    While AI presents numerous benefits, instructional coaches also raised concerns about its potential drawbacks, including ethical dilemmas, student engagement challenges, and equity issues. Despite its advantages, instructional coaches identified several challenges and ethical concerns. They worry some students will use AI tools without critically engaging with the material, leading to passive learning and an overreliance on generative tools. Some had concerns that AI-generated content could reduce the need for creativity and independent thought. Coaches worry that AI makes it easier for students to plagiarize or rely on generated answers without truly understanding concepts which can negatively impact academic integrity. Coaches cite technical challenges as well. Educators face issues with AI tool reliability, compatibility with existing learning management systems (LMS), and steep learning curves. The coaches mentioned that some schools lack the infrastructure to support meaningful widespread AI integration. 

    Several ethical and privacy concerns were mentioned. AI tools collect and store student data, raising concerns about data privacy and security–particularly with younger students who may be less aware or concerned about revealing personally identifiable information (PII). They mention the need for clear guidelines on responsible AI use to prevent bias and misinformation.

    Coaches emphasize the importance of verifying AI-generated materials for accuracy. They suggest teachers be encouraged to cross-check AI-produced responses before using them in instruction. They recommend robust integrating discussions on digital literacy, AI biases, and the ethical implications of generative AI into classroom conversations. Schools need to train educators and students on responsible AI usage. Some schools restrict AI for creative writing, critical thinking exercises, and certain assessments to ensure students develop their own ideas–an idea that coaches recommend. Coaches suggest embedding AI literacy into existing courses, ensuring students understand how AI works, its limitations, and its ethical implications. 

    Equity concerns are a serious issue for instructional coaches. Schools should ensure all students have equal access to AI tools. AI should be leveraged to bridge learning gaps, not widen them. Making sure all students have access to the same suite of tools is essential to create a level playing field for all learners. Instructional coaches generally agree that AI is not just a passing trend, but an integral part of the future of education. There is a concern that generative AI tools will reduce the human interaction of the teaching and learning process. For instance, interpersonal relationships are not developed with AI-based tutoring systems in the same way they can be developed and encouraged with traditional tutoring processes.

    The integration of AI in K-12 education presents both opportunities and challenges. Instructional coaches largely recognize AI’s potential to enhance learning, improve efficiency, academic integrity, and maintain human-centered learning experiences. As AI continues to evolve, educators must be proactive in shaping how it is used, ensuring it serves as a tool for empowerment rather than dependency. Future efforts should focus on professional development for educators, AI literacy training for students, and policies ensuring equitable AI access across diverse school settings.

    Latest posts by eSchool Media Contributors (see all)

    Source link