Tag: Tools

  • Measuring student global competency learning using direct peer connections

    Measuring student global competency learning using direct peer connections

    Key points:

    Our students are coming of age in a world that demands global competency. From economic interdependence to the accelerating effects of climate change and mass migration, students need to develop the knowledge and skills to engage and succeed in this diverse and interconnected world. Consequently, the need for global competency education is more important than ever.

    “Being born into a global world does not make people global citizens,” Andreas Schleicher of the Organization for Economic Cooperation and Development (OECD) has said. “We must deliberately and systematically educate our children in global competence.” 

    Here at Global Cities, we regularly talk with educators eager to bring global competency into their classrooms in ways that engage and excite students to learn. Educators recognize the need, but ask a vital question: How do we teach something we can’t measure?

    It’s clear that in today’s competitive and data-driven education environment, we need to expand and evaluate what students need to know to be globally competent adults. Global competency education requires evaluation tools to determine what and whether students are learning.

    The good news is that two recent independent research studies found that educators can use a new tool, the Global Cities’ Codebook for Global Student Learning Outcomesto identify what global competency learning looks like and to assess whether students are learning by examining student writing. The research successfully used the evaluation tool for global competency programs with different models and curricula and across different student populations.

    Global Cities developed the Codebook to help researchers, program designers, and educators identify, teach, and measure global competency in their own classrooms. Created in partnership with Harvard Graduate School of Education’s The Open Canopy, the Codebook captures 55 observable indicators across four core global learning outcomes: Appreciation for Diversity, Cultural Understanding, Global Knowledge, and Global Engagement. The Codebook was developed using data from our own Global Scholars virtual exchange program, which since 2014 has connected more than 139,000 students in 126 cities worldwide to teach global competency.

    In Global Scholars, we’ve seen firsthand the excitement of directly connecting students with their international peers and sparking meaningful discussions about culture, community, and shared challenges. We know how teachers can effectively use the Codebook and how Global Cities workshops extend the reach of this approach to a larger audience of K-12 teachers. This research was designed to determine whether the same tool could be used to assess global competency learning in other virtual exchange programsnot only Global Cities’ Global Scholars program.

    These studies make clear that the Codebook can reliably identify global learning in diverse contexts and help educators see where and how their students are developing global competency skills in virtual exchange curricula. You can examine the tool (the Codebook) here. You can explore the full research findings here.

    The first study looked at two AFS Intercultural Programs curricula, Global You Changemaker and Global Up Teen. The second study analyzed student work from The Open Canopy‘s Planetary Health and Remembering the Past learning journeys.

    In the AFS Intercultural Programs data, researchers found clear examples of students from across the globe showing Appreciation for Diversity and Cultural Understanding. In these AFS online discussion boards, students showed evidence they were learning about their own and other cultures, expressed positive attitudes about one another’s cultures, and demonstrated tolerance for different backgrounds and points of view. Additionally, the discussion boards offered opportunities for students to interact with each other virtually, and there were many examples of students from different parts of the world listening to one another and interacting in positive and respectful ways. When the curriculum invited students to design projects addressing community or global issues, they demonstrated strong evidence of Global Engagement as well.

    Students in The Open Canopy program demonstrated the three most prevalent indicators of global learning that reflect core skills essential to effective virtual exchange: listening to others and discussing issues in a respectful and unbiased way; interacting with people of different backgrounds positively and respectfully; and using digital tools to learn from and communicate with peers around the world. Many of the Remembering the Past posts were especially rich and coded for multiple indicators of global learning.

    Together, these studies show that global competency can be taught–and measured. They also highlight simple, but powerful strategies educators everywhere can use:

    • Structured opportunities for exchange help students listen and interact respectfully with one another
    • Virtual exchange prompts students to share their cultures and experiences across lines of difference in positive, curious ways
    • Assignments that include reflection questions–why something matters, not just what it is–help students think critically about culture and global issues
    • Opportunities for students to give their opinion and to decide to take action, even hypothetically, builds their sense of agency in addressing global challenges

    The Codebook is available free to all educators, along with hands-on professional development workshops that guide teachers in using the tool to design curriculum, teach intentionally, and assess learning. Its comprehensive set of indicators gives educators and curriculum designers a menu of options–some they might not have initially considered–that can enrich students’ global learning experiences.

    Our message to educators is simple: A community of educators (Global Ed Lab), a research-supported framework, and practical tools can help you teach students global competency and evaluate their work.

    The question is no longer whether we need more global competency education. We clearly do. Now with the Codebook and the Global Ed Lab, teachers can learn how to teach this subject matter effectively and use tools to assess student learning.

    Latest posts by eSchool Media Contributors (see all)

    Source link

  • Using generative tools to deepen, not replace, human connection in schools

    Using generative tools to deepen, not replace, human connection in schools

    Key points:

    For the last two years, conversations about AI in education have tended to fall into two camps: excitement about efficiency or fear of replacement. Teachers worry they’ll lose authenticity. Leaders worry about academic integrity. And across the country, schools are trying to make sense of a technology that feels both promising and overwhelming.

    But there’s a quieter, more human-centered opportunity emerging–one that rarely makes the headlines: AI can actually strengthen empathy and improve the quality of our interactions with students and staff.

    Not by automating relationships, but by helping us become more reflective, intentional, and attuned to the people we serve.

    As a middle school assistant principal and a higher education instructor, I’ve found that AI is most valuable not as a productivity tool, but as a perspective-taking tool. When used thoughtfully, it supports the emotional labor of teaching and leadership–the part of our work that cannot be automated.

    From efficiency to empathy

    Schools do not thrive because we write faster emails or generate quicker lesson plans. They thrive because students feel known. Teachers feel supported. Families feel included.

    AI can assist with the operational tasks, but the real potential lies in the way it can help us:

    • Reflect on tone before hitting “send” on a difficult email
    • Understand how a message may land for someone under stress
    • Role-play sensitive conversations with students or staff
    • Anticipate barriers that multilingual families might face
    • Rehearse a restorative response rather than reacting in the moment

    These are human actions–ones that require situational awareness and empathy. AI can’t perform them for us, but it can help us practice and prepare for them.

    A middle school use case: Preparing for the hard conversations

    Middle school is an emotional ecosystem. Students are forming identity, navigating social pressures, and learning how to advocate for themselves. Staff are juggling instructional demands while building trust with young adolescents whose needs shift by the week.

    Some days, the work feels like equal parts counselor, coach, and crisis navigator.

    One of the ways I’ve leveraged AI is by simulating difficult conversations before they happen. For example:

    • A student is anxious about returning to class after an incident
    • A teacher feels unsupported and frustrated
    • A family is confused about a schedule change or intervention plan

    By giving the AI a brief description and asking it to take on the perspective of the other person, I can rehearse responses that center calm, clarity, and compassion.

    This has made me more intentional in real interactions–I’m less reactive, more prepared, and more attuned to the emotions beneath the surface.

    Empathy improves when we get to “practice” it.

    Supporting newcomers and multilingual learners

    Schools like mine welcome dozens of newcomers each year, many with interrupted formal education. They bring extraordinary resilience–and significant emotional and linguistic needs.

    AI tools can support staff in ways that deepen connection, not diminish it:

    • Drafting bilingual communication with a softer, more culturally responsive tone
    • Helping teachers anticipate trauma triggers based on student histories
    • Rewriting classroom expectations in family-friendly language
    • Generating gentle scripts for welcoming a student experiencing culture shock

    The technology is not a substitute for bilingual staff or cultural competence. But it can serve as a bridge–helping educators reach families and students with more warmth, clarity, and accuracy.

    When language becomes more accessible, relationships strengthen.

    AI as a mirror for leadership

    One unexpected benefit of AI is that it acts as a mirror. When I ask it to review the clarity of a communication, or identify potential ambiguities, it often highlights blind spots:

    • “This sentence may sound punitive.”
    • “This may be interpreted as dismissing the student’s perspective.”
    • “Consider acknowledging the parent’s concern earlier in the message.”

    These are the kinds of insights reflective leaders try to surface–but in the rush of a school day, they are easy to miss.

    AI doesn’t remove responsibility; it enhances accountability. It helps us lead with more emotional intelligence, not less.

    What this looks like in teacher practice

    For teachers, AI can support empathy in similarly grounded ways:

    1. Building more inclusive lessons

    Teachers can ask AI to scan a lesson for hidden barriers–assumptions about background knowledge, vocabulary loads, or unclear steps that could frustrate students.

    2. Rewriting directions for struggling learners

    A slight shift in wording can make all the difference for a student with anxiety or processing challenges.

    3. Anticipating misconceptions before they happen

    AI can run through multiple “student responses” so teachers can see where confusion might arise.

    4. Practicing restorative language

    Teachers can try out scripts for responding to behavioral issues in ways that preserve dignity and connection.

    These aren’t shortcuts. They’re tools that elevate the craft.

    Human connection is the point

    The heart of education is human. AI doesn’t change that–in fact, it makes it more obvious.

    When we reduce the cognitive load of planning, we free up space for attunement.
    When we rehearse hard conversations, we show up with more steadiness.
    When we write in more inclusive language, more families feel seen.
    When we reflect on our tone, we build trust.

    The goal isn’t to create AI-enhanced classrooms. It’s to create relationship-centered classrooms where AI quietly supports the skills that matter most: empathy, clarity, and connection.

    Schools don’t need more automation.

    They need more humanity–and AI, used wisely, can help us get there.

    Latest posts by eSchool Media Contributors (see all)

    Source link

  • Teaching in the age of generative AI: why strategy matters more than tools

    Teaching in the age of generative AI: why strategy matters more than tools

    Join HEPI and Advance HE for a webinar today (Tuesday, 13 January 2026) from 11am to 12pm, exploring what higher education can learn from leadership approaches in other sectors. Sign up here to hear this and more from our speakers.

    This blog was kindly authored by Wioletta Nawrot, Associate Professor and Teaching & Learning Lead at ESCP Business School, London Campus.

    Generative AI has entered higher education faster than most institutions can respond. The question is no longer whether students and staff will use it, but whether universities can ensure it strengthens learning rather than weakens it. Used well, AI can support personalised feedback, stimulate creativity, and free academic time for deeper dialogue. Used poorly, it can erode critical thinking, distort assessment, and undermine trust.

    The difference lies not in the tools themselves but in how institutions guide their use through pedagogy, governance, and culture.

    AI is a cultural and pedagogical shift, not a software upgrade

    Across higher education, early responses to AI have often focused on tools. Yet treating AI as a bolt-on risks missing the real transformation: a shift in how academic communities think, learn, and make judgements.

    Some universities began with communities of practice rather than software procurement. At ESCP Business School, stakeholders, including staff and students, were invited to experiment with AI in teaching, assessment, and student support. These experiences demonstrated that experimentation is essential but only when it contributes to a coherent framework with shared principles and staff development.

    Three lessons have emerged as AI rollouts have been deployed. Staff report using AI to draft feedback or generate case study variations, but final decisions and marking remain human. Students learn more when they critique AI, not copy it. Exercises where students compare AI responses to academic sources or highlight errors can strengthen critical thinking. Governance matters more than enthusiasm. Clarity around data privacy, authorship, assessment and acceptable use is essential to protect trust.

    Assessment: the hardest and most urgent area of reform

    Once students can generate fluent essays or code in seconds, traditional take-home assignments are no longer reliable indicators of learning. At ESCP we have responded by: 

    • Introducing oral assessments, in-class writing, and step-by-step submissions to verify individual understanding.
    • Asking students to reference class materials and discussions, or unique datasets that AI tools cannot access.
    • Updating assessment rubrics to prioritise analytical depth, originality, transparency of process, and intellectual engagement.

    Students should be encouraged to state whether AI was used, how it contributed, and where its outputs were adapted or rejected. This mirrors professional practice by acknowledging assistance without outsourcing judgement. This shift moves universities from policing to encouraging by detecting misconduct and teaching responsible use.

    AI literacy and academic inequality

    AI does not benefit all students equally. Those with strong subject knowledge are better able to question AI’s inaccuracies; others may accept outputs uncritically. 

    Generic workshops alone are insufficient. AI literacy must be embedded within disciplines, for example, in law through case analysis; in business via ethical decision-making; and in science through data validation. Students can be taught not just how to use AI, but how to test it, challenge it, and cite it appropriately.

    Staff development is equally important. Not all academics feel confident incorporating AI into feedback, supervision or assessments. Models such as AI champions, peer-led workshops, and campus coordinators can increase confidence and avoid digital divides between departments.

    Policy implications for UK higher education

    If AI adoption remains fragmented, the UK’s higher education sector risks inconsistency, inequity, and reputational damage. A strategic approach is needed at an institutional and a national level. 

    Universities should define the educational purpose of AI before adopting tools, and consider reforming assessments to remain robust. Structured professional development, opportunities for peer exchange, and open dialogue with students about what constitutes legitimate and responsible use will also support the effective integration of AI into the sector.

    However, it’s not only institutions that need to take action. Policymakers and sector bodies should develop shared reference points for transparency and academic integrity. As a nation, we must invest in research into AI’s impact on learning outcomes and ensure quality frameworks reflect AI’s role in higher education processes, such as assessment and skills development.

    The European Union Artificial Intelligence Act (Regulation (EU) 2024/1689) sets a prescriptive model for compliance in education. The UK’s principles-based approach gives universities flexibility, but this comes with accountability. Without shared standards, the sector risks inconsistent practice and erosion of public trust. A reduction in employability may also follow if students are not taught how to use AI ethically while continuing to develop their critical thinking and analytical skills.

    Implications for the sector

    The experience of institutions like ESCP Business School shows that the quality of teaching with AI depends less on the technology itself than on the judgement and educational purpose guiding its use. 

    Generative AI is already an integral part of students’ academic lives; higher education must now decide how to shape that reality. Institutions that approach AI through strategy, integrity, and shared responsibility will not only protect learning, but renew it, strengthening the human dimension that gives teaching its meaning.

    Source link

  • Transform Your Classroom with Google Workspace AI Tools

    Transform Your Classroom with Google Workspace AI Tools

    The 2025-2026 school year brought a wave of powerful AI-enhanced tools to Google Workspace for Education. These aren”t just shiny new features—they’re practical classroom tools designed to save you time, personalize learning, and unlock student creativity. Best of all? Most are free for educators and students. Now that 2026 is upon us, I am excited to share with you some of my favorite new features that can be used in your classroom with your students. If you are already using these, I’d love to hear from you and learn how you are exploring AI and Google Workspace in your classrooms.

    Let’s walk through the standout Google features you should try with your students this year.

    Google Gemini for Education: Your AI Teaching Assistant

    Google Gemini isn’t just another chatbot. It’s an AI assistant built directly into the Google apps you already use—Docs, Slides, Sheets, Gmail, and Classroom. No more copying and pasting between tabs.

    • Why it matters: Gemini 2.5 Pro incorporates LearnLM, making it the world’s leading model for learning. It’s purpose-built for education with enterprise-grade data protection. Your data isn’t reviewed or used to train AI models.
    • Try this on Monday: Ask Gemini to “Create a lesson plan on photosynthesis aligned to NGSS standards” or “Generate a 25-question multiple choice practice exam from this syllabus.”

    Key Features for K-12 Classrooms:

    Deep Research — Students can research complex topics and receive synthesized reports with sources and citations in minutes. Instead of spending hours searching, they get a comprehensive report they can then explore further.

    Gemini Canvas — Create quizzes, practice tests, study guides, and visual timelines in one interactive space. Go from blank slate to dynamic preview in minutes. Students can build interactive prototypes and code snippets without knowing how to code.

    Gemini Live — Students can talk through complex concepts, get real-time help, and even share their screen or camera for personalized feedback on problem sets.

    What Are Google Gems?

    Think of a Gem as a specialized AI assistant you create for a specific purpose. Instead of writing the same prompt over and over in Gemini, you build a Gem once with custom instructions, and it becomes your go-to expert for that task.

    The difference: Regular Gemini is a generalist. A Gem is a specialist.

    For example, instead of typing “Create a Jeopardy game about the water cycle for 5th grade” every time you need a review game, you create a “Jeopardy Game” Gem that already knows your grade level, subject area, and preferred format. Then you just give it the topic.

    Creating Custom Gems: Build Your Own AI Experts

    Once you’re comfortable with Gemini, Google Gems let you create custom AI assistants tailored to your classroom needs.

    How it works: Give Gemini instructions, examples, and resources so it behaves exactly how you need it to. Upload unit plans, pacing guides, rubrics, or anchor texts so your Gem can reference them when creating content.

    Teacher-facing Gems:

    • Lesson Plan Generator — Aligned to your specific standards and teaching style
    • Parent Communicator — Drafts emails that match your tone and school policies
    • Emergency Sub Plan — Creates ready-to-go activities when you’re out sick
    • Standards Unpacker — Breaks down complex standards into teachable chunks

    Student-facing Gems: Create a Gem and share it with your class through Google Classroom. Students interact with your custom AI expert independently.

    • AI Tutor — Provides step-by-step help without giving away answers
    • Writing Coach — Gives feedback on essays and helps students revise
    • Study Partner — Creates practice questions from their notes
    • Career Explorer — Helps students research potential career paths

    EduGems: Pre-Made Gems by Eric Curts

    Don’t want to build Gems from scratch? Eric Curts (Control Alt Achieve) created EduGems—a growing library of ready-to-use Gems organized by category.

    How to use EduGems:

    1. Visit edugems.ai
    2. Browse by category or search for what you need
    3. Click any Gem to see details
    4. Click “Use” to open it in Gemini, or “Copy” to customize it
    • 🧑‍🏫 AI Tutor — Guides students through problems with questions, not answers. Great for homework help and independent practice.
    • 🎭 Reader’s Theater — Converts stories or historical events into scripts students can perform. Brings content to life through drama.
    • ❓ Jeopardy Game — Creates Jeopardy-style review games on any topic. Perfect for test prep and engagement.
    • 🤔 Student Brainstorming — Helps students generate and organize ideas for projects and writing assignments.
    • 💼 Career Explorer — Students explore career paths, learn about required education, and discover related occupations.
    • 📋 Lesson Plan — Generates complete lesson plans with objectives, activities, and assessments.
    • 📦 Standards Unpacker — Takes complex standards and breaks them into clear learning targets.
    • 🚨 Emergency Sub Plan — Creates complete sub plans with activities, materials, and instructions.
    • 🔀 Re-level Text — Adjusts reading level of any text for differentiation.
    • 📊 Assessment Data Analyzer — Analyzes assessment results and suggests targeted interventions.

    EduGems Categories:

    • Curriculum & Lesson Design (13 Gems) — Lesson plans, unit plans, choice boards, station rotations
    • Student Activities (11 Gems) — Games, simulations, debates, interviews
    • Assessment (15 Gems) — Quizzes, rubrics, test prep, data analysis
    • Support (14 Gems) — Accommodations, scaffolds, behavior plans, social stories
    • Literacy & Language (6 Gems) — Decodable texts, discussion prompts, sentence starters
    • Professional Tasks (11 Gems) — Newsletters, recommendation letters, PD plans

    Pro tip: Start with EduGems to see how effective Gems work, then customize them for your specific needs. You can also submit your own Gems to be added to the collection.

    Learn more: Watch Eric Curts’ complete Gems tutorial video or explore his AI resources at controlaltachieve.com.

    NotebookLM: Your AI Research Assistant

    Teachers and students work with overwhelming amounts of information. NotebookLM becomes an instant expert on whatever documents you upload.

    What makes it special: It grounds all responses in the specific documents you provide—no hallucinations, no random internet sources.

    Features you’ll use:

    • Audio Overviews — Turn lecture recordings, textbook chapters, or research papers into podcast-style audio summaries. Students can study anywhere—on the bus, at practice, during their commute.
    • Document synthesis — Upload PDFs, articles, unit plans, and curriculum resources. Ask questions and get answers pulled directly from your materials. Create summaries, study guides, and student-friendly resources instantly.
    • Student independence — Help students understand complex texts without constant teacher intervention. They can ask clarifying questions and get explanations grounded in their assigned readings.

    Google Vids: Create Professional Video Content in Minutes

    Student attention spans are shrinking, and teachers need tools to deliver content that sticks. Google Vids is Google’s answer: an AI-powered video creation tool that lives right in your Google Workspace.

    What Makes Google Vids Different?

    Think Google Slides turned 90 degrees—instead of slides arranged vertically, you work with scenes arranged horizontally. If you can use Google Slides, you can use Google Vids. But here’s the game-changer: it’s powered by Gemini AI.

    The “Help me create” feature: Type what you want to create (“Make a 3-minute tutorial on the water cycle for 5th grade”), and Google Vids generates a complete first draft in under 60 seconds—script, visuals, timing, transitions, and all. You customize from there instead of starting from scratch.

    Key Features Teachers Love:

    • AI-Powered Creation — Describe your video in a sentence, and Gemini builds the first draft for you. Add your own screenshots, adjust the timing, choose AI voice or record your own.
    • Convert Slides to Videos — Already have a Google Slides presentation? Import it into Vids and transform it into an engaging video with music, transitions, and narration in minutes.
    • Stock Media Library — Access thousands of royalty-free videos, images, music tracks, sound effects, GIFs, and stickers without leaving the platform.
    • Professional Templates — Start with beautifully designed templates for tutorials, announcements, student projects, and more.
    • Real-Time Collaboration — Work together on video projects just like you would in Google Docs. Perfect for group projects or co-planning with colleagues.
    • Seamless Google Classroom Integration — Assign videos as templates so each student gets their own copy. Review student work directly in Classroom and see their progress in real-time.

    For Teachers: Scale Your Impact

    Create professional development videos, flipped classroom content, and instructional materials in 20-30 minutes instead of 2-3 hours.

    Practical use cases:

    • Tool tutorials — Record once, share forever. Every new teacher gets instant access to training.
    • Flipped lessons — Create micro-lectures students watch at home, freeing up class time for hands-on work.
    • Lab procedures — Record safety demos and complex procedures students can review anytime.
    • Personalized feedback — Send quick video messages instead of lengthy written comments.
    • Professional development — Build a library of PD resources teachers can access on-demand.

    For Students: Voice, Choice, and Creativity

    Google Vids gives students an accessible way to demonstrate understanding without needing advanced tech skills.

    Student projects:

    • Video essays — Students explain their thinking, cite sources, and present arguments visually.
    • Book reports — Create “movie trailers” for novels or informational texts.
    • Science demonstrations — Record experiments with narration explaining the process.
    • Digital portfolios — Showcase learning growth throughout the year.
    • Public service announcements — Combine research with persuasive communication skills.

    Scaffolding tip: Start simple. Have students brainstorm in Google Keep, create a 3-slide presentation in Slides, import those slides into Vids, replace slides with video B-roll, add music and transitions. This progression teaches cross-tool workflows while building video literacy skills.

    Getting Started is Simple

    Access Google Vids at vids.google.com or vids.new. No software to download, no complicated setup.

    Three ways to start:

    1. Record — Easiest for screencasts and quick tutorials on Chromebooks
    2. Use templates — Start with professional designs for various purposes
    3. “Help me create” — Describe what you want and let AI build the first draft

    Videos save automatically to Google Drive. Share through Classroom, Drive links, or export as MP4 files.

    Why It Matters for K-12

    Google Vids democratizes video creation. Students and teachers without technical expertise or expensive software can now create professional-looking content. This levels the playing field and opens doors for creativity that were previously closed.

    Want the complete guide? Check out these in-depth resources:

    Getting Started: Your Action Plan

    This week:

    1. Visit gemini.google.com with your school Google account
    2. Ask it to create one lesson plan or assessment
    3. Try Deep Research on a topic you’re teaching next week

    This month:

    1. Create your first custom Gem for a unit you teach frequently
    2. Have students upload their notes to NotebookLM and create an Audio Overview
    3. Record one instructional video in Google Vids

    This semester:

    1. Share the college student offer with your seniors
    2. Build a library of custom Gems for different units
    3. Let students create their own Gems as study partners
    4. Assign a Google Vids project—have students create a 2-minute video explaining a concept, book report trailer, or science demonstration

    One Important Reminder

    With all these powerful AI tools at our fingertips, don’t forget that the most meaningful learning still happens through conversation, hands-on exploration, and human connection. Technology should enhance—not replace—the relationships and dialogue that make your classroom special.

    Use these tools to reclaim your time and energy so you can focus on what matters most: your students.

    Frequently Asked Questions

    Want to Learn More?

    Take a free course: Getting Started with Google AI from Google for Education

    Explore use cases: 100+ ways to use Gemini in education

    Deep dive: Teaching Channel’s course 5381: Teaching with Google’s AI Tools covers Gemini, NotebookLM, Google Vids, and image creation


    Ready to try one of these features? Pick just one from this list and test it this week. Reply and let me know which one you chose and how it went.

    • Jeff Bradbury, your digital learning coach 🎸

    Don’t Miss the Next EdTech Breakthrough

    Google isn’t done innovating, and neither are dozens of other EdTech companies building tools specifically for K-12 educators. New features drop every month—some game-changers, some duds.

    I test them all so you don’t have to.

    Join 20,000+ educators who get my weekly newsletter with:

    ✅ Early access to tutorials on new classroom tech

    ✅ Honest reviews (I’ll tell you when something isn’t worth your time)

    ✅ Ready-to-steal lesson ideas and project templates

    ✅ Time-saving workflows that actually work in real classrooms

    No fluff. No vendor pitches. Just practical strategies from a teacher who’s actually using these tools with students.

    Subscribe to the TeacherCast Newsletter →

    Upgrade Your Teaching Toolkit Today

    Get weekly EdTech tips, tool tutorials, and podcast highlights delivered to your inbox. Plus, receive a free chapter from my book Impact Standards when you join.


    Discover more from TeacherCast Educational Network | Developing Standards-Based Instructional Technology Integration

    Subscribe to get the latest posts sent to your email.

    Source link

  • What we lose when AI replaces teachers

    What we lose when AI replaces teachers

    eSchool News is counting down the 10 most-read stories of 2025. Story #8 focuses on the debate around teachers vs. AI.

    Key points:

    A colleague of ours recently attended an AI training where the opening slide featured a list of all the ways AI can revolutionize our classrooms. Grading was listed at the top. Sure, AI can grade papers in mere seconds, but should it?

    As one of our students, Jane, stated: “It has a rubric and can quantify it. It has benchmarks. But that is not what actually goes into writing.” Our students recognize that AI cannot replace the empathy and deep understanding that recognizes the growth, effort, and development of their voice. What concerns us most about grading our students’ written work with AI is the transformation of their audience from human to robot.

    If we teach our students throughout their writing lives that what the grading robot says matters most, then we are teaching them that their audience doesn’t matter. As Wyatt, another student, put it: “If you can use AI to grade me, I can use AI to write.” NCTE, in its position statements for Generative AI, reminds us that writing is a human act, not a mechanical one. Reducing it to automated scores undermines its value and teaches students, like Wyatt and Jane, that the only time we write is for a grade. That is a future of teaching writing we hope to never see.

    We need to pause when tech companies tout AI as the grader of student writing. This isn’t a question of capability. AI can score essays. It can be calibrated to rubrics. It can, as Jane said, provide students with encouragement and feedback specific to their developing skills. And we have no doubt it has the potential to make a teacher’s grading life easier. But just because we can outsource some educational functions to technology doesn’t mean we should.

    It is bad enough how many students already see their teacher as their only audience. Or worse, when students are writing for teachers who see their written work strictly through the lens of a rubric, their audience is limited to the rubric. Even those options are better than writing for a bot. Instead, let’s question how often our students write to a broader audience of their peers, parents, community, or a panel of judges for a writing contest. We need to reengage with writing as a process and implement AI as a guide or aide rather than a judge with the last word on an essay score.

    Our best foot forward is to put AI in its place. The use of AI in the writing process is better served in the developing stages of writing. AI is excellent as a guide for brainstorming. It can help in a variety of ways when a student is struggling and looking for five alternatives to their current ending or an idea for a metaphor. And if you or your students like AI’s grading feature, they can paste their work into a bot for feedback prior to handing it in as a final draft.

    We need to recognize that there are grave consequences if we let a bot do all the grading. As teachers, we should recognize bot grading for what it is: automated education. We can and should leave the promises of hundreds of essays graded in an hour for the standardized test providers. Our classrooms are alive with people who have stories to tell, arguments to make, and research to conduct. We see our students beyond the raw data of their work. We recognize that the poem our student has written for their sick grandparent might be a little flawed, but it matters a whole lot to the person writing it and to the person they are writing it for. We see the excitement or determination in our students’ eyes when they’ve chosen a research topic that is important to them. They want their cause to be known and understood by others, not processed and graded by a bot.

    The adoption of AI into education should be conducted with caution. Many educators are experimenting with using AI tools in thoughtful and student-centered ways. In a recent article, David Cutler describes his experience using an AI-assisted platform to provide feedback on his students’ essays. While Cutler found the tool surprisingly accurate and helpful, the true value lies in the feedback being used as part of the revision process. As this article reinforces, the role of a teacher is not just to grade, but to support and guide learning. When used intentionally (and we emphasize, as in-process feedback) AI can enhance that learning, but the final word, and the relationship behind it, must still come from a human being.

    When we hand over grading to AI, we risk handing over something much bigger–our students’ belief that their words matter and deserve an audience. Our students don’t write to impress a rubric, they write to be heard. And when we replace the reader with a robot, we risk teaching our students that their voices only matter to the machine. We need to let AI support the writing process, not define the product. Let it offer ideas, not deliver grades. When we use it at the right moments and for the right reasons, it can make us better teachers and help our students grow. But let’s never confuse efficiency with empathy. Or algorithms with understanding.

    Latest posts by eSchool Media Contributors (see all)

    Source link

  • Texas Universities Deploy AI Tools to Review How Courses Discuss Race and Gender – The 74

    Texas Universities Deploy AI Tools to Review How Courses Discuss Race and Gender – The 74


    Get stories like this delivered straight to your inbox. Sign up for The 74 Newsletter

    A senior Texas A&M University System official testing a new artificial intelligence tool this fall asked it to find how many courses discuss feminism at one of its regional universities. Each time she asked in a slightly different way, she got a different number.

    “Either the tool is learning from my previous queries,” Texas A&M system’s chief strategy officer Korry Castillo told colleagues in an email, “or we need to fine tune our requests to get the best results.”

    It was Sept. 25, and Castillo was trying to deliver on a promise Chancellor Glenn Hegar and the Board of Regents had already made: to audit courses across all of the system’s 12 universities after conservative outrage over a gender-identity lesson at the flagship campus intensified earlier that month, leading to the professor’s firing and the university president’s resignation

    Texas A&M officials said the controversy stemmed from the course’s content not aligning with its description in the university’s course catalog and framed the audit as a way to ensure students knew what they were signing up for. As other public universities came under similar scrutiny and began preparing to comply with a new state law that gives governor-appointed regents more authority over curricula, they, too, announced audits.

    Records obtained by The Texas Tribune offer a first look at how Texas universities are experimenting with AI to conduct those reviews. 

    At Texas A&M, internal emails show staff are using AI software to search syllabi and course descriptions for words that could raise concerns under new system policies restricting how faculty teach about race and gender. 

    At Texas State, memos show administrators are suggesting faculty use an AI writing assistant to revise course descriptions. They urged professors to drop words such as “challenging,” “dismantling” and “decolonizing” and to rename courses with titles like “Combating Racism in Healthcare” to something university officials consider more neutral like “Race and Public Health in America.”

    Read Texas State University’s guide to faculty on how to review their curriculum with AI

    While school officials describe the efforts as an innovative approach that fosters transparency and accountability, AI experts say these systems do not actually analyze or understand course content, instead generating answers that sound right based on patterns in their training data.

    That means small changes in how a question is phrased can lead to different results, they said, making the systems unreliable for deciding whether a class matches its official description. They warned that using AI this way could lead to courses being flagged over isolated words and further shift control of teaching away from faculty and toward administrators.

    “I’m not convinced this is about serving students or cleaning up syllabi,” said Chris Gilliard, co-director of the Critical Internet Studies Institute. “This looks like a project to control education and remove it from professors and put it into the hands of administrators and legislatures.”

    Setting up the tool

    During a board of regents meeting last month, Texas A&M System leaders described the new processes they were developing to audit courses as a repeatable enforcement mechanism. 

    Vice Chancellor for Academic Affairs James Hallmark said the system would use “AI-assisted tools” to examine course data under “consistent, evidence-based criteria,” which would guide future board action on courses. Regent Sam Torn praised it as “real governance,” saying Texas A&M was “stepping up first, setting the model that others will follow.” 

    That same day, the board approved new rules requiring presidents to sign off on any course that could be seen as advocating for “race and gender ideology” and prohibiting professors from teaching material not on the approved syllabus for a course.

    In a statement to the Tribune, Chris Bryan, the system’s vice chancellor for marketing and communications, said Texas A&M is using OpenAI services through an existing subscription to aid the system’s course audit and that the tool is still being tested as universities finish sharing their course data. He said “any decisions about appropriateness, alignment with degree programs, or student outcomes will be made by people, not software.”

    In records obtained by the Tribune, Castillo, the system’s chief strategy officer, told colleagues to prepare for about 20 system employees to use the tool to make hundreds of queries each semester. 

    The records also show some of the concerns that arose from early tests of the tool.  

    When Castillo told colleagues about the varying results she obtained when searching for classes that discuss feminism, deputy chief information officer Mark Schultz cautioned that the tool came with “an inherent risk of inaccuracy.”

    “Some of that can be mitigated with training,” he said, “but it probably can’t be fully eliminated.”

    Schultz did not specify what kinds of inaccuracies he meant. When asked if the potential inaccuracies had been resolved, Bryan said, “We are testing baseline conversations with the AI tool to validate the accuracy, relevance and repeatability of the prompts.” He said this includes seeing how the tool responds to invalid or misleading prompts and having humans review the results.

    Experts said the different answers Castillo received when she rephrased her question reflect how these systems operate. They explained that these kinds of AI tools generate their responses by predicting patterns and generating strings of text.

    “These systems are fundamentally systems for repeatedly answering the question ‘what is the likely next word’ and that’s it,” said Emily Bender, a computational linguist at the University of Washington. “The sequence of words that comes out looks like the kind of thing you would expect in that context, but it is not based on reason or understanding or looking at information.”

    Because of that, small changes to how a question is phrased can produce different results. Experts also said users can nudge the model toward the answer they want. Gilliard said that is because these systems are also prone to what developers call “sycophancy,” meaning they try to agree with or please the user. 

    “Very often, a thing that happens when people use this technology is if you chide or correct the machine, it will say, ‘Oh, I’m sorry’ or like ‘you’re right,’ so you can often goad these systems into getting the answer you desire,” he said.

    T. Philip Nichols, a Baylor University professor who studies how technology influences teaching and learning in schools, said keyword searches also provide little insight into how a topic is actually taught. He called the tool “a blunt instrument” that isn’t capable of understanding how certain discussions that the software might flag as unrelated to the course tie into broader class themes. 

    “Those pedagogical choices of an instructor might not be present in a syllabus, so to just feed that into a chatbot and say, ‘Is this topic mentioned?’ tells you nothing about how it’s talked about or in what way,” Nichols said. 

    Castillo’s description of her experience testing the AI tool was the only time in the records reviewed by the Tribune when Texas A&M administrators discussed specific search terms being used to inspect course content. In another email, Castillo said she would share search terms with staff in person or by phone rather than email. 

    System officials did not provide the list of search terms the system plans to use in the audit.

    Martin Peterson, a Texas A&M philosophy professor who studies the ethics of technology, said faculty have not been asked to weigh in on the tool, including members of the university’s AI council. He noted that the council’s ethics and governance committee is charged with helping set standards for responsible AI use.

    While Peterson generally opposes the push to audit the university system’s courses, he said he is “a little more open to the idea that some such tool could perhaps be used.”

    “It is just that we have to do our homework before we start using the tool,” Peterson said.

    AI-assisted revisions

    At Texas State University, officials ordered faculty to rewrite their syllabi and suggested they use AI to do it.

    In October, administrators flagged 280 courses for review and told faculty to revise titles, descriptions and learning outcomes to remove wording the university said was not neutral. Records indicate that dozens of courses set to be offered by the College of Liberal Arts in the Spring 2026 semester were singled out for neutrality concerns. They included courses such as Intro to Diversity, Social Inequality, Freedom in America, Southwest in Film and Chinese-English Translation.

    Faculty were given until Dec. 10 to complete the rewrites, with a second-level review scheduled in January and the entire catalog to be evaluated by June. 

    Administrators shared with faculty a guide outlining wording they said signaled advocacy. It discouraged learning outcomes that describe students “measure or require belief, attitude or activism (e.g., value diversity, embrace activism, commit to change).”

    Administrators also provided a prompt for faculty to paste into an AI writing assistant alongside their materials. The prompt instructs the chatbot to “identify any language that signals advocacy, prescriptive conclusions, affective outcomes or ideological commitments” and generate three alternative versions that remove those elements. 

    Jayme Blaschke, assistant director of media relations at Texas State, described the internal review as “thorough” and “deliberative,” but would not say whether any classes have already been revised or removed, only that “measures are in place to guide students through any adjustments and keep their academic progress on track.” He also declined to explain how courses were initially flagged and who wrote the neutrality expectations.

    Faculty say the changes have reshaped how curriculum decisions are made on campus.

    Aimee Villarreal, an assistant professor of anthropology and president of Texas State’s American Association of University Professors chapter, said the process is usually faculty-driven and unfolds over a longer period of time. She believes the structure of this audit allows administrators to more closely monitor how faculty describe their disciplines and steer how that material must be presented.

    She said the requirement to revise courses quickly or risk having them removed from the spring schedule has created pressure to comply, which may have pushed some faculty toward using the AI writing assistant.

    Villarreal said the process reflects a lack of trust in faculty and their field expertise when deciding what to teach.

    “I love what I do,” Villarreal said, “and it’s very sad to see the core of what I do being undermined in this way.”

    Nichols warned the trend of using AI in this way represents a larger threat. 

    “This is a kind of de-professionalizing of what we do in classrooms, where we’re narrowing the horizon of what’s possible,” he said. “And I think once we give that up, that’s like giving up the whole game. That’s the whole purpose of why universities exist.”

    The Texas Tribune partners with Open Campus on higher education coverage.

    Disclosure: Baylor University, Texas A&M University and Texas A&M University System have been financial supporters of The Texas Tribune, a nonprofit, nonpartisan news organization that is funded in part by donations from members, foundations and corporate sponsors. Financial supporters play no role in the Tribune’s journalism. Find a complete list of them here.

    This article first appeared on The Texas Tribune.


    Did you use this article in your work?

    We’d love to hear how The 74’s reporting is helping educators, researchers, and policymakers. Tell us how

    Source link

  • An educator’s top tips to integrate AI into the classroom

    An educator’s top tips to integrate AI into the classroom

    eSchool News is counting down the 10 most-read stories of 2025. Story #10 focuses on teaching strategies around AI.

    Key points:

    In the last year, we’ve seen an extraordinary push toward integrating artificial intelligence in classrooms. Among educators, that trend has evoked responses from optimism to opposition. “Will AI replace educators?” “Can it really help kids?” “Is it safe?” Just a few years ago, these questions were unthinkable, and now they’re in every K-12 school, hanging in the air.

    Given the pace at which AI technologies are changing, there’s a lot still to be determined, and I won’t pretend to have all the answers. But as a school counselor in Kansas who has been using SchoolAI to support students for years, I’ve seen that AI absolutely can help kids and is safe when supervised. At this point, I think it’s much more likely to help us do our jobs better than to produce any other outcome. I’ve discovered that if you implement AI thoughtfully, it empowers students to explore their futures, stay on track for graduation, learn new skills, and even improve their mental health.

    Full disclosure: I have something adjacent to a tech background. I worked for a web development marketing firm before moving into education. However, I want to emphasize that you don’t have to be an expert to use AI effectively. Success is rooted in curiosity, trial and error, and commitment to student well-being. Above all, I would urge educators to remember that AI isn’t about replacing us. It allows us to extend our reach to students and our capacity to cater to individual needs, especially when shorthanded.

    Let me show you what that looks like.

    Building emotional resilience

    Students today face enormous emotional pressures. And with national student-to-counselor ratios at nearly double the recommended 250-to-1, school staff can’t always be there right when students need us.

    That’s why I created a chatbot named Pickles (based on my dog at home, whom the kids love but who is too rambunctious to come to school with me). This emotional support bot gives my students a way to process small problems like feeling left out at recess or arguing with a friend. It doesn’t replace my role, but it does help triage students so I can give immediate attention to those facing the most urgent challenges.

    Speaking of which, AI has revealed some issues I might’ve otherwise missed. One fourth grader, who didn’t want to talk to me directly, opened up to the chatbot about her parents’ divorce. Because I was able to review her conversation, I knew to follow up with her. In another case, a shy fifth grader who struggled to maintain conversations learned to initiate dialogue with her peers using chatbot-guided social scripts. After practicing over spring break, she returned more confident and socially fluent.

    Aside from giving students real-time assistance, these tools offer me critical visibility and failsafes while I’m running around trying to do 10 things at once.

    Personalized career exploration and academic support

    One of my core responsibilities as a counselor is helping students think about their futures. Often, the goals they bring to me are undeveloped (as you would expect—they’re in elementary school, after all): They say, “I’m going to be a lawyer,” or “I’m going to be a doctor.” In the past, I would point them toward resources I thought would help, and that was usually the end of it. But I always wanted them to reflect more deeply about their options.

    So, I started using an AI chatbot to open up that conversation. Instead of jumping to a job title, students are prompted to answer what they’re interested in and why. The results have been fascinating—and inspiring. In a discussion with one student recently, I was trying to help her find careers that would suit her love of travel. After we plugged in her strengths and interests, the chatbot suggested cultural journalism, which she was instantly excited about. She started journaling and blogging that same night. She’s in sixth grade.

    What makes this process especially powerful is that it challenges biases. By the end of elementary school, many kids have already internalized what careers they think they can or can’t pursue–often based on race, gender, or socioeconomic status. AI can disrupt that. It doesn’t know what a student looks like or where they’re from. It just responds to their curiosity. These tools surface career options for kids–like esports management or environmental engineering–that I might not be able to come up with in the moment. It’s making me a better counselor and keeping me apprised of workforce trends, all while encouraging my students to dream bigger and in more detail.

    Along with career decisions, AI helps students make better academic decisions, especially in virtual school environments where requirements vary district to district. I recently worked with a virtual school to create an AI-powered tool that helps students identify which classes they need for graduation. It even links them to district-specific resources and state education departments to guide their planning. These kinds of tools lighten the load of general advising questions for school counselors and allow us to spend more time supporting students one on one.

    My advice to educators: Try it

    We tell our students that failure is part of learning. So why should we be afraid to try something new? When I started using AI, I made mistakes. But AI doesn’t have to be perfect to be powerful. Around the globe, AI school assistants are already springing up and serving an ever-wider range of use cases.

    I recommend educators start small. Use a trusted platform. And most importantly, stay human. AI should never replace the relationships at the heart of education. But if used wisely, it can extend your reach, personalize your impact, and unlock your students’ potential.

    We have to prepare our students for a world that’s changing fast–maybe faster than ever. I, for one, am glad I have AI by my side to help them get there.

    Source link

  • Digital Tools for Note Taking and PKM – Teaching in Higher Ed

    Digital Tools for Note Taking and PKM – Teaching in Higher Ed

    My friend Kerry left me one of her infamous voice messages today. These are the fancy kinds that go beyond voice mail, but instead show up in my text messages app, only I get to hear her voice. Apple nicely transcribes these messages for me, too, though it cracks me up what it sometimes thinks Kerry says in these messages. This time, it thought that she called me “Fran,” but instead she was calling me, “friend.”

    She’s going to be on sabbatical next semester, so is wanting to get going with a note-taking application. In my over two decades in higher education, I’ve never had a sabbatical, but I imagine that if that time were to come, I would really want to get a jump on the organization side of things, as well. I’ve enjoyed following Robert Talbert’s transparency around his sabbatical as he seeks to be intentional with his sabbatical, even subtitling one of his blogs: Or, how my inherent laziness has made me productive on a big project. He also suggests that we regularly carve out time to reflect on whether where we are spending our time and devoting our attention is in alignment with the things that are most important to us.

    I like reading Robert’s blogs in which he geeks out about the tools that he uses. Like me, he’s evolved what applications he uses, most recently documenting the digital tools he is using for his own sabbatical project (part 1 and part 2).

    Even though Kerry asked me about my suggestions for a note-taking tool, I can’t help but zoom back out and make sure we both understand that bigger picture. I can’t really answer the question as to giving my advice related to taking notes, unless I’m sure she’s got the other vital pieces going that she will need to maximize her time. Not to mention, giving herself permission to wander and be entirely “unproductive” for at least some portions of this time away.

    The Tools

    For any sabbatical, I’m making an assumption that at least some portion of it will involve doing research and some writing.

    References Manager

    There are many good references managers out there. I haven’t changed mine really ever, since landing on Zotero many years ago. I didn’t have a references manager when doing my master’s or doctorate, so when I talk about the power of one, I tend to sound like an old person talking about having to walk uphill to get to school, both ways, with a bit of “get off my lawn” sentiment, throughout.

    Hands down, if you’re going to research, or plan on doing some academic writing, it makes zero sense not to be capturing sources in a references manager. Off the top of my head, be sure you know how to:

    1. Add sources using the Zotero extension installed on your preferred browser. Zotero must be running in the background as an application, at least for how I have things configured on my Mac, but it will nudge you, if you forget.
    2. I choose to check each source, as I add it, though this isn’t necessary. Zotero is great because much of the time, it will grab the metadata associated with the item you have saved, including the author’s name, date of publication, URL, etc. However, sometimes websites don’t have their information set up such that some of the information gets missed. I would always way rather just add it, manually, in the moment I’m already on that page. Others just figure they’ll wait to see if they actually wind up citing that source.
    3. Cite sources within your word processor, which for me is Microsoft Word. I use the toolbar for Zotero when I need to cite a source, as I’m writing, I easily search for it, and then press enter and away I go.
    4. Create a bibliography using Zotero. This would have been a game changer, had I had this tool when I was in school. Some years back, they made this auto-update so each time you add a new source, your references list automatically updates, as you go. If you delete a sentence containing a citation, it is removed from your references. So cool.

    Digital Bookmarks

    For any other type of digital resource (ones I doubt I’ll wind up citing in formal, academic writing), I save them to my preferred digital bookmarking tool: Raindrop.io. I can’t even imaging doing any computing in any context without having a bookmarking tool available to save things to…

    I’ve got collections (folders) for Teaching in Higher Ed, AI (this one is publicly viewable as a page, and as an RSS feed), Teaching, Technology, and ones for specific classes, just as an example. Take a look at my Raindrop blog post, which talks more about why I recommend it and how I have it set up to support my ongoing learning.

    Note-Taking

    Now we’re finally getting around to Kerry’s original question. I had to first talk about a references manager and digital bookmarks, since I wanted to ensure that she will have at least Zotero (or similar tool) for the formal, academic writing, including citing sources and doing the necessary sense-making required for academic writing.

    Chicken Scratch (Quick Capture) Notes

    There’s a place in many people’s lives for quick-capture notes. You’re talking to someone and they mention something you want to remember. You don’t first want to figure out where to put that information; you just want to grab it, like you might a sticky note in an analog world.

    Hands down, for me, that app is Drafts.

    At this exact moment, I would consider myself a “bad” Drafts user. I’ve got 172 “chicken scratch” notes sitting, unorganized. That said, I don’t put anything there that it would be terrible if the notes got “lost” from my attention for a while. These past three months, I was a keynote speaker at a conference in Michigan, and did a pre-conference workshop for the POD Conference in San Diego. Being on the road means lots of opportunities for me to hear about something, or have an idea, that I just want to quickly capture in that moment, and get back to, later.

    I submitted grades late last night, so today means getting back to a more regular GTD weekly review, at which point I’ll be emptying my inboxes, including my Drafts inbox. If you’re curious about the process I use to accomplish this, I couldn’t recommend more another post by Robert Talbert: How and why to achieve inbox zero.

    One other thing I’ll mention about Drafts is that it is incredibly easy to get started with… and once you’re up and running, there are a gazillion bells and whistles you could discover, should you want to get even more benefit out of it.

    One fun thing I enjoy is using an app on my iPhone and Apple Watch (via a complication) called Whisper Memos, which lets me record a voice memo and then receive an email with my “ramblings turned into paragraphed articles.” However, instead of cluttering up my email inbox, I have it set up to send an email to my special Drafts email, which then sends the transcription (broken into paragraphs, which I find super handy) to my Drafts inbox, for later use.

    I also keep a Drafts workspace (not in my inbox) dedicated just to my various checklists, such as packing lists, a school departure checklist (which we haven’t had to use in a long while, since our kids keep getting older and more independent), password reset checklist (where are all of the different apps and services I need to visit, anytime I get forced to reset my password for work), and a checklist for all the places I have to change my profile photo, anytime in the future I get new headshots or otherwise want a change.

    Primary Note Taking Tool

    Now we’re finally to the real question Kerry was asking: What app should she use to take notes? Well, as I mentioned, I actually have a fair amount of them, but since I’m at least attempting to stay focused on the sabbatical needs, I had better get back to it now.

    My primary notetaking tool these days is Obsidian. Robert Talbert again does a great job of articulating how and why he uses Obsidian. A big driver for me is that if I ever want to switch things up down the road, I don’t have to worry about how to get stuff out of Obsidian. As it is just a “wrapper” or a “view” of plain text files that are sitting on my computer. If they ever decided to jack their users around by significant increases to their pricing model, without the added value one might expect, I wouldn’t be locked in at all. There are plenty of other note-taking apps that would know how to “talk” to and display the plain text files on my computer in a similar fashion as Obsidian.

    That said, some people might be intimidated by becoming familiar with writing using Markdown, which is the formatting used in plain text files. Since the text is “plain,” that means you can only make something bold by using other indicators that a given word or phrase is meant to be bold. However, I find you could get up and running with the vast majority of Markdown in less than five minutes, such that this isn’t as big a barrier as it might seem.

    As an example, I don’t have to type the formatting for bold, I can just high light those words and then press command-B on my keyboard, same as I would in any other writing context. Headings are just indicated by typing the number of pound signs at the start of a line. So the heading for this section of this post required four number signs, because it is a heading 4 (H4), and then I just press space and type the subheading, like normal.

    That said, you couldn’t go wrong with Bear, or Craft, if you aren’t as concerned about being able to get stuff easily out of them, should you ever change note taking tools in the future.

    Getting Started

    The tool we select is important, yes. But more important is how we set them up to help us achieve the intended purpose of wanting a note taking tool in the first place.

    Daily notes. I am not as disciplined about this as I once was, but hope to get back to doing daily notes. Carl Pullein talks about the history of the “daily note” and how to use them to keep yourself organized and focused.

    Meeting notes. I am close to 100% disciplined about taking notes during meetings (really helps me stay focused, as otherwise my mind can wander quite a bit), or when attending conferences or webinars. I keep a consistent naming convention for these notes, as follows: yyyy-mm-dd-meeting-name and then move the note to a dedicated folder in Obsidian. I only move the note into the follow after I have reviewed it for any “open loops” and then captured those in my task manager.

    Other writing. I’ve got folders for other types of writing that I do, as well. To me, the key is having a “home” for where things belong and to be super disciplined about consistent naming conventions, so I don’t get overwhelmed with the messiness of the creative process.

    That said, Kerry will first want to play around with any note taking tool she is considering just at the note level, before she worries about how she will organize things. Otherwise, it is way too easy to get overwhelmed and not cross over the finish line of getting started using a note taking tool, consistently.

    The University of Virginia Library offers ideas for how to organize research data across all disciplines. Don’t miss the part where they say to write down your organization system before you start, or in my experience, it is too easy to forget how I set things up in the first place.

    Source link

  • 4 ways AI can make your PD more effective

    4 ways AI can make your PD more effective

    Key points:

    If you lead professional learning, whether as a school leader or PD facilitator, your goal is to make each session relevant, engaging, and lasting. AI can help you get there by streamlining prep, differentiating for diverse learners, combining follow-ups with accessibility for absentees, and turning feedback into actionable improvements.

    1. Streamline prep

    Preparing PD can take hours as you move between drafting agendas, building slides, writing handouts, and finding the right examples. For many facilitators, the preparation phase becomes a race against time, leaving less room for creativity and interaction. The challenge is not only to create materials, but to design them so they are relevant to the audience and aligned with clear learning goals.

    AI can help by taking the raw information you provide–your session objectives, focus area, and audience details–and producing a solid first draft of your session materials. This may include a structured agenda, a concise session description, refined learning objectives, a curated resource list, and even a presentation deck with placeholder slides and talking points. Instead of starting from scratch, you begin with a framework that you can adapt for tone, style, and participant needs.

    AI quick start:

    • Fine-tune your PD session objectives or description so they align with learning goals and audience needs.
    • Design engaging PD slides that support active learning and discussion.
    • Create custom visuals to illustrate key concepts and examples for your PD session.

    2. Differentiate adult learning

    Educators bring different levels of expertise, roles, and learning preferences to PD. AI can go beyond sorting participants into groups; it can analyze pre-session survey data to identify common challenges, preferred formats, and specific areas of curiosity. With this insight, you can design activities that meet everyone’s needs while keeping the group moving forward together.

    For instance, an AI analysis of survey results might reveal that one group wants practical, ready-to-use classroom strategies while another is interested in deepening their understanding of instructional frameworks. You can then create choice-based sessions or breakout activities that address both needs, allowing participants to select the format that works best for them. This targeted approach makes PD more relevant and increases engagement because participants see their own goals reflected in the design.

    AI quick start:

    • Create a pre-session survey form to collect participant goals, roles, and preferences.
    • Analyze survey responses qualitatively to identify trends or themes.
    • Develop differentiated activities and resources for each participant group.

    3. Make PD accessible for those who miss it

    Even the most engaging PD can lose its impact without reinforcement, and some participants will inevitably miss the live session. Illness, scheduling conflicts, and urgent school needs happen. Without intentional follow-up, these absences can create gaps in knowledge and skills that affect team performance.

    AI can help close these gaps by turning your agenda, notes, or recordings into follow-up materials that recap key ideas, highlight next steps, and provide easy access to resources. This ensures that all educators, regardless of whether they attended, can engage with the same content and apply it in their work.

    Imagine hosting a PD session on integrating literacy strategies across the curriculum. Several teachers cannot attend due to testing responsibilities. By using AI to transcribe the recording, produce a well-organized summary, and embed links to articles and templates, you give absent staff members a clear path to catch up. You can also create a short bridge-to-practice activity that both attendees and absentees complete, so everyone comes to the next session prepared.

    This approach not only supports ongoing learning but also reinforces a culture of equity in professional development, where everyone has access to the same high-quality materials and expectations. Over time, storing these AI-generated summaries and resources in a shared space can create an accessible PD archive that benefits the entire organization.

    AI quick start:

    • Transcribe your PD session recording for a complete text record.
    • Summarize the content into a clear, concise recap with next steps.
    • Integrate links to resources and bridge-to-practice activities so all participants can act on the learning.

    4. Turn participant feedback into action

    Open-ended survey responses are valuable, but analyzing them can be time-consuming. AI can code and group feedback so you can quickly identify trends and make informed changes before your next session.

    For example, AI might cluster dozens of survey comments into themes such as “more classroom examples,” “more time for practice,” or “deeper technology integration.” Instead of reading through each comment manually, you receive a concise report that highlights key priorities. You can then use this information to adjust your content, pacing, or format to better meet participants’ needs.

    By integrating this kind of rapid analysis into your PD process, you create a feedback loop that keeps your sessions evolving and responsive. Over time, this builds trust among participants, who see that their input is valued and acted upon.

    AI quick start:

    • Compile and organize participant feedback into a single dataset.
    • Categorize comments into clear, actionable themes.
    • Summarize insights to highlight priority areas for improvement.

    Final word

    AI will not replace your skill as a facilitator, but it can strengthen the entire PD cycle from planning and delivery to post-session coaching, accessibility, and data analysis. By taking on repetitive, time-intensive tasks, AI allows you to focus on creating experiences that are engaging, relevant, and equitable.

    Source link

  • Rethinking icebreakers in professional learning

    Rethinking icebreakers in professional learning

    Key points:

    I was once asked during an icebreaker in a professional learning session to share a story about my last name. What I thought would be a light moment quickly became emotional. My grandfather borrowed another name to come to America, but his attempt was not successful, and yet our family remained with it. Being asked to share that story on the spot caught me off guard. It was personal, it was heavy, and it was rushed into the open by an activity intended to be lighthearted.

    That highlights the problem with many icebreakers. Facilitators often ask for vulnerability without context, pushing people into performances disconnected from the session’s purpose. For some educators, especially those from historically marginalized backgrounds, being asked to disclose personal details without trust can feel unsafe. I have both delivered and received professional learning where icebreakers were the first order of business, and they often felt irrelevant. I have had to supply “fun facts” I had not thought about in years or invent something just to move the activity along.

    And inevitably, somewhere later in the day, the facilitator says, “We are running out of time” or “We do not have time to discuss this in depth.” The irony is sharp: Meaningful discussion gets cut short while minutes were spent on activities that added little value.

    Why icebreakers persist

    Why do icebreakers persist despite their limitations? Part of it is tradition. They are familiar, and many facilitators replicate what they have experienced in their own professional learning. Another reason is belief in their power to foster collaboration or energize a room. Research suggests there is some basis for this. Chlup and Collins (2010) found that icebreakers and “re-energizers” can, when used thoughtfully, improve motivation, encourage interaction, and create a sense of safety for adult learners. These potential benefits help explain why facilitators continue to use them.

    But the promise is rarely matched by practice. Too often, icebreakers are poorly designed fillers, disconnected from learning goals, or stretched too long, leaving participants disengaged rather than energized.

    The costs of misuse

    Even outside education, icebreakers have a negative reputation. As Kirsch (2025) noted in The New York Times, many professionals “hate them,” questioning their relevance and treating them with suspicion. Leaders in other fields rarely tolerate activities that feel disconnected from their core work, and teachers should not be expected to, either.

    Research on professional development supports this skepticism. Guskey (2003) found that professional learning only matters when it is carefully structured and purposefully directed. Simply gathering people together does not guarantee effectiveness. The most valued feature of professional development is deepening educators’ content and pedagogical knowledge in ways that improve student learning–something icebreakers rarely achieve.

    School leaders are also raising the same concerns. Jared Lamb, head of BASIS Baton Rouge Mattera Charter School in Louisiana and known for his viral leadership videos on social media, argues that principals and teachers have better uses of their time. “We do not ask surgeons to play two truths and a lie before surgery,” he remarked, “so why subject our educators to the same?” His critique may sound extreme, but it reflects a broader frustration with how professional learning time is spent.

    I would not go that far. While I agree with Lamb that educators’ time must be honored, the solution is not to eliminate icebreakers entirely, but to plan them with intention. When designed thoughtfully, they can help establish norms, foster trust, and build connection. The key is ensuring they are tied to the goals of the session and respect the professionalism of participants.

    Toward more authentic connection

    The most effective way to build community in professional learning is through purposeful engagement. Facilitators can co-create norms, clarify shared goals, or invite participants to reflect on meaningful moments from their teaching or leadership journeys. Aguilar (2022), in Arise, reminds us that authentic connections and peer groups sustain teachers far more effectively than manufactured activities. Professional trust grows not from gimmicks but from structures that honor educators’ humanity and expertise.

    Practical alternatives to icebreakers include:

    • Norm setting with purpose: Co-create group norms or commitments that establish shared expectations and respect.
    • Instructional entry points: Use a short analysis of student work, a case study, or a data snapshot to ground the session in instructional practice immediately.
    • Structured reflection: Invite participants to share a meaningful moment from their teaching or leadership journey using protocols like the Four A’s. These provide choice and safety while deepening professional dialogue.
    • Collaborative problem-solving: Begin with a design challenge or pressing instructional issue that requires participants to work together immediately.

    These approaches avoid the pitfalls of forced vulnerability. They also account for equity by ensuring participation is based on professional engagement, not personal disclosures.

    Closing reflections

    Professional learning should honor educators’ time and expertise. Under the right conditions, icebreakers can enhance learning, but more often, they create discomfort, waste minutes, and fail to build trust.

    I still remember being asked to tell my last name story. What emerged was a family history rooted in migration, struggle, and survival, not a “fun fact.” That moment reminds me: when we ask educators to share, we must do so with care, with planning, and with purpose.

    If we model superficial activities for teachers, we risk signaling that superficial activities are acceptable for students. School leaders and facilitators must design professional learning that is purposeful, respectful, and relevant. When every activity ties to practice and trust, participants leave not only connected but also better equipped to serve their students. That is the kind of professional learning worth everyone’s time.

    References

    Aguilar, E. (2022). Arise: The art of transformative leadership in schools. Jossey-Bass.

    Chlup, D. T., & Collins, T. E. (2010). Breaking the ice: Using ice-breakers and re-energizers with adult learners. Adult Learning, 21(3–4), 34–39. https://doi.org/10.1177/104515951002100305

    Guskey, T. R. (2003). What makes professional development effective? Phi Delta Kappan, 48(10), 748–750.

    Kirsch, M. (2025, March 29). Breaking through. The New York Times. https://www.nytimes.com/2025/03/29/briefing/breaking-through.html

    Latest posts by eSchool Media Contributors (see all)

    Source link