It’s truly incredible how much new technology has made its way into the classroom. Where once teaching consisted primarily of whiteboards and textbooks, you can now find tablets, smart screens, AI assistants, and a trove of learning apps designed to foster inquiry and maximize student growth.
While these new tools are certainly helpful, the flood of options means that educators can struggle to discern truly useful resources from one-time gimmicks. As a result, some of the best tools for sparking curiosity, creativity, and critical thinking often go overlooked.
Personally, I believe 3D printing is one such tool that doesn’t get nearly enough consideration for the way it transforms a classroom.
3D printing is the process of making a physical object from a three-dimensional digital model, typically by laying down many thin layers of material using a specialized printer. Using 3D printing, a teacher could make a model of a fossil to share with students, trophies for inter-class competitions, or even supplies for construction activities.
At first glance, this might not seem all that revolutionary. However, 3D printing offers three distinct educational advantages that have the potential to transform K–12 learning:
It develops success skills: 3D printing encourages students to build a variety of success skills that prepare them for challenges outside the classroom. For starters, its inclusion creates opportunities for students to practice communication, collaboration, and other social-emotional skills. The process of moving from an idea to a physical, printed prototype fosters perseverance and creativity. Meanwhile, every print–regardless of its success–builds perseverance and problem-solving confidence. This is the type of hands-on, inquiry-based learning that students remember.
It creates cross-curricular connections: 3D printing is intrinsically cross-curricular. Professional scientists, engineers, and technicians often use 3D printing to create product models or build prototypes for testing their hypotheses. This process involves documentation, symbolism, color theory, understanding of narrative, and countless other disciplines. It doesn’t take much imagination to see how these could also be beneficial to classroom learning. Students can observe for themselves how subjects connect, while teachers transform abstract concepts into tangible points of understanding.
It’s aligned with engineering and NGSS: 3D printing aligns perfectly with Next Gen Science Standards. By focusing on the engineering design process (define, imagine, plan, create, improve) students learn to think and act like real scientists to overcome obstacles. This approach also emphasizes iteration and evidence-based conclusions. What better way to facilitate student engagement, hands-on inquiry, and creative expression?
3D printing might not be the flashiest educational tool, but its potential is undeniable. This flexible resource can give students something tangible to work with while sparking wonder and pushing them to explore new horizons.
So, take a moment to familiarize yourself with the technology. Maybe try running a few experiments of your own. When used with purpose, 3D printing transforms from a common classroom tool into a launchpad for student discovery.
Jon Oosterman, Van Andel Institute for Education
Jon Oosterman is a Learning Specialist at Van Andel Institute for Education, a Michigan-based education nonprofit dedicated to creating classrooms where curiosity, creativity, and critical thinking thrive.
Latest posts by eSchool Media Contributors (see all)
Many years ago, around 2010, I attended a professional development program in Houston called Literacy Through Photography, at a time when I was searching for practical ways to strengthen comprehension, discussion, and reading fluency, particularly for students who found traditional print-based tasks challenging. As part of the program, artists visited my classroom and shared their work with students. Much of that work was abstract. There were no obvious answers and no single “correct” interpretation.
Instead, students were invited to look closely, talk together, and explain what they noticed.
What struck me was how quickly students, including those who struggled with traditional reading tasks, began to engage. They learned to slow down, describe what they saw, make inferences, and justify their thinking. They weren’t just looking at images; they were reading them. And in doing so, they were rehearsing many of the same strategies we expect when reading written texts.
At the time, this felt innovative. But it also felt deeply intuitive.
Fast forward to today.
Students are surrounded by images and videos, from photographs and diagrams to memes, screenshots, and, increasingly, AI-generated visuals. These images appear everywhere: in learning materials, on social media, and inside the tools students use daily. Many look polished, realistic, and authoritative.
At the same time, AI has made faking easier than ever.
As educators and school leaders, we now face urgent questions around misinformation, academic integrity, and critical thinking. The issue is no longer just whether students can use AI tools, but whether they can interpret, evaluate, and question what they see.
This is where visual literacy becomes a frontline defence.
Teaching students to read images critically, to see them as constructed texts rather than neutral data, strengthens the same skills we rely on for strong reading comprehension: inference, evidence-based reasoning, and metacognitive awareness.
From photography to AI: A conversation grounded in practice
Recently, I found myself returning to those early classroom experiences through ongoing professional dialogue with a former college lecturer and professional photographer, as we explored what it really means to read images in the age of AI.
A conversation that grew out of practice
Nesreen: When I shared the draft with you, you immediately focused on the language, whether I was treating images as data or as signs. Is this important?
Photographer: Yes, because signs belong to reading. Data is output. Signs are meaning. When we talk about reading media texts, we’re talking about how meaning is constructed, not just what information appears.
Nesreen: That distinction feels crucial right now. Students are surrounded by images and videos, but they’re rarely taught to read them with the same care as written texts.
Photographer: Exactly. Once students understand that photographs and AI images are made up of signs, color, framing, scale, and viewpoint, they stop treating images as neutral or factual.
Nesreen: You also asked whether the lesson would lean more towards evaluative assessment or summarizing. That made me realize the reflection mattered just as much as the image itself.
Photographer: Reflection is key. When students explain why a composition works, or what they would change next time, they’re already engaging in higher-level reading skills.
Nesreen: And whether students are analyzing a photograph, generating an AI image, or reading a paragraph, they’re practicing the same habits: slowing down, noticing, justifying, and revising their thinking.
Photographer: And once they see that connection, reading becomes less about the right answer and more about understanding how meaning is made.
Reading images is reading
One common misconception is that visual literacy sits outside “real” literacy. In practice, the opposite is true.
When students read images carefully, they:
identify what matters most
follow structure and sequence
infer meaning from clues
justify interpretations with evidence
revise first impressions
These are the habits of skilled readers.
For emerging readers, multilingual learners, and students who struggle with print, images lower the barrier to participation, without lowering the cognitive demand. Thinking comes first. Language follows.
From composition to comprehension: Mapping image reading to reading strategies
Photography offers a practical way to name what students are already doing intuitively. When teachers explicitly teach compositional elements, familiar reading strategies become visible and transferable.
What students notice in an image
What they are doing cognitively
Reading strategy practiced
Where the eye goes first
Deciding importance
Identifying main ideas
How the eye moves
Tracking structure
Understanding sequence
What is included or excluded
Considering intention
Analyzing author’s choices
Foreground and background
Sorting information
Main vs supporting details
Light and shadow
Interpreting mood
Making inferences
Symbols and colour
Reading beyond the literal
Figurative language
Scale and angle
Judging power
Perspective and viewpoint
Repetition or pattern
Spotting themes
Theme identification
Contextual clues
Using surrounding detail
Context clues
Ambiguity
Holding multiple meanings
Critical reading
Evidence from the image
Justifying interpretation
Evidence-based responses
Once students recognise these moves, teachers can say explicitly:
“You’re doing the same thing you do when you read a paragraph.”
That moment of transfer is powerful.
Making AI image generation teachable (and safe)
In my classroom work pack, students use Perchance AI to generate images. I chose this tool deliberately: It is accessible, age-appropriate, and allows students to iterate, refining prompts based on compositional choices rather than chasing novelty.
Students don’t just generate an image once. They plan, revise, and evaluate.
This shifts AI use away from shortcut behavior and toward intentional design and reflection, supporting academic integrity rather than undermining it.
The progression of a prompt: From surface to depth (WAGOLL)
One of the most effective elements of the work pack is a WAGOLL (What A Good One Looks Like) progression, which shows students how thinking improves with precision.
Simple: A photorealistic image of a dog sitting in a park.
Secure: A photorealistic image of a dog positioned using the rule of thirds, warm colour palette, soft natural lighting, blurred background.
Greater Depth: A photorealistic image of a dog positioned using the rule of thirds, framed by tree branches, low-angle view, strong contrast, sharp focus on the subject, blurred background.
Students can see and explain how photographic language turns an image from output into meaningful signs. That explanation is where literacy lives.
When classroom talk begins to change
Over time, classroom conversations shift.
Instead of “I like it” or “It looks real,” students begin to say:
“The creator wants us to notice…”
“This detail suggests…”
“At first I thought…, but now I think…”
These are reading sentences.
Because images feel accessible, more students participate. The classroom becomes slower, quieter, and more thoughtful–exactly the conditions we want for deep comprehension.
Visual literacy as a bridge, not an add-on
Visual literacy is not an extra subject competing for time. It is a bridge, especially in the age of AI.
By teaching students how to read images, schools strengthen:
reading comprehension
inference and evaluation
evidence-based reasoning
metacognitive awarenes
Most importantly, students learn that literacy is not about rushing to answers, but about noticing, questioning, and constructing meaning.
In a world saturated with AI-generated images, teaching students how to read visually is no longer optional.
It is literacy.
Author’s note: This article grew out of classroom practice and professional dialogue with a former college lecturer and professional photographer. Their contribution informed the discussion of visual composition, semiotics, and reflective image-reading, without any involvement in publication or authorship.
Nesreen El-Baz, Bloomsbury Education Author & School Governor
Nesreen El-Baz is an ESL educator with over 20 years of experience, and is a certified bilingual teacher with a Master’s in Curriculum and Instruction. El-Baz is currently based in the UK, holds a Masters degree in Curriculum and Instruction from Houston Christian University, and specializes in developing in innovative strategies for English Learners and Bilingual education.
Latest posts by eSchool Media Contributors (see all)
For the last two years, conversations about AI in education have tended to fall into two camps: excitement about efficiency or fear of replacement. Teachers worry they’ll lose authenticity. Leaders worry about academic integrity. And across the country, schools are trying to make sense of a technology that feels both promising and overwhelming.
But there’s a quieter, more human-centered opportunity emerging–one that rarely makes the headlines: AI can actually strengthen empathy and improve the quality of our interactions with students and staff.
Not by automating relationships, but by helping us become more reflective, intentional, and attuned to the people we serve.
As a middle school assistant principal and a higher education instructor, I’ve found that AI is most valuable not as a productivity tool, but as a perspective-taking tool. When used thoughtfully, it supports the emotional labor of teaching and leadership–the part of our work that cannot be automated.
From efficiency to empathy
Schools do not thrive because we write faster emails or generate quicker lesson plans. They thrive because students feel known. Teachers feel supported. Families feel included.
AI can assist with the operational tasks, but the real potential lies in the way it can help us:
Reflect on tone before hitting “send” on a difficult email
Understand how a message may land for someone under stress
Role-play sensitive conversations with students or staff
Anticipate barriers that multilingual families might face
Rehearse a restorative response rather than reacting in the moment
These are human actions–ones that require situational awareness and empathy. AI can’t perform them for us, but it can help us practice and prepare for them.
A middle school use case: Preparing for the hard conversations
Middle school is an emotional ecosystem. Students are forming identity, navigating social pressures, and learning how to advocate for themselves. Staff are juggling instructional demands while building trust with young adolescents whose needs shift by the week.
Some days, the work feels like equal parts counselor, coach, and crisis navigator.
One of the ways I’ve leveraged AI is by simulating difficult conversations before they happen. For example:
A student is anxious about returning to class after an incident
A teacher feels unsupported and frustrated
A family is confused about a schedule change or intervention plan
By giving the AI a brief description and asking it to take on the perspective of the other person, I can rehearse responses that center calm, clarity, and compassion.
This has made me more intentional in real interactions–I’m less reactive, more prepared, and more attuned to the emotions beneath the surface.
Empathy improves when we get to “practice” it.
Supporting newcomers and multilingual learners
Schools like mine welcome dozens of newcomers each year, many with interrupted formal education. They bring extraordinary resilience–and significant emotional and linguistic needs.
AI tools can support staff in ways that deepen connection, not diminish it:
Drafting bilingual communication with a softer, more culturally responsive tone
Helping teachers anticipate trauma triggers based on student histories
Rewriting classroom expectations in family-friendly language
Generating gentle scripts for welcoming a student experiencing culture shock
The technology is not a substitute for bilingual staff or cultural competence. But it can serve as a bridge–helping educators reach families and students with more warmth, clarity, and accuracy.
When language becomes more accessible, relationships strengthen.
AI as a mirror for leadership
One unexpected benefit of AI is that it acts as a mirror. When I ask it to review the clarity of a communication, or identify potential ambiguities, it often highlights blind spots:
“This sentence may sound punitive.”
“This may be interpreted as dismissing the student’s perspective.”
“Consider acknowledging the parent’s concern earlier in the message.”
These are the kinds of insights reflective leaders try to surface–but in the rush of a school day, they are easy to miss.
AI doesn’t remove responsibility; it enhances accountability. It helps us lead with more emotional intelligence, not less.
What this looks like in teacher practice
For teachers, AI can support empathy in similarly grounded ways:
1. Building more inclusive lessons
Teachers can ask AI to scan a lesson for hidden barriers–assumptions about background knowledge, vocabulary loads, or unclear steps that could frustrate students.
2. Rewriting directions for struggling learners
A slight shift in wording can make all the difference for a student with anxiety or processing challenges.
3. Anticipating misconceptions before they happen
AI can run through multiple “student responses” so teachers can see where confusion might arise.
4. Practicing restorative language
Teachers can try out scripts for responding to behavioral issues in ways that preserve dignity and connection.
These aren’t shortcuts. They’re tools that elevate the craft.
Human connection is the point
The heart of education is human. AI doesn’t change that–in fact, it makes it more obvious.
When we reduce the cognitive load of planning, we free up space for attunement. When we rehearse hard conversations, we show up with more steadiness. When we write in more inclusive language, more families feel seen. When we reflect on our tone, we build trust.
The goal isn’t to create AI-enhanced classrooms. It’s to create relationship-centered classrooms where AI quietly supports the skills that matter most: empathy, clarity, and connection.
Schools don’t need more automation.
They need more humanity–and AI, used wisely, can help us get there.
Timothy Montalvo, Iona University & the College of Westchester
Timothy Montalvo is a middle school educator and leader passionate about leveraging technology to enhance student learning. He serves as Assistant Principal at Fox Lane Middle School in Westchester, NY, and teaches education courses as an adjunct professor at Iona University and the College of Westchester. Montalvo focuses on preparing students to be informed, active citizens in a digital world and shares insights on Twitter/X @MrMontalvoEDU or on BlueSky @montalvoedu.bsky.social.
Latest posts by eSchool Media Contributors (see all)
The rapid rise of generative AI has turned classrooms into a real-time experiment in technology use. Students are using AI to complete assignments, while teachers are leveraging it to design lessons, streamline grading, and manage administrative tasks.
According to new national survey data from RAND, AI use among both students and educators has grown sharply–by more than 15 percentage points in just the past one to two years. Yet, training and policy have not kept pace. Schools and districts are still developing professional development, student guidance, and clear usage policies to manage this shift.
As a result, educators, students, and parents are navigating both opportunities and concerns. Students worry about being falsely accused of cheating, and many families fear that increased reliance on AI could undermine students’ critical thinking skills.
Key findings:
During the 2024-2025 school year, AI saw rapid growth.
AI use in schools surged during the 2024-2025 academic year. By 2025, more than half of students (54 percent) and core subject teachers (53 percent) were using AI for schoolwork or instruction–up more than 15 points from just a year or two earlier. High school students were the most frequent users, and AI adoption among teachers climbed steadily from elementary to high school.
While students and parents express significant concern about the potential downsides of AI, school district leaders are far less worried.
Sixty-one percent of parents, 48 percent of middle school students, and 55 percent of high school students believe that increased use of AI could harm students’ critical-thinking skills, compared with just 22 percent of district leaders. Additionally, half of students said they worry about being falsely accused of using AI to cheat.
Training and policy development have not kept pace with AI use in schools.
By spring 2025, only 35 percent of district leaders said their schools provide students with training on how to use AI. Meanwhile, more than 80 percent of students reported that their teachers had not explicitly taught them how to use AI for schoolwork. Policy guidance also remains limited–just 45 percent of principals said their schools or districts have policies on AI use, and only 34 percent of teachers reported policies specifically addressing academic integrity and AI.
The report offers recommendations around AI use and guidance:
As AI technology continues to evolve, trusted sources–particularly state education agencies–should provide consistent, regularly updated guidance on effective AI policies and training. This guidance should help educators and students understand how to use AI as a complement to learning, not a replacement for it.
District and school leaders should clearly define what constitutes responsible AI use versus academic dishonesty and communicate these expectations to both teachers and students. In the near term, educators and students urgently need clarity on what qualifies as cheating with AI.
Elementary schools should also be included in this effort. Nearly half of elementary teachers are already experimenting with AI, and these early years are when students build foundational skills and habits. Providing age-appropriate, coherent instruction about AI at this stage can reduce misuse and confusion as students progress through school and as AI capabilities expand.
Ultimately, district leaders should develop comprehensive AI policies and training programs that equip teachers and students to use AI productively and ethically across grade levels.
Laura Ascione is the Editorial Director at eSchool Media. She is a graduate of the University of Maryland’s prestigious Philip Merrill College of Journalism.
If you’ve attended a professional show or musical recently, chances are you’ve seen virtual set design in action. This approach to stage production has gained so much traction it’s now a staple in the industry. After gaining momentum in professional theater, it has made its way into collegiate performing arts programs and is now emerging in K-12 productions as well.
Virtual set design offers a modern alternative to traditional physical stage sets, using technology and software to create immersive backdrops and environments. This approach unlocks endless creative possibilities for schools while also providing practical advantages.
Here, I’ll delve into three key benefits: increasing student engagement and participation, improving efficiency and flexibility in productions, and expanding educational opportunities.
Increasing student engagement and participation
Incorporating virtual set design into productions gets students excited about learning new skills while enhancing the storytelling of a show. When I first joined Churchill High School in Livonia, Michigan as the performing arts manager, the first show we did was Shrek the Musical, and I knew it would require an elaborate set. While students usually work together to paint the various backdrops that bring the show to life, I wanted to introduce them to collaborating on virtual set design.
We set up Epson projectors on the fly rail and used them to project images as the show’s backdrops. Positioned at a short angle, the projectors avoided any shadowing on stage. To create a seamless image with both projectors, we utilized edge-blending and projection mapping techniques using just a Mac® laptop and QLab software. Throughout the performance, the projectors transformed the stage with a dozen dynamic backdrops, shifting from a swamp to a castle to a dungeon.
Students were amazed by the technology and very excited to learn how to integrate it into the set design process. Their enthusiasm created a real buzz around the production, and the community’s feedback on the final results were overwhelmingly positive.
Improving efficiency and flexibility
During Shrek the Musical, there were immediate benefits that made it so much easier to put together a show. To start, we saved money by eliminating the need to build multiple physical sets. While we were cutting costs on lumber and materials, we were also solving design challenges and expanding what was possible on stage.
This approach also saved us valuable time. Preparing the sets in the weeks leading up to the show was faster, and transitions during performances became seamless. Instead of moving bulky scenery between scenes or acts, the stage crew simply switched out projected images making it much more efficient.
We saw even more advantages in our spring production of She Kills Monsters. Some battle scenes called for 20 or 30 actors to be on stage at once, which would have been difficult to manage with a traditional set. By using virtual production, we broke the stage up with different panels spaced apart and projected designs, creating more space for performers. We were able to save physical space, as well as create a design that helped with stage blocking and made it easier for students to find their spots.
Since using virtual sets, our productions have become smoother, more efficient, and more creative.
Expanding educational opportunities
Beyond the practical benefits, virtual set design also creates valuable learning opportunities for students. Students involved in productions gain exposure to industry-level technology and learn about careers in the arts, audio, and video technology fields. Introducing students to these opportunities before graduating high school can really help prepare them for future success.
Additionally, in our school’s technical theater courses, students are learning lessons on virtual design and gaining hands-on experiences. As they are learning about potential career paths, they are developing collaboration skills and building transferable skills that directly connect to college and career readiness.
Looking ahead with virtual set design
Whether students are interested in graphic design, sound engineering, or visual technology, virtual production brings countless opportunities to them to explore. It allows them to experiment with tools and concepts that connect directly to potential college majors or future careers.
For schools, incorporating virtual production into high school theater offers more than just impressive shows. It provides a cost-effective, flexible, and innovative approach to storytelling. It is a powerful tool that benefits productions, enriches student learning, and prepares the next generation of artists and innovators.
Latest posts by eSchool Media Contributors (see all)
When I first started experimenting with AI in my classroom, I saw the same thing repeatedly from students. They treated it like Google. Ask a question, get an answer, move on. It didn’t take long to realize that if my students only engage with AI this way, they miss the bigger opportunity to use AI as a partner in thinking. AI isn’t a magic answer machine. It’s a tool for creativity and problem-solving. The challenge for us as educators is to rethink how we prepare students for the world they’re entering and to use AI with curiosity and fidelity.
Moving from curiosity to fluency
In my district, I wear two hats: history teacher and instructional coach. That combination gives me the space to test ideas in the classroom and support colleagues as they try new tools. What I’ve learned is that AI fluency requires far more than knowing how to log into a platform. Students need to learn how to question outputs, verify information and use results as a springboard for deeper inquiry.
I often remind them, “You never trust your source. You always verify and compare.” If students accept every AI response at face value, they’re not building the critical habits they’ll need in college or in the workforce.
To make this concrete, I teach my students the RISEN framework: Role, Instructions, Steps, Examples, Narrowing. It helps them craft better prompts and think about the kind of response they want. Instead of typing “explain photosynthesis,” they might ask, “Act as a biologist explaining photosynthesis to a tenth grader. Use three steps with an analogy, then provide a short quiz at the end.” Suddenly, the interaction becomes purposeful, structured and reflective of real learning.
AI as a catalyst for equity and personalization
Growing up, I was lucky. My mom was college educated and sat with me to go over almost every paper I wrote. She gave me feedback that helped to sharpen my writing and build my confidence. Many of my students don’t have that luxury. For these learners, AI can be the academic coach they might not otherwise have.
That doesn’t mean AI replaces human connection. Nothing can. But it can provide feedback, ask guiding questions, and provide examples that give students a sounding board and thought partner. It’s one more way to move closer to providing personalized support for learners based on need.
Of course, equity cuts both ways. If only some students have access to AI or if we use it without considering its bias, we risk widening the very gaps we hope to close. That’s why it’s our job as educators to model ethical and critical use, not just the mechanics.
Shifting how we assess learning
One of the biggest shifts I’ve made is rethinking how I assess students. If I only grade the final product, I’m essentially inviting them to use AI as a shortcut. Instead, I focus on the process: How did they engage with the tool? How did they verify and cross-reference results? How did they revise their work based on what they learned? What framework guided their inquiry? In this way, AI becomes part of their learning journey rather than just an endpoint.
I’ve asked students to run the same question through multiple AI platforms and then compare the outputs. What were the differences? Which response feels most accurate or useful? What assumptions might be at play? These conversations push students to defend their thinking and use AI critically, not passively.
Navigating privacy and policy
Another responsibility we carry as educators is protecting our students. Data privacy is a serious concern. In my school, we use a “walled garden” version of AI so that student data doesn’t get used for training. Even with those safeguards in place, I remind colleagues never to enter identifiable student information into a tool.
Policies will continue to evolve, but for day-to-day activities and planning, teachers need to model caution and responsibility. Students are taking our lead.
Professional growth for a changing profession
The truth of the matter is most of us have not been professionally trained to do this. My teacher preparation program certainly did not include modules on prompt engineering or data ethics. That means professional development in this space is a must.
I’ve grown the most in my AI fluency by working alongside other educators who are experimenting, sharing stories, and comparing notes. AI is moving fast. No one has all the answers. But we can build confidence together by trying, reflecting, and adjusting through shared experience and lessons learned. That’s exactly what we’re doing in the Lead for Learners network. It’s a space where educators from across the country connect, learn and support one another in navigating change.
For educators who feel hesitant, I’d say this: You don’t need to be an expert to start. Pick one tool, test it in one lesson, and talk openly with your students about what you’re learning. They’ll respect your honesty and join you in the process.
Preparing students for what’s next
AI is not going away. Whether we’re ready or not, it’s going to shape how our students live and work. That gives us a responsibility not just to keep pace with technology but to prepare young people for what’s ahead. The latest futures forecast reminds us that imagining possibilities is just as important as responding to immediate shifts.
We need to understand both how AI is already reshaping education delivery and how new waves of change will remain on the horizon as tools grow more sophisticated and widespread.
I want my students to leave my classroom with the ability to question, create, and collaborate using AI. I want them to see it not as a shortcut but as a tool for thinking more deeply and expressing themselves more fully. And I want them to watch me modeling those same habits: curiosity, caution, creativity, and ethical decision-making. Because if we don’t show them what responsible use looks like, who will?
The future of education won’t be defined by whether we allow AI into our classrooms. It will be defined by how we teach with it, how we teach about it, and how we prepare our students to thrive in a world where it’s everywhere.
Ian McDougall, Yuma Union High School District
Ian McDougall is a history teacher and edtech coach at Yuma Union High School District in Arizona. He also facilitates the Lead for Learners Community, an online hub for learner-centered educators nationwide. With extensive experience in K–12 education and technology integration, Ian supports schools in adopting innovative practices through professional development and instructional coaching. He holds a master’s degree in United States history from Adams State University, further strengthening his expertise as both a teacher and coach.
Latest posts by eSchool Media Contributors (see all)
AI is now at the center of almost every conversation in education technology. It is reshaping how we create content, build assessments, and support learners. The opportunities are enormous. But one quiet risk keeps growing in the background: losing our habit of critical thinking.
I see this risk not as a theory but as something I have felt myself.
The moment I almost outsourced my judgment
A few months ago, I was working on a complex proposal for a client. Pressed for time, I asked an AI tool to draft an analysis of their competitive landscape. The output looked polished and convincing. It was tempting to accept it and move on.
Then I forced myself to pause. I began questioning the sources behind the statements and found a key market shift the model had missed entirely. If I had skipped that short pause, the proposal would have gone out with a blind spot that mattered to the client.
That moment reminded me that AI is fast and useful, but the responsibility for real thinking is still mine. It also showed me how easily convenience can chip away at judgment.
AI as a thinking partner
The most powerful way to use AI is to treat it as a partner that widens the field of ideas while leaving the final call to us. AI can collect data in seconds, sketch multiple paths forward, and expose us to perspectives we might never consider on our own.
In my own work at Magic EdTech, for example, our teams have used AI to quickly analyze thousands of pages of curriculum to flag accessibility issues. The model surfaces patterns and anomalies that would take a human team weeks to find. Yet the real insight comes when we bring educators and designers together to ask why those patterns matter and how they affect real classrooms. AI sets the table, but we still cook the meal.
There is a subtle but critical difference between using AI to replace thinking and using it to stretch thinking. Replacement narrows our skills over time. Stretching builds new mental flexibility. The partner model forces us to ask better questions, weigh trade-offs, and make calls that only human judgment can resolve.
Habits to keep your edge
Protecting critical thinking is not about avoiding AI. It is about building habits that keep our minds active when AI is everywhere.
Here are three I find valuable:
1. Name the fragile assumption Each time you receive AI output, ask: What is one assumption here that could be wrong? Spend a few minutes digging into that. It forces you to reenter the problem space instead of just editing machine text.
2. Run the reverse test Before you adopt an AI-generated idea, imagine the opposite. If the model suggests that adaptive learning is the key to engagement, ask: What if it is not? Exploring the counter-argument often reveals gaps and deeper insights.
3. Slow the first draft It is tempting to let AI draft emails, reports, or code and just sign off. Instead, start with a rough human outline first. Even if it is just bullet points, you anchor the work in your own reasoning and use the model to enrich–not originate–your thinking.
These small practices keep the human at the center of the process and turn AI into a gym for the mind rather than a crutch.
Why this matters for education
For those of us in education technology, the stakes are unusually high. The tools we build help shape how students learn and how teachers teach. If we let critical thinking atrophy inside our companies, we risk passing that weakness to the very people we serve.
Students will increasingly use AI for research, writing, and even tutoring. If the adults designing their digital classrooms accept machine answers without question, we send the message that surface-level synthesis is enough. We would be teaching efficiency at the cost of depth.
By contrast, if we model careful reasoning and thoughtful use of AI, we can help the next generation see these tools for what they are: accelerators of understanding, not replacements for it. AI can help us scale accessibility, personalize instruction, and analyze learning data in ways that were impossible before. But its highest value appears only when it meets human curiosity and judgment.
Building a culture of shared judgment
This is not just an individual challenge. Teams need to build rituals that honor slow thinking in a fast AI environment. Another practice is rotating the role of “critical friend” in meetings. One person’s task is to challenge the group’s AI-assisted conclusions and ask what could go wrong. This simple habit trains everyone to keep their reasoning sharp.
Next time you lean on AI for a key piece of work, pause before you accept the answer. Write down two decisions in that task that only a human can make. It might be about context, ethics, or simple gut judgment. Then share those reflections with your team. Over time this will create a culture where AI supports wisdom rather than diluting it.
The real promise of AI is not that it will think for us, but that it will free us to think at a higher level.
The danger is that we may forget to climb.
The future of education and the integrity of our own work depend on remaining climbers. Let the machines speed the climb, but never let them choose the summit.
Laura Ascione is the Editorial Director at eSchool Media. She is a graduate of the University of Maryland’s prestigious Philip Merrill College of Journalism.
K-12 IT leaders are under pressure from all sides–rising cyberattacks, the end of Windows 10 support, and the need for powerful new learning tools.
The good news: Windows 11 on Lenovo devices delivers more than an upgrade–it’s a smarter, safer foundation for digital learning in the age of AI.
Delaying the move means greater risk, higher costs, and missed opportunities. With proven ROI, cutting-edge protection, and tools that empower both teachers and students, the case for Windows 11 is clear.
1. Harness AI-powered educational innovation with Copilot Windows 11 integrates Microsoft Copilot AI capabilities that transform teaching and learning. Teachers can leverage AI for lesson planning, content creation, and administrative tasks, while students benefit from enhanced collaboration tools and accessibility features.
2. Combat the explosive rise in school cyberattacks The statistics are alarming: K-12 ransomware attacks increased 92 percent between 2022 and 2023, with human-operated ransomware attacks surging over 200 percent globally, according to the 2024 State of Ransomware in Education.
3. Combat the explosive rise in school cyberattacks Time is critically short. Windows 10 support ended in October 2025, leaving schools running unsupported systems vulnerable to attacks and compliance violations. Starting migration planning immediately ensures adequate time for device inventory, compatibility testing, and smooth district-wide deployment.
Laura Ascione is the Editorial Director at eSchool Media. She is a graduate of the University of Maryland’s prestigious Philip Merrill College of Journalism.
In the growing conversation around AI in education, speed and efficiency often take center stage, but that focus can tempt busy educators to use what’s fast rather than what’s best. To truly serve teachers–and above all, students–AI must be built with intention and clear constraints that prioritize instructional quality, ensuring efficiency never comes at the expense of what learners need most.
AI doesn’t inherently understand fairness, instructional nuance, or educational standards. It mirrors its training and guidance, usually as a capable generalist rather than a specialist. Without deliberate design, AI can produce content that’s misaligned or confusing. In education, fairness means an assessment measures only the intended skill and does so comparably for students from different backgrounds, languages, and abilities–without hidden barriers unrelated to what’s being assessed. Effective AI systems in schools need embedded controls to avoid construct‑irrelevant content: elements that distract from what’s actually being measured.
For example, a math question shouldn’t hinge on dense prose, niche sports knowledge, or culturally-specific idioms unless those are part of the goal; visuals shouldn’t rely on low-contrast colors that are hard to see; audio shouldn’t assume a single accent; and timing shouldn’t penalize students if speed isn’t the construct.
To improve fairness and accuracy in assessments:
Avoid construct-irrelevant content: Ensure test questions focus only on the skills and knowledge being assessed.
Use AI tools with built-in fairness controls: Generic AI models may not inherently understand fairness; choose tools designed specifically for educational contexts.
Train AI on expert-authored content: AI is only as fair and accurate as the data and expertise it’s trained on. Use models built with input from experienced educators and psychometricians.
These subtleties matter. General-purpose AI tools, left untuned, often miss them.
The risk of relying on convenience
Educators face immense time pressures. It’s tempting to use AI to quickly generate assessments or learning materials. But speed can obscure deeper issues. A question might look fine on the surface but fail to meet cognitive complexity standards or align with curriculum goals. These aren’t always easy problems to spot, but they can impact student learning.
To choose the right AI tools:
Select domain-specific AI over general models: Tools tailored for education are more likely to produce pedagogically-sound and standards-aligned content that empowers students to succeed. In a 2024 University of Pennsylvania study, students using a customized AI tutor scored 127 percent higher on practice problems than those without.
Be cautious with out-of-the-box AI: Without expertise, educators may struggle to critique or validate AI-generated content, risking poor-quality assessments.
Understand the limitations of general AI: While capable of generating content, general models may lack depth in educational theory and assessment design.
General AI tools can get you 60 percent of the way there. But that last 40 percent is the part that ensures quality, fairness, and educational value. This requires expertise to get right. That’s where structured, guided AI becomes essential.
Building AI that thinks like an educator
Developing AI for education requires close collaboration with psychometricians and subject matter experts to shape how the system behaves. This helps ensure it produces content that’s not just technically correct, but pedagogically sound.
To ensure quality in AI-generated content:
Involve experts in the development process: Psychometricians and educators should review AI outputs to ensure alignment with learning goals and standards.
Use manual review cycles: Unlike benchmark-driven models, educational AI requires human evaluation to validate quality and relevance.
Focus on cognitive complexity: Design assessments with varied difficulty levels and ensure they measure intended constructs.
This process is iterative and manual. It’s grounded in real-world educational standards, not just benchmark scores.
Personalization needs structure
AI’s ability to personalize learning is promising. But without structure, personalization can lead students off track. AI might guide learners toward content that’s irrelevant or misaligned with their goals. That’s why personalization must be paired with oversight and intentional design.
To harness personalization responsibly:
Let experts set goals and guardrails: Define standards, scope and sequence, and success criteria; AI adapts within those boundaries.
Use AI for diagnostics and drafting, not decisions: Have it flag gaps, suggest resources, and generate practice, while educators curate and approve.
Preserve curricular coherence: Keep prerequisites, spacing, and transfer in view so learners don’t drift into content that’s engaging but misaligned.
Support educator literacy in AI: Professional development is key to helping teachers use AI effectively and responsibly.
It’s not enough to adapt–the adaptation must be meaningful and educationally coherent.
AI can accelerate content creation and internal workflows. But speed alone isn’t a virtue. Without scrutiny, fast outputs can compromise quality.
To maintain efficiency and innovation:
Use AI to streamline internal processes: Beyond student-facing tools, AI can help educators and institutions build resources faster and more efficiently.
Maintain high standards despite automation: Even as AI accelerates content creation, human oversight is essential to uphold educational quality.
Responsible use of AI requires processes that ensure every AI-generated item is part of a system designed to uphold educational integrity.
An effective approach to AI in education is driven by concern–not fear, but responsibility. Educators are doing their best under challenging conditions, and the goal should be building AI tools that support their work.
When frameworks and safeguards are built-in, what reaches students is more likely to be accurate, fair, and aligned with learning goals.
In education, trust is foundational. And trust in AI starts with thoughtful design, expert oversight, and a deep respect for the work educators do every day.
Nick Koprowicz, Prometric
Nick Koprowicz is an applied AI scientist at Prometric, a global leader in credentialing and skills development.
Latest posts by eSchool Media Contributors (see all)
In my classroom, students increasingly ask for relevant content. Students want to know how what they are learning in school relates to the world beyond the classroom. They want to be engaged in their learning.
In fact, the 2025-2026 Education Insights Report vividly proves that students need and want engaging learning experiences. And it’s not just students who see engagement as important. Engagement is broadly recognized as a key driver of learning and success, with 93 percent of educators agreeing that student engagement is a critical metric for understanding overall achievement. What is more, 99 percent of superintendents believe student engagement is one of the top predictors of success at school.
Creating highly engaging lesson plans that will immerse today’s tech-savvy students in learning can be a challenge, but here are two easy-to-find resources that I can turn to turbo-charge the engagement quotient of my lessons:
Virtual field trips Virtual field trips empower educators to introduce students to amazing places, new people and ideas, and remarkable experiences–without ever leaving the classroom. There are so many virtual field trips out there, but I always love the ones that Discovery Education creates with partners.
I also love the virtual tours of the Smithsonian National Museum of Natural History. Together as a class or individually, students can dive into self-guided, room-by-room tours of several exhibits and areas within the museum from a desktop or smart device. This virtual field trip does include special collections and research areas, like ancient Egypt or the deep ocean. This makes it fun and easy for teachers like me to pick and choose which tour is most relevant to a lesson.
Immersive learning resources Immersive learning content offers another way to take students to new places and connect the wider world, and universe, to the classroom. Immersive learning can be easily woven into the curriculum to enhance and provide context.
One immersive learning solution I really like is TimePod Adventures from Verizon. It features free time-traveling episodes designed to engage students in places like Mars and prehistoric Earth. Now accessible directly through a web browser on a laptop, Chromebook, or mobile device, students need only internet access and audio output to begin the journey. Guided by an AI-powered assistant and featuring grade-band specific lesson plans, these missions across time and space encourage students to take control, explore incredible environments, and solve complex challenges.
Immersive learning content can be overwhelming at first, but professional development resources are available to help educators build confidence while earning microcredentials. These resources let educators quickly dive into new and innovative techniques and teaching strategies that help increase student engagement.
Taken together, engaging learning opportunities are ones that show students how classrooms learnings directly connect to their real lives. With resources like virtual field trips and immersive learning content, students can dive into school topics in ways that are fun, fresh, and sometimes otherworldly.
Leia J. DePalo, Northport-East Northport Union Free School District
Leia J. (LJ) DePalo is an Elementary STEM and Future Forward Teacher (FFT) in the Northport-East Northport School District with over 20 years of experience in education. LJ holds a Master of Science in Literacy and permanent New York State teaching certifications in Elementary Education, Speech, and Computer Science. A dedicated innovator, she collaborates with teachers to design technology-infused lessons, leads professional development, and choreographs award-winning school musicals. In recognition of her creativity and impact, DePalo was named a 2025 Innovator Grant recipient.
Latest posts by eSchool Media Contributors (see all)