Tag: replace

  • Using generative tools to deepen, not replace, human connection in schools

    Using generative tools to deepen, not replace, human connection in schools

    Key points:

    For the last two years, conversations about AI in education have tended to fall into two camps: excitement about efficiency or fear of replacement. Teachers worry they’ll lose authenticity. Leaders worry about academic integrity. And across the country, schools are trying to make sense of a technology that feels both promising and overwhelming.

    But there’s a quieter, more human-centered opportunity emerging–one that rarely makes the headlines: AI can actually strengthen empathy and improve the quality of our interactions with students and staff.

    Not by automating relationships, but by helping us become more reflective, intentional, and attuned to the people we serve.

    As a middle school assistant principal and a higher education instructor, I’ve found that AI is most valuable not as a productivity tool, but as a perspective-taking tool. When used thoughtfully, it supports the emotional labor of teaching and leadership–the part of our work that cannot be automated.

    From efficiency to empathy

    Schools do not thrive because we write faster emails or generate quicker lesson plans. They thrive because students feel known. Teachers feel supported. Families feel included.

    AI can assist with the operational tasks, but the real potential lies in the way it can help us:

    • Reflect on tone before hitting “send” on a difficult email
    • Understand how a message may land for someone under stress
    • Role-play sensitive conversations with students or staff
    • Anticipate barriers that multilingual families might face
    • Rehearse a restorative response rather than reacting in the moment

    These are human actions–ones that require situational awareness and empathy. AI can’t perform them for us, but it can help us practice and prepare for them.

    A middle school use case: Preparing for the hard conversations

    Middle school is an emotional ecosystem. Students are forming identity, navigating social pressures, and learning how to advocate for themselves. Staff are juggling instructional demands while building trust with young adolescents whose needs shift by the week.

    Some days, the work feels like equal parts counselor, coach, and crisis navigator.

    One of the ways I’ve leveraged AI is by simulating difficult conversations before they happen. For example:

    • A student is anxious about returning to class after an incident
    • A teacher feels unsupported and frustrated
    • A family is confused about a schedule change or intervention plan

    By giving the AI a brief description and asking it to take on the perspective of the other person, I can rehearse responses that center calm, clarity, and compassion.

    This has made me more intentional in real interactions–I’m less reactive, more prepared, and more attuned to the emotions beneath the surface.

    Empathy improves when we get to “practice” it.

    Supporting newcomers and multilingual learners

    Schools like mine welcome dozens of newcomers each year, many with interrupted formal education. They bring extraordinary resilience–and significant emotional and linguistic needs.

    AI tools can support staff in ways that deepen connection, not diminish it:

    • Drafting bilingual communication with a softer, more culturally responsive tone
    • Helping teachers anticipate trauma triggers based on student histories
    • Rewriting classroom expectations in family-friendly language
    • Generating gentle scripts for welcoming a student experiencing culture shock

    The technology is not a substitute for bilingual staff or cultural competence. But it can serve as a bridge–helping educators reach families and students with more warmth, clarity, and accuracy.

    When language becomes more accessible, relationships strengthen.

    AI as a mirror for leadership

    One unexpected benefit of AI is that it acts as a mirror. When I ask it to review the clarity of a communication, or identify potential ambiguities, it often highlights blind spots:

    • “This sentence may sound punitive.”
    • “This may be interpreted as dismissing the student’s perspective.”
    • “Consider acknowledging the parent’s concern earlier in the message.”

    These are the kinds of insights reflective leaders try to surface–but in the rush of a school day, they are easy to miss.

    AI doesn’t remove responsibility; it enhances accountability. It helps us lead with more emotional intelligence, not less.

    What this looks like in teacher practice

    For teachers, AI can support empathy in similarly grounded ways:

    1. Building more inclusive lessons

    Teachers can ask AI to scan a lesson for hidden barriers–assumptions about background knowledge, vocabulary loads, or unclear steps that could frustrate students.

    2. Rewriting directions for struggling learners

    A slight shift in wording can make all the difference for a student with anxiety or processing challenges.

    3. Anticipating misconceptions before they happen

    AI can run through multiple “student responses” so teachers can see where confusion might arise.

    4. Practicing restorative language

    Teachers can try out scripts for responding to behavioral issues in ways that preserve dignity and connection.

    These aren’t shortcuts. They’re tools that elevate the craft.

    Human connection is the point

    The heart of education is human. AI doesn’t change that–in fact, it makes it more obvious.

    When we reduce the cognitive load of planning, we free up space for attunement.
    When we rehearse hard conversations, we show up with more steadiness.
    When we write in more inclusive language, more families feel seen.
    When we reflect on our tone, we build trust.

    The goal isn’t to create AI-enhanced classrooms. It’s to create relationship-centered classrooms where AI quietly supports the skills that matter most: empathy, clarity, and connection.

    Schools don’t need more automation.

    They need more humanity–and AI, used wisely, can help us get there.

    Latest posts by eSchool Media Contributors (see all)

    Source link

  • The new AI tools are fast but can’t replace the judgment, care and cultural knowledge teachers bring to the table

    The new AI tools are fast but can’t replace the judgment, care and cultural knowledge teachers bring to the table

    by Tanishia Lavette Williams, The Hechinger Report
    November 4, 2025

    The year I co-taught world history and English language arts with two colleagues, we were tasked with telling the story of the world in 180 days to about 120 ninth graders. We invited students to consider how texts and histories speak to one another: “The Analects” as imperial governance, “Sundiata” as Mali’s political memory, “Julius Caesar” as a window into the unraveling of a republic. 

    By winter, our students had given us nicknames. Some days, we were a triumvirate. Some days, we were Cerberus, the three-headed hound of Hades. It was a joke, but it held a deeper meaning. Our students were learning to make connections by weaving us into the histories they studied. They were building a worldview, and they saw themselves in it. 

    Designed to foster critical thinking, this teaching was deeply human. It involved combing through texts for missing voices, adapting lessons to reflect the interests of the students in front of us and trusting that learning, like understanding, unfolds slowly. That labor can’t be optimized for efficiency. 

    Yet, today, there’s a growing push to teach faster. Thousands of New York teachers are being trained to use AI tools for lesson planning, part of a $23 million initiative backed by OpenAI, Microsoft and Anthropic. The program promises to reduce teacher burnout and streamline planning. At the same time, a new private school in Manhattan is touting an AI-driven model that “speed-teaches” core subjects in just two hours of instruction each day while deliberately avoiding politically controversial issues. 

    Marketed as innovation, this stripped-down vision of education treats learning as a technical output rather than as a human process in which students ask hard questions and teachers cultivate the critical thinking that fuels curiosity. A recent analysis of AI-generated civics lesson plans found that they consistently lacked multicultural content and prompts for critical thinking. These AI tools are fast, but shallow. They fail to capture the nuance, care and complexity that deep learning demands. 

    Related: A lot goes on in classrooms from kindergarten to high school. Keep up with our free weekly newsletter on K-12 education.  

    When I was a teacher, I often reviewed lesson plans to help colleagues refine their teaching practices. Later, as a principal in Washington, D.C., and New York City, I came to understand that lesson plans, the documents connecting curriculum and achievement, were among the few steady examples of classroom practice. Despite their importance, lesson plans were rarely evaluated for their effectiveness.  

    When I wrote my dissertation, after 20 years of working in schools, lesson plan analysis was a core part of my research. Analyzing plans across multiple schools, I found that the activities and tasks included in lesson plans were reliable indicators of the depth of knowledge teachers required and, by extension, the limits of what students were asked to learn. 

    Reviewing hundreds of plans made clear that most lessons rarely offered more than a single dominant voice — and thus confined both what counted as knowledge and what qualified as achievement. Shifting plans toward deeper, more inclusive student learning required deliberate effort to incorporate primary sources, weave together multiple narratives and design tasks that push students beyond mere recall. 

     I also found that creating the conditions for such learning takes time. There is no substitute for that. Where this work took hold, students were making meaning, seeing patterns, asking why and finding themselves in the story. 

    That’s the transformation AI can’t deliver. When curriculum tools are trained on the same data that has long omitted perspectives, they don’t correct bias; they reproduce it. The developers of ChatGPT acknowledge that the model is “skewed toward Western views and performs best in English” and warn educators to review its content carefully for stereotypes and bias. Those same distortions appear at the systems level — a 2025 study in the World Journal of Advanced Research and Reviews found that biased educational algorithms can shape students’ educational paths and create new structural barriers. 

    Ask an AI tool for a lesson on westward expansion, and you’ll get a tidy narrative about pioneers and Manifest Destiny. Request a unit on the Civil Rights Movement and you may get a few lines on Martin Luther King Jr., but hardly a word about Ella Baker, Fannie Lou Hamer or the grassroots organizers who made the movement possible. Native nations, meanwhile, are reduced to footnotes or omitted altogether. 

    Curriculum redlining — the systematic exclusion or downplaying of entire histories, perspectives and communities — has already been embedded in educational materials for generations. So what happens when “efficiency” becomes the goal? Whose histories are deemed too complex, too political or too inconvenient to make the cut? 

    Related: What aspects of teaching should remain human? 

    None of this is theoretical. It’s already happening in classrooms across the country. Educators are under pressure to teach more with less: less time, fewer resources, narrower guardrails. AI promises relief but overlooks profound ethical questions. 

    Students don’t benefit from autogenerated worksheets. They benefit from lessons that challenge them, invite them to wrestle with complexity and help them connect learning to the world around them. That requires deliberate planning and professional judgment from a human who views education as a mechanism to spark inquiry. 

    Recently, I asked my students at Brandeis University to use AI to generate a list of individuals who embody concepts such as beauty, knowledge and leadership. The results, overwhelmingly white, male and Western, mirrored what is pervasive in textbooks.  

    My students responded with sharp analysis. One student created color palettes to demonstrate the narrow scope of skin tones generated by AI. Another student developed a “Missing Gender” summary to highlight omissions. It was a clear reminder that students are ready to think critically but require opportunities to do so.  

    AI can only do what it’s programmed to do, which means it draws from existing, stratified information and lags behind new paradigms. That makes it both backward-looking and vulnerable to reproducing bias.  

    Teaching with humanity, by contrast, requires judgment, care and cultural knowledge. These are qualities no algorithm can automate. When we surrender lesson planning to AI, we don’t just lose stories; we also lose the opportunity to engage with them. We lose the critical habits of inquiry and connection that teaching is meant to foster. 

    Tanishia Lavette Williams is the inaugural education stratification postdoctoral fellow at the Institute on Race, Power and Political Economy, a Kay fellow at Brandeis University and a visiting scholar at Harvard University. 

    Contact the opinion editor at [email protected].  

    This story about male AI and teaching was produced by The Hechinger Report, a nonprofit, independent news organization focused on inequality and innovation in education. Sign up for Hechinger’s weekly newsletter.  

    This <a target=”_blank” href=”https://hechingerreport.org/opinion-the-new-ai-tools-are-fast-but-cant-replace-the-judgment-care-and-cultural-knowledge-teachers-bring-to-the-table/”>article</a> first appeared on <a target=”_blank” href=”https://hechingerreport.org”>The Hechinger Report</a> and is republished here under a <a target=”_blank” href=”https://creativecommons.org/licenses/by-nc-nd/4.0/”>Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License</a>.<img src=”https://i0.wp.com/hechingerreport.org/wp-content/uploads/2018/06/cropped-favicon.jpg?fit=150%2C150&amp;ssl=1″ style=”width:1em;height:1em;margin-left:10px;”>

    <img id=”republication-tracker-tool-source” src=”https://hechingerreport.org/?republication-pixel=true&post=113191&amp;ga4=G-03KPHXDF3H” style=”width:1px;height:1px;”><script> PARSELY = { autotrack: false, onload: function() { PARSELY.beacon.trackPageView({ url: “https://hechingerreport.org/opinion-the-new-ai-tools-are-fast-but-cant-replace-the-judgment-care-and-cultural-knowledge-teachers-bring-to-the-table/”, urlref: window.location.href }); } } </script> <script id=”parsely-cfg” src=”//cdn.parsely.com/keys/hechingerreport.org/p.js”></script>

    Source link

  • College grad unemployment surges as employers replace new hires with AI (CBS News)

    College grad unemployment surges as employers replace new hires with AI (CBS News)

    The unemployment rate for new college graduates has recently surged. Economists say businesses are now replacing entry-level jobs with artificial intelligence.

     

    Source link