Tag: Age

  • Teaching Students Agency in the Age of AI

    Teaching Students Agency in the Age of AI

    Students have little opportunity to practice agency when an LMS tracks their assignments, they’re not encouraged to explore different majors and colleges shrink general education requirements, according to writer and educator John Warner.

    In the latest episode of The Key, Inside Higher Ed’s news and analysis podcast, Warner tells IHE’s editor in chief, Sara Custer, that colleges should refocus on teaching students how to learn and grow.

    “Agency writ large is the thing we need to survive as people … but it’s also a fundamental part of learning, particularly writing.”

    Warner argues that with the arrival of AI, helping students develop agency is even more of an imperative for higher education institutions.

    “AI is a homework machine … Our response cannot be ‘you’re just going to make this thing using AI now,’” Warner said. “More importantly than this is not learning anything, it is a failure to confront [the question]: What do we, as humans, do now with this technology?”

    Warner also shares what he’s learned from consulting and speaking about teaching and AI at campuses across the country. Ultimately, he says, faculty can work with AI in a way that still aligns with their institutional values.

    Listen to the full episode.

    Source link

  • Teaching visual literacy as a core reading strategy in the age of AI

    Teaching visual literacy as a core reading strategy in the age of AI

    Key points:

    Many years ago, around 2010, I attended a professional development program in Houston called Literacy Through Photography, at a time when I was searching for practical ways to strengthen comprehension, discussion, and reading fluency, particularly for students who found traditional print-based tasks challenging. As part of the program, artists visited my classroom and shared their work with students. Much of that work was abstract. There were no obvious answers and no single “correct” interpretation.

    Instead, students were invited to look closely, talk together, and explain what they noticed.

    What struck me was how quickly students, including those who struggled with traditional reading tasks, began to engage. They learned to slow down, describe what they saw, make inferences, and justify their thinking. They weren’t just looking at images; they were reading them. And in doing so, they were rehearsing many of the same strategies we expect when reading written texts.

    At the time, this felt innovative. But it also felt deeply intuitive.

    Fast forward to today.

    Students are surrounded by images and videos, from photographs and diagrams to memes, screenshots, and, increasingly, AI-generated visuals. These images appear everywhere: in learning materials, on social media, and inside the tools students use daily. Many look polished, realistic, and authoritative.

    At the same time, AI has made faking easier than ever.

    As educators and school leaders, we now face urgent questions around misinformation, academic integrity, and critical thinking. The issue is no longer just whether students can use AI tools, but whether they can interpret, evaluate, and question what they see.

    This is where visual literacy becomes a frontline defence.

    Teaching students to read images critically, to see them as constructed texts rather than neutral data, strengthens the same skills we rely on for strong reading comprehension: inference, evidence-based reasoning, and metacognitive awareness.

    From photography to AI: A conversation grounded in practice

    Recently, I found myself returning to those early classroom experiences through ongoing professional dialogue with a former college lecturer and professional photographer, as we explored what it really means to read images in the age of AI.

    A conversation that grew out of practice

    Nesreen: When I shared the draft with you, you immediately focused on the language, whether I was treating images as data or as signs. Is this important?

    Photographer: Yes, because signs belong to reading. Data is output. Signs are meaning. When we talk about reading media texts, we’re talking about how meaning is constructed, not just what information appears.

    Nesreen: That distinction feels crucial right now. Students are surrounded by images and videos, but they’re rarely taught to read them with the same care as written texts.

    Photographer: Exactly. Once students understand that photographs and AI images are made up of signs, color, framing, scale, and viewpoint, they stop treating images as neutral or factual.

    Nesreen: You also asked whether the lesson would lean more towards evaluative assessment or summarizing. That made me realize the reflection mattered just as much as the image itself.

    Photographer: Reflection is key. When students explain why a composition works, or what they would change next time, they’re already engaging in higher-level reading skills.

    Nesreen: And whether students are analyzing a photograph, generating an AI image, or reading a paragraph, they’re practicing the same habits: slowing down, noticing, justifying, and revising their thinking.

    Photographer: And once they see that connection, reading becomes less about the right answer and more about understanding how meaning is made.

    Reading images is reading

    One common misconception is that visual literacy sits outside “real” literacy. In practice, the opposite is true.

    When students read images carefully, they:

    • identify what matters most
    • follow structure and sequence
    • infer meaning from clues
    • justify interpretations with evidence
    • revise first impressions

    These are the habits of skilled readers.

    For emerging readers, multilingual learners, and students who struggle with print, images lower the barrier to participation, without lowering the cognitive demand. Thinking comes first. Language follows.

    From composition to comprehension: Mapping image reading to reading strategies

    Photography offers a practical way to name what students are already doing intuitively. When teachers explicitly teach compositional elements, familiar reading strategies become visible and transferable.

    What students notice in an image What they are doing cognitively Reading strategy practiced
    Where the eye goes first Deciding importance Identifying main ideas
    How the eye moves Tracking structure Understanding sequence
    What is included or excluded Considering intention Analyzing author’s choices
    Foreground and background Sorting information Main vs supporting details
    Light and shadow Interpreting mood Making inferences
    Symbols and colour Reading beyond the literal Figurative language
    Scale and angle Judging power Perspective and viewpoint
    Repetition or pattern Spotting themes Theme identification
    Contextual clues Using surrounding detail Context clues
    Ambiguity Holding multiple meanings Critical reading
    Evidence from the image Justifying interpretation Evidence-based responses

    Once students recognise these moves, teachers can say explicitly:

    “You’re doing the same thing you do when you read a paragraph.”

    That moment of transfer is powerful.

    Making AI image generation teachable (and safe)

    In my classroom work pack, students use Perchance AI to generate images. I chose this tool deliberately: It is accessible, age-appropriate, and allows students to iterate, refining prompts based on compositional choices rather than chasing novelty.

    Students don’t just generate an image once. They plan, revise, and evaluate.

    This shifts AI use away from shortcut behavior and toward intentional design and reflection, supporting academic integrity rather than undermining it.

    The progression of a prompt: From surface to depth (WAGOLL)

    One of the most effective elements of the work pack is a WAGOLL (What A Good One Looks Like) progression, which shows students how thinking improves with precision.

    • Simple: A photorealistic image of a dog sitting in a park.
    • Secure: A photorealistic image of a dog positioned using the rule of thirds, warm colour palette, soft natural lighting, blurred background.
    • Greater Depth: A photorealistic image of a dog positioned using the rule of thirds, framed by tree branches, low-angle view, strong contrast, sharp focus on the subject, blurred background.

    Students can see and explain how photographic language turns an image from output into meaningful signs. That explanation is where literacy lives.

    When classroom talk begins to change

    Over time, classroom conversations shift.

    Instead of “I like it” or “It looks real,” students begin to say:

    • “The creator wants us to notice…”
    • “This detail suggests…”
    • “At first I thought…, but now I think…”

    These are reading sentences.

    Because images feel accessible, more students participate. The classroom becomes slower, quieter, and more thoughtful–exactly the conditions we want for deep comprehension.

    Visual literacy as a bridge, not an add-on

    Visual literacy is not an extra subject competing for time. It is a bridge, especially in the age of AI.

    By teaching students how to read images, schools strengthen:

    • reading comprehension
    • inference and evaluation
    • evidence-based reasoning
    • metacognitive awarenes

    Most importantly, students learn that literacy is not about rushing to answers, but about noticing, questioning, and constructing meaning.

    In a world saturated with AI-generated images, teaching students how to read visually is no longer optional.

    It is literacy.

    Author’s note: This article grew out of classroom practice and professional dialogue with a former college lecturer and professional photographer. Their contribution informed the discussion of visual composition, semiotics, and reflective image-reading, without any involvement in publication or authorship.

    Source link

  • Teaching in the age of generative AI: why strategy matters more than tools

    Teaching in the age of generative AI: why strategy matters more than tools

    Join HEPI and Advance HE for a webinar today (Tuesday, 13 January 2026) from 11am to 12pm, exploring what higher education can learn from leadership approaches in other sectors. Sign up here to hear this and more from our speakers.

    This blog was kindly authored by Wioletta Nawrot, Associate Professor and Teaching & Learning Lead at ESCP Business School, London Campus.

    Generative AI has entered higher education faster than most institutions can respond. The question is no longer whether students and staff will use it, but whether universities can ensure it strengthens learning rather than weakens it. Used well, AI can support personalised feedback, stimulate creativity, and free academic time for deeper dialogue. Used poorly, it can erode critical thinking, distort assessment, and undermine trust.

    The difference lies not in the tools themselves but in how institutions guide their use through pedagogy, governance, and culture.

    AI is a cultural and pedagogical shift, not a software upgrade

    Across higher education, early responses to AI have often focused on tools. Yet treating AI as a bolt-on risks missing the real transformation: a shift in how academic communities think, learn, and make judgements.

    Some universities began with communities of practice rather than software procurement. At ESCP Business School, stakeholders, including staff and students, were invited to experiment with AI in teaching, assessment, and student support. These experiences demonstrated that experimentation is essential but only when it contributes to a coherent framework with shared principles and staff development.

    Three lessons have emerged as AI rollouts have been deployed. Staff report using AI to draft feedback or generate case study variations, but final decisions and marking remain human. Students learn more when they critique AI, not copy it. Exercises where students compare AI responses to academic sources or highlight errors can strengthen critical thinking. Governance matters more than enthusiasm. Clarity around data privacy, authorship, assessment and acceptable use is essential to protect trust.

    Assessment: the hardest and most urgent area of reform

    Once students can generate fluent essays or code in seconds, traditional take-home assignments are no longer reliable indicators of learning. At ESCP we have responded by: 

    • Introducing oral assessments, in-class writing, and step-by-step submissions to verify individual understanding.
    • Asking students to reference class materials and discussions, or unique datasets that AI tools cannot access.
    • Updating assessment rubrics to prioritise analytical depth, originality, transparency of process, and intellectual engagement.

    Students should be encouraged to state whether AI was used, how it contributed, and where its outputs were adapted or rejected. This mirrors professional practice by acknowledging assistance without outsourcing judgement. This shift moves universities from policing to encouraging by detecting misconduct and teaching responsible use.

    AI literacy and academic inequality

    AI does not benefit all students equally. Those with strong subject knowledge are better able to question AI’s inaccuracies; others may accept outputs uncritically. 

    Generic workshops alone are insufficient. AI literacy must be embedded within disciplines, for example, in law through case analysis; in business via ethical decision-making; and in science through data validation. Students can be taught not just how to use AI, but how to test it, challenge it, and cite it appropriately.

    Staff development is equally important. Not all academics feel confident incorporating AI into feedback, supervision or assessments. Models such as AI champions, peer-led workshops, and campus coordinators can increase confidence and avoid digital divides between departments.

    Policy implications for UK higher education

    If AI adoption remains fragmented, the UK’s higher education sector risks inconsistency, inequity, and reputational damage. A strategic approach is needed at an institutional and a national level. 

    Universities should define the educational purpose of AI before adopting tools, and consider reforming assessments to remain robust. Structured professional development, opportunities for peer exchange, and open dialogue with students about what constitutes legitimate and responsible use will also support the effective integration of AI into the sector.

    However, it’s not only institutions that need to take action. Policymakers and sector bodies should develop shared reference points for transparency and academic integrity. As a nation, we must invest in research into AI’s impact on learning outcomes and ensure quality frameworks reflect AI’s role in higher education processes, such as assessment and skills development.

    The European Union Artificial Intelligence Act (Regulation (EU) 2024/1689) sets a prescriptive model for compliance in education. The UK’s principles-based approach gives universities flexibility, but this comes with accountability. Without shared standards, the sector risks inconsistent practice and erosion of public trust. A reduction in employability may also follow if students are not taught how to use AI ethically while continuing to develop their critical thinking and analytical skills.

    Implications for the sector

    The experience of institutions like ESCP Business School shows that the quality of teaching with AI depends less on the technology itself than on the judgement and educational purpose guiding its use. 

    Generative AI is already an integral part of students’ academic lives; higher education must now decide how to shape that reality. Institutions that approach AI through strategy, integrity, and shared responsibility will not only protect learning, but renew it, strengthening the human dimension that gives teaching its meaning.

    Source link

  • Protecting Schools in the Digital Age: Beyond Firewalls and Filters

    Protecting Schools in the Digital Age: Beyond Firewalls and Filters

    In today’s connected world, school safety extends far beyond hallways. Experts highlight how to protect students through cybersecurity, digital literacy, and trust-centered policies.

    Safety starts with digital literacy

    For schools today, safety means more than locked doors. In an era where student data is currency and misinformation spreads at viral speed, digital security has become just as critical as physical protection.

    Megan Derrick, Ph.D. candidate at the University of South Florida and instructional designer at Hillsborough College, identifies “two big red flags: data privacy and misinformation. Hackers love student data, and AI makes fake news spread faster than a viral TikTok.” For her, protecting schools requires both strong cybersecurity systems and teaching students to be critical consumers of information.

    But safety isn’t only technical. “True protection is both technical and human,” says Yanbei Chen, a doctoral researcher at Syracuse University. Her work emphasizes combining infrastructure with education in digital citizenship, so students and teachers feel safe engaging with technology.

    Both Derrick and Chen agree that digital literacy should be integrated across subjects, not siloed into a single workshop. “Students should know how to fact-check a source and avoid clicking on emails that promise free AirPods,” Derrick says. Chen adds that administrators and teachers can model responsible online behavior, weave discussions of privacy and bias into lessons, and provide opportunities for students to practice safe decision-making.

    Safety ensures trust and resilience

    Balancing safety with openness remains a key challenge. Derrick emphasizes the role of transparency: “Policies should not feel like surveillance. They should feel supportive.” When students and teachers understand the reasons behind safeguards, collaborative and creative learning can thrive within secure boundaries.

    Looking ahead, emerging technologies and stronger policies offer hope. Transparent data practices, inclusive design, and human-centered AI can help schools build environments that are both innovative and resilient.

    As Chen puts it, “Digital literacy and cybersecurity are not just technical skills — they’re part of preparing students to be thoughtful, ethical participants in a digital society.”

    In short, protecting schools in the digital age means equipping students and educators not only to avoid risks but to thrive. That requires blending strong safeguards with a culture of trust, transparency, and resilience.

    Source link

  • Dreamery Speaker Series: Designing flexible learning in the digital age

    Dreamery Speaker Series: Designing flexible learning in the digital age

    Penn State University Libraries’ Teaching and Learning with Technology’s (TLT) Dreamery Speaker Series will host guest speaker Melody J Buckner, associate vice provost of digital learning and online initiatives at the University of Arizona, Dec. 3 and 4. She will lead sessions focused on designing courses that foster adaptability and flexibility in the digital age. Learn more via the PSU News article.

    Source link

  • The Nature of Expertise in the Age of AI

    The Nature of Expertise in the Age of AI

    For several years, I’ve been providing content and student support for the University of Kentucky’s Changemakers program, designed and managed by the Center for Next Generation Leadership. It’s an online one year continuing education option where Kentucky educators can get a rank change for successful completion.  I appreciate that Next Gen believes in “parallel pedagogy”; while it provides valuable resources and materials to be read, viewed, and reflected on, it also requires the program’s students to complete meaningful transfer tasks, pursue an action research project, and participate in a final defense of learning that demonstrates how transformative practices are happening in the Changemaker’s own classroom. 

    This professional learning pathway to rank change involves mostly asynchronous work through online modules focused on the awareness and implementation of what Kentucky calls “vibrant learning” in the classroom, with module topics such as Learner Agency and Inquiry Based Learning.  It’s my contribution to the latter module where the content below originally began, but I’ve expanded and added more detail for this blog entry.

    Inquiry-based learning is a powerful pedagogy.  For students, it can be as extensive as working on a multi-week project-based learning unit, or as simple as asking more high-quality questions in class.  Inquiry comes from curiosity, and the attempt to answer challenging questions and solve problems that have no obvious solution.

    Complicated problems requires help.  Two heads are better than one, after all.  With this in mind, seeking community partners can make perfect sense.  (As an aside, this teacher guide can help shape your conversations when you attempt to bring the community into your classroom; while it mentions PBL, the strategies can help for any scale project or problem you want your students to tackle.)   
    These community partners or “outside experts” can authenticate what may seem abstract into real world problems, and even motivate students to “dig in” when the work gets difficult, to echo the title of this excellent Next Generation Learning Challenges article.  But before we consider how bringing in experts from outside of your classroom can increase vibrant learning, let’s first discuss inside experts, and even the idea of “expertise.”

    Keep in mind that traditionally, and for decades (centuries!), you have been considered to be the expert in the room – of your content, of your pedagogy, of your ability to manage your classroom.  The professionalism required of the vocation, much less the idea of professional standards boards that grant, review, and in some cases revoke certification to teach, adds to the foundational belief that a teacher has earned their well-deserved “expert” credentials.

    But you are usually one human in a room of thirty.  Leaning into the expertise of your students can be at its most basic level a strategy of smartly leveraging your numbers.  Viewing your classroom through an asset mindset, we can see students as learners that bring their own powerful POVs which can enrich your culture and community.  For example, with the right scaffolding, structures, and practice, your students are capable of providing peer-to-peer feedback.

    However, some of our stumbling blocks in education are self-induced, born out of a desire to remain humble.  For example, calling yourself or anyone else an “expert” can sound or feel lofty and divisive.  Educators are sometimes their own worst critic, and may wonder aloud what right they have to declare themselves the expert on such-and-such.  As for students, they may view their own bountiful and beautiful knowledge with a shrug of their shoulders.  If someone in middle school knows how to disassemble and reassemble a car engine, it simply reflects their personal interests, or the fact that their mother loves hot rods.  They are told early and often in traditional school that such knowledge isn’t “book learnin’.”  Loving hot rods or diesel mechanics doesn’t matter, thinks the student, because it’s not a part of my third period class, and it won’t show up on my multiple choice test on Friday.

    Therefore, let’s consider a broadening of our definition of “expert,” and look more at the first five letters of the word.  What we really hope to provide, increase, articulate and bring into a classroom is experience.  From another person’s POV, your experience may be long and traveled (which can make you “more experienced”), or simply a road I’ve never traveled upon (which makes your experience a novel one, compared to mine).   Viewing expertise in this kind of inclusive light opens up what an “expert” is.  We can see an expert as simply (but powerfully!) a person with a different, valued perspective.   The key word is “valued.”  You may have a different POV, you may have twelve degrees on the wall, but if I don’t care about you and especially if you don’t care about me, your “expertise” won’t matter much.  We can also see an expert as a person who is recognized as skillfully applying knowledge.  The key word is “applying.” Remember that old chestnut that answers the question, “What’s the difference between intelligence and wisdom?”  Intelligence is knowing that a tomato is a fruit, but wisdom is knowing you would never put a tomato in a fruit salad.  Expertise that feels too detached and theoretical, or a bunch of random facts you can Google anyway, won’t personally matter very much to your learners.

    With our new, more expansive definition of “expert” behind us, how do these experts from outside of a classroom still have potential to help?  Vibrant learning is memorable and authentic, and community partners can be both.  A parent who is a car mechanic might come in to demonstrate the torque caused by automobile engines.  Not only does that make abstract laws of motion and physics seem more relevant to students, it also has a far greater chance of making tonight’s dinner conversation when the student is asked “What did you do at school today?”  By permitting alternative voices into your learning space, you open up different perspectives and bring the outside community inside of your classroom community. Outside partners could also provide feedback to students as they ideate and prototype a solution in a PBL, or serve as a panel audience for defenses of learning. Of course, in a world full of wondrous technology, we are not limited to in-person guest speakers.  Someone from a European museum might Zoom in for a mini-lecture and a Q & A.  There are over twenty billion uploaded YouTube videos, so with the right discernment and curation skills, an expert is just a click away.

    You might have noticed that artificial intelligence wasn’t mentioned above as a potential “outside expert.”  Going back to our expanded definition, it certainly can seem to checkmark the same boxes.  AI can offer a different perspective, powered by code and fueled by billions of artifacts from our culture and knowledge.  Is that perspective valued, or valuable?  It might, although AI is not always accurate, unbiased, or trustworthy; however, the same can be said of Wikipedia entries created by humans, or the theory from a popular scientist of the past which has been discredited in the present.  Discernment and critical thinking is key, particularly from the teacher who should be monitoring, filtering, and observing the AI usage (and teaching students to be critical AI users as well).  AI can also certainly apply its knowledge scraped across the terabytes of the Internet within (milli)seconds of being prompted.  Is that knowledge skillfully applied?  Based on the uploaded rubric of a teacher alongside the first draft of a student’s essay (being mindful of your platform’s privacy protections, of course), or the public domain text of an author, AI could provide nuanced feedback on student writing or pretend to be a character in a book for a fascinating interactive conversation.  But some of the proficiency of AI’s application will depend on qualitative measures: of the rigor of the rubric you uploaded, or the veracity and bias of the knowledge it grabbed from its database, or the depth of skills the AI has been taught to emulate. And again, AI hallucinations can happen.  

    What will hopefully emerge, as we become more skilled and critical users of AI, is that our ethical priorities will shape the machines instead of letting the machines shape us.  A promising example is the “Dimensions in Testimony” website, a partnership between the University of Southern California and the Shoah Foundation.  The site began by digitizing recorded interviews of actual survivors of the Holocaust and the Nanjing Massacre.   Next, an interviewee has a separate page where, via a looping video, they seem to sit and wait for your questions.  

    When prompted, a short video plays where the interviewee “answers” your question, creating a virtual conversation.  You can do this via your microphone or by typing.  What may seem miraculous is really just clever programming – the interviews were transcripted and time-coded, so AI simply takes your prompt, scans the text, finds a corresponding clip that seems to best answer your question, and plays from that particular time-stamped portion of the interview. Still, you can see the power of providing such “expertise” to students, giving them a chance to be both empathetic as well as practicing their questioning/prompting skills.  (It should also be noticed the dignity and care given to the subject matter by USC and Shoah.  The interviews were real, using genuine survivors of genocide and the Holocaust, not actors.  While you technically could have AI “pretend” to be a survivor of a war crime as a customized chatbot, or have students interact with some kind of digital fictionalized Holocaust survivor avatar, there are many reasons why this would be an unethical and inappropriate use of such technology.)

    As you ponder ways to increase and improve inquiry, reflect on the nature of “expertise,” both inside and outside of your own four walls.  As you do so, you can cautiously consider how AI can be one of many types of “outside experts” you can bring into your classroom.

    For more information on Changemakers, be sure to check out this page for the latest links to sign up for updates and apply to join the next cohort.

    Source link

  • The Importance of Connection in the Age of AI – Faculty Focus

    The Importance of Connection in the Age of AI – Faculty Focus

    Source link

  • The Importance of Connection in the Age of AI – Faculty Focus

    The Importance of Connection in the Age of AI – Faculty Focus

    Source link

  • Universities in the age of intelligent machines – Campus Review

    Universities in the age of intelligent machines – Campus Review

    When Geoffrey Hinton, the so-called ‘godfather of AI’, declared in 2016 that ‘we should stop training radiologists now,’ his warning caused a stir.

    Please login below to view content or subscribe now.

    Membership Login

    Source link

  • Strategies for Personalized Learning in AI Age (opinion)

    Strategies for Personalized Learning in AI Age (opinion)

    How do we teach effectively—and humanly—in this age of AI?

    New advances in artificial intelligence break news at such a rapid pace that many of us have difficulties keeping up. Dinuka Gunaratne gave a detailed summary of many different AI tools in his “Carpe Careers” article published in July; yet more tools will likely appear in the next months and years in an exponential explosion. How do we, as educators (new and established Ph.D.s) design curriculum and classes with these new AI tools being released every few weeks? How do we design effective assignments that teach critical analysis and logical thought while knowing that our students, too, have access to these tools?

    Many existing AI tools can be used to assist with course design. However, I will provide some insight on methods of pedagogy that emphasize personalized learning regardless of what new technology becomes available.

    Some questions educators are now thinking about include:

    1. How do I design an assignment so the student cannot just prompt an AI tool to complete it?
    2. How do I design the course so that the student can choose whether or not to use AI tools—and how do I assess these two groups of students?

    Below, I outline some wise teaching practices with an eye toward helping students develop core skills including critical thinking, problem-solving, teamwork, creativity and—the most essential skill of all—curiosity.

    Making the Most Out of Class Time

    An effective course utilizes a combination of teaching strategies. I outline three here.

    1. Make sure that your class is generative so that when you give an assignment, it reaches as far back as day one. A generative learning model is one in which each week is built upon the previous one, and in which a student is assessed on the knowledge they have cumulatively accrued.
    1. Hold interactive in-person activities in each class, building upon the previous assignments and content.
    2. Flip the classroom so that class time is used for discussion and not a monologue presentation from you. If you can assign videos or reading assignments for students to view or read prior to class, then you can use class time to discuss the content or reinforce the learning with group activities.

    Here is an example of combining these tools in the buildup to a presentation from one of my classes.

    • Week 1: Each student writes and brings a one-page summary of their research so that peers can provide feedback. I provide feedback training in class before the peer readings take place.
    • Week 2: Using the peer feedback of the summary, each student creates one slide summarizing their research for a three-minute thesis (3MT) and brings the slide to class to receive peer feedback.
    • Week 3: Students practice presentation skills through an activity called “slide karaoke,” in which a student has one minute to present a simple slide they have never seen before. They are then given feedback by peers and the instructor on general presentation skills. I provide peer feedback training before the presentations.
    • Week 4: Students implement the general feedback from slide karaoke and give practice 3MT presentations to receive specific peer and mentor feedback on the content. These mentors are usually students from the year before who revisit the class.
    • Week 5: Students give the final 3MT in front of judges and peers for evaluation.
    • Week 6: Students write a summary of what was learned from the entire generative experience.

    This sequence of assignments is personalized so that the final report can only be about the student’s individual experience. While students might want to use AI tools to edit or organize their ideas, ChatGPT or other AI tools cannot possibly know what happened in the classroom—only the student can write about it.

    For larger classes in which a presentation from each student may not be possible, here is another example.

    • Week 1: A video or reading assigned to students to view/read before class discusses the basics of DNA and inheritance. An in-class assignment involves a group discussion on Mendelian inheritance problem sets.
    • Week 2: Before class, students read an article on how DNA is packaged; the in-class discussion focuses on the molecules involved in chromatin structure.
    • The next classes all have either prereads or videos, which students discuss in class, and the content builds up to a more complex genetic mechanism, such as elucidating the gene for a disease. The final report could be “summarize how one could find a gene responsible for a certain disease using the discussion points we had in class.” In this scenario as well, the student is taking the personalized class experience and incorporating the ideas into the final report, something that cannot be wholly outsourced to any AI tool.

    If you decide to embrace AI tools in the classroom, you can still teach critical thinking and creativity by asking the students to use AI to write a report on a topic discussed in class—and then in part two of the assignment, ask them to assess the AI-generated report, cite the proper references and correct any mistakes, content or grammar-wise.

    I sometimes show an example of this in class to demonstrate to students that AI makes mistakes, rather than giving it as an assignment. But it is something you might want to try making an optional method for an assignment. Students can declare whether they used AI or not on their submission. As an instructor, you will need to design two rubrics for these different groups. Group one will have a rubric based on content, grammar, references, logical thought and organization, and clarity. Group two (those who use AI) will have a rubric consisting of the same components in addition to an evaluation of how well the student found the AI mistakes.

    Applying for Teaching Positions

    If you are applying for a teaching position, you should address AI in your teaching dossier and how you may or may not incorporate it—but at the very least, discuss its effects on higher education. Many articles and books on this topic exist, including Teaching with AI: A Practical Guide to a New Era of Human Learning, by José Antonio Bowen and C. Edward Watson (Johns Hopkins Press, 2024); Robot-Proof: Higher Education in the Age of Artificial Intelligence, by Joseph E. Aoun (MIT Press, 2017); and Generative AI in Higher Education, by Cecilia Ka Yuk Chan and Tom Colloton (Routledge, 2024).

    Yet even as we consider how to integrate AI in our teaching, we must not forget the human experience at work in all that we do. We can emphasize things like 1) encouraging students to meet with us in person or even for a walk as opposed to a virtual meeting and 2) assessing what emotions students bring to the meeting or class and how that may affect the dynamics. We as educators should harness the human side of teaching, including the classroom experience and the in-class group work, so that the “final” assessments build directly out of these personalized learnings.

    For those venturing into a career that involves teaching or mentoring, develop teaching strategies and tools that center the human experience and include them in your teaching dossier. Your application will shine.

    Nana Lee is the director of professional development and mentorship, special adviser to the dean of medicine for graduate education, and associate professor, teaching stream, at the University of Toronto. She is also a member and regional director of the Graduate Career Consortium—an organization providing an international voice for graduate-level career and professional development leaders.

    Source link