Category: Artificial Intelligence

  • HESA’s AI Observatory: What’s new in higher education (March 16, 2025)

    HESA’s AI Observatory: What’s new in higher education (March 16, 2025)

    International Frameworks

    With the right opportunities we can become AI makers, not takers
    Michael Webb.  FE Week. February 21, 2025.

    The article reflects on the UK’s AI Opportunities Action Plan, aiming to position the country as a leader in AI development rather than merely a consumer. It highlights the crucial role of education in addressing AI skills shortages and emphasizes the importance of focusing both on the immediate needs around AI literacy, but also with a clear eye on the future, as the balance moves to AI automation and to a stronger demand for uniquely human skills.

    Living guidelines on the responsible use of generative AI in research : ERA Forum Stakeholder’s document
    European Commission, Directorate-General for Research and Innovation. March 2024.

    These guidelines include recommendations for researchers, recommendations for research organisations, as well as recommendations for research funding organisations. The key recommendations are summarized here.

    Industry Collaborations

    OpenAI Announces ‘NextGenAI’ Higher-Ed Consortium
    Kim Kozlowski. Government Technology.  March 4, 2025.

    OpenAI has launched the ‘NextGenAI’ consortium, committing $50M to support AI research and technology across 15 institutions, including the University of Michigan, the California State University system, the Harvard University, the MIT and the University of Oxford. This initiative aims to accelerate AI advancements by providing research grants, computing resources, and collaborative opportunities to address complex societal challenges.

    AI Literacy

    A President’s Journey to AI Adoption
    Cruz Rivera, J. L. Inside Higher Ed. March 13, 2025.

    José Luis Cruz Rivera, President of Northern Arizona University, shares his AI exploration journey. « As a university president, I’ve learned that responsible leadership sometimes means […] testing things out myself before asking others to dive in ». From using it to draft emails, he then started using it to analyze student performance data and create tailored learning materials, and even used it to navigate conflicting viewpoints and write his speechs – in addition to now using it for daily tasks.

    Teaching and Learning

    AI Tools in Society : Impacts on Cognitive Offloading and the Future of Critical Thinking
    Gerlich, M. SSRN. January 14, 2025.

    This study investigates the relationship between AI tool usage and critical thinking skills, focusing on cognitive offloading as a mediating factor. The findings revealed a significant negative correlation between frequent AI tool usage and critical thinking abilities, mediated by increased cognitive offloading. Younger participants exhibited higher dependence on AI tools and lower critical thinking scores compared to older participants. Furthermore, higher educational attainment was associated with better critical thinking skills, regardless of AI usage. These results highlight the potential cognitive costs of AI tool reliance, emphasising the need for educational strategies that promote critical engagement with AI technologies.

    California went big on AI in universities. Canada should go smart instead
    Bates, S. University Affairs. March 12, 2025.

    In this opinion piece, Simon Bates, Vice-Provost and Associate Vice-President for Teaching and Learning at UBC, reflects on how the ‘fricitonless efficiency’ promised by AI tools comes at a cost. « Learning is not frictionless. It requires struggle, persistence, iteration and deep focus. The risk of a too-hasty full scale AI adoption in universities is that it offers students a way around that struggle, replacing the hard cognitive labour of learning with quick, polished outputs that do little to build real understanding. […] The biggest danger of AI in education is not that students will cheat. It’s that they will miss the opportunity to build the skills that higher education is meant to cultivate. The ability to persist through complexity, to work through uncertainty, to engage in deep analytical thought — these are the foundations of expertise. They cannot be skipped over. »

    We shouldn’t sleepwalk into a “tech knows best” approach to university teaching
    Mace, R. et al. Times Higher Education. March 14, 2025.

    The article discusses the increasing use of generative AI tools like among university students, with usage rising from 53% in 2023-24 to 88% in 2024-25. It states that instead of banning these tools, instructors should ofcus on rethinking assessment strategies to integrate AI as a collaborative tool in academic work. The authors share a list of activities, grounded in the constructivist approach to education, that they have successfully used in their lectures that leverage AI to support teaching and learning.

    Accessibility & Digital Divide

    AI Will Not Be ‘the Great Leveler’ for Student Outcomes
    Richardson, S. and Redford, P. Inside Higher Ed. March 12, 2025.

    The authors share three reasons why AI tools are only deepening existing divides : 1) student overreliance on AI tools; 2) post-pandemic social skills deficit; and 3) business pivots. « If we hope to continue leveling the playing field for students who face barriers to entry, we must tackle AI head-on by teaching students to use tools responsibly and critically, not in a general sense, but specifically to improve their career readiness. Equally, career plans could be forward-thinking and linked to the careers created by AI, using market data to focus on which industries will grow. By evaluating student need on our campuses and responding to the movements of the current job market, we can create tailored training that allows students to successfully transition from higher education into a graduate-level career. »

    Source link

  • How AI is Changing the Way I Teach Business Law

    How AI is Changing the Way I Teach Business Law

    Reading Time: 5 minutes

    AI has taken the world by storm, and the education field is no exception. After over two decades teaching at The Paul Merage School of Business at the University of California, Irvine, I have seen lots of changes related to curriculum, teaching resources and students. However, I’ve seen nothing quite like the wave of AI tools popping up in classrooms. It’s exciting, a little daunting and definitely something we all need to talk about.

    So, here’s the deal: I’m not an AI expert. But I have spent a lot of time experimenting with AI, learning from my mistakes and figuring out what works and what doesn’t. I’d like to share some of these experiences with you.

    AI in education: What’s the big deal?

    AI is already here, whether we’re ready for it or not. According to Cengage research, use of AI has nearly doubled among instructors, from 24% in 2023, to 45% in 2024. Many of us are using AI to create lectures, craft assignments and even grade assessments. The challenge is not whether we adopt AI. Rather, it’s doing so in a way that enhances our students’ learning outcomes, while maintaining academic integrity in our courses.

    In my online undergraduate business law course, I have always required my students to take written assessments, where they analyze a set of facts to reach a legal conclusion. Not only am I trying to teach them the principles of law, but I want them to improve their writing skills.

    A shift in focus

    A few years ago, I noticed a subtle increase in the overall scores for these written assessments. I have taught this course for over 20 years, so I knew what the historical scores were. Looking into it further, I started hearing about how some students were using ChatGPT in their courses. This got me wondering whether some of my students had already been using it to take my written assessments. Quick answer: yes, they were. This now presented a problem: what do I do about it? In an online course, how can I prohibit the use of AI tools on a written assessment while effectively enforcing that ban?  I shifted my focus from policing and enforcing a ban on the use of AI in my courses to teaching my students how to use AI responsibly.

    Teaching students to use AI responsibly

    In my course, I developed assignments called “Written ApprAIsals.” These three-part writing assignments combine traditional learning with AI-assisted refinement. These teach students how to use AI responsibly while improving their critical thinking and writing skills. Here’s how it works:

    Step 1: Write a first draft without AI

    Students are given a law and related news article about a current legal issue. They must write a memo which analyzes the constitutionality of this law. I also provide them with guidance on utilizing the standard legal memo format, known as IRAC (Issue, Rule, Analysis, Conclusion), to help organize their thoughts and writing.

    Students are permitted to use whatever materials they have, including eBooks, my lecture videos and outlines, Cengage’s online learning platform, MindTap and its resources, and any other information they ethically obtain online. But, they’re not permitted to use AI.

    The purpose of this first draft is for them to demonstrate the foundational knowledge they should have already learned. Students must attest to completing this first draft without using AI, and it’s worth 30% of the total “Written ApprAIsal” grade.

    Step 3: Integrate AI to resolve deficiencies

    Once I have given them feedback on their first drafts, students are required to use AI to improve their first draft. Students must submit the URL to their AI queries and responses (“AI log”). Or less ideally, they can submit a PDF or screenshot of them. I can assess the effort they put in, evaluate their queries, and provide guidance on how to more effectively use AI. This part is worth 40% of the total “Written ApprAIsal” grade.

    Step 3: Use AI to help write a final draft

    Using what they’ve obtained from AI, along with my feedback, students must transform their first draft into an improved final draft. Students are permitted to continue using AI as well.  They must turn on track changes in their document so I can see what changes they’ve made to the first draft.

    Why has this approach worked in my course?

    1. It makes students aware of my familiarity with AI and how it’s used. Students now know I am on the lookout for improper usage of AI in our course.
    2. It encourages their acquisition of foundational knowledge. Students quickly figure out that they must know the basic legal principles. Without them, they will have no idea if AI is providing them with inaccurate information, which can happen sometimes, especially when it comes to legal cases
    3. This approach promotes academic integrity. Students recognize their first drafts must reflect their genuine understanding. There is no benefit to using AI for the first draft. Because the remaining parts are based on their use of AI to improve the first draft, there will not be much room for improvement if the first draft is too good. And because students must submit their AI logs, I can easily ascertain if they actually did the work.
    4. Students build necessary skills for their future careers. They can improve their writing and analysis skills in a low stakes’ way, while receiving useful feedback.
    5. It helps me focus my efforts on helping them understand the law, rather than having to enforce a ban on the use of AI.

    Issues related to this approach

    It takes a lot of effort to find the right law and related news article to use. Not only does the law have to be current, but it also must be interesting and relevant to the students. Legal issues must be presented in a way which are factually neutral to avoid bias. And, the news articles must be factual and not cluttered with distracting commentary or opinions.

    Additionally, rapid feedback is required. With up to 150 students in my course, I only have a little more than 24 hours to turn around written feedback and comments on their first drafts and AI logs. Frankly, it can be overwhelming.

    Tips on integrating AI into your course

    I have learned a few things along the way about integrating AI into my courses.

    Establish clear rules: Be upfront and clear about when, and how, AI can be used. Stick to those rules and enforce them.

    Consider accessibility: Not every student has easy or affordable access to AI tools. Make sure you have alternatives available for these students.

    Teach foundational knowledge first: Students need to know the core concepts so they can critically evaluate any AI-generated content.

    Require transparency: Students must show how they used AI. It is a great way to keep them honest.

    Be flexible and open to experimentation, most importantly: Mistakes are inevitable. There will be times where something you thought would work just doesn’t. That’s ok. Adjust and keep innovating.

    Final Thoughts

    AI is here to stay, and that’s not necessarily a bad thing. AI is a tool that can help students learn. But, it’s up to us to show our students how to use AI responsibly. Whether it’s helping them improve their writing skills, gain foundational knowledge or develop critical thinking skills, AI has so much potential in our courses. Let’s embrace it and figure out how to make it work for each of us.

    Got ideas or experiences with AI in your courses? Let’s connect. I would love to hear how you are using it!

    Machiavelli (Max) Chao is a full-time Senior Continuing Lecturer at the Paul Merage School of Business at the University of California, Irvine and Cengage Faculty Partner. 

    Source link

  • Three Ways Faculty Are Using AI to Lighten Their Professional Load

    Three Ways Faculty Are Using AI to Lighten Their Professional Load

    Reading Time: 4 minutes

    Our most recent research into the working lives of faculty gave us some interesting takeaways about higher education’s relationship with AI. While every faculty member’s thoughts about AI differ and no two experiences are the same, the general trend we’ve seen is that faculty have moved from fear to acceptance. A good deal of faculty were initially concerned about AI’s arrival on campus. This concern was amplified by a perceived rise in AI-enabled cheating and plagiarism among students. Despite that, many faculty have come to accept that AI is here to stay. Some have developed working strategies to ensure that they and their students know the boundaries of AI usage in the classroom.

    Early-adopting educators aren’t just navigating around AI. They have embraced and integrated it into their working lives. Some have learned to use AI tools to save time and make their working lives easier. In fact, over half of instructors reported that they wanted to use AI for administrative tasks and 10% were already doing so. (Find the highlights here.) As more faculty are seeing the potential in AI, that number has likely risen. So, in what ways are faculty already using AI to lighten the load of professional life? Here are three use-cases we learned about from education professionals:

    1. AI to jumpstart ideas and conversations

    “Give me a list of 10 German pop songs that contain irregular verbs.”

    “Summarize the five most contentious legal battles happening in U.S. media law today.”

    “Create a set of flashcards that review the diagnostic procedure and standard treatment protocol for asthma.”

    The possibilities (and the prompts!) are endless. AI is well-placed to assist with idea generation, conversation-starters and lesson materials for educators on any topic. It’s worth noting that AI tends to prove most helpful as a starting point for teaching and learning fodder, rather than for providing fully-baked responses and ideas. Those who expect the latter may be disappointed, as the quality of AI results can vary widely depending on the topic. Educators can and should, of course, always be the final determinants and reviewers of the accuracy of anything shared in class.

    1. AI to differentiate instruction

    Faculty have told us that they spend a hefty proportion (around 28%) of their time on course preparation. Differentiating instruction for the various learning styles and levels in any given class constitutes a big part of that prep work. A particular lesson may land well with a struggling student, but might feel monotonous for an advanced student who has already mastered the material. To that end, some faculty are using AI to readily differentiate lesson plans. For example, an English literature instructor might enter a prompt like, “I need two versions of a lesson plan about ‘The Canterbury Tales;’ one for fluent English speakers and one for emergent English speakers.” This simple step can save faculty hours of manual lesson plan differentiation.

    An instructor in Kansas shared with Cengage their plans to let AI help in this area, “I plan to use AI to evaluate students’ knowledge levels and learning abilities and create personalized training content. For example, AI will assess all the students at the beginning of the semester and divide them into ‘math-strong’ and ‘math-weak’ groups based on their mathematical aptitude, and then automatically assign math-related materials, readings and lecture notes to help the ‘math-weak’ students.”

    When used in this way, AI can be a powerful tool that gives students of all backgrounds an equal edge in understanding and retaining difficult information.

    1. AI to provide feedback

    Reviewing the work of dozens or hundreds of students and finding common threads and weak spots is tedious work, and seems an obvious area for a little algorithmic assistance.

    Again, faculty should remain in control of the feedback they provide to students. After all, students fully expect faculty members to review and critique their work authentically. However, using AI to more deeply understand areas where a student’s logic may be consistently flawed, or types of work on which they repeatedly make mistakes, can be a game-changer, both for educators and students.

    An instructor in Iowa told Cengage, “I don’t want to automate my feedback completely, but having AI suggest areas of exigence in students’ work, or supply me with feedback options based on my own past feedback, could be useful.”

    Some faculty may even choose to have students ask AI for feedback themselves as part of a critical thinking or review exercise. Ethan and Lilach Mollick of the Wharton School of the University of Pennsylvania share in an Harvard Business Publishing Education article, “Though AI-generated feedback cannot replicate the grounded knowledge that teachers have about their students, it can be given quickly and at scale and it can help students consider their work from an outside perspective. Students can then evaluate the feedback, decide what they want to incorporate, and continue to iterate on their drafts.”

    AI is not a “fix-all” for the administrative side of higher education. However, many faculty members are gaining an advantage and getting some time back by using it as something of a virtual assistant.

     

    Are you using AI in the classroom?

    In a future piece, we’ll share 3 more ways in which faculty are using AI to make their working lives easier. In the meantime, you can fully explore our research here:

     

     

     

    Source link

  • Simulations and AI: Critical Thinking Improvement

    Simulations and AI: Critical Thinking Improvement

    Reading Time: 4 minutes

    As an educator teaching undergraduates and graduates, both online and face-to-face, it’s always a challenge to find meaningful ways to engage students. Now that artificial intelligence has come into play, that challenge has become even greater. This has resulted in a need to address ways to create “AI-proof” assignments and content.

    Simulations in different types of courses

    According to Boston College, simulations are designed to engage students “directly with the information or the skills being learned in a simulated authentic challenge.” In my teaching over the past decade plus, I have gone from using simulations in one primary operations management course to using them in almost every course I teach. And I don’t necessarily use them in a stand-alone assignment, although they can be used as such. How I use a simulation is course dependent.

    Face-to-face

    In some face-to-face courses, I will run the simulation in class with everyone participating. Sometimes I will have teams work in a “department,” or have true, open discussions. Sometimes I will run the room, ensuring every single student is paying attention and contributing. Using simulations in this fashion gives flexibility in the classroom. It shows me who truly gets the concepts and who is going through the motions. The dynamic of the class itself can dictate how I run the simulation.

    Online

    In online courses, I typically assign simulation work. This can be one simulation assignment or a progressive unit of simulations. It’s a great way to see students improve as they move through various concepts, ideas, and applications of the topics covered. Creating assignments which are both relative to the simulation and comparative to the work environment make assignments AI-proof. Students must think about what they have actually done in class and relate it to their workplace environment and/or position.

    Why simulations work for all levels

    There are many simulations that can be used and incorporated in both undergraduate and graduate level courses. As much as we don’t think of graduate students relying on AI to complete work, I have seen this happen multiple times. The results aren’t always ideal. Using simulations at the graduate level, and ensuring your assignments reflect both the simulation and real-world comparisons, can help your students use AI to gather thoughts, but not rely on it for the answers.

    Student benefits

    Using simulations will have many benefits for your students. I have gotten feedback from many students over the years regarding their ability to make decisions and see the results that simulations give. My capstone students often want to continue running the simulation, just to see how well they can do with their “business.” I have had students in lower-level management courses ask me how they can get full access to run these when I have them as “in-class only” options. The majority of feedback includes:

    1. Anything is better than lecture!
    2. Being able to see how students’ decisions impact other areas can be very helpful for them. They actually remember it, enforcing more than reading or watching can do.
    3. Students want more simulations throughout their courses, rather than just one or two. They will have the ability to make those decisions and see those impacts. And they feel it will prepare them even more for the workforce.

    As a retention and engagement tool, simulations seem to be one of the best I have found. Are there students that don’t like them? Yes, there always are. Even so, they’re forced to think through solutions and determine a best course of action to get that optimal result. From an instructor’s perspective, there’s nothing better than seeing those wheels turn. Students are guided on how to recover from an issue, and are advised on what may happen if different solutions were attempted. The questions gained are often better than the results.

    Instructor benefits

    For instructors, there are many benefits. As I stated earlier, you can see improvements in student behavior. They ask questions and have a defined interest in the results of their actions. In classes when you have teams, it can become friendly competition. If they are individual assignments, you get more questions, which is something we always want to see. More questions show interest.

    Ease of use

    Although I usually include recorded instructions and tips for simulations in my online courses, I prefer my personal recordings, since I also give examples relevant to student majors and interests. For example, in an entrepreneurial class, I would go through a simulation piece and include how this might affect the new business in the market vs. how it might impact an established business.

    Auto-grading

    When assigning simulations, they are usually auto-graded. This can drastically lighten our workload. I personally have around 150-200 students each term, so being able to streamline the grading function is a huge benefit. However, with this, there are trade-offs. Since I also create simulation-based questions and assignments, there are no textbook answers to refer to. You must know the simulations and be the content expert, so you can effectively guide your students.

    Thoughtful responses

    AI can be a great tool when used productively. But seeing overuse of the tool is what led me to learn more simulations. This adjustment on my end has resulted in students presenting me with more thoughtful, accurate, and relevant responses. Feedback from students has been positive.

    Sims for all industries

    An additional benefit of simulations is that there are basically sims for all industries. Pilot and healthcare sims have existed for a very long time. But even if you only have access to one or two, you have the ability to make it relatable to any field. If you’re like me and teach a variety of classes, you can use one simulation for almost any class.

    Overall success

    I was using simulations before AI became so influential. The extensive and current use of AI has driven me to use more simulations in all of my courses. By adjusting what tools I use, I have been able to encourage more thorough problem solving, active listening and reasoning. Plus, I get strategic and effective questions from my students. The overall results include intense engagement, better critical thinking skills, and content retention.

     

    Written by Therese Gedemer, Adjunct Instructor and Workforce Development Trainer, Marian University, Moraine Park Tech College and Bryant & Stratton College

     

    Source link

  • HEDx Podcast: Head of AI at Macquarie uni – Episode 158

    HEDx Podcast: Head of AI at Macquarie uni – Episode 158

    Phil Laufenberg is the Head of Artificial Intelligence (AI) at Macquarie University. His experience varies from tech-startups to executive responsibilities in public universities across three continents.

    His vision is for AI-enabled universities that accelerate accessible education for all, and sees that one way to do that is through universities partnering with tech companies.

    Read more:

    Do you have an idea for a story?
    Email [email protected]

    Source link

  • ai-powered-teaching-personalizing-online-courses The Cengage Blog

    ai-powered-teaching-personalizing-online-courses The Cengage Blog

    Reading Time: 3 minutes

    Let’s face it: education is changing with technology. But hasn’t it always? Imagine the calligraphy teacher’s grimace at the typewriter. Math teachers and calculators, English teachers and spellcheck, history teachers and Google — instructors quickly adopted all of these tools for their own usage. The same opportunity arises with the explosion of artificial intelligence.

    Personalizing asynchronous courses

    Having been an online student and now leading online courses, I empathize with both sets of stakeholders. Online courses have grown with the availability of the internet and lowered home computer costs. The flexibility asynchronous courses offer is what makes them desirable. Neither party must be in a specific classroom at a particular time. This allows both to work a more convenient schedule.

    The most obvious challenge for instructors is bringing value to the students in a format that lacks the personalization of the classroom setting. Emails and discussion boards don’t communicate with the same personal touch. Recording classroom lectures for a face-to-face class certainly has some merit. The online student gets to hear and watch lectures and discussions. Yet, this might not be a foreseeable solution for instructors without in-person and online sections of the same subject. Also, recorded lectures may give the online sections less time to consume the content than their in-person peers.

    Recorded lecture: the challenges

    Until recently, my modus operandi was recording lectures for online students. I did this in order to replicate what they would get in the classroom, albeit passively devoid of discussion. Unless these videos are reused for different semesters and classes, it still seems inefficient and strangely impersonal. The inefficiency comes from mistakes that I would have laughed off in a live course. However, they certainly became points of frustration when watching myself stumble through a word or phrase that rolled off the tongue effortlessly during the dry run. Sometimes, I didn’t realize my mic was not toggled on. This resulted in a very uneventful silent film. Or someone would interrupt. I don’t think I’ve scratched the surface of all the things that disrupted my attempts. So, I looked for alternative sources for help.

    The power of AI avatars for lecture delivery

    I spent some time dabbling with AI avatars and seeing the potential to adopt the technology. The avatars cross the personalization hurdle by offering lifelike renditions with mannerisms and voice. While the technology is not quite as precise as recorded video, it’s good and getting better. The students have given it positive reviews. It is undoubtedly better than some of the textbook videos I had the unfortunate task of watching in a couple of my online courses as a student.

    Avatars also clear the hurdle of efficiency and frustration. Using an avatar, I no longer have to fret over interruptions or mistakes. The editing is all done in its script. I load what I want it to say, and the avatar says it. No “ums.” No coughs or sneezes to apologize for. No triple takes on the word, “anthropomorphic.” If I’m interrupted, I can save it and return to it later. This enables me to scale my efforts.

    Using Google’s NotebookLM to create AI-generated podcasts

    Depending on your social media algorithm, you were probably privy to people’s Spotify top stats or other creative memes of the phenomenon in early December 2024. Spotify created personal “Wrapped AI podcasts” based on AI’s interpretation of users’ listening habits throughout the year. From a marketing perspective, this is great cobranding for both Google and Spotify, but the instructor’s perspective is why I’m writing. I learned about NotebookLM at a recent conference. The real beauty is that, currently, it’s free with a Google account.

    Evaluating anecdotal evidence from my courses again, the students enjoyed the podcast version of the content. Instructors can add content that they have created and own the rights to, like lecture notes, and two AI “podcasters” will discuss it.

    Because it’s only audio content, students can listen to it anywhere they are with their phones. Some comments that I noted were, “Listening to it felt less like studying” and “It was easy to listen to driving in my car.” This adds another layer of content consumption for students.

    Balancing AI and instructor presence

    Though I offered two technologies to deliver content to students, I do so as supplements to recorded lectures and web meetings. Indeed, in this era of AI, it is easy to become enamored with or apprehensive of this technology. Our students live very digitalized lives. Versing yourself in emerging technologies while still interacting with online students in more “traditional” formats can help you keep up with the times. You can still lean on  tried-and-true education delivery. I think the key is to be willing to try a new technology and ask the students what they think of it. So many educators are worried about replacement, but at this stage in technology, we need to use AI as enhancements. So many digital platforms are using it. Why not use it in online classes responsibly?

    Written by Britton Legget, Assistant Professor of Marketing at McNeese State University and Cengage Faculty Partner.

    Want to learn about Professor Leggett’s unique journey into his current role?

    Source link

  • Engaging Students in Collaborative Research and Writing Through Positive Psychology, Student Wellness, and Generative AI Integration – Faculty Focus

    Engaging Students in Collaborative Research and Writing Through Positive Psychology, Student Wellness, and Generative AI Integration – Faculty Focus

    Source link

  • 25 (Mostly AI) Sessions to Enjoy in 2025 – The 74

    25 (Mostly AI) Sessions to Enjoy in 2025 – The 74


    Get stories like this delivered straight to your inbox. Sign up for The 74 Newsletter

    South by Southwest Edu returns to Austin, Texas, running March 3-6. As always, it’ll offer a huge number of panels, discussions, film screenings, musical performances and workshops exploring education, innovation and the future of schooling.

    Keynote speakers this year include neuroscientist Anne-Laure Le Cunff, founder of Ness Labs, an online educational platform for knowledge workers; astronaut, author and TV host Emily Calandrelli, and Shamil Idriss, CEO of Search for Common Ground, an international non-profit. Idriss will speak about what it means to be strong in the face of opposition — and how to turn conflict into cooperation. Also featured: indy musical artist Jill Sobule, singing selections from her musical F*ck 7th Grade.

    As in 2024, artificial intelligence remains a major focus, with dozens of sessions exploring AI’s potential and pitfalls. But other topics are on tap as well, including sessions on playful learning, book bans and the benefits of prison journalism. 

    To help guide the way, we’ve scoured the schedule to highlight 25 of the most significant presenters, topics and panels: 

    Monday, March 3:

    11 a.m. — Ultimate Citizens Film Screening: A new independent film features a Seattle school counselor who builds a world-class Ultimate Frisbee team with a group of immigrant children at Hazel Wolf K-8 School. 

    11:30 a.m. — AI & the Skills-First Economy: Navigating Hype & Reality: Generative AI is accelerating the adoption of a skills-based economy, but many are skeptical about its value, impact and the pace of growth. Will AI spark meaningful change and a new economic order, or is it just another overhyped trend? Meena Naik of Jobs for the Future leads a discussion with Colorado Community College System Associate Vice Chancellor Michael Macklin, Nick Moore, an education advisor to Alabama Gov. Kay Ivey, and Best Buy’s Ryan Hanson.

    11:30 a.m. — Navigation & Guidance in the Age of AI: The Clayton Christensen Institute’s Julia Freeland Fisher headlines a panel that looks at how generative AI can help students access 24/7 help in navigating pathways to college. As new models take root, the panel will explore what entrepreneurs are learning about what students want from these systems. Will AI level the playing field or perpetuate inequality? 

    12:30 p.m. — Boosting Student Engagement Means Getting Serious About Play: New research shows students who are engaged in schoolwork not only do better in school but are happier and more confident in life. And educators say they’d be happier at work and less likely to leave the profession if students engaged more deeply. In this session, LEGO Education’s Bo Stjerne Thomsen will explore the science behind playful learning and how it can get students and teachers excited again.

    1:30 p.m. — The AI Sandbox: Building Your Own Future of Learning: Mike Yates of The Reinvention Lab at Teach for America leads an interactive session offering participants the chance to build their own AI tools to solve real problems they face at work, school or home. The session is for AI novices as well as those simply curious about how the technology works. Participants will get free access to Playlab.AI.

    2:30 p.m. — Journalism Training in Prison Teaches More Than Headlines: Join Charlotte West of Open Campus, Lawrence Bartley of The Marshall Project and Yukari Kane of the Prison Journalism Project to explore real-life stories from behind bars. Journalism training is transforming the lives of a few of the more than 1.9 million people incarcerated in the U.S., teaching skills from time management to communication and allowing inmates to feel connected to society while building job skills. 

    Tuesday, March 4:

    11:30 a.m. — Enough Talk! Let’s Play with AI: Amid the hand-wringing about what AI means for the future of education, there’s been little conversation about how a few smart educators are already employing it to shift possibilities for student engagement and classroom instruction. In this workshop, attendees will learn how to leverage promising practices emerging from research with real educators using AI in writing, creating their own chatbots and differentiating support plans. 

    12:30 p.m. — How Much is Too Much? Navigating AI Usage in the Classroom: AI-enabled tools can be helpful for students conducting research, outlining written work, or proofing and editing submissions. But there’s a fine line between using AI appropriately and taking advantage of it, leaving many students wondering, “How much AI is too much?” This session, led by Turnitin’s Annie Chechitelli, will discuss the rise of GenAI, its intersection with academia and academic integrity, and how to determine appropriate usage.  

    1 p.m. — AI & Edu: Sharing Real Classroom Successes & Challenges: Explore the real-world impact of AI in education during this interactive session hosted by Zhuo Chen, a text analysis instructor at the nonprofit education startup Constellate, and Dylan Ruediger of the research and consulting group Ithaka S+R. Chen and Ruediger will share successes and challenges in using AI to advance student learning, engagement and skills. 

    1 p.m. — Defending the Right to Read: Working Together: In 2025, authors face unprecedented challenges. This session, which features Scholastic editor and young adult novelist David Levithan, as well as Emily Kirkpatrick, executive director of the National Council of Teachers of English, will explore the battle for freedom of expression and the importance of defending reading in the face of censorship attempts and book bans.

    1 p.m. — Million Dollar Advice: Navigating the Workplace with Amy Poehler’s Top Execs: Kate Arend and Kim Lessing, the co-presidents of Amy Poehler’s production company Paper Kite Productions, will be live to record their workplace and career advice podcast “Million Dollar Advice.” The pair will tackle topics such as setting and maintaining boundaries, learning from Gen Z, dealing with complicated work dynamics, and more. They will also take live audience questions.

    4 p.m. — Community-Driven Approaches to Inclusive AI Education: With rising recognition of neurodivergent students, advocates say AI can revolutionize how schools support them by streamlining tasks, optimizing resources and enhancing personalized learning. In the process, schools can overcome challenges in mainstreaming students with learning differences. This panel features educators and advocates as well as Alex Kotran, co-founder and CEO of The AI Education Project.

    4 p.m. — How AI Makes Assessment More Actionable in Instruction: Assessments are often disruptive, cumbersome or disconnected from classroom learning. But a few advocates and developers say AI-powered assessment tools offer an easier, more streamlined way for students to demonstrate learning — and for educators to adapt instruction to meet their needs. This session, moderated by The 74’s Greg Toppo, features Khan Academy’s Kristen DiCerbo, Curriculum Associates’ Kristen Huff and Akisha Osei Sarfo, director of research at the Council of the Great City Schools.

    Wednesday, March 5:

    11 a.m. — Run, Hide, Fight: Growing Up Under the Gun Screening & Q&A: Gun violence is now the leading cause of death for American children and teens, according to the federal Centers for Disease Control and Prevention, yet coverage of gun violence’s impact on youth is usually reported by adults. Run, Hide, Fight: Growing Up Under the Gun is a 30-minute documentary by student journalists about how gun violence affects young Americans. Produced by PBS News Student Reporting Labs in collaboration with 14 student journalists in five cities, it centers the perspectives of young people who live their lives in the shadow of this threat. 

    11:30 a.m. — AI, Education & Real Classrooms: Educators are at the forefront of testing, using artificial intelligence and teaching their communities about it. In this interactive session, participants will hear from educators and ed tech specialists on the ground working to support the use of AI to improve learning. The session includes Stacie Johnson, director of professional learning at Khan Academy, and Dina Neyman, Khan Academy’s director of district success. 

    11:30 a.m. — The Future of Teaching in an Age of AI: As AI becomes increasingly present in the classroom, educators are understandably concerned about how it might disrupt their teaching. An expert panel featuring Jake Baskin, executive director of the Computer Science Teachers Association andKarim Meghji of Code.org, will look at how teaching will change in an age of AI, exploring frameworks for teaching AI skills and sharing best practices for integrating AI literacy across disciplines.

    2:30 p.m. — AI in Education: Preparing Gen A as the Creators of Tomorrow: Generation Alpha is the first to experience generative artificial intelligence from the start of their educational journeys. To thrive in a world featuring AI requires educators helping them tap into their natural creativity, navigating unique opportunities and challenges. In this session, a cross-industry panel of experts discuss strategies to integrate AI into learning, allowing critical thinking and curiosity to flourish while enabling early learners to become architects of AI, not just users.

    2:30 p.m. — The Ethical Use of AI in the Education of Black Children: Join a panel of educators, tech leaders and nonprofit officials as they discuss AI’s ethical complexities and its impact on the education of Black children. This panel will address historical disparities, biases in technology, and the critical need for ethical AI in education. It will also offer unique perspectives into the benefits and challenges of AI in Black children’s education, sharing best practices to promote the safe, ethical and legal use of AI in classrooms.

    2:30 p.m. — Exploring Teacher Morale State by State: Is teacher morale shaped by where teachers work? Find out as Education Week releases its annual State of Teaching survey. States and school districts drive how teachers are prepared, paid and promoted, and the findings will raise new questions about what leaders and policymakers should consider as they work to support an essential profession. The session features Holly Kurtz, director of EdWeek Research Center, Stephen Sawchuk, EdWeek assistant managing editor, and assistant editor Sarah D. Sparks.

    2:30 p.m. — From White Folks Who Teach in the Hood: Is This Conversation Against the Law Now? While most students in U.S. public schools are now young people of color, more than 80% of their teachers are white. How do white educators understand and address these dynamics? Join a live recording of a podcast that brings together white educators with Christopher Emdin and sam seidel, co-editors of From White Folks Who Teach in the Hood: Reflections on Race, Culture, and Identity (Beacon, 2024).

    3:30 p.m. — How Youth Use GenAI: Time to Rethink Plagiarism: Schools are locked in a battle with students over fears they’re using generative artificial intelligence to plagiarize existing work. In this session, join Elliott Hedman, a “customer obsession engineer” with mPath, who with colleagues and students co-designed a GenAI writing tool to reframe AI use. Hedman will share three strategies that not only prevent plagiarism but also teach students how to use GenAI more productively.  

    Thursday, March 6:

    10 a.m. — AI & the Future of Education: Join futurists Sinead Bovell and Natalie Monbiot for a fireside discussion about how we prepare kids for a future we cannot yet see but know will be radically transformed by technology. Bovell and Monbiot will discuss the impact of artificial intelligence on our world and the workforce, as well as its implications for education. 

    10 a.m. — Reimagining Everyday Places as Early Learning Hubs: Young children spend 80% of their time outside of school, but too many lack access to experiences that encourage learning through hands-on activities and play. While these opportunities exist in middle-class and upper-income neighborhoods, they’re often inaccessible to families in low-income communities. In this session, a panel of designers and educators featuring Sarah Lytle, who leads the Playful Learning Landscapes Action Network, will look at how communities are transforming overlooked spaces such as sidewalks, shelters and even jails into nurturing learning environments accessible to all kids.

    11 a.m. — Build-a-Bot Workshop: Make Your Own AI to Make Sense of AI: In this session, participants will build an AI chatbot alongside designers and engineers from Stanford University and Stanford’s d.school, getting to the core of how AI works. Participants will conceptualize, outline and create conversation flows for their own AI assistant and explore methods that technical teams use to infuse warmth and adaptability into interactions and develop reliable chatbots.  

    11:30 a.m. — Responsible AI: Balancing Innovation, Impact, & Ethics: In this session, participants will learn how educators, technologists and policymakers work to develop AI responsibly. Panelists include Isabelle Hau of the Stanford Accelerator for Learning, Amelia Kelly, chief technology officer of the Irish AI startup SoapBox Labs, and Latha Ramanan of the AI developer Merlyn Mind. They’ll talk about how policymakers and educators can work with developers to ensure transparency and accuracy of AI tools. 


    Get stories like these delivered straight to your inbox. Sign up for The 74 Newsletter

    Source link

  • Earning Our AI Literacy License – Faculty Focus

    Earning Our AI Literacy License – Faculty Focus

    Source link

  • The Student Assistant Supports Learning and Teaching

    The Student Assistant Supports Learning and Teaching

    Reading Time: 3 minutes

    AI is becoming a bigger part of our daily lives, and students are already using it to support their learning. In fact, from our studies, 90% of faculty feel GenAI is going to play an increasingly important role in higher ed.

    Embracing AI responsibly, with thoughtful innovation, can help students take charge of their educational journey. So, we turn to the insights and expertise of you and your students — to develop AI tools that support and empower learners, while maintaining ethical practices, accuracy and a focus on the human side of education.

    Training the Student Assistant together

    Since we introduced the Student Assistant in August 2024, we continue to ensure that faculty, alongside students, play a central role in helping to train it.

    Students work directly with the tool, having conversations. Instructors review these exchanges to ensure the Student Assistant is guiding students through a collaborative, critical thinking process —helping them find answers on their own, rather than directly providing them.

    “I was extremely impressed with the training and evaluation process. The onboarding process was great, and the efforts taken by Cengage to ensure parity in the evaluation process was a good-faith sign of the quality and accuracy of the Student Assistant.” — Dr. Loretta S. Smith, Professor of Management, Arkansas Tech University

    Supporting students through our trusted sources

    The Student Assistant uses only Cengage-authored course materials — it does not search the web.

    By leveraging content aligned directly with instructor’s chosen textbook , the Student Assistant provides reliable, real-time guidance that helps students bridge knowledge gaps — without ever relying on external sources that may lack credibility.

    Unlike tools that rely on potentially unreliable web sources, the Student Assistant ensures that every piece of guidance aligns with course objectives and instructor expectations.

    Here’s how:

    • It uses assigned Cengage textbooks, eBooks and resources, ensuring accuracy and relevance for every interaction
    • The Student Assistant avoids pulling content from the web, eliminating the risks of misinformation or content misalignment
    • It does not store or share student responses, keeping information private and secure

    By staying within our ecosystem, the Student Assistant fosters academic integrity and ensures students are empowered to learn with autonomy and confidence.

    “The Student Assistant is user friendly and adaptive. The bot responded appropriately and in ways that prompt students to deepen their understanding without giving away the answer.” – Lois Mcwhorter, Department Chair for the Hutton School of Business at the University of Cumberlands

    Personalizing the learning journey

    56% of faculty cited personalization as a top use case for GenAI to help enhance the learning experience.

    The Student Assistant enhances student outcomes by offering a personalized educational experience. It provides students with tailored resources that meet their unique learning needs right when they need them. With personalized, encouraging feedback and opportunities to connect with key concepts in new ways, students gain a deeper understanding of their coursework. This helps them close learning gaps independently and find the answers on their own, empowering them to take ownership of their education.

    “What surprised me most about using the Student Assistant was how quickly it adapted and adjusted to feedback. While the Student Assistant helped support students with their specific questions or tasks, it did so in a way that allowed for a connection. It was not simply a bot that pointed you to the correct answer in the textbook; it assisted students similar to how a professor or instructor would help a student.” — Dr. Stephanie Thacker, Associate Professor of Business for the Hutton School of Business at the University of the Cumberlands

    Helping students work through the challenges

    The Student Assistant is available 24/7 to help students practice concepts without the need to wait for feedback, enabling independent learning before seeking instructor support.

    With just-in-time feedback, students can receive guidance tailored to their course, helping them work through challenges on their own schedule. By guiding students to discover answers on their own, rather than providing them outright, the Student Assistant encourages critical thinking and deeper engagement.

    “Often students will come to me because they are confused, but they don’t necessarily know what they are confused about. I have been incredibly impressed with the Student Assistants’ ability to help guide students to better understand where they are struggling. This will not only benefit the student but has the potential to help me be a better teacher, enable more critical thinking and foster more engaging classroom discussion.” — Professor Noreen Templin, Department Chair and Professor of Economics at Butler Community College

    Want to start using the Student Assistant for your courses?

    The Student Assistant, embedded in MindTap, is available in beta with select titles , such as “Management,” “Human Psychology” and “Principles of Economics” — with even more coming this fall. Find the full list of titles that currently feature the Student Assistant, plus learn more about the tool and AI at Cengage right here.

    Source link