Category: ChatGPT

  • Regurgitative AI: Why ChatGPT Won’t Kill Original Thought – Faculty Focus

    Regurgitative AI: Why ChatGPT Won’t Kill Original Thought – Faculty Focus

    Source link

  • Students Increasingly Rely on Chatbots, but at What Cost? – The 74

    Students Increasingly Rely on Chatbots, but at What Cost? – The 74


    Get stories like this delivered straight to your inbox. Sign up for The 74 Newsletter

    Students don’t have the same incentives to talk to their professors — or even their classmates — anymore. Chatbots like ChatGPT, Gemini and Claude have given them a new path to self-sufficiency. Instead of asking a professor for help on a paper topic, students can go to a chatbot. Instead of forming a study group, students can ask AI for help. These chatbots give them quick responses, on their own timeline.

    For students juggling school, work and family responsibilities, that ease can seem like a lifesaver. And maybe turning to a chatbot for homework help here and there isn’t such a big deal in isolation. But every time a student decides to ask a question of a chatbot instead of a professor or peer or tutor, that’s one fewer opportunity to build or strengthen a relationship, and the human connections students make on campus are among the most important benefits of college.

    Julia Freeland-Fisher studies how technology can help or hinder student success at the Clayton Christensen Institute. She said the consequences of turning to chatbots for help can compound.

    “Over time, that means students have fewer and fewer people in their corner who can help them in other moments of struggle, who can help them in ways a bot might not be capable of,” she said.

    As colleges further embed ChatGPT and other chatbots into campus life, Freeland-Fisher warns lost relationships may become a devastating unintended consequence.

    Asking for help

    Christian Alba said he has never turned in an AI-written assignment. Alba, 20, attends College of the Canyons, a large community college north of Los Angeles, where he is studying business and history. And while he hasn’t asked ChatGPT to write any papers for him, he has turned to the technology when a blank page and a blinking cursor seemed overwhelming. He has asked for an outline. He has asked for ideas to get him started on an introduction. He has asked for advice about what to prioritize first.

    “It’s kind of hard to just start something fresh off your mind,” Alba said. “I won’t lie. It’s a helpful tool.” Alba has wondered, though, whether turning to ChatGPT with these sorts of questions represents an overreliance on AI. But Alba, like many others in higher education, worries primarily about AI use as it relates to academic integrity, not social capital. And that’s a problem.

    Jean Rhodes, a psychology professor at the University of Massachusetts Boston, has spent decades studying the way college students seek help on campus and how the relationships formed during those interactions end up benefitting the students long-term. Rhodes doesn’t begrudge students integrating chatbots into their workflows, as many of their professors have, but she worries that students will get inferior answers to even simple-sounding questions, like, “how do I change my major?”

    A chatbot might point a student to the registrar’s office, Rhodes said, but had a student asked the question of an advisor, that person may have asked important follow-up questions — why the student wants the change, for example, which could lead to a deeper conversation about a student’s goals and roadblocks.

    “We understand the broader context of students’ lives,” Rhodes said. “They’re smart but they’re not wise, these tools.”

    Rhodes and one of her former doctoral students, Sarah Schwartz, created a program called Connected Scholars to help students understand why it’s valuable to talk to professors and have mentors. The program helped them hone their networking skills and understand what people get out of their networks over the course of their lives — namely, social capital.

    Connected Scholars is offered as a semester-long course at U Mass Boston, and a forthcoming paper examines outcomes over the last decade, finding students who take the course are three times more likely to graduate. Over time, Rhodes and her colleagues discovered that the key to the program’s success is getting students past an aversion to asking others for help.

    Students will make a plethora of excuses to avoid asking for help, Rhodes said, ticking off a list of them: “‘I don’t want to stand out,’ ‘I don’t want people to realize I don’t fit in here,’ ‘My culture values independence,’ ‘I shouldn’t reach out,’ ‘I’ll get anxious,’ ‘This person won’t respond.’ If you can get past that and get them to recognize the value of reaching out, it’s pretty amazing what happens.”

    Connections are key

    Seeking human help doesn’t only leave students with the resolution to a single problem, it gives them a connection to another person. And that person, down the line could become a friend, a mentor or a business partner — a “strong tie,” as social scientists describe their centrality to a person’s network. They could also become a “weak tie” who a student may not see often, but could, importantly, still offer a job lead or crucial social support one day.

    Daniel Chambliss, a retired sociologist from Hamilton College, emphasized the value of relationships in his 2014 book, “How College Works,” co-authored with Christopher Takacs. Over the course of their research, the pair found that the key to a successful college experience boiled down to relationships, specifically two or three close friends and one or two trusted adults. Hamilton College goes out of its way to make sure students can form those relationships, structuring work-study to get students into campus offices and around faculty and staff, making room for students of varying athletic abilities on sports teams, and more.

    Chambliss worries that AI-driven chatbots make it too easy to avoid interactions that can lead to important relationships. “We’re suffering epidemic levels of loneliness in America,” he said. “It’s a really major problem, historically speaking. It’s very unusual, and it’s profoundly bad for people.”

    As students increasingly turn to artificial intelligence for help and even casual conversation, Chambliss predicted it will make people even more isolated: “It’s one more place where they won’t have a personal relationship.”

    In fact, a recent study by researchers at the MIT Media Lab and OpenAI found that the most frequent users of ChatGPT — power users — were more likely to be lonely and isolated from human interaction.

    “What scares me about that is that Big Tech would like all of us to be power users,” said Freeland-Fisher. “That’s in the fabric of the business model of a technology company.”

    Yesenia Pacheco is preparing to re-enroll in Long Beach City College for her final semester after more than a year off. Last time she was on campus, ChatGPT existed, but it wasn’t widely used. Now she knows she’s returning to a college where ChatGPT is deeply embedded in students’ as well as faculty and staff’s lives, but Pacheco expects she’ll go back to her old habits — going to her professors’ office hours and sticking around after class to ask them questions. She sees the value.

    She understands why others might not. Today’s high schoolers, she has noticed, are not used to talking to adults or building mentor-style relationships. At 24, she knows why they matter.

    “A chatbot,” she said, “isn’t going to give you a letter of recommendation.”

    This article was originally published on CalMatters and was republished under the Creative Commons Attribution-NonCommercial-NoDerivatives license.


    Get stories like these delivered straight to your inbox. Sign up for The 74 Newsletter

    Source link

  • AI and Art Collide in This Engineering Course That Puts Human Creativity First – The 74

    AI and Art Collide in This Engineering Course That Puts Human Creativity First – The 74

    I see many students viewing artificial intelligence as humanlike simply because it can write essays, do complex math or answer questions. AI can mimic human behavior but lacks meaningful engagement with the world.

    This disconnect inspired my course “Art and Generative AI,” which was shaped by the ideas of 20th-century German philosopher Martin Heidegger. His work highlights how we are deeply connected and present in the world. We find meaning through action, care and relationships. Human creativity and mastery come from this intuitive connection with the world. Modern AI, by contrast, simulates intelligence by processing symbols and patterns without understanding or care.

    In this course, we reject the illusion that machines fully master everything and put student expression first. In doing so, we value uncertainty, mistakes and imperfection as essential to the creative process.

    This vision expands beyond the classroom. In the 2025-26 academic year, the course will include a new community-based learning collaboration with Atlanta’s art communities. Local artists will co-teach with me to integrate artistic practice and AI.

    The course builds on my 2018 class, Art and Geometry, which I co-taught with local artists. The course explored Picasso’s cubism, which depicted reality as fractured from multiple perspectives; it also looked at Einstein’s relativity, the idea that time and space are not absolute and distinct but part of the same fabric.

    What does the course explore?

    We begin with exploring the first mathematical model of a neuron, the perceptron. Then, we study the Hopfield network, which mimics how our brain can remember a song from just listening to a few notes by filling in the rest. Next, we look at Hinton’s Boltzmann Machine, a generative model that can also imagine and create new, similar songs. Finally, we study today’s deep neural networks and transformers, AI models that mimic how the brain learns to recognize images, speech or text. Transformers are especially well suited for understanding sentences and conversations, and they power technologies such as ChatGPT.

    In addition to AI, we integrate artistic practice into the coursework. This approach broadens students’ perspectives on science and engineering through the lens of an artist. The first offering of the course in spring 2025 was co-taught with Mark Leibert, an artist and professor of the practice at Georgia Tech. His expertise is in art, AI and digital technologies. He taught students fundamentals of various artistic media, including charcoal drawing and oil painting. Students used these principles to create art using AI ethically and creatively. They critically examined the source of training data and ensured that their work respects authorship and originality.

    Students also learn to record brain activity using electroencephalography – EEG – headsets. Through AI models, they then learn to transform neural signals into music, images and storytelling. This work inspired performances where dancers improvised in response to AI-generated music.

    The Improv AI performance at Georgia Institute of Technology on April 15, 2025. Dancers improvised to music generated by AI from brain waves and sonified black hole data.

    Why is this course relevant now?

    AI entered our lives so rapidly that many people don’t fully grasp how it works, why it works, when it fails or what its mission is.

    In creating this course, the aim is to empower students by filling that gap. Whether they are new to AI or not, the goal is to make its inner algorithms clear, approachable and honest. We focus on what these tools actually do and how they can go wrong.

    We place students and their creativity first. We reject the illusion of a perfect machine, but we provoke the AI algorithm to confuse and hallucinate, when it generates inaccurate or nonsensical responses. To do so, we deliberately use a small dataset, reduce the model size or limit training. It’s in these flawed states of AI that students step in as conscious co-creators. The students are the missing algorithm that takes back control of the creative process. Their creations do not obey AI but reimagine it by the human hand. The artwork is rescued from automation.

    What’s a critical lesson from the course?

    Students learn to recognize AI’s limitations and harness its failures to reclaim creative authorship. The artwork isn’t generated by AI, but it’s reimagined by students.

    Students learn chatbot queries have an environmental cost because large AI models use a lot of power. They avoid unnecessary iterations when designing prompts or using AI. This helps reducing carbon emissions.

    The Improv AI performance on April 15, 2025, featured dancer Bekah Crosby responding to AI-generated music from brain waves.

    The course prepares students to think like artists. Through abstraction and imagination they gain the confidence to tackle the engineering challenges of the 21st century. These include protecting the environment, building resilient cities and improving health.

    Students also realize that while AI has vast engineering and scientific applications, ethical implementation is crucial. Understanding the type and quality of training data that AI uses is essential. Without it, AI systems risk producing biased or flawed predictions.

    Uncommon Courses is an occasional series from The Conversation U.S. highlighting unconventional approaches to teaching.

    This article is republished from The Conversation under a Creative Commons license. Read the original article.

    Source link

  • Reviving Engagement in the Spanish Classroom: A Musical Challenge with ChatGPT – Faculty Focus

    Reviving Engagement in the Spanish Classroom: A Musical Challenge with ChatGPT – Faculty Focus

    Source link

  • AI in Practice: Using ChatGPT to Create a Training Program

    AI in Practice: Using ChatGPT to Create a Training Program

    by Julie Burrell | September 24, 2024

    Like many HR professionals, Colorado Community College System’s Jennifer Parker was grappling with an increase in incivility on campus. She set about creating a civility training program that would be convenient and interactive. However, she faced a considerable hurdle: the challenges of creating a virtual training program from scratch, solo. Parker’s creative answer to one of these challenges — writing scripts for her under-10-minute videos — was to put ChatGPT to work for her. 

    How did she do it? This excerpt from her article, A Kinder Campus: Building an AI-Powered, Repeatable and Fun Civility Training Program, offers several tips.

    Using ChatGPT for Training and Professional Development

    I love using ChatGPT. It is such a great tool. Let me say that again: it’s such a great tool. I look at ChatGPT as a brainstorming partner. I don’t use it to write my scripts, but I do use it to get me started or to fix what I’ve written. I ask questions that I already know the answer to. I’m not using it for technical guidance in any way.

    What should you consider when you use ChatGPT for scriptwriting and training sessions?

    1. Make ChatGPT an expert. In my prompts, I often use the phrase, “Act like a subject matter expert on [a topic].” This helps define both the need and the audience for the information. If I’m looking for a list of reasons why people are uncivil on college campuses, I might prompt with, “Act like an HR director of a college campus and give me a list of ways employees are acting uncivil in the workplace.” Using the phrase above gives parameters on the types of answers ChatGPT will offer, as well as shape the perspective of the answers as for and about higher ed HR.
    2. Be specific about what you’re looking for. “I’m creating a training on active listening. This is for employees on a college campus. Create three scenarios in a classroom or office setting of employees acting unkind to each other. Also provide two solutions to those scenarios using active listening. Then, create a list of action steps I can use to teach employees how to actively listen based on these scenarios.” Being as specific as possible can help get you where you want to go. Once I get answers from ChatGPT, I can then decide if I need to change direction, start over or just get more ideas. There is no wrong step. It’s just you and your partner figuring things out.
    3. Sometimes ChatGPT can get stuck in a rut. It will start giving you the same or similar answers no matter how you reword things. My solution is to start a new conversation. I also change the prompt. Don’t be afraid to play around, to ask a million questions, or even tell ChatGPT it’s wrong. I often type something like, “That’s not what I’m looking for. You gave me a list of______, but what I need is ______. Please try again.” This helps the system to reset.
    4. Once I get close to what I want, I paste it all in another document, rewrite, and cite my sources. I use this document as an outline to rewrite it all in my own voice. I make sure it sounds like how I talk and write. This is key. No one wants to listen to ChatGPT’s voice. And I guarantee that people will know if you’re using its voice — it has a very conspicuous style. Once I’ve honed my script, I ensure that I find relevant sources to back the information up and cite the sources at the end of my documents, just in case I need to refer to them.

    What you’ll see here is an example of how I used ChatGPT to help me write the scripts for the micro-session on conflict. It’s an iterative but replicable process. I knew what the session would cover, but I wanted to brainstorm with ChatGPT.

    Once I’ve had multiple conversations with the chatbot, I go back through the entire script and pick out what I want to use. I make sure it’s in my own voice and then I’m ready to record. I also used ChatGPT to help with creating the activities and discussion questions in the rest of the micro-session.

    I know using ChatGPT can feel overwhelming but rest assured that you can’t really make a mistake. (And if you’re worried the machines are going to take over, throw in a “Thank you!” or “You’re awesome!” occasionally for appeasement’s sake.)

    About the author: Jennifer Parker is assistant director of HR operations at the Colorado Community College System.

    More Resources

    • Read Parker’s full article on creating a civility training program with help from AI.
    • Learn more about ChatGPT and other chatbots.
    • Explore CUPA-HR’s Civility in the Workplace Toolkit.



    Source link

  • Embracing the Future of HR: Your AI Questions Answered – CUPA-HR

    Embracing the Future of HR: Your AI Questions Answered – CUPA-HR

    by Julie Burrell | April 16, 2024

    In his recent webinar for CUPA-HR, Rahul Thadani, senior executive director of HR information systems at the University of Alabama at Birmingham, answered some of the most frequently raised questions about AI in HR. He also spoke to the most prevalent worries, including concerns about data privacy and whether AI will compete with humans for jobs.

    In addition to covering the basics on AI and how it works, Thadani addressed questions about the risks and rewards of using AI in HR, including:

    • How can AI speed up productivity now?
    • What AI tools should HR be using?
    • How well is AI integrated into enterprise software?
    • What are the risks and downsides of using AI?
    • What role will AI play in the future of HR?

    Thadani also put to rest a common fear about AI: that it will replace human jobs. He believes that HR is too complex, too fundamentally human a role to be automated. AI only simulates human intelligence, but it can’t make human decisions. Thadani reminded HR pros, “you all know how complex humans are, how complex decision-making is for humans.” AI can’t understand “the many components that go into hiring somebody,” for example, or how to measure employee engagement.

    AI won’t replace skilled HR professionals, but HR can’t afford to ignore AI. Thadani and other AI leaders stress that HR has a critical role to play in how AI is used on campuses. As the people experts, HR must have a seat at the table in AI discussions, partnering with IT and leadership on decisions such as how employees’ data are used and which AI software to test and purchase.

    Take the First Step

    Most people are just getting started on their AI journey. As a first step for those new to AI, Thadani recommends signing up for a ChatGPT account or another chatbot, like Google’s Gemini. He suggests using your private email account in case you need to sign a privacy agreement that doesn’t align with your institution’s policies. Test out what these chatbots are capable of by using this quick guide to chatbots.

    For leaders and supervisors, Thadani proposes having ongoing conversations within your department, on your campus and with your leadership. Some questions to consider in these conversations: Does your campus have an AI governance council? If so, is HR taking part? Do you have internal AI guidelines in place to protect data and privacy, in your department or for your campus? If not, do you have a plan to develop them? (As a leader in the AI space, the University of Michigan has AI guidelines that provide a good model, and are broken down into staff, faculty and student guidance categories.) Have you identified thought leaders in AI in your office or on your campus who can spur discussions and recommend best practices?

    In HR, “there’s definitely an eagerness to be ready and be ahead of the curve” when it comes to AI, Thadani noted. AI will undoubtedly be central to the future of work, and it’s up to HR to proactively guide how AI can be leveraged in ethical and responsible ways.

    HR-Specific Resources on AI



    Source link

  • Three Essential AI Tools and Practical Tips for Automating HR Tasks – CUPA-HR

    Three Essential AI Tools and Practical Tips for Automating HR Tasks – CUPA-HR

    by Julie Burrell | March 27, 2024

    During his recent keynote at CUPA-HR’s Higher Ed HR Accelerator, Commissioner Keith Sonderling of the Equal Opportunity Employment Commission observed, “now, AI exists in HR in every single stage of employment,” from writing job descriptions, to sourcing candidates and scheduling interviews, and well into the career lifecycle of employees.

    At some colleges and universities, AI is now a routine part of the HR workflow. At the University of North Texas at Dallas, for example, AI has significantly sped up the recruitment and hiring timeline. “It helped me staff a unit in an aggressive time frame,” says Tony Sanchez, chief human resources officer, who stresses that they use AI software with privacy protections. “AI parsed resumes, prescreened applicants, and allowed scheduling directly to the hiring manager’s calendar.”

    Even as AI literacy is becoming a critical skill, many institutions of higher education have not yet adopted AI as a part of their daily operations. But even if you don’t have your own custom AI like The University of Michigan, free AI tools can still be a powerful daily assistant. With some common-sense guardrails in place, AI can help you automate repetitive tasks, make software like Excel easier to use, analyze information and polish your writing.

    Three Free Chatbots to Use Now

    AI development is moving at a breakneck pace, which means that even the freely available tools below are more useful than they were just a few months ago. Try experimenting with multiple AI chatbots by having different browser windows open and asking each chatbot to do the same task. Just don’t pick a favorite yet. With AI companies constantly trying to outperform each other, one might work better depending on the day or the task. And before you start, be sure to read the section on AI guardrails below — you never want to input proprietary or private information into a public chatbot.

    ChatGPT, the AI trailblazer. The free version allows unlimited chats after signing up for an account. Right now, ChatGPT is text-based, which means it can help you with emails and communications, or even draft longer materials like reports. It can also solve math problems and answer questions (but beware of fabricated answers).

    You can customize ChatGPT to make it work better for you by clicking on your username in the bottom lefthand corner. For example, you can tell it that you’re an HR professional working in higher education, and it will tailor its responses to what it knows about your job.

    Google’s powerful AI chatbot, Gemini (formerly known as Bard). You’ll need to have or sign up for a free Google account, and it’s well worth it. Gemini can understand and interact with text just like ChatGPT does, but it’s also multimodal. You can drag and drop images and it will be able to interpret them. Gemini can also make tables, which can be exported to Google Sheets. And it generates images for free. For example, if you have an image you want your marketing team to design, you can get started by asking Gemini to create what you have in mind. But for now, Gemini won’t create images of people.

    Claude, often considered the best AI writer. Take Claude for a spin by asking it to write a job description or memo for you. Be warned that the free version of Claude has a daily usage limit, and you won’t know you’ve hit it until you hit it. According to Claude, your daily limit depends on demand, and your quota resets every morning.

    These free AI tools aren’t as powerful as their paid counterparts — all about $20 per month — but they do offer a sense of what AI can do.

    Practical Tips for Using AI in HR 

    For a recent Higher Ed HR Magazine article, I asked higher education HR professionals how they used AI to increase efficiency. Rhonda Beassie, associate vice president for people and procurement operations at Sam Houston State University, shared that she and her team are using AI for both increased productivity and upskilling, such as:

    • Creating first drafts of and benchmarking job descriptions.
    • Making flyers, announcements and other employee communications.
    • Designing training presentations, including images, text, flow and timing.
    • Training employees for deeper use of common software applications.
    • Providing instructions on developing and troubleshooting questions for macros and VLOOKUP in Microsoft Excel.
    • Troubleshooting software. Beassie noted that employees “can simply say to the AI, ‘I received an error message of X. How do I need to change the script to correct this?’ and options are provided.”
    • Creating reports pulled from their enterprise system.

    AI chatbots are also great at:

    • Being a thought partner. Ask a chatbot to help you respond to a tricky email, to find the flaws in your argument or to point out things you’ve missed in a piece of writing.
    • Revising the tone, formality or length of writing. You can ask chatbots to make something more or less formal or friendly (or whatever tone you’re trying to strike), remove the jargon from a piece of writing, or lengthen or shorten something.
    • Summarizing webpages, articles or book chapters. You can cut and paste a URL into a chatbot and ask it to summarize the page for you. You can also cut and paste a fairly large amount of text into chatbots and ask it for a summary. Try using parameters, such as “Summarize this into one sentence,” or “Please give me a bulleted list of the main takeaways.” The summaries aren’t always perfect, but will usually do in a pinch.
    • Summarizing YouTube videos. (Currently, the only free tool that can do this is Gemini.) Just cut and paste in the URL and ask it to summarize a video for you. Likewise, these summaries aren’t always exactly accurate.
    • Writing in your voice. Ask a chatbot to learn your voice and style by entering in things you’ve written. Ask it to compose a communication, like a memo or email you need to write, in your voice. This takes some time up front to train the AI, and it may not remember your voice from day-to-day or task-to-task.

    Practice Your Prompts

    Just 10 minutes a day can take you far in getting comfortable with these tools if you’re new to them. Learning prompting, which may take an upfront investment of more time, can unlock powerful capabilities in AI tools. The more complex the task you ask AI to do, the more time you need to spend crafting a prompt.

    The best prompts will ask a chatbot to assume a role and perform an action, using specific context. For example, “You are a human resources professional at a small, liberal arts college. You are writing a job description for an HR generalist. The position’s responsibilities include leading safety and compliance training; assisting with payroll; conducting background checks; troubleshooting employee questions in person and virtually. The qualifications for the job are one to two years in an HR office, preferably in higher education, and a BA.”

    Anthropic has provided a very helpful prompt library for Claude, which will also work with most AI chatbots.

    AI Guardrails

    There are real risks to using AI, especially the free tools listed above. You can read about them in detail here, or even ask AI to tell you, but the major dangers are:

    • Freely available AI will not protect your data privacy. Unless you have internal or enterprise software with a privacy agreement at your institution, assume everything you share with AI is public. Protected or confidential information should not be entered into a prompt.
    • AI fabricates, or hallucinates, as it’s sometimes called. It will make up facts that sound deceptively plausible. If you need accurate information, it’s best to consult an expert or trusted sources.
    • You don’t own copyright on AI-created work. In the United States, only human-produced work can be copyrighted.
    • Most of these tools are trained only up to a certain date, often a year or more ago for free chatbots. If you need up-to-the-minute information, use your favorite web browser.

    Further AI Resources



    Source link

  • Empowering Instructors & Students – Sovorel

    Empowering Instructors & Students – Sovorel

    This is my new book that I am very excited about because I think that it can really help a lot of people, especially instructors (at all levels) and students. It is vital and truly an imperative that we in academia help develop AI Literacy. Available on Amazon as an ebook, paperback, or hardback: https://www.amazon.com/AI-Literacy-Imperative-Empowering-Instructors/dp/B0C51RLPCG

    BOOK DESCRIPTION:

    The AI Literacy Imperative: Empowering Instructors & Students” is a seminal work that delves into the critical need for everyone to have AI Literacy in modern society, especially in academia. The book explains how educators must have a deep understanding of the key aspects of AI literacy: Awareness, Capability, Knowledge, and Critical Thinking, to effectively teach this vital skill to students.

    Drawing upon extensive research and practical experience, author Brent A. Anders, PhD. presents a comprehensive guide for instructors to integrate AI literacy into their curriculum. By exploring the fundamental concepts and applications of AI, this book empowers educators to equip their students with the skills necessary for success in both their professional and personal lives in our new AI integrated society.

    Throughout the book, a deep understanding of the complexities of AI and its implications for society are demonstrated. Through a rigorous exploration of the latest research and pedagogical considerations, the book provides educators with a clear roadmap for teaching AI literacy in a way that is understandable, manageable, motivational, and upholds academic integrity.

    “The AI Literacy Imperative: Empowering Instructors & Students” is a must-read for educators, students, instructional designers, librarians, researchers, and everyone else. By providing a comprehensive and easy-to-understand guide on the main components of AI literacy, covering everything from overreliance, writing assignments with AI, deepfakes, ethical considerations, future possibilities, and much more in-between, this book helps everyone better understand AI, use it more effectively in education, and help create a better AI integrated.

    Source link

  • Writing Assignments in the Age of AI – Sovorel

    Writing Assignments in the Age of AI – Sovorel

    I put this infographic together to help many instructors that are struggling with this issue as they teach and are trying to keep students from using AI when they are not supposed to. Be sure to take every opportunity to help students learn about AI Literacy when you can, even when telling them that for this assignment/eval they won’t be able to use it.

    You as the instructor are the subject matter expert and must be the one deciding how AI will be used in your classroom and for your assignments/evaluations. For some assignments, the use of AI may not be the right answer in that you are trying to help them develop skills mastery, so they can properly gain the skill of what “right” looks like. Be sure to fully explain that to them so that they have full relevancy and understanding as to why they can or can not use AI.

    Source link

  • How ChatGPT Can Help Prevent Violations of Academic Integrity – Sovorel

    How ChatGPT Can Help Prevent Violations of Academic Integrity – Sovorel

    A full article (including a video) describing each aspect of how ChatGPT can help with preventing violations of academic integrity (cheating) is provided in an article I wrote located here: https://brentaanders.medium.com/how-chatgpt-can-help-prevent-violations-of-academic-integrity-99ada37b52dd

    What are your thoughts on this or other aspects of ChatGPT and other AI in education? Leave a comment below.

    Source link