Author: admin

  • Ohio District Awarded CoSN Trusted Learning Environment Mini Seal for Student Data Privacy Practices

    Ohio District Awarded CoSN Trusted Learning Environment Mini Seal for Student Data Privacy Practices

    Washington, D.C.    CoSN today awarded Delaware Area Career Center in Delaware, Ohio, the Trusted Learning Environment (TLE) Mini Seal in the Business Practice. The CoSN TLE Seal is a national distinction awarded to school districts implementing rigorous privacy policies and practices to help protect student information. Delaware Area Career Center is the sixth school district in Ohio to earn a TLE Seal or TLE Mini Seal. To date, TLE Seal recipients have improved privacy protections for over 1.2 million students.

    The CoSN TLE Seal program requires that school systems uphold high standards for protecting student data privacy across five key practice areas: Leadership, Business, Data Security, Professional Development and Classroom. The TLE Mini Seal program enables school districts nationwide to build toward earning the full TLE Seal by addressing privacy requirements in one or more practice areas at a time. All TLE Seal and Mini Seal applicants receive feedback and guidance to help them improve their student data privacy programs.

    “CoSN is committed to supporting districts as they address the complex demands of student data privacy. We’re proud to see Delaware Area Career Center take meaningful steps to strengthen its privacy practices and to see the continued growth of the TLE Seal program in Ohio,” said Keith Krueger, CEO, CoSN.

    “Earning the TLE Mini Seal is a tremendous acknowledgement of the work we’ve done to uphold high standards in safeguarding student data. This achievement inspires confidence in our community and connects us through a shared commitment to privacy, transparency and security at every level,” said Rory Gaydos, Director of Information Technology, Delaware Area Career Center.

    The CoSN TLE Seal is the only privacy framework designed specifically for school systems. Earning the TLE Seal requires that school systems have taken measurable steps to implement, maintain and improve organization-wide student data privacy practices. All TLE Seal recipients are required to demonstrate that improvement through a reapplication process every two years.

    To learn more about the TLE Seal program, visit www.cosn.org/trusted.

    About CoSN CoSN, the world-class professional association for K-12 EdTech leaders, stands at the forefront of education innovation. We are driven by a mission to equip current and aspiring K-12 education technology leaders, their teams, and school districts with the community, knowledge, and professional development they need to cultivate engaging learning environments. Our vision is rooted in a future where every learner reaches their unique potential, guided by our community. CoSN represents over 13 million students and continues to grow as a powerful and influential voice in K-12 education. www.cosn.org

    About the CoSN Trusted Learning Environment Seal Program The CoSN Trusted Learning Environment (TLE) Seal Program is the nation’s only data privacy framework for school systems, focused on building a culture of trust and transparency. The TLE Seal was developed by CoSN in collaboration with a diverse group of 28 school system leaders nationwide and with support from AASA, The School Superintendents Association, the Association of School Business Officials International (ASBO) and ASCD. School systems that meet the program requirements will earn the TLE Seal, signifying their commitment to student data privacy to their community. TLE Seal recipients also commit to continuous examination and demonstrable future advancement of their privacy practices. www.cosn.org/trusted

    About Delaware Area Career Center Delaware Area Career Center provides unique elective courses to high school students in Delaware County and surrounding areas. We work in partnership with partner high schools to enhance academic education with hands-on instruction that is focused on each individual student’s area of interest. DACC students still graduate from their home high school, but they do so with additional college credits, industry credentials, and valuable experiences. www.delawareareacc.org

    Connect With Us

    Facebook,Twitter, LinkedIn

    eSchool News Staff
    Latest posts by eSchool News Staff (see all)

    Source link

  • Unibuddy launches AI tool to boost student engagement

    Unibuddy launches AI tool to boost student engagement

    Unibuddy, a higher education peer-to-peer engagement platform, has officially launched Assistant – an AI tool designed to support large-scale, authentic student-led conversations.

    Following a successful beta phase, the tool is now fully live with 30 institutions worldwide and delivering impressive results: tripling student engagement, cutting staff workload significantly, and maintaining over 95% accuracy.

    As universities face increasing pressure from tighter budgets and rising student expectations, Unibuddy said its Assistant tool offers a powerful solution to scale meaningful engagement efficiently, combining the speed of AI with the authenticity of real student voices.

    • 65,000 unique students have used Assistant
    • 100,000+ student questions answered automatically without requiring manual intervention
    • 125% increase in students having conversations
    • 60% increase in lead capture
    • five hours saved per day for university staff

    “Today’s students demand instant, authentic and trustworthy communication,” said Diego Fanara, CEO at Unibuddy. “Unibuddy Assistant is the first and only solution that fuses the speed of AI with the credibility of peer-to-peer guidance – giving institutions a scalable way to meet expectations without sacrificing quality or trust.”

    Unibuddy has partnered with more than 600 institutions globally and has supported over 3,000,000 prospective students through the platform. As part of this extensive network, it regularly conducts surveys to uncover fresh insights. Although chatbots are now common in higher education, survey findings highlight key limitations in their effectiveness:

    • 84% of students said that university responses were too slow (Unibuddy Survey, 2025)
    • 79% of students said it was important that universities balance AI automation (for speed) and human interaction (for depth) while supporting them as they navigate the decision-making process (Unibuddy Survey, 2025)
    • 51% of students say they wouldn’t trust a chatbot to answer questions about the student experience (Unibuddy Survey, 2024)
    • 78% say talking to a current student is helpful — making them 3.5x more likely to trust a peer than a bot (Unibuddy Survey, 2025)
    • Only 14% of students felt engaged by the universities they applied to (Unibuddy Survey, 2025)

    Unibuddy says these finding have shaped its offering: using AI to handle routine questions and highlight valuable information, while smoothly handing off to peer or staff conversations when a personal, human connection is needed.

    Buckinghamshire New University used Unibuddy Assistant to transform early-stage engagement – generating 800,000 impressions, 30,000 clickthroughs, and 10,000+ student conversations in just six months. The university saved over 2,000 staff hours and saw 3,000 referrals to students or staff. 

    Today’s students demand instant, authentic and trustworthy communication
    Diego Fanara, Unibuddy

    Meanwhile the University of South Florida Muma College of Business reported over 30 staff hours saved per month, with a 59% click-to-conversation rate and over a third of chats in Assistant resulting in referrals to student ambassador conversations. 

    And the University of East Anglia deployed Assistant across more than 100 web pages, as part of the full Unibuddy product suites deployment of peer-to-peer chat, with student-led content contributing to a 62% offer-to-student conversion rate compared with 34% of those who didn’t engage with Unibuddy. 

    Source link

  • FIRE and Cosmos Institute launch $1 million grant program for AI that advances truth-seeking

    FIRE and Cosmos Institute launch $1 million grant program for AI that advances truth-seeking

    AUSTIN, Texas, May 16, 2025 — The Foundation for Individual Rights and Expression (FIRE) and the Cosmos Institute today announced the Truth-Seeking AI Grants Program, a new $1 million initiative to fund open-source projects that build freedom into the foundations of AI, rather than censorship or control.

    Truth-seeking AI: Why it matters

    Truth-seeking AI is artificial intelligence built to expand the marketplace of ideas and sharpen human inquiry — not replace it.

    AI already drafts our sentences, sorts our inbox, and cues our next song. But the technology is advancing rapidly. Soon, it could determine which ideas ever reach our minds — or form within them. Two futures lie ahead, and the stakes couldn’t be higher.

    In one, AI becomes a shadow censor. Hidden ranking rules throttle dissent, liability fears chill speech, and flattering prompts dull judgment until people stop asking “why.” That is algorithmic tyranny.

    In the other, AI works as a partner in truth-seeking: it surfaces counter-arguments, flags open questions, and prompts us to check the evidence and our biases. Errors are chipped away, knowledge grows, and our freedom — and habit — to question not only survives but thrives. 

    To ensure we build AI tools and platforms for freedom, not control, Cosmos and FIRE are putting $1 million in grants on the table to ensure the future of AI is free.

    “AI guides a fifth of our waking hours. The builders of these systems now hold the future of free thought and expression in their hands. We’re giving them the capital, computing resources, and community they need to seize that opportunity,” said Brendan McCord, founder and chair of Cosmos Institute.

    “The First Amendment restrains governments, but the principles of free speech must also be translated into code. We’re challenging builders to do exactly that and prioritize freedom over control,” said Greg Lukianoff, president and CEO of FIRE.

    “AI can already steer our thoughts. The future is AI that expands them, not controls them,” added Philipp Koralus, founding director, Oxford HAI Lab and Senior Research Fellow at Cosmos Institute.

    To read more about why we need to bake principles of free thought and expression into AI code, check out Brendan McCord, Greg Lukianoff, and Philipp Koralus’s piece at Reason

    How it works

    • Grant pool: $1 million (cash + compute); compute credits are from Prime Intellect, a platform for open, decentralized AI development
    • Typical award: $1k – $10k fast grants; larger amounts considered for standout ideas
    • Rolling review: decisions in ~3 weeks; applications open May 16 at CosmosGrants.org/truth
    • Sprint timeline: 90 days to ship a working prototype
    • Community: access to a vetted network of builders, mentors, and advisors at the AI and philosophy frontier
    • Showcase: Top projects funded by Nov 1, 2025 will be invited to demo at the Austin AI x Free Speech Symposium in December 2025; selection is competitive and at the program’s discretion

    What we’re funding

    • Marketplace of Ideas — projects that preserve viewpoint diversity and open debate.
    • Promoting Inquiry — systems that actively provoke new questions, surfacing counter-arguments and open issues that require more study.
    • Bold New Concepts — any approach that pushes AI toward the role of truth-seeking partner.

    Illustrative projects:

    We’re focused on prototypes that translate philosophy to code — embedding truth-seeking principles like Mill’s Trident and Socratic inquiry directly into open-source software.

    Possible projects could include:

    • AI challenger that pokes holes in your assumptions and coaches you forward
    • An open debate arena where swappable models argue under a live crowd score
    • A tamper-proof logbook that records every answer on a public ledger.

    About the Foundation for Individual Rights and Expression (FIRE)

    The Foundation for Individual Rights and Expression (FIRE) is a nonpartisan, nonprofit organization dedicated to defending and sustaining the individual rights of all Americans to free speech and free thought — the most essential qualities of liberty. FIRE educates Americans about the importance of these inalienable rights, promotes a culture of respect for them, and provides the means to preserve them. Learn more at www.thefire.org.

    About Cosmos Institute

    Cosmos Institute is a 501(c)(3) academy for philosopher-builders — technologists who unite deep reflection with practical engineering. Through research, fellowships, grants, and education, Cosmos advances human flourishing by translating philosophy to code across three pillars: truth-seeking, decentralization, and human autonomy. The Institute supported the creation of the new Human-Centered AI Lab at the University of Oxford, the first lab dedicated to embedding flourishing principles in open-source AI. Learn more at www.cosmos-institute.org.

    Media Contact
    Karl de Vries, Director of Media Relations, FIRE
    karl.de.vries@thefire.org | +1 215-717-3473

    Source link

  • How educators can use Gen AI to promote inclusion and widen access

    How educators can use Gen AI to promote inclusion and widen access

    by Eleni Meletiadou

    Introduction

    Higher education faces a pivotal moment as Generative AI becomes increasingly embedded within academic practice. While AI technologies offer the potential to personalize learning, streamline processes, and expand access, they also risk exacerbating existing inequalities if not intentionally aligned with inclusive values. Building on our QAA-funded project outputs, this blog outlines a strategic framework for deploying AI to foster inclusion, equity, and ethical responsibility in higher education.

    The digital divide and GenAI

    Extensive research shows that students from marginalized backgrounds often face barriers in accessing digital tools, digital literacy training, and peer networks essential for technological confidence. GenAI exacerbates this divide, demanding not only infrastructure (devices, subscriptions, internet access) but also critical AI literacy. According to previous research, students with higher AI competence outperform peers academically, deepening outcome disparities.

    However, the challenge is not merely technological; it is social and structural. WP (Widening Participation) students often remain outside informal digital learning communities where GenAI tools are introduced and shared. Without intervention, GenAI risks becoming a “hidden curriculum” advantage for already-privileged groups.

    A framework for inclusive GenAI adoption

    Our QAA-funded “Framework for Educators” proposes five interrelated principles to guide ethical, inclusive AI integration:

    • Understanding and Awareness Foundational AI literacy must be prioritized. Awareness campaigns showcasing real-world inclusive uses of AI (eg Otter.ai for students with hearing impairments) and tiered learning tracks from beginner to advanced levels ensure all students can access, understand, and critically engage with GenAI tools.
    • Inclusive Collaboration GenAI should be used to foster diverse collaboration, not reinforce existing hierarchies. Tools like Miro and DeepL can support multilingual and neurodiverse team interactions, while AI-powered task management (eg Notion AI) ensures equitable participation. Embedding AI-driven teamwork protocols into coursework can normalize inclusive digital collaboration.
    • Skill Development Higher-order cognitive skills must remain at the heart of AI use. Assignments that require evaluating AI outputs for bias, simulating ethical dilemmas, and creatively applying AI for social good nurture critical thinking, problem-solving, and ethical awareness.
    • Access to Resources Infrastructure equity is critical. Universities must provide free or subsidized access to key AI tools (eg Grammarly, ReadSpeaker), establish Digital Accessibility Centers, and proactively support economically disadvantaged students.
    • Ethical Responsibility Critical AI literacy must include an ethical dimension. Courses on AI ethics, student-led policy drafting workshops, and institutional AI Ethics Committees empower students to engage responsibly with AI technologies.

    Implementation strategies

    To operationalize the framework, a phased implementation plan is recommended:

    • Phase 1: Needs assessment and foundational AI workshops (0–3 months).
    • Phase 2: Pilot inclusive collaboration models and adaptive learning environments (3–9 months).
    • Phase 3: Scale successful practices, establish Ethics and Accessibility Hubs (9–24 months).

    Key success metrics include increased AI literacy rates, participation from underrepresented groups, enhanced group project equity, and demonstrated critical thinking skill growth.

    Discussion: opportunities and risks

    Without inclusive design, GenAI could deepen educational inequalities, as recent research warns. Students without access to GenAI resources or social capital will be disadvantaged both academically and professionally. Furthermore, impersonal AI-driven learning environments may weaken students’ sense of belonging, exacerbating mental health challenges.

    Conversely, intentional GenAI integration offers powerful opportunities. AI can personalize support for students with diverse learning needs, extend access to remote or rural learners, and reduce administrative burdens on staff – freeing them to focus on high-impact, relational work such as mentoring.

    Conclusion

    The future of inclusive higher education depends on whether GenAI is adopted with a clear commitment to equity and social justice. As our QAA project outputs demonstrate, the challenge is not merely technological but ethical and pedagogical. Institutions must move beyond access alone, embedding critical AI literacy, equitable resource distribution, community-building, and ethical responsibility into every stage of AI adoption.

    Generative AI will not close the digital divide on its own. It is our pedagogical choices, strategic designs, and values-driven implementations that will determine whether the AI-driven university of the future is one of exclusion – or transformation.

    This blog is based on the recent outputs from our QAA-funded project entitled: “Using AI to promote education for sustainable development and widen access to digital skills”

    Dr Eleni Meletiadou is an Associate Professor (Teaching) at London Metropolitan University  specialising in Equity, Diversity, and Inclusion (EDI), AI, inclusive digital pedagogy, and multilingual education. She leads the Education for Social Justice and Sustainable Learning and Development (RILEAS) and the Gender Equity, Diversity, and Inclusion (GEDI) Research Groups. Dr Meletiadou’s work, recognised with the British Academy of Management Education Practice Award (2023), focuses on transforming higher education curricula to promote equitable access, sustainability, and wellbeing. With over 15 years of international experience across 35 countries, she has led numerous projects in inclusive assessment and AI-enhanced learning. She is a Principal Fellow of the Higher Education Academy and serves on several editorial boards. Her research interests include organisational change, intercultural communication, gender equity, and Education for Sustainable Development (ESD). She actively contributes to global efforts in making education more inclusive and future-ready. LinkedIn: https://www.linkedin.com/in/dr-eleni-meletiadou/

    Author: SRHE News Blog

    An international learned society, concerned with supporting research and researchers into Higher Education

    Source link

  • Empowering school staff with emergency response protocols

    Empowering school staff with emergency response protocols

    Key points:

    Safety response protocols are foundational to creating a culture of safety in schools. District leaders should adopt and implement response protocols that cover all types of emergencies. Schools should have building-level response protocols and protocols for incidents when first responders are needed. These practices are critical to keeping the community safe during emergencies.

    When staff members are empowered to participate in emergency planning and response, their sense of safety is improved. Unfortunately, many staff members do not feel safe at school.

    Thirty percent of K-12 staff think about their physical safety when at work every day, and 74 percent of K-12 staff said they do not feel supported by their employer to handle emergency situations at work.

    Staff disempowerment is a “central problem” when it comes to district emergency planning, said Dr. Gabriella Durán Blakey, superintendent of Albuquerque Public Schools: “What does safety mean for educators to really be able to feel safe in their classroom, to impact student achievement, the well-being of students? And how does that anxiety play with how the students feel in the classroom?”

    School leaders should implement response protocols that empower staff to understand and participate in emergency response using a two-tiered system of emergency response:

    • A building-level emergency planning and response team should develop an Emergency Operations Plan, which includes an emergency response protocol
    • Administrators should adopt protocols to follow when they need first responders to intervene

    For guidance on crafting emergency response protocols and plans, click here.

    Laura Ascione
    Latest posts by Laura Ascione (see all)

    Source link

  • Is social media turning our hearts to stone?

    Is social media turning our hearts to stone?

    As global digital participation grows, our ability to connect emotionally may be shifting. Social media has connected people across continents, but it also reshapes how we perceive and respond to others’ emotions, especially among youth. 

    Empathy is the ability to understand and share another’s feelings, helping to build connections and support. It’s about stepping into someone else’s shoes, listening and making them feel understood.

    While platforms like Instagram, TikTok and X offer tools for global connection, they may also be changing the way we experience empathy.

    Social media’s strength lies in its speed and reach. Instant sharing allows users to engage with people from different backgrounds, participate in global conversations and discover social causes. But it also comes with downsides. 

    “People aren’t doing research for themselves,” says Marc Scott, the diversity, equity and community coordinator at the Tatnall School, the private high school that I attend in the U.S. state of Delaware. “They see one thing and take it for fact.”

    Communicating in a two-dimensional world

    That kind of surface-level engagement can harm emotional understanding. The lack of facial expressions, body language and tone — key elements of in-person conversation — makes it harder to gauge emotion online. This often leads to misunderstandings, or worse, emotional detachment.

    In a world where users often post only curated highlights, online personas may appear more polished than real life. “Someone can have a large following,” Scott said. “But that’s just one person. They don’t represent the whole group.” 

    Tijen Pyle teaches advanced placement psychology at the Tatnall School. He pointed out how social media can amplify global polarization. 

    “When you’re in a group with similar ideas, you tend to feel stronger about those opinions,” he said. “Social media algorithms cater your content to your interests and you only see what you agree with.” 

    This selective exposure limits empathy by reducing understanding of differing perspectives. The disconnect can reinforce stereotypes and limit meaningful emotional connection.

    Over exposure to media

    Compounding the problem is “compassion fatigue” — when constant exposure to suffering online dulls our emotional response. Videos of crisis after crisis can overwhelm users, turning tragedy into background noise in an endless scroll.

    A widely cited study published in the journal Psychiatric Science in 2013 examined the effects of exposure to media related to the 9/11 attacks and the Iraq War. The study led by Roxanne Cohen Silver, found that vicariously experienced events, such as watching graphic media images, can lead to collective trauma.

    Yet not all emotional connection is lost. Online spaces have also created powerful support systems — from mental health communities to social justice movements. These spaces offer users a chance to share personal stories, uplift one another and build solidarity across borders. “It depends on how you use it,” Scott said.

    Many experts agree that digital empathy must be cultivated intentionally. According to a 2025 Pew Research Center study, nearly half of U.S. teens believe that social media platforms have a mostly negative effect on people their age, a significant increase from 32% in 2022. This growing concern underscores the complex nature of online interactions, where the potential for connection coexists with the risk of unkindness and emotional detachment. ​

    So how do we preserve empathy in a digital world? It starts with awareness. Engaging critically with content, seeking out diverse viewpoints and taking breaks from the algorithm can help. “Social media can expand your perspectives — but it can also trap you in a single mindset,” Scott said. 

    I initially started thinking about this topic when I was having the same conversations with different people and feeling a sense of ignorance. It wasn’t that they didn’t care — it was like they didn’t know how to care. 

    The way they responded to serious topics felt cold or disconnected, almost like they were watching a video instead of talking to a real person. 

    That made me wonder: has social media changed the way we understand and react to emotions?

    Ultimately, social media isn’t inherently good or bad for empathy. It’s a tool. And like any tool, its impact depends on how we use it. If we use it thoughtfully, we can ensure empathy continues to grow, even in a world dominated by screens.


    Questions to consider:

    1. What is empathy and why is it important?

    2. How can too much time spent on social media dull our emotional response?

    2. How do you know if you have spent too much time on social media? 


     

    Source link

  • Harvard University devotes $250M to sustain research hit by federal cuts

    Harvard University devotes $250M to sustain research hit by federal cuts

    This audio is auto-generated. Please let us know if you have feedback.

    Dive Brief:

    • Harvard University will put $250 million of its own funds toward research affected by the ongoing wave of federal cuts, according to a Wednesday announcement
    • Since last week, Harvard has received “a large number of grant terminations from the federal government,” President Alan Garber and Provost John Manning said in a campuswide message. The funding disruptions are halting “lifesaving research and, in some cases, losing years of important work,” they said.
    • Harvard is taking the same tack as Northwestern and Johns Hopkins universities, which announced in April they would use institutional dollars to cover the cost of ongoing research hit by cuts.

    Dive Insight:

    Northwestern and Johns Hopkins began self-funding some of their own research after hundreds of millions of their federal funding had been lost or frozen due to the Trump administration.

    Since Trump retook office, several federal agencies have abruptly changed their funding policies, cutting off billions in grants and contracts with little to no warning. The National Institutes of Health alone slashed $1.8 billion in a little over a month, according to findings published in JAMA last week. 

    Harvard is now similarly self-funding affected research. But the federal government’s attacks against it outpace those directed at many of its peers. 

    Last month, the Trump administration canceled over $2.2 billion in federal funds to Harvard after the Ivy League institution publicly rebuked its ultimatums, arguing they overstepped the federal government’s authority. Among the demands, the administration sought a third-party audit of the viewpoints of university employees and students and wanted Harvard to selectively curtain the power of certain employees based on their activism. 

    The university is now bracing for even more cuts and mounting a legal battle against the Trump administration to regain its federal funding. 

    The university intends to fight the government’s “unlawful freeze and termination” of many of its grants and is doing what it can in the interim, Garber and Manning said Wednesday.

    “Although we cannot absorb the entire cost of the suspended or canceled federal funds, we will mobilize financial resources to support critical research activity for a transitional period as we continue to work with our researchers to identify alternative funding sources,” they said.

    They added that the university will advocate for “the productive partnership between the federal government and research universities” that has existed for over eight decades.

    Over 50 higher ed organizations, led by the American Council on Education, made a similar plea in a joint statement Wednesday.

    “The entire country benefits when policymakers and higher education leaders respect a common understanding of the vital role colleges and universities play in advancing the social, cultural, and economic well-being of the United States,” the organizations said.

    They argued that the release of research funds should not be contingent on which students colleges enroll, what programs they offer or how they oversee their instructors. The signatories also include the American Association of Colleges and Universities and the New England Commission of Higher Education, Harvard’s accreditor.

    Prior to its announcement Wednesday, Harvard had already implemented a hiring freeze for the spring semester. And dozens of faculty members have pledged 10% of their salaries to shore up against the “severe financial damage” the university faces as it takes the Trump administration to court.

    Garber recently made a similar pledge. He will take a voluntary 25% pay cut beginning in July, a university spokesperson said Thursday. 

    Harvard has not yet publicly disclosed the new president’s salary. But his predecessors have made north of $1 million annually, meaning his voluntary pay cut in fiscal 2026 would likely net the university six-figure savings.

    Garber, a longstanding Harvard employee, has taken a pay reduction during turbulent financial times before. As provost, Garber took a 25% cut in 2020 in response to the pandemic, as did the university’s then-president and executive vice president.

    Source link

  • Waivers Let Some N.C. Majors Keep “DEI” Requirements

    Waivers Let Some N.C. Majors Keep “DEI” Requirements

    Shortly after the second Trump administration began attacking higher education diversity initiatives, the University of North Carolina system ordered its 16 public universities to immediately stop requiring “course credits related to diversity, equity and inclusion.” The system was targeting DEI even before Trump retook office—the UNC Board of Governors repealed the system’s DEI policy a year ago—and its general counsel pointed to the federal government’s newly threatened funding cuts to justify this further step.

    But the system’s kibosh on DEI requirements came with caveats. Majors could continue requiring courses with diversity themes if university chancellors provided waivers.The system gave chancellors the final say on which major-specific courses would continue to be mandated, and chancellors said waivers were needed because state or national accreditation and licensure criteria require diversity education.

    “Approximately 95 percent of the programs identified for waivers at the chancellor level had accreditation and licensure requirements attached to them,” said David J. English, the system’s senior vice president for academic affairs, during a UNC board committee meeting this week. English said these programs are in counseling, education, nursing, psychology and social work.

    According to documents attached to this week’s board meeting agenda and previously reported by Raleigh’s News & Observer, dozens of courses will remain necessary for certain majors in the UNC system, which includes all four-year public universities in the state. Among them are Feminist Theory at UNC Asheville; Multicultural Counseling at UNC Charlotte; Social Work Policy and Restorative Justice at UNC Greensboro; Teaching Reading to Culturally Diverse Children at Fayetteville State University; Inclusion, Diversity and Equity in Agriculture at North Carolina A&T State University; and Diversity in Higher Education at North Carolina Central University.

    The UNC system is one example of universities across the country being asked to comply with vague statewide and national demands to excise DEI. Lacking detailed guidance, they’ve had to define that term for themselves as they seek to show compliance.

    The UNC system never defined for its component institutions what it meant by the verboten “course credits related to DEI.” The universities were left to determine for themselves what they should stop requiring; some administrations used keyword searches of course descriptions, looking for terms such as “cultural” to choose which courses to review.

    The Feb. 5 order from the system said universities’ general education requirements couldn’t include mandates for DEI-related courses at all. A few institutions, such as East Carolina University and UNC Asheville, responded by jettisoning broad diversity categories from their gen ed requirements. At UNC Chapel Hill, College of Arts and Sciences dean Jim White wrote that “Power, Difference, and Inequality”—a category within the gen ed curriculum there—“could be incorrectly read or understood to be ‘related’ to DEI,” so it was “streamlined” and is now called “Power and Society.”

    But when it came to specific majors’ mandates for DEI-related credits, the system let chancellors grant what it called “tailored waivers” to allow these requirements to continue.

    Appalachian State University’s acting provost initially asked the national Council on Social Work Education, which accredits social work programs, to waive accreditation standards that are specifically called “Anti-Racism, Diversity, Equity, and Inclusion.” But when the council refused, Appalachian State chancellor Heather Norris gave her university’s social work program a waiver to continue the education requirements.

    Halaevalu Fonongava’inga Ofahengaue Vakalahi, the Council on Social Work Education’s president and chief executive officer, told Inside Higher Ed in an email, “We do not issue waivers except in very limited circumstances as defined by our Educational Policy and Accreditation Standards. Those circumstances are not applicable in this case.

    “What we have done, and are continuing to do, is work with programs and institutions to ensure they are both meeting the appropriate standards for accreditation while also staying within the boundaries dictated by law,” Vakalahi wrote. “Social work is about healthy individuals, healthy families, and healthy communities. We value inclusion because we believe that social work is for everyone—no exceptions.”

    While the waiver documents released this week show course requirements that survived, they don’t specify whether universities dropped major-specific requirements—and if so, which ones—instead of having their chancellors grant waivers. Universities didn’t provide interviews this week to Inside Higher Ed for this article. But the fact that the system ordered statewide changes to curriculum rather than have the faculties of individual universities propose them has raised academic freedom and shared governance concerns.

    ‘Faculty Were Not Pleased’

    Wade Maki, chair of the UNC Faculty Assembly, said the faculty senates or councils of all 16 universities, plus the one specialized high school in the system, ratified a resolution calling the order to end DEI course requirements “an unnecessary and intolerable breach of the principle of academic freedom” that “deeply undermines” the system’s mission “to serve the people of our state.”

    Defending the place of faculty in setting curriculum, the resolution says, “Faculty, who are trained at the highest level of our disciplines, collaborate within our departments, universities and communities to design and lead programs—including defining the core curriculum and graduation requirements—to ensure our students’ growth and success.”

    While “faculty were not pleased” about the order, they stepped up to take part in the course review, Maki said. Because the system “did not prescribe how” to comply, “it was up to faculty and administrators to work together to determine what to do,” he said.

    “Each major had to look and say, ‘Do we think we’re at risk of being out of compliance here, and what’s the best course of action?’” he said.

    Herle McGowan, chair of North Carolina State University’s Faculty Senate, said her university dropped a general ed requirement in response to the system’s order—a requirement that had been created by faculty with student input.

    “The fact that it changed without consultation from faculty is definitely concerning to me,” McGowan said.

    She said she personally believes academic freedom rights should cover “the broader curriculum,” not just individual faculty teaching and research. Within majors, she said experts should agree on what students need to learn, and when it comes to gen ed, there should be “collaboration from faculty experts all across the university” in determining what students need to be good, well-rounded citizens prepared for life and work.

    Regardless of her own views on academic freedom, McGowan said, the fact that the system handed down such an order points to a need for constituents—from faculty to board members—to come to a consensus on what academic freedom means.

    Source link

  • Misinformation Course Teaches Ethics for Engineering Students

    Misinformation Course Teaches Ethics for Engineering Students

    Nearly three in four college students say they have somewhat high or very high media literacy skills (72 percent), according to a 2025 Student Voice survey by Inside Higher Ed and Generation Lab. Students are less likely to consider their peers media literate; three in five respondents said they have at least somewhat high levels of concern about the spread of misinformation among their classmates.

    When asked how colleges and universities could help improve students’ media literacy skills, a majority of Student Voice respondents indicated they want digital resources on increasing media literacy or media literacy–related content and training embedded into the curriculum.

    A recently developed course at the University of Southern California’s Viterbi School of Engineering teaches students information literacy principles to help them develop tools to mitigate the harms of online misinformation.

    The background: USC offers an interdisciplinary teaching grant that incentivizes cross-campus collaboration and innovative teaching practices. To be eligible for the grant, applications must include at least one full-time faculty member and faculty from more than one school or division. Each grantee receives up to $20,000 to compensate for applicants’ time and work.

    In 2023, Helen Choi, a faculty member at USC Viterbi, won the interdisciplinary teaching grant in collaboration with Cari Kaurloto, head of the science and engineering library at USC Libraries, to create a media literacy course specifically for engineering students.

    “By focusing on engineering students, we were able to integrate a component of the course that addresses a social issue from an engineering perspective in terms of technical know-how and the professional ethics,” Choi said, which helps students see the relevance of course content to their personal and professional lives.

    What’s the need: Students tend to receive most of their news and information on online platforms; Student Voice data found a majority of learners rely on social media for news content (72 percent), and about one in four engage with news apps or news aggregator websites (27 percent).

    Choi and Kaurloto’s course, titled Information Literacy: Navigating Digital Misinformation, builds academic research skills, teaches information literacy principles and breaks down the social issue of online misinformation.

    “Students examine ways they can navigate online information using their research skills, and then extend that knowledge by considering how they, as prospective engineers, can build technologies that mitigate the harms of online misinformation while enhancing the information literacy of users,” Choi explained.

    USC faculty aren’t the only ones noticing a need for more education around engagement with digital information; a growing number of colleges and universities are making students complete a digital literacy course as a graduation requirement.

    In the classroom: Choi and Kaurloto co-teach the course, which was first offered in this spring to a class of 25 students.

    The students learned to develop effective search strategies and critically examine sources, as well as ethical engineering principles and how to apply them in designing social media platforms, Kaurloto said. Choi and Kaurloto employed active learning pedagogies to give students hands-on and real-life applications including writing, speaking and collaborative coursework.

    One assignment the students completed was conducting library research to develop a thesis paragraph on an information literacy topic with a short, annotated bibliography. Students also presented their research to their peers, Kaurloto said.

    Learners also engaged in a group digital literacy project, designing a public service campaign that included helpful, research-backed ways to identify misinformation, Choi said. “They then had to launch that campaign on a social media platform, measure its impact, and present on their findings.” Projects ranged from infographics on Reddit to short-form videos on spotting AI-generated misinformation and images on TikTok and Instagram.

    The impact: Student feedback said they found the course helpful, with many upper-level learners saying they wished they had taken it sooner in their academic career because of the library research skills they gained. They also indicated the course content was applicable in daily life, such as when supporting family members “who students say have fallen down a few internet rabbit holes or who tend to believe everything they see online,” Choi said.

    Other librarians have taken note of the course as a model of how to teach information literacy, Choi said.

    “We’ve found that linking information literacy with specific disciplines like engineering can be helpful both in terms of building curricula that resonate with students but also for building professional partnerships among faculty,” Choi said. “Many faculty don’t know that university librarians are also experts in information literacy—but they should!”

    This fall, Choi and Kaurloto plan to offer two sections of the course with a cap of 24 students per section. Choi hopes to see more first- and second-year engineering students in the course so they can apply these principles to their program.

    If your student success program has a unique feature or twist, we’d like to know about it. Click here to submit.

    Source link