Tag: Engineering

  • Carnegie Mellon lays off 75 employees at engineering institute amid federal funding shifts

    Carnegie Mellon lays off 75 employees at engineering institute amid federal funding shifts

    This audio is auto-generated. Please let us know if you have feedback.

    Dive Brief:

    • Carnegie Mellon University has laid off 75 employees in its Software Engineering Institute as it wrestles with disruptions to federal funding, according to a community message Wednesday from Vice President for Research Theresa Mayer.
    • Mayer tied the cuts — which amount to 10% of SEI’s workforce — to the engineering institute’s “unique financial structure as a federally funded research and development center as well as the shifting federal funding priorities that are shaping the research landscape.”
    • Carnegie Mellon as a whole is in a “strong financial position” for fiscal 2026, university President Farnam Jahanian said in August, noting that the Pittsburgh institution cut its expenses by $33 million.

    Dive Insight:

    Jahanian said in an August community message that Carnegie Mellon is poised to get through the current fiscal year without a deficit, which is more than some of its peer institutions can say. 

    But the university faces stiff financial headwinds — and what its president described as “existential challenges” — from the Trump administration’s disinvestment in scientific and academic research. 

    To tighten its budget, Carnegie Mellon has paused merit raises, reduced nonessential expenditures, limited new staff and faculty hiring, and has reduced staff in certain units through voluntary retirements and employee reductions.  

    In the August message, Jahanian described “signs of a marked decline in the pipeline of new federal research awards nationally and at Carnegie Mellon.” He added that university officials expect more cutbacks in federal agencies’ research budgets under a Republican-led Congress. 

    The university’s Software Engineering Institute, which Mayer described as integral to Carnegie Mellon’s overall research enterprise, is one of the institution’s biggest recipients of federal research funding. Sponsored by the U.S. Department of Defense, SEI develops new technologies and studies complex software engineering, cybersecurity and AI engineering problems, in large part to advance the strategic goals of federal agencies. 

    The institute took in $148.8 million in grants and contracts revenue in fiscal 2024. 

    Prior to this month’s job cuts, officials at the institute took “extensive steps to avoid this outcome, including implementing cost-saving measures in recent months,” Mayer said Wednesday. “Despite these efforts, SEI was unable to reallocate or absorb costs, so staff reductions were unavoidable.”

    Along with a slackening grant pipeline, Jahanian’s August message pointed to the possibility of reduced funding for research overhead costs. 

    The Trump administration has sought to unilaterally cap reimbursement rates for indirect research costs at 15% across multiple agencies, though these policies have been blocked by courts

    Carnegie Mellon is a plaintiff in one of the lawsuits that led to the 15% cap being permanently blocked at the National Institutes of Health, though the Trump administration has appealed the ruling. The university is also represented in lawsuits against other agencies through its membership in the Association of American Universities. 

    If a 15% cap were implemented on research overhead, that would create an additional $40 million annual shortfall for Carnegie Mellon, according to Jahanian. Indirect research costs include overhead expenses such as laboratories and support staff. 

    Beyond federal funding woes, Jahanian also noted in August that Carnegie Mellon’s projected $365 million in graduate tuition revenue for the current fiscal year is about $20 million short of initial estimates due to “lower-than-expected enrollment.”

    While Jahanian didn’t offer reasons for the shortfall, he did note that going forward Carnegie Mellon was examining its balance of undergraduate to graduate and international to domestic students to “ensure long-term stability.”

    Other universities have experienced major declines in their international enrollment amid the Trump administration’s disruptions to the visa approval process and aggressive immigration policies. 

    Officials at DePaul University, in Chicago, said recently that new international graduate student enrollment fell by 62% year over year this fall, contributing heavily to a budget crunch at the institution. 

    One group has predicted that international enrollment could drop by as much as 150,000 students this fall. 

    In recent years, Carnegie Mellon’s enrollment has grown, as has its graduate student ranks. Between 2018 and 2023, overall enrollment increased 11.2% to 15,596 students and graduate enrollment grew 11.7% to 8,307 students.

    Source link

  • When AI Meets Engineering Education: Rethinking the University 

    When AI Meets Engineering Education: Rethinking the University 

    This HEPI blog was kindly authored by James Atuonwu, Assistant Professor at the New Model Institute for Technology and Engineering (NMITE). 

    Where machines of the past multiplied the strength of our hands, AI multiplies the power of our minds – drawing on the knowledge of all history, bounded only by its training data. 

    We are living through a moment of profound transition. The steam engine redefined labour, the computer redefined calculation, and now AI is redefining thought itself. Unlike earlier technologies that multiplied individual workers’ power, AI, particularly large language models (LLMs), multiplies the collective intelligence of humanity. 

    For engineering practice and universities alike, this shift is existential. 

    AI as Servant, Not Master 

    The old adage is apt: AI is a very good servant, but a very bad master

    • As a servant, AI supports engineers in simulation, design exploration, and predictive maintenance. For students, it provides on-demand access to resources, enables rapid testing of ideas, and helps them reframe problems.  
    • As a master, AI risks entrenching bias, undermining judgment, and reshaping educational systems around efficiency rather than values. 

    The challenge is not whether AI will change engineering education, but whether we can train engineers who command AI wisely, rather than being commanded by it. 

    This logic resonates with the emerging vision of Industry 5.0: a paradigm where technology is designed not to replace humans, but to collaborate with them, enhance their creativity and serve societal needs. If Industry 4.0 was about automation and efficiency, Industry 5.0 is about restoring human agency, ethics, and resilience at the heart of engineering practice. In this sense, AI in engineering education is not just a technical challenge, but a cultural one: how do we prepare engineers to thrive as co-creators with intelligent systems, rather than their servants 

    Beyond ‘AI Will Take Your Job’ 

    The phrase AI won’t take your job, but a person using AI will has become a cliché. It captures the competitive edge of AI literacy but misses the deeper truth: AI reshapes the jobs themselves.  

    In engineering practice, repetitive calculations, drafting, and coding are already being automated. What remains – and grows in importance – are those tasks requiring creativity, ethical judgment, interdisciplinary reasoning, and decision-making under uncertainty. Engineering workflows are being reorganised around AI-enabled systems, rather than human bottlenecks

    Universities, therefore, face a central question: Are we preparing students merely to compete with each other using AI, or to thrive in a world where the very structure of engineering work has changed? 

    Rethinking Assessment 

    This question leads directly to assessment – perhaps the most urgent pressure point for universities in the age of AI. 

    If LLMs can generate essays, solve textbook problems, and produce ‘good enough’ designs, then traditional forms of assessment risk becoming obsolete. Yet, this is an opportunity, not just a threat

    • Assessment must shift from recalling knowledge to demonstrating judgment. 
    • Students should be evaluated on their ability to frame problems, critique AI-generated answers, work with incomplete data, and integrate ethical, social, and environmental perspectives. 

    A further challenge lies in the generational difference in how AI is encountered. Mature scholars and professionals, who developed their intellectual depth before AI, can often lead AI, using it as a servant, because they already possess the breadth and critical capacity to judge its outputs. But students entering higher education today face a different reality: they arrive at a time when the horse has already bolted. Without prior habits of deep engagement and cognitive struggle, there is a danger that learners will be led by AI rather than leading it. 

    This is why universities cannot afford to treat AI as a mere technical add-on. They must actively design curricula and assessments that force students to wrestle with complexity, ambiguity, and values – to cultivate the intellectual independence required to keep AI in its rightful place: a servant, not a master. 

    Rediscovering Values and Ethics 

    AI forces a rediscovery of what makes us human. If algorithms can generate correct answers, then the distinctive contribution of engineers lies not only in technical mastery but in judgment grounded in values, ethics, and social responsibility

    Here the liberal arts are not a luxury, but a necessity

    • Literature and history develop narrative imagination, allowing engineers to consider the human stories behind data. 
    • Philosophy and ethics cultivate moral reasoning, helping engineers weigh competing goods. 
    • Social sciences illuminate the systems in which technologies operate, from environmental feedback loops to economic inequities. 

    In this light, AI does not diminish the need for a broad education – it intensifies it. 

    Reimagining the University 

    Yet, values alone are not enough. If universities are to remain relevant in the AI era, they must reimagine their structures of teaching, learning, and assessment. Several approaches stand out as particularly future-proof: 

    • Challenge-based learning, replacing rote lectures with inquiry-driven engagement in authentic problems. 
    • Industry and community co-designed projects, giving students opportunities to apply knowledge in practical contexts 
    • Interdisciplinary integration across engineering, business, and social perspectives. 
    • Block learning, enabling sustained immersion in complex challenges – a counterbalance to the fragmenting tendencies of AI-enabled multitasking. 
    • Professional skills and civic engagement, preparing graduates to collaborate effectively with both people and intelligent systems. 
    • Assessment through projects and portfolios, rather than traditional exams, pushing learners to demonstrate the judgment, creativity, teamwork and contextual awareness that AI can only imitate but not authentically embody. 

    These approaches anticipate what the AI era now demands of universities: to become sites of creation, collaboration, and critique, not simply repositories of content that AI can reproduce at scale. Some newer institutions, such as NMITE, have already experimented with many of these practices, offering a glimpse of how higher education can be reimagined for an AI-enabled world. 

    Closing Reflection 

    AI may be the greatest machine humanity has ever built – not because it moves steel, but because it moves minds. Yet, with that power comes a reckoning. 

    Do we let AI master our universities, eroding integrity?  
    Or do we make it serve as a co-creator, multiplier of human intelligence, and a tool for cultivating wise, ethical, creative engineers? 

    The answer will define not just the future of engineering training and practice, but the very shape of university education itself. 

    Source link

  • Getting it ‘right’ – a reflection on integrating Service Learning at scale into a large Faculty of Science and Engineering

    Getting it ‘right’ – a reflection on integrating Service Learning at scale into a large Faculty of Science and Engineering

    This blog was kindly authored by Professor Lynne Bianchi, Vice Dean for Social Responsibility & Equality, Diversity, Inclusion and Accessibility, at the University of Manchester

    I recently had the fortune to be part of a panel discussing the place of Service Learning in higher education, chaired by HEPI. My reflections before and since may inspire you to take time to think about your perspective on the nature and role of Service Learning in fast-changing university and civic landscapes. In its simplest sense, Service Learning is an educational approach that combines academic study with community service.

    In my role within a large science and engineering faculty, I have rallied our staff and students to think seriously about the features, advantages and benefits of Service Learning in science and engineering contexts. For our university, this teaching and learning approach isn’t new, with expertise in the biomedical sciences and humanities teaching us much about the way in which undergraduate students can create benefit for our local communities whilst enriching their own academic experiences.

    In this blog, I build on my own background as a teacher and higher education academic and draw on my experience in curriculum design when focusing on how we can provide authentic and impactful Service Learning experiences for our undergraduates.

    What do we mean by the ‘right’ learning experiences?

    It doesn’t take long working in this area to unearth a wide range of terms that are used interchangeably – from place-based learning, real-world learning, community-engaged learning, practice-based learning, critical urban pedagogy, industry-inspired learning and more. A gelling feature is that to get Service Learning working well there must be an authentic benefit to each party involved. The students should develop skills and understanding directly required within their degree, and the partner should have a problem explored, solved, or informed. In essence, the experience must lead to a ‘win-win’ outcome(s) to be genuine.

    In our context in science and engineering, we have envisioned Service Learning working well, and considered this to include when:

    For students:

    • Learning has relevance: work on a project, individually or in groups, is contextualised by a problem, issue or challenge that is authentic (as opposed to hypothetical).
    • Learning has resonance: developing and applying skills and knowledge to inform the problem, issue or project that dovetails with existing course specifications and requirements.

    For partners:

    • They are engaged: partners are involved in the design and delivery of the project to some extent. This may vary in the depth or level of engagement and requires both sides to appreciate the needs of each other.
    • They are enriching: partners identify real issues that matter and expose elements of the work environment that enrich students’ awareness of the workplace and career pathways.

    When is the right time for students to engage in service learning?

    I am still pondering this question as there are so many variables and options that influence the choice. Which year group should service learning drop into? Or, does a developmental over time approach suit better? Is Service Learning more impactful in the later undergraduate years, or should it be an integral part of each year of their experience with us? Realistically, there won’t be a one-size-fits-all all model, and there are benefits and challenges to each. What will need to underpin whichever approach we take, will be the focused need to elicit the starting points of our students, our staff and our partners in whichever context.

    Going from ‘zero to hero’ in Service Learning will require training and support for all parties. My experience working across the STEM sector for nearly three decades has taught me that no one partner is the same as another – what is a big deal to one can mean nothing to another. My thinking is that we need to see each person involved in the Service Learning experience as a core ‘partner’ and each has learning starting points, aspirations and apprehensions. Our role as programme leaders is to identify a progression model that appreciates that this is ‘learning’ and that scaffolds and key training will be required at different times – even within the process itself.

    What support will be required to mobilise this model at scale?

    In my early career at this university, I spent time within the Teaching & Learning Student Experience Professional Support teams, where I saw firsthand the integral way that any university programme relies on expertise in taking theoretical ideas into practice. The interplay between project management, planning, timetabling, eLearning, marketing and communications and student experience support teams, to name some, will have play such critical roles in achieving excellence in Service Learning. Working at scale in our faculty across 10 different discipline areas, will require integrated work with other faculties to harness the power of interdisciplinary projects and digital support for course delivery and assessment that can embrace an internal-external interface.

    Support for scaling up will also require a culture of risk-taking to be valued and championed. Over the introductory years, we need to provide a sense of supported exploration, a culture of learning and reflection, and an ethos where failure is rarely a negative, but an opportunity. Of course, science and engineering disciplines bring with them our obligations to accrediting bodies, and a close dialogue with them about ambition, relevance and need for this enriching approach needs to be clearly articulated and agreed so that any course alteration becomes a course invigoration rather than a compromise.

    Faculty culture and the way the university and the sector views and reviews SL will have a significant implication on practice and people feeling safe to innovate. As the university forges and launches its 2035 strategy the spaces for innovation and development are increasingly championed, and the months and years ahead will be ones to watch in terms of establishing a refreshed version of teaching and learning for our students.

    In closing this short exploration of Service Learning, I can feel a positive tension in the air – the excitement to work together to further invigorate our student experience whilst supporting our staff and partners to embrace varied new opportunities. The ‘getting it right’ story will have many chapters, many endings as the genres, characters and plots are there for us all to create – or more pertinently ‘co-create’! What drives me most to remain in this space of uncertainty for a while longer is the anticipation of creating experiences that truly make a difference for good. As our universities transform themselves over the coming years, I invite you to join us in the dialogue and development as we have so much to learn through collaboration.

    Source link

  • AI and Art Collide in This Engineering Course That Puts Human Creativity First – The 74

    AI and Art Collide in This Engineering Course That Puts Human Creativity First – The 74

    I see many students viewing artificial intelligence as humanlike simply because it can write essays, do complex math or answer questions. AI can mimic human behavior but lacks meaningful engagement with the world.

    This disconnect inspired my course “Art and Generative AI,” which was shaped by the ideas of 20th-century German philosopher Martin Heidegger. His work highlights how we are deeply connected and present in the world. We find meaning through action, care and relationships. Human creativity and mastery come from this intuitive connection with the world. Modern AI, by contrast, simulates intelligence by processing symbols and patterns without understanding or care.

    In this course, we reject the illusion that machines fully master everything and put student expression first. In doing so, we value uncertainty, mistakes and imperfection as essential to the creative process.

    This vision expands beyond the classroom. In the 2025-26 academic year, the course will include a new community-based learning collaboration with Atlanta’s art communities. Local artists will co-teach with me to integrate artistic practice and AI.

    The course builds on my 2018 class, Art and Geometry, which I co-taught with local artists. The course explored Picasso’s cubism, which depicted reality as fractured from multiple perspectives; it also looked at Einstein’s relativity, the idea that time and space are not absolute and distinct but part of the same fabric.

    What does the course explore?

    We begin with exploring the first mathematical model of a neuron, the perceptron. Then, we study the Hopfield network, which mimics how our brain can remember a song from just listening to a few notes by filling in the rest. Next, we look at Hinton’s Boltzmann Machine, a generative model that can also imagine and create new, similar songs. Finally, we study today’s deep neural networks and transformers, AI models that mimic how the brain learns to recognize images, speech or text. Transformers are especially well suited for understanding sentences and conversations, and they power technologies such as ChatGPT.

    In addition to AI, we integrate artistic practice into the coursework. This approach broadens students’ perspectives on science and engineering through the lens of an artist. The first offering of the course in spring 2025 was co-taught with Mark Leibert, an artist and professor of the practice at Georgia Tech. His expertise is in art, AI and digital technologies. He taught students fundamentals of various artistic media, including charcoal drawing and oil painting. Students used these principles to create art using AI ethically and creatively. They critically examined the source of training data and ensured that their work respects authorship and originality.

    Students also learn to record brain activity using electroencephalography – EEG – headsets. Through AI models, they then learn to transform neural signals into music, images and storytelling. This work inspired performances where dancers improvised in response to AI-generated music.

    The Improv AI performance at Georgia Institute of Technology on April 15, 2025. Dancers improvised to music generated by AI from brain waves and sonified black hole data.

    Why is this course relevant now?

    AI entered our lives so rapidly that many people don’t fully grasp how it works, why it works, when it fails or what its mission is.

    In creating this course, the aim is to empower students by filling that gap. Whether they are new to AI or not, the goal is to make its inner algorithms clear, approachable and honest. We focus on what these tools actually do and how they can go wrong.

    We place students and their creativity first. We reject the illusion of a perfect machine, but we provoke the AI algorithm to confuse and hallucinate, when it generates inaccurate or nonsensical responses. To do so, we deliberately use a small dataset, reduce the model size or limit training. It’s in these flawed states of AI that students step in as conscious co-creators. The students are the missing algorithm that takes back control of the creative process. Their creations do not obey AI but reimagine it by the human hand. The artwork is rescued from automation.

    What’s a critical lesson from the course?

    Students learn to recognize AI’s limitations and harness its failures to reclaim creative authorship. The artwork isn’t generated by AI, but it’s reimagined by students.

    Students learn chatbot queries have an environmental cost because large AI models use a lot of power. They avoid unnecessary iterations when designing prompts or using AI. This helps reducing carbon emissions.

    The Improv AI performance on April 15, 2025, featured dancer Bekah Crosby responding to AI-generated music from brain waves.

    The course prepares students to think like artists. Through abstraction and imagination they gain the confidence to tackle the engineering challenges of the 21st century. These include protecting the environment, building resilient cities and improving health.

    Students also realize that while AI has vast engineering and scientific applications, ethical implementation is crucial. Understanding the type and quality of training data that AI uses is essential. Without it, AI systems risk producing biased or flawed predictions.

    Uncommon Courses is an occasional series from The Conversation U.S. highlighting unconventional approaches to teaching.

    This article is republished from The Conversation under a Creative Commons license. Read the original article.

    Source link

  • Common Sense Media releases AI toolkit for school districts

    Common Sense Media releases AI toolkit for school districts

    Key points:

    Common Sense Media has released its first AI Toolkit for School Districts, which gives districts of all sizes a structured, action-oriented guide for implementing AI safely, responsibly, and effectively.

    Common Sense Media research shows that 7 in 10 teens have used AI. As kids and teens increasingly use the technology for schoolwork, teachers and school district leaders have made it clear that they need practical, easy-to-use tools that support thoughtful AI planning, decision-making, and implementation.

    Common Sense Media developed the AI Toolkit, which is available to educators free of charge, in direct response to district needs.

    “As more and more kids use AI for everything from math homework to essays, they’re often doing so without clear expectations, safeguards, or support from educators,” said Yvette Renteria, Chief Program Officer of Common Sense Media.

    “Our research shows that schools are struggling to keep up with the rise of AI–6 in 10 kids say their schools either lack clear AI rules or are unsure what those rules are. But schools shouldn’t have to navigate the AI paradigm shift on their own. Our AI Toolkit for School Districts will make sure every district has the guidance it needs to implement AI in a way that works best for its schools.”

    The toolkit emphasizes practical tools, including templates, implementation guides, and customizable resources to support districts at various stages of AI exploration and adoption. These resources are designed to be flexible to ensure that each district can develop AI strategies that align with their unique missions, visions, and priorities.

    In addition, the toolkit stresses the importance of a community-driven approach, recognizing that AI exploration and decision-making require input from all of the stakeholders in a school community.

    By encouraging districts to give teachers, students, parents, and more a seat at the table, Common Sense Media’s new resources ensure that schools’ AI plans meet the needs of families and educators alike.

    This press release originally appeared online.

    eSchool News Staff
    Latest posts by eSchool News Staff (see all)

    Source link

  • Raising the Bar: A Graduate Design Engineer’s Path in Engineering

    Raising the Bar: A Graduate Design Engineer’s Path in Engineering

    • By Professor Lisa-Dionne Morris, Professor of Public & Industry Understanding of Capability Driven Design in the School of Mechanical Engineering, and Engagement Champion for the EPSRC EDI Hub+ at the University of Leeds.

    International Women in Engineering Day, Monday 23 June 2025, provides an essential platform to celebrate the contributions of women designers and engineers while also highlighting persistent gender disparities in the profession. In 2021, only 16.5% of engineers in the UK were women, a figure that underscores the continued need for structural reform and targeted support for women pursuing careers in STEM disciplines.

    Preparing the next generation of female international design engineers requires more than the delivery of technical content. It necessitates a systemic, institution-wide approach that equips graduates with the attributes, knowledge, resources, skills, and confidence to navigate a professional landscape that is rapidly changing and, in many cases, still being defined for future careers. The increasing global demand for roles in areas like sustainable product design, AI-integrated manufacturing, inclusive user interface systems, and human-centred engineering is underpinned by the foundational importance of STEM, making the empowerment of women designers and engineers in these fields crucial for driving innovation and achieving sustainable development goals. These emerging sectors demand not only technical competence but also a blend of creativity, emotional intelligence, and social awareness that diverse females in STEM demonstrate.

    Holistic Support: Design Engineering as Ecosystem

    The development of a graduate designer and engineer can be likened to nurturing a tree within a complex ecosystem. While academic performance remains important, the capacity to thrive in uncertain, transdisciplinary, and innovation-driven contexts depends upon institutional ecosystems that foster global awareness, adaptability, collaboration, and resilience.

    Universities play a vital role as critical enablers and a resource. This extends beyond curricula to the people, processes, and environments that scaffold student growth, from technical staff and personal tutors to administrative teams and peer mentors. The university must therefore shift its conceptualisation of employability from curriculum-contained instruction to community-wide responsibility.

    Barriers and Micro-inequities

    For female design and engineering graduates, these ecosystems are even more consequential. While overt discrimination may be declining, micro-barriers, such as imposter syndrome, limited visibility of role models, cultural dissonance and inaccessible resources, continue to affect women disproportionately. The intersectionality of race, disability, and socioeconomic status further compounds these challenges.

    Support mechanisms such as inclusive wellbeing services, financial assistance schemes, mentoring networks, and accessible technical environments serve as critical interventions. These do not merely reduce dropout risk; they transform educational experiences and enhance graduate outcomes.

    Beyond KSA: Towards the ACRES Model

    Traditional employability frameworks such as the KSA model (Knowledge, Skills, Abilities) focus primarily on individual traits. While helpful, such models risk overlooking the social, ethical, and emotional dimensions necessary for future engineering practice. In response, I propose the ACRES framework — a holistic model centred on:

    • A – Adaptability: Developing the capacity to respond flexibly to change
    • C – Collaboration: Cultivating skills in teamwork and interdisciplinary cooperation.
    • R – Resilience: Building psychological robustness through reflective learning
    • E – Empathy: Encouraging emotional intelligence through inclusive design challenges
    • S – Social Responsibility: Engaging students with ethical, civic, and sustainability issues.

    These attributes are more than ideals; they represent the design specifications for the modern engineer.

    Educational Practice in Action

    Design engineering programmes across the UK are embedding these competencies through interdisciplinary projects, challenge-based learning, studio-based learning, sustainability modules, and community-based partnerships. At the University of Leeds, in the Faculty of Engineering and Physical Sciences, for example, students engage in industry-informed design briefs, receive feedback from career mentors, and co-produce portfolios that reflect both technical ability and human-centred thinking.

    Such practices are not incidental, they are fundamental. The preparation of women designers and engineers is a collective act; it is the result of intentional, inclusive, and collaborative university cultures that nurture talent through both “seen and unseen” interventions.

    The university must function not only as a centre of instruction but as a dynamic support system, enabling intersectionality such as first-generation, women, disabled, and underrepresented female students to flourish in STEM to become graduates. When we invest in raising future-ready women designer and engineers, we are not merely producing graduates, we are shaping leaders, changemakers, and innovators for careers that, in many cases, are yet to be invented.

    Source link

  • Chat Bot Passes College Engineering Class With Minimal Effort

    Chat Bot Passes College Engineering Class With Minimal Effort

    Since the release of ChatGPT in 2022, instructors have worried about how students might circumvent learning by utilizing the chat bot to complete homework and other assignments. Over the years, the large language model has enabled AI to expand its database and its ability to answer more complex questions, but can it replace a student’s efforts entirely?

    Graduate students at the University of Illinois at Urbana-Champaign’s college of engineering integrated a large language model into an undergraduate aerospace engineering course to evaluate its performance compared to the average student’s work.

    The researchers, Gokul Puthumanaillam and Melkior Ornik, found that ChatGPT earned a passing grade in the course without much prompt engineering, but the chat bot didn’t demonstrate understanding or comprehension of high-level concepts. Their work illustrating its capabilities and limitations was published on the open-access platform arXiv, operated by Cornell Tech.

    The background: LLMs can tackle a variety of tasks, including creative writing and technical analysis, prompting concerns over students’ academic integrity in higher education.

    A significant number of students admit to using generative artificial intelligence to complete their course assignments (and professors admit to using generative AI to give feedback, create course materials and grade academic work). According to a 2024 survey from Wiley, most students say it’s become easier to cheat, thanks to AI.

    Researchers sought to understand how a student investing minimal effort would perform in a course by offloading work to ChatGPT.

    The evaluated class, Aerospace Control Systems, which was offered in fall 2024, is a required junior-level course for aerospace engineering students. During the term, students submit approximately 115 deliverables, including homework problems, two midterm exams and three programming projects.

    “The course structure emphasizes progressive complexity in both theoretical understanding and practical application,” the research authors wrote in their paper.

    They copied and pasted questions or uploaded screenshots of questions into a free version of the chat bot without additional guidance, mimicking a student who is investing minimal time in their coursework.

    The results: At the end of the term, ChatGPT achieved a B grade (82.2 percent), slightly below the class average of 85 percent. But it didn’t excel at all assignment types.

    On practice problems, the LLM earned a 90.4 percent average (compared to the class average of 91.4 percent), performing the best on multiple-choice questions. ChatGPT received a higher exam average (89.7 percent) compared to the class (84.8 percent), but it faltered much more on the written sections than on the autograded components.

    ChatGPT demonstrated its worst performance in programming projects. While it had sound mathematical reasoning to theoretical questions, the model’s explanation was rigid and template-like, not adapting to the specific nuances of the problem, researchers wrote. It also created inefficient or overly complex solutions to programming, lacking “the optimization and robustness of considerations that characterize high-quality student submissions,” according to the article.

    The findings demonstrate that AI is capable of passing a rigorous undergraduate course, but that LLM systems can only accomplish pattern recognition rather than deep understanding. The results also indicated to researchers that well-designed coursework can evaluate students’ capabilities in engineering.

    So what? Based on their findings, researchers recommend faculty members integrate project work and open-ended design challenges to evaluate students’ understanding and technical capabilities, particularly in synthesizing information and making practical judgements.

    In the same vein, they suggested that faculty should design questions that evaluate human expertise by requiring students to explain their rationale or justify their response, rather than just arrive at the correct answer.

    ChatGPT was also unable to grasp system integration, robustness and optimization over basic implementation, so focusing on these requirements would provide better evaluation metrics.

    Researchers also noted that because ChatGPT is capable of answering practice problems, instruction should focus less on routine technical work and more on higher-level engineering concepts and problem-solving skills. “The challenge ahead lies not in preventing AI use, but in developing educational approaches that leverage these tools while continuing to cultivate genuine engineering expertise,” researchers wrote.

    Source link

  • Misinformation Course Teaches Ethics for Engineering Students

    Misinformation Course Teaches Ethics for Engineering Students

    Nearly three in four college students say they have somewhat high or very high media literacy skills (72 percent), according to a 2025 Student Voice survey by Inside Higher Ed and Generation Lab. Students are less likely to consider their peers media literate; three in five respondents said they have at least somewhat high levels of concern about the spread of misinformation among their classmates.

    When asked how colleges and universities could help improve students’ media literacy skills, a majority of Student Voice respondents indicated they want digital resources on increasing media literacy or media literacy–related content and training embedded into the curriculum.

    A recently developed course at the University of Southern California’s Viterbi School of Engineering teaches students information literacy principles to help them develop tools to mitigate the harms of online misinformation.

    The background: USC offers an interdisciplinary teaching grant that incentivizes cross-campus collaboration and innovative teaching practices. To be eligible for the grant, applications must include at least one full-time faculty member and faculty from more than one school or division. Each grantee receives up to $20,000 to compensate for applicants’ time and work.

    In 2023, Helen Choi, a faculty member at USC Viterbi, won the interdisciplinary teaching grant in collaboration with Cari Kaurloto, head of the science and engineering library at USC Libraries, to create a media literacy course specifically for engineering students.

    “By focusing on engineering students, we were able to integrate a component of the course that addresses a social issue from an engineering perspective in terms of technical know-how and the professional ethics,” Choi said, which helps students see the relevance of course content to their personal and professional lives.

    What’s the need: Students tend to receive most of their news and information on online platforms; Student Voice data found a majority of learners rely on social media for news content (72 percent), and about one in four engage with news apps or news aggregator websites (27 percent).

    Choi and Kaurloto’s course, titled Information Literacy: Navigating Digital Misinformation, builds academic research skills, teaches information literacy principles and breaks down the social issue of online misinformation.

    “Students examine ways they can navigate online information using their research skills, and then extend that knowledge by considering how they, as prospective engineers, can build technologies that mitigate the harms of online misinformation while enhancing the information literacy of users,” Choi explained.

    USC faculty aren’t the only ones noticing a need for more education around engagement with digital information; a growing number of colleges and universities are making students complete a digital literacy course as a graduation requirement.

    In the classroom: Choi and Kaurloto co-teach the course, which was first offered in this spring to a class of 25 students.

    The students learned to develop effective search strategies and critically examine sources, as well as ethical engineering principles and how to apply them in designing social media platforms, Kaurloto said. Choi and Kaurloto employed active learning pedagogies to give students hands-on and real-life applications including writing, speaking and collaborative coursework.

    One assignment the students completed was conducting library research to develop a thesis paragraph on an information literacy topic with a short, annotated bibliography. Students also presented their research to their peers, Kaurloto said.

    Learners also engaged in a group digital literacy project, designing a public service campaign that included helpful, research-backed ways to identify misinformation, Choi said. “They then had to launch that campaign on a social media platform, measure its impact, and present on their findings.” Projects ranged from infographics on Reddit to short-form videos on spotting AI-generated misinformation and images on TikTok and Instagram.

    The impact: Student feedback said they found the course helpful, with many upper-level learners saying they wished they had taken it sooner in their academic career because of the library research skills they gained. They also indicated the course content was applicable in daily life, such as when supporting family members “who students say have fallen down a few internet rabbit holes or who tend to believe everything they see online,” Choi said.

    Other librarians have taken note of the course as a model of how to teach information literacy, Choi said.

    “We’ve found that linking information literacy with specific disciplines like engineering can be helpful both in terms of building curricula that resonate with students but also for building professional partnerships among faculty,” Choi said. “Many faculty don’t know that university librarians are also experts in information literacy—but they should!”

    This fall, Choi and Kaurloto plan to offer two sections of the course with a cap of 24 students per section. Choi hopes to see more first- and second-year engineering students in the course so they can apply these principles to their program.

    If your student success program has a unique feature or twist, we’d like to know about it. Click here to submit.

    Source link

  • Marine, geoscience, engineering students get hands-on experience aboard CSIRO ship

    Marine, geoscience, engineering students get hands-on experience aboard CSIRO ship

    CSIRO staff Dr Ben Arthur, Ian McRobert and
    Matt Kimber in front of the RV Investigator. Picture: Richard Jupe

    Students from 16 Australian universities set sail from Hobart on Saturday for a unique scientific adventure aimed at developing the country’s next generation of marine experts.

    Please login below to view content or subscribe now.

    Membership Login

    Source link

  • Students Explore STEM with Engineers

    Students Explore STEM with Engineers

    Middletown, PA – Phoenix Contact engineers head back into the classroom this week to teach sixth-grade science class at Middletown Area Middle School in Middletown, Pa. The classes are part of Phoenix Contact’s National Engineers Week celebration.

    Phoenix Contact has worked with the school every February since 2007. The engineers lead hands-on lessons that make science fun. The goal is to inspire young people to consider careers in science, technology, engineering, and math (STEM).

    The lessons include:

    • Building catapults
    • Racing cookie tins down ramps
    • Building an electric motor
    • Learning about static electricity with the Van de Graaff generator

    “Our engineering team created this outreach program many years ago, and the partnership with Middletown Area School District has stood the test of time,” said Patty Marrero, interim vice president of human relations at Phoenix Contact. “National Engineers Week is a special time for them to share their passion for technology with students. It’s also our chance to thank our engineers for the creativity and innovations that drive our company forward.”

    About Phoenix Contact

    Phoenix Contact is a global market leader based in Germany. Since 1923, Phoenix Contact has created products to connect, distribute, and control power and data flows. Our products are found in nearly all industrial settings, but we have a strong focus on the energy, infrastructure, process, factory automation, and e-mobility markets. Sustainability and responsibility guide every action we take, and we’re proud to work with our customers to empower a smart and sustainable world for future generations. Our global network includes 22,000 employees in 100+ countries. Phoenix Contact USA has headquarters near Harrisburg, Pa., and employs more than 1,100 people across the U.S.

    For more information about Phoenix Contact or its products, visit www.phoenixcontact.com, call technical service at 800-322-3225, or email [email protected].

    eSchool News Staff
    Latest posts by eSchool News Staff (see all)

    Source link