Tag: AI

  • The promise and challenge of AI in building a sustainable future

    The promise and challenge of AI in building a sustainable future

    It is tempting to regard AI as a panacea for addressing our most urgent global challenges, from climate change to resource scarcity. Yet the truth is more complex: unless we pair innovation with responsibility, the very tools designed to accelerate sustainability may exacerbate its contradictions.

    A transformative potential

    Let us first acknowledge how AI is already reshaping sustainable development. By mapping patterns in vast datasets, AI enables us to anticipate environmental risks, optimise resource flows and strengthen supply chains. Evidence suggests that by 2030, AI systems will touch the lives of more than 8.5 billion people and influence the health of both human and natural ecosystems in ways we have never seen before. Research published in Nature indicates that AI could support progress towards 79% of the Sustainable Development Goals (SDGs), helping advance 134 specific targets. Yet the same research also cautions that AI may impede 59 of those targets if deployed without care or control.

    In practice, this means smarter energy grids that balance load and demand, precision agriculture that reduces fertiliser waste and environmental monitoring systems that detect deforestation or pollution in real time. For a planet under pressure, these scenarios offer hope to do less harm and build more resilience.

    The hidden costs

    Even so, we must confront the shadows cast by AI’s advancements. An investigation published earlier this year warns that AI systems could account for nearly half of global data-centre power consumption before the decade’s end. Consider the sheer scale: vast server arrays, intensive cooling systems, rare-earth mining and water-consuming infrastructure all underpin generative AI’s ubiquity. Worse still, indirect carbon emissions tied to major AI-capable firms reportedly rose by 150% between 2020 and 2023. In short, innovation meant to serve sustainability imposes a growing ecological burden.

    Navigating trade-offs

    This tension presents an essential question: how can we reconcile AI’s promise with its cost? Scholars warn that we must move beyond the assumption that AI for good’ is always good enough. The moment demands a new discipline of sustainable AI’: a framework that treats resource use, algorithmic bias, lifecycle impact and societal equity as first-order concerns.

    Practitioners must ask not only what AI can do, but how it is built, powered, governed and retired. Efficiency gains that drive consumption higher will not deliver sustainability; they may merely escalate resource demands in disguise.

    A moral and strategic imperative

    For educators, policymakers and business leaders, this is more than a technical issue; it is a moral and strategic one. To realise AI’s true potential in advancing sustainable development, we must commit to three priorities:

    Energy and resource transparency: Organisations must measure and report the footprint of their AI models, including data-centre use, water cooling, e-waste and supply-chain impacts. Transparency is foundational to accountability.

    Ethical alignment and fairness: AI must be trained and deployed with due regard to bias, social impact and inclusivity. Its benefits must not reinforce inequality or externalise environmental harms onto vulnerable communities.

    Integrative education and collaboration: We need multidisciplinary expertise, engineers fluent in ecology, ethicists fluent in algorithms and managers fluent in sustainability. Institutions must upskill young learners and working professionals to orient AI within the broader context of planetary boundaries and human flourishing.

    MLA College’s focus and contribution

    At MLA College, we recognise our role in equipping professionals at this exact intersection. Our programs emphasise the interrelationship between technology, sustainability and leadership. Graduates of distance-learning and part-time formats engage with the complexities of AI, maritime operations, global sustainable development and marine engineering by bringing insight to sectors vital to the planet’s future.

    When responsibly guided, AI becomes an amplifier of purpose rather than a contraption of risk. Our challenge is to ensure that every algorithm, model and deployment contributes to regenerative systems, not extractive ones.

    The promise of AI is compelling: more accurate climate modelling, smarter cities, adaptive infrastructure and just-in-time supply chains. But the challenge is equally formidable: rising energy demands, resource-intensive infrastructures and ungoverned expansion.

    When responsibly guided, AI becomes an amplifier of purpose rather than a contraption of risk

    Our collective role, as educators and practitioners, is to shape the ethical architecture of this era. We must ask whether our technologies will serve humanity and the environment or simply accelerate old dynamics under new wrappers.

    The verdict will not be written on lines of code or boardroom decisions alone. It will be inscribed in the fields that fail to regenerate, in the communities excluded from progress, in the data centres humming with waste and in the next generation seeking meaning in technology’s promise.

    About the author: Professor Mohammad Dastbaz is the principal and CEO of MLA College, an international leader in distance and sustainability-focused higher education. With over three decades in academia, he has held senior positions including deputy vice-chancellor at the University of Suffolk and pro vice-chancellor at Leeds Beckett University.

    A Fellow of the British Computer Society, the Higher Education Academy, and the Royal Society of Arts, Professor Dastbaz is a prominent researcher and author in the fields of sustainable development, smart cities, and digital innovation in education.

    His latest publication, Decarbonization or Demise – Sustainable Solutions for Resilient Communities (Springer, 2025), brings together cutting-edge global research on sustainability, climate resilience, and the urgent need for decarbonisation. The book builds on his ongoing commitment to advancing the UN Sustainable Development Goals through education and research.

    At MLA College, Professor Dastbaz continues to lead transformative learning initiatives that combine academic excellence with real-world impact, empowering students to shape a sustainable future.

    Source link

  • Can AI Keep Students Motivated, Or Does it Do the Opposite? – The 74

    Can AI Keep Students Motivated, Or Does it Do the Opposite? – The 74

    Imagine a student using a writing assistant powered by a generative AI chatbot. As the bot serves up practical suggestions and encouragement, insights come more easily, drafts polish up quickly and feedback loops feel immediate. It can be energizing. But when that AI support is removed, some students report feeling less confident or less willing to engage.

    These outcomes raise the question: Can AI tools genuinely boost student motivation? And what conditions can make or break that boost?

    As AI tools become more common in classroom settings, the answers to these questions matter a lot. While tools for general use such as ChatPGT or Claude remain popular, more and more students are encountering AI tools that are purpose-built to support learning, such as Khan Academy’s Khanmigo, which personalizes lessons. Others, such as ALEKS, provide adaptive feedback. Both tools adjust to a learner’s level and highlight progress over time, which helps students feel capable and see improvement. But there are still many unknowns about the long-term effects of these tools on learners’ progress, an issue I continue to study as an educational psychologist.

    What the evidence shows so far

    Recent studies indicate that AI can boost motivation, at least for certain groups, when deployed under the right conditions. A 2025 experiment with university students showed that when AI tools delivered a high-quality performance and allowed meaningful interaction, students’ motivation and their confidence in being able to complete a task – known as self-efficacy – increased.

    For foreign language learners, a 2025 study found that university students using AI-driven personalized systems took more pleasure in learning and had less anxiety and more self-efficacy compared with those using traditional methods. A recent cross-cultural analysis with participants from Egypt, Saudi Arabia, Spain and Poland who were studying diverse majors suggested that positive motivational effects are strongest when tools prioritize autonomy, self-direction and critical thinking. These individual findings align with a broader, systematic review of generative AI tools that found positive effects on student motivation and engagement across cognitive, emotional and behavioral dimensions.

    A forthcoming meta-analysis from my team at the University of Alabama, which synthesized 71 studies, echoed these patterns. We found that generative AI tools on average produce moderate positive effects on motivation and engagement. The impact is larger when tools are used consistently over time rather than in one-off trials. Positive effects were also seen when teachers provide scaffolding, when students maintain agency in how they use the tool, and when the output quality is reliable.

    But there are caveats. More than 50 of the studies we reviewed did not draw on a clear theoretical framework of motivation, and some used methods that we found were weak or inappropriate. This raises concerns about the quality of the evidence and underscores how much more careful research is needed before one can say with confidence that AI nurtures students’ intrinsic motivation rather than just making tasks easier in the moment.

    When AI backfires

    There is also research that paints a more sobering picture. A large study of more than 3,500 participants found that while human–AI collaboration improved task performance, it reduced intrinsic motivation once the AI was removed. Students reported more boredom and less satisfaction, suggesting that overreliance on AI can erode confidence in their own abilities.

    Another study suggested that while learning achievement often rises with the use of AI tools, increases in motivation are smaller, inconsistent or short-lived. Quality matters as much as quantity. When AI delivers inaccurate results, or when students feel they have little control over how it is used, motivation quickly erodes. Confidence drops, engagement fades and students can begin to see the tool as a crutch rather than a support. And because there are not many long-term studies in this field, we still do not know whether AI can truly sustain motivation over time, or whether its benefits fade once the novelty wears off.

    Not all AI tools work the same way

    The impact of AI on student motivation is not one-size-fits-all. Our team’s meta-analysis shows that, on average, AI tools do have a positive effect, but the size of that effect depends on how and where they are used. When students work with AI regularly over time, when teachers guide them in using it thoughtfully, and when students feel in control of the process, the motivational benefits are much stronger.

    We also saw differences across settings. College students seemed to gain more than younger learners, STEM and writing courses tended to benefit more than other subjects, and tools designed to give feedback or tutoring support outperformed those that simply generated content.

    There is also evidence that general-use tools like ChatGPT or Claude do not reliably promote intrinsic motivation or deeper engagement with content, compared to learning-specific platforms such as ALEKS and Khanmigo, which are more effective at supporting persistence and self-efficacy. However, these tools often come with subscription or licensing costs. This raises questions of equity, since the students who could benefit most from motivational support may also be the least likely to afford it.

    These and other recent findings should be seen as only a starting point. Because AI is so new and is changing so quickly, what we know today may not hold true tomorrow. In a paper titled The Death and Rebirth of Research in Education in the Age of AI, the authors argue that the speed of technological change makes traditional studies outdated before they are even published. At the same time, AI opens the door to new ways of studying learning that are more participatory, flexible and imaginative. Taken together, the data and the critiques point to the same lesson: Context, quality and agency matter just as much as the technology itself.

    Why it matters for all of us

    The lessons from this growing body of research are straightforward. The presence of AI does not guarantee higher motivation, but it can make a difference if tools are designed and used with care and understanding of students’ needs. When it is used thoughtfully, in ways that strengthen students’ sense of competence, autonomy and connection to others, it can be a powerful ally in learning.

    But without those safeguards, the short-term boost in performance could come at a steep cost. Over time, there is the risk of weakening the very qualities that matter most – motivation, persistence, critical thinking and the uniquely human capacities that no machine can replace.

    For teachers, this means that while AI may prove a useful partner in learning, it should never serve as a stand-in for genuine instruction. For parents, it means paying attention to how children use AI at home, noticing whether they are exploring, practicing and building skills or simply leaning on it to finish tasks. For policymakers and technology developers, it means creating systems that support student agency, provide reliable feedback and avoid encouraging overreliance. And for students themselves, it is a reminder that AI can be a tool for growth, but only when paired with their own effort and curiosity.

    Regardless of technology, students need to feel capable, autonomous and connected. Without these basic psychological needs in place, their sense of motivation will falter – with or without AI.

    This article is republished from The Conversation under a Creative Commons license. Read the original article.

    Source link

  • More Than Half the States Have Issued AI Guidance for Schools – The 74

    More Than Half the States Have Issued AI Guidance for Schools – The 74


    Get stories like this delivered straight to your inbox. Sign up for The 74 Newsletter

    Agencies in at least 28 states and the District of Columbia have issued guidance on the use of artificial intelligence in K-12 schools.

    More than half of the states have created school policies to define artificial intelligence, develop best practices for using AI systems and more, according to a report from AI for Education, an advocacy group that provides AI literacy training for educators.

    Despite efforts by the Trump administration to loosen federal and state AI rules in hopes of boosting innovation, teachers and students need a lot of state-level guidance for navigating the fast-moving technology, said Amanda Bickerstaff, the CEO and co-founder of AI for Education.

    “What most people think about when it comes to AI adoption in the schools is academic integrity,” she said. “One of the biggest concerns that we’ve seen — and one of the reasons why there’s been a push towards AI guidance, both at the district and state level — is to provide some safety guidelines around responsible use and to create opportunities for people to know what is appropriate.”

    North Carolina, which last year became one of the first states to issue AI guidance for schools, set out to study and define generative artificial intelligence for potential uses in the classroom. The policy also includes resources for students and teachers interested in learning how to interact with AI models successfully.

    In addition to classroom guidance, some states emphasize ethical considerations for certain AI models. Following Georgia’s initial framework in January, the state shared additional guidance in June outlining ethical principles educators should consider before adopting the technology.

    This year, Maine, Missouri, Nevada and New Mexico also released guidelines for AI in schools.

    In the absence of regulations at the federal level, states are filling a critical gap, said Maddy Dwyer, a policy analyst for the Equity in Civic Technology team at the Center for Democracy & Technology, a nonprofit working to advance civil rights in the digital age.

    While most state AI guidance for schools focuses on the potential benefits, risks and need for human oversight, Dwyer wrote in a recent blog post that many of the frameworks are missing out on critical AI topics, such as community engagement and deepfakes, or manipulated photos and videos.

    “I think that states being able to fill the gap that is currently there is a critical piece to making sure that the use of AI is serving kids and their needs, and enhancing their educational experiences rather than detracting from them,” she said.

    Stateline is part of States Newsroom, a nonprofit news network supported by grants and a coalition of donors as a 501c(3) public charity. Stateline maintains editorial independence. Contact Editor Scott S. Greenberger for questions: [email protected].


    Get stories like these delivered straight to your inbox. Sign up for The 74 Newsletter

    Source link

  • AI and Art Collide in This Engineering Course That Puts Human Creativity First – The 74

    AI and Art Collide in This Engineering Course That Puts Human Creativity First – The 74

    I see many students viewing artificial intelligence as humanlike simply because it can write essays, do complex math or answer questions. AI can mimic human behavior but lacks meaningful engagement with the world.

    This disconnect inspired my course “Art and Generative AI,” which was shaped by the ideas of 20th-century German philosopher Martin Heidegger. His work highlights how we are deeply connected and present in the world. We find meaning through action, care and relationships. Human creativity and mastery come from this intuitive connection with the world. Modern AI, by contrast, simulates intelligence by processing symbols and patterns without understanding or care.

    In this course, we reject the illusion that machines fully master everything and put student expression first. In doing so, we value uncertainty, mistakes and imperfection as essential to the creative process.

    This vision expands beyond the classroom. In the 2025-26 academic year, the course will include a new community-based learning collaboration with Atlanta’s art communities. Local artists will co-teach with me to integrate artistic practice and AI.

    The course builds on my 2018 class, Art and Geometry, which I co-taught with local artists. The course explored Picasso’s cubism, which depicted reality as fractured from multiple perspectives; it also looked at Einstein’s relativity, the idea that time and space are not absolute and distinct but part of the same fabric.

    What does the course explore?

    We begin with exploring the first mathematical model of a neuron, the perceptron. Then, we study the Hopfield network, which mimics how our brain can remember a song from just listening to a few notes by filling in the rest. Next, we look at Hinton’s Boltzmann Machine, a generative model that can also imagine and create new, similar songs. Finally, we study today’s deep neural networks and transformers, AI models that mimic how the brain learns to recognize images, speech or text. Transformers are especially well suited for understanding sentences and conversations, and they power technologies such as ChatGPT.

    In addition to AI, we integrate artistic practice into the coursework. This approach broadens students’ perspectives on science and engineering through the lens of an artist. The first offering of the course in spring 2025 was co-taught with Mark Leibert, an artist and professor of the practice at Georgia Tech. His expertise is in art, AI and digital technologies. He taught students fundamentals of various artistic media, including charcoal drawing and oil painting. Students used these principles to create art using AI ethically and creatively. They critically examined the source of training data and ensured that their work respects authorship and originality.

    Students also learn to record brain activity using electroencephalography – EEG – headsets. Through AI models, they then learn to transform neural signals into music, images and storytelling. This work inspired performances where dancers improvised in response to AI-generated music.

    The Improv AI performance at Georgia Institute of Technology on April 15, 2025. Dancers improvised to music generated by AI from brain waves and sonified black hole data.

    Why is this course relevant now?

    AI entered our lives so rapidly that many people don’t fully grasp how it works, why it works, when it fails or what its mission is.

    In creating this course, the aim is to empower students by filling that gap. Whether they are new to AI or not, the goal is to make its inner algorithms clear, approachable and honest. We focus on what these tools actually do and how they can go wrong.

    We place students and their creativity first. We reject the illusion of a perfect machine, but we provoke the AI algorithm to confuse and hallucinate, when it generates inaccurate or nonsensical responses. To do so, we deliberately use a small dataset, reduce the model size or limit training. It’s in these flawed states of AI that students step in as conscious co-creators. The students are the missing algorithm that takes back control of the creative process. Their creations do not obey AI but reimagine it by the human hand. The artwork is rescued from automation.

    What’s a critical lesson from the course?

    Students learn to recognize AI’s limitations and harness its failures to reclaim creative authorship. The artwork isn’t generated by AI, but it’s reimagined by students.

    Students learn chatbot queries have an environmental cost because large AI models use a lot of power. They avoid unnecessary iterations when designing prompts or using AI. This helps reducing carbon emissions.

    The Improv AI performance on April 15, 2025, featured dancer Bekah Crosby responding to AI-generated music from brain waves.

    The course prepares students to think like artists. Through abstraction and imagination they gain the confidence to tackle the engineering challenges of the 21st century. These include protecting the environment, building resilient cities and improving health.

    Students also realize that while AI has vast engineering and scientific applications, ethical implementation is crucial. Understanding the type and quality of training data that AI uses is essential. Without it, AI systems risk producing biased or flawed predictions.

    Uncommon Courses is an occasional series from The Conversation U.S. highlighting unconventional approaches to teaching.

    This article is republished from The Conversation under a Creative Commons license. Read the original article.

    Source link

  • 60% of Teachers Used AI This Year and Saved up to 6 Hours of Work a Week – The 74

    60% of Teachers Used AI This Year and Saved up to 6 Hours of Work a Week – The 74


    Get stories like this delivered straight to your inbox. Sign up for The 74 Newsletter

    Nearly two-thirds of teachers utilized artificial intelligence this past school year, and weekly users saved almost six hours of work per week, according to a recently released Gallup survey. But 28% of teachers still oppose AI tools in the classroom.

    The poll, published by the research firm and the Walton Family Foundation, includes perspectives from 2,232 U.S. public school teachers.

    “[The results] reflect a keen understanding on the part of teachers that this is a technology that is here, and it’s here to stay,” said Zach Hrynowski, a Gallup research director. “It’s never going to mean that students are always going to be taught by artificial intelligence and teachers are going to take a backseat. But I do like that they’re testing the waters and seeing how they can start integrating it and augmenting their teaching activities rather than replacing them.”

    At least once a month, 37% of educators take advantage of tools to prepare to teach, including creating worksheets, modifying materials to meet student needs, doing administrative work and making assessments, the survey found. Less common uses include grading, providing one-on-one instruction and analyzing student data.

    A 2023 study from the RAND Corp. found the most common AI tools used by teachers include virtual learning platforms, like Google Classroom, and adaptive learning systems, like i-Ready or the Khan Academy. Educators also used chatbots, automated grading tools and lesson plan generators.

    Most teachers who use AI tools say they help improve the quality of their work, according to the Gallup survey. About 61% said they receive better insights about student learning or achievement data, while 57% said the tools help improve their grading and student feedback.

    Nearly 60% of teachers agreed that AI improves the accessibility of learning materials for students with disabilities. For example, some kids use text-to-speech devices or translators.

    More teachers in the Gallup survey agreed on AI’s risks for students versus its opportunities. Roughly a third said students using AI tools weekly would increase their grades, motivation, preparation for jobs in the future and engagement in class. But 57% said it would decrease students’ independent thinking, and 52% said it would decrease critical thinking. Nearly half said it would decrease student persistence in solving problems, ability to build meaningful relationships and resilience for overcoming challenges.

    In 2023, the U.S. Department of Education published a report recommending the creation of standards to govern the use of AI.

    “Educators recognize that AI can automatically produce output that is inappropriate or wrong. They are well-aware of ‘teachable moments’ that a human teacher can address but are undetected or misunderstood by AI models,” the report said. “Everyone in education has a responsibility to harness the good to serve educational priorities while also protecting against the dangers that may arise as a result of AI being integrated in ed tech.”

    Researchers have found that AI education tools can be incorrect and biased — even scoring academic assignments lower for Asian students than for classmates of any other race.

    Hrynowski said teachers are seeking guidance from their schools about how they can use AI. While many are getting used to setting boundaries for their students, they don’t know in what capacity they can use AI tools to improve their jobs.

    The survey found that 19% of teachers are employed at schools with an AI policy. During the 2024-25 school year, 68% of those surveyed said they didn’t receive training on how to use AI tools. Roughly half of them taught themselves how to use it.

    “There aren’t very many buildings or districts that are giving really clear instructions, and we kind of see that hindering the adoption and use among both students and teachers,” Hrynowski said. “We probably need to start looking at having a more systematic approach to laying down the ground rules and establishing where you can, can’t, should or should not, use AI In the classroom.”

    Disclosure: Walton Family Foundation provides financial support to The 74.


    Get stories like these delivered straight to your inbox. Sign up for The 74 Newsletter

    Source link

  • No thank you, AI, I am not interested. You don’t get my data. #shorts

    No thank you, AI, I am not interested. You don’t get my data. #shorts

    No thank you, AI, I am not interested. You don’t get my data. #shorts

    Source link

  • SMART Technologies Launches AI Assist in Lumio to Save Teachers Time

    SMART Technologies Launches AI Assist in Lumio to Save Teachers Time

    Lumio by SMART Technologies, a cloud-based learning platform that enhances engagement on student devices, recently announced a new feature for its Spark plan. This new offering integrates AI Assist, an advanced tool designed to save teachers time and elevate student engagement through AI-generated quiz-based activities and assessments.

    Designing effective quizzes takes time—especially when crafting well-balanced multiple-choice questions with plausible wrong answers to encourage critical thinking. AI Assist streamlines this process, generating high-quality quiz questions at defined levels in seconds so teachers can focus on engaging their students rather than spending time on quiz creation.

    More News from eSchool News

    HVAC projects to improve indoor air quality. Tutoring programs for struggling students. Tuition support for young people who want to become teachers in their home communities.

    Almost 3 in 5 K-12 educators (55 percent) have positive perceptions about GenAI, despite concerns and perceived risks in its adoption, according to updated data from Cengage Group’s “AI in Education” research series.

    Our school has built up its course offerings without having to add headcount. Along the way, we’ve also gained a reputation for having a wide selection of general and advanced courses for our growing student body.

    When it comes to visual creativity, AI tools let students design posters, presentations, and digital artwork effortlessly. Students can turn their ideas into professional-quality visuals, sparking creativity and innovation.

    Ensuring that girls feel supported and empowered in STEM from an early age can lead to more balanced workplaces, economic growth, and groundbreaking discoveries.

    In my work with middle school students, I’ve seen how critical that period of development is to students’ future success. One area of focus in a middle schooler’s development is vocabulary acquisition.

    For students, the mid-year stretch is a chance to assess their learning, refine their decision-making skills, and build momentum for the opportunities ahead.

    Middle school marks the transition from late childhood to early adolescence. Developmental psychologist Erik Erikson describes the transition as a shift from the Industry vs. Inferiority stage into the Identity vs. Role Confusion stage.

    Art has a unique power in the ESL classroom–a magic that bridges cultures, ignites imagination, and breathes life into language. For English Language Learners (ELLs), it’s more than an expressive outlet.

    In the year 2025, no one should have to be convinced that protecting data privacy matters. For education institutions, it’s really that simple of a priority–and that complicated.

    Want to share a great resource? Let us know at [email protected].

    Source link

  • Innovation Without Borders: Galileo’s Networked Approach to Better Higher Education System

    Innovation Without Borders: Galileo’s Networked Approach to Better Higher Education System

    One of the biggest, but least remarked upon trends in European higher education in recent years is the growth of private for-profit, higher education. Even in countries where tuition is free, there are hundreds of thousands of students who now prefer to take courses at private for-profit institutions.

    To me, the question is, why? What sort of institutions are these anyway? Interestingly, the answer to that second question is one which might confuse my mostly North American audience. Turns out a lot of these private institutions are relatively small, bespoke institutions with very narrow academic specializations. And yet they’re owned by a few very large international conglomerate universities. That’s very different from North America, where institutions tend to be either small and bespoke, or part of a large corporation, but not both.

    Today my guest is Nicolas Badré. He’s the Chief Operating Officer of the Galileo Group, which operates a number of universities across Europe. I met him a few months ago at an OECD event in Jakarta. When I heard about some of Galileo’s initiatives, I knew I’d have to have him on the show. 

    There are three things which I think are most important about this interview. First is the discussion about Galileo’s business model and how it achieves economies of scale across such different types of institutions. Second, there’s how the network goes about collectively learning across all its various institutions. And third, specifically how it’s choosing to experiment with AI across a number of institutions and apply the lessons more globally. 

    Overall, it’s a fascinating chat. I hope you enjoy it too. But now, let’s turn things over to Nicolas.


    The World of Higher Education Podcast
    Episode 3.27 | Innovation Without Borders: Galileo’s Networked Approach to Better Higher Education System

    Transcript

    Alex Usher (AU): Nicolas, Galileo Global Education has grown significantly over the years. I think the group is, if I’m not mistaken, 13 or 14 years old now. Some of the universities it owns might be a bit older, but can you walk us through the origins of the group? How did you grow to be as big as you are? I think you’ve got dozens of institutions in dozens of countries—how did that growth happen so quickly?

    Nicolas Badré (NB): Thank you, Alex, for the question. It’s an interesting story. And yes, to your point, the group was created 13 and a half years ago, with an investment by Providence Equity Partners into Istituto Marangoni, a fashion school in Italy. That dates back to 2011. Since then, we’ve made 30 acquisitions.

    The growth started primarily in Europe, especially in France and Germany. Then, in 2014, we took our first steps outside of Europe with the acquisition of IEU in Mexico. Significant moves followed in 2018 and 2019, particularly into the online learning space with Studi in France and AKAD in Germany.

    There’s been a very rapid acceleration over the past five years. For context, I joined the group at the end of 2019. At that time, Galileo had 67,000 students across nine countries. Today, we have 300,000 students in 20 countries.

    Back then, the group was primarily focused on arts and creative schools, as well as business and management schools. Now, we’ve expanded into tech and health, and even into some professional training areas—like truck driving, for instance.

    What does this reflect? Two things. First, very strong organic growth from our existing schools and brands. Take ESG in France as an example. It’s been around for 40 years and is a well-known entry-level business school. Over the past five years, it’s diversified considerably creating ESG Luxury, ESG Tourism, you name it. It’s also expanded its physical presence from just a few cities to now being in 15 or 16 cities across France.

    So it’s really been a combination of strong organic growth and selective acquisitions that have helped us more than quadruple our student numbers in just five years.

    AU: It’s interesting— I think a lot of our listeners and viewers might be surprised to hear about such a strong for-profit institution coming out of France. When you think of French higher education, you think of the Grandes Écoles, you think of free education. So why would so many people choose to pay for education when they don’t have to? It’s a pretty strong trend in France now. I think over 26% of all students in France are in some form of private higher education. What do you offer that makes people willing to give up “free”?

    NB: It’s a good question, and you’re right—it’s not just about France. In many places across Europe, including Germany, the Nordics, and others, you see similar dynamics.

    That said, yes, in France in particular, there’s been a growing share of private players in higher education over the past few years. That probably reflects the private sector’s greater ability to adapt to new environments.

    I’d highlight three main factors that help explain why we’ve been successful in this space.

    First, we’re obsessed with employability and skills-based education. And that’s true across all levels and backgrounds. When we worked on our group mission statement, everyone agreed that our mission is to “unleash the potential of everyone for better employability.” 

    Because of that focus, we maintain very strong ties with industry. That gives us the ability to adapt, create, and update our programs very quickly in response to emerging demands. We know competencies become obsolete faster now, so staying aligned with job market needs is critical. That’s probably the strongest unifying driver across all of Galileo.

    Beyond that, we also offer very unique programs. Take Noroff, for example—a tech school in Norway, which is even more tuition-free than France. It’s one of the very few fee-paying institutions in the country. But the program is so strong that students are willing to pay around 15,000 euros a year because they know they’ll get a top-tier, hands-on experience—something that might be slower to evolve in the public system.

    So that’s the first point: employability and unique, high-impact programs.

    Second, we put a strong emphasis on the student experience. How do we transform their education beyond just delivering content? That’s an area we continue to invest in—never enough, but always pushing. We’re focused on hybridizing disciplines, geographies, and pedagogical approaches.

    And we’ve systematized student feedback—not just asking for opinions, but making sure we translate that feedback into tangible improvements in the student experience.

    And third, I’d say there’s a values-based dimension to all of this. We focus heavily on innovation, entrepreneurship, and high standards. Those are the core values that we’re driven by. You could say they’re our obsessions—and I think that kind of vision and energy resonates with our students. Those are the three main things I’d point to.

    AU: I have a question about how you make things work across such a diverse set of institutions. I mean, you’ve got design schools, drama schools, law schools, medical schools. When people think about private education, there’s often an assumption that there’s some kind of economies of scale in terms of curriculum. The idea that you can reuse curriculum across different places. But my impression is that you can’t do that very much. It seems like you’re managing all these different institutions, each of them like their own boutique operation, with their own specific costs. How do you make it work across a system as large and diverse as yours? Where are the economies of scale?

    NB: Well, that’s also a very good point—and you’re absolutely right. We have a very diverse network of schools. We have a culinary arts school in Bordeaux, France, with maybe 400 students, and we have universities with more than 10,000 students, whether in medical or business education.

    So yes, you might wonder: why put these institutions together?

    The answer is that we really built the group’s development around the entrepreneurial DNA of our school directors. They’re responsible for their own development—for their growth, diversification, and how they respond to the job market.

    We’re not obsessed with economies of scale. What we really value is the network itself. What we focus on is shared methodology—in areas like sales and marketing, finance, HR, and student experience.

    There are also some opportunities for synergies in systems. In some cases, for instance, yes—we use a similar CRM across several countries. But I think the real value of the network lies in its ability to share experiences and experiment with innovation throughout, and then scale up those innovations appropriately across the other schools.

    So I’d say it’s more about shared practices than about forcing economies of scale across borders—because that doesn’t always make sense.

    AU: Am I correct in thinking that you don’t necessarily present yourself as a chain of institutions to students? That each institution actually has a pretty strong identity in and of itself—is that right? Is there a fair bit of autonomy and ability to adapt things locally at each of your schools?

    NB: Yes, I think that’s true. In terms of branding, we believe that each of our schools generally has a stronger brand than Galileo itself. And that’s how it should be, because each school has its own experience, its own DNA, its own momentum and development.

    So, we see ourselves more as a platform that supports the development of all these schools, rather than a chain imposing the same standards and practices across the board.

    Of course, we do have certain methodologies—for example, how to run a commercial campaign. We provide guidance, but it’s ultimately up to each school to manage that process and use the methodology in a way that works best for their own development.

    That doesn’t mean there’s no value in having the Galileo name—there is. But the value is in being a platform that supports the schools, rather than overshadowing them.

    AU: Nicolas, I know Galileo is testing a lot of AI-driven approaches across its various institutions. What I found interesting in a discussion we had offline a few weeks ago is that you’re experimenting with AI in different parts of the institution—some of it around curriculum, some around administration, and some around student services. Can you give us an overview? What exactly are you testing, and what are the goals of these experiments?

    NB: I think we first need to frame how we’re using AI, and it’s important to look at our strategy globally. We believe there are three major trends shaping higher education.

    First, student expectations are evolving quickly—they’re demanding more flexibility and personalization. Second, there’s a rapid emergence of new competencies, which challenges our ability to adapt and update programs quickly. And third, we need to go beyond boundaries and be agile in how we approach topics, address new skills, and serve diverse learners. These are the three starting points we see as opportunities for Galileo to differentiate itself. Now, we’re not trying to become a leading AI company. Our goal remains to be a recognized leader in education—improving employability and lives. That’s our benchmark.

    With that in mind, our AI vision is focused on four areas:

    1. How do we deliver a unique experience to our students?
    2. How do we connect educators globally who are trained in and comfortable with AI?
    3. How do we develop content that can be adapted, localized, translated, and personalized?
    4. And how do we improve operational productivity?

    AI is clearly a powerful tool in all four areas. Let me walk through some of the things we’re doing. 

    The first area we call AI for Content. We’re using AI to more quickly identify the competencies required by the job market. We use tools that give us a more immediate connection to the market to understand what skills are in demand. Based on that, we design programs that better align with those needs.

    Then the next step is about course and content creation. Once we’ve defined the competencies, how do we design the courses, the pedagogical materials? How do we make it easier to localize and adapt that content?

    Take Studi, an online university in France with 67,000 students and around 150 different programs. A year ago, it would take them about four months to design a bachelor’s or master’s program. Now, it takes one to two months, depending on the specifics. The cost has been cut in half, and development speed has increased by a factor of two, three, even four in some cases. This also opens up opportunities to make programs more personalized because we can update them much faster. 

    The second area is AI for Experience. How do we use AI to enhance the student experience?

    We’ve embedded AI features in our LMS to personalize quizzes, generate mind maps, and create interactive sessions during classes. We’ve also adapted assessments. For example, in Germany, for the past two years, our online university AKAD has let students choose their own exam dates. That’s based on an AI approach that generates personalized assessments while staying within the requirements of German accreditation bodies. This wouldn’t be possible without AI. The result is higher engagement, faster feedback, and a more personalized learning experience.

    Lastly, beyond content and experience, we’re seeing real gains in AI for Operations. In sales and marketing, for example, we now use bots in Italy and Latin America to re-engage “dead” leads—contacting them again, setting up meetings, and redirecting them through the admissions funnel. It’s proven quite efficient, and we’re looking to expand that approach to other schools.

    We’re also seeing strong results in tutoring. Take Corndel, a large UK-based school focused on apprenticeships. They’re using AI tools extensively to improve student tracking, tutoring, and weekly progress monitoring.

    So, we’re seeing a lot of momentum across all these dimensions—and it’s really picked up speed over the last 18 months.

    AU: So, you’ve got a network of institutions, which gives you a lot of little laboratories to experiment with—to try different things. How do you identify best practices? And then how do you scale them across your network?

    NB: Well, first of all, we have lots of different pilots. As you’ve understood, we’re quite decentralized, so we don’t have a central innovation team of 50 people imposing innovations across all our schools.

    It’s more about scouting and sharing experiences from one school to another. It’s a combination of networks where people share what they’re learning.

    Just to name a few, we have a Digital Learning Community—that’s made up of all the people involved in LMS design across our schools. They exchange a lot of insights and experiences.

    We also hold regular touchpoints to present what’s happening in AI for content, AI for experience, and AI for operations. We’ve created some shared training paths for schools as well. So there are a lot of initiatives aimed at maximizing sharing, rather than imposing anything top-down. Again, as you pointed out, the schools are extremely diverse—in terms of regulations, size, content, and disciplines. So there’s no universal recipe.

    That said, in some cases it’s more about developing a methodology. For example, how do you design and implement a pedagogical chatbot? The experiments we’re running now are very promising for future scale-up, because we’re learning a lot from these developments.

    AU: I know that, in a sense, you’ve institutionalized the notion of innovation within the system. I think you’ve recently launched a new master’s program specifically focused on this question—on how to innovate in education systems. Can you tell us a little bit about that?

    NB: Yeah, I’m super excited to talk about this, because it’s where I’m focusing most of my energy these days.

    We’ve been working on this project for a year with four Galileo institutions. It’s called Copernia, and the name, like Galileo, is intentional—these are people who changed perspectives. That’s exactly what we want to do: change the perspective on education and truly put the student at the center.

    Copernia started the initiative, Galileo confirmed it, and it’s no coincidence we’re focusing on this.

    The first program we’re launching under Copernia is a Master of Innovation and Technology for Education. The idea is to bring together and leverage expertise from several fields: neurocognitive science, tech, AI and data, educational sciences, innovation, design, and management. The goal is to offer students a unique experience where they not only learn about innovation—but also learn to develop and apply it.

    One of the major assets we want to leverage is the Galileo network. With over 120 campuses, we can offer students real, hands-on opportunities to experiment and innovate. So the value proposition is: if you want to design and test educational innovation, we’ll give you the tools, the foundational knowledge, and, most importantly, the chance to apply that in practice—within our network, with our partners, and with other institutions.

    The goal is to help the whole ecosystem benefit—not just from Galileo’s environment, but also from the contributions of tech partners, academic collaborators, and business partners around the world. I’m convinced this will be a major tool to develop, share, and scale practical, applied innovation.

    And importantly, this isn’t meant to be just an internal initiative for Galileo. It’s designed to be open. We want to train people who can help transform education—not only in higher education, but also in K–12 and lifelong learning. Because we believe this kind of cross-disciplinary expertise and hands-on innovation experience is valuable across the entire education sector.

    AU: I’m really impressed with the scale and speed at which you’re able to experiment. But it did make me wonder—why can’t public higher education systems do the same? I mean, if I think about French universities, there are 70 or 80 in the public system—though it’s hard to keep track because they keep merging. But theoretically, they could do this too, couldn’t they? It’s a moderately centralized system, and there’s no reason institutions couldn’t collaborate in ways that let them identify useful innovations—rolling them out at different speeds in different areas, depending on what works. Why can’t the public sector innovate like that?

    NB: First of all, I wouldn’t make a sweeping judgment on this. I think there is innovation happening everywhere—including within public institutions. So I wouldn’t describe it in black-and-white terms.

    That said, it’s true that as a private organization, we face a certain kind of pressure. We need to prove that we operate a sustainable model—and we need to prove that every month. In other words, we rely on ourselves to develop, to test, and to optimize how we grow. 

    The second is that we have an asset in being able to test and learn in very different environments. Take the example I mentioned earlier, about Germany and the anytime online assessments. We were able to implement that model there because it was online and because the regulatory environment allowed it.

    Now, when we approach accreditation bodies in other countries, we can say: “Look, it works. It’s already accepted elsewhere. Why not consider it here?” That ability to move between different contexts—academic and professional, vocational and executive—is really valuable. It allows us to promote solutions that cross traditional boundaries.

    That’s not something all public universities can do—and frankly, not something all universities can do, period. But it’s an advantage we’ve built over the past several years by creating this large field for experimentation.

    AU: Nicolas, thank you so much for being with us today.

    NB: Alex, thank you very much. It’s been a pleasure.

    AU: It just remains for me to thank our excellent producers, Tiffany MacLennan and Sam Pufek, and to thank you—our viewers, listeners, and readers—for joining us. If you have any questions about today’s podcast, please don’t hesitate to get in touch at [email protected]. And don’t forget—never miss an episode of The World of Higher Education Podcast. Head over to YouTube and subscribe to our channel. Join us next week when our guest will be Noel Baldwin, CEO of the Future Skills Centre here in Canada. He’ll be joining us to talk about the Programme for the International Assessment of Adult Competencies. See you then.

    *This podcast transcript was generated using an AI transcription service with limited editing. Please forgive any errors made through this service. Please note, the views and opinions expressed in each episode are those of the individual contributors, and do not necessarily reflect those of the podcast host and team, or our sponsors.

    This episode is sponsored by Studiosity. Student success, at scale – with an evidence-based ROI of 4.4x return for universities and colleges. Because Studiosity is AI for Learning — not corrections – to develop critical thinking, agency, and retention — empowering educators with learning insight. For future-ready graduates — and for future-ready institutions. Learn more at studiosity.com.

    Source link