Tag: Embedding

  • Global lessons for the UK: how Singapore and India are embedding AI in education

    Global lessons for the UK: how Singapore and India are embedding AI in education

    This blog was kindly authored by Dr Karryl Kim Sagun Trajano (Research Fellow, S. Rajaratnam School of International Studies (RSIS), Dr Gayatri Devi Pillai (Assistant Professor, HHMSPB NSS College for Women, Trivandrum), Professor Mohanan Pillai (Pondicherry University), Dr Hillary Briffa (Senior Lecturer, Department of War Studies, KCL), Dr Anna Plunkett (Lecturer, Department of War Studies, KCL), Dr Ksenia Kirkham (Senior Lecturer, Department of War Studies, KCL),  Dr Özge Söylemez (Lecturer, Defence Studies Department, KCL), Dr Lucas Knotter (Lecturer, Department of Politics, Languages, and International Studies University of Bath), and Dr Chris Featherstone (Associate Lecturer, Department of Politics and International Relations, University of York).

    This blog draws on insights from the 2025 BISA-ISA joint Workshop on AI Pedagogies: Practice, Prompts and Problems in Contemporary Higher Education, sponsored by the ASPIRE (Academic Scholarship in Politics and International Relations Education) Network.

    As the UK continues to work out how best to regulate and support the use of AI in higher education, other countries have already begun to put their ideas into practice. Singapore and India, in particular, offer useful contrasts. Both link technological innovation to questions of social inclusion, though they do so in different ways: Singapore focuses on resilience and lifelong learning, while India emphasises access and the use of vernacular languages. Comparatively, their experiences show how education policy can harness AI to advance both innovation and inclusion, making technological progress a driver of social cohesion. British tertiary education institutions have, for a long time, drawn international lessons mainly from their close western neighbours, but it would be wise to broaden their horizons.

    Singapore: AI for resilience and lifelong learning

    Singapore’s approach to AI in education is rooted in its Smart Nation 2.0 vision, which emphasises the three goals of “Growth, Community and Trust”. The government aims to develop a digitally skilled workforce of 15,000 AI practitioners by 2027, linking education reform to national capability-building. Within this framework, AI pedagogy is closely tied to the idea of social resilience, which is understood in Singaporean policy as the capacity of society to remain cohesive, adaptable, and functional in the face of disruption.

    This vision is implemented through a coordinated ecosystem connecting local universities, AI Singapore (AISG), and the SkillsFuture programme. SkillsFuture uses AI-driven analytics to personalise re-skilling courses, design decision-making simulations, and encourage collaboration between government, industry, and academia. The Centre for Strategic Futures extends this agenda by promoting “AI for personal resilience”, framing digital competence as part of civic participation and collective preparedness.

    Even so, workshop discussions highlighted persistent challenges. Access to elite universities remains uneven, and foreign workers are largely excluded from many lifelong-learning initiatives. Participants also noted that AI training tends to focus on technical ability, leaving less room for ethical debate or critical reflection. To some extent, the drive to innovate has moved faster than efforts to make AI education fully inclusive or reflective.

    Singapore’s experience nonetheless illustrates how AI can be built into the wider social purpose of education. For the UK, it offers a reminder that digital innovation and civic responsibility can reinforce one another when universities treat learning as a public good. Graduates who understand both the capabilities and the limits of AI are better equipped to navigate complex socio-political, and technological environments. When built into lifelong-learning systems, AI education helps create the networks of knowledge and trust that make societies more adaptable and resilient.

    India: AI for inclusivity and vernacular access

    If Singapore shows what is possible through tight coordination in a small, centralised system, India demonstrates how the same principles are tested when applied across a country of continental scale and diversity. India’s National Education Policy (NEP) 2020 sets out a comprehensive vision for transforming the education system to meet the demands of a rapidly changing global economy. It aims to raise the higher education gross enrolment ratio to 50% by 2035 and introduces flexible, learner-centred degree structures designed to encourage creativity and critical thinking. Artificial intelligence is central to this reform, “catalysing” both curricular innovation and system-wide modernisation.

    The National Digital Education Architecture (NDEAR) and the AI for All initiative embed AI within educational design and delivery. The University of Kerala’s Four-Year Undergraduate Programme (FYUGP), implemented under the NEP in 2024-25, is demonstrative of how these reforms are taking shape. AI tools now support continuous assessment, effectively and efficiently enabling educators to tailor material to individual learning needs and diverse assessment methods. These developments signal a wider shift in pedagogy, from one-off examinations toward continuous and formative evaluation that prioritises understanding and reflection.

    At the heart of the strategy lies India’s focus on linguistic and cultural inclusion. NEP 2020 mandates the use of regional languages in instruction and assessment, aligning with government programmes that promote vernacular content and accessible digital platforms. This multilingual approach helps extend higher education to students previously marginalised by linguistic barriers, while AI-assisted translation and adaptive interfaces further improve access for learners with disabilities.

    As with Singapore’s efforts, however, India’s reform agenda is not without its shortcomings. The NEP reflects the aspirations of a growing middle class and the logic of global competitiveness, raising concerns about commercialisation and uneven implementation, particularly at scale. Still, it represents one of the most ambitious efforts worldwide to connect digital innovation with social justice through deliberate policy design. For the UK, the lesson is clear: technological efficiency must be matched by cultural understanding and genuine inclusion, ensuring that advances in AI expand participation in higher education rather than deepen existing divides.

    Comparative insights for the UK

    Singapore and India approach AI in education from very different starting points, and each offers lessons worth considering. Singapore demonstrates the impact of close coordination between government and universities, supported by steady investment in applied research. India, meanwhile, is emblematic of how digital inclusion can extend beyond elite institutions when policy design takes account of linguistic diversity and regional inequality.

    For the UK, these examples point to a shared message: progress depends on coherence. Many initiatives already exist, from Joint Information Systems Committee Jisc’s advancement of the digital capabilities framework to Advance HE’s support to prepare for an AI-enabled future and the Russell Group’s guidance on generative AI, but they remain generally disconnected to date.

    Learning from Singapore and India could help the UK move towards a more consistent approach. That might involve:

    • developing a national framework for AI in higher education that sets clear expectations around ethics and inclusion;
    • funding staff training and digital literacy programmes inspired by Singapore’s emphasis on lifelong learning;
    • supporting multilingual and accessible AI tools that mirror India’s focus on linguistic and regional diversity;
    • building evaluation mechanisms to understand how AI adoption affects equality of opportunity.

    In the end, the challenge is less about technology, and more about governance. The UK has the capacity to lead in responsible AI education if policy connects local innovation to a national vision grounded in fairness and public trust.

    Source link

  • From detection to development: how universities are ethically embedding AI-for-learning

    From detection to development: how universities are ethically embedding AI-for-learning

    Author:
    Mike Larsen

    Published:

    • HEPI Director Nick Hillman’s verdict on the Budget can be found on the Times Higher website here.
    • Today’s blog was kindly authored by Mike Larsen, Chief Executive Officer at Studiosity, a HEPI Partner.

    The future of UK higher education rests upon the assurance of student learning outcomes. While GenAI presents the sector with immense opportunities for advancement and efficiency, the sector is constrained by an anachronistic model of plagiarism detection rooted in adversarialism. I believe the ‘Police and Punish’ model must now be replaced by ‘Support and Validate’.

    A reliance upon detection was perhaps once a necessary evil but it has never aligned with the fundamental values of higher education. The assumption that policing student behaviour is the only way to safeguard standards no longer applies.

    Such a punitive policy model has become increasingly untenable, consuming valuable university resources in unviable investigations and distracting from universities’ core mission. I believe there is a compelling alternative.

    As assessment methods undergo necessary change, higher education institutions must consciously evaluate the risks inherent in abandoning proven means of developing durable critical thinking and communication skills, such as academic writing. New learning and assessment methodologies are required but must be embraced via evidence and concurrently protect the core promise of higher education.

    An emerging policy framework for consideration and research is ‘support and validate’ which pairs timely, evidence-based academic support with student self-validation of authorship and learning.

    Building capability, confidence and competence provides the ideal preparation for graduates to embrace current and future technology in both the workplace and society.

    The combination of established and immediate academic writing feedback systems with advanced authorship and learning validation capabilities creates a robust and multi-layered solution capable of ensuring quality at scale.

    This is an approach built upon detecting learning, not cheating. Higher education leaders may recognise this integrated approach empowers learners and unburdens educators, without compromising quality. It ensures the capabilities uniquely developed by higher education, now needed more than ever, are extended and amplified rather than replaced by techno-solutionism.

    We must build a future where assessment security explicitly prioritises learning, not policing. For UK higher education, a pivot from punishment to capability-building and validation may be the only sustainable way to safeguard the value of the degree qualification.

    Studiosity’s AI-for-Learning platform scales student success at hundreds of universities across five continents, with research-backed evidence of impact. Studiosity has recently acquired Norvalid, a world leader in tech-enabled student self-validation of authorship and authentic learning, shifting how higher education approaches assessment security and learning.

     

    Source link

  • Embedding AI, not bolting it on

    Embedding AI, not bolting it on

    Over the weekend, we published a blog on the teacher training placement crisis.

    Today’s blog was kindly authored by Janice Kay, Director at Higher Futures, and former Provost and Senior Deputy Vice-Chancellor at the University of Exeter.

    We must become AI-first institutions.

    The Office for Students recently made a welcome intervention, encouraging universities to make “bold changes to adapt in this increasingly challenging environment.” And they’re right.

    But it begs the question: why aren’t we being bold? Why, as a sector, do we tend to squeeze AI tools into what we already do, instead of using this moment to completely rethink how we teach, how we support students, and how we assess?

    In short: how do we give our educators the confidence and the skills to think differently?

    Deliberate, purposeful workforce planning

    My argument is that we need to move from this slow, piecemeal adaptation towards something much more deliberate: an AI-first approach to workforce planning across the whole institution.

    Every role should have a clear expectation of AI competence, and every member of staff should be supported to reach it. That means embedding AI capability as a core institutional priority, not an afterthought. And yes, that also means some traditional job roles will change dramatically, and some will disappear altogether through automation.

    Where do we start? We start by understanding where we are. AI competency isn’t about everyone becoming data scientists, it’s about understanding the basics: what AI and large language models actually are, what they can and can’t do, how AI-driven analytics work, and how to use prompts effectively – from simple to sophisticated requests.

    Embedding digital skills into professional growth

    There are already some great examples of this kind of thinking. Take the University of Exeter. As part of its new Digital Strategy, it’s been assessing staff confidence and motivation in using digital tools. Over 41% of staff have engaged so far, with 778 self-assessments completed, a great base for building digital confidence across the organisation. But this also shows the need to be specific: the skills an educator needs are not the same as those of a programme administrator, or a student welfare advisor.

    Once we’ve established those levels of competency, the next step must be a serious, well-supported development programme. For example, educators might want to learn how to use AI tools that generate automated feedback, analyse discussion forums, or predict student engagement and dropout risk. Institutions can and should create incentives for staff to develop these skills. That might be through micro-credentials, workload allocation, and even promotion criteria. And, crucially, people need time – time to experiment, play, fail and learn. AI proficiency shouldn’t be an optional extra. It should be part of the job.

    We also need to be intentional about developing AI leaders. We can’t just leave it to chance. We should be identifying and empowering the people, both academics and professional staff, who can critically evaluate new technologies and embed them in ways that are ethical, pedagogically sound, and discipline specific. These are the people who can bring real meaning to personalisation in learning. And AI fluency shouldn’t just mean technical know-how. It needs to sit alongside learning science, assessment integrity and data ethics. As the recent Skills England report put it, we need technical, non-technical and responsibility skills.

    AI as a foundation, not a feature

    Ultimately, this is about structural change. We need to transform the AI competence of the higher education workforce, but that transformation must go together with how our institutions use AI and digital technologies themselves.

    AI systems should be built into academic and student workflows, not bolted on.

    The Kortext–Saïd partnership is a great example of this. It’s helping academics reimagine learning so that it becomes genuinely personalised. Embedding an AI assistant right into the virtual learning environment is reshaping how modules, materials and assessments are designed.

    As Mark Bramwell, CDIO of Saïd Business School put it, the partnership is:

    empowering our faculty and learning designers to create smarter, data-driven courses and giving our students a more adaptive, hyper-personalised and engaging learning environment.

    That’s exactly the kind of bold partnership we need more of, projects that not only enhance teaching and learning, but also build the AI skills and confidence our workforce needs to thrive in the future. What I want to do is move past the broad debate about whether we should adopt AI technologies. The question isn’t just if we adopt AI in higher education, but how, especially when it comes to our workforce.

    Join Janice Kay, Mark Bramwell and other key sector voices at Kortext LIVE on 11 February 2026 to discuss ‘Leading the next chapter of digital innovation’. Find out more and secure your seat here.

    Source link

  • Skills and employability: embedding to uncovering

    Skills and employability: embedding to uncovering

    Author:
    Claire Toogood

    Published:

    • The blog below was kindly authored for HEPI by Claire Toogood, Research and Strategic Projects Manager at AGCAS.
    • Elsewhere, Nick Hillman, HEPI’s Director, has responded to the Post-16 Education and Skills White Paper in a piece for Times Higher Education.

    In recent years, UK higher education has made significant strides in embedding employability into the curriculum. From frameworks and toolkits to strategic initiatives, the sector has embraced the idea that employability is not an add-on, but a core element and outcome of any academic course. Yet, as the new Uncovering Skills report reveals, embedding is only part of the story. A future challenge, and significant opportunity for impact, lies in helping students uncover, recognise, and articulate the skills they are already developing.

    This project, led by AGCAS and supported by Symplicity, draws on three national datasets, including focus groups, event survey data, and applications to the inaugural Academic Employability Awards. It builds on foundational work such as Kate Daubney’s concept of “Extracted Employability” which reframed employability as something inherent in academic learning, not externally imposed. Alongside sector-wide efforts like the AGCAS Integrating Employability Toolkit and Advance HE’s embedding employability framework, and institutional contributions like Surfacing Skills and Curriculum Redefined at the University of Leeds, Daubney’s work lays the groundwork for a more inclusive and intentional approach to employability.

    The latest findings from the Uncovering Skills project suggest that visibility, confidence, and perceived relevance remain major barriers. Students often struggle to recognise the value of their informal experiences (such as part-time work, caring responsibilities, and volunteering) and they can lack the language to describe these in employer-relevant terms. As one focus group participant noted, ‘Students often think if it’s not linked to their degree then it is not relevant.’ Another added, ‘They disregard skills gained from everyday life – like being a parent or managing during Covid.’. To resolve this, reflection is critical, but it is inconsistently supported across higher education. Time-poor students tend to engage only when prompted by immediate needs, such as job applications. ‘Reflection from the student perspective doesn’t become a need until they’ve got an interview,’ said one participant. Others highlighted that ‘self-reflection and deeper knowledge of skills is where students fall down… poor preparation in earlier education is a factor.’.

    The report also highlights that some student cohorts face compounded challenges. International students, disabled students, and those from widening participation backgrounds require tailored and targeted support to uncover and express their strengths. Institutional collaboration with career development experts is essential, yet reflections from careers professionals involved in the project show that they are not always included in curriculum design, and staff who champion employability often lack recognition, no matter where they are employed within their institution.

    Technology, including AI, offers new possibilities, but also risks encouraging superficial engagement if not used intentionally. ‘Rather than learning what these skills are and having to articulate them, they just abdicate that responsibility to AI,’ warned one contributor. Another observed that students are ‘superficially surfing through university – not as connected to skills development’. The Uncovering Skills report includes a series of case studies that explain how careers professionals and academic staff at ACGAS member institutions are tackling these multiple challenges.

    So, what needs to change?

    The report makes six recommendations:

    1. Make skills visible and recognisable: Use discipline-relevant language and real-world examples to help students connect academic learning to transferable skills.
    2. Support students to uncover skills across contexts: Validate informal and non-traditional experiences as legitimate sources of skill development. Embed reflection opportunities throughout.
    3. Equip staff to facilitate skills recognition: Provide training, shared frameworks, and recognition for staff supporting students in uncovering and articulating their skills.
    4. Use technology to enhance, not replace, reflection: Promote ethical, intentional use of AI and digital tools to support self-awareness and skill articulation.
    5. Tailor support to diverse student needs: Design inclusive, flexible support that reflects the lived experiences and barriers faced by different student cohorts.
    6. Foster a culture of skill recognition across the institution: Embed uncovering skills into institutional strategy, quality processes, and cross-functional collaboration.

    The report includes a call to action, stressing that it is time to build on excellent work to embed and integrate employability by fully supporting students to uncover and articulate their skills. This includes ensuring that all students can equitably access the tools, language, and support they need to succeed. It must include the creation of environments where students feel confident recognising and expressing their skills, whether from academic settings, extra-curricular spaces, or lived experiences; championing equity by validating all forms of learning. It also means investing in staff development and cross-functional collaboration.

    Uncovering skills is a shared responsibility, and a powerful opportunity, to transform how students understand themselves, their experiences and learning, and their future.

    Source link

  • From Detection to Development: How Universities Are Ethically Embedding AI for Learning 

    From Detection to Development: How Universities Are Ethically Embedding AI for Learning 

    This HEPI blog was authored by Isabelle Bristow, Managing Director UK and Europe at Studiosity, a HEPI Partner.  

    The Universities UK Annual Conference always serves as a vital barometer for the higher education sector, and this year, few topics were as prominent as the role of Generative Artificial Intelligence (GenAI). A packed session, Ethical AI in Higher Education for improving learning outcomes: A policy and leadership discussion, provided a refreshing and pragmatic perspective, moving the conversation beyond academic integrity fears and towards genuine educational innovation. 

    Based on early findings from new independent research commissioned by Studiosity, the session’s panellists offered crucial insights and a clear path forward. 

    A new focus: from policing to pedagogy 

    For months, the discussion around Gen-AI has been dominated by concerns over academic misconduct and the development of detection tools. However, as HEPI Director Nick Hillman OBE highlighted, this new report takes a different tack. Its unique focus is on how AI can support active learning, rather than just how students are using it. 

    The findings, presented by independent researcher Rebecca Mace, show a direct correlation between the ethical use of AI for learning and improved student attainment and retention. Crucially, these positive effects were particularly noticeable among students often described as ‘non-traditional’. This reframes the conversation, positioning AI not as a threat to learning but as a powerful tool to enhance it, especially for those who need it most. 

    The analogy that works 

    The ferocious pace of AI’s introduction to the sector has undoubtedly caught many off guard. Professor Marc Griffiths, Pro-Vice Chancellor for Regional Partnerships, Engagement & Innovation at UWE Bristol, acknowledged this head-on, advocating for a dual approach of governance and ‘​​​​sand-boxing’ (the security practice of isolating and testing to make sure an application, system or platform is safe)  of new technologies. Instead of simply denying access, he argued, we must test new tools and develop clear guardrails for their use. 

    In a welcome departure from ​​​​​​​​the widely used but ultimately flawed calculator analogy (​​read more here Generative AI is not a ‘calculator for words’. 5 reasons why this idea is misleading), Professor Griffiths offered a more fitting one: the overhead projector. Like PowerPoint today, the projector was a new technology that was a conduit for content, but it never replaced the core act of teaching and learning itself. AI, he posited, is simply another conduit. It is what we put into it, and what we get out of it, that matters. 

    Evidenced insights and reframing the conversation 

    The panel also grappled with the core questions leaders must ask themselves. Stephanie Harris, Director of Policy at Universities UK posed two fundamental challenges: 

    • How can I safeguard my key product that I am offering to students? 
    • How can I prepare my students for the workforce if I don’t yet know how AI will be used in the future? 

    She stressed the importance of protecting the integrity of the educational experience to prevent an ‘erosion of trust’ between students and institutions. In response to the second question, both Steph and Marc emphasised the answer lies not in specific tech skills, but in timeless critical thinking skills that will prepare students not just for the next three years, but for the next 15. The conversation also touched upon the need for universities to consider students under 16 as the future pipeline, ensuring our policies and frameworks are future-proof. Steph mentioned further prompts for leaders to think about as listed in a UUK-authored, OfS blog Embracing innovation in high education: our approach to artificial intelligence – which was given a commonsense shorthand by Steph as ‘have fun, don’t be stupid!’.  

    The session drove home the importance of evidence-based insights. Dr David Pike, Head of Digital Learning at the University of Bedfordshire, shared key findings from his own research comparing student outcomes for Studiosity users versus those of non-Studiosity users, stating that the results were ‘very clear’ that students did improve at scale. He provided powerful data showing significant measurable academic progress, along with a large positive correlational impact on retention and progression. Dr. Pike concluded that, given this demonstrated positive impact, we should be calling the technology ‘Assisted Intelligence,’ because when used correctly, that is exactly what it is. 

    A guiding framework of values 

    To navigate this new landscape, Professor Griffiths laid out seven core values that must underpin institutional policy on AI: 

    1. Academic integrity: Supporting learning, not replacing it. 
    1. Equity of access: Addressing the real challenge of paywalls. 
    1. Transparency: Clearly communicating how students will be supported. 
    1. Ethical Responsibility 
    1. Empowerment and Capability Building 
    1. Resilience 
    1. Adaptability 

    These values offer a robust framework for leaders looking to create policies that are both consistent and fair, ensuring that AI use aligns with a university’s mission. 

    The policy challenge of digital inequality 

    The issue of equity of access was explored in greater detail by Nick Hillman, who connected the digital divide to the broader student funding landscape. He pointed out that no government had commissioned a proper review on the actual cost of being a student since 1958. With modern student life costing upwards of £20,000 annually if a student wants to involve themselves fully in student life. He made a powerful case for increased maintenance support to match an increased tuition fee, which would also help prevent further disparity between those who can afford premium tech tools and those who cannot. This highlights that addressing digital inequality is not just a technical challenge; it is a fundamental policy one too. 

    In closing 

    The session’s core message was clear: while the rise of AI has been rapid, the sector’s response does not have to be only reactive. By embracing a proactive, values-led approach that prioritises ethical development, equity and human-centric learning, universities can turn what was once seen as a threat into a powerful catalyst for positive change. 

    Studiosity is AI-for-Learning, not corrections – to scale student success, empower educators, and improve retention with a proven , while ensuring integrity and reducing institutional risk. 

    Source link