Tag: Practice

  • Generative AI and the REF: closing the gap between policy and practice

    Generative AI and the REF: closing the gap between policy and practice

    This blog was kindly authored by Liam Earney, Managing Director, HE and Research, Jisc.

    The REF-AI report, which received funding from Research England and co-authored by Jisc and Centre for Higher Education Transformations (CHET), was designed to provide evidence to help the sector prepare for the next REF. Its findings show that Generative AI is already shaping the approaches that universities adopt. Some approaches are cautious and exploratory, some are inventive and innovative, and most of it is happening quietly in the background. GenAI in research practice is no longer theoretical; it is part of the day-to-day reality of research, and research assessment.

    For Jisc, some of the findings in the report are unsurprising. We see every day how digital capability is uneven across the sector, and how new tools arrive before governance has had a chance to catch up. The report highlights an important gap between emerging practice and policy – a gap that the sector can now work collaboratively to close. UKRI has already issued guidance on generative AI use in funding applications and assessment: emphasising honesty, rigour, transparency, and confidentiality. Yet the REF context still lacks equivalent clarity, leaving institutions to interpret best practice alone. This work was funded by Research England to inform future guidance and support, ensuring that the sector has the evidence it needs to navigate GenAI responsibly.

    The REF-AI report rightly places integrity at the heart of its recommendations. Recommendation 1 is critical to support transparency and avoid misunderstandings: every university should publish a clear policy on using Generative AI in research, and specifically in REF work. That policy should outline what is acceptable and require staff to disclose when AI has helped shape a submission.

    This is about trust and about laying the groundwork for a fair assessment system. At present, too much GenAI use is happening under the radar, without shared language or common expectations. Clarity and consistency will help maintain trust in an exercise that underpins the distribution of public research funding.

    Unpicking a patchwork of inconsistencies

    We now have insight into real practice across UK universities. Some are already using GenAI to trawl for impact evidence, to help shape narratives, and even to review or score outputs. Others are experimenting with bespoke tools or home-grown systems designed to streamline their internal processes.

    This kind of activity is usually driven by good intentions. Teams are trying to cope with rising workloads and the increased complexity that comes with each REF cycle. But when different institutions use different tools in different ways, the result is not greater clarity. It is a patchwork of inconsistent practices and a risk that those involved do not clearly understand the role GenAI has played.

    The report notes that most universities still lack formal guidance and that internal policy discussions are only just beginning. In fact, practice has moved so far ahead of governance that many colleagues are unaware of how much GenAI is already embedded in their own institution’s REF preparation, or for professional services, how much GenAI is already being used by their researchers.

    The sector digital divide

    This is where the sector can work together, with support from Jisc and others, to help narrow the divide that exists. The survey results tell us that many academics are deeply sceptical of GenAI in almost every part of the REF. Strong disagreement is common and, in some areas, reaches seventy per cent or more. Only a small minority sees value in GenAI for developing impact case studies.

    In contrast, interviews with senior leaders reveal a growing sense that institutions cannot afford to ignore this technology. Several Pro Vice Chancellors told us that GenAI is here to stay and that the sector has a responsibility to work out how to use it safely and responsibly.

    This tension is familiar to Jisc. GenAI literacy is uneven, as is confidence, and even general digital capability. Our role is to help universities navigate that unevenness. In learning and teaching, this need is well understood, with our AI literacy programme for teaching staff well established. The REF AI findings make clear that similar support will be needed for research staff.

    Why national action matters

    If we leave GenAI use entirely to local experimentation, we will widen the digital divide between those who can invest in bespoke tools and those who cannot. The extent to which institutions can benefit from GenAI is tightly bound to their resources and existing expertise. A national research assessment exercise cannot afford to leave that unaddressed.

    We also need to address research integrity, and that should be the foundation for anything we do next. If the sector wants a safe and fair path forward, then transparency must come first. That is why Recommendation 1 matters. The report suggests universities should consider steps such as:

    • define where GenAI can and cannot be used
    • require disclosure of GenAI involvement in REF related work
    • embed these decisions into their broader research integrity and ethics frameworks

    As the report notes that current thinking about GenAI rarely connects with responsible research assessment initiatives such as DORA or CoARA, that gap has to close.

    Creating the conditions for innovation

    These steps do not limit innovation; they make innovation possible in a responsible way. At Jisc we already hear from institutions looking for advice on secure, trustworthy GenAI environments. They want support that will enable experimentation without compromising data protection, confidentiality or research ethics. They want clarity on how to balance efficiency gains with academic oversight. And they want to avoid replicating the mistakes of early digital adoption, where local solutions grew faster than shared standards.

    The REF AI report gives the sector the evidence it needs to move from informal practice to a clear, managed approach.

    The next REF will arrive at a time of major financial strain and major technological change. GenAI can help reduce burden and improve consistency, but only if it is used transparently and with a shared commitment to integrity. With the right safeguards, GenAI could support fairness in the assessment of UK research.

    From Jisc’s perspective, this is the moment to work together. Universities need policies. Panels need guidance. And the sector will need shared infrastructure that levels the field rather than widening existing gaps.

    Source link

  • Reflections From Six Weeks of Practice – Teaching in Higher Ed

    Reflections From Six Weeks of Practice – Teaching in Higher Ed

    This post is one of many, related to my participation in  Harold Jarche’s Personal Knowledge Mastery workshop.

    I love to walk. Sometimes I do it alone (almost always listening to either music or podcasts), though most often walks these days are facilitated by an invitation from one of our kids to go for an evening walk. I’m at the POD25 conference, so have been missing my night time walks. Right now, I’m holed up in my hotel room, doing some reflecting, writing, and a bit of grading.

    Instead of feeling guilty, I’m overwhelmed with supportive messages about how healthy this is. First, let’s start with walking. Rebecca Solnit writes in Wanderlust: A History of Walking about this practice:

    Thinking is generally thought of as doing nothing in a production-oriented society — and doing nothing is hard to do. It’s best done by disguising it as doing something, and the something closest to doing nothing is walking.

    The pull to keep producing and soaking in every bit of ROI from my university paying for this trip is strong (not because of them, I should say, but because of my own sense of needing to “get the most out of limited budget dollars”). Yet, learning cannot be perfectly quantified in terms of financial metrics, despite corporations’ and governments’ strong desire to do so. Jarche reminds us of the importance of leaving room for time and context to enrich our learning.

    We cannot tap into our innovative capacities without being open to radical departures from the predictable, planned path (an example of which might be the typical professional conference schedule). And yes, sometimes that means not engaging in every planned session at a conference, like the one I’m participating in this week.

    Jarche writes:

    Creative work is not routine work done faster. It’s a whole different way of work, and a critical part is letting the brain do what it does best — come up with ideas. Without time for reflection, most of those ideas will get buried in the detritus of modern workplace busyness.

    As we wrap up our time together, Jarche invites those of us participating in his Personal Knowledge Mastery workshop to reflect on our experience these past six weeks. Here I go, in responding to his questions:

    Q. What was the most useful concept I learned from this workshop?

    A. It wasn’t really a concept, rather a practice. I benefitted by committing to a regular writing practice throughout the workshop, which provided opportunities for rich reflection and deepened learning. The structure of the workshop allowed for that to take place (plus me being a person who is a bit of a completist and wanting to blog through all 18 of the opportunities for reflection and activity that Harold provided).

    Q. What was the most surprising concept that has changed my thinking about PKM?

    A. I had seen Jarche write about McLuhan’s media tetrad in the past, but didn’t slow myself down enough to absorb much of anything, at the time. However, given my commitment to practice PKM throughout this experience, I wrote about the concept for the first time, and even shared the framework as a part of a keynote I gave a month or so ago.

    A diamond-shaped diagram illustrating McLuhan’s media tetrad. The center diamond is labeled “Medium.” Four surrounding diamonds describe its effects: the top says “Obsolesces — a previous medium,” the right says “Retrieves — a much older medium,” the bottom says “Reverses — its properties when extended to its limits,” and the left says “Extends — a human property.” The image is adapted from jarche.com

    During the keynote, I couldn’t remember the word “tetrad,” when the idea came up later in the talk (as in after the slide had long since disappeared). I had attempted to come up with a word association on the plane ride out to Michigan, but it had failed me, in that moment.

    “Think of the old arcade game, Tetris, plus something being “rad” (like in the 80s)”, I told myself. I was definitely learning out loud and performing retrieval practice in real time, as I eventually cobbled together audience participation input and finally got myself there.

    A few things I’ve learned about myself, cognitive science, and other human beings remind me of these principles. For starters, my embarrassment in not knowing, but still struggling through and reaching the side of knowing means I’m unlikely to forget the word in the future. Plus, people aren’t looking for other humans to be perfect. It is through our vulnerability and relatability that we might most often have an opportunity to make an impact on others. At least I believe that may be the case for me… as I wasn’t meant to be the expert, as my primary role in this world, I don’t think. I would rather be known as someone who is curios, which I’ve heard enough times to start to believe that it is true.

    Q. What will be the most challenging aspect of PKM for me?

    A. I still need to learn more about the concepts and frameworks involving navigating complexity, including one I’ve come across in the past, but never got much further than confusion, previously: cynefin. Jim Luke (who I met a gazillion years ago at an OpenEd conference) has offered to share his wisdom about cynefin with Kate Bowles and I sometime in the next couple of months. He replied to me on Mastodon about cynefin:

    I find it a very useful heuristic in thinking about community, higher ed, any activities that are organized and care-centered, etc.

    This exchange wouldn’t have occurred, had it not been for Harold structuring the PKM workshop around engaging on Mastodon, by the way. This is going to be a gift that keeps on giving, I believe. While my connections there are still small in number, they are strong with competence, care, and creativity.

    I’m glad that I can now pronounce cynefin without first locating an audio clip of someone else saying it. I’m useless at phonetic spelling, so that stuff doesn’t often help me in the slightest. I do still have to look up how to spell it each time. My brain feels slower with the learning when a word is pronounced differently than it is spelled. I still have to occasionally slow myself way down when spelling my own last name, so I won’t let myself feel too bad about still not being able to spell cynefin without help.

    Q. Where do I hope to be with my PKM practice one year from now?

    A. I would like to be in a more regular practice of blogging a year from now. I tend to save up blog post ideas that are super laborious for me (at least the way I approach the task, in those cases). I like doing posts for Jane Hart’s Top Tools 4 Learning votes (like my top ten votes from 2025). But given how extensively I write and link in those posts, they take many hours to complete. I also have enjoyed doing top podcast posts, drawing inspiration from Bryan Alexander’s wonderful posts, like this one about the podcasts he was listening to in late 2024.

    My post from late 2024 about what Overcast told me I had listened to the most that year was less time consuming to write, than ones I had done in the past. But I felt weird only going from the total minutes listened as my barometer, when I think that other podcasts are far more worthy of acknowledgement than some of the ones I wound up having listened to the most that year. This 2021 Podcast Favorites post took forever to write and curate, but is more emblematic of the ways I would most like to celebrate all the incredible podcasts that are out there (or at least were publishing, at the time I wrote it).

    If I put some creative constraints on myself, in terms of the time I would allow myself to commit to any individual post, I suspect I would have a lot more success with this aspect of PKM. I so appreciate the way that Alan Levine, Maha Bali, and Kate Bowles write in more reflective, informal ways. I’ve been pushing myself throughout this workshop to just get the ideas I’m having in the moment out there, to tell stories that are snapshots of my sensemaking processes, and to be human and allow myself to show up in the messiness that is indicative of the learning process.

    Gratitude

    My deepest gratitude goes to Harold Jarche for such a well-designed, impactful learning experience through his Personal Knowledge Mastery workshop. I had been telling myself that I would do it at some point for years, now, and finally realized that there wasn’t really ever going to be a “good time” for there to be six weeks without something big happening (conferences, speaking gigs, etc.). So Harold has been able to travel with me on airplanes, sat with me in airports, and is currently in my hotel room in San Diego at the POD 2025 conference. This is only metaphorically speaking, of course. As far as I know, he is in Canada right now. Though I am not surveilling him and he does seem to travel a lot, at least as it compares to me.

    I’m also feeling thanks for those people who allow themselves to learn out loud and take the risks of being openly curious and worrying less about being “right” or “perfect” all the time.

    Source link

  • Making creative practice research visible

    Making creative practice research visible

    I still remember walking into my first Association of Media Practice Educators conference, sometime around the turn of the millennium.

    I was a very junior academic, wide-eyed and slightly overwhelmed. Until that point, I’d assumed research lived only in books and journals.

    My degree had trained me to write about creative work, not to make it.

    That event was a revelation. Here were filmmakers, designers, artists, and teachers talking about the doing as research – not as illustration or reflection, but as knowledge in its own right. There was a sense of solidarity, even mischief, in the air. We were building something together: a new language for what universities could call research.

    When AMPE eventually merged with MeCCSA – the Media, Communication and Cultural Studies Association – some of us worried that the fragile culture of practice would be swallowed by traditional academic habits. I remember standing in a crowded coffee queue at that first joint conference, wondering aloud whether practice would survive.

    It did. But it’s taken twenty-five years to get here.

    From justification to circulation

    In the early days, the fight was about legitimacy. We were learning to write short contextual statements that translated installations, performances, and films into assessable outputs. The real gatekeeper was always the Research Excellence Framework. Creative practice researchers learned to speak REF – to evidence, contextualise, and theorise the mess of creative making.

    Now that argument is largely won. REF 2021 explicitly recognised practice research. Most universities have templates, repositories, and internal mentors to support it. There are still a few sceptics muttering about rigour, but they’re the exception, not the rule.

    If creative practice makes knowledge, the challenge today is not justification. It’s circulation.

    Creative practice is inherently cross-disciplinary. It doesn’t sit neatly in the subject silos that shape our academic infrastructure. Each university has built its own version of a practice research framework – its own forms, repositories, and metadata – but the systems don’t talk to one another. Knowledge that begins in the studio too often ends up locked inside an institutional database, invisible to the rest of the world.

    A decade of blueprints

    Over the past few years, a string of national projects has tried to fix that.

    PRAG-UK, funded by Research England in 2021, mapped the field and called for a national repository, metadata standards, and a permanent advisory body. It was an ambitious vision that recognised practice research as mature and ready to stand alongside other forms of knowledge production.

    Next came Practice Research Voices and SPARKLE in 2023 – both AHRC-funded, both community-driven. PR Voices, led by the University of Westminster, tested a prototype repository built on the Cayuse platform. It introduced the idea of the practice research portfolio – a living collection that links artefacts, documentation, and narrative. SPARKLE, based at Leeds with the British Library and EDINA, developed a technical roadmap for a national infrastructure, outlining how such a system might actually work.

    And now we have ENACT – the Practice Research Data Service, funded through UKRI’s Digital Research Infrastructure programme and again led by Westminster. ENACT’s job is to turn all those reports into something real: a national, interoperable, open data service that makes creative research findable, accessible, and reusable. For the first time, practice research is being treated as part of the UK’s research infrastructure, not a quirky sideshow to it.

    A glimpse of community

    In June 2025, Manchester Metropolitan University hosted The Future of Practice Research. For once, everyone was in the same room – the PRAG-UK authors, the SPARKLE developers, the ENACT team, funders, librarians, and plenty of curious researchers. We swapped notes, compared schemas, and argued cheerfully about persistent identifiers.

    It felt significant – a moment of coherence after years of fragmentation. For a day, it felt like we might actually build a network that could connect all these efforts.

    A few weeks later, I found myself giving a talk for Loughborough University’s Capturing Creativity webinar series. Preparing for that presentation meant gathering up a decade of my own work on creative practice research – the workshops I’ve designed, the projects I’ve evaluated, the writing I’ve done to help colleagues articulate their practice as research. In pulling all that together, I realised how cyclical this story is.

    Back at that first AMPE conference, we were building a community from scratch. Today, we’re trying to build one again – only this time across digital platforms, data standards, and research infrastructure.

    The policy challenge

    If you work in research management, this is your problem too. Practice research now sits comfortably inside the REF, but not inside the systems that sustain the rest of academia. We have no shared metadata standards, no persistent identifiers for creative outputs, and no national repository.

    Every university has built its own mini-ecosystem. None of them connect.

    The sector needs collective leadership – from UKRI, the AHRC, Jisc, and Universities UK – to treat creative practice research as shared infrastructure. That means long-term funding, coordination across institutions, and skills investment for researchers, librarians, and digital curators.

    Without that, we’ll keep reinventing the same wheel in different corners of the country.

    Coming full circle

    Pulling together that presentation for Capturing Creativity reminded me how far we’ve come – and how much remains undone. We no longer need to justify creative practice as research. But we still need to build the systems, the culture, and the networks that let it circulate.

    Because practice research isn’t just another output type. It’s the imagination of the academy made visible.

    And if the academy can’t imagine an infrastructure worthy of its own imagination, then we really haven’t learned much from the last twenty-five years.

    Source link

  • New three-tier visa processing practice starts – Campus Review

    New three-tier visa processing practice starts – Campus Review

    The federal government will reward universities that enrol international students in line with allocated numbers under a new visa processing practice to begin in November.

    Please login below to view content or subscribe now.

    Membership Login

    Source link

  • Tertiary Collaborations in Practice: what partnerships between colleges and universities actually do and where to go next

    Tertiary Collaborations in Practice: what partnerships between colleges and universities actually do and where to go next

    UK universities are under mounting financial pressure. Join HEPI and King’s College London Policy Institute on 11 November 2025 at 1pm for a webinar on how universities balance relatively stable but underfunded income streams against higher-margin but volatile sources. Register now. We look forward to seeing you there.

    This blog was kindly authored by Josh Patel (@joshpatel.bsky.social), Senior Researcher at the Edge Foundation.

    The Prime Minister’s new target is for two-thirds of young people to participate in higher-level learning by age 25. This encompasses not only undergraduate degrees but also higher technical education and apprenticeships, all delivered under a single funding model for all Level 4-6 courses. Some have described this as England’s turn to tertiary, six years after the Augar Review called for a more ‘joined-up system’.

    Since at least the 1990s, English post-secondary education has been characterised by market-based regulatory apparatus and fragmentation. Further education is associated with technical and vocational education, and training and entry to the labour market; higher education with professions, leadership, and research. Oversight of both is dispersed across multiple agencies and further disconnected from adult and lifelong learning. Critics have argued that, consequently, market logics have sustained wasteful competition and produced a homogenised system that privileges higher education over further education, to the detriment of equity and national skills needs.

    If Augar exposed the limits of market-driven differentiation between further education and higher education, tertiary approaches in the devolved nations illustrate how greater collaboration and integrated oversight offer a potential corrective. Wales and Scotland have advanced considerably in a ‘tertiary’ direction and developed governance modes that exercise holistic stewardship over funding and quality regimes. They are justified on grounds of efficiency, concertedness, and the capacity to advance the common good. In Wales, Medr uses its statutory powers under the Well-being of Future Generations Act to guide institutions in meeting duties on equality, sustainability, and civic mission. In Scotland, the Scottish Funding Council leads the Outcome Agreement process, through which colleges and universities set out activities in return for funding. Even in England, partnerships at a regional level, such as those in the North East or through Institutes of Technology, aim to facilitate partnerships to align lifelong learning with local economic needs. In 2021, the last time a representative survey of the scale of collaboration took place, 80% of colleges and 50% of universities in the UK had formal programme links (and it is likely that collaboration has grown since then).

    Despite this prevalence and enthusiasm, research on how the benefits arising from tertiary collaboration manifest at the level of institutions and students is limited. In a short exploratory study with the Edge Foundation, I examined one facet of tertiary systems in Scotland and England: the creation of formal student transition ‘pathways’ between colleges and universities. The aim was not a comprehensive survey, but to sample something of the nature of collaboration in existing systems, to gather evidence to think with and about the concept of tertiary and the place of collaboration and competition.

    Collaboration as an adaptive strategy

    Existing collaborations are, perhaps surprisingly, not foremost concerned with any given common good. Instead, collaboration often emerges as an adaptive strategy within conditions of resource scarcity. Local ‘natural alliances’ in shared specialisms, mutual values, and commitments to widening participation were important in establishing trust necessary to sustain joint work. Yet, as the study found, institutional precarity is the principal driver.

    One Scottish interviewee put it plainly:

    ‘If I’m sitting there and I’ve got 500 applications, like 10 applications for any place, I’ve got good, strong applications. I’m not going to be going, right, how am I going to look at different ways to bring in students?’

    Well-resourced institutions do not collaborate out of necessity: those under pressure do. Partnerships often take the form of a ‘grow your own’ recruitment pipeline, guaranteeing transitions between partner institutions. Universities could ask ‘some tough questions’ of colleges if progression was lower than anticipated. In some cases, institutions agree to partition markets to avoid directly competing for the same students.

    Collaboration could also be used as an instrument of competition. In Scotland, articulation agreements (under which universities recognise vocational qualifications such as HNDs and HNCs and admit students with advanced standing) are commonplace. Colleges in this research reported ‘some bad behaviour’ where partner universities would use these agreements to siphon off students from colleges to secure enrolment numbers. This was contrary to the wishes of colleges, which argued that many such students might benefit from the more intimate and supportive college environment for an additional year, better preparing them to enter the more independent learning environment of university.

    What collaboration offers students

    Where collaboration was stable, tangible benefits followed for learners. Partnerships combine colleges’ attentive pedagogies and flexible resources with university accreditation and facilities. This enables smaller class sizes, greater pastoral attention, and sometimes improved retention and progression, particularly in educational cold spots. Colleges bring local specialisms and staff expertise, often linked to industry, which enrich university courses through co-design and joint delivery.

    This lends cautious support to the claims of tertiary advocates: that collaboration can widen access and enhance provision. Yet formal, longitudinal evidence of graduate outcomes remains rare. The value of such partnerships, their distinctiveness, public benefit, and contribution to regional prosperity need to be more readily championed.

    From expedient to strategic collaboration

    As an instrument, collaboration is worth understanding. The capacity to facilitate collaboration as a strategic good is an important policy lever where market mechanisms are unable to respond immediately or efficiently to the imperatives of national need and public finance. The study suggests four priorities for policymakers:

    Clarify national priorities and reform incentives

    Collaboration has greater utility than an institutional survival tool. With the bringing together of funding for further education and higher education, there is an opportunity to create stability. Together with the clear articulation of long-term educational goals, strategic cooperation in pursuit of these ends could be sustained.

    Strengthen regional governance

    Where regional stewardship exists, through articulation hubs or in Scotland’s Outcome Agreements, collaboration is more systematic. England’s existing fragmented oversight and policy churn undermine this. Regional coherence enables institutions to collectively make strategic planning decisions.

    Value colleges’ distinct niche

    Colleges’ localism, technical capacity, and pedagogical expertise are distinctive assets. Policy should promote these specialisms and encourage co-design and co-delivery rather than hierarchical franchising. Partnerships should foreground each institution’s unique contribution, not replicate the same provision in different guises.

    Improve data sharing and evaluation

    The absence of mechanisms to track students’ journeys and long-term outcomes, including ‘distance travelled’ evaluations, makes claims about distinctiveness and public benefit harder to substantiate.

    Tertiary turns in resource scarcity

    Policy discourse has tended to over-dichotomise competition and collaboration. The question is: to what extent each strategy is most helpful for achieving agreed social ends. Where partnership is an appropriate mechanism, it requires a policy architecture with clarity of purpose and stability. To what ends collaboration is put to must be settled through democratic means – a more complicated question altogether.

    The full report can be read here.

    Source link

  • From Half-Baked to Well-Done: Building a Sensemaking Practice

    From Half-Baked to Well-Done: Building a Sensemaking Practice

    This post is one of many, related to my participation in  Harold Jarche’s Personal Knowledge Mastery workshop.

    Making Sense through Sensemaking

    Sensemaking is an essential part of one’s personal knowledge mastery, so vital that it ought to be a regular practice for any human, particularly those who desire to be taken seriously and be able to add value in workplaces, communities, and societies. Sensemaking centers on a desire to solve problems and gets fueled by curiosity.

    Jarche shares that there’s a whole spectrum of potential sensemaking approaches, everything from filtering information (making a list), or contributing to new information (writing a thesis). Sensemaking requires practice and vulnerability. We aren’t always going to get things right the first time we come to a conclusion.

    Half-Baked Ideas

    In introducing the idea of “half-baked ideas,” Jarche writes:

    If you don’t make sense of the world for yourself, then you’re stuck with someone else’s world view.

    As I reflect on my own ability to come up with half-baked ideas, it all depends on how controversial whatever idea I might be having at the time is as to whether I’m inclined to share it in a social space. I find myself thinking about what hashtags or even words might attract people looking for an internet fight, or wanting to troll a stranger.

    If a half-baked idea I might share is related to teaching and learning, I am less concerned about who may desire to publicly disagree with something, but it it is about politics, I just don’t see the value in “thinking aloud,” in relation to what internet riff raff may decide to come at me, metaphorically speaking. Part of that is that I’m not an expert, while another aspect of this resistance is that I would rather do this kind of sensemaking offline. This is at least in terms of me trying out ideas about various policies, political candidates, and issues of the day.

    Committing to Practice

    I just launched a sensemaking practice involving books about teaching and learning. Usually, I read upwards of 95% of the authors’ books that I interview for the Teaching in Higher Ed podcast. However, I would like to both find other ways to surface my own learning from all that reading, along with cultivating a set of skills to get better at video.

    The series is called Between the Lines: Books that Shape Teaching and Learning and I anticipate eventually getting up to producing an average of one video per week. I won’t hold myself to quite as high of expectations as I do for the podcast, since for that, I’ve been going strong, airing a podcast every single week since June 2014 and I don’t want to have that kind of self-imposed pressure for this experimentation, skill-building, and sensemaking practice.

    The first video is about how small shifts in our teaching make college more equitable and explores three key ideas from David Gooblar’s book, One Classroom at a Time: How Better Teaching Can Make College More Equitable. I hope you’ll consider watching it and giving me some encouragement to keep going or suggestions for how to make them more effective.

    Source link

  • Science of Reading Training, Practice Vary, New Research Finds – The 74

    Science of Reading Training, Practice Vary, New Research Finds – The 74


    Get stories like this delivered straight to your inbox. Sign up for The 74 Newsletter

    North Carolina is one of several states that have passed legislation in recent years to align classroom reading instruction with the research on how children learn to read. But ensuring all students have access to research-backed instruction is a marathon, not a sprint, said education leaders and researchers from across the country on a webinar from the Hunt Institute last Wednesday.

    Though implementation of the state’s reading legislation has been ongoing since 2021, more resources and comprehensive support are needed to ensure teaching practice and reading proficiency are improved, webinar panelists said.

    “The goal should be to transition from the science of reading into the science of teaching reading,” said Paola Pilonieta, professor at the University of North Carolina at Charlotte who was part of a team that studied North Carolina’s implementation of its 2021 Excellent Public Schools Act.

    That legislation mandates instruction to be aligned with “the science of reading,” the research that says learning to read involves “the acquisition of language (phonology, syntax, semantics, morphology, and pragmatics), and skills of phonemic awareness, accurate and efficient work identification (fluency), spelling, vocabulary, and comprehension.”

    The legislature allocated more than $114 million to train pre-K to fifth grade teachers and other educators in the science of reading through a professional development tool called the Language Essentials for Teachers of Reading and Spelling (LETRS). More than 44,000 teachers had completed the training as of June 2024.

    Third graders saw a two-point drop, from 49% to 47%, in reading proficiency from the 2023-24 to 2024-25 school year on literacy assessments. It was the first decline in this measure since LETRS training began. First graders’ results on formative assessments held steady at 70% proficiency and second graders saw a small increase, from 65% to 66%.

    “LETRS was the first step in transforming teacher practice and improving student outcomes,” Pilonieta said. “To continue to make growth in reading, teachers need targeted ongoing support in the form of coaching, for example, to ensure effective implementation of evidence-based literacy instruction.”

    Teachers’ feelings on the training

    Pilonieta was part of a team at UNC-Charlotte and the Education Policy Initiative at Carolina (EPIC) at UNC-Chapel Hill that studied teachers’ perception of the LETRS training and districts’ implementation of that training. The team also studied teachers’ knowledge of research-backed literacy practices and how they implemented those practices in small-group settings after the training.

    They asked about these experiences through a survey completed by 4,035 teachers across the state from spring 2023 to winter 2024, and 51 hour-long focus groups with 113 participants.

    Requiring training on top of an already stressful job can be a heavy lift, Pilonieta said. LETRS training looked different across districts, the research team found. Some teachers received stipends to complete the training or were compensated with time off, and some were not. Some had opportunities to collaborate with fellow educators during the training; some did not.

    “These differences in support influenced whether teachers felt supported during the training, overwhelmed, or ignored,” Pilonieta said.

    Teachers did perceive the content of the LETRS training to be helpful in some ways and had concerns in others, according to survey respondents.

    Teachers holding various roles found the content valuable in learning about how the brain works, phonics, and comprehension.

    They cited issues, however, with the training’s applicability to varied roles, limited differentiation based on teachers’ background knowledge and experience, redundancy, and a general limited amount of time to engage with the training’s content.

    Varied support from administrators, coaches

    When asking teachers about how implementation worked at their schools, the researchers found that support from administrators and instructional coaches varied widely.

    Teachers reported that classroom visits from administrators with a focus on science of reading occurred infrequently. The main support administrators provided, according to the research, was planning time.

    “Many teachers felt that higher levels of support from coaches would be valuable to help them implement these reading practices,” Pilonieta said.

    Teachers did report shifts in their teaching practice after the training and felt those tweaks had positive outcomes on students.

    The team found other conditions impacted teachers’ implementation: schools’ use of curriculum that aligned to the concepts covered in the training, access to materials and resources, and having sufficient planning time.

    Some improvement in knowledge and practice

    Teachers performed well on assessments after completing the training, but had lower scores on a survey given later by the research team. Pilonieta said this suggests an issue with knowledge retention.

    Teachers scored between 95% to 98% across in the LETRS post-training assessment. But in the research team’s survey, scores ranged from 48% to 78%.

    Teachers with a reading license scored higher on all knowledge areas addressed in LETRS than teachers who did not.

    When the team analyzed teachers’ recorded small-group reading lessons, 73% were considered high-quality. They found consistent use of explicit instruction, which is a key component of the science of reading, as well as evidence-backed strategies related to phonemic awareness and phonics. They found limited implementation of practices on vocabulary and comprehension.

    Among the low-quality lessons, more than half were for students reading below grade level. Some “problematic practices” persisted in 17% of analyzed lessons.

    What’s next?

    The research team formed several recommendations on how to improve reading instruction and reading proficiency.

    They said ongoing professional development through education preparation programs and teacher leaders can help teachers translate knowledge to instructional change. Funding is also needed for instructional coaches to help teachers make that jump.

    Guides differentiated by grade levels would help different teachers with different needs when it comes to implementing evidence-backed strategies. And the state should incentivize teachers to pursue specialized credentials in reading instruction, the researchers said.

    Moving forward, the legislation might need more clarity on mechanisms for sustaining the implementation of the science of reading. The research team suggests a structured evaluation framework that tracks implementation, student impact, and resource distribution to inform the state’s future literacy initiatives.

    This article first appeared on EdNC and is republished here under a Creative Commons Attribution-NoDerivatives 4.0 International License.


    Get stories like these delivered straight to your inbox. Sign up for The 74 Newsletter

    Source link

  • Selecting and Supporting New Vice Chancellors: Reflections on Process & Practice – PART 2 

    Selecting and Supporting New Vice Chancellors: Reflections on Process & Practice – PART 2 

    Author:
    Dr Tom Kennie

    Published:

    This HEPI blog was kindly authored by Dr Tom Kennie, Director of Ranmore 

    Introduction 

    In the first blog post, I focused on the process of appointing new Vice Chancellors. with some thoughts and challenges to current practice. In this second contribution, I focus more on support and how to ensure that the leadership transition receives as much attention as candidate selection.  

    Increasingly, the process of leadership transitions often starts way before the incoming successful candidate has been appointed. Depending on the circumstances which led to the need for a new leader, the process may involve a short or extended period with an Interim Leader. This can be an internal senior leader or someone externally who is appointed for a short, fixed-term period. This in itself is a topic for another day. It does, however, require careful consideration as part of the successful transition of a new leader (assuming the interim is not appointed to the permanent role). 

    Reflections to consider when on-boarding Vice Chancellors 

    Rules of engagement with the Interim or Existing post-holder  

    Clear rules of engagement must be agreed with the appointed Interim. Among those rules are those relating to the engagement with the Board. Often these can feel quite implicit and unspoken. I’d encourage both parties to be much more explicit and document their mutual expectations to share with each other.     

    Incoming Vice Chancellor transition plan (individual and team-based) 

    Moving onto the post-appointment, pre-arrival period is an important phase in the process of ensuring a successful outcome. How can the incoming leader prepare (whilst often doing another big job)? How might the team prepare the way for the incoming leader? And, how might the existing or interim leader hold things together during this period? This is often a period of heightened anxiety within the senior leadership team (although rarely surfaced and discussed). Working with the team during this phase can help to reduce the danger of siloed working and help prepare the team for the arrival of the new leader.  

    Outgoing Vice Chancellor transition plan  

    Frequently overlooked is the importance of ensuring a successful transition for the current post-holder (assuming it has not been a forced exit). Beware of placing too much focus on the new person. Often, as indicated earlier, the current post holder may have many months to go before the new person can start. They also require support and encouragement. And, of course, recognition for their period in office.  

    Day 1 and week 1 

    The lead-up to day 1 requires significant consideration by the new Vice Chancellor. Meeting the new ‘inner office’ and considering how and in what ways the new Vice Chancellor is different in style and expectations compared to the outgoing leader is an important factor. Induction processes will, no doubt, feature heavily in the first few weeks, but a new Vice Chancellor should ensure that they control the transition process. This requires careful coordinated communication and choreography.   

    First x days (what’s the right number?) 

    Every new Vice Chancellor should be wary of being persuaded to work towards delivering a plan by some (often arbitrary) date, typically 90-100 days after their arrival. Understanding the context of the institution, and working with this, is more important. 

    Potential surprises & dilemmas  

    A new Vice Chancellor should expect a few surprises when they start. Context and culture are different and these will have an impact on the interpretation of events. To ensure success, these should be soaked up and immediate responses should be avoided. In time, it will be much easier to work out how to respond and what needs to change. 

    Match and ideally exceed expectations  

    Whilst clearly important and easy to say, it is vital to ensure the Vice Chancellor priorities are clarified with the Chair. Having done this, the senior team should be invited to similarly clarify their priorities. Lastly, these should be shared across the team. This, by itself, is likely to signal a new way of working. 

    A final proposal  

    The process of appointing Vice Chancellors is clearly an important matter for Chairs of Governing Boards. Whilst guidance is provided by the Committee of University Chairs (CUC), the latest edition of the document Recruiting a Vice Chancellor was published in 2017. Much has changed in the past eight years and it feels timely for a fresh look given the very different context and shifts in practice. 

    To close, it is worth remembering that nobody comes fully ready for any senior leadership role. Gaps exist and context and culture are different from the new perspective even if the candidate has had a prior role in a different place. You might wish to consider offering some independent support for your new Vice Chancellor. This could be through being a member of a peer-group and/or individual transition coaching. Being in charge is a lonely place and it can be constructive to be able to talk through dilemmas, issues and opportunities in a safe space. Sometimes this can’t be with one’s Chair or Senior Team.  

    Lastly, don’t be too judgemental and try and give any new Vice Chancellor the benefit of the doubt – well at least for a short while! 

    Source link

  • Selecting and Supporting New Vice Chancellors: Reflections on Process & Practice – PART 1 

    Selecting and Supporting New Vice Chancellors: Reflections on Process & Practice – PART 1 

    • This HEPI blog was kindly authored by Dr Tom Kennie, Director of Ranmore.
    • Over the weekend, HEPI director Nick Hillman blogged about the forthcoming party conferences and the start of the new academic year. Read more here.

    Introduction 

    Over the last few months, a number of well-informed commentators have focused on understanding the past, present and to some extent, future context associated with the appointment of Vice Chancellors in the UK. See Tessa Harrison and Josh Freeman of Gatensby Sanderson Jamie Cumming-Wesley of WittKieffer and Paul Greatrix

    In this and a subsequent blog post, I want to complement these works with some practice-informed reflections from my work with many senior higher education leaders. I also aim to open a debate about optimising the selection and support for new Vice Chancellors by challenging some current practices. 

    Reflections to consider when recruiting Vice Chancellors 

    Adopt a different team-based approach 

    Clearly, all appointment processes are team-based – undertaken by a selection committee. For this type of appointment, however, we need a different approach which takes collective responsibility as a ‘Selection and Transition Team’. What’s the difference? In this second approach, the team take a wider remit with responsibility for the full life cycle of the process from search to selection to handover and transition into role. The team also oversee any interim arrangements if a gap in time exists between the existing leader leaving and the successor arriving. This is often overlooked.  

    The Six Keys to a Successful Presidential Transition is an interesting overview of this approach in Canada. 

    Pre-search diagnosis  

    Pre-search diagnosis (whether involving a search and selection firm or not) is often underestimated in its importance or is under-resourced. Before you start to search for a candidate to lead a university, you need to ensure those involved are all ‘on the same page’. Sometimes they are, but in other cases they fail to recognise that they are on the same, but wrong, page. Classically, this may be to find someone to lead the organisation of today, and a failure to consider the place they seek to be in 10 years. Before appointing a search firm, part of the solution is to ensure you have a shared understanding of the type of universityyou are seeking someone to lead.   

    • Role balance and capabilities 

    A further diagnostic issue, linked to the former point, is to be very clear about the balance of capabilities required in your selected candidate. One way of framing this is to assess the candidate balance across a number of dimensions, including:  

    • The Chief Academic Officer (CAO) capabilities; more operational and internally focussed. 
    • The Chief Executive Officer (CEO) capabilities; more strategic and initially internally focussed. 
    • The Chief Civic Officer (CCO) capabilities: more strategic and externally focussed; and 
    • The Chief Stakeholder Relationship Officer (CSRO): more operational and externally focussed. 

    All four matter. One astute Vice Chancellor suggested to me a fifth; Chief Storytelling Officer (CSO). 

    Search firm or not?   

    The decision as to whether to use a search firm is rarely considered today – it is assumed you will use one. It is, however, worth pausing to reflect on this issue, if only to be very clear about what you are seeking from a search firm. What criteria should you use to select one? Are you going with one who you already use, or have used, or are you open to new players (both to you and to the higher education market)? The latter might be relevant if you are seeking to extend your search to candidates who have a career trajectory beyond higher education.  

    ‘Listing’ – how and by whom?   

    Searching should lead to many potential candidates Selecting who to consider is typically undertaken through a long-listing process and from this a short-list is created. Make sure you understand how this will be undertaken and who will be doing it. When was the last time you asked to review the larger list from which the long list was taken?  

    Psychometrics – why, which and how? 

    A related matter involves the use of any psychometric instruments proposed to form part of the selection process. They are often included –yet the rationale for this is often unclear. As is the question of how the data will be used. Equally importantly, if the judgment is that it should be included, who should undertake the process? Whichever route you take, you would be wise to read Andrew Munro’s recent book on the topic, Personality Testing In Employee Selection: Challenges, Controversies and Future Directions 

    Balance questions with scenarios and dilemmas 

    Given the complexity of the role of the Vice Chancellor, it is clearly important to assess candidates across a wide range of criteria. Whilst a question-and-answer process can elicit some evidence, we should all be aware of the limitations of such a process. Complementing the former with a well-considered scenario-based processes involving a series of dilemmas, which candidates are invited to consider, is less common than it should be. 

    Rehearse final decision scenarios  

    If you are fortunate as a selection panel, after having considered many different sources of evidence, you will reach a collective, unanimous decision about the candidate you wish to offer the position. Job almost done. More likely, however, you will have more than one preferred candidate – each providing evidence to be appointable albeit with evidence of gaps in some areas. Occasionally, you may also have reached an impasse where strong cases are made to appoint two equally appointable candidates. Preparing for these situations by considering them in advance. In some cases, the first time such situations are considered are during the final stage of the selection exercise. 

    In part 2 I’ll focus more on support and how to ensure the leadership transition is given as much attention as candidate selection. 

    Source link

  • Universities need to reckon with how AI is being used in professional practice

    Universities need to reckon with how AI is being used in professional practice

    One of the significant themes in higher education over the last couple of decades has been employability – preparing students for the world of work into which they will be released on graduation.

    And one of the key contemporary issues for the sector is the attempt to come to grips with the changes to education in an AI-(dis)empowered world.

    The next focus, I would argue, will involve a combination of the two – are universities (and regulators) ready to prepare students for the AI-equipped work where they will be working?

    The robotics of law

    Large, international law firms have been using AI alongside humans for some time, and there are examples of its use for the drafting of non-disclosure agreements and contracts, for example.

    In April 2025, the Solicitors Regulation Authority authorised Garfield Law, a small firm specialising in small-claims debt recovery. This was remarkable only in that Garfield Law is the first law firm in the world to deliver services entirely through artificial intelligence.

    Though small and specialised, the approval of Garfield Law was a significant milestone – and a moment of reckoning – for both the legal professional and legal education. If a law firm can be a law firm without humans, what is the future for legal education?

    Indeed, I would argue that the HE sector as a whole is largely unprepared for a near-future in which the efficient application of professional knowledge is no longer the sole purview of humans.

    Professional subjects such as law, medicine, engineering and accountancy have tended to think of themselves as relatively “technology-proof” – where technology was broadly regarded as useful, rather than a usurper. Master of the Rolls Richard Vos said in March that AI tools

    may be scary for lawyers, but they will not actually replace them, in my view at least… Persuading people to accept legal advice is a peculiarly human activity.

    The success or otherwise of Garfield Law will show how the public react, and whether Vos is correct. This vision of these subjects as high-skill, human-centric domains needing empathy, judgement, ethics and reasoning is not the bastion it once was.

    In the same speech, Vos also said that, in terms of using AI in dispute resolution, “I remember, even a year ago, I was frightened even to suggest such things, but now they are commonplace ideas”. Such is the pace at which AI is developing.

    Generative AI tools can, and are, being used in contract drafting, judgement summaries, case law identification, medical scanning, operations, market analysis, and a raft of other activities. Garfield Law represents a world view where routine, and once billable, tasks performed by trainees and paralegals will most likely be automated. AI is challenging the traditional boundaries of what it means to be a professional and, in concert with this, challenging conceptions of what it is to teach, assess and accredit future professionals.

    Feeling absorbed

    Across the HE sector, the first reaction to the emergence of generative AI was largely (and predictably) defensive. Dire warnings to students (and colleagues) about “cheating” and using generative AI inappropriately were followed by hastily-constructed policies and guidelines, and the unironic and ineffective deployment of AI-powered AI detectors.

    The hole in the dyke duly plugged, the sector then set about wondering what to do next about this new threat. “Assessments” came the cry, “we must make them AI-proof. Back to the exam hall!”

    Notwithstanding my personal pedagogic aversion to closed-book, memory-recall examinations, such a move was only ever going to be a stopgap. There is a deeper pedagogic issue in learning and teaching: we focus on students’ absorption, recall and application of information – which, to be frank, is instantly available via AI. Admittedly, it has been instantly available since the arrival of the Internet, but we’ve largely been pretending it hasn’t for three decades.

    A significant amount of traditional legal education focuses on black-letter law, case law, analysis and doctrinal reasoning. There are AI tools which can already do this and provide “reasonably accurate legal advice” (Vos again), so the question arises as to what is our end goal in preparing students? The answer, surely, is skills – critical judgement, contextual understanding, creative problem solving and ethical reasoning – areas where (for the moment, at least) AI still struggles.

    Fit for purpose

    And yet, and yet. In professional courses like law, we still very often design courses around subject knowledge, and often try to “embed” the skills elements afterwards. We too often resort to tried and tested assessments which reward memory (closed-book exams), formulaic answers (problem questions) and performance under time pressure (time constrained assessments). These are the very areas in which AI performs well, and increasingly is able to match, or out-perform humans.

    At the heart of educating students to enter professional jobs there is an inherent conflict. On the one hand, we are preparing students for careers which either do not yet exist, or may be fundamentally changed – or displaced – by AI. On the other, the regulatory bodies are often still locked into twentieth century assumptions about demonstrating competence.

    Take the Solicitors Qualifying Examination (SQE), for example. Relatively recently introduced, the SQE was intended to bring consistency and accessibility into the legal profession. The assessment is nonetheless still based on multiple choice questions and unseen problem questions – areas where AI can outperform many students. There are already tools out there to help SQE student practice (Chat SQE, Kinnu Law), though no AI tool has yet completed the SQE itself. But in the USA, the American Uniform Bar Exam was passed by GPT4 in 2023, outperforming some human candidates.

    If a chatbot can ace your professional qualifying exam, is that exam fit for purpose? In other disciplines, the same question arises. Should medical students be assessed on their recall of rare diseases? Should business students be tested on their SWOT analyses? Should accounting students analyse corporate accounts? Should engineers calculate stress tolerances manually? All of these things can be completed by AI.

    Moonshots

    Regulatory bodies, universities and employers need to come together more than ever to seriously engage with what AI competency might look like – both in the workplace and the lecture theatre. Taking the approach of some regulators and insisting on in-person exams to prepare students for an industry entirely lacking in exams probably is not it. What does it mean to be an ethical, educated and adaptable professional in the age of AI?

    The HE sector urgently needs to move beyond discussions about whether or not students should be allowed to use AI. It is here, it is getting more powerful, and it is never leaving. Instead, we need to focus on how we assess in a world where AI is always on tap. If we cannot tell the difference between AI-generated work and student-generated work (and increasingly we cannot) then we need to shift our focus towards the process of learning rather than the outputs. Many institutions have made strides in this direction, using reflective journals, project-based learning and assessments which reward students for their ability to question, think, explain and justify their answers.

    This is likely to mean increased emphasis on live assessments – advocacy, negotiations, client interviews or real-world clinical experience. In other disciplines too, simulations, inter- and multi-disciplinary challenges, or industry-related authentic assessments. These are nothing revolutionary, they are pedagogically sound and all have been successfully implemented. They do, however, demand more of us as academics. More time, more support, more creativity. Scaling up from smaller modules to large cohorts is not an easy feat. It is much easier to keep doubling-down on what we already do, and hiding behind regulatory frameworks. However, we need to do these things (to quote JFK)

    not because they are easy, but because they are hard, because that goal will serve to organize and measure the best of our energies and skills, because that challenge is one that we are willing to accept, one we are unwilling to postpone.

    In law schools, how many of us teach students how to use legal technology, how to understand algorithmic biases, or how to critically assess AI-generated legal advice? How many business schools teach students how to work alongside AI? How many medical schools give students the opportunity to learn how to critically interpret AI-generated diagnostics? The concept of “digital professionalism” – the ability to effectively and ethically use AI in a professional setting – is becoming a core graduate-level skill.

    If universities fail to take the lead on this, then private providers will be eager, and quick, to fill the void. We already have short courses, boot camps, and employer-led schemes which offer industry-tailored AI literacy programmes – and if universities start to look outdated and slow to adapt, students will vote with their feet.

    Invention and reinvention

    However, AI is not necessarily the enemy. Like all technological advances it is essentially nothing more than a tool. As with all tools – the stone axe, the printing press, the internet – it brings with it threats to some and opportunities for others. We have identified some of the threats but also the opportunities that (with proper use), AI can bring – enhanced learning, deeper engagement, and democratisation of access to knowledge. Like the printing press, the real threat faced by HE is not the tool, but a failure to adapt to it. Nonetheless, a surprising number of academics are dusting off their metaphorical sabots to try and stop the development of AI.

    We should be working with the relevant sector and regulator and asking ourselves how we can adapt our courses and use AI to support, rather than substitute, genuine learning. We have an opportunity to teach students how to move away from being consumers of AI outputs, and how to become critical users, questioners and collaborators. We need to stop being reactive to AI – after all, it is developing faster than we can ever do.

    Instead, we need to move towards reinvention. This could mean: embedding AI literacy in all disciplines; refocusing assessments to require more creative, empathetic, adaptable and ethical skills; preparing students and staff to work alongside AI, not to fear it; and closer collaboration with professional regulators.

    AI is being used in many professions, and the use will inevitably grow significantly over the next few years. Educators, regulators and employers need to work even more closely together to prepare students for this new world. Garfield Law is (currently) a one-off, and while it might be tempting to dismiss the development as tokenistic gimmickry, it is more than that.

    Professional courses are standing on the top of a diving board. We can choose obsolescence and climb back down, clinging to outdated practices and condemn ourselves to irrelevance. Or, we can choose opportunity and dive in to a more dynamic, responsive and human vision of professional learning.

    We just have to be brave enough to take the plunge.

    Source link