Category: research impact

  • Impact can’t be gamed or systematised

    Impact can’t be gamed or systematised

    In a recent Wonkhe blog, Joe Mintz discussed the challenges of policy impact in social sciences and humanities research.

    He highlighted the growing importance of research impact for government (and therefore institutions) but noted significant barriers. These included a disconnect where academics prioritise research quality over early policy engagement and a mutual mistrust that limits research influence on decision-making.

    We have recently published a book on research impact that endeavours to explore the challenges of research impact, and these views chime greatly with us. Our motivation for starting the book project were personal. We have carried out a great deal of impactful research and provided support and training for others wishing to engage in impact. However, we wondered why impact seems to be so poorly understood across the sector, and, we had observed, there was a clear fracture between those who wanted to do impactful research, and institutions who wanted to control the process while not really understanding it.

    Agenda opposition

    There continues to be understandable opposition by some to what has been referred to as “the impact agenda”. One criticism is that impact is something being imposed by government and management that is at odds with the ideology of research. This argument follows that research impact is a market-driven mechanism that pressures academics to demonstrate immediate societal benefits from their research, often at the expense of intellectual freedom and critical inquiry. And a metric driven measurement of research impact may not fully capture the complexity or long-term value of research.

    We can certainly empathise with this perspective, but might suggest that this is, in a large part, due to how impact has manifested in a sector that does not really understand what it can be. In our own experiences we have experienced management “advice” that firstly says do not waste your time doing impact, then, once performance-based research funding becomes attached to it, being told it is very important and your impact needs to be “four star”. And then indifference is replaced with interference and attempts to control, to make sure we’re doing it “properly” and making sure it can be monitored.

    In trying to develop our understanding, we spoke to 25 “impactful” academics, who had objectively demonstrated that their research has high value impact, and a range of research professionals across the sector. It soon became very clear that our own observations were not outliers for those doing impactful research.

    Impact success for those we spoke to came from a personal belief that saw it ingrained in their own research practice – this was something they did because they felt it was important, not because they had been told to. The stakeholders and networks they had, and often spent considerable time building, were their own not their institution’s, and many protected these contacts and networks from institutional interference.

    In most cases, interview subjects said that there was little support from their institutions, they just did the work because they felt it was an important part of their research, and this symbiotic work with stakeholders provided further research opportunities. They could see the value of doing impactful research and felt personally rewarded as a result.

    And many talked of institutional interference, where there was opposition to what they were doing (“you’re not doing impact properly”) and advised from positions of seniority although perhaps not knowledge or, in some cases integrity. They were instructed to do things more in line with university systems, regardless of how poor they might be. There was a clear dissonance between academic identity and management culture, often informed by an “impact industry” where PowerPoints from webinars are disseminated across institutions with little opportunity for deep knowledge becoming embedded.

    Secret sauce

    And many spoke of the research management machine, insisting that they engage with central systems so their work could be “monitored” and having many people around them telling them what to do, but offering no support. This support was often as basic as “do more impact” and “give us the evidence now”. In some cases, threats were made to not submit their case studies should they not follow the “correct process”, even when their work was clearly highly impactful. An odd flex for a senior leader, given QR funding goes to the institution, not the academic.

    While the research that went into this book probably threw up as many questions as answers, one thing was very clear for this work. If it is to be successful, impact cannot be imposed upon academics or centrally controlled, it must originate from the academic’s community and own identity as a researcher. Telling someone to “do some impact because we need another case study” with a year before a REF submission is not good practice; management needs to take time to understand the research academics are doing and explore together how best to support it.

    We are reminded of a comment from one interviewee, who does incredibly interesting and impactful research, and has done for many years. When asked why they do it, they simply said “because I enjoy it and I’m good at it”.

    High quality impact case studies come from high quality research and high-quality impact. This is not something that can be gamed or systematised. Academics need to own impact for it to be successful, and institutions need to respect this.

    Source link

  • What gets misunderstood in the quest for policy impact

    What gets misunderstood in the quest for policy impact

    Academics are obsessed with impact. We want our research to be read, to be cited by other academics – and particularly in the social sciences and humanities, to have an impact on government policy.

    Partly this is because internationally over the last forty years, governments have increasingly imposed an impact agenda on universities, using financial and other levers to encourage them to focus on the real-world impact of what goes on in the ivory tower. But it’s not just that. Most academics are really passionate about the work they do, see it as important, and want it to make a difference to the public and society.

    Yet it seems that a lot of the time, that desire to have impact is much more of an aspiration than a reality. When I had just started in academia (at another institution), after working in the IT industry and then as a school teacher, I remember going to a meeting about the department’s research strategy. There were lots of speeches from academics about all the amazing work they were doing (or thought they were doing) – and then one brave colleague spoke up and said that research was a waste of time, as it just meant spending lots of energy on something that maybe ten people around the world would read. He was much more interested in teaching, and the real direct impact he could have on his students right there and then. Quite.

    Tracing impact

    Of course research does have impact, although often it’s much easier to see it in the hard sciences and medicine. The revolutionary impact of the work of Samuel Broder at the National Cancer Institute in the US, and his collaborators, in the 1990s that led to the introduction of retroviral treatments for HIV, comes to mind as one example. I worked as a technical analyst on an HIV/AIDS unit in London in the 1990s and I saw the miraculous impact of this on people’s lives.

    But in the social sciences tracing the path of impact is often much less clear. However, often this is not because the potential for impact is not there, but due to other factors, particularly a lack of understanding between government and academia about how research can usefully intercalate with policy development. Because I was interested in the relationship between research and policy, I undertook a secondment in the insights and research team of the Office for Standards in Education (Ofsted), for 15 months up to October 2023.

    Since then, I have been involved in an ongoing series of conversations, initiated by Ofsted, involving academic and government colleagues on the topic of how to facilitate better communications between government and academia about the role and impact of research. Most recently we held a very well attended symposium at the British Educational Research Association conference in Manchester in September 2024, and are planning other publications and events.

    What’s getting misunderstood

    So far, based on these discussions, we have identified a number of factors that tend not be given enough weight in the relationship between government and academia.

    First – and this is something I saw first-hand at Ofsted – it is important to realise that government does value evidence arising from academic research. Although many academics are unaware of this, each government department has members of what is known as the Government Social Research Profession, whose role is to champion social research evidence and support implementation and evaluation of government policy.

    Another thing I came to understand at Ofsted is that the culture is quite different to academia. The role of research in the civil service is to support the aims of democratically elected ministers. Research evidence is valued in government – but it is one factor among others when decisions are made.

    Linked to this, such evidence has to be provided at the right time and in the right way so that it can have an influence on those decisions. This is something that academics often lack awareness of. Typical academic research projects often focus on making sure that their findings are high quality and robust, and only then think about pathways to dissemination, hoping that someone in government will take notice of it. All too often that can mean, as my old colleague said, that it ends up being read only by other academics. Academics need to be scanning the horizon to find pathways to engagement from really much earlier on in their research, for example in the context of public consultations, or political debates.

    Other areas we have identified also include, on occasion, misconceptions and mistrust between academia and government, which is there on both sides. Civil servants often handle competing priorities and demands, which can hinder opening up lines of communication to the research community about how they use research and to engage in honest conversations about political priorities.

    Although things are changing for the better in England in this regard, as evidenced by our collaboration between Ofsted and academic colleagues, there is much more to do. We have adopted the concept of a “third space”, opportunities for engagement where we can find new ways of working across sectors that promote mutual understanding, in the case of Ofsted, to better promote outcomes for children and young people. This is of course the shared aim of both the academic research community and government.

    However, this is something that is needed not just at Ofsted, but across government and across academia. The impact agenda is not going anywhere anytime soon, and perhaps we would be foolish to want it to, but making it work better has got to be a priority.

    Source link

  • A new mission for higher education policy reviews

    A new mission for higher education policy reviews

    by Ellen Hazelkorn, Hamish Coates, Hans de Wit & Tessa Delaquil

    Making research relevant to policy

    In recent years there has been heightened attention being given to the importance of scholarly endeavour making a real impact on and for society. Yet, despite a five-fold increase in journal articles published on higher education in the last twenty years, the OECD warns of a serious “disconnect between education policy, research and practice”.

    As higher education systems have grown and diversified, it appears with ever increasing frequency that policy is made on the slow, on the run, or not at all. Even in the most regulated systems, gone is the decades-long approach of lifetime civil servants advancing copperplate notes on papyrus through governmental machines designed to sustain flow and augment harmony. In the era of 24-hour deliberation, reporting and muddling through, it may seem that conceptually rooted analysis of policy and policymaking is on the nose or has been replaced by political expediency.

    Nothing could be further from the truth. There has never been a more important time to analyse, design, evaluate, critique, integrate, compare and innovate higher education policy. Fast policy invokes a swift need for imaginative reflection. Light policy demands counterbalancing shovel loads of intellectual backfilling. Comparative analysis is solvent for parochial policy. Policy stasis, when it stalks, must be cured by ingenious, ironic, and incisive admonition.

    Governments worldwide expect research to provide leaders and policymakers with evidence that will improve the quality of teaching and education, learning outcomes and skills development, regional innovation and knowledge diffusion, and help solve society’s problems. Yet, efforts to enhance the research-policy-practice nexus fall far short of this ambition.

    Policy influencers are more likely to be ministerial advisory boards and commissioned reports than journal articles and monographs, exactly opposite to what incentivizes academics. Rankings haven’t helped, measuring ‘impact’ in terms of discredited citation scores despite lots of research and efforts to the contrary.

    Academics continue to argue the purpose of academic research is to produce ‘pure’ fundamental research, rather than undertake public-funded research. And despite universities promoting impactful research of public value, scholars complain of many barriers to entry.

    The policy reviews solution

    Policy Reviews in Higher Education (PRiHE) aims to push out the boundaries and encourage scholars to explore a wide range of policy themes. Despite higher education sitting within a complex knowledge-research-innovation ecosystem, touching on all elements from macro-economic to foreign policy to environmental policy, our research lens and interests are far too narrow. We seem to be asking the same questions. But the policy and public lens is changing.

    Concerns are less about elites and building ‘world-class universities’ for a tiny minority, and much more about pressing social issues such as: regional disparities and ‘left-behind communities’, technical and vocational education and training, non-university pathways, skills and skills mismatch, flexible learning opportunities given new demographies, sustainable regional development, funding and efficiency, and technological capability and artificial intelligence. Of course, all of this carries implications for governance and system design, an area in which much more evidence-based research is required.

    As joint editors we are especially keen to encourage submissions which can help address such issues, and to draw on research to produce solutions rather than simply critique. We encourage potential authors to ask questions outside the box, and explore how these different issues play out in different countries, and accordingly discuss the experiences, the lessons, and the implications from which others can learn.

    Solutions for policy reviews

    Coming into its ninth year, PRiHE is platform for people in and around government to learn about the sector they govern, for professionals in the sector to keep abreast of genuinely relevant developments, and for interested people around the world to learn about what is often (including for insiders!) a genuinely opaque and complex and certainly sui generis environment.

    As our above remarks contend, the nature of contemporary higher education politics, policy and practice cannot be simplified or taken for granted. Journal topics, contributions, and interlocutors must also change and keep pace. Indeed, the very idea of an ‘academic journal’ must itself be reconsidered within a truly global and fully online education and research environment. Rightly, therefore, PRiHE keeps moving.

    With renewed vim and vigour, the Society for Research into Higher Education (SRHE) has refreshed the Editorial Office and Editorial Board, and charged PRiHE to grow even more into a world-leading journal of mark and impact. Many further improvements have been made. For instance, the Editorial Office has worked with SRHE and the publisher Taylor and Francis to make several enhancements to editorial and journal processes and content.

    We encourage people to submit research articles or proposals for an article – which will be reviewed by the Editors and feedback provided in return. We also encourage people to submit commentary and book reviews – where the authors have sought to interrogate and discuss a key issue through a policy-oriented lens. See the ‘instructions for authors’ for details.

    Read, engage, and contribute

    This second bumper 2024 issue provides six intellectual slices into ideas, data and practices relevant to higher education policy. We smartly and optimistically advise that you download and perhaps even print out all papers, power off computers and phones, and spend a few hours reading these wonderful contributions. We particularly recommend this to aspiring policy researchers, researchers and consultants in the midst of their careers, and perhaps most especially to civil servants and related experts embedded in the world of policy itself.

    SRHE and the Editorial Office are looking ahead to a vibrant and strong future period of growth for PRiHE. A raft of direct and public promotion activities are planned. PRiHE is a journal designed to make a difference to policy and practice. The most important forms of academic engagement, of course, include reading, writing and reviewing. We welcome your contribution in these and other ways to the global PRiHE community.

    This blog is based on the editorial published in Policy Reviews in Higher Education (online 16 November 2024) A new mission for higher education policy reviews

    Professor Ellen Hazelkorn is Joint Managing Partner, BH Associates. She is Professor Emeritus, Technological University Dublin.

    Hamish Coates is professor of public policy, director of the Higher Education Futures Lab, and global tertiary education expert.

    Hans de Wit is Professor Emeritus and Distinguished Fellow of the Boston College Center for International Higher Education, Senior Fellow of the international Association of Universities.

    Tessa DeLaquil is postdoctoral research fellow at the School of Education at University College Dublin.

    Author: SRHE News Blog

    An international learned society, concerned with supporting research and researchers into Higher Education

    Source link