Category: Quality

  • Moving beyond the quality wars

    Moving beyond the quality wars

    A decade since his passing, David Watson’s work remains a touchpoint of UK higher education analysis.

    This reflects the depth and acuity of his analysis, but also his ability as a phrasemaker.

    One of his phrases that has stood the test of time is the “quality wars” – his label for the convulsions in UK higher education in the 1990s and early 2000s over the assurance of academic quality and standards.

    Watson coined this phrase in 2006, shortly after the 2001 settlement that brought the quality wars to an end. A peace that lasted, with a few small border skirmishes, until HEFCE’s launch of its review of quality assessment in 2015.

    War never changes

    I wasn’t there, but someone who was has described to me a meeting at that time involving heads of university administration and HEFCE’s chief executive. As told to me, at one point a registrar of a large and successful university effectively called out HEFCE’s moves on quality assessment urging HEFCE not to reopen the quality wars. I’ve no idea if the phrase Pandora’s box was used, but it would fit the tenor of the exchange as it was relayed to me.

    Of course this warning was ignored. And of course (as is usually the case) the registrar was right. The peace was broken, and the quality wars returned to England.

    The staging posts of the revived conflict are clear.

    HEFCE’s Revised operating model for quality assessment was introduced in 2016. OfS was establishment two years later, leading to the B conditions mark I; followed later the same year by a wholesale re-write of the UK quality code that was reportedly largely prompted and/or driven by OfS. Only for OfS to decide by 2020 that it wasn’t content with this; repudiation of the UK quality code; and OfS implementing from 2022 the B conditions mark II (new, improved; well maybe not the latter, but definitely longer).

    And a second front in the quality wars opened up in 2016, with the birth of the Teaching Excellence Framework (TEF). Not quite quality assessment in the by then traditional UK sense, but still driven by a desire to sort the sheep from the goats – identifying both the pinnacles of excellence and depths of… well, that was never entirely clear. And as with quality assessment, TEF was a very moveable feast.

    There were three iterations of Old TEF between 2016 and 2018. The repeated insistence that subject level TEF was a done deal, leading to huge amounts of time and effort on preparations in universities between 2017 and early 2020 only for subject-level TEF to be scrapped in 2021. At which point New TEF emerged from ashes, embraced by the sector with an enthusiasm that was perhaps to be expected – particularly after the ravages of the Covid pandemic.

    And through New TEF the two fronts allegedly became a united force. To quote OfS’s regulatory advice , the B conditions and New TEF formed part of an “overall approach” where “conditions of registration are designed to ensure a minimum level” and OfS sought “to incentivise providers to pursue excellence in their own chosen way … in a number of ways, including through the TEF”.

    Turn and face the strange

    So in less than a decade English higher education experienced: three iterations of quality assessment; three versions of TEF (one ultimately not implemented, but still hugely disruptive to the sector); and a rationalisation of the links between the two that required a lot of imagination, and a leap into faith, to accept the claims being made.

    Pandora’s box indeed.

    No wonder that David Behan’s independent review of OfS recommended “that the OfS’s quality assessment methodologies and activity be brought together to form a more integrated assessment of quality.” Last week we had the first indications from OfS of how it will address this recommendation, and there are two obvious questions: can we see a new truce emerging in the quality wars; and given where we look as though we may end up on this issue, was this round of the quality wars worth fighting?

    Any assessment of where we are following the last decade of repeated and rapid change has to recognise that there have been some gains. The outcomes data used in TEF, particularly the approach to benchmarking at institutional and subject levels, is and always has been incredibly interesting and, if used wisely, useful data. The construction of a national assessment process leading to crude overall judgments just didn’t constitute wise use of the data.

    And while many in the sector continue to express concern at the way such data was subsequently brought into the approach to national quality assessment by OfS, this has addressed the most significant lacuna of the pre-2016 approach to quality assurance. The ability to use this to identify specific areas and issues of potential concern for further, targeted investigation also addresses a problematic gap in previous approaches that were almost entirely focused on cyclical review of entire institutions.

    It’s difficult though to conclude that these advances, important elements of which it appears will be maintained in the new quality assessment approach being developed by OfS, were worth the costs of the turbulence of the last 10 years.

    Integration

    What appears to be emerging from OfS’s development of a new integrated approach to quality assessment essentially feels like a move back towards central elements of the pre-2016 system, with regular cyclical reviews of all providers (with our without visits to be decided) against a single reference point (albeit the B conditions rather than UK Quality Code). Of course it’s implicit rather than explicit, but it feels like an acknowledgment that the baby was thrown out with the bathwater in 2016.

    There are of course multiple reasons for this, but a crucial one has been the march away from the concept of co-regulation between universities and higher education providers. This was a conscious and deliberate decision, and one that has always been slightly mystifying. As a sector we recognise and promote the concept of co-creation of academic provision by staff and students, while being able to maintain robust assessment of the latter by the former. The same can and should be true of providers and regulators in relation to quality assurance and assessment, and last week’s OfS blog gives some hope that OfS is belatedly moving in this direction.

    It’s essential that they do.

    Another of David Watson’s memorable phrases was “controlled reputational range”: the way in which the standing of UK higher education was maintained by a combination of internal and external approaches. It is increasingly clear from recent provider failures and the instances of unacceptable practices in relation to some franchised provision that this controlled reputational range is increasingly at risk. And while this is down to developments and events in England, it jeopardises this reputation for universities across the UK.

    A large part of the responsibility for this must sit with OfS and its approach to date to regulating academic quality and standards. There have also been significant failings on the part of awarding bodies, both universities and private providers. The answer must therefore lie in partnership working between regulators and universities, moving closer to a co-regulatory approach based on a final critical element of UK higher education identified by Watson – its “collaborative gene”.

    OfS’s blog post on its developing approach to quality assessments holds out hope of moves in this direction. And if this is followed through, perhaps we’re on the verge of a new settlement in the quality wars.

    Source link

  • The world is sorting out the quality of transnational education, but where is England?

    The world is sorting out the quality of transnational education, but where is England?

    If you believe – as many do – that English higher education is among the best in the world, it can come as an unwelcome surprise to learn that in many ways it is not.

    As a nation that likes to promote the idea that our universities are globally excellent, it feels very odd to realise that the rest of the world is doing things rather better when it comes to quality assurance.

    And what’s particularly alarming about this is that the new state of the art is based on the systems and processes set up in England around two decades ago.

    Further afield

    The main bone of contention between OfS and the rest of the quality assurance world – and the reason why England is coloured in yellow rather than green on the infamous EQAR map – and the reason why QAA had to demit from England’s statutory Designated Quality Body role – is that the European Standards and Guidance (ESG) require a cyclical review of institutional quality processes and involve the opinions of students, while OfS wants things to be more vibes risk-based and feels quality assurance is far too important to get actual students involved.

    Harsh? Perhaps. In the design of its regulatory framework the OfS was aiming to reduce burden by focusing mainly on where there were clear issues with quality – with the enhancement end handled by the TEF and the student aspect handled by actual data on how they get on academically (the B3 measures of continuation, completion, and progression) and more generally (the National Student Survey). It has even been argued (unsuccessfully) in the past that as TEF is kind of cyclical if you squint a bit, and it does sort of involve students, that England is in fact ESG compliant.

    It’s not like OfS were deliberately setting out to ignore international norms, it was more that it was trying to address English HE’s historic dislike for lengthy external reviews of quality as it established a radically new system of regulation – and cyclical reviews with detailed requirements on student involvement were getting in the way of this. Obviously this was completely successful, as now nobody complains about regulatory burden and there are no concerns about the quality of education in any part of English higher education among students or other stakeholders.

    Those ESG international standards were first published in 2005,with the (most recent) 2015 revision adopted by ministers from 47 countries (including the UK). There is a revision underway led by the E4 group: the European Association for Quality Assurance in Higher Education (ENQA), ESU, EUA and EURASHE – fascinatingly, the directors of three out of four of these organisations are British. The ESG are the agreed official standards for higher education quality assurance within the Bologna process (remember that?) but are also influential further afield (as a reference point for similar standards in Africa, South East Asia, and Latin America. The pandemic knocked the process off kilter a bit, but a new ESG is coming in 2027, with a final text likely to be available in 2026.

    A lot of the work has already been done, not least via the ENQA-led and EU-funded QA-FIT project. The final report, from 2024, set out key considerations for a new ESG – it’s very much going to be a minor review of the standards themselves, but there is some interesting thinking about flexibility in quality assurance methodologies.

    The UK is not England

    International standards are reflected more clearly in other parts of the UK.

    Britain’s newest higher education regulator, Medr, continues to base higher education quality assurance on independent cyclical reviews involving peer review and student input, which reads across to widely accepted international standards (such as the ESG). Every registered provider will be assessed at least every five years, and new entrants will be assessed on entry. This sits alongside a parallel focus on teaching enhancement and a focus on student needs and student outcomes – plus a programme of triennial visits and annual returns to examine the state of provider governance.

    Over at the Scottish Funding Council the Tertiary Quality Enhancement Framework (TQEF) builds on the success of the enhancement themes that have underpinned Scottish higher education quality for the past 20 years. The TQEF again involves ESG-compliant cyclical independent review alongside annual quality assurance engagements with the regulator and an intelligent use of data. As in Wales, there are links across to the assessment of the quality of governance – but what sets TQEF apart is the continued focus on enhancement, looking not just for evidence of quality but evidence of a culture of improvement.

    Teaching quality and governance are also currently assessed by cyclical engagements in Northern Ireland. The (primarily desk-based) Annual Performance Review draws on existing data and peer review, alongside a governance return and engagement throughout the year, to give a single rating to each provider in the system. Where there are serious concerns an independent investigation (including a visit) is put in place. A consultation process to develop a new quality model for Northern Ireland is underway – the current approach simply continues the 2016 HEFCE approach (which was, ironically, originally hoped to cover England, Wales, and Northern Ireland while aligning to ESG).

    The case of TNE

    You could see this as a dull, doctrinal, dispute of the sort that higher education is riven with – you could, indeed, respond in the traditional way that English universities do in these kinds of discussions by putting your fingers in your ears and repeating the word “autonomy” in a silly voice. But the ESG is a big deal: it is near essential to demonstrate compliance if you want to get stuck into any transnational education or set up an international academic partnership.

    As more parts of the world are now demanding access to high quality higher education, it seems fair to assume that much of this will be delivered – in the country or online – by providers elsewhere. In England, we still have no meaningful way of assuring the quality of transnational education (something that we appear to be among the best in the world at expanding)? Indeed, we can’t even collect individualised student data about TNE.

    Almost by definition, regulation of TNE requires international cooperation and international knowledge – the quasi-colonial idea that if the originating university is in good standing then everything it does overseas is going to be fine is simply not an option. National systems of quality need to be receptive to collaboration and co-regulation as more and more cross-border provision is developed, in terms of rigor, comparability (to avoid unnecessary burden) and flexibility to meet local needs and concerns.

    Of course, concerns about the quality of transnational education are not unique to England. ENQA has been discussing the issue as a part of conversations around ESG – and there are plans to develop an international framework, with a specific project to develop this already underway (which involves our very own QAA). Beyond Europe, the International Network for Quality Assurance Agencies in Higher Education (INQAAHE – readers may recall that at great expense OfS is an associated member, and that the current chair is none other than the QAA’s Vicki Stott) works in partnership with UNESCO on cross-border provision.

    And it will be well worth keeping an eye on the forthcoming UNESCO second intergovernmental conference of state parties to the Global Convention on Higher Education later this month in Paris, which looks set to adopt provisions and guidance on TNE with a mind to developing a draft subsidiary text for adoptions. The UK government ratified the original convention, which at heart deals with the global recognition of qualifications, in 2022. That seems to be the limit of UK involvement – there’s been no signs that the UK government will even attend this meeting.

    TNE, of course, is just one example. There’s ongoing work about credit transfer, microcredentials, online learning, and all the other stuff that is on the English to-do pile. They’re all global problems and they will all need global (or at the very least, cross system) solutions.

    Plucky little England going it alone

    The mood music at OfS – as per some questions to Susan Lapworth at a recent conference – is that the quality regime is “nicely up and running”, with the various arms of activity (threshold assessment for degree awarding powers, registration, and university titles; the B conditions and associated investigations; and the Teaching Excellence Framework) finally and smoothly “coming together”.

    A blog post earlier this month from Head of Student Outcomes Graeme Rosenberg outlined more general thinking about bringing these strands into better alignment, while taking the opportunity to fix a few glaring issues (yes, our system of quality assurance probably should cover taught postgraduate provision – yes, we might need to think about actually visiting providers a bit more as the B3 investigations have demonstrated). On the inclusion of transnational education within this system, the regulator has “heard reservations” – which does not sound like the issue will be top of the list of priorities.

    To be clear, any movement at all on quality assurance is encouraging – the Industry and Regulators Committee report was scathing on the then-current state of affairs, and even though the Behan review solidified the sense that OfS would do this work itself it was not at all happy with the current fragmentary, poorly understood, and internationally isolated system.

    But this still keeps England a long way off the international pace. The ESG standards and the TNE guidance UNESCO eventually adopts won’t be perfect, but they will be the state of the art. And England – despite historic strengths – doesn’t even really have a seat at the table.

    Source link

  • Subject-level insights on graduate activity

    Subject-level insights on graduate activity

    We know a lot about what graduates earn.

    Earnings data—especially at subject level—has become key to debates about the value of higher education.

    But we know far less about how graduates themselves experience their early careers. Until now, subject-level data on graduate job quality—how meaningful their work is, how well it aligns with their goals, and whether it uses their university-acquired skills—has been missing from the policy debate.

    My new study (co-authored with Fiona Christie and Tracy Scurry and published in Studies in Higher Education) aims to fill this gap. Drawing on responses from the 2018-19 graduation cohort in the national Graduate Outcomes survey, we provide the first nationally representative, subject-level analysis of these subjective graduate outcomes.

    What we find has important implications for how we define successful outcomes from higher education—and how we support students in making informed choices about what subject to study.

    What graduates tell us

    The Graduate Outcomes survey includes a set of questions—introduced by HESA in 2017—designed to capture core dimensions of graduate job quality. Respondents are asked (around 15 months after graduation) whether they:

    • find their work meaningful
    • feel it aligns with their future plans
    • believe they are using the skills acquired at university

    These indicators were developed in part to address the over-reliance on income as a measure of graduate success. They reflect a growing international awareness that economic outcomes alone offer a limited picture of the value of education—in line with the OECD’s Beyond GDP agenda, the ILO’s emphasis on decent work, and the UK’s Taylor Review focus on job quality.

    Subject-level insights

    Our analysis shows that most UK graduates report positive early-career experiences, regardless of subject. Across the sample, 86 per cent said their work felt meaningful, 78 per cent felt on track with their careers, and 66 per cent reported using their degree-level skills.

    These patterns generally hold across disciplines, though clear differences emerge. The chart below shows the raw, unadjusted proportion of graduates who report positive outcomes. Graduates from vocational fields—such as medicine, subjects allied to medicine, veterinary science, and education—tend to report particularly strong outcomes. For instance, medicine and dentistry graduates were 12 percentage points more likely than average to say their work was meaningful, and over 30 points more likely to report using the skills they acquired at university.

    However, the results also challenge the narrative that generalist or academic degrees are inherently low value. As you can see, most subject areas—including history, languages, and the creative arts, often targeted in these debates—show strong subjective outcomes across the three dimensions. Only one field, history and philosophy, fell slightly below the 50 per cent threshold on the skills utilisation measure. But even here, graduates still reported relatively high levels of meaningful work and career alignment.

    Once we adjusted for background characteristics—such as social class, gender, prior attainment, and institutional differences—many of the remaining gaps between vocational and generalist subjects narrowed and were no longer statistically significant.

    This chart shows the raw proportion of 2018-19 graduates who agree or strongly agree that their current work is meaningful, on track and using skills, by field of study (N = 67,722)

    Employment in a highly skilled occupation—used by the Office for Students (OfS) as a key regulatory benchmark—was not a reliable predictor of positive outcomes. This finding aligns with previous HESA research and raises important questions about the appropriateness of using occupational classification as a proxy for graduate success at the subject level.

    Rethinking what we measure and value

    These insights arrive at a time when the OfS is placing greater emphasis on regulating equality of opportunity and ensuring the provision of “full, frank, and fair information” to students. If students are to make informed choices, they need access to subject-level data that reflects more than salary, occupational status, or postgraduate progression. Our findings suggest that subjective outcomes—how graduates feel about their work—should be part of that conversation.

    For policymakers, our findings highlight the risks of relying on blunt outcome metrics—particularly earnings and occupational classifications—as indicators of course value. Our data show that graduates from a wide range of subjects—including those often labelled as “low value”—frequently go on to report meaningful work shortly after graduation that aligns with their future plans and makes use of the skills they developed at university.

    And while job quality matters, universities should not be held solely accountable for outcomes shaped by employers and labour market structures. Metrics and league tables that tie institutional performance too closely to job quality risk misrepresenting what higher education can influence. A more productive step would be to expand the Graduate Outcomes survey to include a wider range of job quality indicators—such as autonomy, flexibility, and progression—offering a fuller picture of early career graduate success.

    A richer understanding

    Our work offers the first nationally representative, subject-level insight into how UK graduates evaluate job quality in the early stages of their careers. In doing so, it adds a missing piece to the value debate—one grounded not just in earnings or employment status, but in graduates’ own sense of meaning, purpose, and skill use.

    If we are serious about understanding what graduates take from their university experience, it’s time to move beyond salary alone—and to listen more carefully to what graduates themselves are telling us.

    DK notes: Though the analysis that Brophy et al have done (employing listwise deletion, examining UK domiciled first degree graduates only) enhances our understanding of undergraduate progression and goes beyond what is publicly available, I couldn’t resist plotting the HESA public data in a similar way, as it may be of interest to readers:

    [Full screen]

    Source link

  • Risk-based quality regulation – drivers and dynamics in Australian higher education

    Risk-based quality regulation – drivers and dynamics in Australian higher education

    by Joseph David Blacklock, Jeanette Baird and Bjørn Stensaker

    Risk-based’ models for higher education quality regulation have been increasingly popular in higher education globally. At the same time there is limited knowledge of how risk-based regulation can be implemented effectively.

    Australia’s Tertiary Education Quality and Standards Agency (TEQSA) started to implement risk-based regulation in 2011, aiming at an approach balancing regulatory necessity, risk and proportionate regulation. Our recent published study analyses TEQSA’s evolution between 2011 and 2024 to contribute to an emerging body of research on the practice of risk-based regulation in higher education.

    The challenges of risk-based regulation

    Risk-based approaches are seen as a way to create more effective and efficient regulation, targeting resources to the areas or institutions of greatest risk. However, it is widely acknowledged that sector-specificities, political economy and social context exert a significant influence on the practice of risk-based regulation (Black and Baldwin, 2010). Choices made by the regulator also affect its stakeholders and its perceived effectiveness – consider, for example, whose ideas about risk are privileged. Balancing the expectations of these stakeholders, along with their federal mandate, has required much in the way of compromise.

    The evolution of TEQSA’s approaches

    Our study uses a conceptual framework suggested by Hood et al (2001) for comparative analyses of regimes of risk regulation that charts aspects respectively of context and content. With this as a starting point we end up with two theoretical constructs of ‘hyper-regulation’ and ‘dynamic regulation’ as a way to analyse the development of TEQSA over time. These opposing concepts of regulatory approach represent both theoretical and empirical executions of the risk-based model within higher education.

    From extensive document analysis, independent third-party analysis, and Delphi interviews, we identify three phases to TEQSA’s approach:

    • 2011-2013, marked by practices similar to ‘hyper-regulation’, including suspicion of institutions, burdensome requests for information and a perception that there was little ‘risk-based’ discrimination in use
    • 2014-2018, marked by the use of more indicators of ‘dynamic regulation’, including reduced evidence requirements for low-risk providers, sensitivity to the motivational postures of providers (Braithwaite et al. 1994), and more provider self-assurance
    • 2019-2024, marked by a broader approach to the identification of risks, greater attention to systemic risks, and more visible engagement with Federal Government policy, as well as the disruption of the pandemic.

    Across these three periods, we map a series of contextual and content factors to chart those that have remained more constant and those that have varied more widely over time.

    Of course, we do not suggest that TEQSA’s actions fit precisely into these timeframes, nor do we suggest that its actions have been guided by a wholly consistent regulatory philosophy in each phase. After the early and very visible adjustment of TEQSA’s approach, there has been an ongoing series of smaller changes, influenced also by the available resources, the views of successive TEQSA commissioners and the wider higher education landscape as a whole.

    Lessons learned

    Our analysis, building on ideas and perspectives from Hood, Rothstein and Baldwin offers a comparatively simple yet informative taxonomy for future empirical research.

    TEQSA’s start-up phase, in which a hyper-regulatory approach was used, can be linked to a contextual need of the Federal Government at the time to support Australia’s international education industry, leading to the rather dominant judicial framing of its role. However, TEQSA’s initial regulatory stance failed to take account of the largely compliant regulatory posture of the universities that enrol around 90% of higher education students in Australia, and of the strength of this interest group. The new agency was understandably nervous about Government perceptions of its performance, however, a broader initial charting of stakeholder risk perspectives could have provided better guardrails. Similarly, a wider questioning of the sources of risk in TEQSA’s first and second phases could have highlighted more systemic risks.

    A further lesson for new risk-based regulators is to ensure that the regulator itself has a strong understanding of risks in the sector, to guide its analyses, and can readily obtain the data to generate robust risk assessments.

    Our study illustrates that risk-based regulation in practice is as negotiable as any other regulatory instrument. The ebb and flow of TEQSA’s engagement with the Federal Government and other stakeholders provides the context. As predicted by various authors, constant vigilance and regular recalibration are needed by the regulator as the external risk landscape changes and the wider interests of government and stakeholders dictate. The extent to which there is political tolerance for any ‘failure’ of a risk-based regulator is often unstated and always variable.

    Joseph David Blacklock is a graduate of the University of Oslo’s Master’s of Higher Education degree, with a special interest in risk-based regulation and government instruments for managing quality within higher education.

    Jeanette Baird consults on tertiary education quality assurance and strategy in Australia and internationally. She is Adjunct Professor of Higher Education at Divine Word University in Papua New Guinea and an Honorary Senior Fellow of the Centre for the Study of Higher Education at the University of Melbourne.

    Bjørn Stensaker is a professor of higher education at University of Oslo, specializing in studies of policy, reform and change in higher education. He has published widely on these issues in a range of academic journals and other outlets.

    This blog is based on our article in Policy Reviews in Higher Education (online 29 April 2025):

    Blacklock, JD, Baird, J & Stensaker, B (2025) ‘Evolutionary stages in risk-based quality regulation in Australian higher education 2011–2024’ Policy Reviews in Higher Education, 1–23.

    Author: SRHE News Blog

    An international learned society, concerned with supporting research and researchers into Higher Education

    Source link

  • What the experience of neurodivergent PhD students teaches us, and why it makes me angry

    What the experience of neurodivergent PhD students teaches us, and why it makes me angry

    by Inger Mewburn

    Recently, some colleagues and I released a paper about the experiences of neurodivergent PhD students. It’s a systematic review of the literature to date, which is currently under review, but available via pre-print here.

    Doing this paper was an exercise in mixed feelings. It was an absolute joy to work with my colleagues, who knew far more about this topic than me and taught me (finally!) how to do a proper systematic review using Covidence. Thanks Dr Diana TanDr Chris EdwardsAssociate Professor Kate SimpsonAssociate Professor Amanda A Webster and Professor Charlotte Brownlow (who got the band together in the first place).

    But reading each and every paper published about neurodivergent PhD students provoked strong feelings of rage and frustration. (These feelings only increased, with a tinge of fear added in, when I read of plans for the US health department to make a ‘list’ of autistic people?! Reading what is going on there is frankly terrifying – solidarity to all.) We all know what needs to be done to make research degrees more accessible. Make expectations explicit. Create flexible policies. Value diverse thinking styles. Implement Universal Design Principles… These suggestions appear in report after report, I’ve ranted on the blog here and here, yet real change remains frustratingly elusive. So why don’t these great ideas become reality? Here’s some thoughts on barriers that keep neurodivergent-friendly changes from taking hold.

    The myth of meritocracy

    Academia clings to the fiction that the current system rewards pure intellectual merit. Acknowledging the need for accessibility requires admitting that the playing field isn’t level. Many senior academics succeeded in the current system and genuinely believe “if I could do it, anyone can… if they work hard enough”. They are either 1) failing to recognise their neurotypical privilege, or 2) not acknowledging the cost of masking their own neurodivergence (I’ll get to this in a moment).

    I’ve talked to many academics about things we could do – like getting rid of the dissertation – but too many of us are secretly proud of our own trauma. The harshness of the PhD has been compared to a badge of honour that we wear proudly – and expect others to earn.

    Resource scarcity (real and perceived)

    Universities often respond to suggestions about increased accessibility measures with budget concerns. The vibe is often: “We’d love to offer more support, but who will pay for it?”. However, many accommodations (like flexible deadlines or allowing students to work remotely) cost little, or even nothing. Frequently, the real issue isn’t resources but priorities of the powerful. There’s no denying universities (in Australia, and elsewhere) are often cash strapped. The academic hunger games are real. However, in the fight for resources, power dynamics dictate who gets fed and who goes without.

    I wish we would just be honest about our choices – some people in universities still have huge travel budgets. The catering at some events is still pretty good. Some people seem to avoid every hiring freeze. There are consistent patterns in how resources are distributed. It’s the gaslighting that makes me angry. If we really want to, we can do most things. We have to want to do something about this.

    Administrative inertia

    Changing established processes in a university is like turning a battleship with a canoe paddle. Approval pathways are long and winding. For example, altering a single line in the research award rules at ANU requires approval from parliament (yes – the politicians actually have to get together and vote. Luckily we are not as dysfunctional in Australia as other places… yet). By the time a solution is implemented, the student who needed it has likely graduated – or dropped out. This creates a vicious cycle where the support staff, who see multiple generations of students suffer the same way, can get burned out and stop pushing for change.

    The individualisation of disability

    Universities tend to treat neurodivergence as an individual problem requiring individual accommodations rather than recognising systemic barriers. This puts the burden on students to disclose, request support, and advocate for themselves – precisely the executive function and communication challenges many neurodivergent students struggle with.

    It’s akin to building a university with only stairs, then offering individual students a piggyback ride instead of installing ramps. I’ve met plenty of people who simply get so exhausted they don’t bother applying for the accommodations they desperately need, and then end up dropping out anyway.

    Fear of lowering ‘standards’

    Perhaps the most insidious barrier is the mistaken belief that accommodations somehow “lower standards.” I’ve heard academics worrying that flexible deadlines will “give some students an unfair advantage” or that making expectations explicit somehow “spoon-feeds” students.

    The fear of “lowering standards” becomes even more puzzling when you look at how PhD requirements have inflated over time. Anyone who’s spent time in university archives knows that doctoral standards aren’t fixed – they’re constantly evolving. Pull a dissertation from the 1950s or 60s off the shelf and you’ll likely find something remarkably slim compared to today’s tomes. Many were essentially extended literature reviews with modest empirical components. Today, we expect multiple studies, theoretical innovations, methodological sophistication, and immediate publishability – all while completing within strict time limits on ever-shrinking funding.

    The standards haven’t just increased; they’ve multiplied. So when universities resist accommodations that might “compromise standards,” we should ask: which era’s standards are we protecting? Certainly not the ones under which most people supervising today had to meet. The irony is that by making the PhD more accessible to neurodivergent thinkers, we might actually be raising standards – allowing truly innovative minds to contribute rather than filtering them out through irrelevant barriers like arbitrary deadlines or neurotypical communication expectations. The real threat to academic standards isn’t accommodation – it’s the loss of brilliant, unconventional thinkers who could push knowledge boundaries in ways we haven’t yet imagined.

    Unexamined neurodiversity among supervisors

    Perhaps one of the most overlooked barriers is that many supervisors are themselves neurodivergent but don’t recognise it or acknowledge what’s going on with them! In fact, since starting this research, I’ve formed a private view that you almost can’t succeed in this profession without at least a little neurospicey.

    Academia tends to attract deep thinkers with intense focus on specific topics – traits often associated with autism (‘special interests’ anyone?). The contemporary university is constantly in crisis, which some people with ADHD can find provides the stimulation they need to get things done! Yet many supervisors have succeeded through decades of masking and compensating, often at great personal cost.

    The problem is not the neurodivergence or the supervisor – it’s how the unexamined neurodivergence becomes embedded in practice, underpinned by an expectation that their students should function exactly as they do, complete with the same struggles they’ve internalised as “normal.”

    I want to hold on to this idea for a moment, because maybe you recognise some of these supervisors:

    • The Hyperfocuser: Expects students to match their pattern of intense, extended work sessions. This supervisor regularly works through weekends on research “when inspiration strikes,” sending emails at 2am and expecting quick responses. They struggle to understand when students need breaks or maintain strict work boundaries, viewing it as “lack of passion.” Conveniently, they have ignored those couple of episodes of burn out, never considering their own work pattern might reflect ADHD or autistic hyper-focus, rather than superior work ethic.
    • The Process Pedant: Requires students to submit written work in highly specific formats with rigid attachment to particular reference styles, document formatting, and organisational structures. Gets disproportionately distressed by minor variations from their preferred system, focusing on these details over content, such that their feedback primarily addresses structural issues rather than ideas. I get more complaints about this than almost any other kind of supervision style – it’s so demoralising to be constantly corrected and not have someone genuinely engage with your work.
    • The Talker: Excels in spontaneous verbal feedback but rarely provides written comments. Expects students to take notes during rapid-fire conversational feedback, remembering all key points. They tend to tell you to do the same thing over and over, or forget what they have said and recommend something completely different next time. Can get mad when questioned over inconsistencies – suggesting you have a problem with listening. This supervisor never considers that their preference for verbal communication might reflect their own neurodivergent processing style, which isn’t universal. Couple this with a poor memory and the frustration of students reaches critical. (I confess, being a Talker is definitely my weakness as a supervisor – I warn my students in advance and make an effort to be open to criticism about it!).
    • The Context-Switching Avoider: Schedules all student meetings on a single day of the week, keeping other days “sacred” for uninterrupted research. Becomes noticeably agitated when asked to accommodate a meeting outside this structure, even for urgent matters. Instead of recognising their own need for predictable routines and difficulty with transitions (common in many forms of neurodivergence), they frame this as “proper time management” that students should always emulate. Students who have caring responsibilities suffer the most with this kind of inflexible relationship.
    • The Novelty-Chaser: Constantly introduces new theories, methodologies, or research directions in supervision meetings. Gets visibly excited about fresh perspectives and encourages students to incorporate them into already-developed projects. May send students a stream of articles or ideas completely tangential to their core research, expecting them to pivot accordingly. Never recognises that their difficulty maintaining focus on a single pathway to completion might reflect ADHD-related novelty-seeking. Students learn either 1) to chase butterflies and make little progress or 2) to nod politely at new suggestions while quietly continuing on their original track. The first kind of reaction can lead to a dangerous lack of progress, the second reaction can lead to real friction because, from the supervisor’s point of view, the student ‘never listens’. NO one is happy in these set ups, believe me.
    • The Theoretical Purist: Has devoted their career to a particular theoretical framework or methodology and expects all their students to work strictly within these boundaries. Dismisses alternative approaches as “methodologically unsound” or “lacking theoretical rigour” without substantive engagement. Becomes noticeably uncomfortable when students bring in cross-disciplinary perspectives, responding with increasingly rigid defences of their preferred approach. Fails to recognise their intense attachment to specific knowledge systems and resistance to integrating new perspectives may reflect autistic patterns of specialised interests, or even difficulty with cognitive flexibility. Students learn to frame all their ideas within the supervisor’s preferred language, even when doing so limits their research potential.

    Now that I know what I am looking for, I see these supervisory dynamics ALL THE TIME. Add in whatever dash of neuro-spiciness is going on with you and all kinds of misunderstandings and hurt feelings result … Again – the problem is not the neurodivergence of any one person – it’s the lack of self reflection, coupled with the power dynamics that can make things toxic.

    These barriers aren’t insurmountable, but honestly, after decades in this profession, I’m not holding my breath for institutional enlightenment. Universities move at the pace of bureaucracy after all.

    So what do we do? If you’re neurodivergent, find your people – that informal network who “get it” will save your sanity more than any official university policy. If you’re a supervisor, maybe take a good hard look at your own quirky work habits before deciding your student is “difficult.” And if you’re in university management, please, for the love of research, let’s work on not making neurodivergent students jump through flaming bureaucratic hoops to get basic support.

    The PhD doesn’t need to be a traumatic hazing ritual we inflict because “that’s how it was in my day.” It’s 2025. Time to admit that diverse brains make for better research. And for goodness sake, don’t put anyone on a damn list, ok?

    AI disclaimer: This post was developed with Claude from Anthropic because I’m so busy with the burning trash fire that is 2025 it would not have happened otherwise. I provided the concept, core ideas, detailed content, and personal viewpoint while Claude helped organise and refine the text. We iteratively revised the content together to ensure it maintained my voice and perspective. The final post represents my authentic thoughts and experiences, with Claude serving as an editorial assistant and sounding board.

    This blog was first published on Inger Mewburn’s  legendary website The Thesis Whisperer on 1 May 2025. It is reproduced with permission here.

    Professor Inger Mewburn is the Director of Researcher Development at The Australian National University where she oversees professional development workshops and programs for all ANU researchers. Aside from creating new posts on the Thesis Whisperer blog (www.thesiswhisperer.com), she writes scholarly papers and books about research education, with a special interest in post PhD employability, research communications and neurodivergence.

    Author: SRHE News Blog

    An international learned society, concerned with supporting research and researchers into Higher Education

    Source link

  • Higher education can cut through the immigration debate with a focus on quality

    Higher education can cut through the immigration debate with a focus on quality

    The surge for Reform in the recent local elections in England has increased fears in the higher education sector that Labour may feel compelled to focus on driving down immigration at the expense of its other priorities and missions – James Coe has set out the risks of this approach on Wonkhe.

    Vice chancellors are understandably frustrated with the public debate on immigration and do not relish the prospect of rehearsing the same political cycle in the wake of the forthcoming white paper on legal migration. All can reel off data point after data point demonstrating the value of international student recruitment to their regions and communities, which according to the most recent London Economics calculations for the academic year 2022–23 brought £41.9bn a year in economic returns to the UK. That data is well supported by polling that suggests the public is generally pretty unfussed about international students compared to other forms of legal migration. The latest insight from British Future on the public’s attitudes to international students found:

    International students are seen to boost the UK economy, fill skills gaps, improve local economies and create job opportunities for locals and make cities and towns more vibrant and culturally diverse.

    Heads of institution also add that of all the many and varied problems and complaints that arise from engagement with their local communities and regions, international students have never once featured. The problem, they say, is not policy, it is politics. And when politics tilts towards finding any means to drive down overall migration, higher education inevitably finds itself in the position of being collateral damage, despite the economic and reputational harm done – because it’s much easier to reduce student numbers than to tackle some of the more complex and intransigent issues with immigration.

    Standing the heat

    To give the government its due, the signal it wants to send on student visas is not currently about eroding the UK’s international competitiveness as a destination for study, and much more about reducing the use of that system for purposes for which it was never designed, particularly as a route to claiming asylum. Measures proposed are likely to include additional scrutiny of those entering from Nigeria, Pakistan, and Sri Lanka, an approach that may sit uncomfortably as making broad assumptions about a whole cohort of applicants, but at least has the benefit of being risk-based. That nuance may be lost, however, in how the public conversation plays out both within the UK and in the countries where prospective international students and their governments and media pay close attention to the UK international policy landscape and associated mood music.

    The political challenge is not limited to higher education. Recognising the derailing effect of constant short-term reactive announcements in immigration policy, a number of influential think tanks including the Institute for Government, the Institute for Public Policy Research, the Centre for Policy Studies, Onward, and British Future have called on the government to create an annual migration plan. The Institute for Government’s explanation of how it envisages an annual migration plan would work sets out benefits including clarity on overall objectives for the system with the ability to plan ahead, the segmentation of analysis and objectives by route, and the integration of wider government agendas such as those on skills, or foreign policy.

    For the higher education sector, an annual planning approach could make a big difference, creating space for differentiated objectives, policy measures and monitoring of student and graduate visas – something that in many ways would be much more meaningful than removing student numbers from overall published net migration figures, or presenting them separately. It could open up a sensible discussion about what data represents a meaningful measure, what should be adopted as a target and what should be monitored. It could also open up space for a more productive conversation between higher education representatives and policymakers focused on making the most of the connections between international education, regional and national skills needs, and workforce planning.

    In the weeks and months ahead the government is also expected to publish a refreshed international education strategy, which should give the sector a strong steer about what the government wants to see from international higher education. But it will be critical for that strategy to have a clear line of sight to other government priorities on both the economy and the wider immigration picture, to prevent it being siloed and becoming dispensable.

    The fate of the last government’s international education strategy tells an instructive tale about what happens when government is not joined up in its agenda. Three years ago the sector and its champions in Westminster celebrated the achievement of a core objective of that strategy – attracting 600,000 students to the UK – eight years earlier than planned. But that rapid growth provided both unsustainable, as numbers dropped again in response to external shocks, and politically problematic, as students bringing dependents drove up overall numbers and the government responded with another shift in policy. The credibility and longevity of the refreshed strategy will depend on the government’s willingness to back it when the political heat is turned up in other parts of the immigration system.

    Quality is our watchword

    The higher education sector is justifiably proud of its international offer and keen to work with government on developing a shared plan to make the most of opportunities afforded by bringing students to the UK to study. The focus has to be on quality: attracting well-qualified and capable applicants; offering high-quality courses focused on developing career-relevant skills, particularly where there is strategic alignment with the government’s industrial strategy; and further enhancing the global employability of UK international graduates, whether it’s through securing a good job via the Graduate route, or elsewhere.

    The value of international recruitment is not always very tangible to people living in communities in terms of valuable skills and cultural capital – and that breaks down to telling stories in ways that people can connect with. As one Labour Member of Parliament suggested to us, many parts of Britain are in the process of reimagining their collective identities, and part of the job is building a compelling identity connection with the new economy rather than harkening back to an imagined past. That is work that sits somewhat apart from simply explaining the value of international students, but may also turn out to be intimately connected to it.

    Higher education institutions can work with employers, the regional and national policymakers concerned with skills, innovation and growth, and in local communities, to further that agenda, but they need the breathing space afforded by policy stability and a clear plan from government they can trust will be sustainable. To create that space, the sector will need to demonstrate that it has a high standard of practice and will not tolerate abuse of the system. “Abuse” is a loaded word; many of the practices that raise alarm are technically legal, but they put the system as a whole in jeopardy. The sector has a great track record on developing a shared standard of practice through instruments like the Agent Quality Framework, but it may also need to collectively think through whose job it is to call out those who fall short of those standards, to avoid the whole sector being tarred with the brush of irresponsible practice.

    While the landscape is complicated and at times disheartening, UK higher education can cut through the noise by sticking like glue to its quality message. Many universities are bigger and longer standing than Premier League football clubs – but those bastions of community pride have also had to work through challenges with their places and update their practice as the landscape has shifted. There is an opportunity with the forthcoming white paper and international education strategy to get the government and the sector on the same side when it comes to international higher education. Both parties will need to show willing to hear where the other is coming from to avoid another five years of frustration.

    This article is published in association with IDP Education. It draws on a private discussion held with policymakers and heads of institution on the theme of international higher education’s contribution to regional economic growth. The authors would like to thank all those who took part in that discussion.

    Source link

  • Programs like tutoring in jeopardy after Linda McMahon terminates COVID aid spending extensions

    Programs like tutoring in jeopardy after Linda McMahon terminates COVID aid spending extensions

    This story was originally published by Chalkbeat. Sign up for their newsletters at ckbe.at/newsletters.

    HVAC projects to improve indoor air quality. Tutoring programs for struggling students. Tuition support for young people who want to become teachers in their home communities.

    More News from eSchool News

    Almost 3 in 5 K-12 educators (55 percent) have positive perceptions about GenAI, despite concerns and perceived risks in its adoption, according to updated data from Cengage Group’s “AI in Education” research series.

    Our school has built up its course offerings without having to add headcount. Along the way, we’ve also gained a reputation for having a wide selection of general and advanced courses for our growing student body.

    When it comes to visual creativity, AI tools let students design posters, presentations, and digital artwork effortlessly. Students can turn their ideas into professional-quality visuals, sparking creativity and innovation.

    Ensuring that girls feel supported and empowered in STEM from an early age can lead to more balanced workplaces, economic growth, and groundbreaking discoveries.

    In my work with middle school students, I’ve seen how critical that period of development is to students’ future success. One area of focus in a middle schooler’s development is vocabulary acquisition.

    For students, the mid-year stretch is a chance to assess their learning, refine their decision-making skills, and build momentum for the opportunities ahead.

    Middle school marks the transition from late childhood to early adolescence. Developmental psychologist Erik Erikson describes the transition as a shift from the Industry vs. Inferiority stage into the Identity vs. Role Confusion stage.

    Art has a unique power in the ESL classroom–a magic that bridges cultures, ignites imagination, and breathes life into language. For English Language Learners (ELLs), it’s more than an expressive outlet.

    In the year 2025, no one should have to be convinced that protecting data privacy matters. For education institutions, it’s really that simple of a priority–and that complicated.

    Teachers are superheroes. Every day, they rise to the challenge, pouring their hearts into shaping the future. They stay late to grade papers and show up early to tutor struggling students.

    Want to share a great resource? Let us know at [email protected].

    Source link

  • SMART Technologies Launches AI Assist in Lumio to Save Teachers Time

    SMART Technologies Launches AI Assist in Lumio to Save Teachers Time

    Lumio by SMART Technologies, a cloud-based learning platform that enhances engagement on student devices, recently announced a new feature for its Spark plan. This new offering integrates AI Assist, an advanced tool designed to save teachers time and elevate student engagement through AI-generated quiz-based activities and assessments.

    Designing effective quizzes takes time—especially when crafting well-balanced multiple-choice questions with plausible wrong answers to encourage critical thinking. AI Assist streamlines this process, generating high-quality quiz questions at defined levels in seconds so teachers can focus on engaging their students rather than spending time on quiz creation.

    More News from eSchool News

    HVAC projects to improve indoor air quality. Tutoring programs for struggling students. Tuition support for young people who want to become teachers in their home communities.

    Almost 3 in 5 K-12 educators (55 percent) have positive perceptions about GenAI, despite concerns and perceived risks in its adoption, according to updated data from Cengage Group’s “AI in Education” research series.

    Our school has built up its course offerings without having to add headcount. Along the way, we’ve also gained a reputation for having a wide selection of general and advanced courses for our growing student body.

    When it comes to visual creativity, AI tools let students design posters, presentations, and digital artwork effortlessly. Students can turn their ideas into professional-quality visuals, sparking creativity and innovation.

    Ensuring that girls feel supported and empowered in STEM from an early age can lead to more balanced workplaces, economic growth, and groundbreaking discoveries.

    In my work with middle school students, I’ve seen how critical that period of development is to students’ future success. One area of focus in a middle schooler’s development is vocabulary acquisition.

    For students, the mid-year stretch is a chance to assess their learning, refine their decision-making skills, and build momentum for the opportunities ahead.

    Middle school marks the transition from late childhood to early adolescence. Developmental psychologist Erik Erikson describes the transition as a shift from the Industry vs. Inferiority stage into the Identity vs. Role Confusion stage.

    Art has a unique power in the ESL classroom–a magic that bridges cultures, ignites imagination, and breathes life into language. For English Language Learners (ELLs), it’s more than an expressive outlet.

    In the year 2025, no one should have to be convinced that protecting data privacy matters. For education institutions, it’s really that simple of a priority–and that complicated.

    Want to share a great resource? Let us know at [email protected].

    Source link

  • Universities that expand access have graduates who take longer to repay their loans

    Universities that expand access have graduates who take longer to repay their loans

    I’ll admit that the Neil O’Brien-powered analysis of graduate repayments in The Times recently annoyed me a little.

    There’s nothing worse than somebody attempting to answer a fascinating question with inappropriate data (and if you want to read how bad it is I did a quick piece at the time). But it occurred to me that there is a way to address the issue of whether graduate repayments of student loans do see meaningful differences by provider, and think about what may be causing this phenomenon.

    What I present here is the kind of thing that you could probably refine a little if you were, say, shadow education minister and had access to some numerate researchers to support you. I want to be clear up top is that, with public data and a cavalier use of averages and medians, this can only be described as indicative and should be used appropriately and with care (yes, this means you Neil).

    My findings

    There is a difference in full time undergraduate loan repayment rates over the first five years after graduation by provider in England when you look at the cohort that graduated in 2016-17 (the most recent cohort for which public data over five years is available).

    This has a notable and visible relationship with the proportion of former students in that cohort from POLAR4 quintile 1 (from areas in the lowest 20 per cent of areas).

    Though it is not possible to draw a direct conclusion, it appears that subject of study and gender will also have an impact on repayments.

    There is also a relationship between the average amount borrowed per student and the proportion of the cohort at a provider from POLAR4 Q1.

    The combination of higher average borrowing and lower average earnings makes remaining loan balances (before interest) after five years look worse in providers with a higher proportion of students from disadvantaged backgrounds..

    On the face of it, these are not new findings. We know that pre-application background has an impact on post-graduation success – it is a phenomenon that has been documented numerous times, and the main basis for complaints about the use of progression data as a proxy for the quality of education available at a provider. Likewise, we know that salary differences by gender and by industry (which has a close but not direct link to subject of study).

    Methodology

    The Longitudinal Educational Outcomes dataset currently offers a choice of three cohorts where median salaries are available one, three, and five years after graduation. I’ve chosen to look at the most recent available cohort, which graduated in 2016-17.

    Thinking about the five years between graduation and the last available data point, I’ve assumed that median salaries for year 2 are the same as year 1, and that salaries for year 4 are the same as year 3. I can then take 9 per cent of earnings above the relevant threshold as the average repayment – taking two year ones, two year threes, and a year five gives me an average total repayment over five years.

    The relevant threshold is whatever the Department for Education says was the repayment threshold for Plan 1 (all these loans would have been linked to to Plan 1 repayments) for the year in question.

    How much do students borrow? There is a variation by provider – here we turn to the Student Loans Company 2016 cycle release of Support for Students in Higher Education (England). This provides details of all the full time undergraduate fee and maintenance loans provided to students that year by provider – we can divide the total value of loans by the total number of students to get the average loan amount per student. There’s two problems with this – I want to look at a single cohort, and this gives me an average for all students at the provider that year. In the interests of speed I’ve just multiplied this average by three (for a three year full time undergraduate course) and assumed the year of study differentials net out somehow. It’s not ideal, but there’s not really another straightforward way of doing it.

    We’ve not plotted all of the available data – the focus is on English providers, specifically English higher education institutions (filtering out smaller providers where averages are less reliably). And we don’t show the University of Plymouth (yet), there is a problem with the SLC data somewhere.

    Data

    This first visualisation gives you a choice of X and Y axis as follows:

    • POLAR % – the proportion of students in the cohort from POLAR4 Q1
    • Three year borrowing – the average total borrowing per student, assuming a three year course
    • Repayment 5YAG – the average total amount repaid, five years after graduation
    • Balance 5YAG – the average amount borrowed minus the average total repayments over five years

    You can highlight providers of interest using the highlighter box – the size of the blobs represents the size of the cohort.

    [Full screen]

    Of course, we don’t get data on student borrowing by provider and subject – but we can still calculate repayments on that basis. Here’s a look at average repayments over five years by CAH2 subject (box on the top right to choose) – I’ve plotted against the proportion of the cohort from POLAR4 Q1 because that curve is impressively persistent.

    [Full screen]

    For all of the reasons – and short cuts! – above I want to emphasise again that this is indicative data – there are loads of assumptions here. I’m comfortable with this analysis being used to talk about general trends, but you should not use this for any form of regulation or parliamentary question.

    The question it prompts, for me, is whether it is fair to assume that providers with a bigger proportion of non-traditional students will be less effective at teaching. Graduate outcome measures may offer some clues, but there are a lot of caveats to any analysis that relies solely on that aspect.

    Source link

  • Becoming a professional services researcher in HE – making the train tracks converge

    Becoming a professional services researcher in HE – making the train tracks converge

    by Charlotte Verney

    This blog builds on my presentation at the BERA ECR Conference 2024: at crossroads of becoming. It represents my personal reflections of working in UK higher education (HE) professional services roles and simultaneously gaining research experience through a Masters and Professional Doctorate in Education (EdD).

    Professional service roles within UK HE include recognised professionals from other industries (eg human resources, finance, IT) and HE-specific roles such as academic quality, research support and student administration. Unlike academic staff, professional services staff are not typically required, or expected, to undertake research, yet many do. My own experience spans roles within six universities over 18 years delivering administration and policy that supports learning, teaching and students.

    Traversing two tracks

    In 2016, at an SRHE Newer Researchers event, I was asked to identify a metaphor to reflect my experience as a practitioner researcher. I chose this image of two train tracks as I have often felt that I have been on two development tracks simultaneously –  one building professional experience and expertise, the other developing research skills and experience. These tracks ran in parallel, but never at the same pace, occasionally meeting on a shared project or assignment, and then continuing on their separate routes. I use this metaphor to share my experiences, and three phases, of becoming a professional services researcher.

    Becoming research-informed: accelerating and expanding my professional track

    The first phase was filled with opportunities; on my professional track I gained a breadth of experience, a toolkit of management and leadership skills, a portfolio of successful projects and built a strong network through professional associations (eg AHEP). After three years, I started my research track with a masters in international higher education. Studying felt separate to my day job in academic quality and policy, but the assignments gave me opportunities to bring the tracks together, using research and theory to inform my practice – for example, exploring theoretical literature underpinning approaches to assessment whilst my institution was revising its own approach to assessing resits. I felt like a research-informed professional, and this positively impacted my professional work, accelerating and expanding my experience.

    Becoming a doctoral researcher: long distance, slow speed

    The second phase was more challenging. My doctoral journey was long, taking 9 years with two breaks. Like many part-time doctoral students, I struggled with balance and support, with unexpected personal and professional pressures, and I found it unsettling to simultaneously be an expert in my professional context yet a novice in research. I feared failure, and damaging my professional credibility as I found my voice in a research space.

    What kept me going, balancing the two tracks, was building my own research support network and my researcher identity. Some of the ways I did this was through zoom calls with EdD peers for moral support, joining the Society for Research into Higher Education to find my place in the research field, and joining the editorial team of a practitioner journal to build my confidence in academic writing.

    Becoming a professional services researcher: making the tracks converge

    Having completed my doctorate in 2022, I’m now actively trying to bring my professional and research tracks together. Without a roadmap, I’ve started in my comfort-zone, sharing my doctoral research in ‘safe’ policy and practitioner spaces, where I thought my findings could have the biggest impact. I collaborated with EdD peers to tackle the daunting task of publishing my first article. I’ve drawn on my existing professional networks (ARC, JISC, QAA) to establish new research initiatives related to my current practice in managing assessment. I’ve made connections with fellow professional services researchers along my journey, and have established an online network  to bring us together.

    Key takeaways for professional services researchers

    Bringing my professional experience and research tracks together has not been without challenges, but I am really positive about my journey so far, and for the potential impact professional services researchers could have on policy and practice in higher education. If you are on your own journey of becoming a professional services researcher, my advice is:

    • Make time for activities that build your research identity
    • Find collaborators and a community
    • Use your professional experience and networks
    • It’s challenging, but rewarding, so keep going!

    Charlotte Verney is Head of Assessment at the University of Bristol. Charlotte is an early career researcher in higher education research and a leader in within higher education professional services. Her primary research interests are in the changing nature of administrative work within universities, using research approaches to solve professional problems in higher education management, and using creative and collaborative approaches to research. Charlotte advocates for making the academic research space more inclusive for early career and professional services researchers. She is co-convenor of the SRHE Newer Researchers Network and has established an online network for higher education professional services staff engaged with research.

    Author: SRHE News Blog

    An international learned society, concerned with supporting research and researchers into Higher Education

    Source link