Tag: QAA

  • Scotland orders sector-wide assessment review after Glasgow QAA findings

    Scotland orders sector-wide assessment review after Glasgow QAA findings

    Ethan Brown was a 23-year-old geography student who died by suicide on what should have been his graduation day.

    Three months prior, the University of Glasgow had “wrongly informed him that he did not have the necessary credits to graduate.”

    The case has prompted the first Targeted Peer Review report to be published under Scotland’s Quality Concerns Scheme. The QAA’s report makes for difficult reading – and goes beyond framings that emphasise the individual tragedy in the case.

    The review did not cover the individual circumstances of Ethan’s death – that’s a matter for the Crown Office and Procurator Fiscal Service, to whom QAA has passed its report.

    Instead, the four-person peer review team, including a student reviewer, spent September to November 2025 examining whether the errors that led to a student being given the wrong outcome were isolated or systemic.

    Their conclusion is unambiguous – the University of Glasgow’s assessment framework poses “systemic risks” to both academic standards and the quality of the student experience.

    That phrase – “systemic risk to academic standards” – appears at least a dozen times across the report’s 27 pages. For an agency that operates within Scotland’s enhancement-led model, that is unusually direct language.

    Quality in Scotland tends toward enhancement – developmental recommendations and collaborative improvement. This report reads differently.

    The university has fully accepted the recommendations made in the review and says it will implement the recommendations through a comprehensive plan that builds on current change projects.

    But the Scottish Funding Council’s response – commissioning QAA to conduct a national review of assessment policies across all Scottish institutions – suggests the regulator shares the concern that this may not be a Glasgow-specific problem.

    Systemic

    The story begins in February 2025, when Glasgow’s own internal investigation into the School of Geographical and Earth Sciences identified what it explicitly called a “systemic problem” in following the university’s assessment regulations.

    Concerns were raised relating to the consistency and operation of assessment regulations, and an internal investigation found issues with the complexity and application of those regulations – prompting a self-referral to the Scottish Funding Council.

    The internal investigation’s findings, as summarised in the QAA report, describe fragmented practice across multiple fronts.

    Exam boards were maladministered, with a lack of clarity in minute-taking. Communication with students at risk of not graduating was poor or unclear. Multiple methods existed for handling extension requests, including the Good Cause Policy, with little consistency between them.

    Individual professional services staff were carrying too much of the load on extensions and student communication, with inadequate backup. Record-keeping was weak, with no coherent system for managing individual student cases. And a perception had developed – whether accurate or not – that students were responsible for chasing up their own cases if they hadn’t heard back.

    The error that affected Ethan Brown was not spotted by any university staff, nor by two internal exam boards, nor by an external exam board. He should have graduated with a 2:1 Honours degree. Layers of oversight that are usually designed in part to catch this kind of mistake did not.

    For this review, then, the implications go wider than the tragedy – and into wider questions surrounding processes, standards and academic governance.

    Eighteen spreadsheets

    If there’s one detail that captures the governance failure, it’s that at the time of the QAA visit, 18 different spreadsheet formats were in operation across the university for calculating programme-level degree outcomes.

    These weren’t variations on a theme – they were 18 distinct calculation platforms, locally owned and maintained by individual schools, with no central oversight of what was being used where.

    A course aggregation tool has been in development since September 2024, intended to automate calculations and replace this patchwork of local spreadsheets. The target was 86 per cent adoption by semester 1 of 2025-26. The reality at the time of the TPR visit? 17.6 per cent – roughly 800 of the university’s approximately 4,500 courses.

    Programme-level aggregation – the calculation that determines final degree classification – remains entirely manual, relying on those varied local spreadsheets. The QAA team was told about a standardised template called U-PAS (Universal Programme Aggregation Spreadsheet) that had supposedly been adopted across two colleges.

    But the story shifted during the review. First it was standard in two colleges. Then it was standard in one college and one school in another. Then, following the visit, it emerged that U-PAS is actually in operation in six schools across two colleges. A different but standardised spreadsheet is used across another college’s eight schools. The rest? Various local arrangements.

    Paragraph 67 explains:

    The TPR team considers the fact that the University struggled to establish exactly which spreadsheets were in use and where over the period of this TPR to be indicative of the lack of institutional oversight and awareness previously taken in this area.

    The institution did not know its own assessment infrastructure.

    75 per cent

    Glasgow’s assessment regulations include what’s known as the “75 per cent rule” – a provision allowing credit to be awarded when a student has completed 75 per cent of summative assessment.

    The mechanics are complex and differ between honours and non-honours programmes, but the problem is simple enough – there is no formal mechanism guaranteeing that all intended learning outcomes have been demonstrated at the point that credit is awarded.

    The QAA team found:

    …no evidence of sampled traces that follow outcomes from specification to assessed work and then to exam board decisions.

    Where the 75 per cent rule is applied, a student can progress or receive an award without demonstrating performance across the full set of assessed learning outcomes. The report concludes that this “signifies a systemic risk to academic standards” and endorses the university’s plan to remove the rule as part of its regulation simplification programme.

    The rule presumably exists because flexibility was thought necessary – perhaps for students with legitimate reasons for non-completion, or because some disciplines argued their assessment patterns required it.

    But without a mechanism to ensure ILOs are actually met, flexibility becomes a gap in assurance. How many other institutions have similar disconnects between what degree classification is supposed to certify and what it actually guarantees?

    Meanwhile the previous “Good Cause” policy – the process for students to report extenuating circumstances – had been under review since 2021. Students who met with the QAA team reported that several student representatives had sought reform as part of their manifestos, “signifying this as a key priority for the student body.”

    The challenges were known – inconsistent application across schools, each programme operating its own Good Cause Committee, fragmented digital infrastructure, potential for single points of failure, and complex evidence requirements.

    The new Extenuating Circumstances Policy, which went live on 15 September 2025, takes a fundamentally different approach. Stage 1 is centrally managed with a wellbeing focus – Student Support Officers triage claims within 24 hours, assess seriousness, and connect students with support or escalate to safeguarding. Stage 2 – due from semester 2 of 2025-26 – involves locally managed academic decisions on outcomes like resits or extensions.

    Implementation was accelerated following the internal investigation. Staff who met with the QAA team expressed concern about the rapid pace, with some issues arising as a result, though they expressed confidence these would be addressed. The report notes, with characteristic understatement, that:

    …the implementation timeline may have been longer had the internal investigation not occurred.

    Policy lag is not unique to Glasgow, of course. The Enhancement-Led Institutional Reviews in 2014 and 2019 both made recommendations about consistency in exam boards around the use of discretion. The university removed discretion in 2021, and Good Cause had been under active review since 2021 – but some will now argue that it took a death to accelerate the timeline. How many other long-standing issues are sitting in committee cycles across the sector, awaiting a crisis to force action?

    On the job

    Assessment Officers and external examiners who met with the QAA team reported “limited or no formal training” on the Code of Assessment, and described reliance on local briefing and practice. Student-facing and support staff said that familiarity with the Code of Assessment is often “on-the-job rather than through mandatory training.”

    Glasgow confirmed that there’s no institutional record of training on the Code of Assessment. Guidance exists – for Assessment Officers, for Chairs of Boards of Examiners – but there’s no single view demonstrating comprehensive training coverage by role, and no routine monitoring of whether the standard minutes template is actually being used. The template itself, notably, is available but not compulsory.

    The report recommends, “as a matter of urgency and before the next assessment diet,” a standardised mandatory cyclical training programme. Training must be mandatory for Assessment Officers, exam board Chairs, and key administrators, with a process to confirm and monitor completion. The sense is that this is basic governance infrastructure that shouldn’t require external intervention to establish.

    The most difficult section of the report concerns what remains unresolved. Following the internal investigation, the university undertook additional analysis specifically in the School of Geographical and Earth Sciences (GES) – examining the rules around progression from junior to senior honours since 2021-22.

    At the time of the QAA visit, this analysis had checked more than 700 student records and confirmed two students with mistaken outcomes, with a further five students requiring investigation before confirmation.

    Critically, the QAA team confirmed with the university that no similar checks had been made in any of the other 23 schools. The university’s position was that an assessment had been made on risk, and GES was the only “high risk” school identified.

    The report notes that to carry out a whole-institution check of this nature would be a “huge task” – but also records that staff mentioned ongoing consideration of expanding the scope of the analysis beyond the School of GES to the whole institution.

    Even for those relying on risk-based sampling, Paragraph 69 is pointed:

    Given that this analysis was incomplete at the time of the TPR visit (and therefore could not be scrutinised by the TPR team), the extent to which past, present and future awards are affected is unknown.

    But we’re different

    There’s a thread running through the report about the balance between institutional standardisation and school-level delegation – a tension familiar to anyone working in a large university with collegiate or distributed structures.

    It’s typically framed as a debate about protecting academic standards. Schools or departments argue for local control precisely because they understand their disciplines best – chemistry assessment differs from history, professional body requirements vary, pedagogic traditions are specific. Autonomy is defended as the guarantor of subject-level rigour.

    But this story Glasgow inverts the logic. In this case, the distributed model didn’t protect standards – it undermined them. The 18 spreadsheets weren’t expressions of disciplinary distinctiveness – they became an absence of institutional responsibility.

    External examiners couldn’t provide the check because they were working within school-level silos rather than against an institutional standard. The thing rhetorically positioned as safeguarding academic standards became the systemic risk to them.

    Senior staff told the QAA team that the university is:

    …committed to finding the right balance between total standardisation and total delegation, with a shift to greater standardisation (and sometimes total standardisation) where University practice has drifted out of line with sector norms.

    Examples were given where established delegated practices had been replaced with greater standardisation “with little concern among staff in recent years” – including the removal of discretion from exam boards and work on UKVI processes.

    The team recommends that the university develops an approach to policy and process implementation that strikes an “appropriate balance” between complete standardisation across the university, and complete delegation to school level – with identified principles to be followed in arriving at an implementation plan for each policy.

    For me, this is essentially asking – where does legitimate disciplinary variation end and institutional abdication begin? It’s a question that could usefully be posed well beyond Glasgow.

    From a student perspective

    What does all the fragmentation actually look like from the student end? The report offers some telling details. Multiple students told the QAA team they found calculating their Grade Point Average “challenging” – unsurprisingly, given there’s no central system to help them do it.

    Student support staff in one college reported that students regularly seek advice on grade calculations, and that they have to tell them they’re “not Assessment Officers” and therefore “are not confident confirming if calculations they make for students are correct.”

    Staff and SRC Student Advice Centre representatives both identified grade calculation as the most common area of student confusion. The Code of Assessment is, recall, written primarily for staff – and even they can’t reliably interpret it.

    Handbooks are supposed to bridge this gap, but their production is delegated to schools and programmes. Practice varies wildly – some are comprehensive student guides, others focus on assessment only, some exist at course level. Staff who met with the QAA team were unaware of any central guidance on what should be in them, and noted that course administrators often have responsibility for updates.

    The risk, which the report flags, is that handbooks contain outdated information or omit valuable guidance on support services. Links to central webpages are intended to keep things current – but without oversight, nobody’s checking.

    Then there’s how the university actually talks to students. The report devotes a section to “compassionate communication”, which is supposed to involve clear, empathetic, timely messaging aligned with values of kindness and respect.

    Following the internal investigation, Glasgow delivered targeted training to staff with Board of Examiners responsibilities in the School of GES in March 2025. But the QAA team “found no evidence of the extent of coverage, frequency, or scope beyond this school.”

    Staff reported limited awareness and highlighted “the absence of a shared standard.” Students, meanwhile, described university correspondence as generally helpful but sometimes “daunting” and “strong in tone.”

    Finance-related communications were flagged as a particular problem, with the Wellbeing Team observing the knock-on impact on students.

    What’s next

    Glasgow now has to submit an action plan within four weeks covering all 21 recommendations. It will be subject to additional institutional liaison meetings in 2025-26 and 2026-27 to monitor progress, and its next external peer review – the Tertiary Quality Enhancement Review – has been brought forward by a year to 2027-28.

    More significantly for the sector, the Scottish Funding Council has commissioned QAA to conduct a national review of assessment and associated policies and procedures across all Scottish institutions.

    The scope and timeline for this review have not yet been announced, but the implication is clear – if Glasgow’s distributed governance model created these risks, how confident can we be that similar arrangements elsewhere are functioning effectively?

    Every Scottish institution with devolved academic governance should be asking itself some uncomfortable questions. Do we know what spreadsheets are being used for degree classification calculations? Is training on assessment regulations mandatory and tracked? Are external examiner themes being synthesised and actioned systematically? Can we demonstrate that intended learning outcomes are met at the point of credit award? What would a similar review find here?

    For institutions elsewhere, the read-across is less direct but still relevant. In England, for example, OfS’ B conditions cover academic standards, and there has been ongoing work on degree outcome statements and classification algorithms. The core problem Glasgow faced – complexity in regulations, variability in interpretation and reliance on individual competence without systematic training or oversight – is not unique to Scottish HE governance.

    The QAA team notes that Glasgow’s Code of Assessment is “long, dense and complex, which staff find difficult to interpret consistently.” That’s not an unusual description of academic regulations anywhere in the UK.

    The question is whether institutions have the oversight mechanisms to detect when complexity becomes a systemic risk – and whether the rhetoric of disciplinary autonomy is being used to defend standards – or to avoid the hard work of assuring them.

    As I say, this was not a review into the death of Ethan Brown. But his mother, Tracy Scott, says she feels the findings of the report:

    …support the family’s concerns at the level of incompetence within the University of Glasgow, which they feel places students at serious risk.

    To the extent to which the issues in the report touch on the case, and the wider debate about duty of care, I’d finally observe that there’s a tendency to focus on the behaviours of individuals in that debate.

    But this review also reminds us that poor systems can be both a contributor to tragedy, and a legitimate potential concern for those who reasonably expect the university they are in contract with to take steps to avoid foreseeable harm.

    =====

    A University of Glasgow spokesperson said:

    Following an internal investigation into assessment regulations, the University self-referred to the Scottish Funding Council.

    The University fully accepts the recommendations subsequently made by the QAA Peer Review and the risks it identifies.

    Since February 2025, we have worked to address the issues highlighted in the internal investigation and will implement the recommendations of the QAA review through a comprehensive plan that builds on current change projects.

    Source link