Category: Students

  • Wales can lead the way on student engagement – if it chooses to

    Wales can lead the way on student engagement – if it chooses to

    Imagine studying in a Wales where every student understands their rights and responsibilities.

    Where module feedback drives real change, where student representatives have time, resources and power to make a difference, and where complaints drive learning, not defensiveness.

    Where every student contributes to their community in some way – and where decisions can’t be made about students without students.

    When the Tertiary Education and Research (Wales) Act 2022 was being drafted, the inclusion of a mandatory Learner Engagement Code was important – Wales resolved to put into primary legislation what England had buried in the B Conditions and Scotland had largely left to institutional discretion.

    Section 125 now requires the Commission to prepare and publish a code about learner involvement in decision-making that’s not optional, or best practice – it’s law.

    This year the newly formed commission (MEDR) has been informally consulting on it – but it’s now been so long since the original debates that there’s a danger everyone helping to develop the thing will forget what it was supposed to do.

    Nobody will benefit from something that emerges as something weak or vague. The opportunity is for Wales to lead the way with some crunchy “comply or explain” provisions for universities in Wales that reflect the fact that this has been put in primary legislation.

    The cost of getting it wrong

    We know what happens when learner engagement is treated as an afterthought. In England, providers often silence critique on reputational grounds – the Office for Students’ (OfS) free speech guidance had to explicitly state that students have the right to publicly criticise their institutions. Imagine needing regulatory clarification that criticism is allowed in a democracy.

    Meanwhile, Scottish institutions celebrate their “partnership” approach while student representatives struggle to influence decisions that matter. Sparqs frameworks look good on paper, but without regulatory teeth, they rely on institutional goodwill. And goodwill, as any student rep will tell you, tends to evaporate when difficult decisions need making.

    When module evaluation becomes a tick-box exercise rather than genuine dialogue, problems fester. When student reps are excluded from decisions about their own education, drop-out rates climb. When complaints are buried rather than learned from, the same issues affect cohort after cohort.

    I’ve seen a lot of it over the years. The disabled student who gave up trying to get adjustments implemented because every lecturer claimed the central service’s plans were “merely advisory”. The international PGT student who couldn’t complain about teaching quality because they feared visa implications. The part-time student who couldn’t access support services because everything was designed around full-time, on-campus students.

    The student facing disciplinary proceedings who wasn’t allowed an advocate and faced a panel with no student members – in contrast to the support available to staff in similar situations.

    These aren’t edge cases – they’re systematic failures that a robust Code could prevent. Wales has a genuine opportunity to do something different – to create a Code with teeth that makes learner engagement mandatory, measurable and meaningful.

    Learning from what works

    The most effective student engagement systems require common features. They’re comprehensive, covering everything from module evaluation to strategic planning, and are backed by resources, ensuring student representatives aren’t expected to volunteer countless hours without support. And crucially, they have consequences when institutions fail to comply.

    The key is moving from “should” to “must”, with a comply or explain mechanism that has genuine bite.

    Here’s how it could work. The Code would set out clear standards – not aspirations but requirements. Providers would either have to comply with the standards or publicly explain why they’ve chosen an alternative approach that delivers equivalent or better outcomes.

    But – and this is crucial – explanations wouldn’t be allowed to be boilerplate excuses. They would need to be evidence-based, time-limited, and subject to scrutiny.

    The Commission would assess compliance annually, not through tick-box returns but through triangulated evidence – student surveys, complaint patterns, representation effectiveness metrics, and crucially, the views of student representatives themselves.

    Where providers persistently fail to meet standards without adequate justification, consequences would follow – from improvement notices to conditions on funding.

    There would be an expectation of an annually agreed student partnership agreement – setting out both processes and priority actions – and an expectation that students’ unions would produce an annual report on the experiences of students at that provider.

    This isn’t about micromanaging institutions – it’s about establishing minimum standards while allowing flexibility in how they’re met. A small FE provider might implement representation differently than a large university, but both must demonstrate their approach delivers genuine student voice in decision-making.

    Student rights and democratic education

    The Code should first establish that students are both consumers with enforceable rights and partners in their education. This dual recognition ends the sterile debate about whether students are one or the other. It means providers must respect consumer rights (quality, promises kept, redress) while creating genuine partnership structures.

    Knowing your rights matters. Following Poland’s model, all students should receive comprehensive training on their rights and responsibilities within 14 days of starting. That shouldn’t be an optional freshers’ week session – it should be mandatory education covering consumer rights, representation opportunities, complaints procedures, support services, and collective responsibilities.

    Crucially, the training should be developed and delivered by the SU. There should be written materials in (both) plain language(s), recorded sessions for those who can’t attend, annual refreshers, and staff trained to respect and uphold these rights. When every graduate understands both their rights and responsibilities, Wales will transform not just higher education but society.

    Protected status and academic adjustments

    Following Portugal’s model, student representatives should get protected status. That means academic adjustments for representative duties, just as providers must accommodate pregnancy or disability. No student should face the choice between failing their degree or fulfilling their democratic mandate.

    Representatives should get justified absences for all activities – not just formal meetings but preparation, consultation, and training. Assessments should be rescheduled without penalty, deadlines adjusted based on representative workload, and attendance requirements modified. Reps should get protection from any form of academic discrimination.

    The Finnish model adds another layer – ideally, student representatives in governance should receive academic credit or remuneration (or both). Learning through representation is learning – about negotiation, governance, and strategic thinking. They are skills that matter in any career.

    Module evaluation as universal engagement

    The Estonian approach shows what’s possible when feedback becomes embedded in academic culture. Making evaluation mandatory for module completion ensures universal participation. But it must be meaningful – published results, documented actions, closed feedback loops. Every student becomes a partner in quality enhancement, not just the engaged few.

    Wales should adopt Estonia’s three-part structure – teaching quality, student engagement, and learning outcomes. This recognises that educational success requires both good teaching and student effort. No more blaming students for poor outcomes while ignoring teaching failures, and no more student satisfaction surveys that ignore whether students are actually engaging with their learning.

    Results should be published within modules – not buried in committee papers but visible where students choose modules. Previous evaluation results, actions taken, ongoing improvements – all should be required to be transparent. Future students should be able to see what they’re signing up for, and current students should see their feedback matters.

    Comprehensive scope of engagement

    Sweden’s clarity is instructive – students must be represented “when decisions or preparations are made that have bearing on their courses or programmes or the situation of students.” There’s no weasel words about “where appropriate” or “when practicable” – if it affects students, students must be involved.

    In the Netherlands, where decisions are made by individuals, not committees, information must be provided and consultation must occur at least 14 days in advance. And written explanations should be required when student recommendations aren’t followed – because accountability matters in managerial decisions.

    Beyond academic structures, students should be represented on professional service boards, IT committees, estates planning groups, marketing focus groups. Decisions about campus facilities or digital systems affect students as much as curriculum design – yet these areas often lack any student voice.

    The digital environment deserves special attention. Student representatives should be involved in decisions about learning platforms, assessment systems and communication tools – not after implementation but during planning. Because digital accessibility and usability directly impact educational success.

    Consent not consultation

    Wales could be bold. Following the Dutch model, some decisions should require student consent, not just consultation. The Code could distinguish clearly between:

    Matters requiring consent (cannot proceed without student agreement):

    • Teaching and Assessment Regulations
    • Significant programme structure changes
    • Student charter content
    • Institutional policy frameworks affecting learners
    • Quality assurance procedures
    • Representation structure and changes
    • Elective module options for the following year

    Matters requiring consultation (mandatory input but not binding):

    • Budget allocations affecting student services
    • Campus development plans
    • Strategic planning
    • Staff appointments affecting students
    • Marketing and recruitment strategies

    Matters governed by a council of staff and students:

    • Student accommodation
    • Student employment
    • Student services and mental health
    • Harassment and sexual misconduct policy

    Matters delegated to the students’ union

    • Student engagement and representation
    • Student activities and volunteering

    This isn’t radical – it’s a recognition that students are genuine partners. No other stakeholder group would accept purely advisory input on regulations governing their activities. Why should students?

    From course reps to citizens

    Another area where Wales could be genuinely radical would take Wales’ vision of students as citizens by going beyond traditional representation structures – broadening “engagement” beyond academic quality.

    The European model of subject-level associations – common from Helsinki to Heidelberg – shows what’s possible. These aren’t just academic societies but genuine communities combining social activities, career development, representation, and civic engagement. They create belonging at the discipline level where students actually identify.

    In Tallinn, departmental student bodies aren’t sideshows but partners in departmental culture. They organise orientation, run mentoring, coordinate with employers, feed into curriculum development – and crucially, they’re funded and recognised as essential, not optional extras.

    In some countries there’s even a “duty of contribution” where students volunteer to help run the institution. Green officers, peer mentors, student ambassadors – multiple routes to engagement beyond traditional representation. Not everyone wants to be a course rep. But everyone can contribute something.

    Even if we’re just talking about student clubs and societies, Wales should mandate that providers support and fund these diverse engagement routes.

    Every student should serve somehow during their studies – it’s citizenship education in practice. Some will be traditional representatives, others will mentor new students, run sustainability initiatives, organise cultural events, support community engagement. All develop democratic skills. All should share responsibility for their community.

    Taking part

    Some countries maintain a tripartite principle for major bodies – equal representation of students, academic staff, and professional staff – to recognise that universities are communities, not hierarchies. Maybe that’s asking too much – but even with a minimum of two students in the room, representation means nothing without support.

    Some countries require that student reps receive all documentation at least five days in advance, training on context and background, briefings on complex issues, and support to participate fully – you can’t contribute if you don’t understand what’s being discussed.

    When new committees or working groups are established, there should be active consideration of student membership with default presumption of inclusion. Decisions and justifications should be communicated to student representatives, and there should be annual reviews of representation effectiveness with evidence-based changes.

    Some countries transform meetings from tokenistic to meaningful. Materials distributed five working days in advance means no ambushing student representatives with complex papers. Everything in accessible language, translated where needed, should be a standard too.

    The Swedish innovation of publishing all decisions and rationales builds accountability. Rather than being buried in minutes, decisions get actively communicated. Students can see what’s decided in their name and why – democracy requires transparency. And committees should pick up minimum student membership levels with voting rights, and there should never (ever) be just one student in a room.

    Funded independence

    Latvia mandates that SUs receive at least 0.5 per cent of institutional income, and minimums were agreed as part of the Australian Universities Accord. This isn’t generous – it’s the minimum needed for effective representation. The Welsh Code should set a minimum as a % of income, or fees – ensuring student bodies have resources to train representatives, gather evidence, and hold institutions accountable.

    Funding should come with independence safeguards. There should be no conditions that compromise advocacy, no reductions for challenging decisions, and protected status even when (especially when) relationships become difficult. Written agreements should protect core funding even during institutional financial difficulties.

    Beyond core funding, providers should be required to supply facilities, administrative support, IT access, and time for representatives. The split between guaranteed core funding for democratic functions and negotiated funding for service delivery would protect both representation and student services.

    Complaints as learning and conduct

    Complaints are a really important part of student engagement – and so the OIA’s Good Practice Framework, which learns from them, should be mandatory, not optional. A proper system treats complaints as valuable intelligence, not irritations to be managed.

    Wales should then go further, automatically converting failed appeals containing service complaints into formal complaints. When patterns emerge, compensation should go to all affected students, not just those who complained. And every provider should be required to publish on what it’s learned from complaints over the past year, and what it’s doing about it – with sign off from the SU.

    The Swedish model’s restrictions on disciplinary proceedings protect students from institutional overreach. Proceedings are only allowed for academic misconduct, disruption of teaching, disruption of operations and harassment. And students are given full procedural rights – including representation, disclosure and presence during evidence.

    Wales should go further. Every student facing disciplinary proceedings should have the right to independent support, and any panel should include student members who are properly trained and supported. Peer judgement matters in community standards.

    And neither disciplinary nor funding processes should ever be used to silence criticism, punish protest, retaliate for complaints or discourage collective action. The free speech protections in OfS’ guidance should be baseline – students’ right to criticise their institution is absolute, whether individually or collectively.

    Disability rights are student rights

    Every year, countless disabled students arrive with hope and ambition, only to find themselves trapped in a Kafkaesque system of “support” that demands disclosure, documentation, negotiation, repetition, and often – silence. If Wales is to lead, then it should be unflinching in acknowledging the daily indignities that disabled students face – and bold in tackling the systemic failures that allow them to persist.

    Adjustments, when granted, are inconsistently implemented, and advocacy, if it exists at all, is fractured and under-resourced. In many departments, reasonable adjustments are still treated as optional extras. Central services write the plans, but academic departments dispute their legitimacy, claiming subject expertise trumps legal obligation. Students are asked to justify, to prove, to persuade – again and again. And often in public – as if their access needs were a debate.

    Disabled students can’t be expected to fight these battles alone. Wales should require institutions to facilitate advocacy, embedded close to academic departments, co-located with SUs where possible, and independent enough to challenge unlawful behaviour when necessary. Not every rep can be an expert in disability law. But every student should have access to someone who is.

    The law is clear – providers have an anticipatory duty. That means planning ahead for the barriers Disabled students face, not waiting until they fall. But few providers conduct serious, evidence-based assessments of their disabled student population by type of impairment, by subject area, by mode of study. Without that, how can anyone claim to be meeting the duty? Wales could also set the tone nationally with a mandatory bank of questions in the NSS that probes access, implementation, and inclusion.

    Wales’ code should mandate that providers move beyond warm words to hard strategy – analysing disability data with student input, mapping gaps, and resourcing change. Every provider should be required to publish a Disability Access Strategy – co-designed with students, informed by evidence, and backed with budget. And implementation should be monitored – not through passive complaints, but active auditing. Where there are failures, there should be automatic remedies – and if patterns persist, the Commission must intervene.

    And briefing all students on disabled students’ rights would help too. If every student understood what disabled students are legally entitled to, fewer adjustments would be denied, more peers would offer solidarity, and institutions would face pressure from all sides to comply with the law. Education here is empowerment – for disabled and non-disabled students alike.

    Wales could lead

    If all of that feels like a lot, that’s because it is.

    But that’s why it was put in primary legislation – to show what’s possible when you take student engagement seriously, to create structures that outlast changes in institutional leadership or political climate, and to graduate citizens who understand democracy because they’ve practiced it.

    But most importantly, to lead:

    The Commission will ensure that Welsh PCET providers lead the UK in learner and student engagement and representation.

    Universities Wales isn’t so sure. In its response to the Regulatory System Consultation it said:

    We do have a number of concerns about regulatory over-reach that can be found in several of the pillars. For example, in the Learner Engagement pillar, the demand for investment of resources and support for learner engagement could be deemed to be a breach of institutional autonomy, particularly in light of this being married to ‘continuous improvement’ – if this ends up being a metric on which the sector is judged, it could be particularly contentious in tight financial circumstances.

    Good grief. It really isn’t a breach of institutional autonomy for students to expect that a little slice of their fees (whether paid by them or not) will be allocated to their active engagement and will be under their control. As Welsh Government put it during the passage of the Bill:

    There is already some excellent learner engagement within the sector, but the prize now is to ensure this is the norm across all types of provisions and for all learners.

    Welsh Government talks about civic mission, distinctive Welsh values, and education for citizenship – in universities, the Code is where rhetoric can meet reality.

    Fine words should become firm requirements, and partnership can stop being what institutions do to students and become what students and institutions do together.

    I know which Wales I’d rather study in. The question now is whether MEDR has the courage to mandate it.

    Source link

  • From where student governors sit, Dundee isn’t the only institution with governance challenges

    From where student governors sit, Dundee isn’t the only institution with governance challenges

    There are a couple of typical ways to “read” Pamela Gillies’ investigation report into financial oversight and decision making at the University of Dundee.

    One is to imagine that the issues in it are fairly unique to that university – that a particular set of people and circumstances were somehow not picked up properly by a governing body apparently oblivious to what was happening below the surface.

    In that extreme, the key failing was not doing all the Scottish Code for Good Higher Education Governance asks its governors to do.

    Another is to wonder whether, even with a clean bill of “good governance” health, it could happen elsewhere.

    One of the things that is fascinating about organisational failure is the way in which governance tends to be picked up as a problem – because it can lead to the conclusion that because organisational failure is not widespread, the governance issues must be local.

    If you position governance exclusively as scrutiny, it could of course be the case that the culture of governance is weak across the board – it’s just that most senior teams in universities don’t make the mistakes that were evidently made at Dundee, and thus we’d never know.

    After all, nobody questions governance when things are going well, when funding is flowing and when student numbers are on the up. If anything, in that positioning, the danger is in complacency – because governance needs to come into its own to avoid mistakes and catch issues before they become catastrophes.

    When Gillies’ report was published, I couldn’t avoid recalling countless conversations I’ve had over the years with student members of governing bodies about everything from the lateness of papers to the culture of decision making.

    So to test the waters, I pulled out 14 governance issues from the investigation and put a brief (anonymous) survey out to students’ union officers who are members of their Board, Council or Court.

    I can’t claim that 41 responses (captured in the second half of June and the first half of July) are representative of the whole sector, and nor are they representative of the whole of the governing bodies on which respondents have sat.

    But there is enough material in there to cause us concern about how universities around the UK are governed.

    A culture of control

    One issue that Pam Gillies picked up was leadership dominance, where the vice chancellor and chair were found to have “behaved like they have everything under control” while governing bodies failed to provide adequate challenge.

    When we asked whether student governors had experienced leadership that “routinely dominates discussions, controls narratives to present overly positive pictures, or makes it difficult for governors to raise concerns,” 68 per cent said they’d experienced this “a lot”. Another 27 per cent said “a little.”

    That’s 95 per cent of respondents experiencing some level of what one might generously call “narrative management” by their senior teams.

    The comments flesh out what this looks like in practice. One student governor observed:

    You are told at the start that your job is to manage the VC and the SMT but they manage the governors. The Chair and the VC behave like they have everything under control. The room just does not seem interested in education or the student experience, more whether it is running as a business.

    Another captured the emotional impact:

    Whenever I have asked a question or said something even questioning let alone critical about UEG it’s like I have suggested burning down their office. They are allowed to be both over-defensive and over-reassuring rather than treat contributions from me and some of the other more vocal governors as contributions to thinking. It makes the whole thing quite pointless.

    It’s not just about dominance – it’s also about active silencing. Gillies found that dissenting voices were marginalised and that “critical challenge was not welcomed.” Our survey bears this out.

    When asked about governors being “shut down, spoken over, dismissed as ‘obstructive,’ or otherwise discouraged when trying to challenge decisions,” 51 per cent reported experiencing this “a lot”. Another 37 per cent said “a little”.

    The mechanisms are subtle but effective. One respondent noted being warned at the start of their term that the previous student president had not been “constructive” and that to get things done, they needed to be “constructive” instead. The implied threat was clear – play nice or be frozen out.

    It was made very clear to me at the start that the previous President had not been ‘constructive’ and that if I wanted to get things done I needed to be ‘constructive’. All year I have felt torn – other governors would regularly ask me at the meal what was ‘really going on’ but I never felt like I could be critical in the actual meeting because of the ‘partnership’. I feel like the VC was under a lot of pressure to perform for the governors, and that makes it impossible to say anything about what you think is going wrong.

    Another described the choreography of exclusion:

    The power dynamics are fascinating if you’re into that sort of thing. Watch who the Chair makes eye contact with, whose contributions get minuted vs. ‘noted’, who gets interrupted vs. who can ramble for 10 minutes unchecked. I never got the premium treatment – I feel that the Chair needs some feedback on whose thoughts they obviously value.

    That isolation extends beyond meetings. Multiple respondents noted deliberate strategies to separate them from support:

    One tendency we picked up on a lot was to isolate me from support, I wasn’t allowed to discuss the papers with my CEO or have my CEO in the room. It’s only student on the board. They say that’s for confidentiality, but everyone else in the room is clearly discussing their issues with people who can put everything into a context. I think it should be the law that two students are on the board.

    The theatre of governance

    Gillies found that important decisions at Dundee were made outside formal governance structures, with a “small inner circle” controlling key outcomes. Our survey question on decision-making transparency suggests this is far from unique.

    When asked whether “important decisions are made by a small inner circle before reaching the governing body,” 51 per cent said this happened “a lot”, with another 44 per cent saying “a little”.

    The comments reveal how that manifests. One student governor described discovering a shadow governance structure:

    I think there’s a huge element of culture at my institution which prevents effective governance but it’s also the structure. There’s a meeting which isn’t included in the governance structure but everything goes to it before it can go anywhere else and it’s restricted to senior managers at the university. If it isn’t approved there, it won’t happen, even if things like rent negotiations have taken place in the ‘proper’ meetings, they can just scrap it and say ‘no, this is what needs to happen’ and then we’re just told. It feels like secret meeting which secretly governs everything and every other meeting is a rubber stamp for decisions made there.

    Another put it more bluntly:

    The meetings are very odd places, we don’t have any input at all on anything. Everything that comes to the Court is finished, and our job seems to be to politely probe what is in front of us (always once, follow ups frowned upon). Eye-opening but completely pointless.

    Gillies highlighted how late papers and missing documentation hampered effective governance at Dundee – the control of information emerges as a critical tool in maintaining this system across the sector. Over half (54 per cent) of respondents in our survey reported experiencing late papers, missing documentation, or “critical updates given verbally rather than in writing” frequently.

    But it goes deeper than administrative incompetence. When asked about financial information quality – an area Gillies found particularly problematic at Dundee – 37 per cent said they’d frequently received reports that “were unclear, seemed to obscure the true position, contained unexplained anomalies, or lacked integrated information.”

    One respondent shared a particularly telling anecdote:

    Training – our old CFO was a dick. He said that he wouldn’t train student members of Council in the finances because we ‘wouldn’t understand it’ which, in my mind, seems like something to a) find out and b) entirely irrelevant to a governor asking to see financial information.

    The systematic exclusion of student perspectives from board papers then compounds it:

    Many of the budget requests and department updates did not reflect the student experience accurately whether it was missing data from specific feedback routes or lacking in student perspective entirely, it made approvals difficult for me and difficult for the board as I would then be asked for the data and even though I can share some of the issues I know of I cannot represent the entire student body. With only 48hrs notice.

    The message seems to be that knowledge is power – and student governors aren’t meant to have it.

    Living in fantasy land

    Gillies found that Dundee’s governing body had been presented with “overly positive pictures” that obscured institutional reality. Quite striking in our survey is the disconnect between the institution presented in governance meetings and the one students actually experience.

    Multiple respondents described sitting through presentations that bore no resemblance to reality:

    The university that gets presented isn’t the university I was at as a student.

    Another elaborated:

    It feels a lot like a fantasy world in there but they really don’t know how the university actually works, and the questions they ask are so weird, like they are desperate for the university to be as good as they imagine it is when there are really a lot of problems with how it runs especially at school level.

    This fantasy is then maintained through what we might call the tyranny of positivity. When asked whether they’d felt “pressure to maintain positive messaging even when you have legitimate worries,” 61 per cent said they’d experienced this “a lot”.

    The enforcement mechanisms vary. Some are explicit:

    They love talking about ‘student voice’ in the abstract but hate it when we actually speak. I raised concerns about library hours during exams and the DVC literally rolled his eyes. Later the Chair pulled me aside and said I should ‘pick my battles more carefully’ and focus on ‘strategic matters’.

    Others are more subtle. Multiple respondents described being praised for contributions that never led to change:

    I was often praised in the minutes. ‘Thoughtful contribution from the student member.’ But praise without change feels hollow – a polite pat on the head.

    This disconnect between fantasy and reality is exacerbated by what several respondents identified as an unhealthy fixation on rankings:

    A lot of the meetings were really interested in what I had to say, but the obsession with league tables is bizarre. We spent easily an hour at the last meeting discussing how to game NSS metrics but when I suggested actually fixing the issues students raise – timetabling chaos, inconsistent feedback, broken IT systems – I got blank stares. One governor literally said ‘can’t we just manage student expectations better?’ What’s the point?

    Another observed:

    There are about sixteen of us in theory but really there are six people who speak at every meeting, and it is always about whether we are beating other universities. I don’t think the governors have any way to judge how well the university is doing other than by thinking about other universities. It is very weird.

    This comparative obsession substitutes for genuine evaluation of institutional health – where things become filtered through the lens of institutional positioning rather than student experience.

    The survey responses also reveal how regulatory compliance has become another distorting filter. Several respondents noted how the Office for Students has inadvertently created perverse incentives:

    It is very weird to me that whenever I’ve talked about student issues they are responded to with things like ‘that would not be an issue for the OfS’, like we are only supposed to worry about the student experience if OfS are doing a visit.

    It suggests that governing bodies are more concerned with regulatory perception than addressing underlying problems – a dangerous conflation of compliance with quality.

    The impossible position

    A particularly Byzantine aspect of student perceptions of governance emerges in the contradictions around representation. Multiple respondents noted being told explicitly that they were “not a representative” of students, only to have governors constantly ask them about student views:

    At the start of the year it is drilled into you that you are not a representative, and then at every meeting someone has asked me what students think, what students are saying, how students would react, and so on. It really is ridiculous.

    It creates an impossible position – student governors are simultaneously expected to embody the student voice whilst being forbidden from claiming to represent it, and are consulted when convenient but dismissed when challenging.

    The tokenism extends to how “the student experience” is conceptualised:

    There is a pressure not to rock the boat too much or the SU funding will be under threat. One other thing is that the other governors see ‘the student experience’ as one homogeneous thing. I represent 30,000 students – disabled students, commuters, mature students, international students, care leavers – but I get 5 minutes at the end of every meeting to cover ‘student matters.’ When I highlight different needs across student groups, eyes glaze over.

    One response powerfully captured another dimension of the problem:

    Too many decisions are made by white upper-middle class men who have no real understanding of student demographics or experiences and the effects that rushed, ill informed decisions can have on the student body.

    This homogeneity problem compounds all the others – if governance doesn’t reflect the communities it serves, how can it possibly understand their needs?

    Throughout the responses runs a theme of performative partnership that masks fundamental power imbalances. Student governors describe being valued for their “input” on predetermined decisions whilst being told their contributions are “premature” on anything still under genuine consideration:

    Two types of agenda items, ones where student input is ‘valued’ (anything they’ve already decided) and those where student input is ‘premature’ (anything they haven’t decided yet). Its never the right time for meaningful student contribution.

    The contrast between public and private behaviour is also revealing:

    I feel that the UET are like Jekyll and Hyde, they have listened to me outside of the meetings but when I have asked about things during Board meetings they react very defensively. I’m not supposed to be a rep for students but nobody else ever talks about students unless we count recruiting students.

    When push comes to shove

    Gillies found that committees at Dundee operated as “rubber stamping exercises” rather than providing genuine oversight. Our survey revealed similar patterns, with 46 per cent reporting committees feeling like “rubber stamping exercises.”

    Even when committees try to assert themselves, the resistance is telling:

    We had an issue with the auditors and the closest I’ve seen us come to blows as a Council was when the exec tried to treat the issue as annoying but closed and move on but Council had to say ‘actually, no, we’d like an audit of our auditors to work out how [confidential] was missed.’

    The fundamental problem, as one respondent observed, may be structural:

    I honestly think that the huge number of things the council are expected to know about and make decisions on are beyond them. They don’t meet often enough and they really do not understand their responsibilities.

    Gillies documented how Dundee’s governance processes were abandoned during crisis periods. Our survey asked about governance during “difficult periods,” and of those who didn’t say “N/A”, 51 per cent reported seeing “normal governance processes abandoned, informal advisory groups bypass committee structures, or key oversight bodies become inactive when they’re most needed.”

    It suggests that whatever thin veneer of good governance exists in normal times rapidly dissolves under pressure – precisely when robust governance is most essential:

    Student input in governance is at a real risk of just becoming a box ticking exercise as I have sat in meetings where the student experience is discussed by everyone but the students in the room. Once decisions need to be made at speed all thought for student and staff is ignored and it is often because of their own burdensome governance structures that inhibit the agility needed for such a volatile time in HE.”

    The human cost

    The emotional toll shouldn’t be underestimated. Multiple respondents described feeling “out of place,” “invalidated,” or like they were “betraying everyone” simply by asking questions.

    One particularly poignant comment came from a sabbatical officer who left their role early:

    It was a really tough experience as I had students relying on me. I wish that I could’ve stayed in my role for longer but the lack of transparency and wish to subdue the view for students contradicted my individual beliefs and leadership style. I was supportive and I wanted students to know what I was doing. This wasn’t always possible.

    And the lack of institutional learning is telling:

    It is telling that they spent so much time with me at the start but haven’t spent any time with me to get my feedback at the end. I feel that they should do exit interviews to learn about how intimidating the atmosphere can be.

    Perhaps most damning is the response to our final question. When asked whether they “feel confident that your governing body would identify and respond appropriately to serious institutional risks,” only 32 per cent expressed confidence.

    That means 68 per cent of student governors – governors who usually have the most intimate knowledge of how their institutions actually operate – doubt their governing body’s ability to spot and address serious problems.

    One captured the fundamental dysfunction:

    If I compare it to being on my union board I think the governors is a joke. If I ask why or how in the union we have a decent conversation. If I do it at governors the atmosphere is like I’ve betrayed everyone. And if I say something isn’t clear that is turned into something I’ve not done or read. We’re not governors. We’re an audience.

    Another summed up the experience with clarity:

    I feel that the whole thing is engineered to make the vice chancellor and her team to look good rather than gather our input or ideas, I would have side conversations with some of the community governors who shared my view but there just is not any part of any meeting where ‘input’ is welcome.

    We’re not governors. We’re an audience

    Some of the most problematic critiques came in respondents’ final reflections on what governance actually means in practice:

    What frustrates me most is the wasted potential. These are genuinely smart, accomplished people who could transform this place. But they’re trapped in this weird bubble where everything’s fine and any criticism is disloyalty. I know I’m not the only one.

    The sense of governance as performance came through repeatedly:

    In the January meeting I was invited to do a presentation before the formal meeting on what student life is like and I got a lot of praise from the Chair about how eye-opening it was. But about half of the governors were not there and the PVC-E went off on one about how the university’s surveys contradicted some of the things we were saying. I feel that the whole body just doesn’t have a clue about students or staff and what it is like to be a student in 2025.

    One respondent captured the Kafkaesque nature of their experience:

    The whole ‘critical friend’ thing is such a con. We’re meant to be critical but every time I challenge something I get ‘well, Council can only advise, we cannot instruct the executive.’ So we’re legally responsible for decisions we can only ‘advise’ on? The Vice Chancellor keeps saying Council is ‘not a court’ whenever we try to hold them accountable. I’ve started asking ‘what CAN Council actually do?’ because honestly I’m not sure anymore.

    The broader implications were spelled out starkly:

    The big, big, BIG thing for us as student leaders has been ‘what Council is and is not for’. Often, when we’ve brought issues for discussion or ‘airing’ at Council, I have had every variation of ‘Council is not a court’ ‘Council can only advise the exec, it cannot instruct it’ ‘Council is for critical challenge but cannot dictate’ some of which is absolutely at odds with then being legally responsible for the decisions you have only ‘advised on’ and ‘cannot dictate’.

    And perhaps most damningly:

    As a new Sabbatical officer, I felt extremely out of place with the culture of Court meetings, as if I wasn’t supposed to be or welcome there. It made my input feel invalidated and overlooked. Structurally, important decisions are already decided upon within committees before reaching court.

    What next?

    It’s important to set what I’ve gathered in context. Student governors have a particular perspective and a specific set of confidence and cultural capital asymmetries that are bound to make being on a body of the “great and good” a difficult experience.

    41 responses is not the whole sector (and may not even be from 41 universities), and it was a self-selecting survey. But we should be worried.

    Out of the back of the Dundee episode, both Graeme Day and the Scottish Funding Council have committed to exploring ways to strengthen governance to avoid a repeat.

    Universities Scotland has committed to collective reflection on Gilles’ findings and the lessons it shares to give “robust assurance” of financial management and good governance to funders, regulators, supporters and all who depend on universities.

    It has also said it will “connect” to Universities UK’s work to consider the leadership and governance skills required in the sector in times of transformation and challenge.

    As such, the same issue that students see in governing bodies is playing out nationally – there are questions that suggest a loss of autonomy, and reassurance about “performance” designed to retain it.

    There is therefore a real danger that the processes will conclude what these sorts of things always conclude – that with the right “skills” and adherence to a given Code, all will be well.

    But the experiences from students suggest that neither “getting the right skills” nor calls for better codes will solve the fundamental problems. The issue isn’t just about getting the “right” people around the table or training them better – it’s about reconsidering what we’re asking governance to do.

    Vertical or horizontal?

    As I noted here and here, the Dutch experience offers an alternative. Following a series of governance scandals in the early 2000s, the Netherlands rejected both excessive state control and unfettered institutional autonomy. Their 2016 Education Governance Strengthening Act created a “third way” – creating multi-level democratic participation from program to institutional level.

    Rather than imposing rigid rules, the framework promoted “horizontal dialogue” where students, staff, management, and supervisors engage in ongoing conversations about their university.

    A 2021 evaluation found meaningful channels for student and staff input had been created, with improved dialogue quality between stakeholder groups. If there’s enough of them, staff and students have turned out to be better at scrutiny than skilled lay members or someone from the funding council sat in the corner.

    It’s also partly about what is discussed. Most boards operate primarily in fiduciary mode (overseeing budgets, ensuring compliance) or strategic mode (setting priorities, deploying resources). While essential, these modes often crowd out what governance scholars call the “generative mode” – critical thinking, questioning assumptions, and framing problems in insightful ways.

    Generative governance asks probing questions: “What is our fundamental purpose?” and “How does this decision align with our core values?” It involves scenario planning, delving into root causes rather than symptoms, and actively considering ethical implications beyond legal compliance. And it allows senior staff to participate, rather than perform – a culture that then improves scrutiny in fiduciary mode.

    It is where staff, student, and community governors could add most value – yet it’s often where their contributions are most dismissed as inappropriate or “operational.” The standard line that governors should be “concerned with the university rather than as representatives” misses the point that understanding the lived experience of those working and studying there is essential to good governance, and actually improves fiduciary scrutiny.

    Put another way, maybe better fiduciary mode scrutiny could have probed more on the Nigerian students focussed business plan at Dundee. But it’s more likely that better generative mode governance could have explained what was starting to happen to the currency in Nigeria, how tough students were funding it to pay their fees, and what families were going through as the Naira went into collapse.

    It’s also partly about what we think “effectiveness” means. Universities facing unprecedented challenges – financial pressures, technological disruption, legitimacy crises – need governance capable of navigating complexity, not just ticking out risk registers. They need what the Dutch reforms sought – genuine accountability to the communities they serve, not just reassuring compliance with regulatory requirements.

    Universities at their best are spaces where different forms of knowledge encounter each other, and where democratic values are modeled and sustained. Their governance should reflect this reality.

    As such, we need to ensure we’re solving the right problem. The issue isn’t governors who need better training or institutions that need tighter control. It’s a governance model designed for a different era and different types of organisation, struggling to cope with contemporary complexity while excluding the voices that could help navigate it.

    What we do next requires courage to move beyond the false choice between corporatisation and collegial nostalgia. A third way is possible – one that takes seriously both institutional sustainability and democratic participation, that values both expertise and lived experience, that reconciles the university interest with the interests of those who study and work there rather than separating them or elevating one of them, and that governs for the public good rather than just institutional survival.

    The students sitting in those boardrooms, feeling like audiences rather than governors, deserve better. So do the staff, the communities universities serve, and democracy itself.

    Source link

  • AUCC Partners with Spike Lee for Third Season of Entertainment Industry Fellowship

    AUCC Partners with Spike Lee for Third Season of Entertainment Industry Fellowship

    Spike LeeThe Atlanta University Center Consortium has announced the launch of Season Three of the Spike Fellows at Gersh program, continuing its partnership with Oscar-winning director Spike Lee and The Gersh Agency to create pathways for students from historically Black colleges and universities into entertainment industry careers.

    Three students have been selected for this year’s cohort: Anwar Karim from Morehouse College, Denver Edmonds from Spelman College, and Miya Scaggs from Spelman College. The fellows were chosen based on grade point average, leadership experience, school involvement, creative work, and professional recommendations.

    The eight-week paid fellowship places students in New York or Los Angeles, where they complete rotations across different agency departments while receiving senior-level industry mentoring and participating in curated learning experiences and volunteer service projects.

    “The Spike Fellows Program continues to provide an invaluable experience and mentorship for our students who desire impact in the entertainment industry, both in front and behind the camera,” said Dr. Michael Hodge, Executive Director of the AUCC. “Each year, we see a new set of students immersed in the industry, becoming working professionals and aspiring entertainment leaders.”

    The program has achieved a 100 percent employment rate for participants, with alumni securing positions at major entertainment companies including Gersh, Netflix, Warner Brothers, and Range Media. One former fellow was inspired to pursue graduate studies at the University of Southern California’s film program.

    Beyond professional placement, the program provides comprehensive support for participants. A multi-year partnership with Ralph Lauren furnishes business attire for fellows, while networking opportunities include events like the inaugural Young Black Hollywood Mixer, which earned recognition from Deadline as one of the Best Red Carpet and Party Photos of 2024.

    The initiative targets undergraduate students from Clark Atlanta University, Morehouse College, and Spelman College who demonstrate interest in entertainment industry careers. The program aims to address equity gaps in entertainment by creating direct pathways for talented HBCU students to access industry opportunities.

    The Atlanta University Center Consortium, established in 1929, operates as a 501(c)(3) non-profit representing Clark Atlanta University, Morehouse College, Morehouse School of Medicine, and Spelman College. The organization describes itself as “the world’s oldest and largest association of historically Black colleges and universities.”

    The fellowship represents part of broader industry efforts to increase diversity in entertainment, particularly in behind-the-camera roles where representation has historically lagged. By partnering with established industry figures like Lee and agencies like Gersh, the program provides students with direct access to decision-makers and career-building opportunities typically difficult to access for underrepresented groups.

    Source link

  • Universities should face the consequences for misleading students over the cost of living

    Universities should face the consequences for misleading students over the cost of living

    Why do students run out of money? And is it their mistake?

    It’s partly because student maintenance support has not kept pace with the cost of living.

    Last year, the Centre for Research in Social Policy (CRSP) at Loughborough University calculated that students need £18,632 a year outside London (and £21,774 a year in London) to have a minimum acceptable standard of living.

    But if you’re living away from home in England, the maximum maintenance loan is £10,227 – and it’s less than that once your parents earn over £25,000.

    And if you’re an international student, the Home Office’s “proof of funds” figure – the money you need to show you have in the bank to cover your living costs – has been (un)helpfully aligned to that inadequate figure.

    In that scenario, you’d need help with budgeting – especially if you’ve never lived away from home before, if you’ve not participated in higher education before, or if you’ve never lived in the UK before.

    You’d want to know, for example, how much a TV license costs. The good news is that your chosen university has a guide to student living costs, and it lists the license as costing £159 per year.

    The problem? £159 was the 2021 rate – a TV licence now costs £174.50. Still, one little mistake like that isn’t going to break the bank, surely?

    Delay repay

    Over the past few years I’ve whiled away some of my train delays surfing around university websites looking at what the sector says about student cost of living.

    I’ve found marketing boasts dressed up as money advice, sample student budgets that feature decades old estimates, and reassuringly precise figures that turn out to be thumbs in the air from the ambassadors in the office.

    Often, I find webpages that say things like this:

    The problem is that the “fact” turns out to be from 2023, the source on the “lowest rents” claim turns out to be “not yet reliable”, and the “one of the cheapest pints in the country” claim has its source this story in the Independent. From 2019.

    That’s also a webpage that says you can get a bus to the seaside and back for £4.30 (it’s currently £12), a ferry to Bruges for £50 (the route was withdrawn in 2020), and a train to London for “for just over a tenner” – when even with a railcard, the lowest fare you’ll find is £22.66.

    Campus gym prices are listed as less than £20 a month (it’s actually from £22.95 for students), rent for a one-bed city flat is listed as £572 (the source actually says £623.57), and you’re even told that you can head to a “legendary” local nightclub to “down a double” for £1.90.

    Sadly, even Spiders Nightclub is having to cover “the increasing cost of basic overheads” and “the ongoing inflationary cost of purchasing stock”. The current price is £2.50.

    Those were the days

    Sometimes, I find tables like this – where the costs listed appear to be exactly the same as when the webpage was updated in 2022.

    HERTS 1

    Actually, that’s not quite true. Someone has bothered to update the lower rent estimate up to £500 a month since then – leaving all of the other figures unchanged.

    Archive.org allows me to see all sorts of moments when someone, somewhere, has performed an update. Of sorts.

    Here’s one where food and rent have gone up, but everything else is as it was in 2022. The main difference is that the “Yearly costs for students” lines in the table have been deleted – presumably because they would stretch credibility.

    Not every university has a run at listing costs. Many (over 30 at the time of typing) refer their readers to the Which? Student Budget Calculator.

    The Which? Student budget calculator was deleted in 2022 – and even when it was live, its underpinning figures were last updated in 2019.

    Sometimes the google search takes you to undated slide decks and PDFs. This metadata suggests that this one is from 2023 – although the figures in it look suspiciously similar to the numbers in the UG prospectus in 2015.

    To be fair, that’s a university that has at least got an updated chart showing sample costs in its international arrival guide – with a reassuring note that average costs are correct as of March. You’d perhaps be less reassured to find that those average costs – other than the cost of (university) accommodation – have remained exactly as they were since last year.

    Sometimes, a picture is painted of painstaking research carried out by dedicated money advisors. Here’s a table that says the minimum costs have been estimated by the university’s support teams:

    How lucky students in that city are, given that the only things that have increased over the past year are accommodation and rent:

    Actually, tell a lie. Many of the costs seem to be identical to those in 2020:

    Save us from your information

    Lost of the sample budgets and costs are unsourced – but not all of them. A large number quote figures from Save the Student’s student money survey – which last year used responses from 1,010 university students in the UK to calculate the results.

    Even if that was a dataset that could be relied upon at provider or city level, that was a survey that found 67 per cent of students skipping meals to save money, 1 in 10 using food banks and 60 per cent with money related mental health problems. Not a great basis on which to budget, that.

    Others quote their costs from the NatWest Student Living Index – which for reasons I’ve explained in 2024, 2023 and 2022, isn’t an approach that I think comes close to being morally sound.

    Plenty of universities don’t list costs at all, but imply to international students that the “proof of funds” figure has been calculated by Home Office officials as enough to live on:

    It has, of course, just been copied across from DfE’s maximum maintenance loan – a figure widely believed to be wholly inadequate as an estimate of living costs for students.

    Sometimes you find things like this, a set of costs “based on feedback from our current international undergraduate and master’s students”. Someone has gone in and updated the costs for university halls – but hasn’t updated anything else, and nor have they updated the estimate for total monthly living expenses:

    Sometimes you find things like this – costs that haven’t changed in two years contained in an official looking document called “Student Regulations and Policies: Standard Additional Costs”:

    And sometimes you find miracles. Here’s a university where most of the costs haven’t increased in 18 months, and the cost of clothing has fallen dramatically – despite ONS calculating that clothing inflation is currently 5.9 per cent.

    Then there’s charts like this that are “subject to change” – although no change since last summer:

    Or unsourced tables like this, where somehow student costs have started to fall. I want to move there!

    2024. Here’s 2025:

    The long arm

    The good news for prospective students – and the bad news for universities – is that this is all now going to have to change.

    Looking at all of this through the lens of the new Digital Markets, Competition and Consumers (DMCC) Act, it’s hard to avoid the conclusion that universities have been sailing remarkably close to the wind – and that the wind direction has now changed dramatically.

    Under DMCC, the systematic provision of outdated cost-of-living information would likely constitute a serious breach of consumer protection law. The Act makes it automatically unfair to omit material information from invitations to purchase – and there’s little doubt that accurate living costs are material information for prospective students making decisions about whether and where to study.

    Crucially, there’s no longer any need to prove that students were actually misled by the information, or that it influenced their decision-making. The omission itself is the problem.

    The legal framework has fundamentally shifted in universities’ disfavour. The scope of what counts as material information has expanded beyond those categories defined by EU obligations, while misleading actions are no longer restricted to predefined “features” of a product or service.

    Instead, any information relevant to a student’s decision can now trigger a breach – meaning universities can no longer rely on narrow, checklist-based approaches to compliance. Outdated transport costs, inflated claims about local entertainment prices, or misleading accommodation estimates all fall squarely within this expanded scope, even though they might previously have been considered peripheral to the core “product” of education.

    The Act has also lowered the threshold for proving breaches of professional diligence. Previously, universities might have argued that minor cost discrepancies didn’t cause “material distortion” of student decision-making. Now, practices need only be “likely to cause” a different decision – shifting the focus from proving impact to ensuring accurate practice from the outset.

    The Act explicitly recognises that certain groups of consumers are particularly vulnerable, and that practices which might not affect others can cause disproportionate harm to those groups.

    International students – who rely heavily on university cost estimates for visa applications and have limited ability to verify information independently – are a textbook example of vulnerable consumers. So too are first-generation university students, those from lower-income families, and young people making major financial commitments for the first time. The Act requires universities to proactively identify and mitigate risks to these vulnerable groups as part of their duty of care.

    The Competition and Markets Authority now has significant new enforcement powers, including the ability to impose civil penalties of up to 10 per cent of an organisation’s turnover and to hold corporate officers personally liable where they have consented to or negligently allowed breaches to occur.

    Given the sector-wide nature of these problems, and the ease with which accurate cost information could be obtained and maintained, it would be difficult for universities to argue that continued reliance on years-old estimates meets the standard of professional diligence now required by law.

    The sector has had years to get this right voluntarily. With enhanced legal obligations, fundamentally expanded definitions of what constitutes actionable information, lowered thresholds for proving breaches, and much sharper enforcement teeth now imminent, universities that continue to present outdated or inaccurate living costs as current information may find that their casual approach to accuracy has become a rather expensive mistake. Their mistake.

    Source link

  • Why VR works for soft-skills training – Campus Review

    Why VR works for soft-skills training – Campus Review

    Virtual reality (VR) isn’t a silver bullet replacement for lectures or labs, but it is the most practical method to support higher education to deliver immersive learning more effectively at-scale. 

    Please login below to view content or subscribe now.

    Membership Login

    Source link

  • University funding tied to antisemitism action – Campus Review

    University funding tied to antisemitism action – Campus Review

    Universities could be subject to a ’report card’ that assesses their responses to antisemitism, which could result in cut funding, according to Australia’s antisemitism envoy’s report released on Thursday.

    Please login below to view content or subscribe now.

    Membership Login

    Source link

  • How can students’ module feedback help prepare for success in NSS?

    How can students’ module feedback help prepare for success in NSS?

    Since the dawn of student feedback there’s been a debate about the link between module feedback and the National Student Survey (NSS).

    Some institutions have historically doubled down on the idea that there is a read-across from the module learning experience to the student experience as captured by NSS and treated one as a kind of “dress rehearsal” for the other by asking the NSS questions in module feedback surveys.

    This approach arguably has some merits in that it sears the NSS questions into students’ minds to the point that when they show up in the actual NSS it doesn’t make their brains explode. It also has the benefit of simplicity – there’s no institutional debate about what module feedback should include or who should have control of it. If there isn’t a deep bench of skills in survey design in an institution there could be a case for adopting NSS questions on the grounds they have been carefully developed and exhaustively tested with students. Some NSS questions have sufficient relevance in the module context to do the job, even if there isn’t much nuance there – a generic question about teaching quality or assessment might resonate at both levels, but it can’t tell you much about specific pedagogic innovations or challenges in a particular module.

    However, there are good reasons not to take this “dress rehearsal” approach. NSS endeavours to capture the breadth of the student experience at a very high level, not the specific module experience. It’s debatable whether module feedback should even be trying to measure “experience” – there are other possible approaches, such as focusing on learning gains, or skills development, especially if the goal is to generate actionable feedback data about specific module elements. For both students and academics seeing the same set of questions repeated ad nauseam is really rather boring, and is as likely to create disengagement and alienation from the “experience” construct NSS proposes than a comforting sense of familiarity and predictability.

    But separating out the two feedback mechanisms entirely doesn’t make total sense either. Though the totemic status of NSS has been tempered in recent years it remains strategically important as an annual temperature check, as a nationally comparable dataset, as an indicator of quality for the Teaching Excellence Framework and, unfortunately, as a driver of league table position. Securing consistently good NSS scores, alongside student continuation and employability, will feature in most institutions’ key performance indicators and, while vice chancellors and boards will frequently exercise their critical judgement about what the data is actually telling them, when it comes to the crunch no head of institution or board wants to see their institution slip.

    Module feedback, therefore, offers an important “lead indicator” that can help institutions maximise the likelihood that students have the kind of experience that will prompt them to give positive NSS feedback – indeed, the ability to continually respond and adapt in light of feedback can often be a condition of simply sustaining existing performance. But if simply replicating the NSS questions at module level is not the answer, how can these links best be drawn? Wonkhe and evasys recently convened an exploratory Chatham House discussion with senior managers and leaders from across the sector to gather a range of perspectives on this complex issue. While success in NSS remains part of the picture for assigning value and meaning to module feedback in particular institutional contexts there is a lot else going on as well.

    A question of purpose

    Module feedback can serve multiple purposes, and it’s an open question whether some of those purposes are considered to be legitimate for different institutions. To give some examples, module feedback can:

    • Offer institutional leaders an institution-wide “snapshot” of comparable data that can indicate where there is a need for external intervention to tackle emerging problems in a course, module or department.
    • Test and evaluate the impact of education enhancement initiatives at module, subject or even institution level, or capture progress with implementing systems, policies or strategies
    • Give professional service teams feedback on patterns of student engagement with and opinions on specific provision such as estates, IT, careers or library services
    • Give insight to module leaders about specific pedagogic and curriculum choices and how these were received by students to inform future module design
    • Give students the opportunity to reflect on their own learning journey and engagement
    • Generate evidence of teaching quality that academic staff can use to support promotion or inform fellowship applications
    • Depending on the timing, capture student sentiment and engagement and indicate where students may need additional support or whether something needs to be changed mid-module

    Needless to say, all of these purposes can be legitimate and worthwhile but not all of them can comfortably coexist. Leaders may prioritise comparability of data ie asking the same question across all modules to generate comparable data and generate priorities. Similarly, those operating across an institution may be keen to map patterns and capture differences across subjects – one example offered at the round table was whether students had met with their personal tutor. Such questions may be experienced at department or module level as intrusive and irrelevant to more immediately purposeful questions around students’ learning experience on the module. Module leaders may want to design their own student evaluation questions tailored to inform their pedagogic practice and future iterations of the module.

    There are also a lot of pragmatic and cultural considerations to navigate. Everyone is mindful that students get asked to feed back on their experiences A LOT – sometimes even before they have had much of a chance to actually have an experience. As students’ lives become more complicated, institutions are increasingly wary of the potential for cognitive overload that comes with being constantly asked for feedback. Additionally, institutions need to make their processes of gathering and acting on feedback visible to students so that students can see there is an impact to sharing their views – and will confirm this when asked in the NSS. Some institutions are even building questions that test whether students can see the feedback loop being closed into their student surveys.

    Similarly, there is also a strong appreciation of the need to adopt survey approaches that support and enable staff to take action and adapt their practice in response to feedback, affecting the design of the questions, the timing of the survey, how quickly staff can see the results and the degree to which data is presented in a way that is accessible and digestible. For some, trusting staff to evaluate their modules in the way they see fit is a key tenet of recognising their professionalism and competence – but there is a trade-off in terms of visibility of data institution-wide or even at department or subject level.

    Frameworks and ecosystems

    There are some examples in the sector of mature approaches to linking module evaluation data to NSS – it is possible to take a data-led approach that tests the correlation between particular module evaluation question responses and corresponding NSS question outcomes within particular thematic areas or categories, and builds a data model that proposes informed hypotheses about areas of priority for development or approaches that are most likely to drive NSS improvement. This approach does require strong data analysis capability, which not every institution has access to, but it certainly warrants further exploration where the skills are there. The use of a survey platform like evasys allows for the creation of large module evaluation datasets that could be mapped on to NSS results through business intelligence tools to look for trends and correlations that could indicate areas for further investigation.

    Others take the view that maximising NSS performance is something of a red herring as a goal in and of itself – if the wider student feedback system is working well, then the result should be solid NSS performance, assuming that NSS is basically measuring the right things at a high level. Some go even further and express concern that over-focus on NSS as an indicator of quality can be to the detriment of designing more authentic student voice ecosystems.

    But while thinking in terms of the whole system is clearly going to be more effective than a fragmented approach, given the various considerations and trade-offs discussed it is genuinely challenging for institutions to design such effective ecosystems. There is no “right way” to do it but there is an appetite to move module feedback beyond the simple assessment of what students like or don’t like, or the checking of straightforward hygiene factors, to become a meaningful tool for quality enhancement and pedagogic innovation. There is a sense that rather than drawing direct links between module feedback and NSS outcomes, institutions would value a framework-style approach that is able to accommodate the multiple actors and forms of value that are realised through student voice and feedback systems.

    In the coming academic year Wonkhe and evasys are planning to work with institutional partners on co-developing a framework or toolkit to integrate module feedback systems into wider student success and academic quality strategies – contact us to express interest in being involved.

    This article is published in association with evasys.

    Source link

  • Why students reject second-language study – Campus Review

    Why students reject second-language study – Campus Review

    Students are turning away from learning a second language other than English because they don’t see it as a viable qualification even though it is a core skill in other countries, experts have flagged.

    Please login below to view content or subscribe now.

    Membership Login

    Source link

  • Embracing complexity in writing instruction

    Embracing complexity in writing instruction

    Key points:

    Early in our careers, when we were fresh-faced and idealistic (we still are!) the prepackaged curriculum and the advice of more experienced colleagues was the go-to resource. Largely, we were advised that teaching writing was a simple matter of having students walk through and complete organizers, spending about one day for each “stage” of the writing process. At the end of the writing unit, students had finished their compositions–the standardized, boring, recreated ideas that we taught them to write.

    As we matured and grew as teachers of writing, we learned that teaching writing in such simplistic ways may be easier, but it was not actually teaching students to be writers. We learned with time and experience that writing instruction is a complex task within a complex system.

    Complex systems and wicked problems

    Complexity as it is applied to composition instruction recognizes that there is more than just a linear relationship between the student, the teacher, and the composition. It juggles the experiences of individual composers, characteristics of genre, availability of resources, assignment and individual goals, and constraints of composing environments. As with other complex systems and processes, it is non-linear, self-organizing, and unpredictable (Waltuck, 2012).

    Complex systems are wicked in their complexity; therefore, wicked problems cannot be solved by simple solutions. Wicked problems are emergent and generative; they are nonlinear as they do not follow a straight path or necessarily have a clear cause-and-effect relationship. They are self-organizing, evolving and changing over time through the interactions of various elements. They are unpredictable and therefore difficult to anticipate how they will unfold or what the consequences of any intervention might be. Finally, they are often interconnected, as they are symptoms of other problems. In essence, a wicked problem is a complex issue embedded in a dynamic system (Rittel & Webber, 1973).

    Writing formulas are wicked

    As formulaic writing has become and remains prevalent in instruction and classroom writing activity, graphic organizers and structural guides, which were introduced as a tool to support acts of writing, have become a wicked problem of formula; the resource facilitating process has become the focus of product. High-stakes standardized assessment has led to a focus on compliance, production, and quality control, which has encouraged the use of formulas to simplify and standardize writing instruction, the student writing produced, and the process of evaluation of student work. Standardization may improve test scores in certain situations, but does not necessarily improve learning. Teachers resort to short, formulaic writing to help students get through material more quickly as well as data and assessment compliance. This serves to not only create product-oriented instruction, but a false dichotomy between process and product, ignoring the complex thinking and design that goes into writing.

    As a result of such a narrow view of and limited focus on writing process and purpose, formulas have been shown to constrain thinking and limit creativity by prioritizing product over the composing process. The five-paragraph essay, specifically, is a structure that hinders authentic composing because it doesn’t allow for the “associative leaps” between ideas that come about in less constrained writing. Formulas undermine student agency by limiting writers’ abilities to express their unique voices because of over-reliance on rigid structures (Campbell, 2014; Lannin & Fox, 2010; Rico, 1988).

    An objective process lens: A wicked solution

    The use of writing formulas grew from a well-intentioned desire to improve student writing, but ultimately creates a system that is out of balance, lacking the flexibility to respond to a system that is constantly evolving. To address this, we advocate for shifting away from rigid formulas and towards a design framework that emphasizes the individual needs and strategies of student composers, which allows for a more differentiated approach to teaching acts of writing.

    The proposed framework is an objective process lens that is informed by design principles. It focuses on the needs and strategies that drive the composing process (Sharples, 1999). This approach includes two types of needs and two types of strategies:

    • Formal needs: The assigned task itself
    • Informal needs: How a composer wishes to execute the task
    • “What” strategies: The concrete resources and available tools
    • “How” strategies: The ability to use the tools

    An objective process lens acknowledges that composing is influenced by the unique experiences composers bring to the task. It allows teachers to view the funds of knowledge composers bring to a task and create entry points for support.

    The objective process lens encourages teachers to ask key questions when designing instruction:

    • Do students have a clear idea of how to execute the formal need?
    • Do they have access to the tools necessary to be successful?
    • What instruction and/or supports do they need to make shifts in ideas when strategies are not available?
    • What instruction in strategies is necessary to help students communicate their desired message effectively?

    Now how do we do that?

    Working within a design framework that balances needs and strategies starts with understanding the type of composers you are working with. Composers bring different needs and strategies to each new composing task, and it is important for instructors to be aware of those differences. While individual composers are, of course, individuals with individual proclivities and approaches, we propose that there are (at least) four common types of student composers who bring certain combinations of strategies and needs to the composition process: the experience-limited, the irresolute, the flexible, and the perfectionist composers. By recognizing these common composer types, composition instructors can develop a flexible design for their instruction.

    An experience-limited composer lacks experience in applying both needs and strategies to a composition, so they are entirely reliant on the formal needs of the assigned task and any what-strategies that are assigned by the instructor. These students gravitate towards formulaic writing because of their lack of experience with other types of writing. Relatedly, an irresolute composer may have a better understanding of the formal and informal needs, but they struggle with the application of what and how strategies for the composition. They can become overwhelmed with options of what without a clear how and become stalled during the composing process. Where the irresolute composer becomes stalled, the flexible composer is more comfortable adapting their composition. This type of composer has a solid grasp on both the formal and informal needs and is willing to adapt the informal needs as necessary to meet the formal needs of the task. As with the flexible composer, the perfectionist composer is also needs-driven, with clear expectations for the formal task and their own goals for the informal tasks. Rather than adjusting the informal needs as the composition develops, a perfectionist composer will focus intensely on ensuring that their final product exactly meets their formal and informal needs.

    Teaching writing requires embracing its complexity and moving beyond formulaic approaches prioritizing product over process. Writing is a dynamic and individualized task that takes place within a complex system, where composers bring diverse needs, strategies, and experiences. By adopting a design framework, teachers of writing and composing can support students in navigating this complexity, fostering creativity, agency, and authentic expression. It is an approach that values funds of knowledge students bring to the writing process, recognizing the interplay of formal and informal needs, as well as their “what” and “how” strategies; those they have and those that need growth via instruction and experience. Through thoughtful design, we can grow flexible, reflective, and skilled communicators who are prepared to navigate the wicked challenges of composing in all its various forms.

    These ideas and more can be found in When Teaching Writing Gets Tough: Challenges and Possibilities in Secondary Writing Instruction.

    References

    Campbell, K. H. (2014). Beyond the five-paragraph essay. Educational Leadership, 71(7), 60-65.

    Lannin, A. A., & Fox, R. F. (2010). Chained and confused: Teacher perceptions of formulaic writing. Writing & Pedagogy, 2(1), 39-64.

    Rico, G. L. (1988). Against formulaic writing. The English Journal, 77(6), 57-58.

    Rittel, H. W. J., & Webber, M. M. (1973). Dilemmas in a general theory of planning. Policy Sciences, 4(2), 155–169.

    Sharples, M. (1999). How we write : writing as creative design (1st ed.). Routledge. https://doi.org/10.4324/9780203019900

    Waltuck, B. A. (2012). Characteristics of complex systems. The Journal for Quality & Participation, 34(4), 13–15.

    Latest posts by eSchool Media Contributors (see all)

    Source link

  • What students complain about: Ombudsman – Campus Review

    What students complain about: Ombudsman – Campus Review

    The National Student Ombudsman (NSO) has shared the themes and types of the 1,500 student complaints made to the watchdog in its first five months.

    Please login below to view content or subscribe now.

    Membership Login

    Source link