Tag: futures

  • Building Connections and Shaping Futures by Fostering Cohort Success – Faculty Focus

    Building Connections and Shaping Futures by Fostering Cohort Success – Faculty Focus

    Source link

  • Building Connections and Shaping Futures by Fostering Cohort Success – Faculty Focus

    Building Connections and Shaping Futures by Fostering Cohort Success – Faculty Focus

    Source link

  • Capability for change – preparing for digital learning futures

    Capability for change – preparing for digital learning futures

    Digital transformation is an ongoing journey for higher education institutions, but there is something quite distinctive about the current moment.

    The combination of financial uncertainty, changing patterns of student engagement, and the seismic arrival of artificial intelligence is pointing to a future for higher education learning and teaching and a digital student experience that will certainly have some core elements in common with current practice but is likely in many respects to look rather different.

    At the moment I see myself and my colleagues trying to cling to what we always did and what we always know. And I really do think the whole future of what we do and how we teach our students, and what we teach our students is going to accelerate and change very, very quickly now, in the next five years. Institutional leader

    Our conversations with sector leaders and experts over the past six months indicate an ambition to build consistent, inclusive and engaging digital learning environments and to deploy data much more strategically. Getting it right opens up all kinds of possibilities to extend the reach of higher education and to innovate in models for engagement. But future change demands different kinds of technological capabilities, and working practices, and institutions are saying that they are hindered by legacy systems, organisational silos, and a lack of a unified vision.

    Outdated systems do not “talk to each other,” and on a cultural level as departments and central teams also do not “talk to each other” – or may struggle to find a common language. And rather than making life easier, many feel that technology creates significant inefficiencies, forcing staff to spend more time on administrative tasks and less on what truly matters.

    I think the problem always is when we hope something’s going to make it more efficient. But then it just adds a layer of complexity into what we’re doing…I think that’s what we struggle with – what can genuinely deliver some time savings and efficiencies as opposed to putting another layer in a process? Institutional leader

    In the spirit of appreciative inquiry, our report Capability for change – preparing for digital learning futures draws on a series of in depth discussions with leaders of learning and teaching, and digital technology, digital experts and students’ union representatives. We explore the sorts of change that are already in train, and surface insight about how institutions are thinking in terms of building whole-organisation capabilities. “Digital dexterity” – the ability to deploy technology strategically, efficiently, and innovatively to achieve core objectives – may be yet another tech buzzword, but it captures a sense of where organisations are trying to get to.

    While immediate financial pressures may require cutting costs and reprofiling investment, long term sustainability depends on moving forward with change, finding ways, not to do more with less but to do things differently. To realise the most value from technology investment institutional leaders need to find ways to ensure that across the institution staff teams have the knowledge, the motivation and the tools to deploy technology in the service of student success.

    How institutions are building organisational capability

    Running through all our conversations was a tension, albeit a potentially productive one: there needs to be much more consistency and clarity about the primary strategic objectives of the institution and the core technology platforms and applications that enable them. But the effect of, in essence, imposing a more streamlined “central” vision, expectations and processes should be to enable and empower the academic and professional teams to do the things that make for a great student experience. Our research indicates that institutions are focusing on three areas: leadership and strategy; digital capabilities of institutional staff; and breaking down the vertical silos that can hamper effective cross-organisational working.

    A number of reflections point to strategy-level improvements – such as ensuring there is strategic alignment between institutional objectives for student success, and technology and digital strategies; listening to the feedback from students and staff about what they need from technology; setting priorities, and resourcing those priorities from end to end from technology procurement to deployment and evaluation of impact. One institutional leader described what happens when digital strategies get lost in principles and forget to align with the wider success of the organisation:

    The old strategy is fairly similar, I imagine, to many digital strategies that you would have seen – it talks about being user focused, talks about lean delivery, talks about agile methodologies, product and change management and delivering value through showing, not telling. So it was a very top level strategy, but really not built with outcomes at its absolute core, like, what are the things that are genuinely going to change for people, for students? Institutional leader

    Discussions of staff digital capabilities recognised that institutional staff are often hampered by organisational complexity and bureaucracy which too often is mirrored in the digital sphere. One e-learning professional suggested that there is a need for research to really understand why there is a tendency towards proliferation of processes and systems, and confront the impact on staff workloads.

    There may also be limits to what can reasonably be expected from teaching staff in terms of digital learning design:

    You need to establish minimum benchmarks and get everyone to that place, and then some people will be operating well beyond that. You can be clear about basic benchmark expectations around student experience – and then beyond that you need to put in actual support [such as learning design experts] to implement the curriculum framework. E-learning professional

    But the broader insight on staff development was around shifting from provision of training on how to operate systems or tools to a more context-specific exploration of how the available technologies and data can help educators achieve their student success ambitions. Value is more systematically created across the organisation when those academic and professional teams who work directly with students are able to use the technology and data available creatively to enhance their practice and to problem solve.

    Where data has been used before it’s very much sat with senior colleagues in the institution. And you know it’s helped in decision making. But the next step is to try and empower colleagues at the coal face to use data in their day to day interventions with their students… How can they use the data to inform how they support their students? Institutional leader

    Decisive leadership may be successful in setting priorities and streamlining the processes and technologies that underpin them; strong focus on professional development may engage and enable institutional staff. But culture change will come when institutions find ways to systematically build “horizontals” across silos – mechanisms for collaborative and shared activity that bridge different perspectives, languages and disciplinary and professional cultures.

    Some examples we saw included embedding digital professionals in faculties and academic business processes such as recruitment panels, convening of cross-organisation thinking on shared challenges, and appointment of “change agent” roles with a skillset and remit to roam across boundaries.

    Technology providers must be part of the solution – acting as strategic partners rather than suppliers. One way to do that is to support institutions to pilot, test, and develop proof of concept before they decide to invest in large-scale change. Another is to work with institutions to understand how technology is deployed in practice, and the evolving needs of user communities. To be a great partner to the higher education sector means having a deep understanding not only of the technological capabilities that could help the sector but how these might weave into an organisation’s wider mission and values. In this way, technology providers can help to build capability for change.

    This article is published in association with Kortext. You can download the Capability for change report on Kortext’s website. The authors would like to thank all those who shared their insight to inform the report. 

    Source link

  • Data futures, reviewed | Wonkhe

    Data futures, reviewed | Wonkhe

    As a sector, we should really have a handle on how many students we have and what they are like.

    Data Futures – the multi-year programme that was designed to modernise the collection of student data – has become, among higher education data professionals, a byword for delays, stress, and mixed messages.

    It was designed to deliver in year data (so 2024-25 data arriving within the 2024-25 academic year) three times a year, drive efficiency in data collection (by allowing for process streamlining and automation), and remove “data duplication” (becoming a single collection that could be used for multiple purposes by statutory customers and others). To date it has achieved none of these benefits, and has instead (for 2022-23 data) driven one of the sectors’ most fundamental pieces of data infrastructure into such chaos that all forward uses of data require heavy caveats.

    The problem with the future

    In short – after seven years of work (at the point the review was first mooted), and substantial investment, we are left with more problems than we started with. Most commentary has focused on four key difficulties:

    • The development of the data collection platform, starting with Civica in 2016 and later taken over by Jisc, has been fraught with difficulties, frequently delayed, and experienced numerous changes in scope
    • The documentation and user experience of the data collection platform has been lacking. Rapid changes have not resulted in updates for those who use the platform within providers, or those who support those providers (the HESA Liaison team). The error handling and automated quality rules have caused particular issues – indeed the current iteration of the platform still struggles with fields that require responses involving decimal fractions.
    • The behavior of some statutory customers – in frequently modifying requirements, changing deadlines, and putting unhelpful regulatory pressure on providers, has not helped matters.
    • The preparedness of the sector has been inconsistent between providers and between software vendors. This level of preparedness has not been fully understood – in part because of a nervousness among providers around regulatory consequences for late submissions.

    These four interlinked strands have been exacerbated by an underlying fifth issue:

    • The quality of programme management, programme delivery, and programme documentation has not been of the standards required for a major infrastructure project. Parts of this have been due to problems in staffing, and problems in programme governance – but there are also reasonable questions to be asked about the underlying programme management process.

    Decisions to be made

    An independent review was originally announced in November 2023, overlapping a parallel internal Jisc investigation. The results we have may not be timely – the review didn’t even appear to start until early 2024 – but even the final report merely represents a starting point for some of the fundamental discussions that need to happen about sector data.

    I say a “starting point” because many of the issues raised by the review concern decisions about the projected benefits of doing data futures. As none of the original benefits of the programme have been realised in any meaningful way, the future of the programme (if it has one) needs to be focused on what people actually want to see happen.

    The headline is in-year data collection. To the external observer, it is embarrassing that while other parts of the education sector can return data on a near-real time basis – universities update the records they hold on students on a regular basis so it should not be impossible to update external data too. It should not come as a surprise that when the review poses the question:

    As a priority, following completion of the 2023-24 data collection, the Statutory Customers (with the help of Jisc) should revisit the initial statement of benefits… in order to ascertain whether a move to in-year data collection is a critical dependent in order to deliver on the benefits of the data futures programme.

    This isn’t just an opportunity for regulators to consider their shopping list – a decision to continue needs to be swiftly followed by a cost-benefit analysis, reassessing the value of in-year collection and determining whether or when to pursue in-year collection. And the decision is that there will, one day, be in-year student data. In a joint statement the four statutory customers said:

    After careful consideration, we intend to take forward the collection of in-year student data

    highlighting the need for data to contribute to “robust and timely regulation”, and reminding institutions that they will need “adequate systems in place to record and submit student data on time”.

    The bit that interests me here is the implications for programme management.

    Managing successful programmes

    If you look at the government’s recent record in delivering large and complex programmes you may be surprised to learn of the existence of a Government Functional Standard covering portfolio, programme, and project management. What’s a programme? Well:

    A programme is a unique, temporary, flexible organisation created to co-ordinate, direct and oversee the implementation of a set of projects and other related work components to deliver outcomes and benefits related to a set of strategic objectives

    Language like this, and the concepts underpinning it come from what remains the gold standard programme management methodology, Managing Successful Programmes (MSP). If you are more familiar with the world of project management (project: “a unique temporary management environment, undertaken in stages, created for the purpose of delivering one or more business products or outcomes”) it bears a familial resemblance to PRINCE2.

    If you do manage projects for a living, you might be wondering where I have been for the last decade or so. The cool kids these days are into a suite of methodologies that come under the general description of “agile” – PRINCE2 these days is seen primarily as a cautionary tale: a “waterfall” (top down, documentation centered, deadline focused) management practice rather than an “iterative” (emergent, development centered, short term) one.

    Each approach has strengths and weaknesses. Waterfall methods are great if you want to develop something that meets a clearly defined need against clear milestones and a well understood specification. Agile methods are a nice way to avoid writing reports and updating documentation.

    Data futures as a case study

    In the real world, the distinction is less clear cut. Most large programmes in the public sector use elements of waterfall methods (regular project reports, milestones, risk and benefits management, senior responsible owners, formal governance) as a scaffold in which sit agile elements at a more junior level (short development cycle, regular “releases” of “product” prioritised above documentation). While this can be done well it is very easy for the two ideologically separate approaches to drift apart – and it doesn’t take much to read this into what the independent review of data futures reveals.

    Recommendation B1 calls, essentially, for clarity:

    • Clarity of roles and responsibilities
    • Clarity of purpose for the programme
    • Clarity on the timetable, and on how and when the scope of the programme can be changed

    This is amplified by recommendation C1, which looks for specific clarifications around “benefits realisation” – which itself underpins the central recommendation relating to in-year data.

    In classic programme management (like MSP) the business case will include a map of programme benefits: that is, all of the good things that will come about as a result of the hard work of the programme. Like the business case’s risk register (a list of all the bad things that might happen and what can be done if they did) it is supposed to be regularly updated and signed off by the Programme Board – which is made up of the most senior staff responsible for the work of the programme (the Senior Responsible Owners) in the lingo.

    The statement of benefits languished for some time without a full update (there was an incomplete attempt in February 2023, and a promise to make another one after the completed 2022-23 collection – we are not told whether the second had happened). In proper, grown-up, programme management this is supposed to be done in a systematic way: every programme board meeting you review the benefits and the risk register. It’s dull (most of the time!) but it is important. The board needs an eye on whether the programme still offers value overall (based on an analysis of projected benefits). And if the scope needed to change, the board would have final say on that.

    The issue with Data Futures was clarity over whether this level of governance actually had the power to do these things, and – if not – who was actually doing them. The Office for Students latterly put together quite a complex and unwieldy governance structure, with a quarterly review board having oversight of the main programme board. This QRB was made up of very senior staff at the statutory customers (OfS, HEFCW, SFC, DoE(NI)), Jisc, and HESA (plus one Margaret Monckton – now chair of this independent review! – as an external voice).

    The QRB oversaw the work of the programme board – meaning that decisions made by the senior staff nominally responsible for the direction of the programme were often second guessed by their direct line managers. The programme board was supposed to have its own assurance function and an independent observer – it did not (despite the budget being there for it).

    Stop and go

    Another role of the board is to make what are more generally called “stop-go” decisions, and are here described as “approval to proceed”. This is an important way of making sure the programme is still on track – you’d set (in advance) the criteria that needed to be fulfilled in terms of delivery (was the platform ready, had the testing been done) before you moved on to the next work package. Below this, incremental approvals are made by line managers or senior staff as required, but reported upwards to the board.

    What seems to have happened a lot in the Data Futures programme is what’s called conditional approvals – where some of these conditions were waived based on assurances that the remaining required work was completed. This is fine as it goes (not everything lines up all the time) but as the report notes:

    While the conditions of the approvals were tracked in subsequent increment approval documents, they were not given a deadline, assignee or accountable owner for the conditions. Furthermore, there were cases where conditions were not met by the time of the subsequent approval

    Why would you do that? Well, you’d be tempted if you had another board above you – comprising very senior staff and key statutory customers – concerned about the very public problems with Data Futures and looking for progress. The Quarterly Review Board (QRB) as it turned out, only actually ended up making five decisions (and in three of these cases it just punted the issue back down to the programme board – the other two, for completists, were to delay plans for in-year collection).

    What it was meant to be doing was “providing assurance on progress”, “acting as an escalation point” and “approving external assurance activities”. As we’ve already seen, it didn’t really bother with external assurance. And on the other points the review is damning:

    From the minutes provided, the extent to which the members of the QRG actively challenged the programme’s progress and performance in the forum appears to be limited. There was not a clear delegation of responsibilities between the QRG, Programme Board and other stakeholders. In practice, there was a lack of clarity also on the role of the Data Futures governance structure and the role of the Statutory Customers separately to the Data Futures governance structure; some decisions around the data specification were taken outside of the governance structure.

    Little wonder that the section concludes:

    Overall, the Programme Board and QRG were unable to gain an independent, unbiased view on the progress and success of the project. If independent project assurance had been in place throughout the Data Futures project, this would have supported members of the Programme Board in oversight of progress and issues may have been raised and resolved sooner

    Resourcing issues

    Jisc, as developer, took on responsibility for technical delivery in late 2019. Incredibly, Jisc was not provided with funding to do this work until March 2020.

    As luck would have it, March 2020 saw the onset of a series of lockdowns and a huge upswing in demand for the kind of technical and data skills needed to deliver a programme like data futures. Jisc struggled to fill key posts, most notably running for a substantive period of time without a testing lead in post.

    If you think back to the 2022-23 collection, the accepted explanation around the sector for what – at heart – had gone wrong was a failure to test “edge cases”. Students, it turns out, are complex and unpredictable things – with combinations of characteristics and registrations that you might not expect to find. A properly managed programme of testing would have focused on these edge cases – there would have been less issues faced when the collection went live.

    Underresourcing and understaffing are problems in their own right, but these were exacerbated by rapidly changing data model requirements, largely coming from statutory customers.

    To quote the detail from from the report:

    The expected model for data collection under the Data Futures Programme has changed repeatedly and extensively, with ongoing changes over several years on the detail of the data model as well as the nature of collection and the planned number of in-year collections. Prior to 2020, these changes were driven by challenges with the initial implementation. The initial data model developed was changed substantially due to technical challenges after a number of institutions had expended significant time and resource working to develop and implement it. Since 2020, these changes were made to reflect evolving requirements of the return from Statutory Customers, ongoing enhancements to the data model and data specification and significantly, the ongoing development of quality rules and necessary technical changes determined as a result of bugs identified after the return had ‘gone live’. These changes have caused substantial challenges to delivery of the Data Futures Programme – specifically reducing sector confidence and engagement as well as resulting in a compressed timeline for software development.

    Sector readiness

    It’s not enough to conjure up a new data specification and platform – it is hugely important to be sure that your key people (“operational contacts”) within the universities and colleges that would be submitting data are ready.

    On a high level, this did happen – there were numerous surveys of provider readiness, and the programme also worked with the small number of software vendors that supply student information systems to the sector. This formal programme communication came alongside the more established links between the sector and the HESA Liaison team.

    However, such was the level of mistrust between universities and the Office for Students (who could technically have found struggling providers in breach of condition of registration F4), that it is widely understood that answers to these surveys were less than honest. As the report says:

    Institutions did not feel like they could answer the surveys honestly, especially in instances where the institution was not on track to submit data in line with the reporting requirements, due to the outputs of the surveys being accessible to regulators/funders and concerns about additional regulatory burden as a result.

    The decision to scrap a planned mandatory trial of the platform, made in March 2022 by the Quarterly Review Group, was ostensibly made to reduce burden – but, coupled with the unreliable survey responses, this meant that HESA was unable to identify cases where support was needed.

    This is precisely the kind of risk that should have been escalated to programme board level – a lack of transparency between Jisc and the board about readiness made it harder to take strategic actions on the basis of evidence about where the sector really was. And the issue continued into live collection – because Liaison were not made aware of common problems (“known issues”, in fact) the team often struggled with out-of-date documentation: meaning that providers got conflicting messages from different parts of Jisc.

    Liaison, on their part, dealt with more than 39,000 messages between October and December 2023 (during the peak of issues raised during the collection process) – even given the problems noted above they resolved 61 per cent of queries on the first try. Given the level of stress in the sector (queries came in at all hours of the day) and the longstanding and special relationship that data professionals have with HESA Liasion, you could hardly criticise that team for making the best of a near-impossible situation.

    I am glad to see that the review notes:

    The need for additional staff, late working hours, and the pressure of user acceptance testing highlights the hidden costs and stress associated with the programme, both at institutions and at Jisc. Several institutions talked about teams not being able to take holidays over the summer period due to the volume of work to be delivered. Many of the institutions we spoke to indicated that members of their team had chosen to move into other roles at the institution, leave the sector altogether, experienced long term sickness absence or retired early as a result of their experiences, and whilst difficult to quantify, this will have a long-term impact on the sector’s capabilities in this complex and fairly niche area.

    Anyone who was even tangentially involved in the 2022-23 collection, or attended the “Data Futures Redux” session at the Festival of Higher Education last year, will find those words familiar.

    Moving forward

    The decision on in-year data has been made – it will not happen before the 2026-27 academic year, but it will happen. The programme delivery and governance will need to improve, and there are numerous detailed recommendations to that end: we should expect more detail and the timeline to follow.

    It does look as though there will be more changes to the data model to come – though the recommendation is that this should be frozen 18 months before the start of data collection which by my reckoning would mean a confirmed data model printed out and on the walls of SROC members in the spring of 2026. A subset of institutions would make an early in-year submission, which may not be published to “allow for lower than ideal data quality”.

    On arrangements for collections for 2024-25 and 2025-26 there are no firm recommendations – it is hoped that data model changes will be minimal and the time used to ensure that the sector and Jisc are genuinely ready for the advent of the data future.

    Source link

  • Possible futures for working environments

    Possible futures for working environments

    by Nic Kipar

    This blog follows an earlier short review of the literature and is based on the author’s experience in a range of universities. It suggests how working environments might change in practice, with illustrations from the author’s own institution, the University of Glasgow.

    Introduction

    In thinking about working environments, the most effective approach is to ask individuals how they work best. This enables them to thrive in the environment most suited to themselves and the particular activity they are undertaking. More importantly, staff should be given the freedom to experiment with different settings, without others imposing judgments based on their own limited perspectives. This openness fosters a supportive and adaptable workplace, enabling everyone to find the spaces that best suit their work and wellbeing.

    Embracing new thinking

    Traditionally, we have not considered whether staff on our campuses are enjoying their work environments and are able to be their most creative and effective selves. This oversight stands in contrast with the University Value of Curiosity and Discovery: “Embracing new thinking and innovation in a spirit of open minded collaboration that positively impacts on ourselves, our University, our city, society and the world.”

    In response, the University of Glasgow has recently begun incorporating a co-design element into its Workspace Futures Programme, starting with a ‘diagnose’ phase. Yet I still wonder: are we thinking boldly enough? Are we exploring possibilities that reach beyond our usual perspectives and assumptions?

    Let me pose a provocation from my colleague Dr Nathalie Tasler (personal communication, November 2024):

    Remember the Disney movie Aladdin? “Phenomenal cosmic powers… itty-bitty living space!” So how can our immensely talented and creative colleagues thrive when their environment is filled with “stop rules” (Runco, 2007)? In social psychology, stop rules are constraints—often invisible—that limit our thinking, stifle creativity, and shut down possibility thinking (Craft, 2005; Lin, 2020) before they even have a chance to take shape. When workplaces impose these restrictions, whether through rigid protocols, uninspiring spaces, or unspoken norms, how can we expect innovation and fresh ideas to flourish? What would it take to create a work environment where potential isn’t confined, but unleashed?Transforming everyone’s spaces

    While we have been focused on transforming student study spaces and creating vibrant, open campuses that attract students and the public alike, we may be neglecting the needs of our own staff. The University of Edinburgh (Bayne, presentation in November 2024) uses the term “buzz” to describe the energy of a thriving campus, drawing inspiration from the University of Warwick’s public events, like World Cup screenings in collaboration with local businesses, that created memorable, widely shared experiences. Edinburgh’s themes of Belonging and buzz; Sanctuary and beauty; Sustainable connections; Mobility, flexibility and flow, and Openness, public co-creation and surfacing resonate with our work on student spaces, but have we fully explored the potential of spaces that could truly empower our staff work best depending on their known, or yet unknown preferences?

    Understanding individual preferences in workspace design is challenging. Environmental needs are deeply personal, shaped by complex and unique factors. This makes it impossible to assume that one person’s ideal workspace will suit everyone. When we project our own preferences onto others, we risk introducing bias and overlooking or misjudging their needs. These hidden barriers are created by a world design with certain people in mind, leaving others feeling excluded. They make aspects of society accessible to some while shutting out others. These mismatches are the building blocks of exclusion, making people feel unwelcome or unable to fully participate (Holmes, 2018).

    It is one thing to offer flexible options for staff to work from home or from a campus office. But we should also look closely at the campus itself, at how we treat these spaces and how they treat us. Typically, we arrive on campus, head into buildings and into offices or meeting rooms, and operate within closed-off spaces that might be limiting our ability to think creatively or envision the future. It makes me wonder: Are we missing something essential?

    An office is an office is an office?

    We expect our staff to innovate and imagine exciting futures, yet how can we foster that kind of thinking when we confine people to uninspiring spaces? A room does not need to have white walls or dull furniture to feel stifling; even a vibrant, biophilic space can feel restrictive if it is still just four walls. What if we reimagined our workplaces so that, rather than feeling like “just another day at the office”, staff actually felt genuinely inspired to be there?

    At present, we do not offer staff the full range of spaces that might suit different types of work or support them in ways they find personally meaningful. Why is it, for example, that a staff member working in an on-campus café among students is often seen as “not really working”? Such assumptions are outdated, belonging to a pre-digital era. Why do we still insist that all staff need traditional offices, all the time?

    Offices have their purpose, of course, but not all office types are effective for all needs. Open-plan offices with cubicles, for instance, combine the worst aspects of every workspace model. Various issues are associated with open office spaces featuring cubicles, which are often regarded as suboptimal work environments. Common problems include lack of privacy, increased noise levels, and the inability to control one’s environment, which can lead to diminished productivity, lower job satisfaction, and elevated stress levels. The systematic literature review by Colenberg et al (2021) finds a link between cramped cubicle setups in open spaces and decreased physical and mental health due to poor environmental control. I recall working in university offices in the early 1990s, when alternative approaches were simply unimaginable. Back then, an office with your name on the door was a status symbol and a sign of belonging. But why are we still behaving as though we are living in the 20th century?

    Spaces designed to fit people, not making people fit

    James McCune Smith Learning Hub (JMS) © UofG

    If someone can concentrate deeply and produce creative, high-quality work in a bustling student study space like the James McCune Smith Learning Hub (JMS,) or in a moderately busy area like the Mazumdar-Shaw Advanced Research Centre (ARC) lobby, who are we to judge? For some, the energy of a café may be the perfect environment to spark ideas and focus, while others need absolute silence and solitude to work through complex problems. Some might prefer a quiet, shared workspace, finding comfort in the presence of others without the noise. Many benefit from working at home, or outside if weather permits, while others feel more motivated and inspired by coming onto campus.

    Ultimately, as long as staff are accessible when needed and are delivering excellent work, there is no “right” way to structure a work environment. What works for one person may not work for another, and that is precisely the point: a truly supportive workplace recognises and respects individual preferences and needs. By allowing each person the freedom to choose the space that best supports their productivity and wellbeing, we create a culture that values flexibility and respects diversity in how we all work best.

    Mazumdar-Shaw Advanced Research Centre (ARC) © UofG

    Welcoming variation and diversity as agents for evolution

    The psychologist Dr Lorna Champion (personal communication, November 2024) summarised this succinctly: “Evolution is based on variation. If a characteristic supports the survival then it is retained and handed on, because of difference, we evolve. If we don’t have variation then we stagnate.” It is time to embrace new thinking, to break from outdated models, and to create environments that truly support and inspire staff to thrive.

    Nic Kipar leads the Academic and Digital Development team at the University of Glasgow. She played an instrumental role in the creation of the James McCune Smith Learning Hub, focusing on inclusive active learning. Nic co-leads the Enhancing Learning & Teaching Practice workstream, contributing to the university’s Learning & Teaching strategy and planning for the upcoming Keystone building, which will feature large interdisciplinary labs. Nic also chairs a working group on Pedagogy in Superlabs, pioneering these innovative spaces for the university.

    Author: SRHE News Blog

    An international learned society, concerned with supporting research and researchers into Higher Education

    Source link