The consultant you need is already on campus

fghewrouhgfiuer

When questions come up about whether services are working, overlapping, or reaching the right students, the instinct is usually the same – commission a review.

But the expertise to do that work often already exists on campus – if institutions are willing to use it.

I’ve been on both sides of that equation. I went to management school, worked as a management consultant, and later as a careers consultant across several business schools. I’ve watched students walk straight from campus into firms like McKinsey, solving high-stakes, real-world problems.

So when it came time to review our student support teams (Advice, Basic Student Needs and Wellbeing), asking Grishma, an MBA student, to take on the brief didn’t feel risky – it felt obvious. What followed challenged our assumptions, tested our data, and raised a bigger question about why higher education so often overlooks the expertise sitting on its own campuses.

What the review focussed on

Grishma led the review across the three areas using a mix of interviews, process mapping, data analysis, and systems review. Enough structure to test our assumptions, without losing sight of how students actually experience services.

The analysis covered four questions:

Are we set up to deliver the biggest possible impact with the resources we have? Are services designed around real student journeys? Do students know what’s available, and do they trust it? Can we clearly evidence impact and adapt as needs change?

Once we began working through them, the questions revealed far more than expected.

What we learned

One of the most useful aspects of the review was the lens Grishma brought as a business school student.

She didn’t need translating between how services are meant to work and how they actually feel from the student side. At the same time, her business school education forced us to be more disciplined – clearer about purpose, tighter on prioritisation, and more honest about trade-offs.

Additionally, the action research approach meant this wasn’t a hands-off consultancy exercise. Being embedded in the organisation allowed Grishma to see how decisions played out in practice, not just in theory. That depth of involvement significantly improved the quality of the output in a way that a more traditional consultant model rarely achieves.

That combination exposed gaps we’d normalised over time, and the same themes kept coming up.

First, clarity of purpose. Teams were doing good work, but not always towards clearly shared goals. Planning often happened in response to immediate need, which led to lots of small initiatives rather than a few well-defined priorities. For example, different teams would each run their own welcome events or student workshops, duplicating effort and confusing students, instead of coordinating around a single, high-impact programme.

Secondly, silos. Ways of working had evolved separately, limiting visibility across services and making collaboration harder than it needed to be. In a few cases, this meant duplicated effort or students being passed between teams at key moments. For example, a student seeking financial support might contact both the advice team and the basic needs team, but without defined ways of working together, neither team had a clear picture of the student’s situation or next steps.

Finally, data. Insight was being gathered through multiple systems and methods, which made it difficult to build a consistent picture of reach, equity, outcomes, or impact. We could tell strong individual stories, but struggled to answer bigger questions about who we were missing or what was working best.

For instance, in the advice team, advisors could point to powerful examples of students helped through complex situations, but the data couldn’t easily show whether those successes were typical, who never made it to an appointment, or whether some groups were consistently under-represented.

Taken together, these issues created a student experience that could feel disjointed – particularly at transition points – and made it harder to confidently demonstrate impact to partners and stakeholders.

None of this is unique to us. And if anything, the problem is more acute on the university side. An SU might have three or four support teams working in parallel – a university can have dozens, spread across academic departments, central services, faculties, and colleges.

A student might interact with disability services, personal tutoring, counselling, hardship funds, and academic skills support without any of those teams sharing data or referral pathways. The fragmentation we found in a relatively small organisation is a microcosm of something much bigger.

But seeing it laid out clearly, using our own evidence, challenged the comforting assumption that being busy and well-intentioned automatically adds up to effectiveness.

What changed as a result

The work hasn’t sat on a shelf. Each of the three teams received tailored recommendations reflecting their specific pressures, maturity, and opportunities.

We’ve restructured a team to align roles more closely with student journeys rather than service silos. Student development was split into two directorates – wellbeing and volunteering – with wellbeing brought alongside other student support teams where there was clearer overlap in purpose and practice.

We also created a new role to bring together processes for advice and basic needs, giving students a single point of contact and enabling internal triage rather than passing students between teams.

We tightened how we collect and use data so that demand, access, and outcomes can be viewed together. Feedback is now gathered consistently across services, supported in some areas by psychology student placements focused on survey design and analysis using the PERMA model which looks at Positive emotion, Engagement, Relationships, Meaning, and Accomplishment. All teams produce end-of-semester reports, and we are working with the Data Science department to develop an annual impact report that brings together reach, trends, outcomes, and student journeys in a form suitable for senior leaders, trustees, and funders.

We also introduced clearer governance for new projects. This includes being explicit about goals from the outset, who makes which decisions, how student journeys will be reviewed, and the points at which a project should be adapted or stopped. Projects are now reviewed regularly against agreed KPIs, spend, and evidence of impact.

For example, we introduced the ‘SU How’s You Strategic Oversight Board’, meeting every six weeks to track progress on our student wellbeing calls. With a target of 22,000 calls this year and a fixed budget, the board has improved transparency and accountability, helping teams stay on track, maintain call volumes and quality, and make timely decisions about priorities and resources.

Perhaps just as importantly, it shifted how we think about service design. We’re developing a shared student–staff model in which students contribute as partners across the system, through rotational roles that span services and intentionally build future student support leaders, rather than being recruited into single, siloed teams.

What we’d take forward

This won’t work everywhere, and it isn’t a plug-and-play solution. It needs a clear brief, a committed student, good support, and a willingness to test ideas – and leave some behind.

It also required us to take the idea of being student-led seriously: not just consulting students on decisions already made, but trusting them with real responsibility for analysing systems, challenging assumptions, and shaping change. In doing so, we were forced to look more critically at our own practices, and to recognise expertise sitting much closer to home than we often assume.

Next time we’re tempted to reach for a consultant, we’ll start by looking across campus first.

Getting started

For many university departments and students’ unions, the biggest barrier isn’t whether this would be useful – it’s not knowing where to begin.

Most business schools already run modules built around real-world projects. These often sit under headings like placements, consulting projects, capstone projects, or experiential learning. The quickest route in is usually the placements or employability team, whose job is to source organisations willing to host applied projects for students.

If you search your university website for terms like “MBA projects”, “consulting project”, or “industry partnerships”, you’ll often find a named contact or generic inbox. That’s usually the best place to start.

If there’s no obvious placements team, look for programme directors or module leads, particularly for MBAs, MSc Management, Data Science or Psychology. Programme directors are often keen to find credible, well-scoped projects that give students meaningful experience – especially ones based inside the university, where governance, ethics and access are easier to manage.

An initial email doesn’t need to be polished or technical. What matters is clearly setting out the problem you’re trying to solve, why it’s complex or interesting, what students would gain from working on it, and the level of access and support you can offer.

Not every department will say yes, and that’s normal. Timings, assessment cycles, and capacity vary widely. Treat the first conversation as relationship-building rather than transactional. Once one project lands successfully, future collaborations become much easier.

For university professional services teams, the logic is the same but the institutional politics can be trickier. Approaching your own business school to review your own services means navigating sensitivities that an external partnership wouldn’t trigger – academic departments don’t always see professional services as worthy project hosts, and professional services teams can be defensive about student scrutiny.

A director of student services reaching out to an MBA programme director with a well-scoped brief and genuine openness to findings is the way through – but it helps to have senior sponsorship and to frame the work as a genuine learning opportunity, not a cost-saving exercise.

For universities, there’s also a stronger regulatory case. OfS expects providers to be able to evidence the impact of student support – particularly around access, continuation, and attainment gaps. TEF panels look for evidence that institutions understand and act on the student experience. Work like this can feed directly into access and participation monitoring and condition of registration evidence in ways that traditional consultancy reports – which tend to arrive, get filed, and gather dust – rarely do.

For students’ unions and universities alike, this approach isn’t about cutting corners. It’s about recognising that universities already contain extraordinary expertise – and that with a bit of confidence and curiosity, organisations can turn themselves into powerful learning environments while improving services at the same time.

Source link