Blog

  • Reflections of a Former Presidential Spouse (opinion)

    Reflections of a Former Presidential Spouse (opinion)

    In August, Denise A. Battles stepped down after 10-plus years as president of the State University of New York at Geneseo to take a position with the SUNY system, which meant that my term as her presidential spouse came to an expected but abrupt end. I have since spent a great deal of time musing about my decade in that role, the joys and heartbreaks, the triumphs and the tragedies, and even the title … First Man? First Dude? It’s an odd occupation, since nationwide the job description is either nonexistent or as varied as the institutions where spouses and partners serve. My purpose here is to offer a few observations, derived from my experiences and those of my peers, and also humbly offer some advice to present and future executive spouses and partners.

    Denise and I met at our new faculty orientation, which seems like a lifetime ago, and grew up together as academics. She chose administration early on, and I taught for decades before giving up faculty status to become a full-time fellowship director. As she advanced from dean to provost to president, my role as the administrative “trailing” spouse altered in both subtle and overt ways at each new institution, but the core was always rooted in our dedication to the universities we served and to each other. We were fortunate to always be employed at the same university and offered ourselves to search committees as a package deal. Many of my peers gave up careers to serve as dedicated presidential spouses and partners or have positions in business or with outside organizations. For some, their ties to the institution come down to an occasional student play or alumni meet-and-greet, a few calendar events to plan and dress for. Others appear on campus virtually every day, though doing so can be fraught with peril. What’s the old saw? Why do presidents get fired? Houses and spouses (cue laughter).

    There’s a kind of isolation that comes with being a presidential spouse or partner, as virtually everyone at the institution or the surrounding community seems to either work in some way for the president or chancellor or is related to or knows someone who does. That reality leaves a distance, an unspoken space many feel from campus and community acquaintances and even those considered friends. I often discussed this condition with other board members of the spouses-and-partners group that is affiliated with the American Association of State Colleges and Universities (AASCU) and for which I served for over a decade. Many feel a sense of remoteness even with the myriad social outlets that come with the role—entertaining, dinners, social and athletic events, fine arts performances, donor visits, local clubs and organizations. The pandemic left many of us questioning the roles we played as presidential spouses and partners and what the future would bring for our ghostly campuses, overworked partners and largely absent student body. In many ways, that anxiety has not much changed.

    My wife and I were lucky enough to live in a stately historic presidential residence on Main Street in a quaint western New York village, mere steps from the campus. We would often sit on the front porch and greet the students and villagers, even the mayor, walking by … Pleasant as it was, we never forgot we were living in someone else’s house. I still work remotely with fellowships on a phased retirement plan for the college and recently have found myself missing the bustle of the campus and community, attending campus events, and even wearing the golden name badge signifying I was part of the campus team.

    During Denise’s presidency, I would see her mostly only at the end of the day, after she had been dealing with perhaps a sticky personnel matter or one of the myriad other pressing issues on campus, and when she was still digesting the implications and finding solutions. We followed a strict code of confidentiality and professionalism about discussing these matters, which meant I was often not privy to what may have been happening. I made it a point in casual conversation with the campus and village community to refer to Denise as “the president,” to subtly suggest that I was not some kind of informational conduit and also that I knew little. After a while, folks stopped asking.

    Most presidential spouses and partners ache to do more to help their loved ones but know that unconditional support is the best strategy. They are not vice presidents or back-door conduits, as there are plenty of people on campus to serve those functions. Of course, it is true that university chancellors and presidents are well compensated for their work, but the grind offers little respite and few moments for a personal life or chances to escape the endless crises. The average life of a college presidency has shrunk to a mere 5.9 years due to the strain. Faculty, staff and, yes, administrators are being asked to do more, even as they feel anxiety about what the future will bring for their families and positions. As perhaps never before, our campuses must find a unity of purpose to face the fallout from domestic politics and world events.

    Presidential partners often face unexpected challenges when crises arise, as they may become targets for disgruntled and mentally unstable individuals from the campus and community, an unsettling and frightening reality that I unfortunately experienced too many times. Early on, I made the decision to eschew social media entirely, as the viciousness and ignorance were both unrelenting and entirely predictable. These potential grim truths are features of the job, but in the absence of some kind of orientation or guidebook, many partners are left to deal with these situations alone without anyone to confide in but their harried presidents, who can commiserate but may be legally and ethically barred from reciprocating.

    Like many presidential couples, my wife and I have been together day in and day out, pretty much continuously, since we began in academia. But “together” is a bit of a misstatement, as even though we were under the same roof, the work never ended, the email only increased and, if possible, our time together talking as a couple about the everyday things and our future was ever more brief. That reality is echoed in stories I hear from my spousal and partner colleagues across the nation—presidential relationships are being tested as never before.

    So, here’s my advice to present and future presidential partners, humbly offered and born from 10 years on the job. I could list 20 more points, but these seem like the most important ones.

    1. Make the role your own. Since there is no template, you can choose what to be or not to be, regardless of what a predecessor may have been or done. Garden club membership is not required, and you can miss that regular season game. Take your time before committing and remember that you can always say no.
    2. Find supporters and confidants among your spouse and partner peers. Family and friends are often well meaning, but, as with many occupations, cannot really understand what you are going through. AASCU’s Spouse and Partner Program offers a safe and confidential circle of fellow travelers who are more than willing to lend an ear and offer their own experiences to help you through your struggles as you help them through theirs. I recommend membership highly.
    3. Be there for your president or chancellor. Listen, but don’t try to fix anything. Doing so can be the hardest part of the job. Sometimes they just need to vent, especially during the worst of times—and if they seem upset or a bit hostile, usually it’s not about you. You are not an administrator; no one hired you to advise, and doing so may make things worse. They are privy to information that may frankly be none of your business, until it is, and if so, they will tell you what you need to know.

    In writing this piece, I don’t seek pity or sympathy for spouses and partners. I fully acknowledge the privileges that my position as a presidential spouse entailed and feel a deep sense of gratitude for having been given the opportunity to serve the university and the community. I have spent my entire working career in academia as an educator and, with this essay, seek only to inform the larger academic community as to the nature of the job and counsel those who may assume the role at some point. Presidential spouses and partners will continue to live in a strange kind of uncertainty as they struggle to support their presidents and chancellors, often while surrounded by acquaintances but still largely alone, and a bit uncertain as to what their roles truly require.

    Michael Mills is director of national fellowships and scholarships at the State University of New York at Geneseo.

    Source link

  • Transform or be transformed: why digital strategy is now central to university survival

    Transform or be transformed: why digital strategy is now central to university survival

    This blog was kindly authored by Professor Amanda Broderick, Vice-Chancellor & President of the University of East London.

    Across higher education, there is a growing realisation that no cavalry is coming over the hill. Government support arrives with one hand while being withdrawn with the other, and universities are being asked to do more, for more people, with fewer resources. The choice facing the sector is stark: we must transform, or be transformed.

    At the University of East London (UEL), we have been on this journey for some time. In many ways, it was almost serendipitous that the University reached a point of existential pressure years before similar headwinds struck the rest of the sector. That early crisis forced us to confront difficult truths, make bold decisions, and learn quickly what genuinely works. As we approach the final quarter of our ten-year strategy, Vision 2028, our transformation is evident. We have seen a 25 percentage point improvement in positive graduate outcomes (the largest in England), an unparalleled rise in NSS rankings, a move from 90th to 2nd in the country for annual student start-ups, and a financial sustainability strategy which now places us as one of only 15 universities in the country without any external borrowing, whilst delivering a £350m investment programme.

    One area underpins each of these elements of our transformation: digital.

    When we launched Vision 2028, digital transformation sat at its core – not as a technology programme, but as a strategic enabler. Our ‘Digital First’ approach was designed to ensure that the entire UEL community has the tools, confidence and freedom to innovate and develop continuously. That philosophy has shaped everything we have done since.

    We have migrated from on-premises data centres to a cloud infrastructure, becoming the first UK university to be fully cloud-based in 2019. This has improved resilience, reduced environmental impact, and transformed how we use big data, from student retention predictive modelling to generative AI personal learning assistance to business intelligence and management information. We have invested in innovation spaces that allow students to build their own compute environments, redesigned our website to offer a more personalised browsing experience, and strengthened our digital architecture to mitigate downtime.

    Sustainability has been a constant consideration – reducing data centre usage and re-using compatible hardware wherever possible. We have also made key software available anytime, anywhere, and consolidated multiple CRM-type environments into a single solution.

    But digital transformation only matters if it serves a purpose. At UEL, that purpose is careers.

    How can we prepare students for future careers if we do not embed digital skills throughout their education? That question underpins our Mental Wealth and Professional Fitness curriculum, co-designed with employers to ensure students develop future-ready digital capabilities alongside cultural capital, confidence and professional inter-personal behaviours. Introductory modules are paired with sector-specific specialisation depending on course, with Level 3 and 4 modules already covering AI and digital tools for industry, digital identity and professional networks, data literacy, visualisation, and data ethics. Employability is not an add-on at UEL; it is embedded throughout the learner journey – which means that in-demand digital skills are too.

    Our ambition extends beyond our enrolled students. We want to spread transformation across our communities so that opportunity is not confined to campus. Click Start, delivered by Be the Business and the University of East London in partnership with the Institute of Coding, is a powerful example. This four-week course equips young Londoners aged 18–30 with digital marketing and data analysis skills, delivering more than 90 hours of teaching alongside industry-recognised certificates from Google and Microsoft. Since June 2023, more than 230 young people have completed the programme – 41% women, 88% from ethnic minority backgrounds, and 70% from East London. Graduates have progressed into jobs, apprenticeships and further study, with some joining UEL itself and others using the programme as a springboard to transform their lives elsewhere.

    This ethos of applied, inclusive innovation is reflected across our courses and underpinned by active research centres and innovation hubs, from our UK Centre for AI in the Public Sector and Centre for FinTech, to our Child Online Harms Policy Think Tank and Intelligent Technologies Research Group. Alongside our industry partnerships, this cutting-edge research ensures that what students learn remains relevant, responsible, and future-focussed.

    When a student’s whole experience is designed as digital first, technology stops being a blocker and becomes an enabler. It supports our shift from a ‘university-ready student’ model to becoming a ‘student-ready university’. UEL’s Track My Future app exemplifies this approach, bringing academic, careers, and support services into a single personalised platform. Putting students’ own data into their own hands and providing a digital route-map to university life, daily active use regularly exceeds 40,000 interactions – clear evidence that digital tools can strengthen engagement and belonging.

    Compared with when I joined UEL in 2018, the scale of the digital transformation today is unmistakable. This is what purposeful digital transformation looks like: not technology for its own sake, but a platform for inclusion, resilience and impact. In a sector facing relentless pressure, that is not optional – it is essential.

    Kortext is a HEPI Partner. Professor Amanda Broderick is speaking at Kortext LIVE on 11 February 2026 in London. Find out more and secure your seat here.

    Source link

  • Day 1 is Hard: Reflections on Being a First-Year Student (Again) – Faculty Focus

    Day 1 is Hard: Reflections on Being a First-Year Student (Again) – Faculty Focus

    Source link

  • The grade inflation mutant algorithm, 2026

    The grade inflation mutant algorithm, 2026

    The publication of the Office for Students’ annual data set on degree classifications and grade inflation was initially scheduled for October of last year.

    It was delayed until now to enable further data checking – a pause that caused many data-literate observers to speculate that perhaps the venerable and much-criticised OfS algorithm (which compares the classification of degrees awarded to the perfect year that was 2010-11; controlling for age, entry qualifications, and subject of study only) might be in for an overhaul.

    This algorithm has generated results in the past that suggests that more than half of the classifications actually awarded to undergraduates were “unexplained” – the current number is just under 40 per cent.

    So, either four in ten degrees awarded in UK higher education are problematic – or a very simplistic algorithm isn’t actually very good and needs fixing.

    Occam’s razor

    So we thought OfS would take the extra weeks to rethink the algorithm. This has not happened.

    Instead, we get a more nuanced take on what is visible in this collection, which is worth quoting in full:

    The term ‘unexplained’ in this context means that changes in the characteristics of the graduating cohort included in our modelling cannot explain statistically the changes in attainment over the period.

    We are not seeking to understand what other factors might be driving the observed changes. We acknowledge that elements such as improvements in teaching quality could account for them. Our modelling cannot account for increases in degree awarding as a result of changes made in response to the pandemic. Neither can it account for entry requirements such as performance in an audition or the submission of a portfolio, as entry qualifications are limited to standard A-levels, BTECs and direct equivalents.

    Similarly, it cannot account for changes in entry qualifications as a result of the teacher-assessed grading necessitated during the pandemic. For this reason, we also classify these changes as ‘unexplained’.

    In reading this very welcome clarification you may want to think back to November’s OfS intervention on these topics. After investigating three providers (starting in 2022) England’s regulator appeared to decide that the problem was degree algorithms.

    A degree algorithm is the mechanism used by providers to calculate degree classifications from a set of module marks achieved by a student during their undergraduate study. This is a particularly British problem – in most systems globally a grade point average backed by a full transcript is far more important than any classification offered.

    In the three investigations OfS conducted it identified two particular aspect of degree algorithms – awarding a student the best result from multiple algorithms, and discounting credit with the lowest marks – that it was unsure were compatible with the requirements of registration condition B4 (which deals, in part with the “credibility” of degrees awarded).

    This was a new departure for a regulator that had previously been content to use words like “unexplained” to cast suspicion on academic standards more generally. The fact that it found three providers at risk of breaching B4 despite the absence of any current practice that would be in breach of B4 merely served as an indication that the game has changed.

    The hardest degree

    We get the usual data release alongside the report. Here’s a plot showing the percentage point difference between the actual grades awarded and the grades modelled by the algorithm (the so-called “unexplained” awards) – with the number of graduates shown by the thin grey lines. Filters allow you to look just at first class honours or first and upper second degrees, choose the year you are interested in (the most recent, 2023-24, is the default), and to choose a minimum number of graduates at a provider for display (the default is 500).

    Mousing over one of these marks shows, in the chart at the bottom – the actual (orange) awards plotted alongside the modelled (blue) awards.

    [Full screen]

    Top of the charts for “unexplained” first and upper second awards we find Goldsmiths, East London, and Bradford. With the exception of Goldsmiths’ all recorded a slight drop in the actual (observed) award of undergraduate degrees with these classifications each year.

    Like many providers at the top end of this chart, these institutions take pride in serving under-represented and non-traditional applicants to higher education – and they are very good at what they do. Goldsmiths’ is a large arts-focused institution, with admissions determined by portfolio in many cases. East London and Bradford are vocationally-focused providers with strong employer links, serving a local non-traditional population.

    East London and Bradford award a far lower proportion of first class and upper second degrees than – for example – Durham, Bath, or Bristol. In any meaningful, student-facing, interpretation of this phenomenon it is “easier” to get a good degree at a selective provider like that than at one more focused on serving the whole community. The hardest university to get a good degree at is Buckinghamshire New University – less than half of those who completed their course in 2023-24 achieved a first or upper second.

    It’s perhaps easier to see this phenomenon on a scatter plot showing both observed and modelled awards.

    [Full screen]

    There is a neat split by provider type – every Russell Group university awards more than 80 per cent of graduates a first or upper second, while only a handful (Bath, Loughborough, Lancaster, Arts, Goldsmiths’, Northumbria) do. Is that fair?

    Fairness

    The question for anyone concerned with academic standards is whether these provider level differentials are fair. The OfS algorithm – as noted above – uses age, prior attainment, and subject as study as explicatory factors. It’s worth dealing with each in turn.

    • OfS reckons that students with less than stellar A levels are less likely to get good degrees than those with AAA or above– so providers who recruit other kinds of learner will be penalised by the algorithm no matter how good they are at treating non-traditional learners.
    • Age doesn’t quite work how you might expect – mature students are very slightly more likely to get a first or an upper second than the traditional 18 year old entry cohort.
    • And humanities or social sciences subjects are judged to be harder to get a first in than physical sciences: so if you have (say) a huge law school and not many chemists you will struggle with the output of this algorithm.

    [Full screen]

    I’d love to show you the standard errors and p-values that offer reassurance on the quality of this information here, but I understand from OfS that there was an issue with calculating them correctly: the figures have now been removed from the annex. The team are satisfied that the coefficients are accurate for what that’s worth, but if you end up being investigated as a result of this data I would be asking some questions here.

    OfS has arrived at these insights through analysis of previous years of data – and this is a valid thing for them to have done. The failure to predict so much of what has actually happened suggests to me that other assumptions should be added to the model. It used to have disability, ethnicity, sex, and TUNDRA – these were axed from the model in 2023 ostensibly because they didn’t have much explanatory value.

    There is a commendable clarity in the technical annex that any gap between the model and reality is because of “a result of unobserved effects between academic years that have not been accounted for and have not been included as explanatory variables in the model”. It is good to see language like that up top too, as a counterbalance to the rather accusatory term “unexplained”.

    What of it?

    We wrote about the three investigations that have thus far come about as a result of this data when we got the reports published last year. What was notable from those judgements was that OfS did not find any current evidence of grade inflation at any of the three providers involved, though at two of the three they did find a historic concern about degree algorithm that had been in place prior to the existence of the OfS and was addressed speedily when it became apparent that it was causing problems.

    I am going to stick my neck out and say that there are likely to be no providers that are carrying out deliberate and systematic grade inflation as a matter of policy. If OfS feels that there are things providers are innocently doing that may result in grades being less than reliable what it needs to do is provide causal and statistical evidence that this is the case – and it will find this easier if it works with providers in the spirit of enhancement and continuous improvement rather than playing to the headlines.

    Source link

  • Rethinking Lead Quality for Marketing-Admissions Alignment

    Rethinking Lead Quality for Marketing-Admissions Alignment

    Why Quality Beats Quantity in Student Recruitment

    Many institutions measure enrollment success by the size of their funnel. However, lead volume alone doesn’t translate into student enrollments, and in many cases, it creates more friction than results.

    When marketing teams are tasked with generating as many student leads as possible, admissions teams are often left to sift through a flood of prospects who were never the right fit. The result is wasted effort, strained teams, and disappointing yield. A smarter approach focuses on lead quality, not volume, and requires marketing and admissions to work together from the very beginning.

    The Risks of a Volume-Driven Mindset

    A volume-driven approach creates several hidden risks that undermine enrollment goals.

    First, marketing may deliver impressive lead numbers that admissions teams simply can’t convert. When success is defined by quantity alone, campaigns are optimized for clicks and form fills, not for intent or fit. Admissions counselors then spend valuable time chasing prospects who lack academic readiness, program alignment, or enrollment urgency.

    Second, high lead volume increases operational burden. Admissions teams are forced into reactive mode — managing inboxes, repeating outreach attempts, and documenting interactions that rarely progress. Over time, this erodes morale and reduces the attention given to the strongest applicants.

    Finally, institutions often spend more on advertising without improving outcomes. Larger budgets drive more traffic, but without stronger targeting and messaging, enrollment yield remains flat. This cycle reinforces siloed operations rather than solving for them.

    As explored in my recent article about why admissions and marketing collaboration matters, alignment across teams — not scale — is the real growth lever.

    How Discovery Shapes Lead Quality

    High-quality recruitment doesn’t start with campaigns — it starts with clarity. And clarity is the product of strong discovery paired with powerful and differentiated storytelling.

    Discovery is where marketing and admissions teams uncover what actually drives enrollment success: who thrives in the program, why they choose it, what doubts they need resolved, and what outcomes actually motivate action. Without this foundation, messaging tends to default to broad, generic claims that attract attention but fail to reach the right students.

    Strong brand strategies don’t try to appeal to everyone. They’re built around intentional differentiation and can clearly articulate who the institution is a right fit for, what it stands for, and what makes its experience distinct. This, in turn, creates deeper engagement that translates into more qualified prospects. 

    When institutional storytelling is rooted in discovery, messaging becomes more precise and authentic. Instead of overpromising or relying on broad aspirational language, marketing communicates real program strengths, expectations, and outcomes. This clarity acts as a filter. Prospective students who see themselves in the story lean in with higher intent, while those who are misaligned self-select out earlier in the funnel.

    For admissions teams, this translates into more productive conversations. Leads arrive with clearer expectations, stronger program fit, and greater readiness to move forward. 

    In short, discovery-led storytelling reduces friction across the funnel. Marketing attracts fewer but better-aligned prospects, admissions spends less time correcting misalignment, and institutions see stronger enrollment outcomes driven by relevance rather than volume.

    Building Marketing-Admissions Alignment

    True alignment requires more than good intentions. It demands shared definitions, shared metrics, and ongoing communication.

    Institutions must define key performance indicators (KPIs) that connect lead quality to enrollment outcomes — such as yield, time to application, and retention — rather than isolating marketing performance from admissions results. When teams agree on what “good” looks like, strategy becomes easier to execute.

    Messaging, targeting, and follow-up should also be aligned around program goals. Marketing sets expectations honestly and clearly; admissions reinforces those expectations through consistent conversations. Feedback loops allow teams to refine targeting and messaging based on real applicant behavior, not assumptions.

    This approach echoes the mindset shift outlined in my colleague Brian Messer’s recent article, which covered why institutions should stop chasing student leads and focus instead on sustainable enrollment strategies.

    Less Volume, More Conversions

    A smaller pipeline doesn’t mean weaker results. In fact, institutions that prioritize lead quality often see higher conversion rates, stronger retention, and less staff burnout.

    With fewer but better-aligned prospects, admissions teams can focus on meaningful engagement rather than time-consuming, low-yield outreach. Applicants receive clearer guidance, faster responses, and a more personalized experience. And marketing and admissions share accountability for outcomes rather than deflecting responsibility across teams.

    Key Takeaways

    • Lead quality drives stronger enrollment outcomes than raw volume.
    • Discovery is the foundation of high-quality recruitment and clearer positioning.
    • Collaboration between marketing and admissions reduces silos, increases efficiency, and improves yield.

    When marketers prioritize lead quality over lead volume, everyone wins. 

    Improve Lead Quality and Align Marketing and Admissions With Archer

    At Archer Education, we work with your marketing and admissions teams to build sustainable lead generation and enrollment strategies. Our approach focuses on establishing lasting capabilities so that your institution has the tools, training, and insights to operate with confidence. 

    Our enrollment marketing teams conduct deep discovery to inform your campaigns, while our admissions and retention teams provide personalized engagement support to prioritize student success.

    Contact us today to learn more. 

    Source link

  • The Renters’ Rights Act is a disaster for independent students

    The Renters’ Rights Act is a disaster for independent students

    The Renters’ Rights Act is a transformative piece of legislation set to benefit renters through greater security and lower costs, except for one major blind spot.

    In particular, it may act as a homelessness pipeline for independent students – the status given by Student Finance England to students without external familial support while at university.

    Particularly vulnerable are those who are estranged, without living parents, or are care-leavers.

    The summer gap

    One of the key measures in the Renters’ Rights Act is the replacement of fixed-term tenancies with periodic tenancies, i.e., tenancies will be rolling, not fixed-term.

    This benefits most students, as it means that contracts can be terminated by tenants in May or June when the academic year is over, instead of being trapped in a twelve-month fixed-term contract.

    This creates the first major problem for independent students.

    Independent students rarely live exclusively with others who require year-round accommodation, and for many doing so may not be an option. So, instead of the security of a year-long contract guaranteeing accommodation, the landscape may shift so that most shared student rentals are only available between September and June.

    If independent students do manage to seek one another out and live together, this may seem to be one fix to this issue; it isn’t.

    Another key measure in the Renters’ Rights Act is to end no-fault evictions. However, there is a carve-out for student landlords to be able to evict students on a no-fault basis between June and September, provided they live in a student-only HMO. This is a major issue for students who do not have a home to return to.

    Then, there is the option for independent students to live in university halls. Unfortunately, this isn’t a secure option in many universities either. The Renters’ Rights Act allows purpose-built student accommodation to maintain fixed-term contracts. They are often only available from September to June, with providers utilising their accommodation over summer months for other uses.

    Where twelve-month tenancies are available, many purpose-built student accommodation blocks are significantly more expensive than student house shares.

    An independence tax

    Every option available to independent students is likely to add substantial costs. It seems improbable that student landlords will simply swallow the cost of having two or three fewer months of rental income over the academic year. So, there is a strong incentive for student landlords to up the cost of renting for the September to June period to a similar level to what it currently costs for twelve-month contracts.

    While the Renters’ Rights Act allows tenants to challenge unfair rises in rent, this isn’t a particularly effective measure for student housing; students are an incredibly transient group of tenants who can’t challenge an increase prior to being a tenant.

    All that is before considering the loss of the only cost-free workaround for students without a guarantor – upfront rental payments. Often, independent students have avoided the need for a UK-based guarantor by paying several months of rent in advance.

    However, the Renters’ Rights Act is set to curtail this practice by capping the amount of rent a landlord can request upfront. Without the option to pay upfront, these students will be forced toward private guarantor schemes, which are commercial services that typically charge a non-refundable fee in the region of 10 per cent of annual rent.

    Time for an extended maintenance loan

    Without substantially changing the Renters’ Rights Act to the detriment of most students, there seems to be no easy fix available beyond providing additional financial support for independent students.

    Last year, I called for the government to implement an extended maintenance loan aligned with the uplift available for other students who need year-long maintenance support – those on a “long course” – the name for those on a course which runs longer than thirty weeks and three days.

    When I wrote for Wonkhe to launch the campaign for an extended maintenance loan, I predicted that the government would make good on their promise of grants primarily to benefit the Department for Education’s public relations department. This prediction has come true – the government reintroduced grants for the poorest students, on specific courses.

    Unfortunately, this isn’t the progressive silver bullet it sounds like. It means that those students on those courses eligible for grants will repay less in the future. This benefit only materialises if, at some point in the future, their income is of an adequate level to be able to repay their loan in full – which is predicted to be about half of borrowers by the government.

    It’s a nice middle-earner’s income bonus in middle-age for a small number of students. While a step in the right direction and not to be scorned, it’s not the radical progressive reform it’s touted as. It changes nothing for the students struggling to cover basic living costs and, for example, being forced to live at home during their degree, which is around one-third of undergraduates according to UCAS, the highest level ever recorded – and not an option for independent students.

    There were some incremental improvements for care-leavers last year, who are no longer to be means assessed if entering higher education after the age of twenty-five. Indeed, the government is making progress on strengthening support for care-leavers.

    Ensuring more robust implementation of care-leaver “Pathway Plans” – a statutory duty which means local authorities must support care-leavers up to the age of twenty-five – would go a long way to helping this specific group with additional costs due to the aforementioned issues, too.

    A new barrier to be broken

    So, the Renters’ Rights Act, which I should be clear that I largely support and will myself benefit from, has a blind spot. It’s one I’ve raised, and multiple supportive MPs have raised, too.Independent students, particularly care-leavers, estranged students, and students with no living parents, already have a much higher attrition rate and a large attainment gap.

    This blind spot may lead to homelessness and act as a further deterrent for this group to access higher education and reach their full potential.You could say it is a barrier to opportunity, hoisted up by a government committed to breaking all the other barriers down.

    If the government is serious about its “Barriers to Opportunity” mission, it cannot allow a housing reform to become a homelessness pipeline for the very students who have already overcome the most to get to university.

    Source link

  • Higher education postcard: Teesside University

    Higher education postcard: Teesside University

    In August 1856, Joseph Constantine was born in Schleswig-Holstein (then Denmark, later Germany, famously questionable) to British parents: his father, Robert, was an engineer working on the Schleswig-Holstein railway. Joseph went to Newcastle Grammar School and in 1881 moved to Middlesbrough. There he set up in the shopping business, and did very well for himself.

    He was obviously imbued with a passion for Middlesbrough. We learn from the Yorkshire Post and Leeds Intelligencer on 2 July 1930 that:

    Mr Constantine was an active member of the Tees Conservancy Commission, whose work was closely associated with Mr Amos, the general manager. It was to him that Mr. Constantine first broached the idea, in June 1916, of doing something substantial for Middlesbrough. The idea that should connected with higher education was his own, but it was Mr. Amos who suggested that a visit should be made to Armstrong College Newcastle.

    Mr Constantine was greatly impressed with the good work of that institution, and made up his mind to provide the youth of his own town with similar educational facilities. It was is the office of the Mayor, then Mr Joseph Calvert, that Mr Constantine disclosed his proposal and the terms of his gift. The prolongation of the war prevented Mr Constantine from seeing the fulfilment of his dream, and the changed conditions made the gift of £40,000 inadequate for the scheme. But the generosity of Mr Constantine’s widow and his family in giving the same amount, enabled the building of the college to be accomplished.

    On 6 November 1922 (we read in the next day’s Leeds Mercury) the Middlesbrough Education Committee met, and in order to progress the scheme for a college, constituted itself, with representatives of Joseph Constantine (who may by then have been frail: he died six weeks later), as the governing body of the new college. A site had by then been bought, but commencing the build had run into difficulties. The governing body hence formed a sub-committee to look at other colleges to get ideas for buildings.

    In April 1927 the Town Council awarded the building contract – £65,000 – to Messrs Easton, a Newcastle firm (one alderman objected, arguing that the tender should go to a Middlesbrough firm which had bid at only £100 more). Building work was completed in time for the first students to be enrolled in September 1929. Constantine Technical College was born (Joseph Constantine was, apparently, against the college being named for him, but was persuaded by the mayor).

    It offered what we would now think of as both further and higher education, including University of London external degrees. By 1931 it was appointing its second Principal: Dr T J Murray was appointed from the Smethwick Municipal College, on an annual salary of £900, rising to £1200. ICI was offering scholarships for degree students and the students’ guild was organising its third charity rag, starting on 2 July and lasting for almost two weeks. The events list (from the South Bank Express, 18 June 1932) looked – mostly – good:

    • Saturday: motorized treasure hunt
    • Monday: students night at the Gaumont Palace, including a male beauty chorus and a female beauty competition (the latter open to all girls in Teesside over 16 years old)
    • Wednesday: opening of the amusement park by the beauty queen
    • Thursday: rag dances, three held simultaneously in Middlesbrough, Redcar and Stockton
    • Friday: boxing
    • Saturday: rag day, street collection, parade and jazz concert
    • Monday: mock civic night (presumably some sort of debating competition?)
    • Wednesday: sports day

    The college continued to develop through the 1950s and 1960s. It expanded, as can be seen by the relocation of its art school. In the 1960s there was some agitation for the creation of a technical university for the north east, for which Constantine College must have been in the frame. But these hopes were dashed in 1967, with the Secretary of State confirming that no funds would be available.

    The college renamed itself as Constantine College of Technology before becoming the Teesside Polytechnic in 1969. The local college of education was incorporated in the 1970s, and in 1992 it became the University of Teesside (this is the point where, as I wrote about last week, it was in partnership for a while with Durham University for the creation of University College Stockton). In 2009 it was renamed again, as Teesside University.

    Teesside is one of the few universities to have a biological organism named after it. Pseudomonas teessidea is a bacterium which can help to clean contaminated soil, and was discovered by Dr Pattanathu Rahman, then a Teesside University microbiologist.

    Here’s a jigsaw of the postcard – unposted but I guess dates from the 1930s, not long after the college was opened. Unposted, but there’s still a message:

    Source link

  • Students Should Insure an Investment as Important as College

    Students Should Insure an Investment as Important as College

    To the editor:

    We appreciate the opportunity to respond to the recent opinion essay “Degrees of Uncertainty” (Dec. 15, 2025). The author raises important questions about rising college costs, institutional incentives and the risks of oversimplifying complex financial challenges facing students and families.  We are pleased that she recognizes Loan Repayment Assistance Programs (LRAPs) help address affordability challenges and provide many benefits for students and colleges. 

    However, the author questions whether students should benefit from a guarantee that their college degree will be economically valuable. 

    LRAPs are, at their core, student loan insurance. It can be scary to borrow large student loans to finance an expensive college degree. There is a market failure, however, every time a student does not attend their preferred college, study their preferred major or pursue their preferred career because they are afraid of student loans. Students should be free to pursue their passions—not forced into second-best choices because of the cost of the degree or the prospect of a lower income in the future.  

    Society also loses out—especially if the lower-income career a student wants to pursue is a human service profession, such as education, where they will invest in improving the lives of others. 

    Most purchases come with a warranty or guarantee. Why should college be different? Colleges promise to provide value to students. We applaud those colleges and universities that stand behind that promise with a financial guarantee.

    As consumers, we routinely insure our biggest risks and largest purchases. We insure our homes, cars, boats and lives—and even our pets. Why shouldn’t we insure an expensive investment in college? 

    In any class, we can expect some students will earn less than their peers. It is reasonable for students to fear being among that group. An individual student cannot diversify that risk. That is the function of insurance.  

    LRAPs spread the risk across many students, just as insurance does with other familiar risks. Most drivers can’t protect themselves from the chance of being in a car accident and facing large repair and medical expenses. Insurance spreads that risk, turning a small chance of a very large cost into a small premium that protects against that loss. 

    LRAPs serve the same function for students—without the cost—because colleges cover the program, giving students peace of mind and the freedom to attend their preferred college and pursue their passions. 

    By doing this, LRAPs are a tool that can help colleges increase enrollment and revenue. This additional revenue can be invaluable at a time when colleges face many structural challenges—from regulatory changes to the disruption of AI to declining enrollment caused by the demographic cliff. 

    LRAPs provide meaningful protection to students while maintaining clear incentives to focus on completion, career preparation and postgraduation outcomes.

    Peter Samuelson is president and founder at Ardeo Education Solutions, a loan repayment assistance program provider. 

    Source link

  • Faculty Merit Act Is Meritless (opinion)

    Faculty Merit Act Is Meritless (opinion)

    A recent op-ed by David Randall, executive director of the Civics Alliance and director of research at the National Association of Scholars, argues that faculty hiring in American universities has become so corrupt that it requires sweeping legislative intervention. NAS’s proposed Faculty Merit Act would require public universities to publish every higher ed standardized test score—SAT, ACT, GRE, LSAT, MCAT and more—of every faculty member and every applicant for that faculty member’s position across different stages of a faculty search. The goal, they claim, is to expose discrimination and restore meritocracy.

    Letter to the editor

    A letter has been submitted in response to this article. You can read the letter here, and view all of our letters to the editor here.

    The proposal’s logic is explicit: If standardized test scores are a reasonable proxy for faculty merit, then a fair search should select someone with a very high score. If average scores decline from round to round, or if the eventual hire scored lower than dozens—or even hundreds—of rejected applicants, the public, Randall argues, should be able to “see that something is wrong.”

    But the Faculty Merit Act rests on a serious misunderstanding of how measurement and selection actually work. Even if one accepts Randall’s premise that a standardized test score “isn’t a bad proxy for faculty merit,” the conclusions he draws simply do not follow. The supposed red flags the proposed act promises to reveal are not evidence of corruption. They are the expected mathematical consequences of using an imperfect measure in a large applicant pool.

    I am a data scientist who works on issues of social justice. What concerns me is not only that NAS’s proposal is statistically unsound, but that it would mislead the public while presenting itself as transparent.

    A Statistical Mistake

    The proposed act depends on a simple idea: If standardized test scores are a reasonable proxy for faculty merit, then a fair search should select someone with a very high score. If the person hired has a lower score than many rejected applicants, or if average scores decline from round to round, something must be amiss.

    This sounds intuitive. It is also wrong.

    To see why, imagine the following setup. Every applicant has some level of “true merit” for a faculty job—originality, research judgment, teaching ability, intellectual fit. We cannot observe this truth directly. Instead, we observe a standardized test score, which captures some aspects of ability but misses many others. In other words, the test score contains two parts: a signal (the part related to actual merit) and noise (everything else the test does not measure).

    Now suppose a search attracts 300 applicants, as in Randall’s own example. Assume—very generously—that the search committee somehow identifies the single best applicant by true merit and hires that person.

    Here is the crucial point: Even if test scores are meaningfully related to true merit, the best applicant will almost never have the highest test score.

    Why? Because when many people are competing, even moderate noise overwhelms rank ordering. A noisy measure will always misrank some individuals, and the larger the pool, the more dramatic those misrankings become. This is the same reason that ranking professional athletes by a single skill—free-throw percentage, say—would routinely misidentify the best overall players, especially in a large league.

    How Strong Is the Test-Merit Relationship, Really?

    Before putting numbers on this, we should ask a basic empirical question: How strongly do standardized tests actually predict the kinds of outcomes that matter in academia?

    The most comprehensive recent research on the GRE—the test most relevant to graduate education—finds minimal predictive value. A meta-analysis of more than 200 studies found that GRE scores explain just over 3 percent of the variation in graduate outcomes such as GPA, degree completion and licensing exam performance. For graduate GPA specifically—the outcome the test is explicitly designed to predict—GRE scores explained only about 4 percent of the variance.

    These studies assess near-term prediction within the same educational context: GRE scores predicting outcomes for the very students who took the test, measured only a few years later—under conditions maximally favorable to the test’s validity. The NAS proposal extrapolates from evidence that is already weak even under these favorable conditions. It would evaluate faculty hiring using test scores—often SAT scores—taken at age 17, applied to candidates who may now be in their 30s, 40s or older. Direct evidence for that kind of long-term extrapolation is scarce. However, the limited evidence that does exist points towards weak relationships rather than strong ones. For instance, Google’s internal hiring studies famously found “very little correlation” between SAT scores and job performance.

    Taken together, the research suggests that any realistic relationship between standardized test scores and faculty merit is weak—certainly well below the levels needed to support NAS’s proposed diagnostics.

    What This Means in Practice

    The proposed Faculty Merit Act raises an important practical question: Even if standardized test scores contain some information about merit, how useful are they when hundreds of applicants compete for a single job?

    Taking the GRE meta-analysis at face value, standardized test scores correlate with relevant academic outcomes at only about 0.18. Treating that number as a proxy for faculty merit is already generous, given the decades that often separate testing from hiring and the profound differences between standardized exams and the actual work of a professor. But let us grant it anyway.

    Now, consider a search with 300 applicants. With a correlation of 0.18, I calculate that the single strongest candidate by true merit would typically score only around the 70th percentile on the test—roughly 90th out of 300. In other words, it would be entirely normal for around 90 rejected applicants to have higher test scores than the eventual hire.

    Nothing improper has happened. No favoritism or manipulation is required. This outcome follows automatically from combining a weak proxy with a large applicant pool.

    Even if we assume a much stronger relationship—say, a correlation of 0.30, which already exceeds what the evidence supports for most academic outcomes—the basic conclusion does not change. Under that assumption, I calculate that the best candidate would typically score only around the 80th percentile, corresponding to a rank near 60 out of 300. Dozens of rejected applicants would still have higher test scores than the person who gets the job.

    This is the point the proposal gets exactly backward. The pattern it treats as a red flag—a hire whose test score is lower than that of many rejected applicants—is not evidence of corruption. It is the normal, mathematically expected outcome whenever selection relies on an imperfect measure. Scaling this diagnostic across many searches does not make it informative; it simply reproduces the same expected misrankings at a larger scale.

    Why ‘Scores Dropped Each Round’ Proves Nothing

    The same logic applies to the claim that average test scores should increase at each stage of a search.

    Faculty hiring is not one-dimensional. Early stages might screen for general competence; later stages may emphasize originality, research direction, teaching effectiveness and departmental fit—traits that standardized tests measure poorly or not at all. As a search progresses, committees naturally place less weight on test scores and more weight on other information. When that happens, average test scores among finalists can stay flat or even decline. That pattern does not signal manipulation. It signals that the committee is selecting on dimensions that actually matter for the job.

    Transparency, Justice and Bad Diagnostics

    Randall’s op-ed, published by the James G. Martin Center for Academic Renewal, frames the proposal as a response to injustice. But transparency based on invalid diagnostics does not mitigate injustice; it produces it.

    Publishing standardized test scores invites the public to draw conclusions that those numbers cannot support—and those conclusions will not fall evenly. Standardized test scores are strongly shaped by socioeconomic background and access to resources. Treating them as a universal yardstick of merit—especially for faculty careers—will predictably disadvantage scholars from marginalized and nontraditional paths.

    From the standpoint of justice, this is deeply concerning. Accountability mechanisms must rest on sound reasoning. Otherwise, they become tools for enforcing hierarchy rather than fairness.

    If the goal is genuine academic renewal, it should begin with renewing our understanding of what numbers can—and cannot—tell us. Merit cannot be mandated by publishing the wrong metrics, and justice is not served by statistical arguments that collapse under careful inspection.

    Chad M. Topaz is a faculty member at Williams College; co-founder of the Institute for the Quantitative Study of Inclusion, Diversity and Equity; and winner of the Mary and Alfie Gray Award for Social Justice from the Association for Women in Mathematics. He is the author of Unlocking Justice: The Power of Data to Confront Inequity and Create Change, forthcoming from Princeton University Press in May, and can be found on Bluesky at @chadtopaz.

    Source link

  • FIRE statement on calls to ban X in EU, UK

    FIRE statement on calls to ban X in EU, UK

    In recent days, senior United Kingdom government officials and members of the European Parliament have threatened to ban the social media platform X in response to a proliferation of sexualized images on the platform, including images of minors, created by user prompts supplied to Grok, X’s artificial intelligence application. 

    The following statement can be attributed to Ari Cohn, FIRE’s lead counsel for tech policy:

    Banning a platform used by tens of millions of EU and UK residents to participate in global conversations would be a grave mistake. 

    X and Grok are tools for communication, much like printing presses and cell phones are tools for communication. If those tools are used to create and share unlawful content, the answer must be to prosecute those individuals responsible, not to shut down a vital communicative hub in its entirety. Free nations that claim to honor the expressive rights of their citizens must recognize that mass censorship is never an acceptable approach to objectionable content or illegal conduct. Just as the United States’ attempt to ban TikTok violated core First Amendment principles, so too would an international ban of a social media platform violate basic tenets of freedom of expression. 

    As we navigate the challenges of technological advances like artificial intelligence, we must reject censorship and top-down governmental control. In our interconnected world, censorship abroad affects all of us, wherever we call home.

    Source link