Author: admin

  • Colleges Must Pursue All Legal Paths for Diversity (opinion)

    Colleges Must Pursue All Legal Paths for Diversity (opinion)

    Two years ago, the Supreme Court dealt a devastating blow to opportunity in America when it gutted access to higher education for underrepresented groups. That decision was not only legally misguided but also turned a blind eye to the deep inequities that have long shaped our education system. Our colleges and universities scrambled to find lawful tools to ensure that their student bodies still reflected the breadth of talent and promise in this country.

    One of those tools was Landscape, a program recently canceled by the College Board that gave admissions officers data about a student’s high school and neighborhood while explicitly excluding race or ethnicity.

    Standardized test scores and GPAs never tell the whole story. Median family income, access to Advanced Placement courses, local crime rates and other key indicators help admissions officers see the full picture and provide crucial context to help identify high-achieving students from disadvantaged communities. These are students whom universities might otherwise overlook. Tools that give context level the playing field—not by lowering standards, but by lifting students up according to their merit and the obstacles they have overcome.

    The Supreme Court, even in striking down diversity initiatives, still made clear that universities could explore race-neutral alternatives to achieve equity. The use of socioeconomic and geographic factors is exactly such an alternative. Despite U.S. Attorney General Pamela Bondi’s recent nonbinding guidance warning against the use of geographic indicators as “proxies” for race, make no mistake: Abandoning consideration of these elements of an applicant’s background is not a legal requirement but a political choice, reflecting fear rather than courage.

    Without tools that account for the barriers students face, colleges will fall back on practices that overwhelmingly favor the privileged, shutting out low-income and first-generation students who have already beaten the odds. This spoils opportunity for millions, and our campuses and our nation will suffer for it. Diversity is not a box to check; it is a vital engine of education and democracy. Classrooms that bring together students from different walks of life prepare all graduates to lead a diverse society, foster innovation and strengthen our communities.

    We cannot allow the Supreme Court’s decision—and the chilling effect in its wake—to undo decades of progress. And we cannot allow educational institutions to abdicate their responsibility in this moment of crisis. The data that provides broader context for applicants remains available, but without the will to use it, too many doors will remain closed for the students who need them most.

    America has always promised to reward hard work and perseverance, no matter where you come from. That promise rings hollow if we allow the wealthy and well connected to monopolize educational opportunity. Colleges and universities must honor that promise by continuing to seek out and support students who have succeeded against the odds. Fairness demands it, equal opportunity requires it and the future of our country depends on it.

    The authors all serve as state attorneys general: New York Attorney General Letitia James, Connecticut Attorney General William Tong, Delaware Attorney General Kathy Jennings, Illinois Attorney General Kwame Raoul, Minnesota Attorney General Keith Ellison, New Jersey Attorney General Matthew Platkin, Vermont Attorney General Charity Clark and Washington Attorney General Nick Brown.

    Source link

  • Dear DfE… Show Your Workings 

    Dear DfE… Show Your Workings 

    This blog was kindly authored by Mike Crone, a final-year law student at the University of Reading. He is developing a series of blogs, articles and research on public law matters, and the future of higher education policy.

    The Issue 

    You look intently at the assessment mark. 52%. That’s it. No why. No how. Just 52%. The comments read something like: ‘Good structure. Engage more critically’. Two sentences to explain twelve weeks of work. And that number now decides your classification, your next step, maybe your career. 

    In the HEPI and Advance HE Student Academic Experience Survey 2025, 58% respondents state that all or most of their teaching staff gave them useful feedback. However, this means that for 42% of respondents, around half, a minority, or none of their teaching staff were providing useful feedback. Similarly, 51% of respondents stated that all or a majority of teaching staff provided feedback on draft work, and 58% stated that all of the majority of their teaching staff gave them more general feedback on progress. While some of these figures are welcome, there is an issue of consistency. Most students are having a positive experience of feedback from most of their teaching staff, however, there are gaps in the system. For example, 14% of students stated that a minority or none of their teaching staff provided them with useful feedback. 

    While these figures have improved over the last five years, the statistics remain concerning. Where useful feedback is lacking, marks may be being awarded without transparent explanation, feedback is often vague, and links to assessment rubrics may be missing or inconsistently applied. Without improvements, students are not consistently being shown how to improve, and even where rubrics are introduced, their effectiveness hinges on clarity, training, and implementation, all of which vary widely. If students question the result, they may often be told it falls under ‘academic judgement’. 

    In a system that demands students explain every idea, quote every claim, and justify every argument, surely institutions should be held to the same standard? 

    This would be concerning at any time. But in 2025, it’s urgent. Ninety-three per cent of students now use generative AI tools in their studies, up from 66 per cent just a year ago, according to the HEPI–Kortext Gen AI Survey. As the Guardian reported, thousands of UK university students were caught cheating using AI in the last academic year. The pressure on universities to modernise assessment and restore student trust has never been greater. 

    And as Rohan SelvaRadov highlighted in his HEPI Policy Note Non-Examinable Content: Student access to exam scripts, most students do not even see their exam scripts. If students cannot access the work being judged, feedback loses almost all its value. Transparency begins with access. Without it, fairness collapses. Rohan’s superb recommendations on page 10 of the Policy Note set the foundations for rectification.  

    The Problem 

    Assessment is the foundation of credibility in higher education. But right now, that foundation is cracking. Markers vary. Some use rubrics carefully. Others rely on instinct. A recent study of programming assignments asked 28 markers to grade the same set of student submissions. The results were wildly inconsistent, and in some criteria, the level of agreement was close to random. Double marking and moderation exist, but they rarely give students clarity. Feedback still often consists of vague phrases like ‘needs depth’ or ‘some repetition’, which give no insight into how the grade was reached. 

    This is not only a pedagogical failure. It raises legal concerns. 

    Under Section 49 of the Consumer Rights Act 2015, universities must provide services with ‘reasonable care and skill’. If a student receives a grade without explanation, it risks breaching that statutory duty. Schedule 2 of the Act lists examples of unfair terms, many of which could be triggered by provisions in student handbooks or teaching contracts. 

    The Equality Act 2010 goes further. Sections 20 and 21 require universities to make reasonable adjustments where a provision, criterion, or practice places disabled students at a substantial disadvantage. Schedule 13 goes into greater depth surrounding the duties of Higher Education institutes. Vague or unstructured feedback can do exactly that, especially for neurodivergent students who may rely on clarity and structure to improve. Where feedback is not intelligible, impactful, and rubricaligned, universities may be breaching their anticipatory duty under Section 149 as well as the individual duty under Section 20. 

    Meanwhile, the formats we continue to rely on (long essays and highstakes exams) are increasingly misaligned with the world graduates inhabit. Essays reward polish and curriculum style and adherence. Exams reward memory under pressure. Both reward conformity. Neither reflects how people learn and work today, especially in an age of technology and AIsupported thinking. 

    If students are learning differently, thinking differently, and writing differently; why are aren’t we assessing them differently? 

    The Solution 

    The Department for Education (DfE) has the power to act. The Secretary of State for Education and Minister for Women and Equalities appoints the Office for Students (OfS) and sets regulatory priorities. The OfS was designed as a buffer, not a direct arm of government. But if students cannot trust how their futures are decided, then the DfE must ensure the OfS enforces transparency. This does not mean ministers marking essays. It means regulators requiring clear and fair feedback from institutions. 

    First, every summative assessment should include a short, criterion-linked justification. Paragraphs should be labelled according to the rubric. If the student scored a 2:2 in structure and a 1st in analysis, they should be told so clearly and briefly. It would be as easy as colour-coding the marking rubric sections on the rubric table and then highlighting each sentence, paragraph, or particular section as to which colour-coded rubric area it correlates to.  

    Second, from September 2025, Jisc is piloting AI-assisted marking tools like Graide, KEATH and TeacherMatic. These systems generate rubric-matched feedback and highlight inconsistencies. They do not replace human markers. They reveal the thinking behind a mark, or its absence. 

    Pilots should be funded nationally. The results should be made public. If AI improves consistency and transparency, it should be integrated with safeguards and moderation. 

    Third, we need fewer mega-assessments and more micro-assessments. Small, frequent tasks: oral analyses, short-answer applications, real-world simulations, timed practicals. These are harder to cheat, easier to mark, and better at testing what matters: judgement, adaptability, and process. 

    British University Vietnam has already piloted an AI-integrated assessment model with a 33 per cent increase in pass rates and a 5.9 per cent rise in overall attainment. This is not theory. It is happening. But that, precisely, is the concern. A jump in attainment might reflect grade inflation or relaxed calibration rather than increased accuracy. Recent studies complicate the AI narrative: a 2025 study in BMC Medical Education found that while AI systems like ChatGPT-4o and Gemini Flash 1.5 performed well in visually observable OSCE tasks (e.g., catheterisation, injections), they struggled with tasks involving communication or verbal interpretation; areas where nuance matters most. 

    Finally, the OfS registration conditions can be updated to require forensic marking as a basic quality measure. The QAA Quality Code can be revised to mandate ‘outcome-reason mapping’. Institutional risk and satisfaction profiles can include indicators like student trust, misconduct rates, and assessment opacity. 

    It is to be noted that, as per the Competition & Markets Authority’s (CMA) guidance and the case of Clark v University of Lincolnshire and Humberside [2000] EWCA Civ 129, if assessment is not transparent, it may not be lawful, and could be left open to judicial challenge. However, it may not be wise to pursue such judicial challenge through an application for judicial review. The precedent set by the case of Clark, subsequent cases thereafter, and the CMA’s guidance, almost closes the door to judicial review. But, in turn, it leaves open the door to a civil action of a possible breach of contract.  

    In conclusion, Dear DfE… please see me after class regarding the above. If students must show their workings, then so must academic institutions, with government support. With the ever-increasing appetite of the population for litigation, it would seem prudent to take pre-emptive action and collaboration to mitigate such risks.  

    Source link

  • Higher Education must help shape how students learn, lead and build the skills employers want most

    Higher Education must help shape how students learn, lead and build the skills employers want most

    For the first time in more than a decade, confidence in the nation’s colleges and universities is rising. Forty-two percent of Americans now say they have “a great deal” or “quite a lot” of confidence in higher education, up from 36 percent last year.  

    It’s a welcome shift, but it’s certainly not time for institutions to take a victory lap. 

    For years, persistent concerns about rising tuition, student debt and an uncertain job market have led many to question whether college was still worth the cost. Headlines have routinely spotlighted graduates who are underemployed, overwhelmed or unsure how to translate their degrees into careers.  

    With the rapid rise of AI reshaping entry-level hiring, those doubts are only going to intensify. Politicians, pundits and anxious parents are already asking: Why aren’t students better prepared for the real world?  

    But the conversation is broken, and the framing is far too simplistic. The real question isn’t whether college prepares students for careers. It’s how. And the “how” is more complex, personal and misunderstood than most people realize.  

    Related: Interested in innovations in higher education? Subscribe to our free biweekly higher education newsletter. 

    What’s missing from this conversation is a clearer understanding of where career preparation actually happens. It’s not confined to the classroom or the career center. It unfolds in the everyday often overlooked experiences that shape how students learn, lead and build confidence.  

    While earning a degree is important, it’s not enough. Students need a better map for navigating college. They need to know from day one that half the value of their experience will come from what they do outside the classroom.  

    To rebuild America’s trust, colleges must point beyond course catalogs and job placement rates. They need to understand how students actually spend their time in college. And they need to understand what those experiences teach them. 

    Ask someone thriving in their career which part of college most shaped their success, and their answer might surprise you. (I had this experience recently at a dinner with a dozen impressive philanthropic, tech and advocacy leaders.) You might expect them to name a major, a key class or an internship. But they’re more likely to mention running the student newspaper, leading a sorority, conducting undergraduate research, serving in student government or joining the debate team.  

    Such activities aren’t extracurriculars. They are career-curriculars. They’re the proving grounds where students build real-world skills, grow professional networks and gain confidence to navigate complexity. But most people don’t discuss these experiences until they’re asked about them.  

    Over time, institutions have created a false divide. The classroom is seen as the domain of learning, and career services is seen as the domain of workforce preparation. But this overlooks an important part of the undergraduate experience: everything in between.  

    The vast middle of campus life — clubs, competitions, mentorship, leadership roles, part-time jobs and collaborative projects — is where learning becomes doing. It’s where students take risks, test ideas and develop the communication, teamwork and problem-solving skills that employers need.  

    This oversight has made career services a stand-in for something much bigger. Career services should serve as an essential safety net for students who didn’t or couldn’t fully engage in campus life, but not as the launchpad we often imagine it to be. 

    Related: OPINION: College is worth it for most students, but its benefits are not equitable 

    We also need to confront a harder truth: Many students enter college assuming success after college is a given. Students are often told that going to college leads to success. They are rarely told, however, what that journey actually requires. They believe knowledge will be poured into them and that jobs will magically appear once the diploma is in hand. And for good reason, we’ve told them as much. 

    But college isn’t a vending machine. You can’t insert tuition and expect a job to roll out. Instead, it’s a platform, a laboratory and a proving ground. It requires students to extract value through effort, initiative and exploration, especially outside the classroom.  

    The credential matters, but it’s not the whole story. A degree can open doors, but it won’t define a career. It’s the skills students build, the relationships they form and the challenges they take on along the way to graduation that shape their future. 

    As more college leaders rightfully focus on the college-to-career transition, colleges must broadcast that while career services plays a helpful role, students themselves are the primary drivers of their future. But to be clear, colleges bear a grave responsibility here. It’s on us to reinforce the idea that learning occurs everywhere on campus, that the most powerful career preparation comes from doing, not just studying. It’s also on us to address college affordability, so that students have the time to participate in campus life, and to ensure that on-campus jobs are meaningful learning experiences.  

    Higher education can’t afford public confidence to dip again. The value of college isn’t missing. We’re just not looking in the right place. 

    Bridget Burns is the founding CEO of the University Innovation Alliance (UIA), a nationally recognized consortium of 19 public research universities driving student success innovation for nearly 600,000 students. 

    Contact the opinion editor at [email protected]. 

    This story about college experiences was produced by The Hechinger Report, a nonprofit, independent news organization focused on inequality and innovation in education. Sign up for Hechinger’s weekly newsletter. 

    The Hechinger Report provides in-depth, fact-based, unbiased reporting on education that is free to all readers. But that doesn’t mean it’s free to produce. Our work keeps educators and the public informed about pressing issues at schools and on campuses throughout the country. We tell the whole story, even when the details are inconvenient. Help us keep doing that.

    Join us today.

    Source link

  • Chatbots in Higher Education: Benefits, Challenges, and Strategies to Prevent Misuse – Faculty Focus

    Chatbots in Higher Education: Benefits, Challenges, and Strategies to Prevent Misuse – Faculty Focus

    Source link

  • The risk of opening a legislative backdoor into the sector’s pocket

    The risk of opening a legislative backdoor into the sector’s pocket

    It’s 2029. The international student fee levy is finally in place, after a complicated legislative passage, further consultation, and squabbles over implementation.

    Still riding high in the polls, though with an eye to accusations of unfunded spending commitments, Reform’s manifesto promises to jack up the levy to 40 per cent, explicitly labelling it a lever to cut net migration and unsurprisingly deaf to its effects on university balance sheets (as well as to arguments that this could in fact reduce the overall take – they have modelling which says it won’t).

    After all, the primary legislation to operationalise the government top-slice of universities’ student income leaves the exact amount of the levy to the discretion of the Secretary of State. It will be a relatively simple laying of regulations to have the new percentage in place by autumn.

    Scratch that – it’s 2032. The Conservatives are back in power (somehow). The industrial strategy has been binned, and with it the underpinnings of the “priority subject areas” that have determined which students and which courses are eligible for maintenance grants. With the pretext that those who benefit from higher education should in later life foot the bill – and the entirely accurate observation that whether maintenance is in the form of of grant or loan doesn’t actually affect whether students are “working every hour God sends” to support themselves while studying – the Conservative government decides to end the confusing patchwork of targeted grants it has inherited and (once again) shift student support over to maintenance loans. (Oh and the levy income will instead be used to plug the growing apprenticeship overspend.)

    Now when the act passed there was nothing that made a cast-iron link between grants and the fee levy – indeed, there’s not a single mention of how the funds should be spent on the face of the bill, because that’s not the kind of thing you can practically legislate for. Backbenchers flagged this, ministers said it was a commitment and they would stick to it, and Labour’s majority held up.

    This hypothetical Tory Treasury is still antsy about expanding the loan book – gilts are still high, the era of rock-bottom interest rates seems a distant memory – and the price of raising borrowing for maintenance is the announcement of a multi-year freeze on tuition fees. Here we go again.

    How about this one: it’s halfway through Labour’s second term in office, and it’s becoming clear that the modular LLE hasn’t really taken off. The demand for several thousand pounds of plan 5 loan debt in return for a short course has, shockingly, not materialised. As happened with the pilot exercise, DfE tries to tempt learners in with student support grants, rather than chunked up maintenance loans. When this doesn’t bear much fruit, as with the modular acceleration programme the next play is to entirely waive tuition fees for technical courses, just deducting them from LLE entitlement instead.

    Despite low demand, the need to keep finding little pots of cash to spend for the incentivising of modular provision has stretched DfE’s willingness to let too much of the levy income go towards maintenance grants for full degrees (especially as, to the surprise of few, the department was never intending to allocate the whole haul to maintenance grants).

    Maybe there’s a damning National Audit Office report. Maybe there are anecdotal reports of spotty financial controls and agents encouraging students onto certain newly launched courses to get access to lump sums of maintenance, rather than for genuine study. With an eye on the next election and the 10-year NHS workforce plan’s final year looming, the thought pops up – wouldn’t it be politically expedient to just bring back grants for nursing students rather than fiddling around with all these industrial strategy bits and pieces?

    Final one. It’s 2038 or something, and the Office for National Statistics is finally approaching the end of its review of the classification of higher education in the national accounts which it began in 2017. To be fair to the beleaguered stats body, each UK nation has either made large changes to its higher education system in the interim, or announced wholesale reviews which have then not led to much change, leading to one pause after another. Finally though, the ONS is in a position to weigh up all the dimensions of the government’s oversight and control of the English higher education sector, which now includes the ability to skim off a set percentage of all international student income – and decides on classification within the public sector.

    All the sector submissions and parliamentary interventions which tried to advocate against the levy on these very grounds – the scare stories of controls on borrowing, limits on senior staff pay, and changes to how accounts are managed – are vindicated. (However, as Julian Gravatt has pointed out in the definitive article on the topic, the government of the day then carefully takes steps to address just enough of the specifics of the ONS’ decision and thus move universities back out of the public sector. It doesn’t want to lose out on the income the levy brings, so instead it makes changes elsewhere, to regulation perhaps, or pensions. It’s all a bit of a mess.)

    Through the trapdoor

    However the government decides to legislate for the fee levy – it might be a standalone bill, or wrapped up in a larger HE Act – it’s going to be a complicated process. Labour backbenchers have been expressing concerns since it was first mooted, but the grafting on of maintenance grants means that it will be harder for MPs to vote against.

    The sector has largely marshalled two arguments against it: that it will enormously destabilise finances, and that it’s unfair and risky to further cross-subsidise home students with international income. On the first, it’s clear that the government is not convinced that there isn’t a bit more to be squeezed, especially as it has seen much of the sector impose year after year of inflation-busting increases to overseas fee sticker prices – it’s probably no surprise that the white paper modelling saw the cost of the levy passed on, even though some universities will be unable to achieve this in practice. It’s still a sensible argument to make, though until we see to what extent the government is slow-rolling a wider package of tuition fee increases it’s hard to know whether it can gain traction.

    Equally, the argument about cross-subsidy is proving and will continue to prove ineffective, given that DfE has hinted its intention to claim that this helps higher education make the case for international student recruitment to the wider public on exactly those grounds.

    But there’s a larger, longer-term case to be made to ministers and parliamentarians, that considers the enormous unintended consequences and political risks that prising open HE balance sheets in this way will enable. Once the backdoor has been installed, it’s there for hostile actors to take advantage of, and for user error to compound the problems. It is verging on a certainty that the legislation will neither restrict the level the levy is set at nor ringfence how its takings are used.

    Now the announcement has been made it’s almost certainly too late, but the need for the government to legislate to make this a reality points to missed opportunities around cooperation on access – a sector-owned and co-funded pot of money for student support and, yes, redistribution would have been far more effective at staying out of the political fray. This levy will be square in the middle of it, for many years to come.

    Source link

  • The case for collaborative purchasing of digital assessment technology

    The case for collaborative purchasing of digital assessment technology

    Higher education in the UK has a solid background in leveraging scale in purchasing digital content and licenses through Jisc. But when it comes to purchasing specific technology platforms higher education institutions have tended to go their own way, using distinct specifications tailored to their specific needs.

    There are some benefits to this individualistic approach, otherwise it would not have become the status quo. But as the Universities UK taskforce on transformation and efficiency proclaims a “new era of collaboration” some of the long standing assumptions about what can work in a sharing economy are being dusted off and held up to the light to see if they still hold. Efficiency – including finding ways to realise new forms of value but with less overall resource input – is no longer a nice to have; it’s essential for the sector to remain sustainable.

    At Jisc, licensing manager Hannah Lawrence is thinking about the ways that the sector’s digital services agency can build on existing approaches to collective procurement towards a more systematic collaboration, specifically, in her case, exploring ideas around a collaborative route to procurement for technology that supports assessment and feedback. Digital assessment is a compelling area for possible collaboration, partly because the operational challenges are fairly consistent between institutions – such as exam security, scalability, and accessibility – but also because of the shared pedagogical challenge of designing robust assessments that take account of the opportunities and risks of generative AI technology.

    The potential value in collaboration isn’t just in cost savings – it’s also about working together to test and pilot approaches, and share insight and good practice. “Collaboration works best when it’s built on trust, not just transaction,” says Hannah. “We’re aiming to be transparent and open, respecting the diversity of the sector, and making collaboration sustainable by demonstrating real outcomes and upholding data handling standards and ethics.” Hannah predicts that it may take several years to develop an initial iteration of joint procurement mechanism, in collaboration with a selection of vendors, recognising that the approach could evolve over years to offer “best on class” products at a competitive price to institutions who participate in collective procurement approaches.

    Reviewing the SIKTuation

    One way of learning how to build this new collaborative approach is to look to international examples. In Norway, SIKT is the higher education sector’s shared services agency. SIKT started with developing a national student information system, and has subsequently rolled out, among other initiatives, national scientific and diploma archives, and a national higher education application system – and a national tender for digital assessment.

    In its first iteration, when the technology for digital assessment was still evolving, three different vendors were appointed, but in the most recent version, SIKT appointed one single vendor – UNIwise – as the preferred supplier for digital assessment for all of Norwegian higher education. Universities in Norway are not required to follow the SIKT framework, of course, but there are significant advantages to doing so.

    “Through collaboration we create a powerful lobby,” says Christian Moen Fjære, service manager at SIKT. “By procuring for 30,000 staff and 300,000 students we can have a stronger voice and influence with vendors on the product development roadmap – much more so than any individual university. We can also be collectively more effective in sharing insight across the network, like sample exam questions, for example.” SIKT does not hold views about how students should be taught, but as pedagogy and technology become increasingly intertwined, SIKT’s discussions with vendors are typically informed by pedagogical developments. Christian explains, “You need to know what you want pedagogically to create the specification for the technical solution – you need to think what is best for teaching and assessment and then we can think how to change software to reflect that.”

    For vendors, it’s obviously great to be able to sell your product at scale in this way but there’s more to it than that – serving a critical mass of buyers gives vendors the confidence to invest in developing their product, knowing it will meet the needs of their customers. Products evolve in response to long-term sector need, rather than short-term sales goals.

    SIKT can also flex its muscles in negotiating favourable terms with vendors, and use its expertise and experience to avoid pitfalls in negotiating contracts. A particularly pertinent example is on data sharing, both securing assurances of ethical and anonymous sharing of assessment data, and clarity about ultimate ownership of the data. Participants in the network can benefit from a shared data pool, but all need to be confident both that the data will be handled appropriately and that ultimately it belongs to them, not the vendor. “We have baked into the latest requirements the ability to claw back data – we didn’t have this before, stupid, right?” says Christian. “But you learn as the needs arise.”

    Difference and competition

    In the UK context, the sector needs reassurance that diversity will be accommodated – there’s a wariness of anything that looks like it might be a one-size-fits-all model. While the political culture in Norway is undoubtedly more collectivist than in the UK, Norwegian higher education institutions have distinct missions, and they still compete for prestige and to recruit the best students and staff.

    SIKT acknowledges these differences through a detailed consultation process in the creation of national tenders – a “pre-project” on the list of requirements for any technology platform, followed by formal consultation on the final list, overseen by a steering group with diverse sector representation. But at the end of the day to realise the value of joining up, there does need to be some preparedness to compromise, or to put it another way, to find and build on areas of similarity rather than over-refining on what can often be minor differences. Having a coordinating body like SIKT convene the project helps to navigate these issues. And, of course, some institutions simply decide to go another way, and pay more for a more tailored product. There is nothing stopping them from doing so.

    As far as SIKT is concerned, competition between institutions is best considered in the academic realm, in subjects and provision, as that is what benefits the student. For operations, collaboration is more likely to deliver the best results for both institutions and students. But SIKT remains agnostic about whether specific institutions have a different view. “We don’t at SIKT decide what counts as competitive or not,” says Christian. “Universities will decide for themselves whether they want to get involved in particular frameworks based on whether they see a competitive advantage or some other advantage from doing so.”

    The medium term horizon for the UK sector, based on current discussions, is a much more networked approach to the purchase and utilisation of technology to support learning and teaching – though it’s worth noting that there is nothing stopping consortia of institutions getting together to negotiate a shared set of requirements with a particular vendor pending the development of national frameworks. There’s no reason to think the learning curve even needs to be especially steep – while some of the technical elements could require a bit of thinking through, the sector has a longstanding commitment to sharing and collaboration on high quality teaching and learning, and to some extent what’s being talked about right now is mostly about joining the dots between one domain and another.

    This article is published in association with UNIwise. For further information about UNIwise and the opportunity to collaborate contact Tim Peers, Head of Partnerships.

    Source link

  • Making grants and the levy work

    Making grants and the levy work

    Opinions vary about the desirability of the levy on international student fees, and the value of the promised return of targeted maintenance grants.

    Rightly so. The announcement and the descriptions of policies within, were political in nature. They were made at a party conference rather than a ministerial statement or consultation document – they were designed to please some, challenge others, and above all to start a debate.

    And as such, all these opinions are valuable. The government will listen to representations, seek commentary and challenge, and eventually start to spell out some more of the detail and implementation.

    Implementation couldn’t care less about opinions or political expediency. Implementation is a matter of whether something can actually be done, and how.

    My number one priority

    Let’s take the simplistic approach, and call the income the government gets from the levy something around £620m (more on that later).

    In the grand scheme of things that’s not a huge amount of money – we paid out more than £8bn on maintenance loans in 2023-24. However, the much-maligned magic money twig (the OfS’ funding for student access and success) is currently just £273m, and it is ostensibly doing part of the same job as the proposed grants – helping non-traditional students access and succeed in higher education. Of course, it mostly goes on hardship grants these days, which is neither what it is designed for nor any meaningful remedy for a student maintenance system that is not fit for purpose. But that makes the parallel even clearer.

    Any extra money going to students, in this economy and with this level of unwillingness to do anything truly radical about student hardship, is welcome. But the kicker is that it is not enough to be from a deprived background to get the new money – you also need to be studying the right subjects. As we’ve already noted, these are the same “priority subjects” as have been set within the Lifelong Learning Entitlement: vaguely STEMish, but with no medicine but added architecture and economics.

    You survived all you been through

    At the end of every cycle UCAS published data on acceptances using a fine-grained (CAH level 3) subject lens, separated by level of deprivation – which in England means the IMD quintile. From this we learn that in the most recent data (2024 cycle) just under 42 per cent of all England domiciled accepted applicants from IMD quintile 1 (the most deprived group) were accepted onto a “priority subject”.

    [Full screen]

    This is a substantially higher proportion than in any other IMD quintile – it is also a substantially higher number: 39,870. We don’t get quite the same level of subject fidelity for offers and applications, but it appears that quintile 1 applicants are also much more likely to apply to priority subjects than any other group, and slightly more likely to receive an offer.

    In other words, as far as we can tell with the available data, there is not really a problem recruiting disadvantaged young people onto courses in subjects that the government is currently keen on.

    It is possible ministers may be thinking that adding the grants into the mix would drive these already encouraging numbers up even higher (and away from mere dilettante whims like, er, studying medicine, law, or biology). This would appear to ignore a rather expensive and lengthy experiment that has demonstrated that financial concerns (in the form, back then, of the sticker price) do not actually affect applicant behavior all that much, and when applicant behaviour is already trending in the way you might hope there’s maybe not a lot needs to be done.

    But if you assume that the entire annual levy covers a single year of grants for everyone in IMD quintile 1 in a priority subject – and let’s use the exact numbers here – we get 39,870 students sharing £620.52m: £15,560 each.

    That is baking in a bunch of assumptions around the way the levy is implemented, the way grant allocations are determined (is IMD, an area based measure, really the best way to allocate individual grants?), and even whether the entire levy is to be spent directly on grants and nothing else. But if these rather optimistic assumptions are right, we’re slightly above the current maximum loan (£13,762), and beginning to approach the government’s National Living Wage for those aged 18 to 20 (currently just under £18k). It’s not quite enough to live as a student for a year without working at all, but it would mean someone without any other means of support might not have to “work every hour god sends.”

    I’ll let you be my levy

    Let’s say you are an international student looking to study an integrated (4-5 year) Masters’ course in biomedical engineering at the University of Leicester. You’d be charged £25,100 a year (plus £6,275 if you do a year overseas, or £3,765 if you do a year in industry). As you are resident outside of the UK, you’d pay a deposit of £3,000 up front to secure your place. These figures will vary vastly depending on your choice of course and provider, but that gives you an idea of a ballpark figure.

    If you secured your place via an agent, you may have paid a fee up front to them. Your chosen university would also pay a fee to the agent for each successful application – these vary hugely, but let’s say it is 20 per cent of your first year of fees. In some cases, your university would also pay a direct fee to the agent, over and above their percentage of fee income. Combined, these can get pretty intense – far into the millions for providers that use agents, with some pushing £30m

    If you don’t quite meet some of the academic or English language requirements for your course, you may be accepted onto an international foundation year – often offered by another provider, either on behalf of your university or as a stand alone course. There will be fees for this too.

    Of course, before you are accepted onto your course, you’ll need a Tier 4 Student Visa. For all but a handful of countries, you’ll need evidence (the example given by the Foreign Office is a bank statement) that you currently have enough money to cover your fees for your first year plus nine months of living costs. Your visa will cost £524, plus you need to pay a healthcare surcharge (each year) of £776 each year.

    Let’s imagine for a moment that you never made a name for yourself

    If you are looking to design a levy, the first decision that you make will be what constitutes international fee income. Should it be the sticker price – as promoted to students? Should it, for example, include the fees an institution pays to an international agent? Should it include fees that the student pays to another institution for a co-branded international foundation year? Should you factor in that students are already paying a levy of sorts to cover the cost of issuing a visa or of providing access to the NHS? Should it include accommodation fees (or additional course fees) when these are paid directly to the provider?

    Or should a provider pay a proportion of everything it declares as (and auditors agree that is) international student fee income? At what point – when the fee is paid, when the course starts, when it is declared? And is there not a case to look at a levy on agents fees – there is big money to be made by agents, and unlike with providers no counter arguments about the student experience?

    The modelling I’ve done so far is deliberately simplistic – 6 per cent (or whatever is decided on) of declared fee income in the most recent HESA Finance Data. That’s a valid answer, but it is limited – it is not the same effect as you would get if a university had to pay 6 per cent of every international student’s fee at one of the points above. The Home Office modelling noted that in some cases fees themselves may rise to cover the levy, which may have a knock on effect on recruitment – and that in other cases providers themselves would swallow the cost.

    If you think about it like that – and also bear in mind the Public First angle on the types of students more likely to be dissuaded by higher fees – it is difficult not to see the regressive nature of the levy: well-off providers, who recruit well-heeled middle class students from countries where salaries are high, will pay more but will be able to pass the costs on to students. Providers newer to international recruitment, at the price sensitive end of the market, will lose out either way, and will have to work out whether the recruitment drop of a 6 per cent fee hike is worth more than 6 per cent of their current income.

    Such a funny thing for me to try to explain

    What if we don’t take the accountant’s way out? What if we calculate a levy based on what individual students actually pay?

    As noted above we don’t know – either generally or individually – what international students pay as fees. We also don’t really know how many students are currently paying them – HESA student data turns up after a quite considerable lag, and not all undergraduates (and no postgraduates!) show up in UCAS data.

    The closest we get to international student numbers, at all levels, in-year has historically been OfS’ HESES collection (which it uses to allocate OfS grant funding). I say historically because, from 2025-26 the information on domicile (previously used “for planning purposes”) will no longer be collected.

    If you want a levy based on what students actually pay, you need a new data collection covering the students involved and how much they have paid that year (perhaps separated out into qualifying and non-qualifying payments – with all of the early iteration problems that such things bring. Data Futures may eventually get there, but not for a good few years yet.

    Designing a new data collection is not for the faint of heart – we scrapped an entire section of the Higher Education (Freedom of Speech) Act (the bit dealing with income from overseas) primarily because it is a million times easier to torturously audit other data than to collect something new. It would be expensive, both centrally and for individual providers – and it would be commercially sensitive (not all international students pay the same fee for the same course at the same university).

    Know we’re jumping the gun

    At every point in this article, I’ve tried to get across just how broad brush the current details of this policy are. As my colleague Michael notes elsewhere, there is not even clarity that these two halves of an announcement are a part of the same policy, or that it is possible to irrevocably link an income stream with an outgoing like this in the public accounts.

    It is a political announcement, and as such leaping straight to implementation slightly misses the point – like with the “scrapping” of the “fifty per cent participation target” it might well be that how it lands is more important than how it works.

    But as I’ve also tried to show, implementation has no time for political expediency. Real decisions need to be taken, and the current configuration of the sector, of the application cycle, and of the various data collections need to be taken into account. And there’s a need to consider whether the behavioural changes you are trying to make would undermine the funding flows that you are intending will do so – the more parts to a policy the more unintended consequences there could be.

    Source link

  • Texas Teachers, Parents Fear STAAR Overhaul Doesn’t Do Enough – The 74

    Texas Teachers, Parents Fear STAAR Overhaul Doesn’t Do Enough – The 74


    Get stories like this delivered straight to your inbox. Sign up for The 74 Newsletter

    Texas public school administrators, parents and education experts worry that a new law to replace the state’s standardized test could potentially increase student stress and the amount of time they spend taking tests, instead of reducing it.

    The new law comes amid criticism that the State of Texas Assessment of Academic Readiness, or STAAR, creates too much stress for students and devotes too much instructional time to the test. The updated system aims to ease the pressure of a single exam by replacing STAAR with three shorter tests, which will be administered at the beginning, middle and end of the year. It will also ban practice tests, which Texas Education Agency Commissioner Mike Morath has said can take up weeks of instruction time and aren’t proven to help students do better on the standardized test. But some parents and teachers worry the changes won’t go far enough and that three tests will triple the pressure.

    The law also calls for the TEA to study how to reduce the weight testing carries on the state’s annual school accountability ratings — which STAAR critics say is one reason why the test is so stressful and absorbs so much learning time — and create a way for the results of the three new tests to be factored into the ratings.

    That report is not due until the 2029-30 school year, and the TEA is not required to implement those findings. Some worry the new law will mean schools’ ratings will continue to heavily depend on the results from the end-of-year test, while requiring students to start taking three exams. In other words: same pressure, more testing.

    Cementing ‘what school districts are already doing’

    The Texas Legislature passed House Bill 8 during the second overtime lawmaking session this year to scrap the STAAR test.

    Many of the reforms are meant to better monitor students’ academic growth throughout the school year.

    For the early and mid-year exams, schools will be able to choose from a menu of nationally recognized assessments approved by the TEA. The agency will create the third test. Under the law, the three new tests will use percentile ranks comparing students to their peers in Texas; the third will also assess a student’s grasp of the curriculum.

    In addition, scores will be required to be released about two days after students take the exam, so teachers can better tailor their lessons to student needs.

    State Sen. Paul Bettencourt, R-Houston, one of the architects behind the push to revamp the state’s standardized test, said he would like the first two tests to “become part of learning” so they can help students prepare for the end-of-year exam.

    But despite the changes, the new testing system will likely resemble the current one when it launches in the 2027-28 school year, education policy experts say.

    “It’s gonna take a couple of years before parents realize, to be honest, that you know, did they actually eliminate STAAR?” said Bob Popinski with Raise Your Hand Texas, an education advocacy nonprofit.

    Since many schools already conduct multiple exams throughout the year, the law will “basically codify what school districts are already doing,” Popinski said.

    Lawmakers instructed TEA to develop a way to measure student progress based on the results from the three tests. But that metric won’t be ready when the new testing system launches in the 2027-28 school year. That means results from the standardized tests, and their weight in the state’s school accountability ratings system, will remain similar to what they are now.

    Every Texas school district and campus currently receives an A-F rating based on graduation benchmarks and how students perform on state tests, their improvement in those areas, and how well they educate disadvantaged students. The best score out of the first two categories accounts for most of their overall rating. The rest is based on their score in the last category.

    The accountability ratings are high stakes for school districts, which can face state sanctions for failing grades — from being forced to close school campuses to the ousting of their democratically elected school boards.

    Supporters of the state’s accountability system say it is vital to assess whether schools are doing a good job at educating Texas children.

    “The last test is part of the accountability rating, and that’s not going to change,” Bettencourt said.

    Critics say the current ratings system fails to take into account a lot of the work schools are doing to help children succeed outside of preparing them for standardized tests.

    “Our school districts are doing a lot of interesting, great things out there for our kids,” Popinski said. “Academics and extracurricular activities and co-curricular activities, and those just aren’t being incorporated into the accountability report at all.”

    In response to calls to evaluate student success beyond testing, HB 8 also instructs the TEA to track student participation in pre-K, extracurriculars and workforce training in middle schools. But none of those metrics will be factored into schools’ ratings.

    “There is some other interest in looking at other factors for accountability ratings, but it’s not mandated. It’s just going to be reviewed and surveyed,” Bettencourt said.

    Student stress worries

    Even though many schools already conduct testing throughout the year, Popinski said the new system created by HB 8 could potentially boost test-related stress among students.

    State Rep. Brad Buckley, R-Salado, who sponsored the testing overhaul in the Texas House, wrote in a statement that “TEA will determine testing protocols through their normal process.” This means it will be up to TEA to decide whether to keep or change the rules that it currently uses for the STAAR test. Those include that schools dedicate three to four hours to the exam and that administrators create seating charts, spread out desks and manage restroom breaks.

    School administrators said the worst-case scenario would be if all three of the new tests had to follow lockdown protocols like the ones that currently come with STAAR. Holly Ferguson, superintendent of Prosper ISD, said the high-pressure environment associated with the state’s standardized test makes some of her students ill.

    “It shouldn’t be that we have kids sick and anxiety is going through the roof because they know the next test is coming,” Ferguson said.

    The TEA did not respond to a request for comment.

    HB 8 also seeks to limit the time teachers spend preparing students for state assessments, partly by banning benchmark tests for 3-8 grades. Bettencourt told the Tribune the new system is expected to save 22.5 instructional hours per student.

    Buckley said the new law “will reduce the overall number of tests a student takes as well as the time they spend on state assessments throughout the school year, dramatically relieving the pressure and stress caused by over-testing.”

    But some critics worry that any time saved by banning practice tests will be lost by testing three times a year. In 2022, Florida changed its testing system from a single exam to three tests at the beginning, middle and end of the year. Florida Gov. Ron DeSantis said the new system would reduce test time by 75%, but the number of minutes students spent taking exams almost doubled the year the new system went into effect.

    Popinski added that much of the stress the test induces comes from the heavy weight the end-of-year assessment holds on a school’s accountability rating. The pressure to perform that the current system places on school district administrators transfers to teachers and students, critics have said.

    “The pressures are going to be almost exactly the same,” Popinski said.

    What parents, educators want for the new test

    Retired Fort Worth teacher Jim Ekrut said he worries about the ban on practice tests, because in his experience, test preparations helped reduce his students’ anxiety.

    Ekrut said teachers’ experience assessing students is one reason why educators should be involved in creating the new end-of-year exam.

    “The better decisions are going to be made with input from people right on that firing line,” Ekrut said.

    HB 8 requires that a committee of educators appointed by the commissioner reviews the new test that TEA will create. Some, like Ferguson and David Vinson, former superintendent of Wylie ISD who started at Conroe this week, said they hope the menu of possible assessments districts can pick for the first two tests includes a national program they already use called Measures of Academic Progress, or MAP.

    The Prosper and Wylie districts are some that administer MAP exams at the beginning, middle and end of the year. More than 4,500 school districts nationwide use these online tests, which change the difficulty of the questions as students log their answers to better assess their skill level and growth. A 2024 study conducted by the organization that runs MAP found that the test is a strong indicator of how students perform on the end-of-year standardized test.

    Criteria-based tests like STAAR measure a student’s grasp on grade-level skills, whereas norm-based exams like MAP measure a student’s growth over the course of instruction. Vinson described this program as a “checkup,” while STAAR is an “autopsy.”

    Rachel Spires, whose children take MAP tests at Sunnyvale ISD, said MAP testing doesn’t put as much pressure on students as STAAR does.

    Spires said her children’s schedules are rearranged for the month of April, when Sunnyvale administers the STAAR test, and parents are barred from coming to campus for lunch. MAP tests, on the other hand, typically take less time to complete, and the school has fewer rules for how they are administered.

    “When the MAP tests come around, they don’t do the modified schedules, and they don’t do the review packets and prep testing or anything like that,” Spires said. “It’s just like, ‘Okay, tomorrow you’re gonna do a MAP test,’ and it’s over in like an hour.”

    For Ferguson, the Prosper ISD superintendent, a relaxed environment around testing is key to achieving the new law’s goal of reducing student stress.

    “If it’s just another day at school, I’m all in,” Ferguson said. “But if we lock it down, and we create a very compliance-driven system that’s very archaic and anxiety- and worry-inducing to the point that it starts having potential harmful effects on our kids … our teachers and our parents, I’m not okay with that.”

    This article originally appeared in The Texas Tribune at https://www.texastribune.org/2025/09/24/texas-staar-replacement-map-testing/. The Texas Tribune is a member-supported, nonpartisan newsroom informing and engaging Texans on state politics and policy. Learn more at texastribune.org.


    Get stories like these delivered straight to your inbox. Sign up for The 74 Newsletter

    Source link

  • Championing Teachers in High-Conflict Contexts

    Championing Teachers in High-Conflict Contexts

    Myssan Al Laysy Stouhi

    For Myssan Al Laysy Stouhi, the path to a Ph.D. has been anything but conventional. Born and raised in Lebanon, she has witnessed firsthand the challenges that educators face when teaching becomes an act of resilience rather than routine. Now, as she prepares to graduate this December from Indiana University of Pennsylvania’s Composition and Applied Linguistics program, Stouhi is transforming her lived experience into groundbreaking research that amplifies the voices of teachers working in crisis contexts.

    “I always had this interest because, I mean, I’m Lebanese at the end of the day,” Stouhi reflects. “Since I was born, I always lived and worked in a context, in a high conflict context. So, I wanted to do research that would bring more visibility and attention to what things are like for a teacher in Lebanon.”

    Stouhi’s academic journey began at the American University of Beirut, where she earned both her bachelor’s and master’s degrees in linguistics. After teaching there for several years, she moved to the United Arab Emirates in 2014, spending a decade as a faculty member at the University of Sharjah in Dubai. It was during this time that she began envisioning a doctoral program that would allow her to continue working while pursuing advanced research.

    “I needed a Ph.D. program that was low residency,” she explains. “I spoke to professors at IUP and found scholars there who work on teacher identity, teacher emotions, teacher psychology, and teaching in crisis contexts, which was always my interest.”

    Her timing proved prescient. Between October 2019 and October 2023, Lebanon experienced what Stouhi describes as “probably the darkest period of time that Lebanon witnessed in its modern history.” The country endured a revolution against government corruption, a currency collapse that wiped out 90% of the Lebanese pound’s value, COVID-19 lockdowns, the devastating Beirut port explosion, war threats, and even earthquakes.

    “It was unbelievably bad,” she recalls.

    These concerns became the foundation for her dissertation: “English as an Additional Language (EAL) Teachers Navigate Lebanese Educational System as a Crisis Context: Challenges and Resources.” Through interviews, focus group discussions, autoethnographies, and field artifacts, she spoke with nine teachers to understand how they navigated professional and personal challenges during this unprecedented period.

    “Students’ classes were suspended in Lebanon before the quarantine, because of the revolution,” she notes. “The students weren’t going regularly to school anyway. I wanted to see what their classrooms were like, what resources they were able to draw on, what resources were absent.”

    Her research philosophy extends beyond documenting hardship.

    “Lebanon is not the only crisis context on earth,” she emphasizes. “We live in a globe of crises. Every country is subject to crises, whether it’s a natural disaster, political thing, financial thing. My ultimate goal: what can the international academic community learn from Lebanese teachers about navigating teaching in a very high-conflict context?”

    Dr. Gloria Park, her dissertation advisor, recognizes Stouhi’s unique contribution to the field.

    “Myssan is one of the most resilient and strong doctoral students I have worked with in the past 17 years at Indiana University of Pennsylvania,” Park states. “Yes, the fact that she is in a Ph.D. program in the U.S. is a form of cultural and symbolic capital, yet her continuous teaching while matriculating in a Ph.D. program to send money to her family in Lebanon as well as help the needy teachers who teach in crisis context is a testament of her commitment and desire to give back to her home country.”

    Stouhi’s non-traditional path through graduate school reflects broader changes in higher education. She participated in IUP’s summers-only high residency program, taking intensive coursework during eight-week summer sessions while maintaining her full-time teaching position. This model allowed her to balance family obligations with academic aspirations — a juggling act she began contemplating as early as 2003.

    Looking ahead, Stouhi plans to join the academic job market while pursuing activist work supporting teachers in underrepresented contexts. She’s already connected with colleagues developing capacity-building programs for Middle Eastern educators and is considering additional training in AI skills and educational leadership.

    Her message to prospective graduate students reflects the pragmatic optimism that has carried her through years of balancing crisis and opportunity. “If your dream is to get a Ph.D., then start a Ph.D. and see what it’s like, and then you can decide if this is for you or not. We make things a lot harder in our heads.”

    As Stouhi prepares to defend her dissertation, she remains connected to her Lebanese roots, visiting family annually and maintaining her commitment to educational justice. Through her research, she is working to ensure that the voices of Lebanese teachers — and by extension, educators facing crises globally — will not be forgotten but celebrated as examples of professional courage in the face of unprecedented challenges. 

    Source link

  • ANNE D’ALLEVA | The EDU Ledger

    ANNE D’ALLEVA | The EDU Ledger

    Anne D’Alleva has been named president of Binghamton University. D’Alleva is currently the provost and executive vice president for academic affairs at the University of Connecticut.

    D’Alleva has led UConn’s academic enterprise, including strategic planning, budgetary management, faculty development and curriculum innovation across the university’s 14 schools and colleges. She leads initiatives that support student success, faculty excellence and institutional impact.

    D’Alleva is the first woman to serve as provost in UConn’s history and previously served as dean of the School of Fine Arts since 2015 and first joined the UConn faculty as a joint appointment to Art History and Women’s, Gender and Sexuality Studies in 1999. D’Alleva holds a bachelor’s degree in art histo- ry from Harvard University, and her master’s and Ph.D. in art history from Columbia University with a graduate certificate in feminist theory.

    Source link