Tag: fail

  • It’s students that suffer when those supposed to protect them fail

    It’s students that suffer when those supposed to protect them fail

    24 hours after it published its (“summary”) report into the sudden closure of the Applied Business Academy (ABA), the Office for Students (OfS) published an insight brief on protecting the interests of students when universities and colleges close.

    When the regulator works with a closing provider, it says that it works with that provider and other bodies to try to reduce the impact on students.

    OfS’ report on ABA is notably quiet on the extent to which it has been successful there – we’ve no idea how many students were real, how many of those that were have successfully switched provider, how much (if any compensation) any of the impacted students have had, and so on.

    Nor has it talked about its success or otherwise in reducing the impact on students from the closure of ALRA drama school in 2022 or Schumacher College last Autumn.

    A number of campuses have closed in recent years – no idea on that, and if your course closes (or is cut or merged in a material way) OfS doesn’t even require providers to report that in, so it would neither know nor feature it on its “current closures” webpage (that plenty of students caught up in a closure will nevertheless find if they google “closed course office for students”).

    The other gap in knowledge thus far is the sorts of things that you might assume the regulator has noticed or done or considered in the run up to a closure. The learning is valuable – and so the new brief shares both its experience of closures and “near misses”, and the experiences of some of those directly involved.

    There’s helpful material on the impact on students, communication and record management, and how providers may be affected by the closure of subcontracted or validated delivery partners – and features anonymised quotes shared by senior managers and “a student” involved in institutional closures.

    Unexpected hits you between the eyes

    The note suggests that providers consistently underestimate the challenges and “resource-intensive nature” of closure processes – one contributor says:

    The challenge is underestimating the level of work and planning that are needed in different areas. Planning prior to a crisis developing can help the situation hugely.

    Financial complexities often catch institutions unprepared, with many discovering too late how their legal structure significantly impacts rescue options. OfS says that providers need to thoroughly understand their financial position, contractual obligations, and legal options well before any crisis occurs.

    Student data management are also a problem – incomplete or inadequate student records prove nearly useless when transfers become necessary, and data sharing agreements essential for transferring information to other institutions are often neglected until closure is imminent.

    The human impact on students is underestimated. Students face difficulties processing their options without timely information, and providers fail to recognise how closure disproportionately affects those with caring responsibilities, part-time employment, disabilities, or those on placements – all groups who cannot easily relocate. Accommodation arrangements create more complications, with some students locked into tenancy contracts.

    Communication challenges see providers struggling to balance early transparency against having finalised options – it says that many fail to develop clear, student-focused comms plans, resulting in confusion and poor decision-making among those affected.

    Validated and subcontractual partnerships demand special attention – with one leader admitting:

    Our mechanisms were too slow to identify the risks for those students.

    Many have failed to identify and plan for contingencies despite retaining significant responsibility for these students. And refunds and compensation frameworks are neglected too – the one student observes:

    We were told we could claim compensation for reasonable interim costs from our institution, but without clear or prompt guidance on what this could cover, it was hard to feel confident in making decisions.

    It also says that early stakeholder engagement with agencies like UCAS or the OIA (as well as proactive communication with OfS and any other relevant regulators) is critical – delays in those its seen until crisis is imminent miss valuable opportunities for support in protecting student interests.

    The benefits of hindsight

    Despite focusing on risks to study continuation of study and provider response planning and execution, astonishingly the brief never mentions Condition C3 – the core regulation governing these areas.

    Condition C4 (an enhanced version of C3) appears occasionally, but we learn nothing about its application in the cited cases, preventing assessment of the regulatory framework’s effectiveness.

    This all matters because OfS’s fundamental purpose is to assure those enrolling into the provision it regulates of a level baseline student interest protection – not merely offering advice.

    And the reality is that the evidence it presents reveals systematic failures across C3’s key requirements. Providers here demonstrated profound gaps in risk assessment and awareness. They “were not fully aware of the risks” from delivery partner failures, with early warning mechanisms that “should have kicked in earlier”, and seem to have failed to conduct the comprehensive risk assessments across all provision types that C3 explicitly requires.

    Mitigation planning fell similarly short of regulatory expectations. Institutions underestimated “the level of work and planning needed” while failing to properly identify alternative study options. Practical considerations like accommodation concerns with “third-party landlords” were overlooked entirely. And plans weren’t “produced in collaboration with students” as both C3 and pages like this promise:

    …we expect providers to collaborate with students to review and refresh the plan on a regular basis.

    Implementation and communication failures undermined student protection. When crises occurred, protection measures weren’t activated promptly, with students reporting “it was difficult to decide what to do next without having all the information in a timely manner.”

    Compensation processes generated confusion rather than clarity. Delivery partners neglected to inform lead institutions of closure risks, while information sharing was often restricted to “a smaller group of staff,” reducing planning capacity precisely when broad engagement was needed.

    And C3’s requirements regarding diverse student needs seem to have been unaddressed too. Support for students with additional needs proved inadequate in practice, while international students faced visa vulnerabilities that should have been anticipated.

    C3 also requires plans to be “published in a clear and accessible way” and “revised regularly” – requirements evidently unmet here, with evidence suggesting some providers maintained static protection measures that proved ineffective when actually needed.

    Has anyone been held to account for those failings? And for its own part, if OfS knew that ABA was in trouble (partly via Ofsted and partly via the DfE switching off the loans tap), even if C4 wasn’t applied, was C3 compliance scrutinised? Will other providers be held to account if they fail in similar ways? We are never told.

    The more the world is changing

    The questions pile up the further into the document you get. Given the changed financial circumstances in the sector and the filing cabinet that must be full of “at enhanced risk” of financial problems, why hasn’t OfS issued revised C3 guidance? If anyone’s reading inside the regulator, based on report I’ve had a go at the redraft that former OfS chair Michael Barber promised back in 2018 (and then never delivered) here – providers wishing to sleep at night should take a look too.

    You also have to wonder if OfS has demanded C3 rewrites of providers who have featured on the front of the Sunday Times, or who have announced redundancies. If it has, there’s not much evidence – there’s clearly a wild mismatch between the often years old, “very low risks here” statements in “live” SPPs that I always look for when a redundancy round is threatened, and I have a live list of those featured on Queen Mary UCU’s “HE Shrinking” webpage whose SPPs paint a picture of financial stability and infinitesimally small course closure risk despite many now teaching them out.

    I’ve posted before about the ways in which things like “teach out” sound great in practice, but almost always go wrong – with no attempt by OfS to evaluate, partly because it usually doesn’t know about them. I’m also, to be fair, aware that in multiple cases providers have submitted revised student protection plans to the regulator, only to hear nothing back for months on end.

    Of course in theory the need for a specific and dedicated SPP may disappear in the future – OfS is consulting on replacing them with related comprehensive information. But when that might apply to existing providers is unknown – and so for the time being, OfS’ own protection promises on its own website appear to be going unmet with impunity for those not meeting them:

    Student protection plans set out what students can expect to happen should a course, campus, or institution close. The purpose of a plan is to ensure that students can continue and complete their studies, or can be compensated if this is not possible.

    As such the brief reads like a mixture between a set of case studies and “best practice”, with even less regulatory force than a set of summaries from the OIA. The difference here – as the OIA regularly itself identifies – is that the upholding of a complaint against its “Good Practice Framework” won’t be much use if the provider is in administration.

    So whether it’s holes in the wording of C3, problems in predicting what C3’s requirements might mean, a lack of enforcement over what students are being promised now, a need for C3 to be revised and updated, a need for better guidance in light of cases surrounding it, or a need for all of these lessons to be built into its new proposed C5 (and then implemented across the existing regulated sector), what OfS has done is pretty much reveal that students should have no trust in the protection arrangements currently on offer.

    And for future students, wider lessons – on the nature of what is and isn’t being funded, and whether the risks can ever be meaningfully mitigated – are entirely absent here too.

    Amidst cuts that OfS itself is encouraging, from a course or campus closure point of view, a mixture of OfS consistently failing to define “material component” in the SPP guidance, and a breath of providers either having clauses that give them too much power to vary from what was promised, or pretending their clauses allow them to merge courses or slash options when they don’t, is bad enough – as is the tactic of telling students of changes a couple of weeks before the term starts when the “offer” of “you can always break the contract on your side” is a pretty pointless one.

    But from a provider collapse perspective, it’s unforgivable. Whatever is done in the future on franchising, you’d have to assume that many of the providers already look pretty precarious now – and will be even more so if investigations (either by the government or newspapers) reveal more issues, or if OfS makes them all register (where the fit and proper person test looks interesting), or if the government bans domestic agents.

    And anyone that thinks that it’s only franchised providers that look precarious right now really ought to get their head across the risk statements in this year’s crop of annual accounts.

    Back in 2017 when DfE consulted on the Regulatory Framework on behalf of the emerging OfS back, it promised that were there to be economic changes that dramatically affected the sustainability of many providers, the regulator would work with providers to improve their student protection plans so that they remained “strong” and “deliverable” in service of the student interest.

    So far they’ve proved to be weak and undeliverable. Whether that’s DfE’s fault for not getting the powers right, OfS’ for not using them, or ministers’ fault for freezing fees, taking the cap off recruitment and letting cowboys in to trouser wads of tuition fee loan money is an issue for another day. For now, someone either needs to warn students that promises on protection are nonsense, or providers, DfE and OfS need to act now to make good on the promises of protection that they’ve made.

    Source link

  • These teens can do incredible math in their heads but fail in a classroom

    These teens can do incredible math in their heads but fail in a classroom

    When I was 12, my family lived adjacent to a small farm. Though I was not old enough to work, the farm’s owner, Mr. Hall, hired me to man his roadside stand on weekends. Mr. Hall had one rule: no calculators. Technology wasn’t his vibe. 

    Math was my strong suit in school, but I struggled to tally the sums in my head. I weighed odd amounts of tomatoes, zucchini and peppers on a scale and frantically scribbled calculations on a notepad. When it got busy, customers lined up waiting for me to multiply and add. I’m sure I mischarged them.

    I was thinking about my old job as I read a quirky math study published this month in the journal Nature. Nobel Prize winning economists Abhijit Banerjee and Esther Duflo, a husband and wife research team at MIT, documented how teenage street sellers who were excellent at mental arithmetic weren’t good at rudimentary classroom math. Meanwhile, strong math students their same age couldn’t calculate nearly as well as impoverished street sellers.

    “When you spend a lot of time in India, what is striking is that these market kids seem to be able to count very well,” said Duflo, whose primary work in India involves alleviating poverty and raising the educational achievement of poor children.  “But they are really not able to go from street math to formal math and vice versa.”

    Related: Our free weekly newsletter alerts you to what research says about schools and classrooms.

    In a series of experiments, Duflo’s field staff in India pretended to be ordinary shoppers and purposely bought unusual quantities of items from more than 1,400 child street sellers in Delhi and Kolkata. A purchase might be 800 grams of potatoes at 20 rupees per kilogram and 1.4 kilograms of onions at 15 rupees per kilogram. Most of the child sellers quoted the correct price of 37 rupees and gave the correct change from a 200 rupee note without using a calculator or pencil and paper. The odd quantities were to make sure the children hadn’t simply memorized the price of common purchases. They were actually making calculations. 

    However, these same children, the majority of whom were 14 or 15 years old, struggled to solve much simpler school math problems, such as basic division. (After making the purchases, the undercover shoppers revealed their identities and asked the sellers to participate in the study and complete a set of abstract math exercises.)

    The market sellers had some formal education. Most were attending school part time, or had previously been in school for years.

    Duflo doesn’t know how the young street sellers learned to calculate so quickly in their heads. That would take a longer anthropological study to observe them over time. But Duflo was able to glean some of their strategies, such as rounding. For example, instead of multiplying 490 by 20, the street sellers might multiply 500 by 20 and then remove 10 of the 20s, or 200. Schoolchildren, by contrast, are prone to making lengthy pencil and paper calculations using an algorithm for multiplication. They often don’t see a more efficient way to solve a problem.

    Lessons from this research on the other side of the world might be relevant here in the United States. Some cognitive psychologists theorize that learning math in a real-world context can help children absorb abstract math and apply it in different situations. However, this Indian study shows that this type of knowledge transfer probably won’t happen automatically or easily for most students. Educators need to figure out how to better leverage the math skills that students already have, Duflo said. Easier said than done, I suspect.  

    Related: Do math drills help children learn?

    Duflo says her study is not an argument for either applied or abstract math.  “It would be a mistake to conclude that we should switch to doing only concrete problems because we also see that kids who are extremely good at concrete problems are unable to solve an abstract problem,” she said. “And in life, at least in school life, you’re going to need both.” Many of the market children ultimately drop out of school altogether.

    Back at my neighborhood farmstand, I remember how I magically got the hang of it and rarely needed pencil and paper after a few months. Sadly, the Hall farm is no longer there for the town’s children to practice mental math. It’s now been replaced by a suburban subdivision of fancy houses. 

    Contact staff writer Jill Barshay at 212-678-3595 or barshay@hechingerreport.org.

    This story about applied math was written by Jill Barshay and produced by The Hechinger Report, a nonprofit, independent news organization focused on inequality and innovation in education. Sign up for Proof Points and other Hechinger newsletters.

    The Hechinger Report provides in-depth, fact-based, unbiased reporting on education that is free to all readers. But that doesn’t mean it’s free to produce. Our work keeps educators and the public informed about pressing issues at schools and on campuses throughout the country. We tell the whole story, even when the details are inconvenient. Help us keep doing that.

    Join us today.

    Source link