Tag: supposed

  • Religion and politics aren’t supposed to mix

    Religion and politics aren’t supposed to mix

    Ukrainian President Volodymyr Zelensky says religion was one topic his family never mentioned at the dinner table.

    That could be because he’s from the Jewish minority, or because the overwhelming Orthodox Christian majority was split into different branches.

    Ukraine’s Orthodox have gradually become more Ukrainian, to the detriment of a once-powerful pro-Russian Church, and the trend has sped up now that Kyiv and Moscow are at war.

    The conflict between the pro-Kyiv Orthodox Church of Ukraine (OCU) and the pro-Moscow Ukrainian Orthodox Church (UOC) gets lost in the international coverage of the drama on the battlefield.

    But with about 80% of Ukrainians identifying as Orthodox Christians, even if probably less than half attend church regularly, this split between the two Churches seeps into politics.

    Christmas in Kyiv

    The religious conflict crept into the news last month when the pro-Kyiv Church authorized all Ukrainian parishes to celebrate Christmas on December 25 if they wished, rather than the traditional Orthodox date of January 7.

    The symbolism of allowing Christmas to be celebrated on the date used in the West was not lost on Ukrainian believers.

    The roots to this clash go back to the communist period. While Ukraine was part of the Soviet Union, it was under the umbrella of the Russian Orthodox Church.

    When the Soviet Union collapsed in 1991, the Ukrainian Orthodox Church continued to operate in the newly sovereign Ukraine, but proclaimed its loyalty to the Moscow Patriarchate.

    Ukrainian patriots objected and said they deserved their own Church. Their rival Orthodox Church of Ukraine was created in 1992, soon after Ukraine’s independence. It was recognized as autocephalous (independent) by the Ecumenical Patriarchate in Istanbul — the highest authority in Orthodox Christianity — in 2019.

    The politics of praying in Ukrainian

    The two Churches have the same theology, liturgy and even architecture as the Moscow Church. But the Kyiv Church prays in Ukrainian rather than Church Slavonic and declares allegiance to Ecumenical Patriarch Bartholomew in Istanbul instead of Moscow’s Patriarch Kirill.

    Originally much larger, the Moscow Church saw parishes defecting to its rival, especially after the war began. Under this pressure, the Ukrainian branch declared its independence from Russia in May, condemned the invasion and refused to recognize Patriarch Kirill in its liturgies.

    It’s unclear now which Church is larger. But the head of the Kyiv Patriarchate, Metropolitan Epiphinius, told Religion News Service in May: “Every day, Ukrainians are gradually coming to understand which Church is truly Ukrainian and which Church is Russian.”

    The Moscow Patriarchate tried to shield off Russian-occupied Crimea by creating its own metropolitanate (archdiocese) there in June. The Kyiv Church refused to recognize this.

    When Putin annexed four Ukrainian territories in September — even though he did not completely control them — he tried to justify the move in religious terms, calling it a “glorious spiritual choice.”

    Sermons, spies and the Security Service

    But Kyiv increasingly saw the pro-Moscow Church as a fifth column, or spies of Putin. In October, the acting head of Ukraine’s Security Service revealed it had found 33 suspected Russian agents among the Moscow Church’s clergy in Ukraine.

    Some preached pro-Russian sermons, Kyiv said, some had anti-Ukrainian literature and some were army chaplains who passed on information about Ukrainian artillery batteries to Russian agents.

    That’s when the Kyiv Church authorized all Ukrainian parishes to celebrate Christmas on December 25 if they wished. On December 1, Zelensky upped the ante by calling for an official ban on all activities of the Moscow Patriarchate’s Church in Ukraine. Parliament was asked to draft a suitable law, which may be difficult given the provision in the Ukrainian constitution of freedom of religion.

    In late December, Ukraine refused to renew the Moscow Church’s lease on the Cathedral of the Dormition at Kyiv’s Monastery of the Caves, traditionally the center of Ukrainian Orthodoxy.

    On January 7, Metropolitan Epiphanius, head of the pro-Kyiv Church, celebrated the traditional Christmas there to show he was the new man in charge now.

    And in its latest turn to faith, Russia called for a 36-hour truce to mark the traditional Christmas on January 7. Kyiv and its western allies rejected this as a cynical ploy, and both sides continued shelling each other as if nothing had happened.

    The battlefield struggle is still the main story, both in its ultimate importance and in the David-and-Goliath story that readers understand. The religious rivalry will always be secondary.

    But these pinpricks on the faith front add up to a new phase in the growth of local nationalism, which helps buoy Ukrainian morale. In hoping to defeat a country he thought would easily give in, Putin has done more than anyone to forge a united and defiant Ukrainian nation.


     

    Three questions to consider:

    1. Why do politicians often appeal to religion during a war?

    2. Do mainstream journalists make religious angles clear in a conflict?

    3. When do separate small events add up to a noteworthy trend?


     

    Source link

  • Tutoring was supposed to save American kids after the pandemic. The results? ‘Sobering’

    Tutoring was supposed to save American kids after the pandemic. The results? ‘Sobering’

    Rigorous research rarely shows that any teaching approach produces large and consistent benefits for students. But tutoring seemed to be a rare exception. Before the pandemic, almost 100 studies pointed to impressive math or reading gains for students who were paired with a tutor at least three times a week and used a proven curriculum or set of lesson plans. 

    Some students gained an extra year’s worth of learning — far greater than the benefit of smaller classes, summer school or a fantastic teacher. These were rigorous randomized controlled trials, akin to the way that drugs or vaccines are tested, comparing test scores of tutored students against those who weren’t. The expense, sometimes surpassing $4,000 a year per student, seemed worth it for what researchers called high-dosage tutoring.

    On the strength of that evidence, the Biden administration urged schools to invest their pandemic recovery funds in intensive tutoring to help students catch up academically. Forty-six percent of public schools heeded that call, according to a 2024 federal survey, though it’s unclear exactly how much of the $190 billion in pandemic recovery funds have been spent on high-dosage tutoring and how many students received it. 

    Related: Our free weekly newsletter alerts you to what research says about schools and classrooms.

    Even with ample money, schools immediately reported problems in ramping up high-quality tutoring for so many students. In 2024, researchers documented either tiny or no academic benefits from large-scale tutoring efforts in Nashville, Tennessee, and Washington, D.C.

    New evidence from the 2023-24 school year reinforces those results. Researchers are rigorously studying large-scale tutoring efforts around the nation and testing whether effective tutoring can be done more cheaply. A dozen researchers studied more than 20,000 students in Miami; Chicago; Atlanta; Winston-Salem and Greensboro, North Carolina; Greenville, South Carolina; schools throughout New Mexico, and a California charter school network. This was also a randomized controlled study in which 9,000 students were randomly assigned to get tutoring and compared with 11,000 students who didn’t get that extra help.

    Their preliminary results were “sobering,” according to a June report by the University of Chicago Education Lab and MDRC, a research organization.

    The researchers found that tutoring during the 2023-24 school year produced only one or two months’ worth of extra learning in reading or math — a tiny fraction of what the pre-pandemic research had produced. Each minute of tutoring that students received appeared to be as effective as in the pre-pandemic research, but students weren’t getting enough minutes of tutoring altogether. “Overall we still see that the dosage students are getting falls far short of what would be needed to fully realize the promise of high-dosage tutoring,” the report said.

    Monica Bhatt, a researcher at the University of Chicago Education Lab and one of the report’s authors, said schools struggled to set up large tutoring programs. “The problem is the logistics of getting it delivered,” said Bhatt. Effective high-dosage tutoring involves big changes to bell schedules and classroom space, along with the challenge of hiring and training tutors. Educators need to make it a priority for it to happen, Bhatt said.

    Related: Students aren’t benefiting much from tutoring, one new study shows

    Some of the earlier, pre-pandemic tutoring studies involved large numbers of students, too, but those tutoring programs were carefully designed and implemented, often with researchers involved. In most cases, they were ideal setups. There was much greater variability in the quality of post-pandemic programs.

    “For those of us that run experiments, one of the deep sources of frustration is that what you end up with is not what you tested and wanted to see,” said Philip Oreopoulos, an economist at the University of Toronto, whose 2020 review of tutoring evidence influenced policymakers. Oreopoulos was also an author of the June report.

    “After you spend lots of people’s money and lots of time and effort, things don’t always go the way you hope. There’s a lot of fires to put out at the beginning or throughout because teachers or tutors aren’t doing what you want, or the hiring isn’t going well,” Oreopoulos said.

    Another reason for the lackluster results could be that schools offered a lot of extra help to everyone after the pandemic, even to students who didn’t receive tutoring. In the pre-pandemic research, students in the “business as usual” control group often received no extra help at all, making the difference between tutoring and no tutoring far more stark. After the pandemic, students — tutored and non-tutored alike — had extra math and reading periods, sometimes called “labs” for review and practice work. More than three-quarters of the 20,000 students in this June analysis had access to computer-assisted instruction in math or reading, possibly muting the effects of tutoring.

    Related: Tutoring may not significantly improve attendance

    The report did find that cheaper tutoring programs appeared to be just as effective (or ineffective) as the more expensive ones, an indication that the cheaper models are worth further testing. The cheaper models averaged $1,200 per student and had tutors working with eight students at a time, similar to small group instruction, often combining online practice work with human attention. The more expensive models averaged $2,000 per student and had tutors working with three to four students at once. By contrast, many of the pre-pandemic tutoring programs involved smaller 1-to-1 or 2-to-1 student-to-tutor ratios.

    Despite the disappointing results, researchers said that educators shouldn’t give up. “High-dosage tutoring is still a district or state’s best bet to improve student learning, given that the learning impact per minute of tutoring is largely robust,” the report concludes. The task now is to figure out how to improve implementation and increase the hours that students are receiving. “Our recommendation for the field is to focus on increasing dosage — and, thereby learning gains,” Bhatt said.

    That doesn’t mean that schools need to invest more in tutoring and saturate schools with effective tutors. That’s not realistic with the end of federal pandemic recovery funds.  

    Instead of tutoring for the masses, Bhatt said researchers are turning their attention to targeting a limited amount of tutoring to the right students. “We are focused on understanding which tutoring models work for which kinds of students.” 

    Contact staff writer Jill Barshay at 212-678-3595, jillbarshay.35 on Signal, or [email protected].

    This story about tutoring effectiveness was written by Jill Barshay and produced by The Hechinger Report, a nonprofit, independent news organization focused on inequality and innovation in education. Sign up for Proof Points and other Hechinger newsletters.

    The Hechinger Report provides in-depth, fact-based, unbiased reporting on education that is free to all readers. But that doesn’t mean it’s free to produce. Our work keeps educators and the public informed about pressing issues at schools and on campuses throughout the country. We tell the whole story, even when the details are inconvenient. Help us keep doing that.

    Join us today.

    Source link

  • It’s students that suffer when those supposed to protect them fail

    It’s students that suffer when those supposed to protect them fail

    24 hours after it published its (“summary”) report into the sudden closure of the Applied Business Academy (ABA), the Office for Students (OfS) published an insight brief on protecting the interests of students when universities and colleges close.

    When the regulator works with a closing provider, it says that it works with that provider and other bodies to try to reduce the impact on students.

    OfS’ report on ABA is notably quiet on the extent to which it has been successful there – we’ve no idea how many students were real, how many of those that were have successfully switched provider, how much (if any compensation) any of the impacted students have had, and so on.

    Nor has it talked about its success or otherwise in reducing the impact on students from the closure of ALRA drama school in 2022 or Schumacher College last Autumn.

    A number of campuses have closed in recent years – no idea on that, and if your course closes (or is cut or merged in a material way) OfS doesn’t even require providers to report that in, so it would neither know nor feature it on its “current closures” webpage (that plenty of students caught up in a closure will nevertheless find if they google “closed course office for students”).

    The other gap in knowledge thus far is the sorts of things that you might assume the regulator has noticed or done or considered in the run up to a closure. The learning is valuable – and so the new brief shares both its experience of closures and “near misses”, and the experiences of some of those directly involved.

    There’s helpful material on the impact on students, communication and record management, and how providers may be affected by the closure of subcontracted or validated delivery partners – and features anonymised quotes shared by senior managers and “a student” involved in institutional closures.

    Unexpected hits you between the eyes

    The note suggests that providers consistently underestimate the challenges and “resource-intensive nature” of closure processes – one contributor says:

    The challenge is underestimating the level of work and planning that are needed in different areas. Planning prior to a crisis developing can help the situation hugely.

    Financial complexities often catch institutions unprepared, with many discovering too late how their legal structure significantly impacts rescue options. OfS says that providers need to thoroughly understand their financial position, contractual obligations, and legal options well before any crisis occurs.

    Student data management are also a problem – incomplete or inadequate student records prove nearly useless when transfers become necessary, and data sharing agreements essential for transferring information to other institutions are often neglected until closure is imminent.

    The human impact on students is underestimated. Students face difficulties processing their options without timely information, and providers fail to recognise how closure disproportionately affects those with caring responsibilities, part-time employment, disabilities, or those on placements – all groups who cannot easily relocate. Accommodation arrangements create more complications, with some students locked into tenancy contracts.

    Communication challenges see providers struggling to balance early transparency against having finalised options – it says that many fail to develop clear, student-focused comms plans, resulting in confusion and poor decision-making among those affected.

    Validated and subcontractual partnerships demand special attention – with one leader admitting:

    Our mechanisms were too slow to identify the risks for those students.

    Many have failed to identify and plan for contingencies despite retaining significant responsibility for these students. And refunds and compensation frameworks are neglected too – the one student observes:

    We were told we could claim compensation for reasonable interim costs from our institution, but without clear or prompt guidance on what this could cover, it was hard to feel confident in making decisions.

    It also says that early stakeholder engagement with agencies like UCAS or the OIA (as well as proactive communication with OfS and any other relevant regulators) is critical – delays in those its seen until crisis is imminent miss valuable opportunities for support in protecting student interests.

    The benefits of hindsight

    Despite focusing on risks to study continuation of study and provider response planning and execution, astonishingly the brief never mentions Condition C3 – the core regulation governing these areas.

    Condition C4 (an enhanced version of C3) appears occasionally, but we learn nothing about its application in the cited cases, preventing assessment of the regulatory framework’s effectiveness.

    This all matters because OfS’s fundamental purpose is to assure those enrolling into the provision it regulates of a level baseline student interest protection – not merely offering advice.

    And the reality is that the evidence it presents reveals systematic failures across C3’s key requirements. Providers here demonstrated profound gaps in risk assessment and awareness. They “were not fully aware of the risks” from delivery partner failures, with early warning mechanisms that “should have kicked in earlier”, and seem to have failed to conduct the comprehensive risk assessments across all provision types that C3 explicitly requires.

    Mitigation planning fell similarly short of regulatory expectations. Institutions underestimated “the level of work and planning needed” while failing to properly identify alternative study options. Practical considerations like accommodation concerns with “third-party landlords” were overlooked entirely. And plans weren’t “produced in collaboration with students” as both C3 and pages like this promise:

    …we expect providers to collaborate with students to review and refresh the plan on a regular basis.

    Implementation and communication failures undermined student protection. When crises occurred, protection measures weren’t activated promptly, with students reporting “it was difficult to decide what to do next without having all the information in a timely manner.”

    Compensation processes generated confusion rather than clarity. Delivery partners neglected to inform lead institutions of closure risks, while information sharing was often restricted to “a smaller group of staff,” reducing planning capacity precisely when broad engagement was needed.

    And C3’s requirements regarding diverse student needs seem to have been unaddressed too. Support for students with additional needs proved inadequate in practice, while international students faced visa vulnerabilities that should have been anticipated.

    C3 also requires plans to be “published in a clear and accessible way” and “revised regularly” – requirements evidently unmet here, with evidence suggesting some providers maintained static protection measures that proved ineffective when actually needed.

    Has anyone been held to account for those failings? And for its own part, if OfS knew that ABA was in trouble (partly via Ofsted and partly via the DfE switching off the loans tap), even if C4 wasn’t applied, was C3 compliance scrutinised? Will other providers be held to account if they fail in similar ways? We are never told.

    The more the world is changing

    The questions pile up the further into the document you get. Given the changed financial circumstances in the sector and the filing cabinet that must be full of “at enhanced risk” of financial problems, why hasn’t OfS issued revised C3 guidance? If anyone’s reading inside the regulator, based on report I’ve had a go at the redraft that former OfS chair Michael Barber promised back in 2018 (and then never delivered) here – providers wishing to sleep at night should take a look too.

    You also have to wonder if OfS has demanded C3 rewrites of providers who have featured on the front of the Sunday Times, or who have announced redundancies. If it has, there’s not much evidence – there’s clearly a wild mismatch between the often years old, “very low risks here” statements in “live” SPPs that I always look for when a redundancy round is threatened, and I have a live list of those featured on Queen Mary UCU’s “HE Shrinking” webpage whose SPPs paint a picture of financial stability and infinitesimally small course closure risk despite many now teaching them out.

    I’ve posted before about the ways in which things like “teach out” sound great in practice, but almost always go wrong – with no attempt by OfS to evaluate, partly because it usually doesn’t know about them. I’m also, to be fair, aware that in multiple cases providers have submitted revised student protection plans to the regulator, only to hear nothing back for months on end.

    Of course in theory the need for a specific and dedicated SPP may disappear in the future – OfS is consulting on replacing them with related comprehensive information. But when that might apply to existing providers is unknown – and so for the time being, OfS’ own protection promises on its own website appear to be going unmet with impunity for those not meeting them:

    Student protection plans set out what students can expect to happen should a course, campus, or institution close. The purpose of a plan is to ensure that students can continue and complete their studies, or can be compensated if this is not possible.

    As such the brief reads like a mixture between a set of case studies and “best practice”, with even less regulatory force than a set of summaries from the OIA. The difference here – as the OIA regularly itself identifies – is that the upholding of a complaint against its “Good Practice Framework” won’t be much use if the provider is in administration.

    So whether it’s holes in the wording of C3, problems in predicting what C3’s requirements might mean, a lack of enforcement over what students are being promised now, a need for C3 to be revised and updated, a need for better guidance in light of cases surrounding it, or a need for all of these lessons to be built into its new proposed C5 (and then implemented across the existing regulated sector), what OfS has done is pretty much reveal that students should have no trust in the protection arrangements currently on offer.

    And for future students, wider lessons – on the nature of what is and isn’t being funded, and whether the risks can ever be meaningfully mitigated – are entirely absent here too.

    Amidst cuts that OfS itself is encouraging, from a course or campus closure point of view, a mixture of OfS consistently failing to define “material component” in the SPP guidance, and a breath of providers either having clauses that give them too much power to vary from what was promised, or pretending their clauses allow them to merge courses or slash options when they don’t, is bad enough – as is the tactic of telling students of changes a couple of weeks before the term starts when the “offer” of “you can always break the contract on your side” is a pretty pointless one.

    But from a provider collapse perspective, it’s unforgivable. Whatever is done in the future on franchising, you’d have to assume that many of the providers already look pretty precarious now – and will be even more so if investigations (either by the government or newspapers) reveal more issues, or if OfS makes them all register (where the fit and proper person test looks interesting), or if the government bans domestic agents.

    And anyone that thinks that it’s only franchised providers that look precarious right now really ought to get their head across the risk statements in this year’s crop of annual accounts.

    Back in 2017 when DfE consulted on the Regulatory Framework on behalf of the emerging OfS back, it promised that were there to be economic changes that dramatically affected the sustainability of many providers, the regulator would work with providers to improve their student protection plans so that they remained “strong” and “deliverable” in service of the student interest.

    So far they’ve proved to be weak and undeliverable. Whether that’s DfE’s fault for not getting the powers right, OfS’ for not using them, or ministers’ fault for freezing fees, taking the cap off recruitment and letting cowboys in to trouser wads of tuition fee loan money is an issue for another day. For now, someone either needs to warn students that promises on protection are nonsense, or providers, DfE and OfS need to act now to make good on the promises of protection that they’ve made.

    Source link