Tag: research

  • The Society for Research into Higher Education in 2005

    The Society for Research into Higher Education in 2005

    by Rob Cuthbert

    In SRHE News and Blog a series of posts is chronicling, decade by decade, the progress of SRHE since its foundation 60 years ago in 1965. As always, our memories are supported by some music of the times.

    In 2005 Hurricane Katrina hit the Gulf Coast in the USA, and a Kashmir earthquake in Pakistan killed 86,000. In London 52 people died in the 7/7 suicide bombings; Jean Charles de Menezes, wrongly suspected of being a fugitive terrorist, was killed by London police officers. Labour under Tony Blair won its third successive victory in the 2005 UK general election, George W Bush was sworn in for his second term as US President, and Angela Merkel became the first female Chancellor of Germany. Pope John Paul II died and was succeeded by Pope Benedict XVI. Prince Charles married Camilla Parker Bowles. YouTube was founded, Microsoft released the Xbox 360, the Superjumbo Airbus A380 made its first flight and the Kyoto Protocol officially took effect. There was no war in Ukraine as Greece won the Eurovision Song Contest 2005 in Kyiv, thanks to Helena Paparizou with “My Number One” (no, me neither). In a reliable barometer of the times the year’s new words included glamping, microblogging and ransomware. And the year was slightly longer when another leap second was added.

    Higher education in 2005

    So here we are, with many people taking stock of where HE had got to in 2005 – suddenly I see. Evan Schofer (Minnesota) and John W Meyer (Stanford) looked at the worldwide expansion of HE in the twentieth century in the American Sociological Review, noting that: “An older vision of education as contributing to a more closed society and occupational system—with associated fears of “over-education”—was replaced by an open-system picture of education as useful “human capital” for unlimited progress. … currently about one-fifth of the world cohort is now enrolled in higher education.”

    Mark Olssen (Surrey) and Michael A Peters (Surrey) wrote about “a fundamental shift in the way universities and other institutions of higher education have defined and justified their institutional existence” as different governments sought to apply some pressure. Their 2005 article in the Journal of Educational Policy traced“… the links between neoliberalism and globalization on the one hand, and neoliberalism and the knowledge economy on the other. … Universities are seen as a key driver in the knowledge economy and as a consequence higher education institutions have been encouraged to develop links with industry and business in a series of new venture partnerships.”

    Åse Gornitzka (Oslo), Maurice Kogan (Brunel) and Alberto Amaral (Porto) edited Reform and Change in Higher Education: Analysing Policy Implementation, also taking a long view of events since the publication 40 years earlier of Great Expectations and Mixed Performance: The Implementation of Higher Education Reforms in Europe by Ladislav Cerych and Paul Sabatier. The 2005 book provided a review and critical appraisal of current empirical policy research in higher education with Kogan on his home territory writing the first chapter, ‘The Implementation Game’. At the same time another giant of HE research, SRHE Fellow Michael Shattock, was equally at home editing a special issue of Higher Education Management and Policy on the theme of ‘Entrepreneurialism and the Knowledge Society’. That journal had first seen the light of day in 1977, being a creation of the OECD programme on Institutional Management in Higher Education, a major supporter of and outlet for research into HE in those earlier decades. The special issue included articles by SRHE Fellows Ron Barnett and Gareth Williams, and by Steve Fuller (Warwick), who would be a keynote speaker at the SRHE Research Conference in 2006. The journal’s Editorial Advisory Group was a beautiful parade of leading researchers into HE, including among others Elaine El-Khawas, (George Washington University, Chair), Philip Altbach (Boston College, US), Chris Duke (RMIT University, Australia), Leo Goedegebuure (Twente), Ellen Hazelkorn (Dublin Institute of Technology), Lynn Meek (University of New England, Australia), Robin Middlehurst (Surrey), Christine Musselin (Centre de Sociologie des Organisations (CNRS), France), Sheila Slaughter (Arizona) and Ulrich Teichler (Gesamthochschule Kassel, Germany).

    I’ve got another confession to makeShattock had been writing about entrepreneurialism as ‘an idea for its time’ for more than 15 years, paying due homage to Burton Clark. The ‘entrepreneurial university’ was indeed a term “susceptible to processes of semantic appropriation to suit particular agendas”, as Gerlinde Mautner (Vienna) wrote in Critical Discourse Studies. It was a concept that seemed to break through to the mainstream in 2005 – witness, a survey by The Economist, ‘The Brains Business’ which said: “America’s system of higher education is the best in the world. That is because there is no system … Europe hopes to become the world’s pre-eminent knowledge-based economy. Not likely … For students, higher education is becoming a borderless world … Universities have become much more businesslike, but they are still doing the same old things … A more market-oriented system of higher education can do much better than the state-dominated model”. You could have it so much better, said The Economist.

    An article by Simon Marginson (then Melbourne, now Oxford via UCL), ‘Higher Education in the Global Economy’, noted that “… a new wave of Asian science powers is emerging in China (including Hong Kong and Taiwan), Singapore and Korea. In China, between 1995 and 2005 the number of scientific papers produced each year multiplied by 4.6 times. In South Korea … 3.6 times, in Singapore 3.2. … Between 1998 and 2005 the total number of graduates from tertiary education in China increased from 830,000 to 3,068,000 ….” (and Coldplay sang China all lit up). Ka Ho Mok (then Hang Seng University, Hong Kong) wrote about how Hong Kong institutional strategies aimed to foster entrepreneurship. Private education was booming, as Philip Altbach (Boston College) and Daniel C Levy (New York, Albany) showed in their edited collection, Private Higher Education: a Global Revolution. Diane Reay (Cambridge), Miriam E David and Stephen J Ball (both IoE/UCL) reminded us that disadvantage was always with us, as we now had different sorts of higher educations, offering Degrees of choice: class, race, gender and higher education.

    The 2005 Oxford Review of Education article by SRHE Fellow Rosemary Deem (Royal Holloway) and Kevin J Brehony (Surrey) ‘Management as ideology: the case of ‘new managerialism’ in higher education’ was cited by almost every subsequent writer on managerialism in HE. 2005 was not quite the year in which journal articles appeared first online; like many others in 2005 that article appeared online only two years later in 2007, as publishers digitised their back catalogues. However by 2005 IT had become a dominant force in institutional management. Libraries were reimagined as library and information services, student administration was done in virtual learning environments, teaching was under the influence of learning management systems.

    The 2005 book edited by Paul Ashwin (Lancaster), Changing higher education: the development of learning and teaching, reviewed changes in higher education and ways of thinking about teaching and learning over the previous 30 years. Doyenne of e-learning Diana Laurillard (UCL) said: “Those of us working to improve student learning, and seeking to exploit elearning to do so, have to ride each new wave of technological innovation in an attempt to divert it from its more natural course of techno-hype, and drive it towards the quality agenda.” Singh, O’Donoghue and Worton (all Wolverhampton) provided an overview of the effects of eLearning on HE andin an article in the Journal of University Teaching Learning Practice.

    UK HE in 2005

    Higher education in the UK kept on growing. HESA recorded 2,287,540 students in the UK in 2004-2005, of whom 60% were full-time or sandwich students. Universities UK reported a 43% increase in student numbers in the previous ten years, with the fastest rise being in postgraduate numbers, and there were more than 150,000 academic staff in universities.

    Government oversight of HE went from the Department for Education (DfE) to the Department for Education and Employment (DfEE), then in 2001 the Department for Education and Skills (DfES), which itself would only last until 2007. Gillian Shepherd was the last Conservative Secretary of State for Education before the new Labour government in 1997 installed David Blunkett until 2001, when Estelle Morris, Charles Clarke and Ruth Kelly served in more rapid succession. No party would dare to tangle with HE funding in 1997, so a cross-party pact set up the Dearing Review, which reported after the election. Dearing pleaded for its proposals to be treated as a package but government picked the bits it liked, notably the introduction of an undergraduate fee of £1000, introduced in 1998. Perhaps Kelly Clarkson got it right: You had your chance, you blew it.

    The decade after 1995 featured 12 separate pieces of legislation. The Conservative government’s 1996 Education (Student Loans) Act empowered the Secretary of State to subsidise private sector student loans. Under the 1996 Education (Scotland) Act the Scottish Qualifications Authority replaced the Scottish Examination Board and the Scottish Vocational Education Council. There was a major consolidation of previous legislation from the 1944 Education Act onwards in the 1996 Education Act, and the 1997 Education Act replaced the National Council for Vocational Qualifications and the School Curriculum and Assessment Authority with the Qualifications and Curriculum Authority.

    The new Labour government started by abolishing assisted places in private schools with the 1997 Education (Schools) Act (an immediate reward for party stalwarts, echoed 20 years later when the new Labour government started by abolishing VAT relief for private schools). The 1998 Education (Student Loans) Act allowed public sector student loans to be transferred to the private sector, which would prompt much subsequent comment and criticism when tranches of student debt were sold, causing unnecessary trouble. The 1998 Teaching and Higher Education Act established General Teaching Councils for England and Wales, made new arrangements for the registration and training of teachers, changed HE student financial support arrangements and allowed fees to rise to £3000, passing narrowly after much Parliamentary debate. The 1998 School Standards and Framework Act followed, before the 2000 Learning and Skills Act abolished the Further Education Funding Councils and set up the Learning and Skills Council for England, the National Council for Education and Training for Wales, and the Adult Learning Inspectorate. The 2001 Special Educational Needs and Disability Act extended provision against discrimination on grounds of disability in schools, further and higher education.

    The 2004 Higher Education Act established the Arts and Humanities Research Council, created a Director of Fair Access to Higher Education, made arrangements for dealing with students’ complaints and made provisions relating to grants and loans to students in higher and further education. In 2005 in the Journal of Education Policy Robert Jones (Edinburgh) and Liz Thomas (HE Academy) identified three strands – academic, utilitarian and transformative – in policy on access and widening participation in the 2003 White Paper which preceded the 2004 Act. They concluded that “… within a more differentiated higher education sector different aspects of the access discourse will become dominant in different types of institutions.” Which it did, but perhaps not quite in the way they might have anticipated.

    John Taylor (then Southampton) looked much further back, at the long-term implications of the devastating 1981 funding cuts, citing Maurice Kogan and Stephen Hanney (both Brunel) “Before then, there was very little government policy for higher education. After 1981, the Government took a policy decision to take policy decisions, and other points such as access and efficiency moves then followed.”.

    SRHE and research into higher education in 2005

    With long experience of engaging with HE finance policy, Nick Barr and Iain Crawford (both LSE) boldly titled their 2005 book Financing Higher Education: Answers From the UK. But policies were not necessarily joined up, and often pointed in different directions, as SRHE Fellow Paul Trowler (Lancaster), Joelle Fanghanel (City University, London) and Terry Wareham (Lancaster) noted in their analysis, in Studies in Higher Education, of initiatives to enhance teaching and learning: “… these interventions have been based on contrasting underlying theories of change and development. One hegemonic theory relates to the notion of the reflective practitioner, which addresses itself to the micro (individual) level of analysis. It sees reflective practitioners as potential change agents. Another relates to the theory of the learning organization, which addresses the macro level … and sees change as stemming from alterations in organizational routines, values and practices. A third is based on a theory of epistemological determinism … sees the discipline as the salient level of analysis for change. … other higher education policies exist … not overtly connected to the enhancement of teaching and learning but impinging upon it in very significant ways in a bundle of disjointed strategies and tacit theories.”

    SRHE Fellow Ulrich Teichler (Kassel) might have been channelling The Killers as he looked on the bright side about the growth of research on higher education in Europe in the European Journal of Education: “Research on higher education often does not have a solid institutional base and it both benefits and suffers from the fact that it is a theme-base area of research, drawing from different disciplines, and that the borderline is fuzzy between researchers and other experts on higher education. But a growth and quality improvement of research on higher education can be observed in recent years …”

    European research into HE had reached the point where Katrien Struyven, Filip Dochy and Steven Janssens (all Leuven) could review evaluation and assessment from the student’s point of view in Evaluation and Assessment in Higher Education:“… students’ perceptions about assessment significantly influence their approaches to learning and studying. Conversely, students’ approaches to study influence the ways in which they perceive evaluation and assessment.” Lin Norton (Liverpool Hope) and four co-authors surveyed teachers’ beliefs and intentions about teaching in a much-cited article in Higher Education: “… teachers’ intentions were more orientated towards knowledge transmission than were their beliefs, and problem solving was associated with beliefs based on learning facilitation but with intentions based on knowledge transmission.” Time for both students and teachers to realise it was not all about you.

    SRHE had more than its share of dislocations and financial difficulties in the decade to 2005. After its office move to Devonshire Street in London in 1995 the Society’s financial position declined steadily, to the point where survival was seriously in doubt. Little more than a decade later we would have no worries, but until then the Society’s chairs having more than one bad day were Leslie Wagner (1994-1995), Oliver Fulton (1996-1997), Diana Green (1998-1999), Jennifer Bone (2000-2001), Rob Cuthbert (2002-2003) and Ron Barnett (2004-2005). The crisis was worst in 2002, when SRHE’s tenancy in Devonshire Street ended. At the same time the chairs of SRHE’s three committees stepped down and SRHE’s funds and prospective income reached their lowest point, sending a shiver down the spine of the governing Council. The international committee was disbanded but the two new incoming committee chairs for Research (Maria Slowey, Dublin City University) and Publications (Rosemary Deem, Royal Holloway) began immediately to restore the Society’s academic and financial health. SRHE Director Heather Eggins arranged a tenancy at the Institute of Physics in 76 Portland Place, conveniently near the previous office. From 2005 the new Director, Helen Perkins, would build on the income stream created by Rosemary Deem’s skilful negotiations with publishers to transform the Society’s finances and raise SRHE up. The annual Research Conference would go from strength to strength, find a long-term home in Celtic Manor, and see SRHE’s resident impresario François Smit persuade everyone that they looked good on the dancefloor. But that will have to wait until we get to SRHE in 2015.

    Rob Cuthbert is editor of SRHE News and the SRHE Blog, Emeritus Professor of Higher Education Management, University of the West of England and Joint Managing Partner, Practical Academics. Email [email protected]. Twitter/X @RobCuthbert. Bluesky @robcuthbert22.bsky.social.

    Author: SRHE News Blog

    An international learned society, concerned with supporting research and researchers into Higher Education

    Source link

  • Science of Reading Training, Practice Vary, New Research Finds – The 74

    Science of Reading Training, Practice Vary, New Research Finds – The 74


    Get stories like this delivered straight to your inbox. Sign up for The 74 Newsletter

    North Carolina is one of several states that have passed legislation in recent years to align classroom reading instruction with the research on how children learn to read. But ensuring all students have access to research-backed instruction is a marathon, not a sprint, said education leaders and researchers from across the country on a webinar from the Hunt Institute last Wednesday.

    Though implementation of the state’s reading legislation has been ongoing since 2021, more resources and comprehensive support are needed to ensure teaching practice and reading proficiency are improved, webinar panelists said.

    “The goal should be to transition from the science of reading into the science of teaching reading,” said Paola Pilonieta, professor at the University of North Carolina at Charlotte who was part of a team that studied North Carolina’s implementation of its 2021 Excellent Public Schools Act.

    That legislation mandates instruction to be aligned with “the science of reading,” the research that says learning to read involves “the acquisition of language (phonology, syntax, semantics, morphology, and pragmatics), and skills of phonemic awareness, accurate and efficient work identification (fluency), spelling, vocabulary, and comprehension.”

    The legislature allocated more than $114 million to train pre-K to fifth grade teachers and other educators in the science of reading through a professional development tool called the Language Essentials for Teachers of Reading and Spelling (LETRS). More than 44,000 teachers had completed the training as of June 2024.

    Third graders saw a two-point drop, from 49% to 47%, in reading proficiency from the 2023-24 to 2024-25 school year on literacy assessments. It was the first decline in this measure since LETRS training began. First graders’ results on formative assessments held steady at 70% proficiency and second graders saw a small increase, from 65% to 66%.

    “LETRS was the first step in transforming teacher practice and improving student outcomes,” Pilonieta said. “To continue to make growth in reading, teachers need targeted ongoing support in the form of coaching, for example, to ensure effective implementation of evidence-based literacy instruction.”

    Teachers’ feelings on the training

    Pilonieta was part of a team at UNC-Charlotte and the Education Policy Initiative at Carolina (EPIC) at UNC-Chapel Hill that studied teachers’ perception of the LETRS training and districts’ implementation of that training. The team also studied teachers’ knowledge of research-backed literacy practices and how they implemented those practices in small-group settings after the training.

    They asked about these experiences through a survey completed by 4,035 teachers across the state from spring 2023 to winter 2024, and 51 hour-long focus groups with 113 participants.

    Requiring training on top of an already stressful job can be a heavy lift, Pilonieta said. LETRS training looked different across districts, the research team found. Some teachers received stipends to complete the training or were compensated with time off, and some were not. Some had opportunities to collaborate with fellow educators during the training; some did not.

    “These differences in support influenced whether teachers felt supported during the training, overwhelmed, or ignored,” Pilonieta said.

    Teachers did perceive the content of the LETRS training to be helpful in some ways and had concerns in others, according to survey respondents.

    Teachers holding various roles found the content valuable in learning about how the brain works, phonics, and comprehension.

    They cited issues, however, with the training’s applicability to varied roles, limited differentiation based on teachers’ background knowledge and experience, redundancy, and a general limited amount of time to engage with the training’s content.

    Varied support from administrators, coaches

    When asking teachers about how implementation worked at their schools, the researchers found that support from administrators and instructional coaches varied widely.

    Teachers reported that classroom visits from administrators with a focus on science of reading occurred infrequently. The main support administrators provided, according to the research, was planning time.

    “Many teachers felt that higher levels of support from coaches would be valuable to help them implement these reading practices,” Pilonieta said.

    Teachers did report shifts in their teaching practice after the training and felt those tweaks had positive outcomes on students.

    The team found other conditions impacted teachers’ implementation: schools’ use of curriculum that aligned to the concepts covered in the training, access to materials and resources, and having sufficient planning time.

    Some improvement in knowledge and practice

    Teachers performed well on assessments after completing the training, but had lower scores on a survey given later by the research team. Pilonieta said this suggests an issue with knowledge retention.

    Teachers scored between 95% to 98% across in the LETRS post-training assessment. But in the research team’s survey, scores ranged from 48% to 78%.

    Teachers with a reading license scored higher on all knowledge areas addressed in LETRS than teachers who did not.

    When the team analyzed teachers’ recorded small-group reading lessons, 73% were considered high-quality. They found consistent use of explicit instruction, which is a key component of the science of reading, as well as evidence-backed strategies related to phonemic awareness and phonics. They found limited implementation of practices on vocabulary and comprehension.

    Among the low-quality lessons, more than half were for students reading below grade level. Some “problematic practices” persisted in 17% of analyzed lessons.

    What’s next?

    The research team formed several recommendations on how to improve reading instruction and reading proficiency.

    They said ongoing professional development through education preparation programs and teacher leaders can help teachers translate knowledge to instructional change. Funding is also needed for instructional coaches to help teachers make that jump.

    Guides differentiated by grade levels would help different teachers with different needs when it comes to implementing evidence-backed strategies. And the state should incentivize teachers to pursue specialized credentials in reading instruction, the researchers said.

    Moving forward, the legislation might need more clarity on mechanisms for sustaining the implementation of the science of reading. The research team suggests a structured evaluation framework that tracks implementation, student impact, and resource distribution to inform the state’s future literacy initiatives.

    This article first appeared on EdNC and is republished here under a Creative Commons Attribution-NoDerivatives 4.0 International License.


    Get stories like these delivered straight to your inbox. Sign up for The 74 Newsletter

    Source link

  • The REF helps make research open, transparent, and credible- let’s not lose that

    The REF helps make research open, transparent, and credible- let’s not lose that

    The pause to reflect on REF 2029 has reignited debate about what the exercise should encompass – and in particular whether and how research culture should be assessed.

    Open research is a core component of a strong research culture. Now is the time to take stock of what has been achieved, and to consider how REF can promote the next stage of culture change around open research.

    Open research can mean many things in different fields, as the UNESCO Recommendation on Open Science makes clear. Wherever it is practiced, open research shifts focus away from outputs and onto processes, with the understanding that if we make the processes around research excellent, then excellent outcomes will follow

    Trust

    Being open allows quality assurance processes to work, and therefore research to be trustworthy. Although not all aspects of research can be open (sensitive personal data, for example), an approach to learning about the world that is as open as possible differentiates academic research from almost all other routes to knowledge. Open research is not just a set of practices – it’s part of the culture we build around integrity, collaboration and accountability.

    But doing research openly takes time, expertise, support and resources. As a result, researchers can feel vulnerable. They can worry that taking the time to focus on high-quality research processes might delay publication and risk them being scooped, or that including costs for open research in funding bids might make them less likely to be funded; they worry about jeopardising their careers. Unless all actors in the research ecosystem engage, then some researchers and some institutions will feel that they put themselves at a disadvantage.

    Open research is, therefore, a collective action problem, requiring not only policy levers but a culture shift in how research is conducted and disseminated, which is where the REF comes in.

    REF 2021

    Of all the things that influence how research is done and managed in the UK HE sector, the REF is the one that perhaps attracts most attention, despite far fewer funds being guided by its outcome than are distributed to HEIs in other ways.

    One of the reasons for this attention is that REF is one of the few mechanisms to address collective action problems and drive cultural change in the sector. It does this in two ways, by setting minimum standards for a submission, and by setting some defined assessment criteria beyond those minimum standards. Both mechanisms provide incentives for submitting institutions to behave in particular ways. It is not enough for institutions to simply say that they behave in this way – by making submissions open, the REF makes institutions accountable for their claims, in the same way as researchers are made accountable when they share their data, code and materials.

    So, then, how has this worked in practice?

    A review of the main panel reports from REF 2021 shows that evidence of open research was visible across all four main panels, but unevenly distributed. Panel A highlighted internationally significant leadership in Public Health, Health Services and Primary Care (UoA 2) and Psychology, Psychiatry and Neuroscience (UoA 4), while Panel B noted embedded practices in Chemistry (UoA 8) and urged Computer Science and Informatics (UoA 11) to make a wider shift towards open science through sharing data, software, and protocols. Panel C pointed to strong examples in Geography and Environment Studies (UoA 14), and in Archaeology (UoA 15), where collaboration, transparency, and reproducibility were particularly evident. By contrast, Panel D – and parts of Panel C – showed how definitions of open research can be more complex, because what constitutes ‘open research’ is perhaps much more nuanced and varied in these disciplines, and these disciplines did not always demonstrate how they were engaging with institutional priorities on open research and supporting a culture of research integrity. Overall, then, open research did not feature in the reports on most UoAs.

    It is clear that in 2021 there was progress, in part guided by the inclusion in the REF guidance of a clear indicator. However, there is still a long way to go and it is clear open research was understood and evidenced in ways that could exclude some research fields, epistemologies and transparent research practices.

    REF 2029

    With REF 2029, the new People, Culture and Environment element has created a stronger incentive to drive culture change across the sector. Institutions are embracing the move beyond compliance, making openness and transparency a core part of everyday research practice. However, alignment between this sector move, REF policy and funder action remains essential to address this collective action problem and therefore ensure that this progress is maintained.

    To step back now would not only risk slowing, or even undoing, progress, but would send confused signals that openness and transparency may be optional extras rather than essentials for a trusted research system. Embedding this move is not optional: a culture of openness is essential for the sustainability of UK research and development, for the quality of research processes, and for ensuring that outputs are not just excellent, but also trustworthy in a time of mass misinformation.

    Openness, transparency and accountability are key attributes of research, and hallmarks of the culture that we want to see in the sector now and in the future. Critically, coordinated sector-wide, institutional and individual actions are all needed to embed more openness into everyday research practices. This is not just about compliance – it is about a genuine culture shift in how research is conducted, shared and preserved. It is about doing the right thing in the right way. If that is accepted, then we would challenge those advocating for reducing the importance of those practices in the REF: what is your alternative, and will it command public trust?

     

    This article was supported by contributions from:

    Michel Belyk (Edge Hill University), Nik Bessis (Edge Hill University), Cyclia Bolibaugh (University of York), Will Cawthorn (University of Edinburgh), Joe Corneli (Oxford Brookes University), Thomas Evans (University of Greenwich), Eleanora Gandolfi (University of Surrey), Jim Grange (Keele University), Corinne Jola (Abertay University), Hamid Khan (Imperial College, London), Gemma Learmonth (University of Stirling), Natasha Mauthner (Newcastle University), Charlotte Pennington (Aston University), Etienne Roesch (University of Reading), Daniela Schmidt (University of Bristol), Suzanne Stewart (University of Chester), Richard Thomas (University of Leicester), Steven Vidovic (University of Southampton), Eric White (Oxford Brookes University).

    Source link

  • Notes on Research Policy, Here and Abroad

    Notes on Research Policy, Here and Abroad

    Hi all. I thought I would take some time to have a chat about how research policy is evolving in other countries, because I think there are some lessons we need to learn here in Canada.

    One piece of news that struck me this week came from Switzerland, where the federal government is slashing the budget of the Swiss National Science Foundation (SNSF) by 20%. If the Swiss, a technological powerhouse of a nation, with a broad left-right coalition in power and a more or less balanced budget, are cutting back on science like this, then we might all have to re-think the idea that being anti-Science is just a manifestation of right-wing populism. Higher education as a whole has some thinking to do.

    And right now, two countries are in fact re-thinking science quite a bit. In the UK, the new head of UK Research and Innovation (roughly, that country’s One Big Granting Council), has told institutions that they might need to start “doing fewer things but doing them well”, to which the President of Universities UK and vice-chancellor of Manchester Metropolitan University Malcom-Press added that he was “hearing from government is that [they] don’t want to be investing in areas of research where we don’t have the quality and we don’t have the scale.” And, the kicker: “You can’t have hobbyist research that’s unfunded going on in institutions. We can’t afford it.”

    Over to Australia, where a few months ago the government set up a Strategic Examination of Research and Development, which released a discussion paper, held consultations and got feedback (which it published) and has now released six more “issue” papers for consultation which detail government thinking in many different and more detailed ways. If this sounds magical to you, it is because you are from Canada, where the standard practice for policymaking is to do everything behind closed doors and treat stakeholders like mushrooms (in the dark with only fecal matter for company) instead of a place where policy-making is treated as a serious endeavour in which public input and expert advice is welcomed. 

    For today’s purposes however, what matters is not process but policy. The review is seriously considering a number of fairly radical ideas, such as creating a few national “focus areas” for research funding, which would attract higher rates of overhead and requiring institutions to focus their efforts in one of these priority areas via mission-based compacts (which are sort of like Ontario’s Multi-Year Agreements, only they are meaningful) so as to build scale and specialization. 

    Whew.

    One thing that strikes me as odd about both the UK and Australian line of thinking is the idea that institutional specialization matters all that much. While lots of research is done at the level of the individual lab, most “big science” – the stuff people who dream about specialization have in mind when the talk about science – happens in teams which span many institutions, and more often than not across national borders as well. I get the sense that the phenomenon of institutional rankings have fried policy makers’ brains somewhat: they seem to think that the correct way to think about science is at the level of the institution, rather than labs or networks of laboratories. It’s kind of bananas. We can be glad that this kind of thinking has not infected Canadian policy too much because the network concept is more ingrained here.

    Which brings me to news here at home. 

    The rumour out of Ottawa is that in the next few months (still not clear if this is going to be fall 2025 or Spring 2026) there will be an announcement of a new envelope of money for research. But very definitely not inquiry-driven research. No, this is money which the feds intend to spend as part of the increase in “defence” spending which is supposed to rise to 2% of GDP by 2025-2026 and 5% by 2035. So, the kinds of things it will need to go to will be “security”, likely defined relatively generously. It will be for projects in space, protection of critical infrastructure, resiliency, maybe energy production, etc.  I don’t think this is going to be all about STEM and making widgets – there will be at least some room for social science in these areas and maybe humanities, too, though this seems to me a harder pitch to make. It is not clear from what I have heard if this is going to be one big pie or a series of smaller pies, divided up wither by mission or by existing granting council. But the money does seem to be on its way.

    Now before I go any further, I should point out that I have not heard anyone say that these new research envelopes are actually going to contain new money beyond what was spent in 2024-25.  As I pointed out a couple of weeks ago, that would be hard to square with the government’s deficit-fighting commitments.

    In fact, if I had to guess right now, the best-case scenario would be that the Liberals will do this by taking some or all of the 88% of the Budget 2024 research commitment to the tri-councils and push it into these new envelopes (worst-case scenario: they nuke the 88% of the 2024 Budget commitment they haven’t yet spent and claw back money from existing commitments to make these new envelopes). 

    So, obviously no push here for institutional specialization, but where our debate echoes those of the UK and Australia is that all three governments seem to want to shift away from broad-based calls for inquiry driven research and toward more mission-based research in some vaguely defined areas of national priority.  I know this is going to irritate and anger many people, but genuinely I don’t see many politically practical alternatives right now. As I said back here: if defending existing inquiry-driven tri-council budgets is the hill the sector chooses to die on, we’re all going to be in big trouble. 

    No one will forcing individual researchers or institutions to be part of this shift to mission-driven research, but clearly that’s where the money is going to be. So, my advice to VPs Research is: get your ducks in a row on this now. Figure out who in your institution does anything that can even tangentially be referred to as “security-enhancing”. Figure out what kinds of pitches you might want to make.  Start testing your elevator pitches. There will be rewards to first movers in this area.

    Source link

  • UChicago Sells Off Research Center

    UChicago Sells Off Research Center

    The University of Chicago is selling a celebrated research center as the generously endowed university navigates layoffs and program cuts amid a heavy debt load, Financial Times reported Monday.

    UChicago is reportedly selling the Center for Research in Security Prices, founded in 1960, for $355 million to Morningstar, a research and investment firm also located in Chicago. The center, known as CRSP, developed a market database more than 65 years ago that “allowed investors to measure historic rates of return for U.S. stocks,” according to its website, which notes that its data has been used in more than 18,000 peer-reviewed studies and by hundreds of entities.

    CRSP formally became a limited liability company in 2020 but remained wholly owned by UChicago and maintained its affiliation with the university and the Booth School of Business.

    The sale comes as financial issues are adding up for the university. UChicago has borrowed heavily in recent years and seen substandard returns on its $10 billion endowment. University officials recently announced plans to pause admission to multiple Ph.D. programs and to cut 400 staff jobs as the private institution grapples with a debt load that has grown to $6 billion.

    UChicago is currently trying to shed $100 million in expenses.

    The Trump administration’s cancellation of dozens of federal grants in recent months has also hurt the university’s bottom line. UChicago president Paul Alivisatos wrote in late August that the “profound federal policy changes of the last eight months have created multiple and significant new uncertainties and strong downward pressure on our finances.”

    Source link

  • Education Department takes a preliminary step toward revamping its research and statistics arm

    Education Department takes a preliminary step toward revamping its research and statistics arm

    In his first two months in office, President Donald Trump ordered the closing of the Education Department and fired half of its staff. The department’s research and statistics division, called the Institute of Education Sciences (IES), was particularly hard hit. About 90 percent of its staff lost their jobs and more than 100 federal contracts to conduct its primary activities were canceled.

    But now there are signs that the Trump administration is partially reversing course and wants the federal government to retain a role in generating education statistics and evidence for what works in classrooms — at least to some extent. On Sept. 25, the department posted a notice in the Federal Register asking the public to submit feedback by Oct. 15 on reforming IES to make research more relevant to student learning. The department also asked for suggestions on how to collect data more efficiently.

    The timeline for revamping IES remains unclear, as is whether the administration will invest money into modernizing the agency. For example, it would take time and money to pilot new statistical techniques; in the meantime, statisticians would have to continue using current protocols.

    Still, the signs of rebuilding are adding up. 

    Related: Our free weekly newsletter alerts you to what research says about schools and classrooms.

    At the end of May, the department announced that it had temporarily hired a researcher from the Thomas B. Fordham Institute, a conservative think tank, to recommend ways to reform education research and development. The researcher, Amber Northern, has been “listening” to suggestions from think tanks and research organizations, according to department spokeswoman Madi Biedermann, and now wants more public feedback.  

    Biedermann said that the Trump administration “absolutely” intends to retain a role in education research, even as it seeks to close the department. Closure will require congressional approval, which hasn’t happened yet. In the meantime, Biedermann said the department is looking across the government to find where its research and statistics activities “best fit.”

    Other IES activities also appear to be resuming. In June, the department disclosed in a legal filing that it had or has plans to reinstate 20 of the 101 terminated contracts. Among the activities slated to be restarted are 10 Regional Education Laboratories that partner with school districts and states to generate and apply evidence. It remains unclear how all 20 contracts can be restarted without federal employees to hold competitive bidding processes and oversee them. 

    Earlier in September, the department posted eight new jobs to help administer the National Assessment of Educational Progress (NAEP), also called the Nation’s Report Card. These positions would be part of IES’s statistics division, the National Center for Education Statistics. Most of the work in developing and administering tests is handled by outside vendors, but federal employees are needed to award and oversee these contracts. After mass firings in March, employees at the board that oversees NAEP have been on loan to the Education Department to make sure the 2026 NAEP test is on schedule.

    Only a small staff remains at IES. Some education statistics have trickled out since Trump took office, including its first release of higher education data on Sept. 23. But the data releases have been late and incomplete

    It is believed that no new grants have been issued for education studies since March, according to researchers who are familiar with the federal grant making process but asked not to be identified for fear of retaliation. A big obstacle is that a contract to conduct peer review of research proposals was canceled so new ideas cannot be properly vetted. The staff that remains is trying to make annual disbursements for older multi-year studies that haven’t been canceled. 

    Related: Chaos and confusion as the statistics arm of the Education Department is reduced to a skeletal staff of 3

    With all these changes, it’s becoming increasingly difficult to figure out the status of federally funded education research. One potential source of clarity is a new project launched by two researchers from George Washington University and Johns Hopkins University. Rob Olsen and Betsy Wolf, who was an IES researcher until March, are tracking cancellations and keeping a record of research results for policymakers. 

    If it’s successful, it will be a much-needed light through the chaos.

    Contact staff writer Jill Barshay at 212-678-3595, jillbarshay.35 on Signal, or [email protected].

    This story about reforming IES was produced by The Hechinger Report, a nonprofit, independent news organization focused on inequality and innovation in education. Sign up for Proof Points and other Hechinger newsletters.

    The Hechinger Report provides in-depth, fact-based, unbiased reporting on education that is free to all readers. But that doesn’t mean it’s free to produce. Our work keeps educators and the public informed about pressing issues at schools and on campuses throughout the country. We tell the whole story, even when the details are inconvenient. Help us keep doing that.

    Join us today.

    Source link

  • Is research culture really too hard to assess?

    Is research culture really too hard to assess?

    Assessing research culture has always been seen as difficult – some would say too difficult.

    Yet as REF 2029 pauses for reflection, the question of whether and how culture should be part of the exercise is unavoidable. How we answer this has the potential to shape not only the REF, but also the value we place on the people and practices that define research excellence.

    The push to assess research culture emerged from recognition that thriving, well-supported researchers are themselves important outcomes of the research system. The Stern Review highlighted that sustainable research excellence depends not just on research outputs but on developing the people who produce them. The Harnessing the Metric Tide report built on this understanding, recommending that future REF cycles should reward progress towards better research cultures.

    A significant proportion of what we have learnt about assessing research culture came from the People, Culture and Environment indicators project, run by Vitae and Technopolis, and Research England’s subsequent REF PCE pilot exercise. Together with the broader consultation as part of the Future Research Assessment Programme, this involved considerable sector engagement over multiple years.

    Indicators

    Nearly 1,600 people applied to participate in the PCE indicators co-development workshops. Over 500 participated across 137 institutions, with participants at all levels of career stage and roles. Representatives from ARMA, NADSN, UKRN, BFAN, ITSS, FLFDN and NCCPE helped facilitate the discussions and synthesise messages.

    The workshops confirmed what many suspected about assessing research culture. It’s genuinely difficult. Nearly every proposed indicator proved problematic. Participants raised concerns about gaming and burden. Policies could become tick-box exercises. Metrics might miss crucial context. But participants saw that clusters of indicators used together and contextualised could allow institutions to tell meaningful stories about their approach and avoid the potentially misleading pictures painted by isolated indicators.

    A recurring theme was the need to focus on mechanisms, processes and impacts, not on inputs. Signing up for things, collecting badges, and writing policies isn’t enough. We need to make sure that we are doing something meaningful behind these. This doesn’t mean we cannot evidence progress, rather that the evidence needs contextualising. The process of developing evidence against indicators would incentivise institutions to think more carefully about what they’re doing, why, and for whom.

    The crucial point that seems to have been lost is that REF PCE never set out to measure culture directly. Instead, it aimed to assess credible indicators of how institutions enable and support inclusive, sustainable, high-quality research.

    REF PCE was always intended to be an evolution, not a revolution. Culture has long been assessed in the REF, including through the 2021 Environment criteria of vitality and sustainability. The PCE framework aimed to build on this foundation, making assessment more systematic and comprehensive.

    Finance and diversity

    Two issues levelled at PCE have been the sector’s current financial climate and the difficulty of assessing culture fairly across institutional diversity. These are not new revelations. Both were anticipated and debated extensively in the PCE indicators project.

    Workshop participants stressed that the assessment must recognise that institutions operate with different resources and constraints, focusing on progress and commitment rather than absolute spending levels. There is no one-size-fits-all answer to what a good research culture looks like. Excellent research culture can look very different across the sector and even within institutions.

    This led to a key conclusion: fair assessment must recognise different starting points while maintaining meaningful standards. Institutions should demonstrate progress against a range of areas, with flexibility in how they approach challenges. Assessment needs to focus on ‘distance travelled’ rather than the destination reached.

    Research England developed the REF PCE pilot following these insights. This was deliberately experimental, testing more indicators than would ultimately be practical, as a unique opportunity to gather evidence about what works, what doesn’t, what is feasible, and equitable across the sector. Pilot panel members and institutions were co-designers, not assessors and assessees. The point was to develop evidence for a streamlined, proportionate, and robust approach to assessing culture.

    REF already recognises that publications and impact are important outputs of research. The PCE framework extended this logic: thriving, well-supported people working across all roles are themselves crucial outcomes that institutions should develop and celebrate.

    This matters because sustainable research excellence depends on the people who make it happen. Environments that support career development, recognise diverse contributions, and foster inclusion don’t just feel better to work in – they produce better research. The consultation revealed sophisticated understanding of this connection. Participants emphasised that research quality emerges from cultures that value integrity, collaboration, and support for all contributors.

    Inputs

    Some argue that culture is an input to the system that shouldn’t be assessed directly. Others suggest establishing baseline performance requirements as a condition for funding. However, workshop discussions revealed that setting universal standards low enough for all institutions to meet renders them meaningless as drivers of improvement. Baselines are important, but alone they are not sufficient. Research culture requires attention through assessment, incentivisation and reward that goes beyond minimum thresholds.

    Patrick Vallance and Research England now have unprecedented evidence about research culture assessment. Consultation has revealed sector priorities. The pilot has tested practical feasibility. The upcoming results, to be published in October, will show what approaches are viable and proportionate.

    Have we encountered difficulties? Yes. Do we have a perfect solution for assessing culture? No. But this REF is a huge first step toward better understanding and valuing of the cultures that underpin research in HE. We don’t need all the answers for 2029, but we shouldn’t discard the tangible progress made through national conversations and collaborations.

    This evidence base provides a foundation for informed decisions about whether and how to proceed. The question is whether policymakers will use it to build on promising foundations or retreat to assessment approaches that miss crucial dimensions of research excellence.

    The REF pause is a moment of choice. We can step back from culture as ‘too hard’, or build on the most substantial sector-wide collaboration ever undertaken on research environments. If we discard what we’ve built, we risk losing sight of the people and conditions that make UK research excellent.

    Source link

  • Saudi and Australia forge new paths in education and research

    Saudi and Australia forge new paths in education and research

    During the visit, Al-Benyan met with Australia’s minister of education, Jason Clare, where discussions focused on expanding ties in higher education, scientific research, and innovation, with emphasis on joint university initiatives, including twinning programs and faculty and student exchanges designed to build stronger academic links between the two countries.

    The research collaboration was prominently featured on the agenda, with both sides highlighting opportunities in fields such as artificial intelligence, cybersecurity, renewable energy, and health sciences. The minister also discussed investment opportunities in Saudi Arabia’s evolving education sector under Vision 2030, with a view to establishing local branches and research centers.

    Australia’s expertise in technical and vocational training was another focal point, as Saudi looks to enhance human capital development and equip its young population with the skills needed to succeed in the future labor market. Both ministers underlined the importance of supporting Saudi students in Australia by strengthening academic pathways and ensuring a welcoming educational and social environment.

    As well as his meeting with Clare, Al-Benyan held talks with professor Phil Lambert, a leading Australian authority on curriculum development. Their discussions centered on collaboration with Saudi Arabia’s National Curriculum Centre to develop learning programs that promote critical thinking, creativity, and innovation.

    The meeting reviewed best practices in student assessment, teacher training, and professional certification, aligning with global standards. Opportunities for joint research on performance evaluation and digital education methods were also explored with the aim of integrating advanced technologies into classrooms.

    Al-Benyan also took part in the Saudi-Australian Business Council meeting in Sydney, where he highlighted investment opportunities in the Kindgdom’s education sector in line with Vision 2030.

    Education is a key pillar globally and a central focus of Saudi Arabia’s Vision 2030, which aims to create a world class education system that nurtures innovation and drives future ready skills
    Sam Jamsheedi, president and chairman of the Australian Saudi Business Forum

    Conversations covered the launching of scholarship and exchange programs, advancing educational infrastructure and technologies, and promoting joint research in priority fields such as health, energy, and artificial intelligence, underscoring the importance of developing programs to enhance academic qualifications and support initiatives for persons with disabilities, while reaffirming Saudi Arabia’s commitment to supporting investors through regulatory incentives and strategic backing.

    “It was a pleasure to welcome the Minister of Education, His Excellency Yousef Al Benyan, as part of the official Ministry of Education, Saudi Arabia delegation from the Kingdom of Saudi Arabia to Australia,” said Sam Jamsheedi, president and chairman of the Australian Saudi Business Forum.

    “Education is a key pillar globally and a central focus of Saudi Arabia’s Vision 2030, which aims to create a world class education system that nurtures innovation and drives future ready skills.”

    “Our Council was proud to host a roundtable with leading Australian universities and training providers, giving Ministerial attendees first hand insights into Australia’s capabilities across higher education, vocational training, and research collaboration.”

    “Australian education already has a strong presence in the Kingdom, with a growing number of partnerships across early childhood education, schooling, technical training & university programs,” he added.

    Source link

  • 10+ Years of Lasting Impact and Local Commitment

    10+ Years of Lasting Impact and Local Commitment

    Over 60,000 students have benefited from the math program built on how the brain naturally learns

    A new analysis shows that students using ST Math at Phillips 66-funded schools are achieving more than twice the annual growth in math performance compared to their peers. A recent analysis by MIND Research Institute, which included 3,240 students in grades 3-5 across 23 schools, found that this accelerated growth gave these schools a 12.4 percentile point advantage in spring 2024 state math rankings.

    These significant outcomes are the result of a more than 10-year partnership between Phillips 66 and MIND Research Institute. This collaboration has brought ST Math, created by MIND Education, the only PreK–8 supplemental math program built on the science of how the brain learns, fully funded to 126 schools, 23 districts, and more than 60,000 students nationwide. ST Math empowers students to explore, make sense of, and build lasting confidence in math through visual problem-solving.

    “Our elementary students love JiJi and ST Math! Students are building perseverance and a deep conceptual understanding of math while having fun,” said Kim Anthony, Executive Director of Elementary Education, Billings Public Schools. “By working through engaging puzzles, students are not only fostering a growth mindset and resilience in problem-solving, they’re learning critical math concepts.”

    The initiative began in 2014 as Phillips 66 sought a STEM education partner that could deliver measurable outcomes at scale. Since then, the relationship has grown steadily, and now, Phillips 66 funds 100% of the ST Math program in communities near its facilities in California, Washington, Montana, Oklahoma, Texas, Illinois, and New Jersey. Once involved, schools rarely leave the program.

    To complement the in-class use of ST Math, Phillips 66 and MIND introduced Family Math Nights. These events, hosted at local schools, bring students, families, and Phillips 66 employee volunteers together for engaging, hands-on activities. The goal is to build math confidence in a fun, interactive setting and to equip parents with a deeper understanding of the ST Math program and new tools to support their child’s learning at home.

    “At Phillips 66, we believe in building lasting relationships with the communities we serve,” said Courtney Meadows, Manager of Social Impact at Phillips 66. “This partnership is more than a program. It’s a decade of consistent, community-rooted support to build the next generation of thinkers and improve lives through enriching educational experiences.”

    ST Math has been used by millions of students across the country and has a proven track record of delivering a fundamentally different approach to learning math. Through visual and interactive puzzles, the program breaks down math’s abstract language barriers to benefit all learners, including English Learners, Special Education students, and Gifted and Talented students.

    “ST Math offers a learning experience that’s natural, intuitive, and empowering—while driving measurable gains in math proficiency,” said Brett Woudenberg, CEO of MIND Education. “At MIND, we believe math is a gateway to brighter futures. We’re proud to partner with Phillips 66 in expanding access to high-quality math learning for thousands of students in their communities.”

    Explore how ST Math is creating an impact in Phillips 66 communities with this impact story: https://www.mindeducation.org/success-story/brazosport-isd-texas/

    About MIND Education
    MIND Education engages, motivates and challenges students towards mathematical success through its mission to mathematically equip all students to solve the world’s most challenging problems. MIND is the creator of ST Math, a pre-K–8 visual instructional program that leverages the brain’s innate spatial-temporal reasoning ability to solve mathematical problems; and InsightMath, a neuroscience-based K-6 curriculum that transforms student learning by teaching math the way every brain learns so all students are equipped to succeed. Since its inception in 1998, MIND Education and ST Math has served millions and millions of students across the country. Visit MINDEducation.org.

    About Phillips 66
    Phillips 66 (NYSE: PSX) is a leading integrated downstream energy provider that manufactures, transports and markets products that drive the global economy. The company’s portfolio includes Midstream, Chemicals, Refining, Marketing and Specialties, and Renewable Fuels businesses. Headquartered in Houston, Phillips 66 has employees around the globe who are committed to safely and reliably providing energy and improving lives while pursuing a lower-carbon future. For more information, visit phillips66.com or follow @Phillips66Co on LinkedIn.

    eSchool News Staff
    Latest posts by eSchool News Staff (see all)

    Source link

  • The Shrinking Research University Business Model

    The Shrinking Research University Business Model

    For most of the past 30 or so years, big Canadian universities have all been working off more or less the same business model: find areas where you can make big profits and use those profits to make yourself more research-intensive.

    That’s it. That’s the whole model.

    International students? Big profit centres. Professional programs? You better believe those are money-makers. Undergraduate studies – well, they might not make that much money in toto but holy moly first-year students are taken advantage of quite hideously to subsidize other activities, most notably research-intensity.

    Just to be clear, when I talk about “research-intensity”, I am not really talking about laboratories or physical infrastructure. I am talking about the entire financial superstructure that allows profs to teach 2 courses per semester and to be paid at rates which are comparable to those at (generally better-funded) large public research universities in the US. It’s about compensation, staffing complements, the whole shebang – everything that allows our institutions to compete internationally for research talent. Governments don’t pay enough, directly, for institutions to do that. So, universities have found ways to offer new products, or re-arrange the products they offer, in such a way as to support these goals of competitive hiring.

    Small universities do not have quite the same imperatives with respect to research, but this business model affects them nonetheless. To the extent that they wish to compete for staff with the research-intensive institutions, they have to pay higher salaries as well. Maybe the most extreme outcome of that arms race occurred at Laurentian, whose financial collapse was at least in part due to the university implicitly trying to align itself to U15 universities’ pay scales rather than, say, the pay scale at Lakehead (unions, which like to write ambitious pay “comparables” into institutional collective agreements, are obviously also a factor here).

    Anyways, the issue is that for one reason or another, governments have been chipping away at these various sources of profit that have been used to cross-subsidize research-intensity. The situation with international students is an obvious one, but this is happening in other ways too. Professional master’s degrees are not generating the returns they used to as private universities, both foreign and domestic, begin to compete, particularly in the business sector. (A non-trivial part of the reason that Queen’s found itself in financial difficulty last year was because its business school didn’t turn a profit for the first time in years. I don’t know the ins and outs of this, but I would be surprised if Northeastern’s aggressive push into Toronto wasn’t eating some of its executive education business). 

    Provincial governments – some of them, anyway – are also setting up colleges to compete with universities in a number of areas for undergraduate students. In Ontario, that has been going on for 20-25 years, but in other places like Nova Scotia it is just beginning. Some on the university side complain about these programs, primarily in polytechnics, being preferred by government because they are “cheap”, but they rarely get into specifics about quality. One reason college programs are often better on a per-dollar measure? The colleges aren’t building in a surplus to pay for research-intensity – this is precisely what allows them to do revolutionary things like not stuffing 300 first-year students in a single classroom.  

    In brief then: the feds have taken away a huge source of cross-subsidy. Provinces, to varying degrees (most prominently in Ontario), have been introducing competition to chip-away at other sources of surplus that allowed universities to cross-subsidize research intensity. Together, these two processes are putting the long-standing business model of big Canadian universities at risk.

    The whole issue of cross-subsidization raises two policy questions which are not often discussed in polite company – in Canada, at least. The first has to do with cross-subsidization and whether it is the correct policy or not. I suspect there is a strong majority among higher education’s interested public that think it probably is a good policy; we just don’t know for sure because the policy emerged, as so many Canadian policies do, through a process of extreme passive-aggressiveness. Institutions were mad at governments for not directly funding what they wanted to do, so they went off and did their own thing. Governments, grateful not to be harassed for money, said nothing, which institutions took for approval whereas in fact it was just (temporary) non-disapproval. 

    (I should add here – precisely because of all the passive-aggressiveness – it is not 100% clear to me the extent to which provincial governments understand the implications of introducing competition. When they allow new private or college degree programs, they likely think “we are improving options for students” not “I wonder how this might degrade the ability of institutions to conduct research”. And, of course, the reason they don’t think that is precisely because Canadians achieve everything through passive-aggression rather than open policy debates which might illuminate choices and trade-offs. Yay, us.)

    The second policy question – which we really never ever raise – is whether or not research-intensity, as it is practiced in Canadian universities, is worth subsidizing in the first place. I know, you’re all reading that in shock and horror because what is a university if it is not about research? Well, that’s a pretty partial view, and historically, a pretty recent one.  Even among the U15, there are several institutions whose commitment to being big research enterprises is less than 40 years old. And, of course, we already have plenty of universities (e.g. the Maple League) where research simply isn’t a focus – what’s to say the current balance of research-intensive to non-research-intensive universities is the correct one?

    Now add the following thought: if the country clearly doesn’t think that university research matters because the knowledge economy doesn’t matter and we should all be out there hewing wood and drawing water, and if the federal government not only chops the budget 2024 promises on research but then also cuts deeply into existing budgets, what compelling policy reason is there to keep arranging our universities the way we do?  Why not get off the cross-subsidization treadmill and think of ways of spending money on actually improving undergraduate education (which the sector always claims to be doing, but isn’t much, really).

    I am not, of course, advocating this as a course of policy. But given the way both the politics of research universities and the economics of their business models are heading, we might need to start discussing this stuff. Maybe even openly, for a change.

    Source link