Tag: research

  • Anatomy of the Research Statement (opinion)

    Anatomy of the Research Statement (opinion)

    The research statement that you include in your promotion and tenure dossier is one of the most important documents of your scholarly career—and one you’ll have little experience writing or even reading, unless you have a generous network of senior colleagues. As an academic editor, I support a half dozen or so academics each year as they revise (and re-revise, and throw out, and retrieve from the bin, and re-revise again) and submit their research statements and P&T dossiers. My experience is with—and so these recommendations are directed at—tenure-track researchers at American R-1s and R-2s and equivalent Canadian and Australian institutions.

    In my experience, most academics are good at describing what their research is and how and why they do it, but few feel confident in crafting a research statement that attests to the impact of their accomplishments. And “impact” is a dreaded word across the disciplines—one that implies reducing years of labor to mere numbers that fail to account for the depth, quality or importance of your work.

    When I think about “impact,” I think of course of the conventional metrics, but I think as well of your work’s influence among your peers in academia, and also of its resonance in nonacademic communities, be they communities of clinicians, patients, people with lived experiences of illness or oppression, people from a specific equity-deserving group, or literal neighborhoods that can be outlined on a map. When I edit research statements, I support faculty to shift their language from “I study X” to “My study of X has achieved Y” or “My work on X has accomplished Z.” This shift depends on providing evidence to show how your work has changed other people’s lives, work or thinking.

    For researchers who seek to make substantial contributions outside of academia—to cure a major disease, to change national policy or legislation—such a focus on impact, influence and resonance can be frustratingly short-termist. Yet if it is your goal to improve the world beyond the boundaries of your classroom and campus, then it seems worthwhile to find ways to show whether and how you are making progress toward that goal.

    If you’re preparing to go up for tenure or promotion, here’s a basic framework for a research statement, which you can adopt and adapt as you prepare your own impact-, influence- or resonance-focused research statement:

    Paragraph 1—Introduction

    Start with a high-level description of your overarching program of research. What big question unites the disparate parts of your work? What problem are you working toward solving? If your individual publications, presentations and grants were puzzle pieces, what big picture would they form?

    Paragraph 2—Background (Optional)

    Briefly sketch the background that informed your current preoccupations. Draw, if relevant, on your personal or professional background before your graduate studies. This paragraph should be short and should emphasize how your pre-academic life laid the foundation that has prepared you, uniquely, to address the key concerns that now occupy your intellectual life. For folks in some disciplines or institutions, this paragraph will be irrelevant and shouldn’t be included: trust your gut, or, if in doubt, ask a trusted senior colleague.

    Middle Paragraphs—Research Themes, Topics or Areas

    Cluster thematically—usually into two, three or four themes—the topics or areas into which your disparate projects and publications can be categorized. Within each theme, identify what you’re interested in and, if your methods are innovative, how you work to advance scholarly understandings of your subject. Depending on the expected length of your research statement, you might write three or four paragraphs for each theme. Each paragraph should identify external funding that you secured to advance your work and point to any outputs—publications, conference presentations, journal special issues, monographs, edited books, keynotes, invited talks, events, policy papers, white papers, end-user training guides, patents, op-eds and so on—that you produced.

    If the output is more than a few years old, you’ll also want to identify what impact (yes) that output had on other people. Doing so might involve pointing at your numbers of citations, but you might also:

    • Describe the diversity of your citations (e.g., you studied frogs but your research is cited in studies of salmon, belugas and bears, suggesting the broad importance of your work across related subfields);
    • Search the Open Syllabus database to identify the number of institutions that include your important publication in their teaching, or WorldCat, to identify the number of countries in which your book is held;
    • Link your ORCID account to Sage’s Policy Profiles to discover the government ministries and international bodies that have been citing your work;
    • Summarize media mentions of your work or big, important stories in news media, e.g. magazine covers or features in national newspapers (e.g. “In August 2025, this work was featured in The New York Times (URL)”);
    • Name awards you’ve won for your outputs or those won by trainees you supervised on the project, including a description of why the award-giving organization selected your or your trainee’s work;
    • Identify lists of top papers in which your article appears (e.g., most cited or most viewed in that journal in the year it was published); or,
    • Explain the scholarly responses to your work, e.g., conference panels discussing one of your papers or quotations from reviews of your book in important journals.

    Closing Paragraphs—Summary

    If you’re in a traditional research institution—one that would rarely be described by other academics as progressive or politically radical—then it may be advantageous for you to conclude your research statement with three summary paragraphs.

    The first would summarize your total career publications and your publications since appointment, highlighting any that received awards or nominations or that are notable for the number of citations or the critical response they have elicited. This paragraph should also describe, if your numbers are impressive, your total number of career conference presentations and invited talks or keynotes as well as the number since either your appointment or your last promotion, and the total number of publications and conference presentations you’ve co-authored with your students or trainees or partners from community or patient groups.

    A second closing paragraph can summarize your total career research funding and funding received since appointment, highlighting the money you have secured as principal investigator, the money that comes from external (regional, national and international) funders, and, if relevant, the new donor funding you’ve brought in.

    A final closing paragraph can summarize your public scholarship, including numbers of media mentions, hours of interviews provided to journalists, podcast episodes featured on or produced, public lectures delivered, community-led projects facilitated, or numbers of op-eds published (and, if available, the web analytics associated with these op-eds; was your piece in The Conversation one of the top 10 most cited in that year from your institution?).

    Final Paragraph—Plans and Commitments

    Look forward with excitement. Outline the upcoming projects, described in your middle paragraphs, to which you are already committed, including funding applications that are still under review. Paint for your reader a picture of the next three to five years of your research and then the rest of your career as you progress toward achieving the overarching goal that you identified in your opening paragraph.

    While some departments and schools are advising their pretenure faculty that references to metrics aren’t necessary in research statements, I—perhaps cynically—worry that the senior administrators who review tenure dossiers after your department head will still expect to see your h-index, total number of publications, number of high-impact-factor journals published in and those almighty external dollars awarded.

    Unless you are confident that your senior administrators have abandoned conventional impact metrics, I’d encourage you to provide these numbers and your disciplinary context. I’ve seen faculty members identify, for example, the average word count of a journal article in their niche, to show that their number of publications is not low but rather is appropriate given the length of a single article. I’ve seen faculty members use data from journals like Scientometrics to show that their single-digit h-index compares to the average h-index for associate professors in their field, even though they are not yet tenured. Such context will help your reader to understand that your h-index of eight is, in fact, a high number, and should be understood as such.

    You’ll additionally receive any number of recommendations from colleagues and mentors; for those of you who don’t have trusted colleagues or mentors at your institution, I’ve collected the advice of recently tenured and promoted associate professors and full professors from a range of disciplines and institutional contexts in this free 30-page PDF.

    I imagine that most of the peers and mentors whom you consult will remind you to align with any guidelines that your institution provides. Definitely, you should do this—and you should return to those guidelines and evaluation criteria, if they exist, as you iteratively revise your draft statement based on the feedback you receive from peers. You’ll also need to know what pieces of your P&T dossier will be read by what audience—external readers, a departmental or faculty committee, senior administrators. Anyone can tell you this; every piece of writing will need to consider both audience and context.

    But my biggest takeaway is something no client of mine has ever been told by a peer, colleague or mentor: Don’t just describe what you’ve done. Instead, point to the evidence that shows that you’ve done your work well.

    Source link

  • What research says about Mamdani and Cuomo’s education proposals

    What research says about Mamdani and Cuomo’s education proposals

    by Jill Barshay, The Hechinger Report
    November 3, 2025

    New York City, where I live, will elect a new mayor Tuesday, Nov. 4. The two front runners — state lawmaker Zohran Mamdani, the Democratic nominee, and former Gov. Andrew Cuomo, running as an independent — have largely ignored the city’s biggest single budget item: education. 

    One exception has been gifted education, which has generated a sharp debate between the two candidates. The controversy is over a tiny fraction of the student population. Only 18,000 students are in the city’s gifted and talented program out of more than 900,000 public school students. (Another 20,000 students attend the city’s elite exam-entrance high schools.) 

    But New Yorkers are understandably passionate about getting their kids into these “gated” classrooms, which have some of the best teachers in the city. Meanwhile, the racial composition of these separate (some say segregated) classes — disproportionately white and Asian — is shameful. Even many advocates of gifted education recognize that reform is needed. 

    Related: Our free weekly newsletter alerts you to what research says about schools and classrooms.

    Mamdani wants to end gifted programs for kindergarteners and wait until third grade to identify advanced students. Cuomo wants to expand gifted education and open up more seats for more children. 

    The primary justification for gifted programs is that some children learn so quickly that they need separate classrooms to progress at an accelerated pace. 

    But studies have found that students in gifted classrooms are not learning faster than their general education peers. And analyses of curricula show that many gifted classes don’t actually teach more advanced material; they simply group mostly white and Asian students together without raising academic rigor.

    In my reporting, I have found that researchers question whether we can accurately spot giftedness in 4- or 5-year-olds. My colleague Sarah Carr recently wrote about the many methods that have been used to try to identify young children with high potential, and how the science underpinning them is shaky. In addition, true giftedness is often domain-specific — a child might be advanced in math but not in reading, or vice versa — yet New York City’s system labels or excludes children globally rather than by subject. 

    Because of New York City’s size — it’s the nation’s largest public school system, even larger than 30 states — what happens here matters.

    Policy implications

    • Delaying identification until later grades, when cognitive profiles are clearer, could improve accuracy in picking students. 
    • Reforming the curriculum to make sure that gifted classes are truly advanced would make it easier to justify having them. 
    • Educators could consider ways for children to accelerate in a single subject — perhaps by moving up a grade in math or English classes. 
    • How to desegregate these classrooms, and make their racial/ethnic composition less lopsided, remains elusive.

    I’ve covered these questions before. Read my columns on gifted education:

    Size isn’t everything

    Another important issue in this election is class size. Under a 2022 state law, New York City must reduce class sizes to no more than 20 students in grades K-3 by 2028. (The cap will be 23 students per class in grades 4-8 and 25 students per class in high school.) To meet that mandate, the city will need to hire an estimated 18,000 new teachers.

    During the campaign, Mamdani said he would subsidize teacher training, offering tuition aid in exchange for a three-year commitment to teach in the city’s public schools. The idea isn’t unreasonable, but it’s modest — only $12 million a year, expected to produce about 1,000 additional teachers annually. That’s a small fraction of what’s needed.

    The bigger problem may be the law itself: Schools lack both physical space and enough qualified teachers. What parents want — small classes led by excellent, experienced educators — isn’t something the city can scale quickly. Hiring thousands of novices may not improve learning much, and will make the job of school principal, who must make all these hires, even harder.

    For more on the research behind class-size reductions, see my earlier columns:

    Contact staff writer Jill Barshay at 212-678-3595, jillbarshay.35 on Signal, or [email protected].

    This story about education issues in the New York City mayoral election was produced by The Hechinger Report, a nonprofit, independent news organization focused on inequality and innovation in education. Sign up for Proof Points and other Hechinger newsletters.

    This <a target=”_blank” href=”https://hechingerreport.org/proof-points-nyc-mayor-election-education/”>article</a> first appeared on <a target=”_blank” href=”https://hechingerreport.org”>The Hechinger Report</a> and is republished here under a <a target=”_blank” href=”https://creativecommons.org/licenses/by-nc-nd/4.0/”>Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License</a>.<img src=”https://i0.wp.com/hechingerreport.org/wp-content/uploads/2018/06/cropped-favicon.jpg?fit=150%2C150&amp;ssl=1″ style=”width:1em;height:1em;margin-left:10px;”>

    <img id=”republication-tracker-tool-source” src=”https://hechingerreport.org/?republication-pixel=true&post=113202&amp;ga4=G-03KPHXDF3H” style=”width:1px;height:1px;”><script> PARSELY = { autotrack: false, onload: function() { PARSELY.beacon.trackPageView({ url: “https://hechingerreport.org/proof-points-nyc-mayor-election-education/”, urlref: window.location.href }); } } </script> <script id=”parsely-cfg” src=”//cdn.parsely.com/keys/hechingerreport.org/p.js”></script>

    Source link

  • ‘End of an era’: Experts warn research executive order could stifle scientific innovation

    ‘End of an era’: Experts warn research executive order could stifle scientific innovation

    An executive order that gives political appointees new oversight for the types of federal grants that are approved could undercut the foundation of scientific research in the U.S., research and higher education experts say. 

    President Donald Trump’s order, signed Aug. 7, directs political appointees at federal agencies to review grant awards to ensure they align with the administration’s “priorities and the national interest.

    These appointees are to avoid giving funding to several types of projects, including those that recognize sex beyond a male-female binary or initiatives that promote “anti-American values,” though the order doesn’t define what those values are.   

    The order effectively codifies the Trump administration’s moves to deny or suddenly terminate research grants that aren’t in line with its priorities, such as projects related to climate change, mRNA research, and diversity, equity and inclusion.

    The executive order’s mandates mark a big departure from norms before the second Trump administration. Previously, career experts provided oversight rather than political appointees and peer review was the core way to evaluate projects.

    Not surprisingly, the move has brought backlash from some quarters.

    The executive order runs counter to the core principle of funding projects based on scientific merit — an idea that has driven science policy in the U.S. since World War II, said Toby Smith, senior vice president for government relations and public policy at the Association of American Universities. 

    “It gives the authority to do what has been happening, which is to overrule peer-review through changes and political priorities,” said Smith. “This is really circumventing peer review in a way that’s not going to advance U.S. science and not be healthy for our country.”

    That could stifle scientific innovation. Trump’s order could prompt scientists to discard their research ideas, not enter the scientific research field or go to another country to complete their work, research experts say. 

    Ultimately, these policies could cause the U.S. to fall from being one of the top countries for scientific research to one near the bottom, said Michael Lubell, a physics professor at the City College of New York.

    “This is the end of an era,” said Lubell. “Even if things settle out, the damage has already been done.”

    A new approach to research oversight

    Under the order, senior political appointees or their designees will review new federal awards as well as ongoingl grants and terminate those that don’t align with the administration’s priorities.

    This policy is a far cry from the research and development strategy developed by Franklin D. Roosevelt’s administration at the end of World War II. Vannevar Bush, who headed the U.S. Office of Scientific Research and Development at the time, decided the U.S. needed a robust national program to fund research that would leave scientists to do their work free from political pressure. 

    Bush’s strategy involved some government oversight over research projects, but it tended to defer to the science community to decide which projects were most promising, Lubell said. 

    “That kind of approach has worked extremely well,” said Lubell. “We have had strong economic growth. We’re the No. 1 military in the world, our work in the scientific field, whether it’s medicine, or IT — we’re right at the forefront.”

    But Trump administration officials, through executive orders and in public hearings, have dismissed some federal research as misleading or unreliable — and portrayed the American scientific enterprise as one in crisis. 

    The Aug. 7 order cited a 2024 report from the U.S. Senate Commerce, Science, and Transportation Committee, led by its then-ranking member and current chairman, Sen. Ted Cruz, R-Texas, that alleged more than a quarter of National Science Foundation spending supported DEI and other “left-wing ideological crusades.” House Democrats, in a report released in April, characterized Cruz’s report as “a sloppy mess” that used flawed methodology and “McCarthyistic tactics.”

    Source link

  • Australia signs research pact with China – Campus Review

    Australia signs research pact with China – Campus Review

    Universities Australia has signed a deal with China that will encourage research collaboration and student exchanges between the two countries.

    Please login below to view content or subscribe now.

    Membership Login

    Source link

  • The white paper is wrong – changing research funding won’t change teaching

    The white paper is wrong – changing research funding won’t change teaching

    The Post-16 education and skills white paper might not have a lot of specifics in it but it does mostly make sense.

    The government’s diagnosis is that the homogeneity of sector outputs is a barrier to growth. Their view, emerging from the industrial strategy, is that it is an inefficient use of public resources to have organisations doing the same things in the same places. The ideal is specialisation where universities concentrate on the things they are best at.

    There are different kinds of nudges to achieve this goal. One is the suggestion that the REF could more closely align to the government missions. The detail is not there but it is possible to see how impact could be made to be about economic growth or funding could be shifted more toward applied work. There is a suggestion that research funding should consider the potential of places (maybe that could lead to some regional multipliers who knows). And there are already announced steps around the reform on HEIF and new support for spin-outs.

    Ecosystems

    All of these things might help but they will not be enough to fundamentally change the research ecosystem. If the incentives stay broadly the same researchers and universities will continue to do broadly the same things irrespective of how much the government wants more research aimed at growing the economy.

    The potentially biggest reform has the smallest amount of detail. The paper states

    We will incentivise this specialisation and collaboration through research funding reform. By incentivising a more strategic distribution of research activity across the sector, we can ensure that funding is used effectively and that institutions are empowered to build deep expertise in areas where they can lead. This may mean a more focused volume of research, delivered with higher-quality, better cost recovery, and stronger alignment to short- and long-term national priorities. Given the close link between research and teaching, we expect these changes to support more specialised and high quality teaching provision as well.

    The implication here is that if research funding is allocated differently then providers will choose to specialise their teaching because research and teaching are linked. Before we get to whether there is a link between research funding and teaching (spoiler there is not) it is worth unpacking two other implications here.

    The first is that the “strategic distribution” element will have entirely different impacts depending on what the strategy is and what the distribution mechanism is. The paper states that there could, broadly, be three kinds of providers. Teaching only, teaching with applied research, and research institutions (who presumably also do teaching.) The strategy is to allow providers to focus on their strengths but the problem is it is entirely unclear which strengths or how they will be measured. For example, there are some researchers that are doing research which is economically impactful but perhaps not the most academically ground breaking. Presumably this is not the activity which the government would wish to deprioritise but could be if measured by current metrics. It also doesn’t explain how providers with pockets of research excellence within an overall weaker research profile could maintain their research infrastructure.

    The white paper suggests that the sector should focus on fewer but better funded research projects. This makes sense if the aim is to improve the cost recovery on individual research projects but improving the unit of resource through concentrating the overall allocation won’t necessarily improve financial sustainability of research generally. A strategic decision to align research funding more with the industrial strategy would leave some providers exposed. A strategic decision to invest in research potential not research performance would harm others. A focus on regions, or London, or excellence wherever it may be, would have a different impact. The distribution mechanism is a second order question to the overall strategy which has not yet dealt with some difficult trade offs

    On its own terms it also seems research funding is not a good indicator of teaching specialism.

    Incentives

    When the White Paper suggests that the government can “incentivise specialisation and collaboration through research funding reform”, it is worth asking what – if any – links there currently are between research funding and teaching provision.

    There’s two ways we can look at this. The first version looks at current research income from the UK government to each provider(either directly, or via UKRI) by cost centre – and compares that to the students (FTE) associated with that cost centre within a provider.

     

    [Full screen]

    We’re at a low resolution – this split of students isn’t filterable by level or mode of study, and finances are sometimes corrected after the initial publication (we’ve looked at 2021-22 to remove this issue). You can look at each cost centre to see if there is a relationship between the volume of government research funding and student FTE – and in all honesty there isn’t much of one in most cases.

    If you think about it, that’s kind of a surprise – surely a larger department would have more of both? – but there are some providers who are clearly known for having high quality research as opposed to large numbers of students.

    So to build quality into our thinking we turn to the REF results (we know that there is generally a good correlation between REF outcomes and research income).

    Our problem here is that REF results are presented by unit of assessment – a subject grouping that maps cleanly neither to cost centres or to the CAH hierarchy used more commonly in student data (for more on the wild world of subject classifications, DK has you covered). This is by design of course – an academic with training in biosciences may well live in the biosciences department and the biosciences cost centre, but there is nothing to stop them researching how biosciences is taught (outputs of which might be returned to the Education cost centre).

    What has been done here is a custom mapping at CAH3 level between subjects students are studying and REF2021 submissions – the axis are student headcount (you can filter by mode and level, and choose whichever academic year you fancy looking at) against the FTE of staff submitted to REF2021 – with a darker blue blob showing a greater proportion of the submission rated as 4* in the REF (there’s a filter at the bottom if you want to look at just high performing departments).

    [Full screen]

    Again, correlations are very hard to come by (if you want you can look at a chart for a single provider across all units of assessment). It’s almost as if research doesn’t bring in money that can cross-subsidise teaching, which will come as no surprise to anyone who has ever worked in higher education.

    Specialisation

    The government’s vision for higher education is clear. Universities should specialise and universities that focus on economic growth should be rewarded. The mechanisms to achieve it feel, frankly, like a mix of things that have already been announced and new measures that are divorced from the reality of the financial incentives universities work under.

    The white paper has assiduously ducked laying out some of the trade-offs and losers in the new system. Without this the government cannot set priorities and if it does not move some of the underlying incentives on student funding, regional funding distribution, greater devolution, supply-side spending like Freeports, staff reward and recognition, student number allocations, or the myriad of things that make up the basis of the university funding settlement, it has little hope of achieving its goals in specialisation or growth.

    Source link

  • The Society for Research into Higher Education in 2005

    The Society for Research into Higher Education in 2005

    by Rob Cuthbert

    In SRHE News and Blog a series of posts is chronicling, decade by decade, the progress of SRHE since its foundation 60 years ago in 1965. As always, our memories are supported by some music of the times.

    In 2005 Hurricane Katrina hit the Gulf Coast in the USA, and a Kashmir earthquake in Pakistan killed 86,000. In London 52 people died in the 7/7 suicide bombings; Jean Charles de Menezes, wrongly suspected of being a fugitive terrorist, was killed by London police officers. Labour under Tony Blair won its third successive victory in the 2005 UK general election, George W Bush was sworn in for his second term as US President, and Angela Merkel became the first female Chancellor of Germany. Pope John Paul II died and was succeeded by Pope Benedict XVI. Prince Charles married Camilla Parker Bowles. YouTube was founded, Microsoft released the Xbox 360, the Superjumbo Airbus A380 made its first flight and the Kyoto Protocol officially took effect. There was no war in Ukraine as Greece won the Eurovision Song Contest 2005 in Kyiv, thanks to Helena Paparizou with “My Number One” (no, me neither). In a reliable barometer of the times the year’s new words included glamping, microblogging and ransomware. And the year was slightly longer when another leap second was added.

    Higher education in 2005

    So here we are, with many people taking stock of where HE had got to in 2005 – suddenly I see. Evan Schofer (Minnesota) and John W Meyer (Stanford) looked at the worldwide expansion of HE in the twentieth century in the American Sociological Review, noting that: “An older vision of education as contributing to a more closed society and occupational system—with associated fears of “over-education”—was replaced by an open-system picture of education as useful “human capital” for unlimited progress. … currently about one-fifth of the world cohort is now enrolled in higher education.”

    Mark Olssen (Surrey) and Michael A Peters (Surrey) wrote about “a fundamental shift in the way universities and other institutions of higher education have defined and justified their institutional existence” as different governments sought to apply some pressure. Their 2005 article in the Journal of Educational Policy traced“… the links between neoliberalism and globalization on the one hand, and neoliberalism and the knowledge economy on the other. … Universities are seen as a key driver in the knowledge economy and as a consequence higher education institutions have been encouraged to develop links with industry and business in a series of new venture partnerships.”

    Åse Gornitzka (Oslo), Maurice Kogan (Brunel) and Alberto Amaral (Porto) edited Reform and Change in Higher Education: Analysing Policy Implementation, also taking a long view of events since the publication 40 years earlier of Great Expectations and Mixed Performance: The Implementation of Higher Education Reforms in Europe by Ladislav Cerych and Paul Sabatier. The 2005 book provided a review and critical appraisal of current empirical policy research in higher education with Kogan on his home territory writing the first chapter, ‘The Implementation Game’. At the same time another giant of HE research, SRHE Fellow Michael Shattock, was equally at home editing a special issue of Higher Education Management and Policy on the theme of ‘Entrepreneurialism and the Knowledge Society’. That journal had first seen the light of day in 1977, being a creation of the OECD programme on Institutional Management in Higher Education, a major supporter of and outlet for research into HE in those earlier decades. The special issue included articles by SRHE Fellows Ron Barnett and Gareth Williams, and by Steve Fuller (Warwick), who would be a keynote speaker at the SRHE Research Conference in 2006. The journal’s Editorial Advisory Group was a beautiful parade of leading researchers into HE, including among others Elaine El-Khawas, (George Washington University, Chair), Philip Altbach (Boston College, US), Chris Duke (RMIT University, Australia), Leo Goedegebuure (Twente), Ellen Hazelkorn (Dublin Institute of Technology), Lynn Meek (University of New England, Australia), Robin Middlehurst (Surrey), Christine Musselin (Centre de Sociologie des Organisations (CNRS), France), Sheila Slaughter (Arizona) and Ulrich Teichler (Gesamthochschule Kassel, Germany).

    I’ve got another confession to makeShattock had been writing about entrepreneurialism as ‘an idea for its time’ for more than 15 years, paying due homage to Burton Clark. The ‘entrepreneurial university’ was indeed a term “susceptible to processes of semantic appropriation to suit particular agendas”, as Gerlinde Mautner (Vienna) wrote in Critical Discourse Studies. It was a concept that seemed to break through to the mainstream in 2005 – witness, a survey by The Economist, ‘The Brains Business’ which said: “America’s system of higher education is the best in the world. That is because there is no system … Europe hopes to become the world’s pre-eminent knowledge-based economy. Not likely … For students, higher education is becoming a borderless world … Universities have become much more businesslike, but they are still doing the same old things … A more market-oriented system of higher education can do much better than the state-dominated model”. You could have it so much better, said The Economist.

    An article by Simon Marginson (then Melbourne, now Oxford via UCL), ‘Higher Education in the Global Economy’, noted that “… a new wave of Asian science powers is emerging in China (including Hong Kong and Taiwan), Singapore and Korea. In China, between 1995 and 2005 the number of scientific papers produced each year multiplied by 4.6 times. In South Korea … 3.6 times, in Singapore 3.2. … Between 1998 and 2005 the total number of graduates from tertiary education in China increased from 830,000 to 3,068,000 ….” (and Coldplay sang China all lit up). Ka Ho Mok (then Hang Seng University, Hong Kong) wrote about how Hong Kong institutional strategies aimed to foster entrepreneurship. Private education was booming, as Philip Altbach (Boston College) and Daniel C Levy (New York, Albany) showed in their edited collection, Private Higher Education: a Global Revolution. Diane Reay (Cambridge), Miriam E David and Stephen J Ball (both IoE/UCL) reminded us that disadvantage was always with us, as we now had different sorts of higher educations, offering Degrees of choice: class, race, gender and higher education.

    The 2005 Oxford Review of Education article by SRHE Fellow Rosemary Deem (Royal Holloway) and Kevin J Brehony (Surrey) ‘Management as ideology: the case of ‘new managerialism’ in higher education’ was cited by almost every subsequent writer on managerialism in HE. 2005 was not quite the year in which journal articles appeared first online; like many others in 2005 that article appeared online only two years later in 2007, as publishers digitised their back catalogues. However by 2005 IT had become a dominant force in institutional management. Libraries were reimagined as library and information services, student administration was done in virtual learning environments, teaching was under the influence of learning management systems.

    The 2005 book edited by Paul Ashwin (Lancaster), Changing higher education: the development of learning and teaching, reviewed changes in higher education and ways of thinking about teaching and learning over the previous 30 years. Doyenne of e-learning Diana Laurillard (UCL) said: “Those of us working to improve student learning, and seeking to exploit elearning to do so, have to ride each new wave of technological innovation in an attempt to divert it from its more natural course of techno-hype, and drive it towards the quality agenda.” Singh, O’Donoghue and Worton (all Wolverhampton) provided an overview of the effects of eLearning on HE andin an article in the Journal of University Teaching Learning Practice.

    UK HE in 2005

    Higher education in the UK kept on growing. HESA recorded 2,287,540 students in the UK in 2004-2005, of whom 60% were full-time or sandwich students. Universities UK reported a 43% increase in student numbers in the previous ten years, with the fastest rise being in postgraduate numbers, and there were more than 150,000 academic staff in universities.

    Government oversight of HE went from the Department for Education (DfE) to the Department for Education and Employment (DfEE), then in 2001 the Department for Education and Skills (DfES), which itself would only last until 2007. Gillian Shepherd was the last Conservative Secretary of State for Education before the new Labour government in 1997 installed David Blunkett until 2001, when Estelle Morris, Charles Clarke and Ruth Kelly served in more rapid succession. No party would dare to tangle with HE funding in 1997, so a cross-party pact set up the Dearing Review, which reported after the election. Dearing pleaded for its proposals to be treated as a package but government picked the bits it liked, notably the introduction of an undergraduate fee of £1000, introduced in 1998. Perhaps Kelly Clarkson got it right: You had your chance, you blew it.

    The decade after 1995 featured 12 separate pieces of legislation. The Conservative government’s 1996 Education (Student Loans) Act empowered the Secretary of State to subsidise private sector student loans. Under the 1996 Education (Scotland) Act the Scottish Qualifications Authority replaced the Scottish Examination Board and the Scottish Vocational Education Council. There was a major consolidation of previous legislation from the 1944 Education Act onwards in the 1996 Education Act, and the 1997 Education Act replaced the National Council for Vocational Qualifications and the School Curriculum and Assessment Authority with the Qualifications and Curriculum Authority.

    The new Labour government started by abolishing assisted places in private schools with the 1997 Education (Schools) Act (an immediate reward for party stalwarts, echoed 20 years later when the new Labour government started by abolishing VAT relief for private schools). The 1998 Education (Student Loans) Act allowed public sector student loans to be transferred to the private sector, which would prompt much subsequent comment and criticism when tranches of student debt were sold, causing unnecessary trouble. The 1998 Teaching and Higher Education Act established General Teaching Councils for England and Wales, made new arrangements for the registration and training of teachers, changed HE student financial support arrangements and allowed fees to rise to £3000, passing narrowly after much Parliamentary debate. The 1998 School Standards and Framework Act followed, before the 2000 Learning and Skills Act abolished the Further Education Funding Councils and set up the Learning and Skills Council for England, the National Council for Education and Training for Wales, and the Adult Learning Inspectorate. The 2001 Special Educational Needs and Disability Act extended provision against discrimination on grounds of disability in schools, further and higher education.

    The 2004 Higher Education Act established the Arts and Humanities Research Council, created a Director of Fair Access to Higher Education, made arrangements for dealing with students’ complaints and made provisions relating to grants and loans to students in higher and further education. In 2005 in the Journal of Education Policy Robert Jones (Edinburgh) and Liz Thomas (HE Academy) identified three strands – academic, utilitarian and transformative – in policy on access and widening participation in the 2003 White Paper which preceded the 2004 Act. They concluded that “… within a more differentiated higher education sector different aspects of the access discourse will become dominant in different types of institutions.” Which it did, but perhaps not quite in the way they might have anticipated.

    John Taylor (then Southampton) looked much further back, at the long-term implications of the devastating 1981 funding cuts, citing Maurice Kogan and Stephen Hanney (both Brunel) “Before then, there was very little government policy for higher education. After 1981, the Government took a policy decision to take policy decisions, and other points such as access and efficiency moves then followed.”.

    SRHE and research into higher education in 2005

    With long experience of engaging with HE finance policy, Nick Barr and Iain Crawford (both LSE) boldly titled their 2005 book Financing Higher Education: Answers From the UK. But policies were not necessarily joined up, and often pointed in different directions, as SRHE Fellow Paul Trowler (Lancaster), Joelle Fanghanel (City University, London) and Terry Wareham (Lancaster) noted in their analysis, in Studies in Higher Education, of initiatives to enhance teaching and learning: “… these interventions have been based on contrasting underlying theories of change and development. One hegemonic theory relates to the notion of the reflective practitioner, which addresses itself to the micro (individual) level of analysis. It sees reflective practitioners as potential change agents. Another relates to the theory of the learning organization, which addresses the macro level … and sees change as stemming from alterations in organizational routines, values and practices. A third is based on a theory of epistemological determinism … sees the discipline as the salient level of analysis for change. … other higher education policies exist … not overtly connected to the enhancement of teaching and learning but impinging upon it in very significant ways in a bundle of disjointed strategies and tacit theories.”

    SRHE Fellow Ulrich Teichler (Kassel) might have been channelling The Killers as he looked on the bright side about the growth of research on higher education in Europe in the European Journal of Education: “Research on higher education often does not have a solid institutional base and it both benefits and suffers from the fact that it is a theme-base area of research, drawing from different disciplines, and that the borderline is fuzzy between researchers and other experts on higher education. But a growth and quality improvement of research on higher education can be observed in recent years …”

    European research into HE had reached the point where Katrien Struyven, Filip Dochy and Steven Janssens (all Leuven) could review evaluation and assessment from the student’s point of view in Evaluation and Assessment in Higher Education:“… students’ perceptions about assessment significantly influence their approaches to learning and studying. Conversely, students’ approaches to study influence the ways in which they perceive evaluation and assessment.” Lin Norton (Liverpool Hope) and four co-authors surveyed teachers’ beliefs and intentions about teaching in a much-cited article in Higher Education: “… teachers’ intentions were more orientated towards knowledge transmission than were their beliefs, and problem solving was associated with beliefs based on learning facilitation but with intentions based on knowledge transmission.” Time for both students and teachers to realise it was not all about you.

    SRHE had more than its share of dislocations and financial difficulties in the decade to 2005. After its office move to Devonshire Street in London in 1995 the Society’s financial position declined steadily, to the point where survival was seriously in doubt. Little more than a decade later we would have no worries, but until then the Society’s chairs having more than one bad day were Leslie Wagner (1994-1995), Oliver Fulton (1996-1997), Diana Green (1998-1999), Jennifer Bone (2000-2001), Rob Cuthbert (2002-2003) and Ron Barnett (2004-2005). The crisis was worst in 2002, when SRHE’s tenancy in Devonshire Street ended. At the same time the chairs of SRHE’s three committees stepped down and SRHE’s funds and prospective income reached their lowest point, sending a shiver down the spine of the governing Council. The international committee was disbanded but the two new incoming committee chairs for Research (Maria Slowey, Dublin City University) and Publications (Rosemary Deem, Royal Holloway) began immediately to restore the Society’s academic and financial health. SRHE Director Heather Eggins arranged a tenancy at the Institute of Physics in 76 Portland Place, conveniently near the previous office. From 2005 the new Director, Helen Perkins, would build on the income stream created by Rosemary Deem’s skilful negotiations with publishers to transform the Society’s finances and raise SRHE up. The annual Research Conference would go from strength to strength, find a long-term home in Celtic Manor, and see SRHE’s resident impresario François Smit persuade everyone that they looked good on the dancefloor. But that will have to wait until we get to SRHE in 2015.

    Rob Cuthbert is editor of SRHE News and the SRHE Blog, Emeritus Professor of Higher Education Management, University of the West of England and Joint Managing Partner, Practical Academics. Email [email protected]. Twitter/X @RobCuthbert. Bluesky @robcuthbert22.bsky.social.

    Author: SRHE News Blog

    An international learned society, concerned with supporting research and researchers into Higher Education

    Source link

  • Science of Reading Training, Practice Vary, New Research Finds – The 74

    Science of Reading Training, Practice Vary, New Research Finds – The 74


    Get stories like this delivered straight to your inbox. Sign up for The 74 Newsletter

    North Carolina is one of several states that have passed legislation in recent years to align classroom reading instruction with the research on how children learn to read. But ensuring all students have access to research-backed instruction is a marathon, not a sprint, said education leaders and researchers from across the country on a webinar from the Hunt Institute last Wednesday.

    Though implementation of the state’s reading legislation has been ongoing since 2021, more resources and comprehensive support are needed to ensure teaching practice and reading proficiency are improved, webinar panelists said.

    “The goal should be to transition from the science of reading into the science of teaching reading,” said Paola Pilonieta, professor at the University of North Carolina at Charlotte who was part of a team that studied North Carolina’s implementation of its 2021 Excellent Public Schools Act.

    That legislation mandates instruction to be aligned with “the science of reading,” the research that says learning to read involves “the acquisition of language (phonology, syntax, semantics, morphology, and pragmatics), and skills of phonemic awareness, accurate and efficient work identification (fluency), spelling, vocabulary, and comprehension.”

    The legislature allocated more than $114 million to train pre-K to fifth grade teachers and other educators in the science of reading through a professional development tool called the Language Essentials for Teachers of Reading and Spelling (LETRS). More than 44,000 teachers had completed the training as of June 2024.

    Third graders saw a two-point drop, from 49% to 47%, in reading proficiency from the 2023-24 to 2024-25 school year on literacy assessments. It was the first decline in this measure since LETRS training began. First graders’ results on formative assessments held steady at 70% proficiency and second graders saw a small increase, from 65% to 66%.

    “LETRS was the first step in transforming teacher practice and improving student outcomes,” Pilonieta said. “To continue to make growth in reading, teachers need targeted ongoing support in the form of coaching, for example, to ensure effective implementation of evidence-based literacy instruction.”

    Teachers’ feelings on the training

    Pilonieta was part of a team at UNC-Charlotte and the Education Policy Initiative at Carolina (EPIC) at UNC-Chapel Hill that studied teachers’ perception of the LETRS training and districts’ implementation of that training. The team also studied teachers’ knowledge of research-backed literacy practices and how they implemented those practices in small-group settings after the training.

    They asked about these experiences through a survey completed by 4,035 teachers across the state from spring 2023 to winter 2024, and 51 hour-long focus groups with 113 participants.

    Requiring training on top of an already stressful job can be a heavy lift, Pilonieta said. LETRS training looked different across districts, the research team found. Some teachers received stipends to complete the training or were compensated with time off, and some were not. Some had opportunities to collaborate with fellow educators during the training; some did not.

    “These differences in support influenced whether teachers felt supported during the training, overwhelmed, or ignored,” Pilonieta said.

    Teachers did perceive the content of the LETRS training to be helpful in some ways and had concerns in others, according to survey respondents.

    Teachers holding various roles found the content valuable in learning about how the brain works, phonics, and comprehension.

    They cited issues, however, with the training’s applicability to varied roles, limited differentiation based on teachers’ background knowledge and experience, redundancy, and a general limited amount of time to engage with the training’s content.

    Varied support from administrators, coaches

    When asking teachers about how implementation worked at their schools, the researchers found that support from administrators and instructional coaches varied widely.

    Teachers reported that classroom visits from administrators with a focus on science of reading occurred infrequently. The main support administrators provided, according to the research, was planning time.

    “Many teachers felt that higher levels of support from coaches would be valuable to help them implement these reading practices,” Pilonieta said.

    Teachers did report shifts in their teaching practice after the training and felt those tweaks had positive outcomes on students.

    The team found other conditions impacted teachers’ implementation: schools’ use of curriculum that aligned to the concepts covered in the training, access to materials and resources, and having sufficient planning time.

    Some improvement in knowledge and practice

    Teachers performed well on assessments after completing the training, but had lower scores on a survey given later by the research team. Pilonieta said this suggests an issue with knowledge retention.

    Teachers scored between 95% to 98% across in the LETRS post-training assessment. But in the research team’s survey, scores ranged from 48% to 78%.

    Teachers with a reading license scored higher on all knowledge areas addressed in LETRS than teachers who did not.

    When the team analyzed teachers’ recorded small-group reading lessons, 73% were considered high-quality. They found consistent use of explicit instruction, which is a key component of the science of reading, as well as evidence-backed strategies related to phonemic awareness and phonics. They found limited implementation of practices on vocabulary and comprehension.

    Among the low-quality lessons, more than half were for students reading below grade level. Some “problematic practices” persisted in 17% of analyzed lessons.

    What’s next?

    The research team formed several recommendations on how to improve reading instruction and reading proficiency.

    They said ongoing professional development through education preparation programs and teacher leaders can help teachers translate knowledge to instructional change. Funding is also needed for instructional coaches to help teachers make that jump.

    Guides differentiated by grade levels would help different teachers with different needs when it comes to implementing evidence-backed strategies. And the state should incentivize teachers to pursue specialized credentials in reading instruction, the researchers said.

    Moving forward, the legislation might need more clarity on mechanisms for sustaining the implementation of the science of reading. The research team suggests a structured evaluation framework that tracks implementation, student impact, and resource distribution to inform the state’s future literacy initiatives.

    This article first appeared on EdNC and is republished here under a Creative Commons Attribution-NoDerivatives 4.0 International License.


    Get stories like these delivered straight to your inbox. Sign up for The 74 Newsletter

    Source link

  • The REF helps make research open, transparent, and credible- let’s not lose that

    The REF helps make research open, transparent, and credible- let’s not lose that

    The pause to reflect on REF 2029 has reignited debate about what the exercise should encompass – and in particular whether and how research culture should be assessed.

    Open research is a core component of a strong research culture. Now is the time to take stock of what has been achieved, and to consider how REF can promote the next stage of culture change around open research.

    Open research can mean many things in different fields, as the UNESCO Recommendation on Open Science makes clear. Wherever it is practiced, open research shifts focus away from outputs and onto processes, with the understanding that if we make the processes around research excellent, then excellent outcomes will follow

    Trust

    Being open allows quality assurance processes to work, and therefore research to be trustworthy. Although not all aspects of research can be open (sensitive personal data, for example), an approach to learning about the world that is as open as possible differentiates academic research from almost all other routes to knowledge. Open research is not just a set of practices – it’s part of the culture we build around integrity, collaboration and accountability.

    But doing research openly takes time, expertise, support and resources. As a result, researchers can feel vulnerable. They can worry that taking the time to focus on high-quality research processes might delay publication and risk them being scooped, or that including costs for open research in funding bids might make them less likely to be funded; they worry about jeopardising their careers. Unless all actors in the research ecosystem engage, then some researchers and some institutions will feel that they put themselves at a disadvantage.

    Open research is, therefore, a collective action problem, requiring not only policy levers but a culture shift in how research is conducted and disseminated, which is where the REF comes in.

    REF 2021

    Of all the things that influence how research is done and managed in the UK HE sector, the REF is the one that perhaps attracts most attention, despite far fewer funds being guided by its outcome than are distributed to HEIs in other ways.

    One of the reasons for this attention is that REF is one of the few mechanisms to address collective action problems and drive cultural change in the sector. It does this in two ways, by setting minimum standards for a submission, and by setting some defined assessment criteria beyond those minimum standards. Both mechanisms provide incentives for submitting institutions to behave in particular ways. It is not enough for institutions to simply say that they behave in this way – by making submissions open, the REF makes institutions accountable for their claims, in the same way as researchers are made accountable when they share their data, code and materials.

    So, then, how has this worked in practice?

    A review of the main panel reports from REF 2021 shows that evidence of open research was visible across all four main panels, but unevenly distributed. Panel A highlighted internationally significant leadership in Public Health, Health Services and Primary Care (UoA 2) and Psychology, Psychiatry and Neuroscience (UoA 4), while Panel B noted embedded practices in Chemistry (UoA 8) and urged Computer Science and Informatics (UoA 11) to make a wider shift towards open science through sharing data, software, and protocols. Panel C pointed to strong examples in Geography and Environment Studies (UoA 14), and in Archaeology (UoA 15), where collaboration, transparency, and reproducibility were particularly evident. By contrast, Panel D – and parts of Panel C – showed how definitions of open research can be more complex, because what constitutes ‘open research’ is perhaps much more nuanced and varied in these disciplines, and these disciplines did not always demonstrate how they were engaging with institutional priorities on open research and supporting a culture of research integrity. Overall, then, open research did not feature in the reports on most UoAs.

    It is clear that in 2021 there was progress, in part guided by the inclusion in the REF guidance of a clear indicator. However, there is still a long way to go and it is clear open research was understood and evidenced in ways that could exclude some research fields, epistemologies and transparent research practices.

    REF 2029

    With REF 2029, the new People, Culture and Environment element has created a stronger incentive to drive culture change across the sector. Institutions are embracing the move beyond compliance, making openness and transparency a core part of everyday research practice. However, alignment between this sector move, REF policy and funder action remains essential to address this collective action problem and therefore ensure that this progress is maintained.

    To step back now would not only risk slowing, or even undoing, progress, but would send confused signals that openness and transparency may be optional extras rather than essentials for a trusted research system. Embedding this move is not optional: a culture of openness is essential for the sustainability of UK research and development, for the quality of research processes, and for ensuring that outputs are not just excellent, but also trustworthy in a time of mass misinformation.

    Openness, transparency and accountability are key attributes of research, and hallmarks of the culture that we want to see in the sector now and in the future. Critically, coordinated sector-wide, institutional and individual actions are all needed to embed more openness into everyday research practices. This is not just about compliance – it is about a genuine culture shift in how research is conducted, shared and preserved. It is about doing the right thing in the right way. If that is accepted, then we would challenge those advocating for reducing the importance of those practices in the REF: what is your alternative, and will it command public trust?

     

    This article was supported by contributions from:

    Michel Belyk (Edge Hill University), Nik Bessis (Edge Hill University), Cyclia Bolibaugh (University of York), Will Cawthorn (University of Edinburgh), Joe Corneli (Oxford Brookes University), Thomas Evans (University of Greenwich), Eleanora Gandolfi (University of Surrey), Jim Grange (Keele University), Corinne Jola (Abertay University), Hamid Khan (Imperial College, London), Gemma Learmonth (University of Stirling), Natasha Mauthner (Newcastle University), Charlotte Pennington (Aston University), Etienne Roesch (University of Reading), Daniela Schmidt (University of Bristol), Suzanne Stewart (University of Chester), Richard Thomas (University of Leicester), Steven Vidovic (University of Southampton), Eric White (Oxford Brookes University).

    Source link

  • Notes on Research Policy, Here and Abroad

    Notes on Research Policy, Here and Abroad

    Hi all. I thought I would take some time to have a chat about how research policy is evolving in other countries, because I think there are some lessons we need to learn here in Canada.

    One piece of news that struck me this week came from Switzerland, where the federal government is slashing the budget of the Swiss National Science Foundation (SNSF) by 20%. If the Swiss, a technological powerhouse of a nation, with a broad left-right coalition in power and a more or less balanced budget, are cutting back on science like this, then we might all have to re-think the idea that being anti-Science is just a manifestation of right-wing populism. Higher education as a whole has some thinking to do.

    And right now, two countries are in fact re-thinking science quite a bit. In the UK, the new head of UK Research and Innovation (roughly, that country’s One Big Granting Council), has told institutions that they might need to start “doing fewer things but doing them well”, to which the President of Universities UK and vice-chancellor of Manchester Metropolitan University Malcom-Press added that he was “hearing from government is that [they] don’t want to be investing in areas of research where we don’t have the quality and we don’t have the scale.” And, the kicker: “You can’t have hobbyist research that’s unfunded going on in institutions. We can’t afford it.”

    Over to Australia, where a few months ago the government set up a Strategic Examination of Research and Development, which released a discussion paper, held consultations and got feedback (which it published) and has now released six more “issue” papers for consultation which detail government thinking in many different and more detailed ways. If this sounds magical to you, it is because you are from Canada, where the standard practice for policymaking is to do everything behind closed doors and treat stakeholders like mushrooms (in the dark with only fecal matter for company) instead of a place where policy-making is treated as a serious endeavour in which public input and expert advice is welcomed. 

    For today’s purposes however, what matters is not process but policy. The review is seriously considering a number of fairly radical ideas, such as creating a few national “focus areas” for research funding, which would attract higher rates of overhead and requiring institutions to focus their efforts in one of these priority areas via mission-based compacts (which are sort of like Ontario’s Multi-Year Agreements, only they are meaningful) so as to build scale and specialization. 

    Whew.

    One thing that strikes me as odd about both the UK and Australian line of thinking is the idea that institutional specialization matters all that much. While lots of research is done at the level of the individual lab, most “big science” – the stuff people who dream about specialization have in mind when the talk about science – happens in teams which span many institutions, and more often than not across national borders as well. I get the sense that the phenomenon of institutional rankings have fried policy makers’ brains somewhat: they seem to think that the correct way to think about science is at the level of the institution, rather than labs or networks of laboratories. It’s kind of bananas. We can be glad that this kind of thinking has not infected Canadian policy too much because the network concept is more ingrained here.

    Which brings me to news here at home. 

    The rumour out of Ottawa is that in the next few months (still not clear if this is going to be fall 2025 or Spring 2026) there will be an announcement of a new envelope of money for research. But very definitely not inquiry-driven research. No, this is money which the feds intend to spend as part of the increase in “defence” spending which is supposed to rise to 2% of GDP by 2025-2026 and 5% by 2035. So, the kinds of things it will need to go to will be “security”, likely defined relatively generously. It will be for projects in space, protection of critical infrastructure, resiliency, maybe energy production, etc.  I don’t think this is going to be all about STEM and making widgets – there will be at least some room for social science in these areas and maybe humanities, too, though this seems to me a harder pitch to make. It is not clear from what I have heard if this is going to be one big pie or a series of smaller pies, divided up wither by mission or by existing granting council. But the money does seem to be on its way.

    Now before I go any further, I should point out that I have not heard anyone say that these new research envelopes are actually going to contain new money beyond what was spent in 2024-25.  As I pointed out a couple of weeks ago, that would be hard to square with the government’s deficit-fighting commitments.

    In fact, if I had to guess right now, the best-case scenario would be that the Liberals will do this by taking some or all of the 88% of the Budget 2024 research commitment to the tri-councils and push it into these new envelopes (worst-case scenario: they nuke the 88% of the 2024 Budget commitment they haven’t yet spent and claw back money from existing commitments to make these new envelopes). 

    So, obviously no push here for institutional specialization, but where our debate echoes those of the UK and Australia is that all three governments seem to want to shift away from broad-based calls for inquiry driven research and toward more mission-based research in some vaguely defined areas of national priority.  I know this is going to irritate and anger many people, but genuinely I don’t see many politically practical alternatives right now. As I said back here: if defending existing inquiry-driven tri-council budgets is the hill the sector chooses to die on, we’re all going to be in big trouble. 

    No one will forcing individual researchers or institutions to be part of this shift to mission-driven research, but clearly that’s where the money is going to be. So, my advice to VPs Research is: get your ducks in a row on this now. Figure out who in your institution does anything that can even tangentially be referred to as “security-enhancing”. Figure out what kinds of pitches you might want to make.  Start testing your elevator pitches. There will be rewards to first movers in this area.

    Source link

  • UChicago Sells Off Research Center

    UChicago Sells Off Research Center

    The University of Chicago is selling a celebrated research center as the generously endowed university navigates layoffs and program cuts amid a heavy debt load, Financial Times reported Monday.

    UChicago is reportedly selling the Center for Research in Security Prices, founded in 1960, for $355 million to Morningstar, a research and investment firm also located in Chicago. The center, known as CRSP, developed a market database more than 65 years ago that “allowed investors to measure historic rates of return for U.S. stocks,” according to its website, which notes that its data has been used in more than 18,000 peer-reviewed studies and by hundreds of entities.

    CRSP formally became a limited liability company in 2020 but remained wholly owned by UChicago and maintained its affiliation with the university and the Booth School of Business.

    The sale comes as financial issues are adding up for the university. UChicago has borrowed heavily in recent years and seen substandard returns on its $10 billion endowment. University officials recently announced plans to pause admission to multiple Ph.D. programs and to cut 400 staff jobs as the private institution grapples with a debt load that has grown to $6 billion.

    UChicago is currently trying to shed $100 million in expenses.

    The Trump administration’s cancellation of dozens of federal grants in recent months has also hurt the university’s bottom line. UChicago president Paul Alivisatos wrote in late August that the “profound federal policy changes of the last eight months have created multiple and significant new uncertainties and strong downward pressure on our finances.”

    Source link