Tag: research

  • New test tubes or shiny buildings? The choice facing policymakers when it comes to funding research

    New test tubes or shiny buildings? The choice facing policymakers when it comes to funding research

    Let me start with a vignette. Back in 2017, we published a brilliant award-winning report on TRAC written by a young intern. This looked specifically at cross-subsidies in universities from Teaching (international students) to Research.

    Back then, there was no clear cross subsidy towards home students, as they (more than) paid for themselves due to £9,000 fees. But the subsidy from international students towards research was large, as it remains today.

    We held a launch event at the LSE for the paper. This remains seared on my mind for, instead of being impartial, the eminent professor in the Chair attacked our young intern for having the temerity to publicise the split in resources for teaching and research.

    His (widely shared) view was that, at an institution like the LSE, research informs teaching and teaching informs research, so policy makers should not look too closely under the bonnet but instead let universities spend their resources as they see fit.

    The interesting part of this story is that the person who asked us to write the report was the LSE’s own Director of Research. He was frustrated that his colleagues seemed not to understand the financial flows in their own institution.

    A second reason why we should shine a spotlight on how universities work is that teaching and research are now split down the middle when it comes to political oversight:

    • we have one Minister for teaching and another for research;
    • we have one Whitehall Department for teaching and another for research; and
    • we have one regulator / funder for teaching and another for research.

    We might prefer it if it were not so, but it is naïve to think substantial cross-subsidies within institutions fit as naturally with these arrangements as they did with the arrangements in place back at the turn of the millennium, when TRAC was first mooted.

    In our 2017 report, we showed that, according to TRAC, only 73% of research costs were recovered. On revisiting the issue in another report three years later, we found cost recovery had fallen to 69%. Today, as the KCL report shows, the number is just 66%.

    In other words, during a decade when politicians have exalted the power of R&D to transform Britain, the level of cost recovery has been falling at almost 1 percentage point a year.

    However, what has changed over time is that this is now fairly well understood. For example, TRAC data were heavily used to show the sector’s challenges in both the Universities UK Blueprint and the recent Post-16 Education and Skills white paper.

    Let me focus on that white paper for a second. It is a slightly odd document, where you can see the joins between the three Secretaries of State (for Education, Work and Pensions and Science, Innovation and Technology) who share responsibility for it.

    In particular, the white paper recommits to improving cost recovery for research while simultaneously looking for new ways to crack down on the international students who currently provide big cross-subsidise for research.

    The end result, as the white paper itself admits, is likely to be less research:

    We will work with the sector and other funders to address the cost recovery of research. … We recognise that this may result in funding a lower volume of research but at a more sustainable level.

    While some research-intensive institutions may celebrate this concentration, it does not feel like we have talked enough about the consequences in terms of what it could mean:

    • for research capacity in each region;
    • for the pipeline of new researchers; and
    • for the likelihood of missing out on new discoveries that may otherwise happen.

    In other words, what we have in the white paper is the perhaps inevitable result of giving the Minister for Science, Research and Innovation, Lord Vallance, the additional role of champion for the ‘Oxford-Cambridge corridor’.

    So far, I have assumed the TRAC numbers are accurate, yet we all know they are rough – or worse. A 10-year old piece on TRAC in Times Higher Education quotes one university finance director as saying: ‘if you put garbage [data] in, you will get garbage out.’

    In preparation for this session, I spoke to one academic at a research-intensive university, who even argued: ‘TRAC is a piece of fiction to conceal how much teaching subsidises research.’

    He went on to explain that your contract might say 40% of time should be on Teaching and 40% on Research (with 20% for admin): ‘If you spend 60% on Research and 20% on Teaching, you would be in violation of contract so no one will admit to it.’

    A second academic I contacted was similarly scathing:

    ‘I think it is a classic case of looking for a lost wedding ring under the lamppost, even when you lost it a mile away. Universities obviously have an incentive to say that teaching UK students and doing research is more expensive, because they hope to get more money from the government. That is why TRAC does not lead to better business models – the stuff is known to be suspect.’

    Such criticisms may explain why I have only ever been able to find one university that has followed the logic of their own TRAC numbers by refusing to take on any major new research projects (and even they only had the ban in force temporarily).

    The lesson I take from all this is that TRAC is useful, but not enough. Some sort of calculation needs to occur to inform policy makers, funders and managers. But TRAC is not the slam dunk that people sometimes like to think it is because:

    1. the process is neither liked nor trusted by those it measures;
    2. institutions do not respond to what the data say, so look guilty of crying wolf; and
    3. every sector in search of public money does its own calculations, so the fact that TRAC exists and shows a substantial shortfall in the full economic costs of research and, increasingly, teaching home students too does not automatically give higher education institutions a leg up over other areas of when lobbying the Government.

    Finally, TRAC is meant to help politicians understand the world but I think we also need to recall the motivations of political leaders. When I was in Whitehall, we struggled to persuade the Treasury to move towards full economic costing. They caricatured it as buying new test tubes when the alternative was shiny new buildings. In the end, politicians in hard hats cannot go to topping-out ceremonies for new test tubes.

    Source link

  • Insiders Reveal: How to Stay Motivated Through Your Research

    Insiders Reveal: How to Stay Motivated Through Your Research

    Stay Open to What Your Data Is Telling You 📊

    Dr. Patricia Bleich, Associate Professor at Keiser University, emphasizes the importance of remaining genuinely open to what your research reveals. Rather than forcing your findings to fit preconceived ideas, let the data guide your thinking and shape your conclusions. This applies whether you’re working with quantitative measurements or qualitative insights, remain curious about what your research is actually showing you.

    Why This Matters for Motivation

    One of the most demotivating experiences in research is discovering your data doesn’t support your hypothesis. But this only feels like failure if you’re attached to a specific outcome.

    When you approach your data with genuine openness and curiosity, unexpected findings become interesting discoveries rather than disappointing setbacks. This mindset shift protects your motivation and improves your research quality.

    Build your analysis on what the data actually shows, using measurable outcomes and trusted research methods to draw sound conclusions. Your goal isn’t to prove what you already believe, it’s to discover what’s actually there.

    Start Sharing Your Work Before It’s Perfect 📢

    Dr. Jenni Rose, Senior Lecturer at the University of Manchester, offers crucial advice about overcoming perfectionism through action:

    “Don’t wait for the perfect study or groundbreaking discovery to share your pedagogical insights. Every time you try something new in your classroom and reflect on what worked, you’re contributing to the scholarship of teaching.”

    This principle extends far beyond teaching research. Whether you’re studying organizational behavior, healthcare outcomes, or environmental policy, the same truth applies: your insights have value before they’re polished to perfection.

    Why Waiting Kills Motivation

    Perfectionism often manifests as silence. You tell yourself you’ll share your work when it’s “ready,” but that day never comes because perfect doesn’t exist.

    Rose emphasizes: “Your colleagues are wrestling with similar challenges, and your experiences can illuminate their path forward. Start small and start local, but start somewhere. Write that blog post, submit that short paper, or give that workshop at your teaching center.”

    Replace “teaching center” with wherever makes sense for your field: present at a graduate student colloquium, share preliminary findings at a departmental seminar, write a blog post about your methodology challenges, or submit to a student research journal.

    The benefits of early sharing:

    • You get feedback that improves your work
    • You build momentum and accountability
    • You contribute to your field sooner
    • You connect with others doing similar work
    • You combat isolation

    Whatever your focus, sharing early and often builds resilience and sustains motivation. Your preliminary findings, methodological reflections, and even your struggles can help others navigating similar territory.

    Source link

  • Should research be free for all?

    Should research be free for all?

    In the past, Gitanjali Yadav, like many other Indian researchers, would have used illegal online libraries to access academic journal articles. Now a new initiative by the Indian government brings hope as a legal alternative, delivering free article access to researchers across India.

    While Yadav benefits from the new scheme, it has brought her new challenges.

    Despite being one of the world’s top scientific research countries, many Indian universities don’t have enough funding for researchers to read the papers they need. 

    The Indian government sought to solve this issue with the One Nation One Subscription (ONOS) scheme. ONOS gives public Indian academic institutions free access to academic journals.

    But just as soon as the government gave access, they also took it away. Following the ONOS launch, another important website for reading academic articles was shut down. 

    Previously Sci-Hub, an illegal academic library, was the one-stop shop to read and download academic literature. Many Indian academics relied on it to access articles behind paywalls. But now the platform has been banned, leaving many Indian academics, and their research, without options. 

    A researcher at the National Institute of Plant Genome Research, Yadav downloads and analyzes thousands of academic articles for her research. Now, her attempts to access these articles on a mass scale have led to blocks by publishers even though her institution has subscriptions. 

    Why access matters

    Researchers worry that ONOS will not be able to replace Sci-Hub effectively, and that this will lead to downstream effects in conducting research. 

    India’s access problems raise a bigger question: can countries in the Global South compete in science when they don’t have the same access to information as richer countries? 

    Obtaining academic articles isn’t always simple. Similar to checking a book out from a library, you can only read an article if it is held in a library’s collection. Access to research papers is given through universities and academic institutions. 

    Academic journal subscriptions and publishing fees are estimated to earn $10 billion annually in the United States alone. The average cost of an annual subscription to a single journal by an Indian institution is around $1,300, though journals are often sold together in packages. The costs of these packages can vary, but a large Indian institution could expect to pay $50,000 for one year’s access, making the fee unaffordable for many Indian institutions. 

    Meanwhile, the researchers who provide articles and provide peer review for the journals are unpaid for their work

    Paywall problems

    Without access through their university, many academic articles are kept behind paywalls. Researchers can pay to read a paper, but fees average around $50 for a single article, and Indian universities can’t pay for all the journals they need. 

    “You’re somebody working in an Indian laboratory,” said Peter Murray-Rust, Cambridge researcher and well-known advocate for Open Access science. “What are you going to do? You’re going to pirate it.” 

    Launched in 2011, Sci-Hub changed Indian research by allowing anyone to illegally read articles for free, even if the articles were behind paywalls. 

    One fan was Jonny Coates, the executive director of Rippling Ideas, an organization that advocates for open access to scholarly works. “There are some people who tell you, actually, what Sci-Hub’s done is it solved the access problem,” he said.

    At its peak, Sci-Hub provided access to over 81 million research articles. For academics in India, many started downloading articles illegally. The country downloaded over 5 million articles from Sci-Hub in 2017 alone. 

    Is ONOS a game changer?

    Comments from hundreds of Indian researchers can be found thanking Alexandra Elkyban, Sci-Hub’s founder, online: “The website Sci-Hub you have developed is like an oasis in the desert for people like me,” wrote Indian researcher Keshav Moharir. “God bless you.”

    Now the platform is banned, and the Indian research scene is changing. The Indian ban on Sci-Hub follows a 2020 lawsuit filed by major academic publishers like Elsevier and Wiley

    When the Indian government launched the $715 million ONOS initiative earlier this year it was heralded as a solution to the access problem because it gave eligible public institutions free access to 13,000 academic journals.

    The announcement was met with much excitement: cutting-edge research could now be pursued without financial barriers. Researchers from small institutions were enthusiastic that they could finally access the resources previously limited to top tier universities. Indian Prime Minister Narendra Modi described it as a “game-changer for Indian academia and for youth empowerment,” in an X post. 

    But ONOS has also faced criticism from researchers. For those offered ONOS, more than half are still waiting to use the platform, and it’s unclear why they remain without access. At the same time, ONOS also only covers a small portion of the some 40,000 academic journals worldwide, limiting access to specialized publications that can be important for researchers. And there are logistical challenges, highlighted by Yardav’s difficulties. 

    Private universities, meanwhile, are left without ONOS or Sci-Hub. And some say it will be difficult for them to conduct research going forward.  

    Beyond India 

    India’s challenges show a bigger problem in the Global South. In contrast to institutions in high-income countries, those in the Global South have less money and fewer legal ways to read papers. That means that Global South countries are less likely to be able to read paywalled papers and include them in their own research. And because of this, their research may not be as strong or influential

    Lack of access can also influence what type of research gets done. A recent study by researchers at NYU Abu Dhabi on paywalls and scientific data concluded that paywalls can compound disparities between who gets access and who doesn’t and who ends up contributing to the global production of knowledge.

    One researcher from Ghana quoted in the study noted that the availability of papers could affect which projects he recommends to his students. 

    Murray-Rust said that being able to read the body of research is so essential for conducting good science, that in many cases, piracy becomes a standard practice.

    Whether government-led schemes can replace grassroots alternatives like Sci-Hub effectively is yet to be seen. 

    Researchers like Yadav fear that ONOS will end up being more symbolic rather than a real change for India’s research community. For now, India’s academic community finds itself in a difficult phase of transition. 


    Questions to consider:

    1. Why does it cost money to access some research studies?

    2. Who should fund scholarly research?

    3. If you put a lot of time and money into conducting a research study, would you give away the results for free? Why or why not?

     

    Source link

  • Northern Ireland can be a testbed for research culture

    Northern Ireland can be a testbed for research culture

    In higher education, talk of “research culture” can sometimes feel abstract. We know it matters, but what does it actually look like in practice – and how do you change it?

    Today, we’re publishing a new report on research culture in Northern Ireland that tries to answer some of those questions. Produced in partnership with CRAC-Vitae as part of the Research Culture Northern Ireland (RCNI) initiative, and supported by the Wellcome Trust, the report draws insights from across universities, government, industry, and the voluntary sector.

    Our aim was to explore how research is experienced in a small but vibrant ecosystem, and to test whether Northern Ireland might offer a different perspective on research culture – one that could be of interest not only here, but to other regions of the UK and beyond.

    Why Northern Ireland, why now?

    Northern Ireland’s research ecosystem is distinctive. Our higher education sector is small but high-performing, regularly punching above its weight in UK and international rankings. We are separated from the rest of the UK by the Irish Sea, but uniquely, we share a land border with the EU – creating opportunities for cross-border collaboration.

    Yet there are challenges too. Levels of innovation and productivity remain lower than in the UK and Ireland overall. Access to research funding is uneven. Career mobility is limited, partly shaped by geography.

    At the same time, research and innovation are high on the policy agenda. The Northern Ireland Executive’s Programme for Government highlights ambitious R&I plans, including the creation of a regional strategy to support key sectors such as cyber security and software, advanced manufacturing and life and health sciences. The appointment of Northern Ireland’s first Chief Scientific and Technology Adviser signals stronger leadership in this space, and with the CSTA shortly bringing the regional R&I strategy forward for consultation, it highlights the significant developments since the report on research culture was commissioned. The Belfast Region City Deal is creating new innovation centres, while a recently published Collaborative Innovation Plan represents a coordinated commitment by Innovate UK, the Department for the Economy and Invest NI to accelerate inclusive and sustainable innovation across the region.

    To harness these opportunities, we need a research culture that enables collaboration across sectors, supports the talent we already have, and makes the region an attractive place for others to come and do research.

    Finding out

    Queen’s University Belfast and Ulster University jointly created RCNI with support from the Wellcome Trust to explore research culture in more depth, and to test interventions that might help address challenges.

    Alongside pilot projects on postdoctoral careers, practice-as-research, and the role of research professionals, we commissioned CRAC-Vitae to examine Northern Ireland’s research ecosystem through survey data (167 responses), interviews (17), and focus groups. The aim was not only to generate evidence specific to our context, but also to explore whether familiar UK-wide challenges looked different – or perhaps more visible – in a small system.

    The findings are grouped into five themes. None of them are unique to Northern Ireland – but they resonate in ways that may feel familiar to colleagues elsewhere:

    Collaboration and coordination. Collaboration is widespread, with 80 per cent of respondents reporting that they had worked with an external organisation. However, qualitative data revealed that collaborations are often informal, relying on personal networks. Smaller organisations can be excluded, and visibility across the system is limited.

    Career pathways and talent development. Career progression is constrained by limited opportunities with 59 per cent of respondents identifying a lack of progression routes. Pathways are often fragmented, and cross-sector mobility remains low, with 52 per cent of respondents reporting difficulty moving between sectors. Talent is underutilised as a result.

    Understanding and communicating the value of research. Research has enormous civic and community benefits, but these are undervalued and misunderstood – limiting recognition and policy impact.

    Reducing administrative burden. Bureaucracy, compliance, and regulatory hurdles disproportionately affect SMEs and non-HE actors, creating inefficiencies and blocking participation.

    Strategic vision and system reform. Stakeholders see a fragmented and opaque system, lacking shared vision and coherence – only 31 per cent of survey respondents agreed there is a shared strategic vision for R&I in NI – a situation compounded by political instability.

    We know this is a small sample and just one piece in a growing evidence base. But it offers useful starting points for further discussion – and perhaps areas where regions could work together.

    Reflections for small regions

    Looking across these findings, a few reflections stand out that may be of interest to other small regions with strong research ecosystems.

    First, proximity can be a strength. The size and concentration of institutions, government, and industry in a defined area creates real opportunities to build effective networks and shared understanding of barriers. In particular, it can help identify and tackle bureaucratic friction more quickly.

    Next, that collaboration is essential – but needs structure. In small systems, personal connections carry weight. That can be a strength, but risks becoming exclusive and unwelcoming to newcomers. Creating formal mechanisms for inclusion is key.

    There’s also work to be done on harnessing existing talent. With only a handful of research-intensive institutions, we need to do more to support and retain the talent we already have. Not every research student or postdoc will have an academic career – but their skills are vital to other sectors and to addressing regional challenges.

    Finally, a joined-up voice matters. A coherent strategic focus and communication plan helps small regions do more with less. Playing to strengths, and presenting a clear message externally, is critical to attracting funding and partnerships. This project, a partnership between Queen’s and Ulster, embodies that.

    These are not answers, but starting points for reflection – and perhaps for collaboration across regions that face similar issues.

    Where could this go?

    We are realistic: these challenges cannot be solved by one project, or even one region. Our next steps will therefore follow a dual approach: influencing system-level reforms through evidence, advocacy, and convening – recognising that changes to policy and funding lie with government and funders – and also testing project-level interventions through pilot projects, generating practical learning that might inform broader reforms.

    The first of these involves a new collaboration with CRAC-Vitae to pilot innovative approaches to tracking the career outcomes of postdoctoral researchers in Northern Ireland. This aligns with our “people first” focus for this project, recognising that our research and innovation ecosystem is nothing without the talent and ideas that populate it.

    If successful, we hope that coordinated career tracking will help identify mobility trends, sectoral destinations, and skill gaps across the R&I workforce – providing the evidence needed to strengthen cross-sector pathways and retain and develop talent within NI’s R&I ecosystem.

    Building on other RCNI work exploring postdoctoral career development, these efforts aim to build a clearer picture of how people move through and beyond the research ecosystem – and how policies and practices can better support their progression.

    Although modest in scale, this pilot will address an area with little existing evidence and may offer a model for others seeking to strengthen mobility and progression across the research ecosystem.

    An invitation to reflect

    So what does this mean for colleagues elsewhere? We don’t claim to have the answers. But we think Northern Ireland’s experience highlights issues that many regions face – and raises questions that might be useful to explore collectively.

    If proximity can be a strength, how do we best harness it? If collaboration relies too heavily on personal networks, how do we make it more inclusive? If we want to value research talent beyond academia, how do we support those careers in practice? And if small regions need a joined-up voice, how do we achieve it without losing diversity?

    Northern Ireland is a small system, but that makes its challenges and opportunities more visible. We hope this report is not only useful here, but a provocation to reflect on how small research ecosystems across the UK – and beyond – might learn together.

    Source link

  • Clean energy research funds under threat – Campus Review

    Clean energy research funds under threat – Campus Review

    There has been much excitement since Australia signed a landmark agreement with the United States last month to expand cooperation on critical minerals and rare earth elements.

    Please login below to view content or subscribe now.

    Membership Login

    Source link

  • Anatomy of the Research Statement (opinion)

    Anatomy of the Research Statement (opinion)

    The research statement that you include in your promotion and tenure dossier is one of the most important documents of your scholarly career—and one you’ll have little experience writing or even reading, unless you have a generous network of senior colleagues. As an academic editor, I support a half dozen or so academics each year as they revise (and re-revise, and throw out, and retrieve from the bin, and re-revise again) and submit their research statements and P&T dossiers. My experience is with—and so these recommendations are directed at—tenure-track researchers at American R-1s and R-2s and equivalent Canadian and Australian institutions.

    In my experience, most academics are good at describing what their research is and how and why they do it, but few feel confident in crafting a research statement that attests to the impact of their accomplishments. And “impact” is a dreaded word across the disciplines—one that implies reducing years of labor to mere numbers that fail to account for the depth, quality or importance of your work.

    When I think about “impact,” I think of course of the conventional metrics, but I think as well of your work’s influence among your peers in academia, and also of its resonance in nonacademic communities, be they communities of clinicians, patients, people with lived experiences of illness or oppression, people from a specific equity-deserving group, or literal neighborhoods that can be outlined on a map. When I edit research statements, I support faculty to shift their language from “I study X” to “My study of X has achieved Y” or “My work on X has accomplished Z.” This shift depends on providing evidence to show how your work has changed other people’s lives, work or thinking.

    For researchers who seek to make substantial contributions outside of academia—to cure a major disease, to change national policy or legislation—such a focus on impact, influence and resonance can be frustratingly short-termist. Yet if it is your goal to improve the world beyond the boundaries of your classroom and campus, then it seems worthwhile to find ways to show whether and how you are making progress toward that goal.

    If you’re preparing to go up for tenure or promotion, here’s a basic framework for a research statement, which you can adopt and adapt as you prepare your own impact-, influence- or resonance-focused research statement:

    Paragraph 1—Introduction

    Start with a high-level description of your overarching program of research. What big question unites the disparate parts of your work? What problem are you working toward solving? If your individual publications, presentations and grants were puzzle pieces, what big picture would they form?

    Paragraph 2—Background (Optional)

    Briefly sketch the background that informed your current preoccupations. Draw, if relevant, on your personal or professional background before your graduate studies. This paragraph should be short and should emphasize how your pre-academic life laid the foundation that has prepared you, uniquely, to address the key concerns that now occupy your intellectual life. For folks in some disciplines or institutions, this paragraph will be irrelevant and shouldn’t be included: trust your gut, or, if in doubt, ask a trusted senior colleague.

    Middle Paragraphs—Research Themes, Topics or Areas

    Cluster thematically—usually into two, three or four themes—the topics or areas into which your disparate projects and publications can be categorized. Within each theme, identify what you’re interested in and, if your methods are innovative, how you work to advance scholarly understandings of your subject. Depending on the expected length of your research statement, you might write three or four paragraphs for each theme. Each paragraph should identify external funding that you secured to advance your work and point to any outputs—publications, conference presentations, journal special issues, monographs, edited books, keynotes, invited talks, events, policy papers, white papers, end-user training guides, patents, op-eds and so on—that you produced.

    If the output is more than a few years old, you’ll also want to identify what impact (yes) that output had on other people. Doing so might involve pointing at your numbers of citations, but you might also:

    • Describe the diversity of your citations (e.g., you studied frogs but your research is cited in studies of salmon, belugas and bears, suggesting the broad importance of your work across related subfields);
    • Search the Open Syllabus database to identify the number of institutions that include your important publication in their teaching, or WorldCat, to identify the number of countries in which your book is held;
    • Link your ORCID account to Sage’s Policy Profiles to discover the government ministries and international bodies that have been citing your work;
    • Summarize media mentions of your work or big, important stories in news media, e.g. magazine covers or features in national newspapers (e.g. “In August 2025, this work was featured in The New York Times (URL)”);
    • Name awards you’ve won for your outputs or those won by trainees you supervised on the project, including a description of why the award-giving organization selected your or your trainee’s work;
    • Identify lists of top papers in which your article appears (e.g., most cited or most viewed in that journal in the year it was published); or,
    • Explain the scholarly responses to your work, e.g., conference panels discussing one of your papers or quotations from reviews of your book in important journals.

    Closing Paragraphs—Summary

    If you’re in a traditional research institution—one that would rarely be described by other academics as progressive or politically radical—then it may be advantageous for you to conclude your research statement with three summary paragraphs.

    The first would summarize your total career publications and your publications since appointment, highlighting any that received awards or nominations or that are notable for the number of citations or the critical response they have elicited. This paragraph should also describe, if your numbers are impressive, your total number of career conference presentations and invited talks or keynotes as well as the number since either your appointment or your last promotion, and the total number of publications and conference presentations you’ve co-authored with your students or trainees or partners from community or patient groups.

    A second closing paragraph can summarize your total career research funding and funding received since appointment, highlighting the money you have secured as principal investigator, the money that comes from external (regional, national and international) funders, and, if relevant, the new donor funding you’ve brought in.

    A final closing paragraph can summarize your public scholarship, including numbers of media mentions, hours of interviews provided to journalists, podcast episodes featured on or produced, public lectures delivered, community-led projects facilitated, or numbers of op-eds published (and, if available, the web analytics associated with these op-eds; was your piece in The Conversation one of the top 10 most cited in that year from your institution?).

    Final Paragraph—Plans and Commitments

    Look forward with excitement. Outline the upcoming projects, described in your middle paragraphs, to which you are already committed, including funding applications that are still under review. Paint for your reader a picture of the next three to five years of your research and then the rest of your career as you progress toward achieving the overarching goal that you identified in your opening paragraph.

    While some departments and schools are advising their pretenure faculty that references to metrics aren’t necessary in research statements, I—perhaps cynically—worry that the senior administrators who review tenure dossiers after your department head will still expect to see your h-index, total number of publications, number of high-impact-factor journals published in and those almighty external dollars awarded.

    Unless you are confident that your senior administrators have abandoned conventional impact metrics, I’d encourage you to provide these numbers and your disciplinary context. I’ve seen faculty members identify, for example, the average word count of a journal article in their niche, to show that their number of publications is not low but rather is appropriate given the length of a single article. I’ve seen faculty members use data from journals like Scientometrics to show that their single-digit h-index compares to the average h-index for associate professors in their field, even though they are not yet tenured. Such context will help your reader to understand that your h-index of eight is, in fact, a high number, and should be understood as such.

    You’ll additionally receive any number of recommendations from colleagues and mentors; for those of you who don’t have trusted colleagues or mentors at your institution, I’ve collected the advice of recently tenured and promoted associate professors and full professors from a range of disciplines and institutional contexts in this free 30-page PDF.

    I imagine that most of the peers and mentors whom you consult will remind you to align with any guidelines that your institution provides. Definitely, you should do this—and you should return to those guidelines and evaluation criteria, if they exist, as you iteratively revise your draft statement based on the feedback you receive from peers. You’ll also need to know what pieces of your P&T dossier will be read by what audience—external readers, a departmental or faculty committee, senior administrators. Anyone can tell you this; every piece of writing will need to consider both audience and context.

    But my biggest takeaway is something no client of mine has ever been told by a peer, colleague or mentor: Don’t just describe what you’ve done. Instead, point to the evidence that shows that you’ve done your work well.

    Source link

  • What research says about Mamdani and Cuomo’s education proposals

    What research says about Mamdani and Cuomo’s education proposals

    by Jill Barshay, The Hechinger Report
    November 3, 2025

    New York City, where I live, will elect a new mayor Tuesday, Nov. 4. The two front runners — state lawmaker Zohran Mamdani, the Democratic nominee, and former Gov. Andrew Cuomo, running as an independent — have largely ignored the city’s biggest single budget item: education. 

    One exception has been gifted education, which has generated a sharp debate between the two candidates. The controversy is over a tiny fraction of the student population. Only 18,000 students are in the city’s gifted and talented program out of more than 900,000 public school students. (Another 20,000 students attend the city’s elite exam-entrance high schools.) 

    But New Yorkers are understandably passionate about getting their kids into these “gated” classrooms, which have some of the best teachers in the city. Meanwhile, the racial composition of these separate (some say segregated) classes — disproportionately white and Asian — is shameful. Even many advocates of gifted education recognize that reform is needed. 

    Related: Our free weekly newsletter alerts you to what research says about schools and classrooms.

    Mamdani wants to end gifted programs for kindergarteners and wait until third grade to identify advanced students. Cuomo wants to expand gifted education and open up more seats for more children. 

    The primary justification for gifted programs is that some children learn so quickly that they need separate classrooms to progress at an accelerated pace. 

    But studies have found that students in gifted classrooms are not learning faster than their general education peers. And analyses of curricula show that many gifted classes don’t actually teach more advanced material; they simply group mostly white and Asian students together without raising academic rigor.

    In my reporting, I have found that researchers question whether we can accurately spot giftedness in 4- or 5-year-olds. My colleague Sarah Carr recently wrote about the many methods that have been used to try to identify young children with high potential, and how the science underpinning them is shaky. In addition, true giftedness is often domain-specific — a child might be advanced in math but not in reading, or vice versa — yet New York City’s system labels or excludes children globally rather than by subject. 

    Because of New York City’s size — it’s the nation’s largest public school system, even larger than 30 states — what happens here matters.

    Policy implications

    • Delaying identification until later grades, when cognitive profiles are clearer, could improve accuracy in picking students. 
    • Reforming the curriculum to make sure that gifted classes are truly advanced would make it easier to justify having them. 
    • Educators could consider ways for children to accelerate in a single subject — perhaps by moving up a grade in math or English classes. 
    • How to desegregate these classrooms, and make their racial/ethnic composition less lopsided, remains elusive.

    I’ve covered these questions before. Read my columns on gifted education:

    Size isn’t everything

    Another important issue in this election is class size. Under a 2022 state law, New York City must reduce class sizes to no more than 20 students in grades K-3 by 2028. (The cap will be 23 students per class in grades 4-8 and 25 students per class in high school.) To meet that mandate, the city will need to hire an estimated 18,000 new teachers.

    During the campaign, Mamdani said he would subsidize teacher training, offering tuition aid in exchange for a three-year commitment to teach in the city’s public schools. The idea isn’t unreasonable, but it’s modest — only $12 million a year, expected to produce about 1,000 additional teachers annually. That’s a small fraction of what’s needed.

    The bigger problem may be the law itself: Schools lack both physical space and enough qualified teachers. What parents want — small classes led by excellent, experienced educators — isn’t something the city can scale quickly. Hiring thousands of novices may not improve learning much, and will make the job of school principal, who must make all these hires, even harder.

    For more on the research behind class-size reductions, see my earlier columns:

    Contact staff writer Jill Barshay at 212-678-3595, jillbarshay.35 on Signal, or [email protected].

    This story about education issues in the New York City mayoral election was produced by The Hechinger Report, a nonprofit, independent news organization focused on inequality and innovation in education. Sign up for Proof Points and other Hechinger newsletters.

    This <a target=”_blank” href=”https://hechingerreport.org/proof-points-nyc-mayor-election-education/”>article</a> first appeared on <a target=”_blank” href=”https://hechingerreport.org”>The Hechinger Report</a> and is republished here under a <a target=”_blank” href=”https://creativecommons.org/licenses/by-nc-nd/4.0/”>Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License</a>.<img src=”https://i0.wp.com/hechingerreport.org/wp-content/uploads/2018/06/cropped-favicon.jpg?fit=150%2C150&amp;ssl=1″ style=”width:1em;height:1em;margin-left:10px;”>

    <img id=”republication-tracker-tool-source” src=”https://hechingerreport.org/?republication-pixel=true&post=113202&amp;ga4=G-03KPHXDF3H” style=”width:1px;height:1px;”><script> PARSELY = { autotrack: false, onload: function() { PARSELY.beacon.trackPageView({ url: “https://hechingerreport.org/proof-points-nyc-mayor-election-education/”, urlref: window.location.href }); } } </script> <script id=”parsely-cfg” src=”//cdn.parsely.com/keys/hechingerreport.org/p.js”></script>

    Source link

  • ‘End of an era’: Experts warn research executive order could stifle scientific innovation

    ‘End of an era’: Experts warn research executive order could stifle scientific innovation

    An executive order that gives political appointees new oversight for the types of federal grants that are approved could undercut the foundation of scientific research in the U.S., research and higher education experts say. 

    President Donald Trump’s order, signed Aug. 7, directs political appointees at federal agencies to review grant awards to ensure they align with the administration’s “priorities and the national interest.

    These appointees are to avoid giving funding to several types of projects, including those that recognize sex beyond a male-female binary or initiatives that promote “anti-American values,” though the order doesn’t define what those values are.   

    The order effectively codifies the Trump administration’s moves to deny or suddenly terminate research grants that aren’t in line with its priorities, such as projects related to climate change, mRNA research, and diversity, equity and inclusion.

    The executive order’s mandates mark a big departure from norms before the second Trump administration. Previously, career experts provided oversight rather than political appointees and peer review was the core way to evaluate projects.

    Not surprisingly, the move has brought backlash from some quarters.

    The executive order runs counter to the core principle of funding projects based on scientific merit — an idea that has driven science policy in the U.S. since World War II, said Toby Smith, senior vice president for government relations and public policy at the Association of American Universities. 

    “It gives the authority to do what has been happening, which is to overrule peer-review through changes and political priorities,” said Smith. “This is really circumventing peer review in a way that’s not going to advance U.S. science and not be healthy for our country.”

    That could stifle scientific innovation. Trump’s order could prompt scientists to discard their research ideas, not enter the scientific research field or go to another country to complete their work, research experts say. 

    Ultimately, these policies could cause the U.S. to fall from being one of the top countries for scientific research to one near the bottom, said Michael Lubell, a physics professor at the City College of New York.

    “This is the end of an era,” said Lubell. “Even if things settle out, the damage has already been done.”

    A new approach to research oversight

    Under the order, senior political appointees or their designees will review new federal awards as well as ongoingl grants and terminate those that don’t align with the administration’s priorities.

    This policy is a far cry from the research and development strategy developed by Franklin D. Roosevelt’s administration at the end of World War II. Vannevar Bush, who headed the U.S. Office of Scientific Research and Development at the time, decided the U.S. needed a robust national program to fund research that would leave scientists to do their work free from political pressure. 

    Bush’s strategy involved some government oversight over research projects, but it tended to defer to the science community to decide which projects were most promising, Lubell said. 

    “That kind of approach has worked extremely well,” said Lubell. “We have had strong economic growth. We’re the No. 1 military in the world, our work in the scientific field, whether it’s medicine, or IT — we’re right at the forefront.”

    But Trump administration officials, through executive orders and in public hearings, have dismissed some federal research as misleading or unreliable — and portrayed the American scientific enterprise as one in crisis. 

    The Aug. 7 order cited a 2024 report from the U.S. Senate Commerce, Science, and Transportation Committee, led by its then-ranking member and current chairman, Sen. Ted Cruz, R-Texas, that alleged more than a quarter of National Science Foundation spending supported DEI and other “left-wing ideological crusades.” House Democrats, in a report released in April, characterized Cruz’s report as “a sloppy mess” that used flawed methodology and “McCarthyistic tactics.”

    Source link

  • Australia signs research pact with China – Campus Review

    Australia signs research pact with China – Campus Review

    Universities Australia has signed a deal with China that will encourage research collaboration and student exchanges between the two countries.

    Please login below to view content or subscribe now.

    Membership Login

    Source link

  • The white paper is wrong – changing research funding won’t change teaching

    The white paper is wrong – changing research funding won’t change teaching

    The Post-16 education and skills white paper might not have a lot of specifics in it but it does mostly make sense.

    The government’s diagnosis is that the homogeneity of sector outputs is a barrier to growth. Their view, emerging from the industrial strategy, is that it is an inefficient use of public resources to have organisations doing the same things in the same places. The ideal is specialisation where universities concentrate on the things they are best at.

    There are different kinds of nudges to achieve this goal. One is the suggestion that the REF could more closely align to the government missions. The detail is not there but it is possible to see how impact could be made to be about economic growth or funding could be shifted more toward applied work. There is a suggestion that research funding should consider the potential of places (maybe that could lead to some regional multipliers who knows). And there are already announced steps around the reform on HEIF and new support for spin-outs.

    Ecosystems

    All of these things might help but they will not be enough to fundamentally change the research ecosystem. If the incentives stay broadly the same researchers and universities will continue to do broadly the same things irrespective of how much the government wants more research aimed at growing the economy.

    The potentially biggest reform has the smallest amount of detail. The paper states

    We will incentivise this specialisation and collaboration through research funding reform. By incentivising a more strategic distribution of research activity across the sector, we can ensure that funding is used effectively and that institutions are empowered to build deep expertise in areas where they can lead. This may mean a more focused volume of research, delivered with higher-quality, better cost recovery, and stronger alignment to short- and long-term national priorities. Given the close link between research and teaching, we expect these changes to support more specialised and high quality teaching provision as well.

    The implication here is that if research funding is allocated differently then providers will choose to specialise their teaching because research and teaching are linked. Before we get to whether there is a link between research funding and teaching (spoiler there is not) it is worth unpacking two other implications here.

    The first is that the “strategic distribution” element will have entirely different impacts depending on what the strategy is and what the distribution mechanism is. The paper states that there could, broadly, be three kinds of providers. Teaching only, teaching with applied research, and research institutions (who presumably also do teaching.) The strategy is to allow providers to focus on their strengths but the problem is it is entirely unclear which strengths or how they will be measured. For example, there are some researchers that are doing research which is economically impactful but perhaps not the most academically ground breaking. Presumably this is not the activity which the government would wish to deprioritise but could be if measured by current metrics. It also doesn’t explain how providers with pockets of research excellence within an overall weaker research profile could maintain their research infrastructure.

    The white paper suggests that the sector should focus on fewer but better funded research projects. This makes sense if the aim is to improve the cost recovery on individual research projects but improving the unit of resource through concentrating the overall allocation won’t necessarily improve financial sustainability of research generally. A strategic decision to align research funding more with the industrial strategy would leave some providers exposed. A strategic decision to invest in research potential not research performance would harm others. A focus on regions, or London, or excellence wherever it may be, would have a different impact. The distribution mechanism is a second order question to the overall strategy which has not yet dealt with some difficult trade offs

    On its own terms it also seems research funding is not a good indicator of teaching specialism.

    Incentives

    When the White Paper suggests that the government can “incentivise specialisation and collaboration through research funding reform”, it is worth asking what – if any – links there currently are between research funding and teaching provision.

    There’s two ways we can look at this. The first version looks at current research income from the UK government to each provider(either directly, or via UKRI) by cost centre – and compares that to the students (FTE) associated with that cost centre within a provider.

     

    [Full screen]

    We’re at a low resolution – this split of students isn’t filterable by level or mode of study, and finances are sometimes corrected after the initial publication (we’ve looked at 2021-22 to remove this issue). You can look at each cost centre to see if there is a relationship between the volume of government research funding and student FTE – and in all honesty there isn’t much of one in most cases.

    If you think about it, that’s kind of a surprise – surely a larger department would have more of both? – but there are some providers who are clearly known for having high quality research as opposed to large numbers of students.

    So to build quality into our thinking we turn to the REF results (we know that there is generally a good correlation between REF outcomes and research income).

    Our problem here is that REF results are presented by unit of assessment – a subject grouping that maps cleanly neither to cost centres or to the CAH hierarchy used more commonly in student data (for more on the wild world of subject classifications, DK has you covered). This is by design of course – an academic with training in biosciences may well live in the biosciences department and the biosciences cost centre, but there is nothing to stop them researching how biosciences is taught (outputs of which might be returned to the Education cost centre).

    What has been done here is a custom mapping at CAH3 level between subjects students are studying and REF2021 submissions – the axis are student headcount (you can filter by mode and level, and choose whichever academic year you fancy looking at) against the FTE of staff submitted to REF2021 – with a darker blue blob showing a greater proportion of the submission rated as 4* in the REF (there’s a filter at the bottom if you want to look at just high performing departments).

    [Full screen]

    Again, correlations are very hard to come by (if you want you can look at a chart for a single provider across all units of assessment). It’s almost as if research doesn’t bring in money that can cross-subsidise teaching, which will come as no surprise to anyone who has ever worked in higher education.

    Specialisation

    The government’s vision for higher education is clear. Universities should specialise and universities that focus on economic growth should be rewarded. The mechanisms to achieve it feel, frankly, like a mix of things that have already been announced and new measures that are divorced from the reality of the financial incentives universities work under.

    The white paper has assiduously ducked laying out some of the trade-offs and losers in the new system. Without this the government cannot set priorities and if it does not move some of the underlying incentives on student funding, regional funding distribution, greater devolution, supply-side spending like Freeports, staff reward and recognition, student number allocations, or the myriad of things that make up the basis of the university funding settlement, it has little hope of achieving its goals in specialisation or growth.

    Source link