Tag: publication

  • U.K. University Apologizes to U.S. Scholar Over Publication Ban

    U.K. University Apologizes to U.S. Scholar Over Publication Ban

    Sheffield Hallam University has apologized to a professor whose research into alleged human rights abuses was blocked from publication after political pressure from the Chinese security services.

    In late 2024, a study by Laura Murphy, an American professor of human rights and contemporary slavery at Sheffield Hallam, into forced labor practices Uyghur Muslims allegedly face was refused publication by her institution after a campaign of harassment and intimidation from Beijing, The Guardian and BBC News reported.

    Sheffield Hallam staff working in offices in mainland China faced visits from intelligence officials over the research, while access to the university’s websites was blocked for more than two years, hampering student recruitment, officials say.

    In an internal email from July 2024 obtained by Murphy using a subject access request, university officials said “attempting to retain the business in China and publication of the research are now untenable bedfellows.”

    After taking a career break to work for the U.S. government, Murphy returned to Sheffield Hallam in early 2025 and says she was told by administrators that the university was no longer permitting any research on forced labor or on China, prompting her to start legal action.

    Her solicitor, Claire Powell, of the firm Leigh Day, said that Murphy’s “academic freedom has been repeatedly and unlawfully restricted over the past two years.”

    “The documents uncovered paint an extremely concerning picture of a university responding to threats from a foreign state security service by trading the academic freedom of its staff for its own commercial interests,” Powell added.

    Murphy, who claimed her university failed to protect her academic freedom, has now received an apology and the institution has told her it “wish[ed] to make clear our commitment to supporting her research and to securing and promoting freedom of speech and academic freedom within the law.”

    “The university’s decision to not continue with Professor Laura Murphy’s research was taken based on our understanding of a complex set of circumstances at the time, including being unable to secure the necessary professional indemnity insurance,” a spokesperson for the university added.

    These circumstances relate to a defamation case brought by a Hong Kong garment maker which initiated a libel case against Sheffield Hallam after its name was included in a report into forced labor published in December 2023. A preliminary rule at the High Court in London found the report had been “defamatory.”

    The apology comes months after new free speech laws came into effect in England in August, with the Office for Students’ free speech champion Arif Ahmed warning the regulator would take action if universities bowed to pressure from foreign governments regarding contentious areas of research.

    A U.K. government spokesperson said, “Any attempt by a foreign state to intimidate, harass or harm individuals in the U.K. will not be tolerated, and the government has made this clear to Beijing after learning of this case.

    “The government has robust measures in place to prevent this activity, including updated powers and offenses through the National Security Act.”

    The Chinese Embassy in London told the BBC that the university had “released multiple fake reports on Xinjiang that are seriously flawed.”

    “It has been revealed that some authors of these reports received funding from certain U.S. agencies,” the embassy added.

    Murphy told the BBC she has received funding over the course of her career from multiple U.S. research agencies, including the U.S. National Endowment for Humanities for work on slave narratives, the U.S. Department of Justice for work on human trafficking in New Orleans, and more recently from USAID and the U.S. State Department for her work on China.

    The Chinese Embassy said the allegations of “forced labor” in her reports “cannot withstand basic fact-check.”

    Source link

  • Is it time to change the rules on NSS publication?

    Is it time to change the rules on NSS publication?

    If we cast our minds back to 2005, the four UK higher education funding bodies ran the first ever compulsory survey of students’ views on the education they receive – the National Student Survey (NSS).

    Back then the very idea of a survey was controversial, we were worried about the impact on the sector reputation, the potential for response bias, and that students would be fearful of responding negatively in case their university downgraded their degree.

    Initial safeguards

    These fears led us to make three important decisions all of which are now well past their sell-by date. These were:

    • Setting a response rate threshold of 50 per cent
    • Restricting publication to subject areas with more than 22 respondents
    • Only providing aggregate data to universities.

    At the time all of these were very sensible decisions designed to build confidence in what was a controversial survey. Twenty years on, it’s time to look at these with fresh eyes to assure ourselves they remain appropriate – and to these eyes they need to change.

    Embarrassment of riches

    One of these rules has already changed: responses are now published where 10 or more students respond. Personally, I think this represents a very low bar, determined as it is by privacy more than statistical reasoning, but I can live with it especially as research has shown that “no data” can be viewed negatively.

    Of the other two, first let me turn to the response rate. Fifty per cent is a very high response rate for any survey, and the fact the NSS achieves a 70 per cent response rate is astonishing. While I don’t think we should be aiming to get fewer responses, drawing a hard line at 50 per cent creates a cliff edge in data that we don’t need.

    There is nothing magical about 50 per cent – it’s simply a number that sounds convincing because it means that at least half your students contributed. A 50 per cent response rate does not ensure that the results are not subject to bias for example, if propensity to respond was in some way correlated with a positive experience the results would still be flawed.

    I would note that the limited evidence that there is suggests that propensity to respond is not correlated with a positive experience, but it’s an under-researched area and one the Office for Students (OfS) should publish some work on.

    Panel beating

    This cliff edge is even more problematic when the data is used in regulation, as the OfS proposes to do a part of the new TEF. Under OfS proposals providers that don’t have NSS data either due to small cohorts or a “low” response rate would have NSS evidence replaced with focus groups or other types of student interaction. This makes sense when the reason is an absolute low number of responses but not when it’s due to not hitting an exceptionally high response rate as Oxford and Cambridge failed to do for many years.

    While focus groups can offer valuable insights, and usefully sit alongside large-scale survey work, it is utterly absurd to ignore evidence from a survey because an arbitrary and very high threshold is not met. Most universities will have several thousand final year students, so even if only 30 per cent of them respond you will have responses from hundreds if not thousands of individuals – which must provide a much stronger evidence base than some focus groups. Furthermore, that evidence base will be consistent with every other university creating one less headache for assessors in comparing diverse evidence.

    The 50 per cent response rate threshold also looks irrational when set against a 30 per cent threshold for the Graduate Outcomes survey. While any response rate threshold is arbitrary to apply, applying two different thresholds needs rather more justification than the fact that the surveys are able to achieve different response rates. Indeed, I might argue that the risk of response bias might be higher with GO for a variety of reasons.

    NSS to GO

    In the absence of evidence in support of any different threshold I would align the NSS and GO publication thresholds at 30 per cent and make the response rates more prominent. I would also share NSS and GO data with TEF panels irrespective of the response rate, and allow them to rely on their expert judgement supported by the excellent analytical team at the OfS. And the TEF panel may then choose to seek additional evidence if they consider it necessary.

    In terms of sharing data with providers, 2025 is really very different to 2005. Social media has arguably exploded and is now contracting, but in any case attitudes to sharing have changed and it is unlikely the concerns that existed in 2005 will be the same as the concerns of the current crop of students.

    For those who don’t follow the detail, NSS data is provided back to Universities via a bespoke portal that provides a number of pre-defined cuts of the data and comments, together with an ability to create your own cross-tabs. This data, while very rich, do not have the analytical power of individualised data and suffer from still being subject to suppression for small numbers.

    What this means is that if we want to understand the areas we want to improve we’re forced to deduce it from a partial picture rather than being laser focussed on exactly where the issues are, and this applies to both the Likert scale questions and the free text.

    It also means that providers cannot form a longitudinal view of the student experience by linking to other data and survey responses they hold at an individual level – something that could generate a much richer understanding of how to improve the student experience.

    Source link