The Fair Work Ombudsman is scrutinising 28 universities in relation to wage underpayments and has recovered $218 million in unpaid wages for over 110,000 employees.
Please login below to view content or subscribe now.
Membership Login

The Fair Work Ombudsman is scrutinising 28 universities in relation to wage underpayments and has recovered $218 million in unpaid wages for over 110,000 employees.
Please login below to view content or subscribe now.

The Georgetown fellow, an Indian citizen, was released from detention in May.
Andrew Thomas/AFP via Getty Images
A Georgetown University researcher who was detained by immigration agents in March will be allowed to resume his work, at least for now, according to a court settlement released Tuesday. Politico first reported the development.
The agreement does not guarantee that the postdoctoral fellow, Badar Khan Suri, will be able to stay in U.S. long term, and it doesn’t resolve his claim that the government violated his First Amendment rights by detaining him because of his pro-Palestinian comments and what the government claims are ties to Hamas. Those aspects of the case will be determined by a later ruling.
That said, as litigation continues, Suri will be protected, maintain his status as a student and remain employed.
Suri was first released from detention in May. His wife is a citizen, but her father has been identified as a former Hamas adviser, which likely was a key factor that influenced Suri’s arrest, Politico reported.
Both parties in the case agreed the settlement was a result of “good faith” negotiations, Politico noted, though the State Department and Department of Homeland Security declined to comment.
“We are encouraged that the government agreed to restore Dr. Suri and his children’s status and records,” Eden Heilman, an ACLU lawyer representing Suri, told Politico. “We know Dr. Suri is eager to rejoin the academic community at Georgetown and this will give him the opportunity to do that this fall.”

It has never been more important for providers across the sector to show that access and participation activities have an impact. With resources stretched, we need to know the work we are doing is making a measurable difference. New research from HEAT reveals a series of powerful findings:
These findings provide compelling evidence that the work being done across the sector to widen participation is not only reaching the right students but changing trajectories at scale. Crucially, this latest research includes previously unavailable controls for student-level prior attainment — adding new rigour to our understanding of outreach impact. You can read the full report on our website.
What’s next for national-level research?
Our ability to generate this kind of national evidence is set to improve even further thanks a successful bid to the Office for Students (OfS) Innovation Fund. Through a collaboration with academics at the Centre for Education Policy and Equalising Opportunities (CEPEO) at the UCL Institute of Education, HEAT will lead on the development and piloting of a pioneering new Outreach Metric, measuring providers’ broader contribution to reducing socio-economic gaps in HE participation. More details about this project can be found here, and we look forward to sharing early findings with the sector in 2026.
Local-level evaluation is just as important
While national analyses like these are essential to understanding the big picture, the OfS rightly continues to require providers to evaluate their own delivery. Local evaluations are critical for testing specific interventions, understanding how programmes work in different contexts, and learning how to adapt practice to improve outcomes. Yet robust evaluation is often resource-intensive and can be out of reach for smaller teams.
This is where use of a sector-wide system for evaluation helps – shared systems like HEAT provide the infrastructure to track student engagement and outcomes at a fraction of the cost of building bespoke systems. Thanks to a decade of collaboration, we now have a system which the sector designed and built together, and which provides the tools necessary to deliver the evaluation that the OfS require providers to publish as part of their Access and Participation Plans (APP).
We’re also continuing to improve our infrastructure. Thanks to a second successful bid to the OfS Innovation Fund we are building system functionality to support providers to use their tracking data when evaluating their APP interventions. This includes an ‘automated comparator group tool’ that will streamline the process of identifying matched participant and non-participant groups based on confounding variables. By reducing the need for manual data work, the tool will make it easier to apply quasi-experimental designs and generate more robust evidence of impact.
Next steps – sharing through publication
With all these tools at their disposal, the next step is to support the sector to publish their evaluation. We need shared learning to avoid duplication and siloed working. HEAT is currently collaborating with TASO to deliver the Higher Education Evaluation Library (HEEL), which will collect, and share, intervention-level evaluation reports in one accessible place for the first time. By collating this evidence, the HEEL will help practitioners and policymakers alike to see what works, what doesn’t, and where we can improve together.
If we want to continue delivering meaningful progress on access and participation, we need both meaningful, critical local evaluation and powerful national insights. Centralised data tracking infrastructure can give the sector the tools it needs to do both.

In a recent piece in The New Yorker, “What Happens After AI Destroys College Writing?,” Hua Hsu tells a story that will be familiar to anyone working in higher education: students wrestling—to varying degrees—about when and how to use generative AI tools like ChatGPT in the completion of their schoolwork.
There is a range of approaches and opinions among students—as there must be, as students are not a monolith—but Hsu centers the piece around “Alex,” an NYU student with a future goal to become a CPA who makes extensive and, to his mind, strategic use of these tools in various aspects of his life, including in writing the emails he exchanged with Hsu to arrange the interview for the piece.
Alex walks Hsu through his use of Claude to first crunch an article on Robert Wedderburn (a 19th-century Jamaican abolitionist) into a summary, and then, when the summary was longer than he had time to absorb before class, to reduce it to bullet points that he then transcribed into a notebook since his professor didn’t allow computers in class.
In a more elaborate example, Alex also used it to complete an art history assignment rooted in a visit to a museum exhibition, where he took pictures of the works and wall text and then fed it all into Claude.
His rationale, as told to Hsu, was “I’m trying to do the least work possible, because this is a class I’m not hella fucking with.”
At the end of the article, we check in with Alex on his finals. Alex “estimated he’d spent between thirty minutes and an hour composing two papers for his humanities classes,” something that would’ve taken “eight or nine hours” without Claude. Alex told Hsu, “I didn’t retain anything. I couldn’t tell you the thesis for either paper hahhahaha.”
Hsu then delivers the kicker: “He received an A-minus and a B-plus.”
I mean this without offense to Hsu, an accomplished writer (New Yorker staff writer and author of the best-selling memoir Stay True) and professor at Bard College, but the piece, for all its specifics and color, felt like very old news—to me, at least.
The transactional mindset toward education, something I’ve been writing about for years, is on perfect display in Alex’s actions. Generative AI has merely made this more plain, more common and more troubling, since there aren’t even any hoops to jump through in order to fake engagement. Alex is doing nothing (or nearly so) and earning credits from New York University.
On reflection, though, the story of Alex is even older than I thought, since it was also my story, particularly the line “I’m trying to do the least work possible,” which was very much my experience for significant chunks of my own college experience from 1988 to 1992.
I earned quite a few credits in my time for, if not doing nothing, certainly learning nothing. Or not learning the subject for which I’d earned the credits, anyway.
How could I blame a student of today for adopting the attitude that I lived by? With my own students, when I was teaching college, I often made hay from my lackluster undergraduate performance, talking about how I skipped more than 70 percent of my class meetings second semester of freshman year but still received no grade lower than a B.
In the article, Hsu remarks that “None of the students I spoke with seemed lazy or passive.” The students “worked hard—but part of their effort went to editing out anything in their college experiences that felt extraneous. They were radically resourceful.”
I, on the other hand, at least when it came to the school part of college, was resolutely lazy and largely passive, except when it came to making sure to avoid courses I was not interested in—essentially anything outside of reading and writing—or that had a mode of assessment not suited to my skills.
My preferred structure was a lecture or lecture/discussion with in-class essay exams and/or short response papers geared to specific texts. Exams and research papers were to be avoided, because exams required studying and research papers required … research.
If you let me loose on a reading or a few chapters from a textbook, I had no trouble giving something that resembled a student doing college, even though the end result was very much akin to Alex’s. I didn’t retain anything.
But hindsight says I learned a lot—or learned enough, anyway, through the classes I was interested in and, perhaps more importantly, the noncurricular experiences of college.
While there are some similarities between my and Alex’s mindset vis-à-vis college, there is a significant difference. Alex appears to be acting out of an “optimization” mindset, where he focuses his efforts on what is most “relevant,” presumably to his future interests, like employment and monetary earnings.
I, on the other hand, majored in the “extraneous experiences.” I was pretty dedicated to the lacrosse club, showing up for practice five days a week with games on the weekend, but I also recall a game day following my 21st birthday when I was so hungover (and perhaps still drunk) that you could smell the alcohol oozing from my pores. My shifts in the midfield were half the length of my line mates’.
(That was the last time I got that drunk.)
I recall a contest at my fraternity where the challenge was to gain the most weight within an 18-hour period, during which we stuffed ourselves with spaghetti, Italian bread, chocolate pudding and gallons of water until we were sick and bloated. Another time we ground through an entire season of Nintendo Super Tecmo Bowl football over the span of a few days, skipping class if you had a matchup that needed playing. We had a group of regulars who gathered in my room to watch All My Children and General Hospital most weekdays. I am the least successful of that crew by a fair stretch.
I know that I took courses in economics, geography, Asian studies and Russian history where, like Alex, I retained virtually nothing about the course material even days after the courses, when I crammed for a test or bs’ed my way through a paper to get my B and move on to what I wanted to spend my time on.
From my perspective as a middle-aged person whose life has been significantly enhanced by all the ways I dodged schoolwork while I was in college, including spending inordinate amounts of time with the woman to whom I have been married for 25 years this August, I would say that missing out on those classes to make room for experiences was the right thing to do.
(Even though I had no understanding of this at the time.)
Will Alex look back and feel the same?
So many questions that need exploring:
Would Alex be as appalled by my indigence, my failure at optimization, as I am by his ignorance?
Have we lost our belief that we as humans have agency over this world of technology?
Is Alex actively deskilling himself, or am I failing to develop the skills necessary for surviving in the world we’ve made for students like Alex?
I wonder if Alex and I have different definitions of what it is to survive.

Get stories like this delivered straight to your inbox. Sign up for The 74 Newsletter
Nearly two-thirds of teachers utilized artificial intelligence this past school year, and weekly users saved almost six hours of work per week, according to a recently released Gallup survey. But 28% of teachers still oppose AI tools in the classroom.
The poll, published by the research firm and the Walton Family Foundation, includes perspectives from 2,232 U.S. public school teachers.
“[The results] reflect a keen understanding on the part of teachers that this is a technology that is here, and it’s here to stay,” said Zach Hrynowski, a Gallup research director. “It’s never going to mean that students are always going to be taught by artificial intelligence and teachers are going to take a backseat. But I do like that they’re testing the waters and seeing how they can start integrating it and augmenting their teaching activities rather than replacing them.”
At least once a month, 37% of educators take advantage of tools to prepare to teach, including creating worksheets, modifying materials to meet student needs, doing administrative work and making assessments, the survey found. Less common uses include grading, providing one-on-one instruction and analyzing student data.
A 2023 study from the RAND Corp. found the most common AI tools used by teachers include virtual learning platforms, like Google Classroom, and adaptive learning systems, like i-Ready or the Khan Academy. Educators also used chatbots, automated grading tools and lesson plan generators.
Most teachers who use AI tools say they help improve the quality of their work, according to the Gallup survey. About 61% said they receive better insights about student learning or achievement data, while 57% said the tools help improve their grading and student feedback.
Nearly 60% of teachers agreed that AI improves the accessibility of learning materials for students with disabilities. For example, some kids use text-to-speech devices or translators.
More teachers in the Gallup survey agreed on AI’s risks for students versus its opportunities. Roughly a third said students using AI tools weekly would increase their grades, motivation, preparation for jobs in the future and engagement in class. But 57% said it would decrease students’ independent thinking, and 52% said it would decrease critical thinking. Nearly half said it would decrease student persistence in solving problems, ability to build meaningful relationships and resilience for overcoming challenges.
In 2023, the U.S. Department of Education published a report recommending the creation of standards to govern the use of AI.
“Educators recognize that AI can automatically produce output that is inappropriate or wrong. They are well-aware of ‘teachable moments’ that a human teacher can address but are undetected or misunderstood by AI models,” the report said. “Everyone in education has a responsibility to harness the good to serve educational priorities while also protecting against the dangers that may arise as a result of AI being integrated in ed tech.”
Researchers have found that AI education tools can be incorrect and biased — even scoring academic assignments lower for Asian students than for classmates of any other race.
Hrynowski said teachers are seeking guidance from their schools about how they can use AI. While many are getting used to setting boundaries for their students, they don’t know in what capacity they can use AI tools to improve their jobs.
The survey found that 19% of teachers are employed at schools with an AI policy. During the 2024-25 school year, 68% of those surveyed said they didn’t receive training on how to use AI tools. Roughly half of them taught themselves how to use it.
“There aren’t very many buildings or districts that are giving really clear instructions, and we kind of see that hindering the adoption and use among both students and teachers,” Hrynowski said. “We probably need to start looking at having a more systematic approach to laying down the ground rules and establishing where you can, can’t, should or should not, use AI In the classroom.”
Disclosure: Walton Family Foundation provides financial support to The 74.
Get stories like these delivered straight to your inbox. Sign up for The 74 Newsletter

Tertiary education’s new steward will focus on allocating university funding, harmonising tertiary education and negotiating mission-based contracts, according to its Terms of Reference released on Tuesday.
Please login below to view content or subscribe now.