‘How can I know what I think till I see what I say?’: How AI is changing education and writing

‘How can I know what I think till I see what I say?’: How AI is changing education and writing
  • Following HEPI’s recent Policy Note on students’ use of artificial intelligence (AI), HEPI Director Nick Hillman reviews a new book from the United States on what AI means for writing.

‘ChatGPT cannot write.’ It’s a bold statement but one near the start of the new book More Than Words: How to Think about Writing in the Age of AI that explains what comes in the following 300 pages.

The author John Warner’s persuasive argument is that generative AI creates syntax but doesn’t write because ‘writing is thinking.’ (I hope this is the only reason why, when asked to write a higher education policy speech ‘in the style of Nick Hillman’, ChatGPT’s answer is so banal and vacuous…) People are, Warner says, attracted to AI because they’ve not previously been ‘given the chance to explore and play within the world of writing.’

Although Warner is not as negative about using ChatGPT to retrieve information as he is on using it to write wholly new material, he sees the problems it presents as afflicting the experience of ‘deep reading’ too: ‘Reading and writing are being disrupted by people who do not seem to understand what it means to read and write.’

The book starts by reminding the reader how generative AI based on Large Language Models actually works. ChatGPT and the like operate as machines predicting the next word in a sentence (called a ‘token’). To me, it is reminiscent of Gromit placing the next piece of train track in front of him as he goes. It’s all a bit like a more sophisticated version of how the iPhone Notes app on which I’m typing this keeps suggesting the next word for me. (If you click on the suggestions, it tends to end up as nonsense though – I’ve just done it and got, ‘the app doesn’t even make a sentence in a single note’, which sounds like gibberish while also being factually untrue.)

‘The result’, we are told of students playing with ChatGPT and the like, ‘is a kind of academic cosplay where you’ve dressed up a product in the trappings of an academic output, but the underlying process is entirely divorced from the genuine article.’

Writing, Warner says, is a process in which ‘the idea may change based on our attempts to capture it.’ That is certainly my experience: there have been times when I’ve started to bash out a piece not quite knowing if it will end up as a short blog based on one scatty thought or flower into a more polished full-length HEPI paper. Academics accustomed to peer review and the slow (tortuous?) procedures of academic journals surely know better than most that writing is a process.

The most interesting and persuasive part of the book (and Warner’s specialist subject) is the bit on how formulae make writing mundane rather than creative. Many parents will recognise this. It seems to me that children are being put off English in particular by being forced to follow the sort of overweening instructions that no great author ever considered (‘write your essay like a burger’, ‘include four paragraphs in each answer’, ‘follow PEE in each paragraph’ [point / evidence / explain]). Warner sees AI taking this trend to its logical and absurd conclusion where machines are doing the writing and the assessment – and ruining both.

Because writing is a process, Warner rejects even the popular idea that generative AI may be especially useful in crafting a first draft. He accepts it can produce ‘grammatically and syntactically sound writing … ahead of what most students can produce.’ But he also argues that the first draft is the most important draft ‘as it establishes the intention behind the expression.’ Again, I have sympathy with this. Full-length HEPI publications tend to go through multiple drafts, while also being subjected to peer review by HEPI’s Advisory Board and Trustees, yet the final published version invariably still closely resembles the first draft because that remains the original snapshot of the author’s take on the issue at hand. Warner concludes that AI ‘dazzles on first impression but … has significantly less utility than it may seem at first blush.’

One of the most interesting chapters compares and contrasts the rollout of ChatGPT with the old debates about the rise of calculators in schools. While calculators might mean mental arithmetic skills decline, they are generally empowering; similarly, ChatGPT appears to remove the need to undertake routine tasks oneself. But Warner condemns such analogies: for calculators ‘the labor of the machine is identical to the labor of a human’, whereas ‘Fetching tokens based on weighted probabilities is not the same process as what happens when humans write.’

At all the many events I go to on AI in higher education, three areas always comes up: students’ AI use; what AI might mean for professional services; and how AI could change assessment and evaluation. The general outcome across all three issues is that no one knows for sure what AI will mean, but Warner is as big a sceptic on AI and grading as he is on so much else. Because it is formulaic and based on algorithms, Warner argues:

Generative AI being able to give that “good” feedback means that the feedback isn’t actually good. We should instead value that which is uniquely human. … Writing is meant to be read. Having something that cannot read generate responses to writing is wrong.

The argument that so many problems are coursing through education as a result of new tech reminds me a little of the argument common in the 1980s that lead pipes brought down the Roman Empire. Information is said to become corrupted by AI in the way that the water supposedly became infected by the lead channels. But the theory about lead pipes is no longer taken seriously and I remain uncertain whether Warner’s take will survive the passage of time in its entirety either.

Moreover, Warner’s criticisms of the real-world impact of ChatGPT are scattergun in their approach. They include the ‘literal army of precarious workers doing soul-killing tasks’ to support the new technology as well as the weighty environmental impact. This critique calls to mind middle-class drug-takers in the developed world enjoying their highs while dodging the real-world impact on developing countries of their habit.

In the end, Warner’s multifarious criticisms tot up to resemble an attack on technology that comes perhaps just a little too close for comfort to the attacks in the early 1980s by the Musicians’ Union’s on synthesisers and drum machines. In other words, the downsides may be exaggerated while the upsides might be downplayed.

Nonetheless, I was partially persuaded. The process of writing is exactly that: a process. Writing is not just mechanical. (The best young historian I taught in my first career as a school teacher, who is now an academic at UCL, had the worst handwriting imaginable as his brain moved faster than his hand / pen could manage.) So AI is unlikely to replace those who pen words for a living just yet.

Although, paradoxically, I also wished the author had run his text through an AI programme and asked it to knock out around 40% of his text. Perhaps current iterations of generative AI can’t write like a smart human or think like a smart human, but they might be able to edit like a smart human? Perhaps AI’s biggest contribution could come at the end of the writing process rather than the beginning? Technology speeds up all our lives, leaving less time for a leisurely read, and it seems to me that all those ‘one-idea’ books that the US floods the market with, including this one, could nearly always be significantly shorter without losing anything of substance.

Source link