AI writing is a huge concern for writers. The current WGA strike has a ban on AI writing as one of their main demands. Indie authors are in a panic that in the near future AI generated novels will crowd them out of the marketplace and they’ll no longer be able to sell books.
Why Listen To Me
I completed a Ph.D. in computer science, focusing on AI, and I’m a writer who has released five books. I don’t claim to be the best person in the world to comment on this, but I’m certainly better positioned to do so than the vast majority of people who have already done so.
So, What Do I Think?
AI writing is a nothingburger novelty. Writers don’t need to worry about it. It’s comparable to being concerned about word processors, spell checkers, or grammar checkers. ChatGPT and its ilk are minorly useful tools that will get a tiny bit better than they currently are and will not displace skilled writers in any substantial way.
What If I’m Wrong?
I’m open to that possibility. If AI writing gets massively better every year and reaches bestselling writer levels of performance this would be the technological singularity. Writing at that level could be used to run companies and governments better than people, massively impact the world, and render the future after that point unknowable.
Writers losing work would be the least of our worries in such a world. There would be far greater upheaval in all our lives.
But I’m Probably Right
Humans are ready to anthropomorphize anything. We think our pets have rich mental lives that they simply don’t. We think our cars have a personality. In 1966 Eliza was created, which was a chatbot that would have “conversations” with people. Many of those people attributed human qualities to the chatbot and were convinced they were interacting with something conscious, instead of what was a relatively simple script that played games with grammatical rules to make it APPEAR that users were having a conversation with it.
Before ChatGPT went mainstream, a similar technology would function like a dungeon master. It was built on an earlier release of ChatGPT. People in the D&D community got excited about this, but when I used it or read sessions other people had had with it, the users were doing a lot of the heavy lifting giving the program the benefit of the doubt about what it wrote. It was like role-playing with a five-year-old as your dungeon master. It hasn’t gotten much better.
This is what’s happening with ChatGPT right now. It uses a statistical approach to produce something that RESEMBLES an answer, essay, or other form of writing. Humans read depth into this that doesn’t exist. The answers are often wrong and are always of low quality. When challenged on something that’s incorrect, rather than “changing its mind”, the program just revises its answer to the next most likely response. You can easily get it to revise its answer when it’s correct.
This makes it very good at producing amateur writing. Unfortunately, there isn’t a huge market for amateur writing. College student essays aren’t worth reading, beyond assigning a grade, and ChatGPT can produce a close approximation of this quality. Most novelists’ first books aren’t worth reading. Considerate writers will make sure readers never see these. ChatGPT can produce these, but there’s already a horde of writers willing to produce these for a small cost or free.
ChatGPT *WILL NOT* produce the next Harry Potter, Handmaid’s Tale, or A Song of Fire and Ice novel. It could imitate these, producing fan fiction, but writers have already been doing this and there isn’t much of a market for it. What we want from writers is to create such works in a novel way, when nothing like them exists. ChatGPT isn’t much more likely to produce this than monkeys randomly pressing keys on a keyboard.
How Can Writers Use It?
Honestly, ChatGPT isn’t even that great as a tool for writers. Using it to help plot your story is like letting a Magic 8 Ball make narrative decisions. Maybe it’s a somewhat interesting gimmick, but it’s unlikely to produce a very engaging story.
ChatGPT is useful for producing low-quality, derivative writing. Inexpensively creating product descriptions or web copy is likely the best you can hope for. The text produced isn’t going to sell much or engage web browsers, so even this isn’t very valuable.
There *MIGHT* be a market for creating personalized stories for people with very specific tastes. Things like someone with a giantess fetish. Something so particular that no one is writing about it (or they’ve read everything available) with readers willing to read very low-quality text. There are desperate writers already doing this, so ChatGPT’s only value proposition might be to create this for free instead of cheap.
Human Writers Will Quickly Outstrip What AI Can Produce
Some writers will never produce anything better than what ChatGPT could, but these people never would have written things worth reading anyways. The majority of writers who stick with it will quickly improve and become more skilled than ChatGPT will ever get. I’d be amazed if a writer’s second book wasn’t noticeably better than ChatGPT.
I *HOPE* the WGA just included their AI demand to have something to give in on. Anything else they can get is more valuable than a Hollywood AI ban. ChatGPT will never be a competitor for an industry writers’ room. Authors similarly can chill out. There isn’t anything to worry about here.
Elham says
I enjoyed reading the article and cannot agree with you more.
John Champaign says
Thanks for commenting Ellie, great to hear from you!