Why creative workers should go with our guts about ChatGPT

Defenders of CEOs often cite difficult-to-define attributes to justify their lavish remuneration.

Anxiety about ChatGPT’s philosophical reflections or Dall-E’s artworks is triggered by the sense that they trespass onto domains that were once humanity’s exclusive preserve. (Gabby Jones / Bloomberg via Getty Images)

It was chutzpah that empowered Elon Musk to make Tesla the world’s most valuable automaker. At the end of his storied career as CEO of General Electric, Jack Welch chose “Straight from the Gut” as the subtitle of his 2001 memoir. More recently, we’ve seen costly miscalculations when overmighty CEOs go with their guts.

It’s important for creative workers to heed the warnings of our guts as we respond to the exciting but disconcerting accomplishments of generative AI. The poetry, journalism, and artworks of generative AIs cause angst among creative workers because ours was the work that was supposed to be beyond the reach of the machines that easily automated the routines of clerical workers.

I’m a philosophy academic who is gobsmacked by the answers given by ChatGPT to the questions that were once the mainstays of my course assessments. The fact that ChatGPT isn’t producing works of genius may relieve some philosophers. But my gut warns me that our advantage over the machines may be short-lived. If you’re impressed by the journalism of ChatGPT — based on GPT-3 — wait until you read GPT-4’s Op-Eds. If you still have doubts, just wait for the commercial release of Google’s LaMDA AI Chatbot, which is rumoured to out-perform ChatGPT.

When creative workers go with their guts, they are appropriately suspicious about the reassurances of people selling us these techs.

A transfer of wealth from the digitally clueless to the digitally savvy

Anxiety about ChatGPT’s philosophical reflections or Dall-E’s artworks is triggered by a sense of trespass on domains that were once humanity’s exclusive preserve. I propose a heuristic to guide our expectations of generative AIs:

THE GREATER THE ANGST PRODUCED BY A NEW DIGITAL TECH, THE GREATER ITS POTENTIAL TO TRANSFER WEALTH FROM THE DIGITALLY CLUELESS TO THE DIGITALLY SAVVY.

Creative workers don’t have to pretend to be experts on generative AI to understand that it is changing the rules in ways that could leave them worse off. Indeed, that’s part of the point. We should feel the same unease about generative AI that you experience when invited by a confidence trickster to play the shell game. You may think you can track the passage of the pea from shell to shell, but you should nevertheless expect that you will end up on the losing end of this transaction.

Microsoft’s recent multibillion dollar investment in OpenAI, the company behind ChatGPT, suggests its expectation of hefty future profits. Some of these profits may be new money, but creative workers are right to be concerned that some of it will come from money used to pay our wages.

Some creative workers are digitally informed and keen to make imaginative use of ChatGPT. But considered collectively, creative workers are less informed than the business interests now investing in generative AI.

The intrusion of AI into traditional ways of life suggests that we are about to be offered deals that we don’t fully understand. We should expect that these will enormously benefit those who offer them and leave the rest of us worse off.

Learning the lessons of Facebook and crypto

Recent history offers many examples of occasions in which we would have been better advised to listen more closely to our sense of unease and less to the overconfident affirmations of people selling us novel techs.

Mark Zuckerberg now distances himself from the motto “move fast and break things” that guided Facebook’s initial expansion. The “breaking things” aspect of Zuckerberg’s commandment reflected Facebook’s willingness to change the ways we relate to each other, and the ways businesses sell to us. That willingness was excellent for Facebook’s bottom line but not so good for democratic values. Hence the same patterns valuable to advertisers helped to pervert election outcomes.

It’s too easy to be wise after the fact. We now laugh at the digital cluelessness of 84-year-old US Senator Orin Hatch who in 2018 asked Mark Zuckerberg “How do you sustain a business model in which users don’t pay for your service?” Zuckerberg’s answer – “Senator, we run ads” — seems obvious to us … now. But we are better prepared for the future if we remember our collective sense of surprise that there could be that much money in displaying irritating ads that most of us try our hardest to ignore.

It’s obvious to us now that people paid too much for cryptocurrencies. The comparatively digitally clueless among us believed the savvy when they offered to trade our boring old dollars for a stake in the future of money. The marketing was excellent. A mysterious genius named Satoshi Nakamoto invented Bitcoin and pioneered blockchain, the technology on which it is built. Of course, those who’ve lost money are wise now. But if we are to learn from this, we should remember how easily far too many people believed the futuristic hype of crypto peddled by sellers who clearly had a better understanding of the potential and limitations of blockchain technology than we did.

What can creative workers learn from these failures of imagination? It’s better to express doubts about Dogecoin tokens when you’re considering buying them than after the market has crashed and you’re trying to sell. The same advice applies to creative workers. While we still have jobs, we can influence how our employers engage with generative AIs. Now is a better time to express our forceful unease about this intrusion than after we’ve lost our jobs.

  • Nicholas Agar is Professor of Ethics at the University of Waikato in Aotearoa New Zealand, and the author of How to be Human in the Digital Economy. His book “Dialogues on Human Enhancement” is forthcoming with Routledge.

This article was originally published by the ABC on 31 Jan 2023 and is republished here with permission.