I’ve had Siri in my pocket for years. I have friends who get Alexa to play lullabies to help them sleep, and others who ask their Echo to “guard” their house by listening for unusual noises while they’re out of town. So when I heard about the new chatbots, I thought, “How different could they be?”
The answer? Different. Leaps-and-bounds different. When I finally took ChatGPT out for a spin — and tried Google Bard too — I got to see for myself what you may already have discovered. These are not task-oriented bots like Siri or Alexa or any other software that plays your favorite songs or finds stuff for you on the internet. Chat GPT, the new AI-powered Bing, YouChat, and their brethren are designed to be socially interactive and, more important, generative. They won’t play your favorite song; they’ll write you a song — from scratch. And they’ll do it in the time it takes you to read this sentence.
Think First Before Collaborating With a Bot
Anyone who does any writing at all, whether it’s songs or emails or entire books, is likely to either be thrilled at this development or quaking in their boots, wondering if their job is about to disappear. (This is one of the concerns of the screenwriters who are currently on strike.) If writing is challenging for you, you may think the playing field just got leveled. But it’s not that clear-cut. Before you ask a chatbot to assist you, there are some dicey issues you’d be wise to consider first.
Once I had tried these new bots, lots of questions bubbled to the surface. “If you get help from AI software on a story, does the software own the story or do you?” “Is your great idea still your idea?” “Do you have to tell a publisher that you had an AI collaborator?”
Who Owns the Book?
Just for fun, I posed these questions to the chatbots, but their answers were long and noncommittal. One bot advised, “If you have concerns about authorship or ownership of AI-generated content, it's advisable to consult a legal expert.” Pleased that AI still had respect for human opinion, I called prominent publishing attorney Jonathan Kirsch to see if he could shed light on my questions.
“We are at the very, very beginning of AI,” Kirsch said. “We are also at the very, very beginning of the law that governs AI.” He says that although the United States Copyright Office does not make actual laws, it has taken the position that AI-generated content does not belong to anyone, not to the bot that wrote it and not to the person who typed the initial idea into the bot. Instead, AI-generated content belongs to the public domain. That’s right, the place where long-dead writers’ books go. In other words, you can’t own that content, and anyone can use it.
How Do Publishers Feel About AI-Generated Content?
The situation is concerning for publishers; they can’t sell something that belongs to no one. Some publishing houses have already asked Kirsch to write clauses for their author contracts, essentially saying, “You are promising us that there's no AI-generated content in the manuscript.” This promise doesn’t rely on the honor system. Software exists that can detect AI-generated content (not too dissimilar from plagiarism checkers), and Kirsch predicts that “some publishers, maybe many publishers, will do test runs on sample passages to see if there's any AI in there, especially if they've gone to the trouble of putting in their contracts a requirement that there not be any AI.” Publishers don’t want your book if you collaborated with a bot — even on 10 percent of it.
There’s another danger in getting an assist from a chatbot. These bots don’t have their own knowledge. Their “brains” are filled with content created by humans — think, the best and worst of the internet. The bots draw on this content to create their own. It is unclear how much of what the bots write is AI-generated speech, and how much is pulled directly from something written by a human. If what that human wrote is copyright-protected, you may be infringing on the copyright of an actual human writer, all while thinking you’re using public domain material. If your book ends up containing “identifiable portions of the copyrighted work,” says Kirsch, you — or your publisher — may be sued for copyright infringement.
Chatbots can also provide inaccurate information — at times wildly inaccurate. There are even instances where chatbots flat-out lie. Designed to communicate like humans, they mimic our ways — all of our ways.
A Dubious Tool
Kirsch has another caution for writers, above and beyond the legal ramifications of using AI-generated content. He says, “It's bad practice for a writer to use artificial intelligence because it's announcing to the world that you don't have what it takes to write your own books or articles.” You’re offering up something that’s not unique to you, that doesn’t even belong to you. “It's a dubious tool,” he says. “It's not an answer for a struggling writer. It creates more problems than it solves.”
The truth is, however, people are already using AI to help them write. Is there any harm in using a chatbot in the preliminary stages of mapping out a book, such as brainstorming or brewing up an outline? If you do, be sure to take over the helm when you begin the actual writing in order to avoid legal trouble. Besides, AI doesn’t have the writing chops of a seasoned author — and it shows. If you need help writing a book, an article, or a blog post, you’ll likely want to hire a professional — a human professional.
- Should You Let AI Write Your Book for You? - September 7, 2023
- Show, Don't Tell: Writing 101 for Fiction and Memoir - July 19, 2019
- Do You Need Discipline to Be a Writer? - September 14, 2018
One Comment
Very well presented. Every quote was awesome and thanks for sharing the content. Keep sharing and keep motivating others.