loader loader

Redefining authorship with AI writing tools

Who owns the intellectual property if you’re using AI to help write that article?

June 29, 2022

The writing industry has been around for centuries, and it’s a tough business — until now. With AI tools like OpenAI’s GPT-3 that have the ability to produce coherent text and others that can even write lines of code, technology is creating new realms of possibilities.

While it’s not likely that human writers are going obsolete anytime soon, this symbiotic relationship with AI tools has created a major grey area: who actually wrote it?

Apart from prompting a philosophical evaluation of what it means to be a writer, it’s also an authorship issue. Legalities aside, it’s a question that will affect how we assess creativity.

Blurring boundaries

The extent of AI assistance varies from tool to tool. Anyone with a smartphone will be familiar with auto-correct, auto-complete and auto-suggest functions.

For day-to-day communication, in emails or text messages, what matters is clarity – not flair. Telling your partner “I’m on the way home” gets the point across much more easily than “I’m embarking on the journey back to our humble dwellings.” 

In situations like this where creativity may be more of a hindrance than help, it might not be as important who writes it and how. At the end of the day, the purpose is to transmit factual information accurately. And if you can sidestep miscommunication with help from an AI tool, by all means do it.

Things get a little complicated when the AI gets “smarter”. Instead of merely predicting how a half-written sentence ends, some tools allow you to type a blurb, summary or even a prompt and they can generate full articles from your ideas. 

There’s this battle between idea and execution that blurs the lines of authorship. It may have been your idea to begin with, but a robot wrote it and probably supplanted it with additional information that you might not have even originally envisaged. Where does that leave you?

The problem with robots writing stories

Things get even messier in settings where originality and authenticity are important, such as in journalism, academia and creative writing fields.

Say you’re writing an academic paper for a scientific journal where original research is required. The paper is evaluated not just in terms of original ideas, but also in the way it’s written. Here, the structure of the paper, the way the sentences flow and the choice of words are all integral parts of evaluating research. 

It raises the question: If an AI writes 30% of your paper for you, but you wrote the other 70%, is this plagiarism? Would it constitute cheating since you had some help?

Take another situation: you’re an author writing the next great novel on your computer. You type “Are you” and the AI tool auto completes the line by suggesting it ends with “alright?”. If the machine does an accurate prediction of your intentions, perhaps it doesn’t matter if it’s “Are you alright?” or “Are you OK?”, since they both carry the same core meaning, albeit expressed differently.

But what if you typed “Are you” and the AI-generated completed answer was “who you want to be”? And what if at that moment, you — the human writer — has an epiphany and decides that’s indeed an interesting plot line, and so you scrap your original story to make space for this new development. When that happens, whose idea was it? 

Who owns the intellectual property for AI-generated content?

Imagine a future where we cannot attribute content to the original human writer. Does it absolve them of responsibility when it comes to something sensitive like hate speech? 

Some people argue that AI can never create anything truly original, as it’s trained using large bodies of text taken from the internet and/or written by actual people. And to a certain extent, that it is merely a tool that supports the human creative process, like pen and paper.

But with the latest developments in technology, these programmes are no longer just a tool; they can actually affect or dictate the decisions involved in the creative process.

Perhaps things might change in the future, but right now, humans are still in favour.

Just last year, a federal judge in the US ruled that only human beings can be listed as an inventor on patents, not computers or AI machines. Same goes for in Singapore, although “not infrequently, in cases involving a high degree of automation, there will be no original work produced for the simple reason that there are no identifiable human authors.”

Jevon Louis, Head of IP practice at Shook Lin and Bok, said on Click2View’s “The Content Show” [listen in its entirety above], “Right now, you have different jurisdictions that take slightly different approaches. But at the end of the day, they still require a human author to be named. How you derive who this person is has some differences across the jurisdictions.” 

In the US and Europe, the author is the one whose personality and creativity come through in the writing. They are the ones who have fact-checked and gone through the work. Meanwhile, in countries such as Hong Kong, India, and the UK, the author is the one who enables the AI to function and create it, so this can be the programmer or anyone who did the necessary training to make the AI possible. 

When it comes to legal issues, such as defamation, the ones who control and operate the AI are the ones who are responsible. “I suppose it falls on the individual or company overseeing the work. The ones who take the necessary steps to verify the information,” says Jevon.

He also says that anyone who uses AI writing tools should read the terms and conditions. “There are some open-source Software-as-a-Service (SaaS)-based AI tools, which may mean you need to make it publicly available. This may not bode well for some, so make sure that the copyright is transferred and no residuals sit with them.”

Speaking of copyright, human authors usually have 70 years, meaning they can enjoy the fruits of their labour for two to three generations before it becomes available in the public domain, but what about AI-generated content? 

This is a question that everyone is still grappling with. “A company generally owns the AI, so they can potentially own the piece indefinitely, defeating the purpose of the Copyright Act in the first place,” says Jevon.

What makes a writer?

The writer is slowly becoming more of an editor and publisher as the craft of writing is subject to a process of soft automation. (Can’t help thinking of Hemingway here, who reputedly said ‘write drunk, edit sober.’) But at the same time, we find it hard to say that AI tools have the ability to actually write.

Writing is a means of expression, and by extension that means you have something to express —  emotions, feelings, thoughts. A non-sentient AI program has none of that. Trained on data inputs of other writers, it is only able to achieve the illusion of ‘writing’ because they reshuffle commonly used words into different combinations to form new sentences. 

While we’re enthusiastic about these new tools in churning out SEO-friendly website copy, let’s not forget: the art of writing — emotive, powerful, compelling writing — lies in the human touch, a sentiment that Jevon agrees with. 

“While it is true that AI can replace certain jobs, certain industries like the creative industry still need the human element to check the work, so AI is not foolproof yet. If we ever get to that point, that would be pretty impressive.” 

Read more from Click2View:

  1. Data Storytelling: A Weapon of Mass Persuasion?
  2. How To Write a Good Blog
  3. The Consciousness As Content: Going Full Steam Meta In The Metaverse
  4. Why All Events Will Be Hybrid

Sign up to our newsletter for a weekly update on the latest content marketing news. Don’t forget to subscribe to our YouTube channel too!
Click2View is Southeast Asia’s premiere full-service independent B2B content marketing agency servicing clients like Microsoft, Google, Visa, Prudential, and the Lee Kuan Yew School of Public Policy.