Who owns the intellectual property if you’re using AI to help you write that article?
The writing industry has been around for centuries, and it’s a tough business — until now, that is. With AI tools like OpenAI’s GPT-3 that have the ability to produce coherent text and others that can even write lines of code, technology is creating new realms of possibilities everyday.
While it’s not likely that human writers are going obsolete anytime soon, this symbiotic relationship we share with AI tools has created a major grey area: who actually wrote it?
Apart from prompting a philosophical evaluation of what it means to be a writer, it’s also an authorship issue. Legalities aside, it’s also a question that will affect how we assess creativity.
Indeed, the extent of AI assistance varies from tool to tool. Anyone with a smartphone will be familiar with the auto-correct, auto-complete and auto-suggest functions.
For day-to-day communication, like in emails or text messages, what matters is clarity and not flair. Telling your spouse “I’m on the way home” gets the point across much more easily than “I’m embarking on the journey back to our humble dwellings.”
In situations like this where creativity might be more of a hindrance than help, perhaps it might not be as important who writes it and how. At the end of the day, the purpose of it is to transmit factual information accurately. And if you can sidestep miscommunication brought on by convoluted text and grammatical errors with help from an AI tool, by all means do it.
Of course, things get a little complicated when the AI gets “smarter”. Instead of merely predicting how a half-written sentence would end, some tools allow you to type a blurb, summary or even a prompt and they can generate full articles from the ideas you want included.
There’s this battle between idea and execution that blurs the lines of authorship. It may have been your idea to begin with, but a robot wrote it and probably supplanted it with additional information that you might not have even originally thought of. Where does that leave you?
In fact, this gets even messier in settings where originality and authenticity are important, such as in journalism, academia and creative writing fields.
Say you’re writing an academic paper for a scientific journal where original research is required. The academic paper is evaluated not just in terms of original ideas, but also in the way it’s written. Here, the structure of the paper, the way the sentences flow and the choice of words are all integral parts of evaluating research.
That begets the questions: If an AI writes 30% of your paper for you, but you wrote the other 70%, is this plagiarism? Would it constitute cheating since you had some help?
Take for instance another similar situation: you’re an author writing the next great classic novel on your computer. You type “Are you” and the AI tool auto completes the line by suggesting it ends with “alright?”. If the machine does an accurate prediction of your intentions, perhaps it doesn’t matter if it’s “Are you alright?” or “Are you OK?”, since they both carry the same core meaning, albeit expressed differently.
But what if you typed “Are you” and the AI-generated completed answer was “who you want to be”? And what if at that moment, you — the human writer — gets an epiphany and decides that that’s indeed an interesting plot line to go towards, and so you scrap your original story to make space for this new development. When that happens, whose idea was it?
Imagine a future where we cannot attribute content to the original human writer. Does it absolve them of responsibility when it comes to something sensitive like hate speech?
Some people argue that AI can never create anything truly original, as they are trained using large bodies of text taken from the internet and/or written by actual people. And to a certain extent, that is what they are — mere tools that support the human creative process, like pen and paper.
But with the latest developments in technology, these programmes are no longer just a tool; they can actually affect or dictate the decisions involved in the creative process.
Perhaps things might change in the future, but right now, humans are still in favour.
Just last year, a federal judge in the US ruled that only human beings can be listed as an inventor on patents, not computers or AI machines. Same goes for in Singapore, although “not infrequently, in cases involving a high degree of automation, there will be no original work produced for the simple reason that there are no identifiable human authors.”
It seems like the writer slowly becomes more editor and publisher as the actual craft of writing is subject to a process of soft automation. But at the same time, we find it hard to say that AI tools have the ability to actually write.
Writing is a means of expression, and by extent that means you have something to express — emotions, feelings, thoughts. A non-sentient AI program has none of that. Trained on data inputs of other writers, it is only able to achieve the illusion of ‘writing’ because they reshuffle commonly used words into different combinations to form new sentences.
While we’re enthusiastic about these new tools in churning out SEO-friendly website copy, let’s not forget: the art of writing — emotive, powerful, compelling writing — lies in the human touch.
Ever wondered what it was like using one of those AI writing tools? Not too long ago, our Head of Editorial David tested out Hypotenuse AI. He challenged it to write his life story based on a few inputs and key points. This is how it turned out.
Sign up to our newsletter for a weekly update on the latest content marketing news. Don’t forget to subscribe to our YouTube channel too!
Click2View is Southeast Asia’s premiere full-service independent B2B content marketing agency servicing clients like Microsoft, Google, Visa, Prudential, and the Lee Kuan Yew School of Public Policy.