News  |    |  November 14, 2019

Should AI-Written News Stories Have Bylines? Whose?

News article by Brendan Dixon.
Published in Mindmatters.

Excerpt:

Consider OpenAI’s GPT-2 text generation AI. OpenAI claims GPT-2 can create “coherent paragraphs of text” (though what we’ve seen stretches the meaning of “coherent”). It also raises a question: If a writer uses AI to “write” an article, or if an article is written entirely by an AI system, what should be the byline?

When I read a piece created by another human, I am engaging with another mind. When I read a piece “authored” by an AI, however, I’m engaging an algorithm. The human may ponder, evaluate, weigh, and rewrite. (I sure do.) An AI spews: Given this input, under these conditions, this is the output. Period.

AI can be useful in writing. Consider an AI working from a mass of field reports and “writing” an overview of very complex events. Or a health writer who must summarize a huge batch of academic papers to determine what coherent message can be gleaned from them about a controversial issue in cancer treatment or vaccination. These uses of machine analysis benefit us by augmenting what we can do — like every other tool we use. There will always be problems with dishonest uses such as AI essays, deepfakes, or other artifacts that are meant to deceive. But honest practice also raises some issues we need to think about [ . . . ]