By Matt Sutton
A news-writing robot created by Automated Insights is estimated to “write” one billion stories in 2014.
But most news outlets publishing this robot-made content are so far choosing not to inform readers about their use of the software—a decision that contradicts one of journalism’s most valuable ethical principles: transparency.
In 2013, the Automated Insights software produced 350 million news stories, according to company founder Robbie Allen, in a recent Poynter article written by Poynter's digital media fellow, Sam Kirkland.
Automated Insights, a software company out of Durham, NC, has created what it calls a “wordsmith platform” that writes “insightful, personal reports from your data. It’s like an expert talking with each user in plain English.” Basically, the software takes a bunch of information (data) and places it neatly into a framework that can be easily understood by users.
Related content on J-Source:
- Video: How to use hidden cameras ethically
- Has the “selfie” gone too far?
- Peter Bregg on what it means to win a lifetime achievement award
The Los Angeles Times recently used similar software to publish an article last month about a 2.7-magnitude aftershock that occurred roughly an hour after a 4.4-magnitude earthquake hit Los Angeles. There have been at least 10 similar articles regarding earthquakes published this month. In each article, the final paragraph states, “This post was created by an algorithm written by the author.”
The algorithm referred to is known as “quakebot” and it is software programmed to receive information from the U.S. Geological Survey when reports of earthquakes come in; the software then places all relevant information about the event into a template that is edited by the author—in this case, Ken Schwencke—and then published within a matter of minutes.
Other news websites, however, seem to feel that it is not necessary to include such identifying information about their robotically generated material.
News organizations listed on the Automated Insights website as “partners and customers” included Yahoo, MSN, WRAL News (Raleigh, NC) and USA Today. No Canadian outlets were included in the list.
This means that most people who regularly read Internet news have more than likely at least glanced over an artificially created story without being told about how the information came to be in front of them.
There is no doubt that this software is an advancement and a useful tool to the online journalism community and will certainly save reporters and editors from the tedious task of analyzing endless amounts of data.
But is it ethical for news outlets not to tell their readers that the content they are reading was not created by the hand of a human journalist?
The technology is so new that even journalism educators can’t agree. David Swick, a professor of journalism ethics at the University of King’s College, in Halifax, said no. “Transparency is the key to journalism ethics,” said Swick. “Want your readers to trust you? Tell them the truth.”
However, Tim Currie, a member of the Canadian Association of Journalists’ national ethics advisory committee and a professor of online journalism at King’s, disagreed. “I don’t think ethics would require notifying a reader a story was compiled by an algorithm,” said Currie. “But readers might be interested in knowing it.”
As a second-year journalism student currently learning about ethics and law in the media, I agree with Swick’s assessment.
Ethical journalism is one of the few things left keeping professional journalists a step ahead of the bloggers and citizen journalists running rampant with the 21st-century ability to share any information they chose. Without transparency, ethical journalism begins to fall apart, and the lines separating professional and citizen journalists begin to get blurry.
Journalism is journalism, and we need to report responsibly and transparently regardless of what an article is about or how it was produced. If exceptions start being made for certain aspects of journalism, such as automated reporting, then other rules and codes will become less important as well.
One defense given by Allen to Poynter is that news outlets might not want to disclose that a robot created certain content because readers would be more critical of the content and would want to find problems with the software.
An April 8, an L.A. Times quakebot article received a critical comment from a commentator who said, “This post was created by an algorithm written by the author. So it takes an algorithm to write/post this article….wow….”
However, Swick said that this is not a valid excuse. “We’re journalists, we know that stuff always comes out,” said Swick. “How long is it really going to stay a secret for?”
As this modern journalistic tool continues to grow in popularity, it will inevitably cross borders and be used by journalists worldwide, at which point, I am sure citizens everywhere will know about it and be wondering why they hadn’t been told sooner.
The best thing for the news outlets withholding this information is to tell their readers exactly what they are doing. Tell them what the technology does, how they are using it and what exactly they are using it for.
It is not that this technology is a bad thing. The software is not being used to manipulate news or deceive readers, and it is not a threat to any journalism jobs. It is simply a tool meant to speed up the process of getting the facts out to the public as soon as possible.
But keeping the methods of journalism from readers is unethical, and saying that it’s because it might make readers uncomfortable or overly critical is not an excuse.
“Any time a journalism outlet or a journalist is doing something and they’re not absolutely sure that the reader would like it, they should tell the reader, let them decide, treat them like a grown up,” said Swick.
It really comes down to a question of whether news websites are more concerned with readers not trusting their methods or with readers not trusting their news.
Methods can be proven, but finding out that a news outlet has been keeping something as essential as the “author“ of the news from readers could break trust and cause the reader to get their news somewhere else.
“All we’ve got is the trust of the reader,” said Swick. “Without trust, why would they buy us?”
In an already struggling industry, it does not make sense for any journalist to do anything that could potentially affect their reputation or their business. News outlets need to realize this now, before the technology evolves into the mainstream, and start being transparent with their readers.
Calgary Journal reporter Matt Sutton is a second-year journalism student at Mount Royal University in Calgary.
Related content on J-Source:
- Why context is crucial in photojournalism
- Toronto Star builds technology to post professional camera photos instantly on live blogs
- A visionary photojournalist ahead of his time: Q&A with Larry Towell, Canada’s first member of Magnum Photos