By Casey Bukro
Can machines learn ethics?
The Associated Press already uses an automated platform capable of producing up to 2,000 stories a second. This is especially handy when companies issue quarterly earnings, which can be drudgery for a human reporter who scans the reports for meaningful numbers and statistics.
The robotic journalist crunches those numbers in seconds and spits them out in readable form, not in Pulitzer Prize-winning style but adequate.
Robo-reporting is especially handy for business and sports stories heavy on numbers and scores.
Northwestern University was among the pioneers in using machine learning, or pattern recognition software, to assemble the basics of a news report. A 2009 student project created software to write a headline and story from a baseball game’s box score. Two NU professors in 2010 started a Chicago company, Narrative Science to find commercial uses for the technique.
Stories written by robots have a lot of potential for the news business, and a few issues that need to be hammered out. Like ethics.
Computers, for example, could become plagiarists.
“Just because the information you scrape off the Internet may be accurate, it doesn’t necessarily mean that you have the right to integrate it into the automated stories that you’re creating — at least without credit and permission,” said Tom Kent, Associated Press standards editor, in a Digital Journal article, which cited comments Kent made to the University of Wisconsin Center for Journalism Ethics.
“I think the most pressing ethical concern is teaching algorithms how to assess data and how to organize it for the human eye and the human mind,” said Kent. “If you’re creating a series of financial reports, you might program the algorithm to lead with earnings per share. You might program it to lead with total sales or lead with net income. But all of those decisions are subject — as any journalistic decision is — to criticism.”
News judgment and organization are ethical questions for human and robot journalists alike, Kent pointed out in the Digital Journal article.
“Everyone has a different idea about what fair reporting is,” said Kent. “The important thing is that you devote to your news decisions on automated news the same amount of effort you devote to your ethics and objectivity decisions at any other kind of news.”
Here’s an ethics checklist for robot journalism written by Kent.
The Society of Professional Journalists code of ethics has a few tenets that do not compute. For example, the code says, “Seek truth and report it.” A robot would be truly wise if it could separate truth from lies, especially in presidential debates.
And how about this one: “Avoid conflicts of interest, real or perceived.” And: “Never plagiarize.” Or: “Avoid pandering to lurid curiosity.”
And what about salacious, sexist or offensive comments that are in poor taste? Can a robot keep up with shifting cultural etiquette or unacceptable words?
The technology is considered a look into the future, not only how automation is likely to change the way journalism operates but its likely impact on job creation or loss in the ranks of journalists already hard hit by staff cuts.
“Computers are not taking journalists’ jobs – not yet, at any rate,” The Verge reported. “Instead, they’re freeing up writers to think more critically about the bigger picture.” The goal is to write smarter and more interesting stories.
At least that is the justification often given for robotization in journalism, always with the hope that it will not turn into an invasion of the job-snatchers.
Have a question about journalism ethics? Call 866-DILEMMA. Or go to ethicsadvicelineforjournalists.org and submit a question online.