When New York Times critic Carol Vogel previewed an artist’s retrospective, readers were quick to question her report.
By Stephen Rynkiewicz
Renaissance artists might have struggled with the idea of plagiarism. Florentine salons respected tradition and uniformity, and apprentices in Piero di Cosimo’s studio learned by imitating the master. National Gallery of Art curator Gretchen Hirschauer told New York Times critic Carol Vogel that Piero’s work entered American collections partly by accident. It was attributed to other artists.
But the concept of plagiarism has evolved. When Vogel previewed Hirschauer’s retrospective of Piero’s work, a few readers were quick to question her report. It started with a list of Piero’s peculiarities, citing contemporary Giorgio Vasari, who’s still studied in paperback. But the wording was close to an even more common source, Wikipedia. The print passage is shortened online, and ombudsman Margaret Sullivan suggests Times editors might take further steps if a pattern emerges.
The word plagiarism first appears during the Reformation. The Random House Dictionary defines it as “to use the words or ideas of another person as if they were your own words or ideas.” Universities have moved beyond the Renaissance academy, with rules against copying and paraphrasing. The Society of Professional Journalists ethics code simply says, “Never plagiarize.”
Yet the practice continues. Evidence of plagiarism in Sen. John Walsh’s Army War College research puts him under pressure to withdraw from the November election. Repeated instances on the website BuzzFeed got a producer fired last month. And delegates to SPJ’s 2014 convention will consider adding another ethics directive: “Always attribute.“
Why this plague of plagiarism? In the modern age, it’s just so much easier to copy. It doesn’t take years of apprenticeship, just a few cut-and-paste keystrokes. Technology raises the risk of not only becoming a plagiarist, but also being uncovered as one: Sources found on the web can be compared in a web search too.
Dropping source material into a draft is dangerous, even if the intention is to recast. A disciplined writer might use control-p for accuracy, with proper names or direct quotes. But even that’s risky. A good place to start is with primary sources — recordings, transcripts, statements — then to refer to them sparingly.
Attribution’s more of a gray area in news media, if only because they use wire services so widely. Associated Press subscribers agree to pool their reports, generally without attribution once the original report’s in print.
The blogosphere has clouded the standards further. Websites flirt with plagiarism when they relay the essence of an exclusive elsewhere, in a way that robs the source of any chance to profit from its beat. The Chicago Tribune frequently gives reminders to its online news editors (I was one) on how to cite other publications: Name and link to the source, don’t quote directly, keep the report to a sentence or two, and confirm it as soon as possible.
Competition encourages journalists not only to confirm a story but to move it forward. Once their own report is in place, it also makes them stingy about giving credit to the newsroom that put it out first. But standards are evolving here too. Web timestamps make it more apparent who scored the beat, and that may be making everyone more generous at giving due credit.
If rules on fairplay are shifting, the Renaissance atelier may be where to look for direction. When they knew enough to transform their material, apprentices became journeymen and started their own studios. When journalists bring craft and intelligence to their work, they too became artists.
Submit your question to the Ethics Adviceline for Journalists.