With the relentless pressure to move quickly in the era of the Twitter-fueled 24/7 news cycle, it’s probably not surprising that there are so many high-profile journalism errors. All the more reason to double- and triple-check before pulling the trigger. Fri., November 30, 2012.
Senior contributing writer Paul Farhi (firstname.lastname@example.org) is a reporter for the Washington Post.
In journalism, as in real life, stuff happens. It happened to Ben Smith on March 22, 2007. That morning, Smith, then a crack reporter and blogger at Politico, got a dynamite tip: John Edwards, the 2004 Democratic vice presidential nominee, would be announcing the suspension of his campaign for the 2008 presidential nomination at a press conference that afternoon. The decision, Smith's source said, was precipitated by his wife's health. Elizabeth Edwards' cancer had recurred.
Smith knew and trusted his source; as he wrote later, the source spoke with "authority and detail" about Elizabeth Edwards' condition. But just one source? With Edwards' press conference mere hours away, Smith consulted his editors. Go with it, they decided.
Politico, just two months old, put Smith's story on its home page under a bold, declarative headline: "Edwards to Suspend Campaign." The scoop predictably lit up the media landscape. The Drudge Report headlined it with a siren. Smith did three radio interviews. TV stories followed.
Many reporters know the weightless, slightly sickening feeling of what came next. It's like the moment when the cartoon character, having run full throttle off the edge of the cliff into midair, realizes he is about to plummet to the canyon floor.
At the ensuing press conference, Edwards indeed announced that his wife's cancer had returned. But he also said her condition wasn't sufficiently dire to suspend his campaign. Contrary to Smith's story, he was staying in. "It was a really awful moment, personally," recalls Smith, who went on to become editor of BuzzFeed. Reflecting on his mistake, he adds, "As with so many moments in reporting, I don't take a single clear lesson from this. But credibility is the coin of the realm, and I could feel the hit mine took with this."
He's right about being wrong, of course. In a perfect world, we'd all report with unerring accuracy, quote with symphonic fidelity and write with the grammar, syntax and spelling of an Oxford English lit professor. But it doesn't work like that. Mistakes happen — and they hurt not just an individual reporter and his or her publication but the media's reputation as a whole. Read on; I speak from experience.
Public trust in the news media has been falling for decades, and inaccuracies — perceived or otherwise — are a big part of the reason. Just 25 percent of those surveyed by the Pew Research Center last year said news organizations generally get their facts straight; 66 percent said stories are "often inaccurate." Only four years earlier, 39 percent viewed the media as mostly accurate; 53 percent said the opposite.
Mistakes have been a part of journalism since long before "Dewey Defeats Truman." But even with better and faster tools to check information, it's hard to argue that accuracy is improving. Deadline pressure in the era of the 24/7 news cycle is relentless, and many reporting staffs are smaller than in the past. Plus, there are fewer safety nets — editors and copy editors — to catch reporters when they misstep. News organizations are keen to fact-check the statements of politicians, but they might consider putting their own houses in better order as well.
In fact, this may be a Golden Age of non-facts, the Era of Error. In one of the largest and most comprehensive studies of journalistic inaccuracy, academics Scott Maier and Philip Meyer found that reporting errors were at their highest level in the 70 years such research has been conducted. Maier and Meyer went right to the source, or rather the sources, to draw this conclusion. They asked some 4,800 sources cited in 400 stories carried by 14 newspapers whether the stories about them were accurate. Answer: Not very often. The sources reported errors in 61 percent of news and feature stories, the highest defect rate since studies of this kind began in the 1930s.
The reported inaccuracies included the relatively trivial and objective kind — misspellings, incorrect ages, titles and dates, etc. — to the more profound and subjective, such as misleading or distorted quotes, the omission of information or the overplaying of inconsequential facts (read: hype). Maier and Meyer went further, asking the sources for their perception of the stories and the newspapers' credibility. Unsurprisingly, the ratings of both declined as the number and severity of errors rose. What's more, the greater the error rate, the less likely sources said they were willing to cooperate with the newspaper again. In other words, errors not only hurt the newspaper's reputation, they damaged the media's working relationship with the very people they cover.
The even worse news about this bad news is that the research was conducted in 2002 and 2003, with the results published in 2005. While the researchers haven't revisited the topic since then, there isn't much reason to be hopeful, says Maier, a former reporter who is an associate professor of journalism at the University of Oregon. Today, reporters are busy tweeting and blogging and sometimes shooting video in addition to reporting stories. "Reporters are doing more than ever before, and editors are asked to do more, too," Maier says. Journalists "just don't have the same scrutiny and time. It's not surprising that you see errors creeping into copy more and more."
It's not hard to find a journalist who, like Ben Smith, has made an error that has been seared into his or her consciousness. Sometimes the snapshots from journalism's vast Mistaken Nation are appalling, depressing or hilarious, and sometimes all three at once. Joel Achenbach, a veteran Washington Post writer, once wrote a feature story for the newspaper about the National Spelling Bee in which he misspelled four words (the Post's droll correction of the story said the paper "feels reasonably sure that the rest of the words in the story, or at least the vast majority of them, were correctly spelled"). Another time, Achenbach miscalculated the number of hours of daylight on the shortest day of the year. "I must have gotten 100 e-mails from people who were proud of their ability to add and subtract," he says.
Trip Gabriel of the New York Times recalls a threefer from a single campaign trail dispatch last year. His story misspelled Tammy Faye Bakker's last name ("Baker"), misspelled Hillary Clinton's first name ("Hilary") and reported the wrong call letters for a TV station. "I should have done better," says Gabriel, "especially since I later wrote a piece about [Rep. Michele] Bachmann's penchant for playing fast and loose with facts," most infamously with her assertion that a vaccine could cause mental retardation. Comments Gabriel: "Count me as one of those who believes that you lose a few atoms of reader trust, individually and collectively as the Times, whenever you get even the small stuff like names wrong."
With all-the-time deadlines and a win-the-traffic scoop culture, speed kills, or at least injures, journalistic accuracy. CNN's and Fox News' race to call the U.S. Supreme Court's decision on health care reform in June led to headline-making gaffes.
Premature — that is, erroneous — news reports about the deaths of well-known people are practically a subcategory of their own, echoing Mark Twain's long-ago quip about the stories of his own demise. Media outlets in the Detroit area reported the death of famed boxing trainer Emanuel Steward in late October; Steward was ailing, but was, in fact, alive at the time. The "news" that disgraced Penn State football coach Joe Paterno had died traveled far and wide in January at least half a day before it was true. The most notorious of these false reports may have been the "death" of then-Arizona Rep. Gabrielle Giffords in a mass shooting in Tucson in early 2011. As Craig Silverman documented on his Regret the Error blog, NPR was the first to report incorrectly on Giffords' death, followed by Reuters, CNN, Fox News, The Huffington Post, the New York Times and PBS' "NewsHour."
Breaking news, naturally, is the kind most susceptible to being misreported. And thanks to the Web and Twitter, everyone can see news being reported, sometimes badly, in real time. The "fact" that the New York Stock Exchange had been flooded by Hurricane Sandy in late October was retweeted hundreds of times, including by many news organizations. The tweet rolled from Twitter feed to Twitter feed like an onrushing snowball, each time gaining credibility based on the say-so of the previous tweeter. The problem: None of the retweeters had checked the original source of the story. If they had, they would have learned that the story was a fabrication, one of several launched during the storm by a malicious individual who went by the nom-de-Twitter ComfortablySmug.
It pays to keep in mind what NPR's then-executive editor, Dick Meyer, wrote in a public apology for the Giffords mistake: "Already all of us at NPR News have been reminded of the challenges and professional responsibilities of reporting on fast-breaking news at a time and in an environment where information and misinformation move at light speed."
While traditional correction boxes haven't disappeared from newspapers, the Internet has revolutionized old notions of correcting mistakes, and not always in a good way. The ability to repost breaking material quickly means that there often are no acknowledgments of error in real time, just a succession of fixes.
In September, for example, both the Wall Street Journal and the Associated Press initially reported that the mysterious figure behind the controversial "Innocence of Muslims" video was an Israeli named Sam Bacile, whose financial backers included about 100 American Jews. Within hours, both organizations filed updates that eventually revealed that "Bacile" was a pseudonym, that there was no Jewish involvement and that the key figures behind the video were Coptic Christians. None of these subsequent reports acknowledged that the initial stories had been inaccurate. That information appeared a day later when the Journal and the AP published brief, freestanding corrections.
There's little doubt that digital technology has revolutionized the traditional corrections box. Fixing an error, and noting the correction at the top or bottom of the story, are faster and easier to do than ever. Technology also permits corrections to be crowdsourced. Many eagle-eyed digital readers are quick to flag problems when they spot them. On Twitter, where much misinformation surfaces, debunking it has become a cottage industry. A number of news organizations, including my own, have "report corrections" buttons that make it painless for those readers to alert an editor. The days when reader phone calls were passed from desk to desk by indifferent news aides would seem to be over.
Even so, it's not clear that it's any easier to get a correction now than it was in the hot-type era. In a 2007 study of 10 daily newspapers, Maier found that journalists are as reluctant as ever to acknowledge they were wrong. News sources noticed lots of errors in the newspapers' stories, he found, but they rarely complained about them. When they did, the response was about the same as if they hadn't bothered. Of the 130 cases in which a source informed the newspaper about an alleged error, the newspapers ran a correction only four times.
Then there are the mistakes that are so large, so pervasive, they may not be noticed as mistakes at all until long afterward. Several newspapers, including my own, ran probing after-the-fact stories about how they covered, often mistakenly or misleadingly, the run-up to the war in Iraq in 2002 and early 2003. The pithiest "clarification" of all time may have been the 2004 mea culpa by Kentucky's Lexington Herald-Leader: "It has come to the editor's attention that the Herald-Leader neglected to cover the civil rights movement. We regret the omission."
A more recent example may be the media's coverage of the federal stimulus law, the $831 billion package of tax cuts, extended unemployment benefits and infrastructure investment enacted just after President Barack Obama took office in early 2009. As Time magazine correspondent Michael Grunwald demonstrates in his recently published book, "The New New Deal: The Hidden Story of Change in the Obama Era," media reports about the stimulus' impact were mostly skeptical and often quite downbeat. Grunwald argues that the coverage missed what has been a positive, even transformative story: The stimulus saved jobs, helped the economy at a time when it was in free fall and jumpstarted important changes in health care, transportation and energy development.
"The national media wrote about it as if it was always negative, with very few exceptions," Grunwald says. "There was a tendency to do he-said, she-said stories that were [politicized]. Republicans were very shrewd and disciplined [in their criticism]. They had a very unified message about levitating trains to Disneyland and what a big-government, big-spending mess this was."
Grunwald thinks Washington groupthink played a role in how reporters characterized the stimulus. Some of the media response was colored by timing, he says; the stimulus was enacted several months after passage of the widely unpopular bank bailout bill during the last days of the Bush administration. The projects funded by the stimulus law were also geographically dispersed and designed for the long term, he says, defeating instant assessments of their impact.
"Every paper that looked at it had a Pulitzer in its eyes," Grunwald says. "You don't get a Pulitzer for investigations that don't uncover wrongdoing. So everyone was primed to run 'gotcha' stories.... I'm familiar with the ethic that if you don't have anything nice to say, put it in the paper. But there was something about this big government spending at a time when people were hurting that turned reporters into runaway prosecutors."
The danger is that this first very rough draft of history can calcify into something like conventional wisdom. As it happens, some of the greatest myths perpetrated by the media are about the media itself. One popular notion is that CBS News anchorman Edward R. Murrow's famous "See It Now" broadcasts in the 1950s "brought down" Sen. Joe McCarthy, the communist-hunting demagogue. Another is that President Lyndon B. Johnson concluded that he could not win reelection after another legendary CBS anchorman, Walter Cronkite, declared in 1968 that the Vietnam War was "mired in stalemate." Still another is that the press, particularly Bob Woodward and Carl Bernstein of the Washington Post, drove President Richard M. Nixon from office in 1974 with a relentless stream of revelations about the Watergate scandal (see "Watergate Revisited," Fall 2012).
These tales have been told so many times (often by journalists) that they seem true. But each is a case study in what satirist Stephen Colbert labeled "truthiness" — a statement having an emotional foundation or the ring of truth but with little or no evidence to support it. Each of these myths is dismantled in W. Joseph Campbell's 2010 book, "Getting It Wrong: Ten of the Greatest Misreported Stories in American Journalism," which demonstrates the self-reinforcing power of mistakes.
Each media-driven myth, says Campbell, a professor at American University in Washington, D.C., shares a common DNA. "It has to be simple and plausible," he says. "It has to condense complicated events into a story that's easy to grasp." The stories involving Cronkite, Murrow and Watergate all share another trait as well: They cast journalists as the hero of the story, a self-flattering and perhaps irresistible narrative for reporters.
In fact, as Campbell and others have pointed out, Murrow's critical reports about McCarthy in 1954 came long after other journalists had exposed him and his own party had turned against him (the Murrow-McCarthy confrontation was probably more important as a TV event than as a political one, validating the fledgling medium as a serious source of news and commentary). In Cronkite's case, there's no evidence that LBJ ever saw the "stalemate" broadcast or that he responded to it with the purported comment: "If I've lost Cronkite, I've lost Middle America," or words to that effect. As for Woodward and Bernstein's supposedly central role in Nixon's demise, this would only be true if one ignored the contributions of Senate hearings, judicial rulings, grand jury indictments, FBI investigations, the testimony of White House aides and the White House taping system. Woodward himself disclaimed such heroic status, calling it "horseshit."
Memory can be an unreliable substitute for facts, Campbell says. Popular culture has a way of clouding, and even overtaking, facts, too. This may be the case with Watergate and with Murrow. The movie "All the President's Men" placed the reporters at the center of its plot (in a way that Woodward and Bernstein's bestseller never did), probably sealing the journalist-as-hero myth in the popular imagination. So, too, did George Clooney's 2005 movie, "Good Night, and Good Luck," in which the Murrow-McCarthy confrontation was dramatized, shorn of most of its larger context.
But mistakes and myths injure more than just the historical record. They can distort contemporary understanding, sometimes in dangerous ways. Campbell and others have recounted how the flawed reporting in the immediate aftermath of Hurricane Katrina — with lurid and unsubstantiated tales of violence and mayhem — delayed relief efforts and diverted critical resources. More generally, these stories impugned the city's reputation (see "Myth-Making in New Orleans," December 2005/January 2006 ).
Like every reporter, I've had my share of misreported names, dates and facts. Each error stings, but a sting doesn't last long. There's one mistake, however, that I'll never forget or ever stop regretting.
It started on August 13 with a tip from a man named Clyde Prestowitz. Hard on the heels of the news that CNN host and Time magazine Editor-at-Large Fareed Zakaria had been suspended for plagiarizing a New Yorker article, Prestowitz e-mailed me out of the blue. "I'd like to make you aware of a similar incident from my own personal experience," he wrote.
A similar incident? This sounded promising. I responded with interest via e-mail and spoke with Prestowitz several times on the phone. According to him, Zakaria had used without attribution material from Prestowitz's 2005 book, "Three Billion New Capitalists: The Great Shift of Wealth and Power to the East." Prestowitz claimed that Zakaria had used a quote — from an interview Prestowitz had conducted with former Intel Corp. CEO Andy Grove — without citing its source in Zakaria's 2008 book, "The Post-American World".
Prestowitz, an oft-quoted economist and former trade specialist in the Reagan administration, sounded quite upset. But when I began to check his story, I found something that should have stopped me. In Amazon's "Look Inside" feature, which enables readers to browse portions of books online, I found that Zakaria had indeed noted Prestowitz's contribution in an endnote. It was tucked at the bottom of a note that first cited material from New York Times columnist and author Thomas Friedman.
Prestowitz waved me away. He said that credit appeared in version 2.0 of the book, which came out in 2011 (he was right about that; the footnote I read on Amazon was from the 2011 edition). He insisted that the credit had been added to the final edition after he and his agent had complained to Zakaria's publisher that earlier editions lacked the proper acknowledgment. The first edition and paperback (published in 2008 and 2009, respectively) didn't have the endnote, Prestowitz insisted.
At this point, I should have checked for myself. But a deadline loomed. I called Zakaria for his comment. He had a lot to say. But what he didn't say was most telling. First, he never denied not crediting Prestowitz (he said he couldn't remember). Then he launched into a justification of why not doing so was valid. "As I write explicitly [in the book], this is not an academic work where everything has to be acknowledged and footnoted," he told me. The book contains "hundreds" of comments and quotes that aren't attributed because doing so, in context, would "interrupt the flow for the reader," he said, adding that this was "standard practice" for authors such as Malcolm Gladwell, David Brooks, Friedman and himself.
My story went to press and appeared on washingtonpost.com that night. It began to pile up reader comments, mentions on Twitter and links from other news outlets. A success, I concluded.
Several months later, I can still remember that sickening, weightless feeling. I, too, had run off the cliff.
As became clear the next day, when Zakaria's publisher alerted my newspaper, my story was dead wrong. Zakaria had indeed credited Prestowitz in every edition of his book (a trip to the public library to get hard copies of each edition confirmed this). The Post not only corrected the story, it retracted it and added an apology to Zakaria.
In retrospect, I made not one but two mistakes. First, and most obviously, I should have independently checked Prestowitz's claim. Easy. The second part isn't so obvious. I shouldn't have taken Zakaria's non-denial of the allegation as confirmation of it, nor assumed that his attitude toward acknowledging his source material was a tacit admission of guilt.
Journalistic clichés often become clichés for good reasons — they codify ethical standards, stand as warnings of would-be pitfalls and reinforce the tribal wisdom. In my case, I forgot one of modern journalism's hoariest clichés, one I won't forget again.
If your mother says she loves you, check it out. ###