AJR  Features :     * WEB ONLY    
From AJR,   December/January 2011

The Comment Police   

Web exclusive
NPR says it’s pleased with the results of its decision to outsource the battle against offensive online comments. Posted: Wed, Dec. 15 2010

By Madhu Rajaraman
Madhu Rajaraman (mrajaraman@ajr.umd.edu) is an AJR editorial assistant.     


It was a victim of its own success.

At least, that's how Andy Carvin sees it.

Carvin, senior strategist for NPR, says the comment threads on the news organization's Web site are intended to allow reporters to engage with the public and foster intelligent dialogue and debate. However, NPR was forced to take defensive action after barrages of inflammatory posts by trolls and spammers polluted its discussion boards and threatened to become a persistent problem.

As a result, NPR announced in October that it would outsource its Web site regulation duties to ICUC Moderation Services, a social media monitoring company based in Canada — a move that, according to Carvin, has yielded impressive results in the short time since it's been implemented.

"Earlier, it wasn't unusual to find 500 comments in the abuse queue [per day]. Now it's only about three, four, five a day," he says. "It's only been about a month and a half now, but so far, so good." The term "abuse queue" refers to the list of user comments that have been marked as offensive.

Prior to bringing ICUC into the picture, Carvin says, interns and other staff members were on their own when it came to removing abusive comments. That is, until it came to the point where the online commentary became so voluminous that NPR staff could no longer keep up.

The discussion boards on NPR's Web site have a posted set of guidelines, which is generally what directs the decision-making. Rule #1: "If you can't be polite, don't say it."

Flagging offensive content is the one duty that remains up to readers and NPR staff, and is done simply by clicking the "Report Abuse" button to the side of the comment in question. ICUC is alerted every time this happens, and the moderation company takes it from there, relieving NPR employees of endless hours spent screening controversial content. "If the comment in question is obviously breaking the rules, they block it," Carvin says. "If it's borderline or questionable in some other way, they contact my team so we can discuss and decide how to handle it."

In borderline cases, an ICUC moderator contacts a member of Carvin's social media team, and they discuss how to proceed. He adds that once in a while, a user will petition a blocked comment, meaning the author wants to appeal the decision to jettison his or her comment. This results in a similar review process, after which the comment may or may not be unblocked. "It's often more art than science," Carvin says.

ICUC specializes in social media monitoring, online content moderation and Web support for companies and organizations. Keith Bilous, the firm's president and CEO, says that in order to sign up, an organization simply contacts ICUC to discuss its needs, and details such as fees and specific services are worked out according to what exactly the news organization is looking for. "It's all done remotely," he says. "It's based on the client's needs, and we moderate everything virtually." In addition to NPR, ICUC's client list includes Unilever, Calvin Klein and Intel.

Carvin says that certain topics are more likely than others to trigger incendiary comments. Stories about the Middle East are the most controversial, he says. In the recent past, NPR's firing of commentator Juan Williams, the Jon Stewart rally in Washington, D.C., and the midterm elections were the hot topics that garnered a great deal of attention. This time, thanks to ICUC, NPR was better equipped to deal with offensive reader responses.

Carvin says of the Juan Williams incident, "We had tens of thousands of comments coming in that week," many of which were, predictably, harsh in nature. But by having a group of people dedicated solely to dealing with the one percent of problematic users, NPR reporters could focus on doing their jobs. "Having ICUC moderators there was comforting for us," he said.

NPR also saw an increase in the number of hackers writing scripts to remove comment threads altogether. "People were trying to gauge exactly when we weren't monitoring [the threads]," he says, and using these patterns to exploit employees' work habits and delete comments. This angered readers, who accused NPR of censorship when it was actually hackers who had done the damage. "These are humans," Carvin says. "Very savvy ones."

Thanks to ICUC, spammers now need to work extra hard to penetrate NPR's comment forums — something that comes as a big relief to Carvin and his colleagues.

But wouldn't it just be easier to get rid of comments on NPR's Web site altogether? A number of news organizations, including the Buffalo News, have banned anonymous comments, which are often the most problematic, and three Maine papers banned all comments for a 48-hour period in October. "Yeah, it would be easier, but it would be giving up on our mission," Carvin says, stressing the role of online discussions in informing the public. "So many times comments have contributed to our reporting."

Carvin cites last year's "balloon boy" hoax in Colorado as an example. The story of then six-year-old Falcon Heene garnered international attention in 2009, when the boy's parents falsely led authorities and the media to believe he had been carried away by helium balloons and was, presumably, in serious danger. Carvin says that while NPR's reporters were covering the incident, conversation among online readers helped clarify that the boy had never disappeared to begin with. "They started doing the physics..they crunched the numbers and found that it was impossible for the balloons to be carrying a boy of that weight." Before many other newsrooms had figured out the truth, NPR's online audience "had already made a very rational conclusion," Carvin says.

Carvin is certain that the positives of reader participation on the Web greatly outweigh the negatives. "There's a diverse range of smart people who want to help us. The more directly our journalists can engage with the public, the better and more diverse our reporting will be."

But comment moderation doesn't come cheap. Damon Kiesow, digital media fellow at the Poynter Institute, sees value in reader comments but acknowledges that a lack of resources may hinder smaller, less wealthy news organizations from both fostering conversation online while keeping tabs on offensive posters.

"NPR can afford to outsource, and the New York Times can afford to set up a board," Kiesow says. "So it's a balancing act between resources and the quality of comments."

Regardless, Kiesow says, killing comments kills the conversation. "Often when I read a news story [online], I look for the comments. They add context and value to a story," he says, adding that if a news site eliminates this feature, reader interest dwindles and people who want to have a meaningful discussion won't participate. "It's not such a toxic atmosphere by default. You just have to sweep the streets, make it a nice place to live."

NPR may have toughened up on its official Web site, but Carvin says that readers looking for a more freewheeling forum can still turn to social media. Whereas npr.org has a rule forbidding obscenities, Carvin points to NPR's Facebook page as an outlet that allows for more freedom when it comes to comments. He says that although forums on Facebook, much like those on npr.org, are often used by reporters for sourcing purposes, the conversations tend to be much more casual than on the official site. Rather than enforcing a polite dialogue, Carvin says, "our Facebook users are snarky and swear like sailors" — and that's just fine with him.

###