Can comments on news platforms be salvaged? Hailed two decades ago as a forum for empowering what Dan Gillmor and Jay Rosen called “the former audience,” they have in all too many cases devolved into an open sewer of lies, hate and racism. Remember the adage that “our audience knows more than we do”? Well, there may be something to that. But it turns out that scrolling through the comments is not the way to tap into that wisdom.
The Philadelphia Inquirer this week became the latest news organization to drop most of its comments. Closer to home, when my other employer, GBH News, ended comments a few years ago in the course of upgrading its content-management system, I didn’t hear about a single complaint.
On Thursday, Anika Gupta, the author of “How to Handle a Crowd: The Art of Creating Healthy and Dynamic Online Communities” (2020), offered some common-sense ideas that were aimed not only at news sites but also at the larger challenge of how to keep virtual discussions from spinning out of control.
In a talk via Zoom sponsored by Northeastern University’s School of Journalism, Gupta discussed her study of Make America Dinner Again, started in 2016 by two women in the San Francisco Bay Area, Tria Chang and Justine Lee, to bring people with differing political perpectives together over food and conversation. It took off, and Facebook approached Chang and Lee with the idea of making it a Facebook group as well.
To the extent that it’s worked, Gupta said, it’s because the group has grown slowly (to date, there are still fewer than 1,000 members), with lots of personal intervention. Some of the steps they’ve taken include staying away from hot-button topics such as whether abortion should be legal or if teachers should have guns. Instead, they aim for “detailed, specific, ‘sideways’ questions,” as Gupta put it in her presentation. For instance, rather than asking about abortion rights, members were asked a lengthy question about how religious people justify a particular biblical quote.
They also implemented a “one-hour rule” that limits members to posting only one comment per thread per hour, which tends to keep the temperature down.
Some of the challenges they’ve faced, Gupta said, involve questions about what to do regarding members with false or offensive views. Their decision was to take aggressive action in such cases and encourage people to leave — a different approach compared to the one generally taken by the news business.
“A lot of news organizations are uncomfortable with this ‘if you don’t like it, you can leave’ attitude,” Gupta said.
I had a chance to ask Gupta about two issues that have bedeviled news organizations: Would requiring real names make a difference? And should comments be screened before they’re posted? Gupta’s take was that real names don’t matter all that much. Even in community online forums with real-names policies, she said, “you will be shocked about what people say about their neighbors.” (Actually, no, I wouldn’t.)
Moreover, insisting on real names can drive away people afraid of being harassed. That’s especially true with women, who, studies and anecdotal evidence show, are disproportionately singled out for online abuse.
Pre-screening, she added, is a problem because it is so labor-intensive, and it may not be realistic for larger media outlets. She also said pre-screening turns comments into something like letters to the editor, since commenters know their views are going to be read by someone at the news organization.
The Independent doesn’t require real names, but it does have a number of commenters who’ve used consistent pseudonyms over time, which Gupta said is helpful in maintaining civility. The site also screens every comment before it’s posted. The editor and founder, Paul Bass, believes that leads to more and higher-quality comments, since people who want to be constructive aren’t scared off.
Still, the Independent has had its glitches. As I wrote for the Nieman Journalism Lab a number of years ago, at one point an outbreak of sociopathy led Bass to shut down the comments temporarily. When they relaunched, commenters were required to register under their real names, though they could still post pseudonymously. That action put them on notice that they could be sued — Section 230, much discussed of late, protects the Independent, not the individuals who comment on the site.
Bass continues to see value in comments, writing in a public thread on Facebook this week:
Screening is essential. We screw up sometimes, and sometimes it gets toxic. But overall almost everyone involved with our site (readers, reporters, etc.) agrees that comments section is the best part. Lively, very wide range of points of view and racial/economic backgrounds; and some people who really know a lot more than we do! But occasionally it does feel like a sewer. I do feel comfortable zapping comments and banning people. Without our comments section, we would be more removed from readers, especially those who disagree with us. I learn so much from commenters!
I do wonder, though, if the Independent’s 2005 founding has something to do with Bass’ success with comments. Facebook was barely a thing at that time, and digital culture hadn’t become as toxic as it is today. By establishing expectations right from the start, Bass has been able to maintain a relatively civil environment for more than 15 years.
And I agree with Bass that screening — by humans — is essential. Anika Gupta said Thursday that screening by artificial intelligence isn’t going to be effective anytime soon, despite the efforts of Google to develop a system that would do just that.
At the local level, in particular, maintaining a useful comments platform is essential to keeping the audience engaged. Letting the trolls invade and taking action only after the damage has been done is exactly the wrong approach.
Become a member! For $5 a month, you can support Media Nation and receive a weekly newsletter with exclusive content. Just click here.