Illustration of a human hand pulling on the tongue of a green monster

It turns out that even trolls have hearts. Sometimes. That’s what Katelyn Verstraten, an online reporter for CTV Vancouver, discovered in April 2016 when she posted an article about David Pennington — a local runner who was planning to jog from the Mexico-U.S. border to Vancouver to raise awareness and funds for ocean protection and conservation. Once the story went live, Verstraten waited for her Twitter nemesis, known to her as @baabaa, to surface.

A few hours later, @baabaa and his black sheep profile photo appeared in the comment section. But rather than his usual anti-media tangents, this time he only said, “I literally can’t say anything bad about [Pennington].”

What makes @baabaa, and thousands of other black sheep, bleat online? Behavioural scientist Mark Timms believes it’s the spotlight effect — the idea that some people believe they’re noticed by a wide audience when, in reality, they aren’t. This web celebrity belief helps explain why people participate in online discussions so vigourously. Susan Weinschenk, a behavioural psychologist, argues that the Internet reduces empathy: “There are certain norms in place when we are having a normal conversation that are not in place when we’re having online conversations.” The result is that all those trolls erode online communities along with digital journalism itself.

In response, many overwhelmed news sites, led by Reuters and Popular Science, removed their comment sections due to insufficient moderating personnel. Others soon followed, including the Toronto Star, Huffington Post Canada, Motherboard, and all CTV news sites.

Users cried foul over infringement of free speech, but editors are tired of defending researched articles from unfounded and ignorant jibes. Sexist and racist remarks also alienate a significant portion of the audience. And, of course, advertisers don’t like paying for ads that appear near toxic reader commentary. If online comment sections attract trolls, then publishers need to redesign them.

Back in 1997, viewers submitted questions to the staff of PBS’s NewsHour with Jim Lehrer via email. Greg Barber, who was the desk assistant on NewsHour that year, remembers this era fondly. “It was not a live experience — the Internet was very young back then,” he says, “but what I could see straight away with that experience was how exciting it was to be able to connect questions that came from folks not in the media industry, people who were experts in whatever field.” For Barber and everyone else on the staff , engaging with the audience on a personal level was good for journalism — and a constant connection with the customer was also good for business.

Barber, now the director of digital news products at The Washington Post, would eventually watch online comments devolve into a raging dumpster re. To address the problem, the Post began collaborating with the New York Times and Mozilla, a company that encourages a community approach to software design through an open source philosophy, as demonstrated by applications like Bugzilla. After a team was brought together, Kim Gardner, project manager for software developer Agile at the New York Times, joined them as they snowballed into a forward-thinking organization.

The result of the Barber-Gardner-Mozilla collaboration is the Coral Project, one of the only organizations leading the charge to establish responsible commenting. It is armed with many years of research, a team of eccentric minds, and a grant from the Knight Foundation (an organization that invests in the future of journalism). Barber is now the head of strategy and partnerships for the Coral Project. He leverages 20 years of journalism experience to help the team navigate the nuances of commentary and the way it shapes newsroom engagement with audiences.

Both the Coral Project and Civil Comments, an interactive comment moderation platform created by Christa Mrgan and Aja Bogdanoff, assume that the community wants to offer intelligent debate but lack the tools to keep troublemakers out. Another program, called Hearken, provides a service that helps news sites crowdsource story ideas by asking the public questions about what to investigate next. “Questions are the new comments,” says Hearken CEO Jennifer Brandel.

The Coral Project created software called Trust in late 2015 to address the challenge of identifying types of users and storing the results in a database. The software was beta-tested last year. Bad contributions are instantly sent through pre-moderation while non-toxic contributions are highlighted. Trust works with two other products: Ask, which allows users to ask others in the community a specific question; and Talk, the platform that integrates Coral Project software to foster constructive commentary and help create an engaged reader community.

Civil Comments asks every commenter to first review two other comments in the moderation queue. Comments are judged on quality and civility, based on how others evaluate them. “Our hypothesis was that our community would need to delete about 50 percent of comments,” says Mrgan, the vice president of design. The actual number ended up being a mere four percent.

Civil Comments caught the eye of The Globe and Mail, where a trial is currently being conducted in its politics section. Cynthia Young, the Globe’s head of audience, decided to try the software after hundreds of Globe readers urged her to keep the comment section, saying they came to the Globe specifically to read the comments. Mrgan believes that Civil Comments will succeed because comment section quality is achieved through participation.

Not all publications need outside help with the comment challenge. Unwilling to pay for outside help to moderate comments, Motherboard went the old-school route by creating a Letters to the Editor section, which offers engaging conversations without hate speech or personal attacks. It’s a slower process, with responses published a few days to a few weeks after the originating article appears, but the result is a more thoughtful debate. “Even if someone would write an email to disagree with the story,” says Motherboard weekend editor Emanuel Maiberg, “At least they’re sending an email. There’s just more thought that goes into it.” Readers were not pleased with the change, calling the decision cowardly.

Is it possible to design comment sections that encourage readers to provide constructive feedback in the moment? Digital native Quartz launched its own comment feature, Annotations, in 2013, allowing users to comment on a specific paragraph in the article. Remarks are anchored to a single statement rather than criticizing (or praising) the entire article. The idea is also pretty intuitive, since the user only needs to hover their cursor over a paragraph (or tap on a paragraph with a mobile device) and click on the quote bubble to read what another reader said. Remarks must be concise, since commenters are limited to 280 characters, equal to two full tweets.

The posts are immediately public without a review process, but some annotations are given a gold star and labeled a “featured” post if moderators think it’s a substantial contribution to the article. However, critics say it’s just putting the same problem on a different platform. Moderation is clearly key here.

Another attempt to improve comments involves eliminating anonymity. Tim Shore, the founder and publisher of BlogTO, switched to using a social media plugin because “we wanted a comment solution that required some sort of user login or profile so that users couldn’t comment anonymously like in our previous system.” BlogTO’s previous model allowed people to post under any pseudonym or email, which made it too difficult to track and ban problematic users.

But asking readers to log in using their social media account is not without its own issues. After describing the relatively harmless shenanigans of @baabaa, Katelyn Verstraten notes that people can express hostility — not with the safety of a pseudonym, but with their real names: “With Facebook, you just have to be accountable, because it links back to your profile.” She says she is still shocked by the things people say, even without anonymity.

There is also the issue of headline commenting. Some readers leave a comment after reading only the headline — despite the fact that the article may address some of their criticism. “People will make a mean comment based on [a headline],” says Verstraten. “Sometimes you just want to smack yourself in the forehead.”

For Carson Jerema, one of the Edmonton Journal’s digital news editors, the issue is mostly technical. The change from a traditional site-run comment box to the Facebook plugin had comments funnelling down through one multi-purpose tool. The plugin forces commenters to log in via Facebook — everything they post goes to both the website and Facebook. As with most newsrooms, there isn’t one staff member on constant comment patrol. While moderation is made easier, the publication is still exposed to hostile remarks as readers can comment on stories published days, weeks, or even months ago. Even when a publication believes it’s solved a problem in one article, another problem could pop up on another article and reflect poorly on the site’s readership at large.

But not every publication suffers equally. “We don’t really have the same problem with our comment section that other mainstream news sites have,” says Kate Robertson, an online and social media manager at Now Magazine. Since it wasn’t experiencing the same level of harassment as other publications, it retained its comment section, deciding that the benefits of constructive commentary far outweighed the draw-backs of hate speech. Farhan Mohamed, the editor-in-chief of the Daily Hive, maintained its comment section out of loyalty to the reader: “If we are not giving our readers a chance to weigh in on anything and have their say, then I think that we are doing a bit of an injustice to them.” Moderation, while complicated, has often resolved itself with keyword searches to scope out toxic remarks, along with readers who report toxic comments.

What started as a constructive debate about Syrian refugees settling in Toronto, soon changed direction. A December 2015 CBC News Toronto article about Prime Minister Justin Trudeau’s ambitious refugee policy inspired positive comments. But skeptical Canadians soon weighed in. “I would rather help my fellow Canadian,” said a pseudonymous user named “wingman.”

A user named “Being human” responded: “People tend to fear from the unknown, but as Syrian who came to Canada few months back i’m very grateful. We want to adapt the Canadian culture, what all we are asking for is to give us a time. We did not intend to flee from our home by choice but we forced to take this path. Syrians are good people and i hope someday you could meet someone, Please do not judge us before knowing us.”

In this case, the CBC article gave a refugee an opportunity to directly address criticism. It’s not clear why, but “wingman” went silent during the rest of the 5,161-comment conversation. CBC has since closed comments on most refugee stories to avoid hate speech. Senior director of digital news Brodie Fenlon admitted this decision occurred because the comments would affect the quality of journalism when reporters were afraid of the backlash their stories would have.

But for every “wingman” who goes silent and every @baabaa who unexpectedly changes their behaviour, there are flocks of others who relentlessly slag diversity, immigration, and more. The emerging new solutions will never stop all of them — only their consciences can do that.

(Visited 110 times, 1 visits today)

Sign Up for Our Newsletter

Keep up to date with the latest stories from our newsroom.