Illustration by: Gavin McCarthy

Illustration by: Gavin McCarthy

After a nine-year-old girl died in a house fire on Sandy Bay reserve north of Portage la Prairie, Manitoba, cbc.ca was one of the first to report the tragedy. As soon as the story went up, user comments began popping up on the site: “Native people do not have the knowledge to look after a house” and “… the house went up completely in 15 minutes due to the large amount of alcohol in the building.”In a news conference the Southern Chiefs Organization, which represents dozens of bands in Manitoba, publicized its grievances over the offensive comments on cbc.ca. The event was sparked by a complaint the SCO received from a resident of Sandy Bay who discovered allegedly ignorant or hatred-filled comments were going unmonitored.

The organization asked the provincial government to launch an investigation into CBC’s Manitoba website, suggesting charges should be laid against the public broadcaster, which, in some members’ opinions, should be held to a higher standard than other news organizations. Lyndenn Behm, SCO’s communications coordinator, says there’s been no apology from CBC since the posts on February 11. He says Aboriginals, First Nations and the residents of Sandy Bay also deserve an apology.

The response

CBC’s approach to its online comment sections is to provide as open a forum as possible for an exchange of views. But spokesperson Jeff Keay says, “We’re rethinking that now.” He adds that the broadcaster isn’t sure it sees the value in the discussion due to an “excessive degree of intemperate commentary.”

Online news providers are not held to the standards of the Canadian Radio-television and Telecommunications Commission (CRTC), so sites must develop their own guidelines and make individual judgment calls based on the Charter of Human Rights and Freedoms and libel laws. This can mean a choice between curbing free expression (and losing traffic) or allowing potentially defamatory comments (and being legally vulnerable).

The legal area concerning online user comments isn’t just a “grey area,” says Bert Bruser, media lawyer for the Toronto Star, “it’s a mess.”

Value of user comments to journalism

Mathew Ingram, communities editor for The Globe and Mail, encourages journalists to use and experiment with the comments sections. One of his colleagues who has enhanced her stories this way is Globe reporter Tavia Grant. She has reacted to new information from readers by doing more interviews and then updating her article for a more accurate picture.

Ingram acknowledges that some people use comments sections to vent or ride their hobbyhorse, “but there are people out there who do know something about the story you’re writing about, and who have valuable knowledge, perspective or comments.”

Monitoring online comment sections

CBC outsources the moderation of its comments sections to Manitoba-based ICUC Moderation Services Inc., which deals with the over 200,000 posts made on cbc.ca each month. ICUC says its services work within its mandates of its clients, including trendy companies such as MuchMusic, Coors Light and Calvin Klein, as well as CTV and the Government of Canada.

On the other hand, most of globeandmail.com is semi-moderated, says technology editor MattFrehner. A comment is flagged only if a reader finds it offensive. Flagged comments are reviewed by an editor who either accepts or removes them. Frehner says roughly 85 to 90 percent of stories are semi-moderated, but there are closed stories as well, especially ones dealing with court cases that could be jeopardized by information posted online. The Globe also asks journalists to check for unsuitable posts but moderating 100,000 comments every month is unrealistic. As Frehner says, “You can’t spend your entire day reading comments about the conflict in Israel.”

Despite the dangers, open forums-be they comment sections or live chats-increase traffic. Frehner says that when the Globe hosts an open discussion, with someone such as the political columnist Jeffrey Simpson, there is a huge spike in user participation.

Meanwhile, Neil Sanderson, assistant managing editor of thestar.com, says that his paper’s site receives around 2,000 comments a day and employs five in-house moderators. From a variety of educational programs (none with a background in journalism), moderators are trained to check for 18 different problems, ranging from hate speech to libel. Sanderson also says reader comments are valuable because “from a philosophical point of view, the media depend on freedom of speech. We can’t exist any other way.”

Still evolving 

Roger D. McConchie, a B.C. lawyer practicing internet and defamation law, recommends that online news organizations apply the same rules to user comments that they use for daily print retractions. That could mean publishing an apology on the site or on the page where the “hate speech” or libel has been flagged as well as eliminating the offensive comments from the original article.

While Canada’s internet laws have improved in the last 10 years, news organizations still must work on balancing openness with their own regulations. So the struggle to provide forums for lively and insightful discussions without being interrupted by ignorant and unconstructive comments continues. “I would have never imagined five years ago how widespread and serious the problem for individuals who are defamed has become,” says McConchie. “It’s just grown topsy-turvy.”