OTTAWA -- The federal government is proposing the creation of new powers to block online platforms that repeatedly refuse to take down harmful content, and is looking at new ways for CSIS and RCMP to play a role it comes to combating online threats to national security and child exploitation content.

In launching a new proposal for how to tackle harmful online content, Canada’s justice, public safety, and heritage ministers announced Thursday that they want to bring in new laws and regulations to force social media companies to be more accountable for five kinds of harmful content on their platforms: hate speech, child exploitation, the sharing of non-consensual images, incitements to violence, and terrorism.

And, they’re looking at what role federal security and intelligence agencies could play in enforcing these new rules, as well as the potential to completely block access within Canada to platforms that fail to act on content on their services that is deemed harmful.

Departmental officials outlined the proposal and issued a technical discussion paper detailing their legislative aims on Thursday.

Specifically, the government is proposing to create new rules and a compliance regime for online communication services that would force these companies to address harmful content posted on their platforms, including a requirement to review and remove, if necessary, problematic content within 24 hours of the post being flagged.

The sites may also be obliged, in certain circumstances, to preserve content and identifying information for potential future legal action. They could also have new options to alert authorities to potentially illegal content and content of national security concern if an imminent risk of harm is suspected.

The proposal would also compel platforms to provide data on their algorithms and other systems that scour for and flag potentially harmful content, provide a rationale for when action is taken on flagged posts, and would install a new system for Canadians to appeal platform’s decisions around content moderation.

The new regime comes with a series of proposed and severe new sanctions for companies deemed to be repeatedly non-compliant. These consequences would include fines of up to five per cent of the company’s annual global revenue or $25 million, whichever is higher.

And, “as a last resort,” if an online platform repeatedly failed to remove child sexual exploitation material or terrorist content, the government would seek the legal ability to block Canadians from accessing that service at all, through a court injunction forcing telecommunications service providers to restrict access to that site or service.

These aims would target what the government is calling “online communication service providers,” such as Facebook, YouTube, Twitter, Instagram, Tik Tok, and Pornhub. The government said the aim of the legislation would be on public content posted to these platforms and not private communications like emails, text messages, or WhatsApp messages.

“Person-to-person communications raise important considerations with respect to freedom of expression and privacy rights,” said one Canadian Heritage official, speaking with reporters about the plan on a not-for-attribution basis.

“We need this because what we're seeing, and what we've been seeing over the last few years is that violence and hate speech have been on the increase, on social media platforms… Canadians have been asking us, the government, to intervene and that's exactly what we're doing,” said Canadian Heritage Minister Steven Guilbeault in an interview with CTV News.

A NEW COMMISSIONER, ROLE FOR CSIS

In order to operate and adjudicate this new system, the government is looking to create a new “Digital Safety Commission of Canada” that would be able to issue binding decisions for platforms to remove harmful content, ordering them to remove content when they “get it wrong.”

The commission would include a new commissioner who would enforce the rules, evaluate complaints against online communication service providers, a “digital recourse council” to act as an independent tribunal system, and an advisory board that would provide expert advice on regulations.

Officials said that it would be up to the commissioner to make decisions around compliance of the platforms and what the threshold of non-compliance would be for moving to block a platform outright, where the recourse council would be adjudicating individuals’ complaints about posts that haven’t been taken down and have the legal authority to order content be taken down.

The commissioner would also be able to conduct inspections of online communication service providers and “may enter, at any reasonable time, any place in which they believe on reasonable grounds there is any document, information or any other thing, including computer algorithms and software, relevant to the purpose of verifying compliance and preventing non-compliance.”

The proposal is to have the council and commissioner be able to conduct hearings behind closed doors “where a public hearing would not be in the public interest, including where there is a privacy interest, national security interest, international relations interest, national defence interest, or confidential commercial interest.”

Also included in this proposal are potential ways for the Canadian Security Intelligence Service (CSIS) and the RCMP to play enhanced roles. Specifically, the government is proposing changes to the mandatory reporting requirements involving the RCMP and child sexual exploitation offences, whole offering new latitude to CSIS when it comes to threats to national security and terrorism.

If pursued, the measures would provide the agencies with the judicial authorization to obtain basic subscriber information like address and phone number, or transmission data like an IP address, that the government says would help quickly identifying perpetrators and would be subject to “checks and balances.”

“One of the principal issues right now is the sheer volume of extremist content online, and also the protections people have in their anonymity and the only way to take any action against the posters of this hateful content, and that which promotes violence, is to identify them,” said a CSIS official at the briefing.

CONSULTING UNTIL THE FALL

Late in the parliamentary session, the Liberals tabled but did not advance Bill C-36, which proposes to amend the Canadian Human Rights Act and Criminal Code to make it a discriminatory practice for individuals to communicate hate speech online. Officials said Thursday that this new initiative would link up with that bill.

The government says it now wants to hear from stakeholders and citizens on the proposed approach and the new powers that would be provided. This comes after considerable public backlash over now-halted Bill C-10’s mixed messaging when it comes to regulating user-generated content.

“The government recognizes this is a very complex space, and it engages many different issues,” said an official.

“By mandating website blocking, the government is creating a country-wide blocking system that will increase consumer Internet costs and gradually expand the scope of content blocking as lobby groups demand that it be used for additional purposes,” suggested Michael Geist, a University of Ottawa law professor and the Canada Research Chair in internet and e-commerce law, in an initial reaction to the proposal.

In a statement, Canadian Internet Registration Authority (CIRA) President and CEO Byron Holland said that while Canadians are concerned about harmful online content, they are “also very concerned that new laws could result in the over-policing of speech, with legitimate, lawful content being taken down inappropriately.”

“The government is going to have to work hard to strike a balance here, and make sure the blunt hammer of internet censorship isn’t used when more effective and proportionate responses are available,” Holland said.

Conservative justice critic Rob Moore issued a statement about the proposal, stating that while the Conservatives condemn hate speech and speech that incites violence, “Canada already has existing criminal law protection.”

“We believe combating hate speech is best done using the Criminal Code with additional resources to be provided to law enforcement,” Moore said. “After reviewing the proposal released today, we are deeply concerned with the Liberal’s plan to create an online speech regulator whose powers are overly broad and ill-defined. Canada’s Conservatives will follow the consultation process and review the eventual legislation with interest to ensure that the free speech rights of Canadians are protected.”

In making their case for why these new measures are necessary, officials said Canadians have an appetite for more government involvement in regulating hate speech online and pointed to similar measures taken in Germany and Australia.

The government cited a range of statistics, such as that one-in-five Canadians have experienced some form of online hate, with racialized Canadians being nearly three times more likely to have experienced harmful behavior online.

The consultation will be open for feedback until Sept. 25, with the responses received informing legislation they aim to table this fall, subject to the potential election outcome.