These are just some of the many red flags lawyers have identified in Bill C-63 since it was tabled before the House of Commons on Feb. 26.
Under the bill, if a judge is satisfied that an “informant has reasonable grounds” to fear that a future hate crime may be committed by a defendant, the defendant must abide by certain restrictions for a year.
The restrictions include, but are not limited to, wearing an electronic bracelet and observing curfews. Failure to comply comes with a one-year prison sentence.
Other red flags, say Mr. Moore and other lawyers, include broad powers given to a government-appointed Digital Safety Commission, the new ways “hate speech” would be defined and policed, and requirements on social media companies that may spur them to broadly censor online comments.
The stated purpose of the government’s long-touted bill is to prevent “online harms” by targeting content that exploits or is used to bully children, incites terrorism or violence, and “foments hatred.”
It seeks to amend the Criminal Code to add a new standalone hate crime offence that applies to all existing offences, add the provision of “fear” that someone may commit a hate crime in the future, and increase penalties for hate crimes. For example, the maximum sentence for “advocating for genocide” would increase from five years to life in prison. The Canadian Constitution Foundation (CCF) finds this worrisome.
The bill would amend the Canadian Human Rights Act to specify that posting “hate speech” online is discrimination, and it would allow people to file complaints against the poster with the Canadian Human Rights Commission. In some cases, complaints can be filed with their identity kept anonymous if deemed necessary by the commission.
Online social media platforms will have a duty to act responsibly, protect children, and make certain content inaccessible. Failure to abide by the requirements could cost the platforms 6 percent of their gross global revenue or $10 million, whichever is greater.
Human Rights Tribunals
While some of the harms covered by the bill are relatively straightforward—sexual exploitation of children, or inducing children to harm themselves, for example—some are more nebulous. “Inciting violence” and “fomenting hatred” are more open to interpretation.Mr. Dehaas said the attorney general, a legal expert, must be consulted under current legislation before any hate speech charges can be applied—precisely because it’s so difficult to determine that the threshold has been met.
Under Bill C-63, the courts would still hear criminal cases of hate speech, but a whole new class of hate speech cases would be heard and penalized by the CHRT.
The tribunal can impose fines up to $50,000, and each complainant could receive up to $20,000.
Mr. Moore said he expects this to be an easy way for people to silence political opponents, especially since some complaints can be made anonymously.
“It’s pretty cheap to lay a complaint. It doesn’t cost you anything. And if it doesn’t even cost you your identity, you can just go ahead and do that to all of your political opponents,” he said.
Noting that the tribunal is government-appointed, he added that even in cases where the tribunal decides in favour of the accused, “the process often is the punishment here.”
Canada previously had provisions for hate speech in Section 13 of the Canadian Human Rights Act, though it was removed amid uproar about its impact on free expression.
It was used by a group of students and the Canadian Islamic Congress in 2007 against Maclean’s magazine, claiming a column by Mark Steyn contained hate speech.
The complainants filed their application with the Canadian Human Rights Commission, which dismissed the complaint; with the Ontario Human Rights Commission, which said it didn’t have jurisdiction over the issue; and with the British Columbia Human Rights Tribunal, which heard the complaint and decided to dismiss it.
Still, the case gained much attention, and Parliament voted to remove the hate speech provision in 2014. Bill C-63 would essentially reinstate the provision.
“We’re talking about codifying [a] pre-existing definition of hatred,” Mr. Virani told Mr. Serapio.
“Hatred has been defined in Supreme Court jurisprudence for at least the last 11 years in a decision called Whatcott, 2013, where it talks about something that arises to ‘detestation’ and ‘vilification.’ It doesn’t cover things like humiliating, offensive comments, things that are insulting.”
The Whatcott decision, however, is what Mr. Dehaas said sets a standard so confusing that a civil tribunal is unlikely to be able to interpret it correctly. Even less so will the average Canadian be able to gauge what’s OK to say and what isn’t.
Defining ‘Hate’
In 2001 and 2002, William Whatcott distributed flyers on the topic of homosexuality in Regina and Saskatoon. Residents filed complaints against him with the Saskatchewan Human Rights Commission. The case went all the way to the Supreme Court of Canada, which ruled against him in 2013.Canada has seen a surge in concerns about schools’ approach to teaching about sexual orientation and gender identity, exemplified by Alberta’s recently announced policy to tightly control such content and limit gender transitioning for minors.
A main concern about the Online Harms Act has been that it would amplify the Liberal government’s stance that comments on this topic are discriminatory. The Million March 4 Children, which highlighted this issue last year, was often accused of being “homophobic” or “transphobic.”
Christian and Muslim parents have especially raised concerns about how gender and sexuality are treated in schools. It may be difficult for such parents to navigate where the line is that Mr. Whatcott crossed in discussing these matters.
Mr. Dehaas gave examples of such difficulties regarding some “hallmarks of hatred” laid out in the Whatcott decision.
“This is where things get frightening because some of ‘hallmarks of hatred’ are things people should frankly be allowed to say,” he said.
“This is how feminists sometimes talk about men. Is that hate speech?” Mr. Dehaas said.
Another hallmark is that hate speech alleges a group is “plotting to destroy Western civilization,” Mr. Dehaas said. Such claims can be “offensive,” he added, “but what if some group really is one day plotting to destroy Western civilization?”
Another hallmark is labelling a group as “liars, cheats, criminals, and thugs,” a “parasitic race,” “pure evil,” or “lesser beasts,” or in other dehumanizing terms.
Mr. Dehaas gave examples of instances where someone might call a group “pure evil,” refer to them as “animals,” or say a group is “unlawful” in debate on social and political matters.
“When people don’t know where the line is, they stay silent to avoid punishment. That makes it hard [to] have raucous debates on all kinds of issues, which is the point of free speech,” he said.
“The Bill would require social media companies to ’minimize the risk that users of the service will be exposed to harmful content' with the threat of massive fines if they don’t properly mitigate the risk,” the release said. “Social media companies will likely err on the side of caution and block large amounts of speech that is close to the legal line.”
Who Decides?
The Canadian Human Rights Tribunal isn’t the only entity given broad powers by the bill. Enforcement of the bill lies largely with the Digital Safety Commission it will create, and the commission’s “powers are incredibly broad ranging,” Mr. Geist wrote.The focus of the commission will be tackling “harmful content.”
It can make content inaccessible, conduct investigations, demand information from regulated services, hold hearings (sometimes out of public view), establish codes of conduct, and levy penalties.
“Despite those powers, the Commission is not subject to any legal or technical rules of evidence, as the law speaks to acting informally and expeditiously,” Mr. Geist said.
Mr. Moore is also concerned about these powers, as well as the commission’s “massive mandate.”
This includes its ability to send inspectors to “enter any place in which they have reasonable grounds to believe that there is any document, information or other thing relevant to that purpose,” as defined in the bill.
This means the inspectors can walk into people’s places of work anytime to look at documents. The only exception is people’s homes. Absent the homeowner’s approval, a warrant would be needed to enter.
“We are talking about some of the most draconian powers given to an agency that doesn’t exist and has no track record of integrity,” Mr. Moore says.
While the commission itself can’t make arrests or directly send people to jail, Mr. Moore says it can take its cases through the Federal Court process and obtain court orders, and people can go to jail for failure to observe those orders.
He uses health agencies’ restrictions during the COVID-19 pandemic as an example. Although Alberta Health Services couldn’t directly have pastors who kept their churches open imprisoned, it went to court to obtain orders, and the pastors were sent to jail.
“That is a potential to happen to providers of social media services that are regulated, but we don’t yet know exactly what that’s going to look like. Could it be small little people running a blog post, with the comment section? Or is that only going to be the Googles and Facebooks of the world?” he said.
Mr. Moore said he is very concerned that this legislation could be used to target political opposition.
“Excuse my skepticism, I’ve stopped believing that law will always be used by the government in a manner that a good-faith actor would use that law,” he says.
‘Constitutionally Protected Expression’
Many, including Mr. Geist and Peter Menzies, who is a former Canadian Radio-television and Telecommunications Commission (CRTC) vice-chair and Epoch Times columnist, have said that Bill C-63 is a toned-down version of the online harms legislation that the Liberal government had presented in the past. The legislation has been years in the making, starting with a 2019 campaign promise and including a previous version of the bill introduced in 2021.The CCF says the current bill will still “significantly hamper constitutionally protected expression.”
It would require social media companies to flag content that they believe “foments hatred” and deal with content that is legal, but that “the operator had reasonable grounds to believe posed a risk of significant psychological or physical harm.”
“This appears aimed at encouraging social media companies to censor speech that the government cannot outlaw,” CCF said.