Big Tech dangles self-policing in exchange for no liability in latest Heritage report.

From Canadian Heritage “What We Heard” report on Online Safety

February 2, 2023

The federal Liberals’ march towards tabling Online Safety legislation in the House of Commons got closer to its goal this week with Heritage’s release of results from a series of pro-regulatory roundtables held across Canada. In a recent tweet, Minister Pablo Rodriguez re-committed to tabling a Bill.

The comments from 160 community groups and 13 online platforms were mostly aligned with a series of reports issued in Spring 2022 by the government’s expert panel. That Report recommended government-supervised self policing of online harms and safety, rejecting an interventionist enforcement model that was proposed in the government’s original consultation of Summer 2021.

Buried at the end of this most recent Heritage report is feedback from the Big Tech group including Meta, TikTok, Google, YouTube, Microsoft, and Mind Geek. The comments from the group reflect a constructive engagement with the government’s intention to legislate. Importantly, the summary notes that “several participants flagged the importance of including intermediary liability protection in a legislative framework.”

What the online platforms and tech companies are signalling is their desire for a quid pro quo: they will play ball with legislation enforcing some form of self-policing of harmful online content in exchange for insulating them from lawsuits brought by victims of online harm.

This policy proposal comes at a time that the same Big Tech companies nervously eye the US Supreme Court’s hearing on February 21-22 of lawsuits brought against YouTube and Twitter by victims of ISIS terrorism.

The two lawsuits directly challenge the teflon-like section 230 of the 1996 federal Communications Decency Act. The plaintiffs argue that Twitter and YouTube’s algorithm-driven distribution and monetizing of ISIS recruitment videos glorifying terrorism amounts to promotion of dangerous content (in the case of the YouTube videos, Google split the advertising revenue with ISIS).

That social media curation, say the applicants, is not protected by section 230 enacted by Congress to protect passive hosting of online bulletin boards. Instead, the plaintiffs argue that common law principles of product liability should apply.

As posted previously by, conflicting views in Congress and US state legislatures over more or less governance of online curation creates a policy vacuum which the US Supreme Court has been invited to fill.


If you would like regular notifications of future posts from you can follow this site by signing up under the Follow button in the bottom right corner of the home page; 

or e-mail to be added to the weekly update; 

or follow @howardalaw on Twitter.

Published by

Howard Law

I am retired staff of Unifor, the union representing 300,000 Canadians in twenty different sectors of the economy, including 10,000 journalists and media workers. As the former Director of the Media Sector and as an unapologetic cultural nationalist, I have an abiding passion for public policy in Canadian media.

One thought on “Big Tech dangles self-policing in exchange for no liability in latest Heritage report.”

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s