Online Harms Act puts tech companies on notice

Debates about regulating online content and the safety of children can get… heated, to say the least. So before the yelling starts, let’s take a look at what tech companies will actually be expected to do under the Online Harms Act.

What happened: The federal government introduced Bill C-63, which makes tech platforms responsible for content that sexually victimizes, bullies, or encourages self-harm to children; intimate images that are shared without consent; and content that incites hatred, violence, or terrorism.

  • Online services would have to keep digital safety plans outlining what measures they are taking to limit harmful content on their platforms and how effective they’ve been.
     
  • Companies must have an internal point of contact to handle user complaints, as well as tools to flag harmful content and block other users.
     
  • Platforms would also have to make two categories of content completely inaccessible: anything that sexually victimizes a child as well as intimate content posted without consent, which specifically includes deepfakes.
     
  • If anything under those two categories is posted, companies would have 24 hours from being made aware of the content to take it down.

Why it matters: Platforms have struggled to curb much of the content covered by the act, but a new Digital Safety Commission will make them either tackle the problem or pay up. Companies could be fined up to $10 million for not meeting their obligations, with the charge going up to $25 million for repeat offenders.

  • Services would have a statutory duty to protect children online. What that means exactly will be set by the new commission, but the government suggested things like parental controls, warning labels, and safe search settings being the default.
     
  • The bill would only apply to platforms meeting a minimum number of users. That threshold hasn’t been set, but the government used Facebook, Twitch, and PornHub as examples of companies that would be within the scope.

Yes, but: The law would not cover direct and encrypted messages between users, which the government said was to protect freedom of speech.