Subscribe to the 100% free rdnewsNOW daily newsletter!
Founder and CEO of Children First Canada Sara Austin speaks during a ‘Time Is Up: Children and Families Take Over Parliament Hill to Demand Online Safety’ rally on Parliament Hill in Ottawa, on Monday, April 27, 2026. THE CANADIAN PRESS/Spencer Colby

Child advocates call for online harms bill covering AI chatbots, gaming

Apr 27, 2026 | 2:00 AM

OTTAWA —

Ottawa can’t afford to wait any longer to introduce new online harms legislation that covers AI chatbots and video games, children’s advocates and about a dozen kids told a press conference on Parliament Hill Monday.

They urged the government to move quickly to introduce its promised online harms bill.

“This is a David and Goliath battle — kids and parents up against a multi-billion dollar tech industry that is profiting off of harming our children,” Sara Austin, founder and CEO of Children First Canada, told reporters.

“We need our prime minister and every member of Parliament to work with a united perspective around what’s most important, which is keeping kids safe and acting with a sense of urgency, because kids’ lives are on the line.”

Austin said in an earlier interview that recent months have seen an escalation in the level of harm caused by the use of AI chatbots.

OpenAI banned the mass shooter in Tumbler Ridge, B.C., from using its ChatGPT chatbot due to what it called worrisome interactions, but did not alert law enforcement. The shooter got around the ban by having a second account.

Austin said the tragedy “could have potentially … been prevented had OpenAI acted sooner to disclose the risks to the police.”

One of the people who spoke at the press event Monday is Jason Sokolowski, whose 15-year-old daughter Penelope died in 2025 in connection with the terrorist group 764. Austin said Penelope’s grooming began on the online gaming platform Roblox.

Sokolowski said he wasn’t able to recognize Penelope “was being groomed and extorted on social media apps that were designed to addict her.”

“I wish she could be here to explain the horrors that she saw,” he added.

Politicians don’t understand how bad the problem is, Sokolowski said.

The federal government added 764 to its list of terror entities in December 2025. It described the group as a “decentralized transnational network of online nihilistic violent extremists.”

It said members of the group use social media and gaming platforms “to lure, groom, and extort youth to commit violent and sexual acts, including self-harm.”

Matt Richardson of the Canadian Open Source Intelligence Centre has said that in the course of his research into online spaces involving members of 764, he’s seen images of self-harm, initials and names of abusers carved into victims’ skin, animal abuse and even invitations to watch livestreamed suicide attempts.

“Many of our kids are spending extensive amounts of their daily lives on gaming platforms and they have proven to be unsafe,” Austin said.

She said there are multiple “gaming platforms that are risky for kids because they allow for chat features with kids to be able to communicate with strangers” who can pretend to be children.

Online safety advocate Carol Todd, whose daughter Amanda died by suicide in 2012 after she was targeted for online sextortion, also took part in the event in Ottawa.

She said Amanda’s story shocked this country. But what should shock us even more is that more than a decade later, children are still being harmed in the same ways, on more powerful platforms with even less protection.”

Twelve-year-old Zachary Fathally said the government first promised to bring online harms legislation four years ago, when he was eight years old.

“We’ve been told to wait and to be patient, but we’ve waited far too long. Kids like me across Canada have waited over 1,700 days for protection. That’s four years and nearly a third of my life,” he said. “And while we’ve been waiting, kids have been getting hurt.”

Children First Canada said in a press release it was leading Monday’s event on Parliament Hill, with support from a coalition that includes medical organizations, youth and parents.

In addition to AI chatbots and gaming, the group wants the online harms bill to cover social media. It says legislation must include a duty of care for platforms requiring them to prevent foreseeable harm and introduce safety by design, and a “strong, independent regulator with enforcement power.”

The Liberal government previously introduced the online harms bill C-63 but it did not become law before last year’s federal election was called.

After initially signalling it would not bring the bill back in the same form, but would instead tackle aspects of it in other legislation, the government changed course and Culture Minister Marc Miller is now taking the lead on a new bill.

Miller has reconvened an expert group the government previously consulted. The group is expected to consider multiple questions, including whether the legislation should cover AI chatbots and if it should restrict social media access for kids and teenagers.

AI chatbot safety and social media bans for children have emerged as global political issues since the earlier version of the bill was introduced.

Austin said the government has had plenty of time to prepare and needs to take action now.

“We continue to hear from key leaders in government that they are taking their time to get this right. And I appreciate the sentiment behind that, because they’ve had a couple of false starts with online safety legislation before,” she said.

But the government doesn’t have to “recreate the wheel here,” she said, noting Canada can follow the lead of others, including the United Kingdom, the European Union and Australia.

This report by The Canadian Press was first published April 27, 2026.

— With files from Erika Morris in Montreal

Anja Karadeglija, The Canadian Press