OpenAI says Tumbler Ridge shooter would spur police flag under rules now


OpenAI says it is further strengthening its criteria for referring users’ behaviour to law enforcement “based on the Tumbler Ridge tragedy and the Canadian context” and taking other new safety measures after meeting with federal ministers in Ottawa this week.

The company behind ChatGPT has faced criticism after it was revealed that the company flagged and banned an account in June 2025 belonging to the shooter who killed eight people in Tumbler Ridge, B.C., more than seven months later. However, the account wasn’t flagged to law enforcement until after the shooting because it was determined there was no “imminent” threat last summer.

Get the day's top news, political, economic, and current affairs headlines, delivered to your inbox once a day.

Get daily National news

Get the day’s top news, political, economic, and current affairs headlines, delivered to your inbox once a day.

In a letter to ministers Thursday, OpenAI said it had already taken steps to improve that criteria based on guidance from mental health, behavioural and law enforcement experts “several months ago” to make the threshold of an imminent threat “more flexible,” and account for “a potential risk of imminent violence.”

Story continues below advertisement

“With the benefit of our continued learnings, under our enhanced law enforcement referral protocol, we would refer the account banned in June 2025 to law enforcement if it were discovered today,” the company wrote.

More to come…


&copy 2026 Global News, a division of Corus Entertainment Inc.



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *