Banx Media Platform logo
BUSINESSIPOs

Beneath British Columbia’s Cold Sky: A Community Asks What Could Have Been Stopped

Families of victims in Canada’s Tumbler Ridge mass shooting are suing OpenAI and Sam Altman, alleging the company failed to warn police after flagging the shooter’s ChatGPT activity.

G

Gerrad bale

INTERMEDIATE
5 min read
0 Views
Credibility Score: 97/100
Beneath British Columbia’s Cold Sky: A Community Asks What Could Have Been Stopped

In Tumbler Ridge, the mountains stand still.

They rise in dark green walls around the town, holding winter longer than most places, keeping silence in the valleys and frost on the sidewalks well into spring. It is a place where school buses trace familiar roads and where faces are known by name, where grief, when it arrives, has nowhere to hide.

In February, grief came all at once.

It came in the sound of gunfire inside a school. It came in shattered hallways and ambulance lights against the snow. It came in names spoken softly in kitchens and church pews and at memorials built beneath a gray British Columbia sky.

Now, months later, that grief has traveled south.

This week, families of victims of one of Canada’s deadliest mass shootings filed lawsuits in a U.S. federal court against OpenAI and its chief executive, Sam Altman, accusing the company of failing to warn law enforcement after allegedly identifying the shooter as a credible threat months before the attack.

The lawsuits, filed in federal court in San Francisco, center on the February massacre in Tumbler Ridge, where nine people were killed, many of them children. The plaintiffs allege that the shooter’s violent interactions with ChatGPT were flagged by internal systems in June 2025 and reviewed by members of OpenAI’s safety team.

According to the complaints, those team members recommended contacting police after concluding there was a credible and imminent threat of harm.

But the warning, the families say, never came.

The lawsuits allege that senior leaders at OpenAI overruled the recommendation, fearing that disclosure would expose the scale of violence-related conversations on the platform and potentially threaten the company’s path toward a massive initial public offering.

In legal filings, the accusation is framed in terms of negligence, wrongful death, and product liability.

In Tumbler Ridge, it is framed more simply.

What if someone had called?

The shooter, 18-year-old Jesse Van Rootselaar, is alleged to have killed family members at home before attacking her former school, killing an educational assistant and several students before dying by suicide. Among the plaintiffs are relatives of those killed and a 12-year-old girl who survived after being shot multiple times and remains in intensive care.

The town’s grief has become layered now.

There is mourning for the dead. There is fear for the living. And there is the heavy, unanswered arithmetic of prevention—what systems saw, what people knew, and what was left undone.

OpenAI has called the shooting “a tragedy” and said it has a zero-tolerance policy for the use of its tools to facilitate violence. The company says it has already strengthened safeguards, improved detection of repeat violators, and refined protocols for escalating potential threats and notifying authorities when conversations suggest an imminent and credible risk of harm.

Last week, Altman publicly apologized in an open letter to the community, saying he was “deeply sorry” that law enforcement had not been alerted.

But apologies arrive after funerals.

And in courtrooms, sorrow becomes evidence.

These cases may become some of the first in the United States to test whether an artificial intelligence company can be held legally responsible for failing to act on dangerous user behavior—or for interactions plaintiffs say may have contributed to violence itself.

The implications stretch far beyond one town.

Across the world, AI systems increasingly sit in intimate spaces: in bedrooms, on phones, in moments of loneliness or rage or confusion. They answer questions, mirror emotions, and sometimes encounter distress or danger in forms difficult to measure.

What duty they carry in those moments is still being written.

Outside the courthouse in California, the arguments will be technical. Lawyers will speak of thresholds, criteria, foreseeability, and liability.

Back in Tumbler Ridge, the questions are quieter.

They are asked beside candles and photographs. In classrooms with empty desks. In homes where winter light still falls across rooms that no longer feel the same.

And somewhere between code and conscience, between a machine’s warning and a human decision, a town waits to hear whether accountability can reach as far as grief.

AI Image Disclaimer Illustrations were created using AI tools and are not real photographs.

Sources Reuters Associated Press The Wall Street Journal NPR The Verge

Note: This article was published on BanxChange.com and is powered by the BXE Token on the XRP Ledger. For the latest articles and news, please visit BanxChange.com

Decentralized Media

Powered by the XRP Ledger & BXE Token

This article is part of the XRP Ledger decentralized media ecosystem. Become an author, publish original content, and earn rewards through the BXE token.

Newsletter

Stay ahead of the news — and win free BXE every week

Subscribe for the latest news headlines and get automatically entered into our weekly BXE token giveaway.

No spam. Unsubscribe anytime.

Share this story

Help others stay informed about crypto news