Lawsuits Claim OpenAI Employees Warned Leaders of Potential Shooter Months Before Tragedy |
Lawsuits Claim OpenAI Employees Warned Leaders of Potential Shooter Months Before Tragedy
Families of the victims claim OpenAI identified a ‘credible and specific’ threat but failed to alert authorities, raising new questions about AI companies’ responsibility.
BY LEILA SHERIDAN, NEWS WRITER
Canadian Prime Minister Mark Carney carries flowers to a memorial for the victims of the mass shooting at Tumbler Ridge Secondary School in Tumbler Ridge, British Columbia, Canada. Photo: Getty Images
Families of victims in one of Canada’s deadliest mass shootings filed lawsuits Wednesday against OpenAI and its CEO Sam Altman, alleging the company failed to act on clear warning signs that the shooter posed a real-world threat.
The lawsuits, filed in federal court in San Francisco, claim OpenAI knew months in advance that 18-year-old Jesse Van Rootselaar was expressing violent intent on ChatGPT yet chose not to alert Canadian law enforcement.
According to the complaints, OpenAI employees flagged the shooter’s account eight months before the attack, determining it posed “a credible and specific threat of gun violence against real people.” The account was ultimately banned, but the families allege that was not enough. The shooter was able to create a new account and continue using the chatbot, the lawsuits say, according to The Guardian.
Internally, about a dozen employees debated how to handle the situation, with some urging leadership to notify Canadian authorities, according to people familiar with the matter, The Wall Street Journal reported. The lawsuits allege that senior leaders, including Altman, decided against contacting law enforcement and instead limited the response to deactivating the account.
Vibe-Coding for Beginners in Five Easy Steps
“The fact that Sam and the leadership overruled the safety team, and then children died, adults died, the whole town was ruined, is pretty close to the definition of evil to me,” Jay Edelson, the lead lawyer representing the families, said, according to The Guardian.
The attack in Tumbler Ridge killed seven people, including four students, a teacher, and members of the shooter’s own family. In the aftermath, investigators uncovered a digital trail that included violent scenarios described in ChatGPT conversations and a Roblox game simulating a mass shooting, according to The Wall Street Journal.
OpenAI has said it did not identify “credible and imminent planning” that met its threshold for reporting the account to authorities. In a letter sent two weeks after the shooting to Canada’s minister of artificial intelligence and digital innovation, OpenAI vice president of global policy Ann O’Leary said the company acted based on the information available at the time, according to The Guardian.