OpenAI CEO Sam Altman Apologizes for Failing to Report Mass Shooter's Account to Police
Key Takeaways
- ▸OpenAI CEO Sam Altman apologized for the company's failure to report a mass shooter's ChatGPT account to law enforcement before a January attack in Tumbler Ridge, BC that killed 8 people
- ▸OpenAI says the banned account did not meet its internal threshold for reporting to police, but the company is now reviewing and strengthening its safety protocols
- ▸OpenAI faces multiple legal challenges including a lawsuit from parents of an injured child and a criminal investigation in Florida related to another alleged shooter
Summary
OpenAI co-founder and CEO Sam Altman has apologized to the community of Tumbler Ridge, British Columbia, for not alerting law enforcement about a ChatGPT account belonging to Jesse Van Rootselaar, who carried out a mass shooting in January that killed eight people and injured nearly 30 others. The account was identified and banned by OpenAI in June for problematic usage, but the company determined it did not meet its internal threshold for reporting a credible or imminent plan for serious physical harm.
In a letter sent to the community on Thursday, Altman expressed deep remorse, stating "The pain your community has endured is unimaginable" and "I cannot imagine anything worse in this world than losing a child." He acknowledged that while the company's safety standards required a specific threshold before alerting authorities, the incident has prompted OpenAI to strengthen its safety measures and work more closely with law enforcement at all levels.
The incident has spawned significant legal consequences for OpenAI. Parents of a severely injured child have sued the company, alleging it "had specific knowledge of the shooter's long-range planning of a mass casualty event" but failed to act. Additionally, OpenAI is facing a separate criminal investigation in Florida related to a shooting at Florida State University, where a ChatGPT user allegedly killed two people and injured several others.


