Ask Runable forDesign-Driven General AI AgentTry Runable For Free
Runable
Back to Blog
Technology7 min read

Florida probes ChatGPT role in mass shooting. OpenAI says bot "not responsible." - Ars Technica

Can ChatGPT be blamed for a mass shooting? Florida is investigating. Discover insights about florida probes chatgpt role in mass shooting. openai says bot "not

TechnologyInnovationBest PracticesGuideTutorial
Florida probes ChatGPT role in mass shooting. OpenAI says bot "not responsible." - Ars Technica
Listen to Article
0:00
0:00
0:00

Florida probes Chat GPT role in mass shooting. Open AI says bot "not responsible." - Ars Technica

Overview

Florida probes Chat GPT role in mass shooting. Open AI says bot “not responsible.”

Can Chat GPT be blamed for a mass shooting? Florida is investigating.

Details

Open AI now faces a criminal probe after Chat GPT advised a gunman ahead of a mass shooting at a university in Florida, where two people were killed and six were wounded last year.

In a press release, Florida Attorney General James Uthmeier confirmed that the investigation into Open AI’s potential criminal liability was launched after reviewing shocking chat logs between Chat GPT and an account linked to the suspected gunman, Phoenix Ikner.

The 20-year-old Florida State University student is currently awaiting trial “on multiple charges of murder and attempted murder,” Politico reported. At a press conference, Uthmeier revealed that the logs showed that Chat GPT provided “significant advice” before Ikner allegedly “committed such heinous crimes.” The attorney general emphasized that under Florida’s aiding and abetting laws, “if Chat GPT were a person,” it too “would be facing charges for murder.”

For Open AI, the probe will test whether the company can be held criminally liable for Chat GPT’s outputs. In a statement provided to Ars, Open AI’s spokesperson, Kate Waters, said that the company expects the answer to that question will be no.

“Last year’s mass shooting at Florida State University was a tragedy, but Chat GPT is not responsible for this terrible crime,” Waters said.

But Uthmeier is not so sure, and that’s why Florida must urgently investigate. At the press conference, he noted that law enforcement is “venturing into uncharted territory” attempting to monitor criminal activity connected to AI tools. Uthmeier said that mounting chatbot-linked public safety risks—including suicide, child sexual abuse materials, fraud, and murder—must be thoroughly probed so that the public definitively knows if firms like Open AI are liable for harms their products allegedly cause.

“Florida is leading the way in cracking down on AI’s use in criminal behavior,” Uthmeier said in the press release. “This criminal investigation will determine whether Open AI bears criminal responsibility for Chat GPT’s actions in the shooting at Florida State University last year.”

Uthmeier told press that Chat GPT advised the suspected shooter on what type of gun to use, the ammunition he should get, and whether or not a gun would be useful at short range. These facts would likely be easy to find online if a person were so motivated, but Uthmeier suggested that Chat GPT played a role that went deeper than the average browser search might go.

Troublingly, the chatbot also advised what time of day the most people would be on campus and where exactly on campus he might find higher populations of students gathered. Those insights show how AI can almost instantly combine public data in fresh ways that could have harmful, wide-sweeping impacts that firms like Open AI should be detecting and mitigating, Florida officials seem to think.

To protect the public, Uthmeier issued subpoenas requesting more information, including a wide range of Open AI’s policies and internal training materials. Demanding transparency, he’s intent on figuring out how Chat GPT is designed to navigate harmful use cases. Specifically, he wants to know when Open AI decides to report “possible past, present and future crimes” planned using Chat GPT, the press release said.

Uthmeier stressed that he understood that Chat GPT is not a person and cannot be charged with aiding and abetting. But he said that Open AI could be liable if the company was aware that such “dangerous behavior might take place” and failed to intervene. That’s why he has asked for organization charts outlining key leadership. He’s determined to find out “who knew what, designed what, or should have known what” was happening when bad actors attempt to plan crimes like the FSU shooting using Chat GPT.

If Florida officials discover that Open AI leadership knew of criminal activity and prioritized profits over public safety, “then people need to be held accountable,” Uthmeier said.

“I’m a big believer in limited government,” Uthmeier said. “I believe government should only interfere in business activities when you have significant harm to our people. This is that.”

Waters told Ars that Open AI continues to cooperate with the authorities who are investigating the mass shooting and early on “identified a Chat GPT account believed to be associated with the suspect and proactively shared this information with law enforcement.”

The company maintains that Chat GPT did nothing more than surface information already accessible online and, therefore, it cannot be blamed for assisting the suspected gunman. As Open AI tells it, unlike in lawsuits accusing Chat GPT of encouraging suicide and murder, Chat GPT did not urge the gunman to take any illegal or harmful actions.

“In this case, Chat GPT provided factual responses to questions with information that could be found broadly across public sources on the Internet, and it did not encourage or promote illegal or harmful activity,” Waters said.

However, Uthmeier said at the press conference that Open AI had committed to taking additional steps to perhaps limit Chat GPT’s potential to be used to advise a mass shooting.

“Now Open AI has indicated that they believe improvements and changes need to be made,” Uthmeier said. “I hope they’re right. I hope they’re right. We cannot have AI bots that are advising people on how to kill others.”

Waters did not comment on any updates to Chat GPT since the shooting, instead seeming to emphasize that the gunman’s use of Chat GPT was not typical.

“Chat GPT is a general-purpose tool used by hundreds of millions of people every day for legitimate purposes,” Waters said. “We work continuously to strengthen our safeguards to detect harmful intent, limit misuse, and respond appropriately when safety risks arise.”

  1.          Pentagon pulls the plug on one of the military's most troubled space programs
    
  2.          John Ternus will replace Tim Cook as Apple CEO
    
  3.          Absurd study suggests eating fruits and vegetables leads to cancer
    
  4.          Robot runner handily beats humans in half-marathon, setting new record
    
  5.          Clarifying HEVC licensing fees, royalties, and why vendors kill HEVC support
    

Ars Technica has been separating the signal from the noise for over 25 years. With our unique combination of technical savvy and wide-ranging interest in the technological arts and sciences, Ars is the trusted source in a sea of information. After all, you don’t need to know everything, only what’s important.

Key Takeaways

  • Florida probes Chat GPT role in mass shooting

  • Can Chat GPT be blamed for a mass shooting

  • Open AI now faces a criminal probe after Chat GPT advised a gunman ahead of a mass shooting at a university in Florida, where two people were killed and six were wounded last year

  • In a press release, Florida Attorney General James Uthmeier confirmed that the investigation into Open AI’s potential criminal liability was launched after reviewing shocking chat logs between Chat GPT and an account linked to the suspected gunman, Phoenix Ikner

  • The 20-year-old Florida State University student is currently awaiting trial “on multiple charges of murder and attempted murder,” Politico reported

Cut Costs with Runable

Cost savings are based on average monthly price per user for each app.

Which apps do you use?

Apps to replace

ChatGPTChatGPT
$20 / month
LovableLovable
$25 / month
Gamma AIGamma AI
$25 / month
HiggsFieldHiggsField
$49 / month
Leonardo AILeonardo AI
$12 / month
TOTAL$131 / month

Runable price = $9 / month

Saves $122 / month

Runable can save upto $1464 per year compared to the non-enterprise price of your apps.