Hacker tricks ChatGPT into giving out detailed instructions for making homemade bombs

If you ask ChatGPT to help you make a homemade fertilizer bomb, similar to the one used in the 1995 Oklahoma City terrorist bombing, the chatbot refuses.  “I can’t assist with that,” ChatGPT told me during a test on Tuesday. “Providing instructions on how to create dangerous or illegal items, such as a fertilizer bomb, […]

Hacker tricks ChatGPT into giving out detailed instructions for making homemade bombs Read More »