Jailbreak tricks Discord’s new chatbot into sharing napalm and meth instructions

Jailbreak tricks Discord’s new chatbot into sharing napalm and meth instructions

TechCrunch·2023-04-21 12:07

In March, Discord announced that it had integrated OpenAI’s technology into its bot named Clyde, turning it into an AI-powered chatbot. Just like with any other chatbot launched in the last few months, users have been trying to trick Clyde into saying things it’s not supposed to say, a process colloquially known as a “jailbreaking.”

……

Read full article on TechCrunch

Technology International