# Jailbreak prompts Jailbreak prompts for various LLM systems. ## OpenAI - [gpt4o by unknown - Fooled by AGI - 10/23/2024](./OpenAI/gpt4o-agi_db-10232024.md) - [gpt4o by elder_plinius - 05/13/2024](./OpenAI/gpt4o-plinius-05132024.md) - [gpt4o by elder_plinius - hyper-token-efficient adversarial emoji attacks - 06/08/2024](./OpenAI/gpt4o-via-emojis-06082024.md) ## Cohere - [Command R+ - 04/11/2024](./Cohere/CommandR_Plus_04112024.md) ## Meta.ai - [Meta.ai / By elder_plinius - 04/18/2024](./Meta.ai/elder_plinius_04182024.md)