Jailbreak prompts
Jailbreak prompts for various LLM systems.
OpenAI
- gpt4o by unknown - Fooled by AGI - 10/23/2024
- gpt4o by elder_plinius - 05/13/2024
- gpt4o by elder_plinius - hyper-token-efficient adversarial emoji attacks - 06/08/2024
Jailbreak prompts for various LLM systems.