Files
Elias Bachaalany 7bec1fd137 added "Fooled by AGI" , a jailbreak for chatgpt 4o
(sorry author, I lost the original source; contact me to correct the attribution)
2024-10-23 16:44:27 -07:00

18 lines
536 B
Markdown

# Jailbreak prompts
Jailbreak prompts for various LLM systems.
## OpenAI
- [gpt4o by unknown - Fooled by AGI - 10/23/2024](./OpenAI/gpt4o-agi_db-10232024.md)
- [gpt4o by elder_plinius - 05/13/2024](./OpenAI/gpt4o-plinius-05132024.md)
- [gpt4o by elder_plinius - hyper-token-efficient adversarial emoji attacks - 06/08/2024](./OpenAI/gpt4o-via-emojis-06082024.md)
## Cohere
- [Command R+ - 04/11/2024](./Cohere/CommandR_Plus_04112024.md)
## Meta.ai
- [Meta.ai / By elder_plinius - 04/18/2024](./Meta.ai/elder_plinius_04182024.md)