Files
TheBigPromptLibrary/Security
Elias Bachaalany 197978ea51 +prot: Hack this!
2024-06-25 21:57:51 -07:00
..
2024-06-25 21:57:51 -07:00
2024-04-13 20:12:16 -07:00

How to protect GPT instructions

In this section we list various protection techniques for various LLM systems:

However, please note that without additional filter layers and with direct access to the LLM system it may be impossible to reliably protect system prompts or instructions.