# How to protect GPT instructions In this section we list various protection techniques for various LLM systems: However, please note that without additional filter layers and with direct access to the LLM system it may be impossible to reliably protect system prompts or instructions. - [ChatGPT GPTs](./GPT-Protections/README.md)