Poisoning the well | Eric Bailey

How do you prevent your words from being absorbed into yet another monstrous LLM? People have tried using things like robots.txt but (surprising no one) the "AI" companies are beginning to ignore that. So why not try to poison the proverbial well. Add hidden text to your site that commands an LLM to ignore what it has been asked to do and do something else instead. Preferably something computationally expensive πŸ˜‰

This prompt injection instructs a LLM to perform something time intensive, and therefore expensive. Ideally, it might even crash the LLM that attempts to regurgitate this content.

On the futility of robots.txt:

I don’t think utilizing robots.txt is effective given that it’s a social contract and one that has been consciously and deliberately broken.

Explore Other Notes

  • How do you prevent your words from being absorbed into yet another monstrous LLM? People have tried using things like robots.txt but (surprising no one) the "AI" companies are beginning to ignore […]
  • Murray Adcock.
Journal permalink