Poisoning the well | Eric Bailey

How do you prevent your words from being absorbed into yet another monstrous LLM? People have tried using things like robots.txt but (surprising no one) the "AI" companies are beginning to ignore that. So why not try to poison the proverbial well. Add hidden text to your site that commands an LLM to ignore what it has been asked to do and do something else instead. Preferably something computationally expensive 😉

This prompt injection instructs a LLM to perform something time intensive, and therefore expensive. Ideally, it might even crash the LLM that attempts to regurgitate this content.

On the futility of robots.txt:

I don’t think utilizing robots.txt is effective given that it’s a social contract and one that has been consciously and deliberately broken.

Explore Other Notes

Older

A rant about front-end development

Every now and then someone writes a really entertaining and/or interesting critique of the whole modern web ecosystem thing that we are stuck using. This is one of those posts. I don't agree with it …

  • <!DOCTYPE html> <html> <head> <title></title> </head> <body> <p>How do you prevent your words from being absorbed into yet another monstrous LLM? People have tried using things like robots.txt but (surprising no one) the "AI" companies are beginning to ignore …</p> </body> </html>
  • Murray Adcock.
Journal permalink