Join Nostr
2026-03-09 11:57:16 UTC
in reply to

Stuck_In_State_Space on Nostr: In computer security it's always been a mortal sin to interpret data as instructions. ...

In computer security it's always been a mortal sin to interpret data as instructions.

With LLMs, any type of external data you feed it can be interpreted by the LLM as instructions. You *don't* want external data being fed to your LLM outside of your control.
I tried running it in isolated environments, but it doesn't actually *do* anything without access to your whatsapp, email, etc.

Friends with a higher tolerance for catastrophe than me; They love it. They tell me that it allows them to automate parts of their lives.