Join Nostr
2026-01-19 17:03:32 UTC
in reply to

Tom Casavant on Nostr: Unfortunately, as nearly everyone knows, *every* LLM is susceptible to prompt ...

Unfortunately, as nearly everyone knows, *every* LLM is susceptible to prompt injection.
Some people predict that prompt injection will *always* be a problem for LLMs. And if I can tell your LLM to do what I want it to do, suddenly your exposed 'search' API endpoint is *incredibly* valuable to me.

Which is why I propose that the mere existence of a public facing LLM on your site is incredibly dangerous [to you and your site].