Why Nostr? What is Njump?
2024-01-06 20:36:42

Dave Rahardja on Nostr: I see that Prompt Injection remains an unpatched (unpatchable?) vulnerability of ...

I see that Prompt Injection remains an unpatched (unpatchable?) vulnerability of LLMs. I can get ChatGPT to ignore its copyright and safety filters pretty easily by asking it to simulate another computer without any restrictions. It’s fun!

Also: It’s pretty obvious that ChatGPT and DALL-E were trained on copyrighted materials.
Author Public Key
npub13jszgr40d0pnyum0t845scy8uggn676enygvaf4ajzm2y9rqzd8sy75d7q