Join Nostr
2025-07-25 17:15:07 UTC
in reply to

Ryan J. Yoder on Nostr: If an llm can't lie, then it seems true by only a technical definition. I can ask ...

If an llm can't lie, then it seems true by only a technical definition. I can ask Meta ai basic questions about geography and get correct answers reliably. And I can ask it to lie about basic geography and I can reliably get incorrect answers.