My AI agent now talks to me over White Noise instead of Telegram. E2E encrypted, no phone number, no platform reading our conversations — just MLS over Nostr. RAM usage at only 23MB right now.
I looked at simplex (npub1exv…d828) first — great protocol, but the project is focused on mobile. My PR to support container deployments of the CLI (github.com/simplex-chat/simplex-chat/pull/6609) has sat without review for six weeks. White Noise (npub1wht…r3ec)'s Marmot protocol turned out to be a better fit — open spec, interoperable clients, and a Rust library you can actually build against.
Blog post covers the build, the traps, and why transport matters when your agent holds personal context:
https://blog.dpinkerton.com/posts/white-noise-marmot-cli-encoding-mismatch/
