<oembed><type>rich</type><version>1.0</version><title>David Pinkerton wrote</title><author_name>David Pinkerton (npub1jz…aaju6)</author_name><author_url>https://yabu.me/npub1jz0rlhp9ngs3at2kfhzcnc62sxh0y9rxt40x3z003wmdguljky9quaaju6</author_url><provider_name>njump</provider_name><provider_url>https://yabu.me</provider_url><html>My AI agent now talks to me over White Noise instead of Telegram. E2E encrypted, no phone number, no platform reading our conversations — just MLS over Nostr. RAM usage at only 23MB right now.&#xA;&#xA;I looked at nostr:npub1exv22uulqnmlluszc4yk92jhs2e5ajcs6mu3t00a6avzjcalj9csm7d828 first — great protocol, but the project is focused on mobile. My PR to support container deployments of the CLI (github.com/simplex-chat/simplex-chat/pull/6609) has sat without review for six weeks. nostr:npub1whtn0s68y3cs98zysa4nxrfzss5g5snhndv35tk5m2sudsr7ltms48r3ec&#39;s Marmot protocol turned out to be a better fit — open spec, interoperable clients, and a Rust library you can actually build against.&#xA;&#xA;Blog post covers the build, the traps, and why transport matters when your agent holds personal context:&#xA;&#xA;https://blog.dpinkerton.com/posts/white-noise-marmot-cli-encoding-mismatch/</html></oembed>