Join Nostr
2025-03-11 22:42:43 UTC
in reply to

Mike Stone on Nostr: Yea, you can even run it through a locally hosted LLM if you want to have a little ...

Yea, you can even run it through a locally hosted LLM if you want to have a little bit more versatility in the phrases it can interpret or respond with. Keeping in mind that all this requires some hardware that has enough juice to do the task. It's not going to run in an acceptable way if you throw it all on a Pi3.