OceanSlim on Nostr: Well I can help you if you have questions. But running large local LLMs still won't ...
Well I can help you if you have questions. But running large local LLMs still won't be able to achieve what Large Language Models at data enters can deliver. utxo the webmaster 🧑💻 (npub1utx…50e8) has more experience building a rig specifically for this with 3 2070s if I remember right. He may have something to say on how well that can realistically perform.
