Join Nostr
2024-04-10 00:56:17 UTC
in reply to

OceanSlim on Nostr: Well I can help you if you have questions. But running large local LLMs still won't ...

Well I can help you if you have questions. But running large local LLMs still won't be able to achieve what Large Language Models at data enters can deliver. has more experience building a rig specifically for this with 3 2070s if I remember right. He may have something to say on how well that can realistically perform.