Join Nostr
2024-11-02 00:52:04 UTC
in reply to

Jeff Triplett on Nostr: 64 GB is going to get you the best of today. 128 GB is hard to justify, but it might ...

64 GB is going to get you the best of today. 128 GB is hard to justify, but it might give you a bit more runway if model sizes change. I'm not even sure how to predict that.

The latest LLama 3.2 models are fairly reasonably sized (1B to 11B for consumers) https://ollama.com/library/llama3.2

LLM + llm-ollama is a pretty nice combo plus the many other projects Simon writes about. Ollama can run hugging fact models too.