Why Nostr? What is Njump?
2023-09-09 17:23:15
in reply to

TheGuySwann on Nostr: GPT4All is a great one, and also despite a bunch of models requiring like an A100 or ...

GPT4All is a great one, and also despite a bunch of models requiring like an A100 or V100 card to run, there are a number of decent models available with Prem ai too, plus a variety of non-LLM stuff.

For image generation, my favorites have been Automatic1111, and ComfyUI both using stable diffusion. Great places to find models, LoRAs, embedding, etc are civitai and huggingface.co. I know there is another aggregator I used while I was focused on stable diffusion but I can’t remember right now.

Unfortunately LLMs locally run are not the best, the GPU power needed is still just outside of the “prosumer” capacity so you might need a hosted option (Google collab or something) and/or leaning on some of the bigger tools like ChatGPT before the wave of GPU sharing networks finally lands.

Still looking for the best local run LLM stuff and I’ll be sure to discuss it if I find some secret sauce.
Author Public Key
npub1h8nk2346qezka5cpm8jjh3yl5j88pf4ly2ptu7s6uu55wcfqy0wq36rpev