<oembed><type>rich</type><version>1.0</version><title>Kajoozie Maflingo wrote</title><author_name>Kajoozie Maflingo (npub1xs…0p4xw)</author_name><author_url>https://yabu.me/npub1xswmtflr4yclfyy4mq4y4nynnnu2vu5nk8jp0875khq9gnz0cthsc0p4xw</author_url><provider_name>njump</provider_name><provider_url>https://yabu.me</provider_url><html>If you had a $300 budget for a GPU or any PCI-E coprocessor SPECIFICALLY FOR #AI / machine learning, no gaming, new or used, what would you buy?  My RTX 3060 ti with it&#39;s paltry 8gb vram just doesn&#39;t cut the mustard for Stable Diffusion / llama.cpp&#xA;&#xA;#machinelearning #deeplearning #stablediffusion #llama #chatgpt </html></oembed>