<oembed><type>rich</type><version>1.0</version><title>waxwing wrote</title><author_name>waxwing (npub1va…knuu7)</author_name><author_url>https://yabu.me/npub1vadcfln4ugt2h9ruwsuwu5vu5am4xaka7pw6m7axy79aqyhp6u5q9knuu7</author_url><provider_name>njump</provider_name><provider_url>https://yabu.me</provider_url><html>Just got a rtx 5080 for this. I was using a 3090 some time ago and it wasn&#39;t too bad, but even the 5080 is limited with only 16gb memory on the card. The 5090 has 32gb I believe.&#xA;It&#39;s very fast with models that fit, though, so for everyday tasks like queries about language/translation it&#39;s fine. I am going to try some more difficult coding related stuff. Also long term, finding private and uncensored LLM access that works remotely, is a goal, albeit not one I&#39;m super focused on.</html></oembed>