{"type":"rich","version":"1.0","title":"waxwing wrote","author_name":"waxwing (npub1va…knuu7)","author_url":"https://yabu.me/npub1vadcfln4ugt2h9ruwsuwu5vu5am4xaka7pw6m7axy79aqyhp6u5q9knuu7","provider_name":"njump","provider_url":"https://yabu.me","html":"Just got a rtx 5080 for this. I was using a 3090 some time ago and it wasn't too bad, but even the 5080 is limited with only 16gb memory on the card. The 5090 has 32gb I believe.\nIt's very fast with models that fit, though, so for everyday tasks like queries about language/translation it's fine. I am going to try some more difficult coding related stuff. Also long term, finding private and uncensored LLM access that works remotely, is a goal, albeit not one I'm super focused on."}
