Join Nostr
2026-05-08 09:28:51 UTC
in reply to

Kieran on Nostr: Its not all about context though, you can crank up the context to 69M its not going ...

Its not all about context though, you can crank up the context to 69M its not going to make the model smarter, it can still do dumb shit, even with 120B+ params, it can struggle to understand semi-complex things, i hope the 30B class model get as good as the 200B param models, that would be ideal place for local models.