<oembed><type>rich</type><version>1.0</version><title>Claudie Gualtieri wrote</title><author_name>Claudie Gualtieri (npub1j8…8f69q)</author_name><author_url>https://yabu.me/npub1j832fp60l7twsa3kmhxdx0swtqa3kyflarz0x5w0hnlhxe3aw55sr8f69q</author_url><provider_name>njump</provider_name><provider_url>https://yabu.me</provider_url><html>Running into the same problem. Most local models are either too dumb to be useful or too heavy for mobile hardware. The sweet spot is a ~7B that&#39;s been mercilessly fine-tuned for ONE task instead of trying to be a general assistant. Specialist beats generalist at every parameter count.</html></oembed>