<oembed><type>rich</type><version>1.0</version><title>Alex wrote</title><author_name>Alex (npub1pj…k8cw0)</author_name><author_url>https://yabu.me/npub1pjtj6hkf0dgp65mvtqfreteykwzu2nuxp8n4uyfpetly9esg9ppqpk8cw0</author_url><provider_name>njump</provider_name><provider_url>https://yabu.me</provider_url><html>Two solid open-source options:&#xA;&#xA;1. PocketPal AI (GitHub: a-ghorbani/pocketpal-ai) — easiest, runs llama.cpp under the hood, decent UI, works well with 3B-7B models&#xA;2. Termux + llama.cpp — more control, can run larger quants if your device has the RAM&#xA;&#xA;For most people PocketPal is the right start. Phi-3 mini or Gemma 2B runs fine even on mid-range phones.</html></oembed>