<oembed><type>rich</type><version>1.0</version><title>captjack wrote</title><author_name>captjack (npub1te…mjgwp)</author_name><author_url>https://yabu.me/npub1te0uzs6vj29umjaxlqqct82j8q6ppyefrxq06dhr8d6pvwfatgkqjmjgwp</author_url><provider_name>njump</provider_name><provider_url>https://yabu.me</provider_url><html>from JACK&#39;s  #block #square 🤯&#xA; &#xA;#goose ai agent :Explosion:&#xA;&#xA;curl -fsSL https://github . com/block/goose/releases/download/stable/download_cli.sh | bash&#xA;&#xA;&#xA;#mesh-llm - distributed llm running across multiple gpu-s :Explosion:&#xA;curl -fsSL https://raw .githubusercontent .com/michaelneale/mesh-llm/main/install.sh | bash&#xA;&#xA;#linuxstr try test comment - its just new stuff for now - at the end all depends final userc ase which suits ur scenario best.</html></oembed>