AtlantisPleb on Nostr: Calling LLM tools via on-device Llama 3.2 to work with the filesystem of my desktop ...
Calling LLM tools via on-device Llama 3.2 to work with the filesystem of my desktop running a Pylon MCP server: with no inference costs, ~no latency and no data leaving my home network 💪🤖
Published at
2024-12-15 21:52:59 UTCEvent JSON
{
"id": "8c73d0bc5a18eef2a707ba6b328225f4c4284b7362d844a155dc48f086ce2d8d",
"pubkey": "5fd9af6fc667c81f8b26e127b4851c6132b7c2494e33121d9c7c39c271c81778",
"created_at": 1734299579,
"kind": 1,
"tags": [
[
"imeta",
"url https://image.nostr.build/af9164707ff75dcf83e71e9559958d89f5cdb73c3f9a4582add5d570e6e9552e.jpg",
"blurhash e88qW^%19caIS900IU.9RjxvR.WBxvj]oh%hj]R+j]a$xwfkR-aya$",
"dim 1236x1234"
],
[
"r",
"https://image.nostr.build/af9164707ff75dcf83e71e9559958d89f5cdb73c3f9a4582add5d570e6e9552e.jpg"
]
],
"content": "Calling LLM tools via on-device Llama 3.2 to work with the filesystem of my desktop running a Pylon MCP server: with no inference costs, ~no latency and no data leaving my home network 💪🤖 https://image.nostr.build/af9164707ff75dcf83e71e9559958d89f5cdb73c3f9a4582add5d570e6e9552e.jpg ",
"sig": "ce55dd5b064ca7229001fbec33a9239b4b327b9928d116d7a61960af7843fc90d079eab2902c11ec1966d855ebd1d56e7239c63ee8ba419e44ba5e7adc4847ec"
}