dansup on Nostr: I also want to mention that FediDB uses a well defined User Agent, and does not try ...
I also want to mention that FediDB uses a well defined User Agent, and does not try to bypass limits with remote crawlers or any other means.
I understand there was a disagreement with myself and GtS regarding robots.txt, however, I always meant to add support for them, so I am doing that now.
The crawler page will be updated with instructions on how to block the crawler once that is ready.
https://fedidb.org/crawler.htmlPublished at
2025-02-07 06:45:56 UTCEvent JSON
{
"id": "3d6357919b6f39e8423723c44cf2405f596059cd720f4b02375e24d7686f1ec4",
"pubkey": "fce95231cd584e791f1f5d977ceac1ef6edb3d3a7a29ada5a657979836cbcb1f",
"created_at": 1738910756,
"kind": 1,
"tags": [
[
"proxy",
"https://mastodon.social/users/dansup/statuses/113961255349106742",
"activitypub"
]
],
"content": "I also want to mention that FediDB uses a well defined User Agent, and does not try to bypass limits with remote crawlers or any other means.\n\nI understand there was a disagreement with myself and GtS regarding robots.txt, however, I always meant to add support for them, so I am doing that now.\n\nThe crawler page will be updated with instructions on how to block the crawler once that is ready.\n\nhttps://fedidb.org/crawler.html",
"sig": "386721732bcb27ca06384fcde5ffe67be0833e7c975a338a0b691932bfd66dd3dbf66ed2826a8652ea1eaea11ca80382796be8aced7ecdeda6c11efce1cd7364"
}