Why Nostr? What is Njump?
2024-09-06 22:44:45

y₿ on Nostr: ...

中继站的一个重要功能就应当是过滤掉“垃圾消息”。“垃圾消息”的传播有规律,那就是”内容重复、量大、面广”。

#nostr #蹃驼
#Nostr I think I'll be tinkering around with some more strict rate limiting measures in #Nosflare relay that stores the hash of the content of each note's json. In a similar way I have it check for duplicates already on #Anonostr but this would apply relay wide.

This way bursts of spam notes that repeat the same thing, regardless if disposable npubs are used, can be stopped quickly at the relay level. And it would be quick to adapt if the spammer adapts. Of course, will have a filtering logic available for any single word or phrases you'd want still allowed. But might be useful to block X thing once repeated Y times within a given time frame.

I'm fully aware that a spammer could iterate through each note with completely unique content to bypass. And that's why I'm thinking the next best feature might be to have a minimum created_at requirement to publish to the relay. When an EVENT message is received it goes into a queue and the author's pubkey is checked against another relay for its oldest created_at event. If it is less than X time old, it gets dropped.

Could be something relay implementations collab on to help the community. Some standard that let's relays share this data and make it easily REQable. Thoughts? #AskNostr
Author Public Key
npub18qkk0mr2x8wg6rzmxzvh2wsqgsnp5u8xjj4d6j0ttr7w22wtxzys32cgr6