Join Nostr
2026-02-11 10:24:11 UTC
in reply to

gustav on Nostr: A (maybe obvious) realization: Anything we do to help humans determine whether some ...

A (maybe obvious) realization: Anything we do to help humans determine whether some work is made by humans or not will eventually be used against us to train the next AI. If we include a history of changes in each file as I outlined above, that data is extremely valuable as training data for *how humans actually* draw / write / code and the next AI will produce even more human-like output. (However, if we are able to enforce the time-component, the benefit of using AI would significantly be reduced)

So, the conclusion is that we can't rely on checking that someone is human or that a work was created by a human. I think we only have 2 options left:

1) Create an invite-only community in which AI is forbidden, and have tools that help moderators kick people that break the rules. I think artist's are craving such a platform to replace e.g. Instagram. Imagine an invite only Instagram where AI is prohibited, would it work?

2) Examine if it's possible to create a computing space that starts with a clean slate and input/output is very restricted, allowing only new work to be created within the confines of the system, where the system could have 2000s era computing power. Not sure, but maybe this requires proof-of-work, which has huge environmental impacts if done on a large scale. Personal computing does not need to be large scale however. :moomin_hmm: