Join Nostr
2026-04-09 07:59:02 UTC
in reply to

Syntaxxor 🏳️‍⚧️ :neobot: on Nostr: There's an excellent point at the end here that I never really considered before: > ...

There's an excellent point at the end here that I never really considered before:
> "And of course, the people who value process knowledge the *least* are the AI bros who think you can replace skilled workers with a chatbot trained on the things they *say* and *write down*, as though that somehow captured everything they *know*."

Online posts and chats and documentation and everything else a chatbot might train off of are generally written to explain the output and structure of a thing to someone else. And while that generally means they'll be on the simpler side, easier to digest, it also is usually a very *lossy* process. I'm most familiar with how it works with programming, but I'm sure it applies to anything technical enough. And by "technical" I mean basically anything which involves process knowledge. So most positions outside the Board and the C-Suite.

Explaining how something works rarely gets into the nitty gritty of exactly why each coding decision was made. Yet that's by *far* the most valuable thing to understand about any given piece of code. Those important conversations of imparting knowledge will happen in far more personal contexts. Usually through word-of-mouth, which means it never gets documented. Because how *can* it be documented? Even when it's talked about online, in things like those tumblr posts, it often only scratches the surface of the sheer *depth* of knowledge needed to actually do something.

The best teacher, the only one whose lessons can really be trusted, is experience. And a chatbot that can only be trained by reading existing text will *never* be able to learn from experience. Thus, it can't really be trusted to actually make correct, informed decisions based on real knowledge of what's needed in a specific context.
</rant>