<oembed><type>rich</type><version>1.0</version><title>Gary ₿usey wrote</title><author_name>Gary ₿usey (npub10v…5y2c7)</author_name><author_url>https://yabu.me/npub10vkh8gavju2tqegqqp29fvdjfle92vzwkjyqw6pzzgssr58u773qn5y2c7</author_url><provider_name>njump</provider_name><provider_url>https://yabu.me</provider_url><html>I agree with you on certain elements of this statement. When you wonder if AI will surpass humans strictly in terms of &#34;smarts&#34;, as you mentioned, yes it will exceed humanity very easily. But, I can&#39;t fully agree with you&#39;re notion that we&#39;ll most likely not see a generalist AI. I believe it will eventually be able to but through different means of achieving consensus. Humans have a tendency to generalize through whatever degree of emotional impact different types of experiences may have on them. This is why when humans generalize they often have a tendency to be wrong about many things; much of the time it lacks logic. Not always. But, often. Given an AI&#39;s lack of emotion, there seems to be, at least, a small possibility of it being able to generate a consensus through facts alone. Including statistical facts that were produced through the observation of emotional responses.</html></oembed>