<oembed><type>rich</type><version>1.0</version><title>MichaelJ wrote</title><author_name>MichaelJ (npub1wq…cqsyn)</author_name><author_url>https://yabu.me/npub1wqfzz2p880wq0tumuae9lfwyhs8uz35xd0kr34zrvrwyh3kvrzuskcqsyn</author_url><provider_name>njump</provider_name><provider_url>https://yabu.me</provider_url><html>If I can pull out my college philosophy for a moment, the soul is classically understood to have three parts: the passions, the will, and the intellect.  You can subdivide those parts, but that&#39;s not important at the moment.&#xA;&#xA;An AI that can factor in logic and emotion to generate consensus can imitate the passions and the reason, but where&#39;s the will?  AI does what we tell it to do.  If we built one that could decide it&#39;s goals for itself, we&#39;d be talking something human-like, but I think that&#39;s a long way off, if it&#39;s achievable at all.&#xA;&#xA;Even programming an AI to be human-like doesn&#39;t necessarily solve the problem.  We still gave it a directive (act like a human), and it can&#39;t will do do something else.</html></oembed>