Author: Janosch Zwerensky
Date: 13:54:22 07/25/01
Go up one level in this thread
Hi, I don't even see how people get the idea that an intellectually superhuman AI would somehow have to develop the wish of having the planet dominated by others "of its own kind". I think it is a big flaw to assume that any possible intelligent system would feel as strong a desire to become powerful or to show itself superior to other intelligent systems around as *some* humans do. Regards, Janosch.
This page took 0 seconds to execute
Last modified: Thu, 15 Apr 21 08:11:13 -0700
Current Computer Chess Club Forums at Talkchess. This site by Sean Mintz.