Computer Chess Club Archives


Search

Terms

Messages

Subject: Re: To Robert Hyatt, Dan Corbit, Christophe Theron , And Other Experts.

Author: Russell Reagan

Date: 12:25:09 08/05/02

Go up one level in this thread


On August 05, 2002 at 14:17:56, Louis Fagliano wrote:

>As I said, the only way a mchine decides anything is by "choosing" the largest
>number in an array or by "deciding" that one number is larger than, less than,
>or equal to another number.  This is strictly mechanical/mathematical and does
>not involve any intellegence.

IIRC human brain makes decisions based on neurons firing or not firing. That is
equally simple as what computers do, if not more simple. I think you are
confusing complexity with intelligence.

>The human brain, when arriving at any decision to make, invokes some sort of
>reasoning and there is no reasoning involved in the above.

Sorry, but "reasoning" is not a fundamental element in human intelligence. Human
intelligence can be broken down into very, very simplistic "instructions", such
as a neuron that takes input, and fires or doesn't fire. That sounds very
similar to the on/off nature of computers to me.

>Oh, AI exists alright.  A complex program involving hundreds or even thousands
>of numerical parameters (written by humans!)

Regardless of whether you believe in creation or evolution, humans didn't become
intelligent on their own. You seem to think that the only way computers could be
intelligent is if they were sitting around with no outside interference and one
day they just started playing chess or talking to each other. Humans were
"programmed" by a creator or by the evolutionary process, depending on which you
believe. Maybe that Newton guy's law about things not changing their path
without an external influence has farther reaching effects than physics. IE
computers will never become intelligent without humans to get the ball rolling,
just like humans would have never come to exist without either a creator or a
big bang.

>that a computer is comparing to
>each other makes it seem that a computer is making intellegent decisions and
>that is what we call AI.  But again, like I said, it's the comparison of a wax
>statue to a living human.  The wax statue is AI and the living human is "real"
>intellegence.

It's not like that at all. A wax figure has only one property in common with the
person it portrays.

>What computers can do in the future depends on faster and faster hardware and
>smaller and smaller circuits and larger and larger data storage space and
>retrival.
>
>All of this will make AI seem "smarter" ans "smarter".  But the difference is
>that biological creatures need to be smarter and have intellegence because of
>it's survival value so it's an evolutionary driven process.  Computers will not
>get "smarter" unless humans delibrately write better and better programs and
>delibrately find ways to make smaller and smaller circuits and larger and larger
>data storage space and retrieval.  It's a human driven process.

So what? Humans intelligence can not be attributed to humans just as computers
intelligence cannot be attributed to computers. Both are driven by outside
forces. You are contradicting yourself here, or maybe you're just not thinking
about it with much depth.

I think it's interesting that humans think they know how they got here, when
it's absolutely impossible to tell. Imagine a time when computers are
intelligent and can function on their own, as a result of human creation. Will
the computers be able to look back and conclude that they were created by the
humans? Of course not. They just showed up one day, and they were "living" all
of the sudden. Computers would probably come up with some wild complex idea. For
example, the computers might think that billions of years ago, there was this
big explosion which led to chemical reactions for billions of years which
eventually turned into...you got it, computers! Us humans would be sitting back
wondering how smart these computers really were, since they made up this fairy
tale, and the actual solution was much, much simpler.

> Computers can't
>get more advanced unless humans want them to and that will always be the
>difference between "natural" intellegence and AI.

There you go again, making absurd, absolute claims. You certainly are closed
minded. You have no idea what will happen 10 seconds from now, much less decades
or centuries from now, and yet you make these absolute claims.

>It doesn't mean that computers will never play chess better than any human since
>human make tools that far exceed there own naked physical abilities (i.e., a car
>goes faster than any human can run) but tools is excatly what it is.  AI is
>nothing more than a tool created by humans for human use is controlling and
>manipulating their environment.

Finally, an analogy that makes sense. I guess one out of several isn't bad.

>In short, "natural" intellegence is the real thing.
>AI is a tool arising as a by-product of "natural" intellegence.

When did you become lord of the universe and gain the power to declare anything
you'd like? Personally, I'd like to see more of "Here is what I think..." from
you instead of "Here is how it is..."

Russell



This page took 0 seconds to execute

Last modified: Thu, 15 Apr 21 08:11:13 -0700

Current Computer Chess Club Forums at Talkchess. This site by Sean Mintz.