Author: Robin Smith
Date: 15:42:27 03/19/01
Go up one level in this thread
On March 19, 2001 at 01:45:25, Christophe Theron wrote: >Subjective experiences are defined by the states of the information processing >entity. This is an interesting and possibly correct conjecture. Do you have specific evidence to demonstrate it's validity? Can you think of an experiment that could prove it to be correct? Can you eleborate on your conjecture to say precisely HOW information processing leads to subjective experiences? Does a chess program, which is certainly an information processing entity, have feelings? If not, why not? If so, how do we know that? >It's no wonder they cannot be transfered "as is" in another entity (which has a >different structure). So you are bound to look at "feelings" from the outside >and deduce what these feelings are only by the behaviour of the entity. Deduction can be a very tricky business. Even with people, I sometimes think someone is feeling angry when they are just feeling tired. And this is with two people who have very similar "structures". When a dog wags it's tail, I think the dog is happy. But maybe the dog is hungry and has found that wagging it's tail sometimes leads to it getting food. And then take it a step further, to a robot dog that is wagging it's tail. Is it feeling anything at all? How can we know? I'm not saying that it does or doesn't, just that we don't know and assuming can be a dangerous pastime. >Your examples about love, fear, humour, beauty are certainly very romantic, but >it - again - sounds very old fashioned to me. I don't understand what you are saying sounds old fashioned. Do you think the subjective experience of love, fear etc are old fashioned? Do you not also share these experiences? I assume you do, you are human. So what is the old fashioned part? That I think an explaination has not yet been demonstrated? How is this oldfashioned? >Some of these "feelings" will probably appear in very complex computing machines >and it will be possible to see it from the outside. Yes, they likely will. But how will we KNOW they do, unless we understands the mechanism whereby they arise in humans? Take a *very* complex computing machine, let's say a complex computer simulation of the earth, including all it's weather, earthquakes, tides, volcanos etc. Even the best super computers of today can't predict the weather 3 days from now, or the next earthquake or volcanic eruption, so doing a model of the whole earth that can predict what the exact tempurature 10 weeks from now will be is clearly sometime in the future, if we can ever do it at all. Would a machine doing such a simulation have feelings? If not, why not? And if so, does that also mean that the thing being modeled, the earth, also has feelings? Or let's take a group of people, perhaps the scientists and engineers working on the manhatan project during world war II. Clearly the complex calcualtions they carried out as a group are even more complex than any one of them did as an individual. Did tis group attain consciousness .... as a GROUP? If not why not? After all, the group was engaged in complex calculation. I have a hunch that there is more to it than just complex calculation. >At the time there were no computers, and machines were made of gears, people >could have wondered how a machine could gather informations and do anything >useful with it, but now that we have computers of such complexity and that we >are foreseeing gigantic advances in this complexity, I think it is time to >update our way of thinking... Yes, and this is what I am trying to do. But I don't think the issues are as simple as you make them out to be. >I'm not trying to contradict you by all means. It's just that I don't see >mysteries where you see them. If you don't see any mystery, then you must have some idea as to where these subjective experiences come from. Please share your insights. Bringing it back to computer chess, how would you go about programming a computer to be upset when it losses a game of chess? Or have you already done that, since it is clear that your program, Tiger, tries very hard to win? Robin
This page took 0.01 seconds to execute
Last modified: Thu, 15 Apr 21 08:11:13 -0700
Current Computer Chess Club Forums at Talkchess. This site by Sean Mintz.