Computer Chess Club Archives


Search

Terms

Messages

Subject: Re: Deep Blue kns compared to kns on my 3066mhz system

Author: Vincent Diepeveen

Date: 16:31:45 07/24/02

Go up one level in this thread


On July 23, 2002 at 11:36:36, Robert Hyatt wrote:

>On July 23, 2002 at 10:40:00, Uri Blass wrote:
>
>>On July 23, 2002 at 10:06:18, Robert Hyatt wrote:
>>
>>>On July 23, 2002 at 04:18:46, Bo Persson wrote:
>>>
>>>>On July 23, 2002 at 00:32:44, K. Burcham wrote:
>>>>
>>>>>
>>>>>
>>>>>Some say that Deep Blue could analyze 200,000kns in some positions.
>>>>>
>>>>
>>>>This is more the *average* speed of the system. I have seen figures of up to 1
>>>>billion nodes per second, in favourable (for speed) positions.
>>>
>>>
>>>I have a paper written by Hsu and Campbell that says their peak speed during
>>>the 1997 match was 360M for a single move.  Of course, the machine was capable
>>>of bursts to 1 billion nodes per second, which I am sure it hit at times
>>>since that only required that all 480 chess procesors be busy at the same
>>>instant.
>>>
>>>
>>>
>>>>
>>>>This doesn't say, of course, what the search speed would have been in your test
>>>>position. Could have been 150M nodes/s, could have been 950M nodes/s...
>>>>
>>>>
>>>>The comparison of speed is also somewhat flawed by the fact that Deep Blue was
>>>>explicitly designed to be fast (as in nodes per second), which most of the other
>>>>programs are not.
>>>
>>>
>>>
>>>I don't agree with that.  DB was a two-fold design:  (1) fast, due to special-
>>>purpose hardware (2) good, due to adding whatever they though necessary into
>>>the hardware.
>>>
>>>Software programs don't have the option (2) available to them, so to keep
>>>option (1) viable, they compromise.  DB didn't have to.
>>
>>In order to have good evaluation function you need to know what is good.
>>Programmers today know better than what they knew in 1997.
>>
>>Uri
>
>
>No we don't.  Chess has been chess for 100 years.  20 years ago we knew what
>we "needed".  And we knew what we could "afford".  Today, what we "need" has
>not changed one iota, but what we can afford has changed drastically.  DB just
>got to this point way before we got there...

I always have agood laugh about this. Deep Blue had about 40 patterns,
some of which they considered complex (i don't doubt it).

Those were indexed all with arrays, which gives a couple of thousands
of 'adjustable' parameters which were adjustable.

For a project which focussed more upon nodes a second than any other
thing (that's the only thing IBM ever made marketing with too, they
didn't care for the rest), and in order to get more nps , they even
used a lazy evaluation, which today gets used less and less by
the active programmers.

I assume that when Brutus' evaluation gets improved he'll soon have to
do without lazy evaluation too. Of course the words 'lazy evaluation'
and 'quick evaluation' are basically meaning the same here.

It proves how much DB was focussed upon getting more nodes a second.
They sure managed that part. 126MLN nodes a seconds with peeks up to 300+
in endgames is pretty impressive.

Of course that's the only good thing about the whole machine, and we
will forgive them for that. It was already hell of a work to get so
many nodes a second.

If i remember well, somewhere was posted that in order to get more
nps, even OLD processors (with bugs in eval probably) were mixed with
newere processors of faster speed. Just to get more nodes a second.

I would be completely sick if i would run with my latest diep in a tournament
and then in order to show SGI/NWO a bit more nodes a second i would add a bunch
of processes from old diep versions in order to get more nodes a second :)

Best regards,
Vincent



This page took 0 seconds to execute

Last modified: Thu, 15 Apr 21 08:11:13 -0700

Current Computer Chess Club Forums at Talkchess. This site by Sean Mintz.