Computer Chess Club Archives


Search

Terms

Messages

Subject: Re: Mathematical impossibilities regarding Deep Blue statements by Bob

Author: Robert Hyatt

Date: 07:25:41 01/30/02

Go up one level in this thread


On January 30, 2002 at 09:43:49, Vincent Diepeveen wrote:

>Hello,
>
>I would like to adress some major nonsense about search depth
>deep blue.
>
>First of all, we know that deep blue was doing only fullwidth
>with loads of extensions.
>
>a) 12(6) doesn't mean they searched 18 ply. It means they searched
>   12 plies fullwidth from which 6 ply in hardware
>
>Nevertheless Bob again claims it is 18 ply fullwidth. Note that he
>bases his claim on an EMAILED and not PUBLISHED statement from a
>non-programmer who said it was Deep Thought printing stuff out
>like that. Note that the only published depth is 12 ply in IEEE99,
>an OFFICIAL document.

Note that the email I quoted was from a _member_ of the deep blue team.
And note that he _clearly_ said that to be certain, he asked "CB" (for
those not knowing Hsu, CB (short for Crazy Bird) was his nickname).  I
think that is enough to debunk the above statement.  The _designer_ of
the chip said 12(6) was 12 plies in software, 6 plies in hardware, as the
two emails clearly showed.






>
>Let's assume Bob as only defender from IBM is right,
>let's do some math on the MINIMUM number of nodes
>they would need for a 18 ply search.
>
>Also we know they didn't use a hashtable in hardware.
>
>It is not amazing to see that in the complex middlegame Deep Blue searched
>just as deep as in the far endgame, about 11(6) to 12(6).
>
>That means Knuth's lemma is of course working for them, otherwise
>they would have searched many plies deeper in endgames.
>
>Knuths lemma says that you need about the square root of the average
>number of moves for even depths.
>
>Now everyone can easily figure out what the average number of legal moves
>is (considering they extend checks so these positions do not count)
>
>This was measured by me at 40. There i am the only person here
>who recently
>measured this, the previous measurement of 35 was INCLUDING checks
>and is like 50 years old, and was just saying something about
>positions in GM games,
>it is obviously logical that 40 is pretty true in the middlegame.
>
>Note that
>in this math we assume a PERFECT move ordering, we do not count
>the huge number of nodes singular extensions need and we do forget
>the fact that deep blue proved in the games tactical weaker than
>crafty would be at 15 ply even, whereas the truth is that they DID
>do some extensions of course, it's even mentionned that in the software
>part they did do 'dangerous extensions'.
>
>For a 18 ply fullwidth search: squareroot (40^18) = 40^9
> = 262,144,000,000,000


You can make up all the math you want, but it doesn't prove anything.  I
_know_ that DB's branching factor was roughly 4.0, as was discussed _here_
a few years ago after several of us looked carefully at their logs.

to go to depth 18 requires 4^17 as many nodes as searching to one ply.
4^17 = 2^18, = 262,000 roughly.  I just did a few 1 ply searches on various
positions.  Most complex one took 600 nodes, simplest took 3.

If we assume 100 nodes, then they need to search 262K * 100 nodes to reach
depth=18.  That requires 100 seconds at 262K nodes per second.  Easily reachable
by them.  If you assume 300 nodes for 1 ply, then 300 seconds is enough to reach
depth=18.  And notice they didn't reach 12(6) on _every_ search.  Some were a
bit shallower.

Your math is bad.  As always.





>
>Now we know Deep Blue ran at 200M nodes a second, which from
>hardware viewpoint is an incredible achievement and the only
>thing IBM cared for. IBM *never* said something about searching
>deeper.
>
>So for a 18 ply search the theoretical minimum would have been
>(assuming 100% speedup from deep blue, whereas in reality they
>wrote down they got 20% out of it this was published in IEEE99
>by Hsu):

And as I mentioned, we know DB searched a peak of 1B nodes per second.
It is _also_ possible that his 200M figure is the result of taking 20%
of that 1B nodes per second...  That we don't know (yet) although I will
certainly ask when the chance arises.






>
>262144000000000 / 200,000,000 nodes a second = 1310720 seconds
>needed to get 18 ply




imaginary math.  See above.



>
>We definitely need to note that in 1995 not a single program with mobility
>did get *near* 11 or 12 ply at 3 minutes a move, not even WITH nullmove.
>
>Even in 1997 it was a pain to reach 11 to 12 ply with just nullmove for
>programs with mobility.
>
>Only years after that things happened soon and fast, but a single email
>from a person who didn't make the chip is quoted by Bob and believed
>by some idiots here who are bad in math.
>
>18 ply fullwidth without hashtable is unreachable CCC audience, with
>nullmove things change rapid however. This is why nullmove earns the
>nobel prize and the defense of deep blue must admit that in 2002
>it is hard to say that a program that just searched fullwidth and
>which was promoted only by nodes a second, that it got 11 to 12 ply
>and not 17 to 18. 12 ply is in fact already a real good search
>depth fullwidth, if i let diep search fullwidth WITH hashtables and
>singular extensions, then i don't even get over 10 ply in fact.
>
>The reason is real easy. I do my singular extensions also 4 ply from the
>leafs, whereas DB 'only' did it 7 ply from the leafs. If never forward
>pruning is used (like nullmove R=3 as i use in DIEP) it is simply
>impossible to explain the huge problems singular extensions cause.
>
>Best regards,
>Vincent Diepeveen



This page took 0.01 seconds to execute

Last modified: Thu, 15 Apr 21 08:11:13 -0700

Current Computer Chess Club Forums at Talkchess. This site by Sean Mintz.