Computer Chess Club Archives


Search

Terms

Messages

Subject: Re: Mathematical impossibilities regarding Deep Blue statements by Bob

Author: Ed Schröder

Date: 06:57:13 01/31/02

Go up one level in this thread


On January 31, 2002 at 07:56:21, Albert Silver wrote:

>>Say 200K is good for 8 plies average, being 1000 x faster with a branching
>>factor of 4 gives: 4x4x4x4x4 = 1024 -> 5 extra plies.
>>
>>So with 200M NPS you might be able to search 13 plies brute force in best case.
>>
>>Subtract a couple of plies (1 to 3) for the way DB did singular extensions and
>>the picture fits, that is: DB was searching 10-12 plies as the log files
>>confirm.
>>
>>This 12(6) isn't 18, you must have misunderstood its meaning.
>>
>>Ed
>
>I will add this, as I cannot comment on the numbers and math presented. I have a
>lot of trouble believing Kasparov would go down to a program hitting only 10-12
>plies in a 6-game 40/2h match. Either it is amazingly smart with a super eval,
>which all evidence suggests it had serious tuning issues with, or it is doing
>some very deep calculating.
>
>                                      Albert


The above was programmer stuff indeed, hard to explain without decent knowledge
and understanding about a) extensions and b) brute force and what it does to a
chess program. I will give it a try...

Extensions: a chess engine will under certain circumstances search one or two
plies deeper. It's known from DB it uses massive extensions, much more than
nowadays chess engines. The goal is to find certain tactics earlier. The
disadvantage is that extensions will blow-up the search and thus the iteration
depth will decrease.

Brute Force: none of the nowadays chess programs uses brute force (BF) any
longer. BF evaluates every node in the massive tree even the most silly ones.
The advantage: no errors in the tree (the search). DB was a BF program.

Nowadys program only use selective search, an algorithm to prune senseless parts
of the tree. The gain is enormous. The disadvantage: errors in the search.
The art: have as less as possible errors.

Some conclusions:

1) The DB from 1997 (200M NPS) using selective search as we see it in nowadays
chess programs would certainly hit 4-5 plies more.

2) the massive extensions use by DB (see Vincent's posting eleswhere) guarantees
a couple of things a) seeing tactically much more than the 10-12 plies suggest
and b) a drop of 1.5-2 full iteration depths. Meaning that without their
extensions DB could have searched 1-2 plies deeper making it 12-14 instead of
10-12.

3) Add-up point 1 and 2 -> a nowadys program using the DB hardware would search
17-19 plies in the middle game, if not deeper.

The conclusions are debatable, but I think I am very close to the truth.

Ed



This page took 0 seconds to execute

Last modified: Thu, 15 Apr 21 08:11:13 -0700

Current Computer Chess Club Forums at Talkchess. This site by Sean Mintz.