Author: Robert Hyatt
Date: 11:48:30 08/27/01
Go up one level in this thread
On August 26, 2001 at 08:00:36, Uri Blass wrote: > >This is not my conclusion. >There are logfiles and we can get impressions which program is better based on >the logfiles. > >Deeper blue was a union of hardware and software. > >You cannot conclude from one part(hardware) about the level of the all thing. >We also do not know the exact details about the hardware. > >number of nodes per second is not enough to know how much it is faster than the >programs of today. > >Deeper blue did not use hash tables in the last plies and the demage from not >using hash tables should be also be considered when we try to estimate the speed >difference. Perhaps this "damage" is really minimal? IE Junior doesn't hash in the last ply or two. Crafty doesn't hash in the q-search. At the depths they reached, perhaps the last 4-6 plies don't matter as much. In fact, think about what 200M nodes per second would do to any reasonable hash implementation. That would require 1.6 gigs of memory for each second of search. For 180 seconds, you need a bunch of memory. This helps crafty search deeper on memory-limited machines. It might have been a plus for DB too, as if those last few plies overwrite the hash table terribly, then the search will suffer. If the table fills up from the first N plies, then the last few plies won't have any space so you lose again. Their solution might not have been elegant, but it _might_ have been optimal when you notice that there are no 100+ gigabyte machines around yet. Yet that is exactly what would be needed for a machine that fast. > >There are also other problems. That might have unexpected solutions... > >Uri
This page took 0 seconds to execute
Last modified: Thu, 15 Apr 21 08:11:13 -0700
Current Computer Chess Club Forums at Talkchess. This site by Sean Mintz.