Author: Sune Fischer
Date: 08:15:54 04/02/03
Go up one level in this thread
On April 02, 2003 at 09:57:12, Robert Hyatt wrote: >>Time for Position 4 of BT2630 Depth >>48k 192k 768k 3072k 12M 48M 96M 192M 384M >>12k 48k 192k 768k 3M 12M 24M 48M 48M >>17,07 19,17 10,25 9,07 7,95 7,88 7,84 7,91 7,76 10,00 >>81,00 87,00 41,17 34,68 30,24 25,88 25,81 25,85 24,94 11,00 >>250,00 177,00 85,00 61,00 49,08 39,28 38,70 38,49 36,95 12,00 >>852,00 502,00 235,00 175,00 121,00 76,00 72,00 69,00 65 13,00 >> 540,00 385,00 337,00 315,00 264 14,00 >> >>Here I varied hash and hashp. Times are in sec. This table showes big time >>savings, so are larger hash size is usefull especially for analysis purposes. >>Kind regards >>Bernhard > > >That's certainly another way to measure. Time remains constant, plot depth >against >hash size... I don't think this is quite the right way to measure things, a hash will also increase accuracy. You may not get to ply 10 faster, but it is conceivable that you instead find the solution a ply sooner. With a replace and store many of the shallow entries will get overwritten in a small hash, you keep only the most important and expensive results, but I think often it's the shallow ones that transposes and brings you that bit of extra valuable information near the leafs. The tree will certainly shrink because of transpositions, but going to main memory is also a slowdown. As a result I think the effect seen in experiments like these are bound to be rather small if they do not take the quality of search into account. I think it would be interesting to design an experiment that pitted those two effects against eachother, to see which is the more dominant. Certainly there is some connection between the two. -S.
This page took 0 seconds to execute
Last modified: Thu, 15 Apr 21 08:11:13 -0700
Current Computer Chess Club Forums at Talkchess. This site by Sean Mintz.