Computer Chess Club Archives


Search

Terms

Messages

Subject: Re: Optimizing Hash table size

Author: Ernest Bonnem

Date: 14:36:25 02/29/04

Go up one level in this thread


This is what I have observed :
At a given time t in the analysis, the chess engine has gone through N nodes
(or t times the average nodes/sec).
I have found that for some engines (Shredder "classic"), these N nodes have
delivered, on the average, approx. 2 N bytes to the hash table (supposing for
instance that a node which delivers  a hash content delivers 16 bytes, but is 8
times less frequent than nodes which do not deliver hash content (quiescent
search nodes, terminal nodes, other...?)
The hash entries fill little by little the hashtable, with dimension H0 bytes,
but the random nature of the filling makes that as the table gets filled, some
new entries replace older entries : for instance, when the hashtable is
half-filled, there is only 50% chance that a new entry will not replace one.
So actually you can see that the filling H of the hashtable is of the form
H = H0 (1 - exp (-kt))
where k can be calculated by the way the hashtable fills initially (for small t,
H = H0.kt)

After a few minutes, the hashtable is filled at more than 99 % (can it ever
reach 100 %, in reality ?) and any new hash entry replaces an old one.
So while your needs increase (engine is now in depth 20, 21...) your hash
resource has now reached its limit.

Now you see this with the percentage of hash hits (given by Winboard, for
instance) : in the beginning, when the hash table is less than 50 % filled, hash
hits are around 30 % or more (Ruffian) and later, this percentage decreases, and
I presume that after the hash table is "full" (means that the hash resource is
now constant, through replacement of old -but not so old...-values by new
values), the hash hit percentage decreases proportionnally to 1/t. Which soon
means zero, and the hashtable now brings quasi (?) nothing any more to the
search.

For a conclusion :

* of course for best efficiency in Deep Position Analysis, get the largest
hashtable size you can !!!

* if just for the fun of it, you want to know what is a "sufficient" hashtable
for a given (short) duration t analysis, calculate by observing your engine :
initial filling of your hashtable during duration t should be around 50 %. It
actually seems that this was the origin of the "famous" Fritz5 formula given by
Chessbase :
 HT(KB) = 2 x F(MHz) x t(sec).


On February 29, 2004 at 15:53:11, Derek Winter wrote:

>I know that this is an old topic for you experts, but for others, it is still a
>mystery.
>
> Assuming that ample RAM is available for the hash table(over 1.5 gig), is there
>a formula for calculating optimized hash table size to use in Fritz for Deep
>Position Analysis with parameters like these:
>
>             Variation Length:   20 ply
>             Branching for both: 3: 2: 2
>             Time used:        : 200 seconds/ply.
>
>                  Thanks, Derek



This page took 0 seconds to execute

Last modified: Thu, 15 Apr 21 08:11:13 -0700

Current Computer Chess Club Forums at Talkchess. This site by Sean Mintz.