Computer Chess Club Archives


Search

Terms

Messages

Subject: Re: How important is a big hash table? Measurements...

Author: Robert Hyatt

Date: 22:01:41 03/29/03

Go up one level in this thread


On March 29, 2003 at 07:45:32, K. Burcham wrote:

>
>
>I think also that the ram available for resources and large hash has exceeded
>the mhz rating for now. In other words with four slots of 256, or two slots of
>512, there is more than enough memory for our hash settings for now.

Here's some rough math.  I'm running tests to see how this holds up in practice
and will post the results tomorrow.  I'm running the first 6 bt2630 positions
to a fixed depth, with hash sizes starting at 48K bytes and doubling each
run.  I'll post the sizes and the total search time for the 6 positions.

Back to the math:  Crafty searches about 2.5M nodes per second as an upper
limit in middlegame positions, using my dual 2.8ghz xeon.  For a search
in a 40 moves/2hrs game, Crafty will average roughly 6 minutes a move,
counting the first N moves in zero time from book, and the time saved by
pondering.  6 minutes = 360 seconds = 2.5 * 360 * M = 900M positions.  Some
fraction of that is not stored in Crafty, Until I get the actual data out,
I'll guess that at least 3/4 of those positions are not hashed.  Meaning I
need to store 225M positions (maybe).  At 16 bytes per position, 3.6 gigabytes
of RAM are needed to hold exactly that many entries.  Since the hash signatures
are _not_ uniformly distributed, I'd probably go for a hash table 2x that big
to have some confidence that not many positions are getting replaced which hurts
tree size.

8 gigs roughly.  Pretty big.

If you assume 90% of searched nodes are at depth=0 and beyond, that means 10%
or 90M positions need to be stored, or 1.4 gigs of RAM, and to be safe, double
that to 3 gigs.

I'm not sure about the .75 or .90 multiplier for the q-search.  And it is
also not easy to prove that storing less than the full tree is sub-optimal,
although logically it should be easy to see.  My results should be ready
tomorrow morning and I'll post them as soon as I can...


>
>If the processors were to take a big jump in size, then the larger hash would be
>helpful. If we had a 6000 mhz processor, with 3 gigs of ram and hash set at 1000
>megs, then I think this hash size can be used. It seems to me that over the last
>ten years, the total amount of memory that has become available in todays
>systems has exceeded the processing power.
>
>I think someday we will look back on 1000 megs of hash as small.
>
>kburcham

I think that one day folks will look back and see 1000 terabytes as small.



This page took 0 seconds to execute

Last modified: Thu, 15 Apr 21 08:11:13 -0700

Current Computer Chess Club Forums at Talkchess. This site by Sean Mintz.