Computer Chess Club Archives


Search

Terms

Messages

Subject: How to handle HUGE hash table?

Author: James Flanagan

Date: 06:34:11 03/29/00


Dann Corbit recently distributed a set of "difficult" positions, asking people
to try solving them with different computer programs.  I attacked the second
position with a modified version of Crafty 17.10, which found the draw after
about 18 hours on my AMD K6-2. The answer came around ply 16. There were 52 (!)
legal moves in the root position.

My 48M hash table was clearly inadequate under these circumstances.  Does anyone
have experience with techniques for expanding the hash table for extremely long
analysis runs?

The obvious solution would be to write the hash table to disk.  However, true
random access would be extremely inefficient on disk, so some sort of logical
organization of the data would be necessary so that related positions are stored
together on the same page of disk/memory.  There are several different storage
schemes that I've considered:

- similar positions receive similar hash keys

- positions generated around the same time in the run are stored together (this
would require a memory-resident index between hash code and memory page). A
variation of this would involve changing a board position's memory location
based on the time of its most recent access.

- make a separate hashtable starting from each root move, plus links between
transpositions in different tables

- use a linked list or tree structure in which each board position is linked to
parent and child positions, which would be stored as close as possible to each
other in physical memory; hashing would be replaced by tree traversal.

Each of these approaches has obvious difficulties.  Does anyone have
constructive suggestions or experience in this area?

-Jim Flanagan.



This page took 0 seconds to execute

Last modified: Thu, 15 Apr 21 08:11:13 -0700

Current Computer Chess Club Forums at Talkchess. This site by Sean Mintz.