Author: Dan Newman
Date: 00:02:02 05/28/98
Go up one level in this thread
On May 27, 1998 at 21:04:08, Robert Hyatt wrote: >here's the scheme I've used for about 20 years now: make each hash >signature in two parts... the lower 48 bits of the hash signature from >the book position, the upper 16 bits from the hash signature of the >*parent* position. When sorting, all the has signatures from the same >parent will be in one "clump" on disk. If you measure the size of this >"clump" and do some clever indexing, you can get away with doing *one* >I/O to read in *every* possible position that can be produced by playing >a book move from the current position. (you will also have positions >from >other parent positions as well since you only took 16 bits from the >parent >hash key) but this turns out to be not very big overall... and you can >play bullet chess without keeping the entire book in memory, and still >make 10 moves without using a second off of your clock... Ah ha. Neat. I didn't catch the 16 bits from the parent position trick when I looked at Crafty's book code. I guess this scheme results in some positions occurring multiple times in the book (i.e., those with more than one parent), but I suppose there aren't an awful lot of these. In the cases where this occurs, the win/loss/draw counts for a position could be path dependent (unless you do something to make them the same). Actually, now that I think about it, this might be a good thing. The win/loss/draw counts for positions proceeding from a parent position should be comparable (on the same "scale") so that a choice can be made between them. If a position is inflated by counts from another path to that position, then the position may look better than it is. (Does this make sense?) I think I'm going to make some changes to my book code... -Dan.
This page took 0 seconds to execute
Last modified: Thu, 15 Apr 21 08:11:13 -0700
Current Computer Chess Club Forums at Talkchess. This site by Sean Mintz.