Computer Chess Club Archives


Search

Terms

Messages

Subject: Re: trouble getting tt tables working

Author: Robert Hyatt

Date: 08:12:24 03/07/98

Go up one level in this thread


On March 07, 1998 at 01:30:48, Will Singleton wrote:

>On March 06, 1998 at 20:40:42, Robert Hyatt wrote:
>
>>
>>first thing to look at is 32 bits is not enough.  you will get way too
>>many false matches (collisions)..  enough to produce bogus results.
>>
>
>I see from Nelson's icca article that you had been using 40 bits, and he
>intimated that it should or would be expanded to 64.  I assume that was
>due to memory limits.  Is there a correlation between hashkey and index
>size?  That is, if I go to 40 or 64 bits, is the optimum overall table
>size affected?
>
>

actually, you have to read what Harry wrote carefully.  We computed a
64bit hash signature throughout Cray Blitz.  We only stored the upper
40 bits in the hash table, but we used the low-order bits for the
hash probe address, which means that far more than 40 bits were actually
accounted for.  IE with our usual 16M entry table (the smallest we ever
used in competition) we used the entire 64 bits, because we used the low
order 24 bits for the random probe address...


>>
>>I didn't see an example of what kind of problem you are having...
>>
>
>My results are unpredictable, something is causing wrong move selection.
> I posted the logic and the pseudo-code in the hope that perhaps someone
>could see a flaw in my understanding of the process.
>
>Since the process seems clear enough, however, I guess it's just a
>question of some creative time with the debugger :-<
>
>Will

There are several things to watch out for.  Obviously, you can store
three
different results:

1.  EXACT, when alpha < VALUE < beta.

2.  LOWER  when VALUE >= beta

3.  UPPER  when VALUE <= alpha

you can then return the appropriate value if and only if the draft of
the entry is sufficient.  Screwing this draft up will wreck things for
anyone.

when you store LOWER you have a move to store that caused this beta
cutoff (fail high).  When you store UPPER, you don't.  (naturally on
EXACT you also have a best move, but it's found in the PV, not in the
last move you searched.)  If you store the wrong move, you can break
the efficiency of alpha/beta by searching nonsense moves first and
blowing up the search space required.

If you do null-move, be careful, in that you can probe the wrong table
if you aren't careful and get a false match.

If you store mates, besure you store "mate-in-N-from-*this*-position"
rather than mate from the root, otherwise you will truly go ape when you
encounter lots of mate threats.

One other test is to create a routine Validate(hash_signature) that you
can call (compile in when -DDEBUG is set) that will take your
incrementally-
updated hash signature, compute a duplicate completely from scratch, and
compare them.  Report any differences.  This is the easiest way to break
hashing, forgetting to properly unmake an en passant capture or a pawn
promotion, so that the signature is corrupted.  That will cause many
problems since you now get false matches all over the place...

I run this every now and then after serious changes.  In fact, if you
look
at Crafty, I have a module Validate() that tests *everything*.  Compares
every bitmap, piece count, and so forth, to pick up on any data
structure
corruption.  I call it before/after every MakeMove() and also
before/after
any UnMakeMove().  Slows me down by a factor of 5x or so, but catches
lots
of errors after major changes...





This page took 0 seconds to execute

Last modified: Thu, 15 Apr 21 08:11:13 -0700

Current Computer Chess Club Forums at Talkchess. This site by Sean Mintz.