Computer Chess Club Archives


Search

Terms

Messages

Subject: Correction

Author: Gerd Isenberg

Date: 11:24:35 12/28/02

Go up one level in this thread


On December 28, 2002 at 13:13:39, Gerd Isenberg wrote:

>On December 27, 2002 at 21:03:24, Dan Andersson wrote:
>
>>Ugh! Talk about destroying a nice run time behaviour. Using a 4K hash and a
>>rehashing scheme uould get you a mean almost identical to one. The algorithm you
>>describe would probably have a mean close to one also, but the standard
>>deviation will be horiible to behold. But the missed probe behaviour will be
>>real bad. Iterating over the move list raises the cost of an error linearly, or
>>very nearly so. Real stupid. There is no excuse whatsoever to use that
>>algorithm.
>>
>>MvH Dan Andersson
>
>Hi Dan,
>
>Thanks for the hint. In some endgame positions i got up to 20% collisions. In
>openings or early middlegame < 1%-4%. So about 5% in avarage.
>

Oups i updated some counter at the wrong place.
The number of collisions is much greater!

    if ( Count50 + 4 <= GameCount )    {
       tryRepCount++;
       if ( RepetitionHash[RepeHashIdx()] > 1 )       {
           mayRepCount++;
           if ( Repetition() > 0 ) // iterate over move list
              RealRepCount++;

RealRepCount is about 2-10% of mayRepCount.

But mayRepCount is consistently < 10% of tryRepCount.
So the table safes more than 90% of the Repetition calls, where the move list
iteration occurs.

Gerd


>Before i used the even more stupid approach iterating back over the move list,
>comparing zobrist keys, first one 4 ply before and then in 2 ply decrements
>until there are reversible moves. So the 4KB table was a nice improvement for
>me.
>
>But anyway, time to try Bruce's approach.
>
>Regards,
>Gerd



This page took 0 seconds to execute

Last modified: Thu, 15 Apr 21 08:11:13 -0700

Current Computer Chess Club Forums at Talkchess. This site by Sean Mintz.