Author: Heiner Marxen
Date: 08:24:44 12/09/03
Go up one level in this thread
On December 08, 2003 at 16:50:09, Steffen Jakob wrote:
>On December 08, 2003 at 16:19:48, Tim Foden wrote:
>
>>On December 08, 2003 at 11:02:22, Alvaro Jose Povoa Cardoso wrote:
>>
>>>
>>>I've been thinking about the efficiency of the history heuristic at high search
>>>depths.
>>>It seems to me that the history table will be overwritten many times if we have
>>>a search of several billions of nodes. Additionally, as the search moves to
>>>different parts of the tree the history table values will be somewhat trashed.
>>>What do you think we could do about this?
>>>Maybe limit the history heuristic to a certain depth (ex: the nominal depth).
>>>
>>>Comments anyone?
>
>Hi Tim!
>
>I hope you enjoyed your trip back to spain. :-)
>
>>What happens in GLC is whenever it increments a value in the history table it
>>checks it against a maximum. If the maximum value is exceeded, it divides all
>>values in the table by 2.
>
>I do almost the same! I don´t modify the values in the table but remember a
>shift value which is used to reduce the history values on demand:
>
>int historyShift = 0;
>
>inline Value getLimitedHistoryValue(Value historyValue) {
> Value limitedValue = historyValue >> historyShift;
> if (limitedValue > MAX_HISTORY_VALUE) {
> historyShift++;
> limitedValue = limitedValue >> 1;
> }
> return limitedValue;
>}
And you are sure that the values in the table do not overflow?
>In the past I didn´t limit the history value. Then I realized that it became a
>dominant sort criterium in deep searches which was of course very bad.
Here I agree, of course. :-)
Cheers,
Heiner
This page took 0 seconds to execute
Last modified: Thu, 15 Apr 21 08:11:13 -0700
Current Computer Chess Club Forums at Talkchess. This site by Sean Mintz.