Author: Robert Hyatt
Date: 17:02:45 08/15/04
Go up one level in this thread
On August 15, 2004 at 17:33:39, Bruce Cleaver wrote: >Here's an idea: most programs implement nullmove with R = 2 or R=3 (even >adaptive nullmove uses R=1, 2, or 3). > >Suppose the truly optimal value for R is at 2.2 (not 2.0), the idea being that >you always reduce the search 2 plies, and then 20% of the time (done >probabilistically) reduce 3 plies (i.e. if random() <= 0.2, R = 3 else R = 2). >The same goes for R = 3, or whatever integer value you are using. > >I know it goes against the grain having a non-deterministic approach, but an >extra >20% of the search done at R = 3 vice R = 2 could yield large benefits (or, >terrible blunders, of course). R = 4 is way too large by experience, but maybe >R = 3.1 is better than R = 3, and R = 2.5 is better than R = 2 > >Just an idea :) I have tried this in the past. IE I use a R value of 3 that drops to two near the leaves, in one unit-step-function type of "drop". I have tried reducing it fractionally so that rather than a sudden drop from 3 to 2, there is a gradual drop from 3 to 2 in fractional decrements. I didn't find any particular plus or minus and didn't keep it. It is still in my notes for "something to play more with when I have time..."
This page took 0 seconds to execute
Last modified: Thu, 15 Apr 21 08:11:13 -0700
Current Computer Chess Club Forums at Talkchess. This site by Sean Mintz.