Author: Mathieu Pagé
Date: 15:35:29 08/15/04
Go up one level in this thread
On August 15, 2004 at 17:33:39, Bruce Cleaver wrote: >Here's an idea: most programs implement nullmove with R = 2 or R=3 (even >adaptive nullmove uses R=1, 2, or 3). > >Suppose the truly optimal value for R is at 2.2 (not 2.0), the idea being that >you always reduce the search 2 plies, and then 20% of the time (done >probabilistically) reduce 3 plies (i.e. if random() <= 0.2, R = 3 else R = 2). >The same goes for R = 3, or whatever integer value you are using. > >I know it goes against the grain having a non-deterministic approach, but an >extra >20% of the search done at R = 3 vice R = 2 could yield large benefits (or, >terrible blunders, of course). R = 4 is way too large by experience, but maybe >R = 3.1 is better than R = 3, and R = 2.5 is better than R = 2 > >Just an idea :) It wont work. You have 100 nodes as an example and th optimal R=2.2. If you use R=2 you get 100 nodes with a value of R near the optimal value. If you use R=3 You get 100 nodes with a value of R far of the optimal value (I mean it could have been nearer). if you use 20xR=3 and 80xR=2 You get 80 nodes with value of R near the optimal value and 20 wich could have been nearer. I think the firs option is the best. Unless you can say in wich nodes R=3 is better than R=2. This is already know as Adaptive Null Move I think. Mathieu P.
This page took 0 seconds to execute
Last modified: Thu, 15 Apr 21 08:11:13 -0700
Current Computer Chess Club Forums at Talkchess. This site by Sean Mintz.