Computer Chess Club Archives




Subject: Re: When to do a null move search - an experiment

Author: Vincent Diepeveen

Date: 08:57:43 04/26/04

Go up one level in this thread

On April 26, 2004 at 11:48:35, JosÚ Carlos wrote:

>On April 26, 2004 at 11:32:26, Tord Romstad wrote:
>>On April 26, 2004 at 10:39:42, JosÚ Carlos wrote:
>>>  An interesting experiment, of course. But I think your conditions are rather
>>>different from 'most' programs. I mean:
>>>  - You allow any number of null moves in a row (most programs don't do even
>>This has no importance, I think.  My experience is that I almost always get the
>>same score and PV when I enable/disable several null moves in a row, and that
>>the difference in number of moves searched is *very* tiny.
>  You're probably right, as you've tested and I speak from intuition, but at
>first sight, it seems that the fact that you allow several null moves in a row
>will increase your percentage of null-moves-tries/total-nodes-searched, and thus
>that avoiding unnecessary null moves will be a good idea.

In *all* experiments i did with nullmove and a program not using *any* forward
pruning other than nullmove, the best thing was to *always* nullmove.

Double nullmove i invented to proof nullmove gives the same results like a
normal fullwidth search for depth n which i may pick, and i use it as it finds
zugzwangs and i am sure that is very helpful, because the weakest chain counts.

So double nullmove always completely outgunned doing a single nullmove then
disallowing a nullmove and then allowing the next one.

The quickest way to of course get a cutoff in a tree is by a nullmove.

So always allowing a nullmove is also intuitive the best thing to do.

Any forward pruning or weak qsearch you might use can however change that of

>>>  - You do pruning and reductions that are not public domain
>>>  This is important because your results mean: 'in a non-standard null move
>>>implementation (where you try more null move searches than most others) and with
>>>a lot of heavy pruning and reductions (I assume they're heavy according to your
>>>posts here and the high depths Gothmog gets) and in a MTD(f) root search,
>>>limiting null move application seems to benefit". This conclusion is, apart from
>>>interesting of course, very intuitive.
>>This is a very good point.  During our discussion last week, several people
>>claimed that checking the static eval before doing a null move search wasn't
>>really necessary, because the cost of a null move search would be tiny
>>compared to the cost of searching the moves anyway.  This isn't really
>>true for Gothmog, because most of the moves are usually not searched to
>>full depth.
>>I am not sure it is fair to describe my pruning and reductions as "not
>>public domain", though.  It is true that I haven't published any paper
>>and that my engine is not open source (at least not yet), but I am happy
>>to describe the tricks I use to anybody who wants to listen.  :-)
>  True. I dind't mean they're a secret, but they're not implemented in most
>chess programs. Expressing correctly in english is not easy for me...

that's the case for all of us.

>>Whether the use of MTD(f) is important is hard to say, but my guess is
>>that it isn't.
>  Probably not. I mentioned it just to underline that your program is different
>from the average, though it's true that MTF(f) is more popular now than a few
>years ago.
>  JosÚ C.
>>>  Thanks for the data. You sure inspired many others to do similar tests (I'll
>>>do when I have time).
>>Yes, it would be interesting to see the results of similar experiments for
>>other engines.  It seems like a sure bet that this is a YMMV issue.  :-)

This page took 0.03 seconds to execute

Last modified: Thu, 07 Jul 11 08:48:38 -0700

Current Computer Chess Club Forums at Talkchess. This site by Sean Mintz.