Computer Chess Club Archives


Search

Terms

Messages

Subject: Re: Odd behavior in lazy eval

Author: Stan Arts

Date: 06:28:46 10/19/04

Go up one level in this thread


On October 19, 2004 at 08:24:45, Rick Bischoff wrote:

>I am still struggling with my huge node count in WAC #Iforgotthenumber-- anyway,
>I am trying out lazy evaluation, e.g., only counting material and seeing if I
>can get a cutoff from there..  Here is the code in the evaluation function:
>
>	//int x = whiteScore - blackScore;
>	//extern bool wtm;
>	//if ( !wtm ) x = -x;
>	//if ( x + 200 <= alpha ) return alpha;
>	//else if ( x - 200 >= beta ) return beta;
>
>When this code is off, as shown above, I have the following stats from a depth-7
>search:
>
> 6     157     295     112333 1. Ne5 Qxb3 2. Nxd7+ Kc8 3. axb3 Kxd7 4. Rxa7
> 7     153    1132     397249 1. Ne5 Qxb3 2. Nxd7+ Kc8 3. axb3 Kxd7 4. Rxa7 Nd5
>1294061 nodes (1041179, 0.804583) 31.162 seconds (41526.9 NPS)
>4000 nullmove try 1330 nullmove succeed (%33.25)
>
>OK, so when I turn on the lazy evaulation:
>
> 6      53     403     198712 1. Ne5 Rf6 2. Bg5 Qxb3 3. axb3 Re6
> 7     141    1720     752626 1. Ne5 Qxb3 2. axb3 Bxb5 3. Nxg6 Nxg6 4. Be3
>move Ne5
>3314059 nodes (2830629, 0.854128) 66.653 seconds (49721.1 NPS)
>171058 nullmove try 66872 nullmove succeed (%39.0932)
>
>So you can see, not only has the PV in the last two iterations changed
>drastically, the node count has also increased by a factor of 2 or more!
>
>Clearly, this has to be a bug?    From the opening with lazy eval off:
>
> 6       0      50      24360 1. d4 d5 2. Bf4 Bf5 3. Nf3 Nc6
> 7      10     263     107854 1. d4 d5 2. Bf4 Nc6 3. Nf3 Bf5
>move d4
>278745 nodes (225693, 0.809676) 6.184 seconds (45075.2 NPS)
>5307 nullmove try 2870 nullmove succeed (%54.0795)
>
>With it on:
>
> 6       0      48      24099 1. d4 d5 2. Bf4 Bf5 3. Nf3 Nc6
> 7      10     257     110823 1. d4 d5 2. Bf4 Nc6 3. Nf3 Bf5
>move d4
>348815 nodes (289003, 0.828528) 6.985 seconds (49937.7 NPS)
>9995 nullmove try 3934 nullmove succeed (%39.3597)
>
>So, I am still getting a node increase here, but the PV has remained the same.
>What gives!?


Hi

One thing I experienced when I tried, was that if I return beta or alpha
instead of the actual lazy-evaluationvalue, my searchtree grew a lot in
size.
Probably because my hashtable-values (when alpha or beta) will give a lot
less cut-offs that way, then when using the actual lazy score, because
this is further above/below beta/alpha, and so will make a check to be used
at other nodes more often.

Maybe your futility-margin in qsearch for captures to get to alpha is too
small in relation with the lazy-evaluationmargin you have. So that when you
get in positions with passed pawns or king-safety-in-danger positions, you
start seeing too little, because then too-small-margins-for-risky-decisions
at different places start to add up I guess. That could also explain your
PV-difference and big score difference perhaps.

Hmm, and if you detect cases of insufficient material and so in your
evaluationfunction, a margin of 2.00 is probably far too small. Or you
could also include these cases before you decide to lazy-quit.
I would use a large value anyway, (for instance atleast 3.50-5.00) because
this will still give a nice speedup (when I tested about 10-20% in most
positions, to double the speed in very tactical positions and lots of mates
or big material losses in the searchtree etc.) and have little downsides of
missing big positional-scoreswings-tactics or blindness for endgame-cases.
I read that Rebel uses 0.50, which I don't understand. So I gave that a try
anyway :) and indeed there wasn't too much difference in normal play, but
it also wasn't really faster. (only a few % over using a big margin)

Greetings
Stan




This page took 0 seconds to execute

Last modified: Thu, 15 Apr 21 08:11:13 -0700

Current Computer Chess Club Forums at Talkchess. This site by Sean Mintz.