Computer Chess Club Archives


Search

Terms

Messages

Subject: Re: not using nullmove? [generalized null move]

Author: Russell Reagan

Date: 08:02:09 02/14/04

Go up one level in this thread


On February 14, 2004 at 10:41:15, Robert Hyatt wrote:

>There is one other major flaw.  You just made the search _many_
> times slower.
>
>:)
>
>The math:
>
>Doing an 8 ply search.  Assume a branching factor of 38.
>
>the 8 ply search will search
>
>N1 = 38^4 + 38^4 + 1 leaf nodes.
>
>the 38 x 7 ply search will search
>
>N2 = 38 * (38^4 + 38^3 + 1) nodes
>
>N1 = 4,170,273
>
>N2 = 81,320,342
>
>N2 / N1 = 19.5 times slower.  Not a good thing.
>
>This is the penalty you pay if you want the ply=2 scores.  Alpha/beta only gives
>the best move / ply=1 score...  Additional information has huge additional cost.

This sounds to me like you're leaving out part of the picture :) He _is_ getting
something pretty significant in return for the speed hit (very accurate scores,
relatively speaking). This would be like me saying that iterative deepening is
stupid because you just end up doing N searches and you waste time. If you did
iterative deepening and didn't take advantage of the scores from previous
iterations, that _would_ be stupid, but no one does that. Just like he isn't
taking on a 20x speed hit for nothing.

Initially it may be 20x slower, but if you are able to do pruning and
reductions, your effective branching factor drops, and that is better than any
constant loss in speed. Exponential improvement at the cost of a linear
overhead. Right?



This page took 0 seconds to execute

Last modified: Thu, 15 Apr 21 08:11:13 -0700

Current Computer Chess Club Forums at Talkchess. This site by Sean Mintz.