Author: Uri Blass
Date: 19:44:46 04/01/04
Go up one level in this thread
On April 01, 2004 at 22:40:46, Uri Blass wrote:
correction
I changed name of varaible and I forgot to change the relevant comment
Here is the new code:
int Search(int depth,int alpha,int beta,int verify)
{
int last,Newdepth,val,hash_depth;
int first_alpha=alpha;
int iidfirst=first_move[ply];
int valmax=alpha;
/*there is a move stack similiar to a lot of programs
we always have first_move[0]=0 and first_move[i] ro be sum of perft 1 in
the positions in the line that leaded
to the position.*/
int i=first_move[ply];
int reduction;
/*the function gen generates legal moves*/
gen();
last=first_move[ply+1];
/*last-i is the number of moves of the side to move
hash did not help to prune and the same for null move so now you need to do
internal iterative deepening
to find first move to search in the right conditions*/
hash_depth=probe_hash_depth_upper(beta);
/*hash_depth can be 0 in case that there is no hash probe
min_reduction is the minimal reduction that can be used when you use iid*/
if (hash_depth<depth-min_reduction)
{
reduction=iid_reduction(depth,alpha);
if (depth>reduction+hash_depth)
{
/*it is important to start with the right move
do search to reduced depth to find a move and stop when a move generate
cutoff*/
while (i<last)
{
makemove(gen_dat[i].m);
val=-Search(depth-reduction,-beta,-valmax,verify);
undomove();
if (val>valmax)
{
valmax=val;
iidfirst=i;
}
if (val>beta)
break;
i++;
}
replace_move_data(i,valmax);
i=first_move[ply];
}
}
while (i<last)
{
/*probability_to_fail_high_too_small also consider the previous moves
if for example good captures were already searched then I may decide based on
evaluation and depth that
probability to fail high is too small and return alpha but in previous move I
did not know it because
I still hoped that the score will be above alpha
both good_chance_to_fail_high and probability_to_fail_high_too_small may use a
function that evaluate probability
but I do not suggest to call a function that evaluate probability because it
may be cheaper to do some lazy evaluation
and not to evaluate exact probability in order to know if the probability is
too small and only if the lazy evaluation
fails to calculate exact probability*/
if (probability_to_fail_high_too_small(depth,alpha))
return alpha;
/*the function sortmove find the next move to search and put it in place i in
the global move stack
it is logical to spend more time if the depth is bigger
so the function gets not only the number of move in the move stack but also
the depth*/
sortmove(i,depth);
/*it is possible that I cannot decide for all moves that there is no chance
that they will fail high
but I may decide for the next move that it is not going to fail high
I expect the function to say yes very fast in most cases and the only cases
when I expect no chance for
the move to fail high is when the move is quiet move and the evaluation enough
below alpha*/
if (chance_fail_high_this_move(depth,alpha,i))
{
makemove(gen_dat[i].m);
/*first we check for draw by repetition or by endgame*/
if (TryDraw())
return 0;
/*now we calculate the extensions so we can use later the hash tables to
decide if to prune the move*/
Newdepth=depth+extensions();
if (TryHashTables(Newdepth)>=beta)
return beta;
/*I will also try ETC here in case that the depth is big enough by generating
all moves and
I need to write code for it*/
if (nullconditions(depth))
{
reduction=calculate_null_reduction();
donullmove();
gen();
/*starting_q_search_depth is simply a parameter that I give qsearch in the
beginning and I need it because
I search different in different plies of the qsearch*/
if (reduction<Newdepth)
val=-Search(Newdepth-reduction,-beta,-beta+1,verify);
else
val=-Quies(-beta,-beta+1,starting_q_search_depth);
undonullmove();
if (val>=beta)
{
if (verify==1)
{
depth--;
verify=0;
}
else
return val;
}
}
if (depth <= 0)
{
gen();
return -Quies(-beta,-alpha,starting_q_search_depth);
}
else
{
if (depth>small_depth)
{
/*evalprobability search every legal move to depth-small_depth in order to
give exact evaluation
that is going to be used for evaluating probabilities for every move to be
best
when the probabilities are going to be used to decide later about
reductions or extensions
and singular extension may be a private case
I also think about having updateprobability function that update
probabilities for all moves
when I call it only when the remaining depth is big enough.
I think also to store in the hash tables relatively small tree with
probabilities when I update
the probabilities in all the nodes of that tree every time that I get new
information like changing
the root score or fail high or fail low when the depth is big enough.*/
evalprobability(depth);
}
/*in cases of reduction I want to investigate later only in case of
unexpected fail high
Reduction_Condition happens when the probability was not small enough to
prune but still small*/
if (Reduction_Conditions())
{
val=-Search(Newdepth-PLY-Ply_Reduction(),-beta,-alpha,verify);
if (val>alpha)
val=-Search(Newdepth - PLY, -beta, -alpha,verify);
}
else
val = -Search(Newdepth - PLY, -beta, -alpha,verify);
}
undomove();
if (val >= beta)
return beta;
if (val > alpha)
alpha = val;
}
i++;
}
return alpha;
}
This page took 0 seconds to execute
Last modified: Thu, 15 Apr 21 08:11:13 -0700
Current Computer Chess Club Forums at Talkchess. This site by Sean Mintz.