Author: Daniel Shawul
Date: 06:13:03 04/02/04
Go up one level in this thread
On April 02, 2004 at 04:31:58, Vasik Rajlich wrote:
>On April 01, 2004 at 21:33:00, Uri Blass wrote:
>
>>I decided that there is probably no hope that movei will become commercial with
>>all the competition of smart people like Tord(and in any case sharing ideas is
>>not going to reduce the chances that it is going to happen) so I decided to
>>share my thoughts about searching that I still do not use in movei.
>>
>>I want to have some general search code(not only for chess) when it will be
>>possible to implement failsoft,failhard,mtd and many search ideas by the right
>>defines.
>>
>>I still do not have defines for different possibilities but I decided to post
>>the following heavy commented code that is based on ideas that I thought about
>>search but I do not use most of the ideas that are in this code for movei.
>>
>>I hope that other people will help me to make this code more general to allow
>>also ideas like mtd if the right things are defines and to change things to have
>>fail hard and fail soft and to add general code for the iterate function and the
>>hash functions.
>>
>>The idea is that there is going to be some general search code not only for
>>chess that every programmer will be able to use as a basis.
>>
>>part of the procedures or the functions may be empty in the first versions or
>>return a constant(for example in case that the program does not use iid it may
>>have return infinite in the function that calculate if to use iid.
>>
>>int Search(int depth,int alpha,int beta,int verify)
>>{
>> int last,Newdepth,val;
>> int first_alpha=alpha;
>> int iidfirst=first_move[ply];
>> int valmax=alpha;
>> /*there is a move stack similiar to a lot of programs
>> we always have first_move[0]=0 and first_move[i] ro be sum of perft 1 in
>>the positions in the line that leaded
>> to the position.*/
>>
>> int i=first_move[ply];
>> int reduction;
>> /*the function gen generates legal moves*/
>> gen();
>>
>>
>> last=first_move[ply+1];
>> /*last-i is the number of moves of the side to move
>>
>> hash did not help to prune and the same for null move so now you need to do
>>internal iterative deepening
>> to find first move to search in the right conditions*/
>> reduction=iid_reduction(depth,alpha);
>> if (depth>reduction)
>> {
>> /*it is important to start with the right move
>> do search to reduced depth to find a move and stop when a move generate
>>cutoff*/
>> while (i<last)
>> {
>> makemove(gen_dat[i].m);
>> val=-Search(depth-reduction,-beta,-valmax,verify);
>> undomove();
>> if (val>valmax)
>> {
>> valmax=val;
>> iidfirst=i;
>> }
>> if (val>beta)
>> break;
>>
>> }
>> replace_move_data(i,valmax);
>> i=first_move[ply];
>> }
>>
>> while (i<last)
>> {
>> /*probability_to_fail_high_too_small also consider the previous moves
>> if for example good captures were already searched then I may decide based on
>>evaluation and depth that
>> probability to fail high is too small and return alpha but in previous move I
>>did not know it because
>> I still hoped that the score will be above alpha
>> both good_chance_to_fail_high and probability_to_fail_high_too_small may use a
>>function that evaluate probability
>> but I do not suggest to call a function that evaluate probability because it
>>may be cheaper to do some lazy evaluation
>> and not to evaluate exact probability in order to know if the probability is
>>too small and only if the lazy evaluation
>> fails to calculate exact probability*/
>> if (probability_to_fail_high_too_small(depth,alpha))
>> return alpha;
>> /*the function sortmove find the next move to search and put it in place i in
>>the global move stack
>> it is logical to spend more time if the depth is bigger
>> so the function gets not only the number of move in the move stack but also
>>the depth*/
>>
>> sortmove(i,depth);
>> /*it is possible that I cannot decide for all moves that there is no chance
>>that they will fail high
>> but I may decide for the next move that it is not going to fail high
>> I expect the function to say yes very fast in most cases and the only cases
>>when I expect no chance for
>> the move to fail high is when the move is quiet move and the evaluation enough
>>below alpha*/
>> if (chance_fail_high_this_move(depth,alpha,i))
>> {
>> makemove(gen_dat[i].m);
>> /*first we check for draw by repetition or by endgame*/
>> if (TryDraw())
>> return 0;
>> /*now we calculate the extensions so we can use later the hash tables to
>>decide if to prune the move*/
>> Newdepth=depth+extensions();
>> if (TryHashTables(Newdepth)>=beta)
>> return beta;
>> /*I will also try ETC here in case that the depth is big enough by generating
>>all moves and
>> I need to write code for it*/
>> if (nullconditions(depth))
>> {
>> reduction=calculate_null_reduction();
>> donullmove();
>> gen();
>> /*starting_q_search_depth is simply a parameter that I give qsearch in the
>>beginning and I need it because
>> I search different in different plies of the qsearch*/
>> if (reduction<Newdepth)
>> val=-Search(Newdepth-reduction,-beta,-beta+1,verify);
>> else
>> val=-Quies(-beta,-beta+1,starting_q_search_depth);
>> undonullmove();
>> if (val>=beta)
>> {
>> if (verify==1)
>> {
>> depth--;
>> verify=0;
>> }
>> else
>> return val;
>> }
>> }
>> if (depth <= 0)
>> {
>> gen();
>> return -Quies(-beta,-alpha,starting_q_search_depth);
>> }
>> else
>> {
>> if (depth>small_depth)
>> {
>> /*evalprobability search every legal move to depth-small_depth in order to
>>give exact evaluation
>> that is going to be used for evaluating probabilities for every move to be
>>best
>> when the probabilities are going to be used to decide later about
>>reductions or extensions
>> and singular extension may be a private case
>> I also think about having updateprobability function that update
>>probabilities for all moves
>> when I call it only when the remaining depth is big enough.
>>
>> I think also to store in the hash tables relatively small tree with
>>probabilities when I update
>> the probabilities in all the nodes of that tree every time that I get new
>>information like changing
>> the root score or fail high or fail low when the depth is big enough.*/
>>
>> evalprobability(depth);
>> }
>> /*in cases of reduction I want to investigate later only in case of
>>unexpected fail high
>> Reduction_Condition happens when the probability was not small enough to
>>prune but still small*/
>> if (Reduction_Conditions())
>> {
>> val=-Search(Newdepth-PLY-Ply_Reduction(),-beta,-alpha,verify);
>> if (val>alpha)
>> val=-Search(Newdepth - PLY, -beta, -alpha,verify);
>>
>> }
>> else
>> val = -Search(Newdepth - PLY, -beta, -alpha,verify);
>> }
>> undomove();
>> if (val >= beta)
>> return beta;
>> if (val > alpha)
>> alpha = val;
>> }
>> i++;
>>
>> }
>> return alpha;
>>
>>
>>}
>
>Hi Uri,
>
>One general comment: I've also had the idea to do both MTD (f) and PVS. Now I
>realize that this is not feasible. Too many things are different.
>
>The same holds true for a number of issues. Are you going to be essentially
>selective, or essentially brute force? Are you going to be fast, or intelligent?
>
>It makes sense to experiment a bit when you are starting, but I think you're
>past that point.
whether a starter or not,expermenting won't hurt.
Infact,i would like to have optional MTD(f) in my programme.
daniel
>
>Good luck.
>
>Vas
This page took 0 seconds to execute
Last modified: Thu, 15 Apr 21 08:11:13 -0700
Current Computer Chess Club Forums at Talkchess. This site by Sean Mintz.