Computer Chess Club Archives


Search

Terms

Messages

Subject: Re: Repeatability (questions for Omid)

Author: Robert Hyatt

Date: 08:24:44 12/19/02

Go up one level in this thread


On December 18, 2002 at 18:43:26, Bruce Moreland wrote:

>On December 18, 2002 at 13:13:09, Robert Hyatt wrote:
>
>>Actually I found it "non-conclusive" as I reported.  It helped in some places,
>>hurt in others, and the end-of-term stuff here (and then the SMT stuff on this
>>new machine) side-tracked me for a while...  I still have plans to play with it
>>further.
>
>So you took an initial stab at repeating this and failed, if I can read between
>those lines.

No.  But I did do a good implementation of the approach.  Which involved adding
a paramenter to the Search() procedure to indicate whether or not to "verify"
a search (false below an existing verification point).

However, the results (for me) were inconclusive.  But then again, I didn't
wreck my R=3~2 null-move depth, which means what I was testing on was not what
he used.  And additional testing/tweaking seemed to be needed in order to better
tune the verification depth, for one thing...

I have not tossed the idea out, I simply ran out of play-time near the end of
our semester and had to put it aside.  Then this hyper-threading box showed up
early and generated a bit of discussion here so I got side-tracked there as
well.  :)


>
>I implemented this in Gerbil last night and ran it.
>
>I found that this was inferior to both R=2 and R=3 at every one second interval,
>with ECM, between 1 and 20 seconds.
>
>Meaning, that it never produces more solutions in a given number of seconds.
>
>General R=3 also is never better than R=2, given this testing methodology, in
>Gerbil.
>
>It is possible that I implemented it wrongly.  There are a couple of things that
>I don't understand and had to guess about:
>
>1) When the null-move search comes back with fail high, and verify is true, I
>will do a regular search with reduced depth.  If this search fails high, I cut
>off like normal.  If this search does not fail high, I have to re-search with
>the original depth.  What I don't understand is what I do if the initial reduced
>depth search modified alpha.  I am assuming that I put alpha back the way it was
>and start over.

That is how I did it, yes..


>
>2) I don't know if this implementation allows two consecutive null moves or
>what.  In specific, I don't know what "null_ok()" does.  I am assuming that if I
>don't allow two null moves in a row already, I can continue to not allow two
>null moves in a row.


I did it just like my normal program.  No consecutive null-move searches, and
I did continue to use the "avoid-null" hash table trick as well.



>
>3) I am assuming that if the search cuts off when doing the reduced depth
>searches, that the depth record in the hash table should be the original depth,
>and not the reduced depth.

That is what I did, although it seems that it might be a problematic
assumption that I didn't give much thought to since the code already worked
that way normally...


>
>I can't find any bugs in my implementation, if my assumptions were correct.
>
>bruce



This page took 0.11 seconds to execute

Last modified: Thu, 07 Jul 11 08:48:38 -0700

Current Computer Chess Club Forums at Talkchess. This site by Sean Mintz.