Computer Chess Club Archives


Search

Terms

Messages

Subject: Re: My miscellany of minor chess programming problems ?

Author: Robert Hyatt

Date: 11:29:21 08/29/05

Go up one level in this thread


On August 29, 2005 at 12:03:22, rasjid chan wrote:

>On August 29, 2005 at 10:12:51, Robert Hyatt wrote:
>
>>On August 29, 2005 at 06:02:09, Gian-Carlo Pascutto wrote:
>>
>>>On August 27, 2005 at 23:30:52, Robert Hyatt wrote:
>>>
>>>>On August 27, 2005 at 15:36:15, rasjid chan wrote:
>>>>
>>>>>On August 27, 2005 at 10:09:21, Jon Dart wrote:
>>>>>
>>>>>>> So this mean we don't need to keep pv[][] if we have hash tables (it is
>>>>>>> like double accounting).
>>>>>>
>>>>>>I think this is false, given you have a finite size hash table and must
>>>>>>eventually replace something. I have some experience: I used to retrieve pv from
>>>>>>the hash table but now use a pv array and back up the scores.
>>>>>>
>>>>>>--Jon
>>>>>
>>>>>They tell me so and I begin to doubt. Maybe as Dr Hyatt says, backup the pv.
>>>>>It may be best to be simple as I don't yet know how hashing twists and turns
>>>>>within.
>>>>>
>>>>>Rasjid
>>>>
>>>>Just remember this.  While searching the PV, _after_ you search a move on the PV
>>>>path, you do a lot of other searching.  Any of which can overwrite the PV move
>>>>so that you get no move at that point, and a resulting short PV.  The "back up"
>>>>method has no significant cost associated with it, since it is not done very
>>>>often in a PVS-type search...
>>>>
>>>>If you really don't care, the hash table approach works much of the time, and
>>>>does have zero overhead.  The array backup method has a finite but small
>>>>overhead.  I find that avoiding the short PVs helps in testing and debugging,
>>>>but in real games is irrelevant with respect to the game outcome...
>>>
>>>What about setting a flag in the hashtable that it is a PV move and should not
>>>be overwritten?
>>>
>>>I never did this, but thinking of it, why not?
>>>
>>>--
>>>GCP
>>
>>
>>one reason:  if you search to depth=30, you don't just have 30 PV moves.  :)
>>And I am talking about depth=30 as in the PV is actually just 30 plies long.
>>But there are _lots_ of other moves that will look like PV moves, where you back
>>a best move up, but then it gets thrown out.
>>
>>This has been tried in the past.  And the problem is that you can and will end
>>up with a _huge_ population of such positions at times, which basically reduces
>>the size of the hash table to almost nothing, killing overall performance.  All
>>to get the PV that is easy enough to get by backing it up.  :)
>>
>
>"I played with this a few years back, and someone suggested the "flag" idea.  I
>was astounded at the number of such "PV" nodes that I found in the table just by
>running through every entry, after doing searches in problem positions..."
>
>I think the key here is experience and I would not want to rely on my logic too
>much. "Why not" sometimes fail and I when I read about it I said "... why not.."
>That's simple to answer -
>
>Even normal hashing gives priority in order of EX,UB,LB and never overwrite
>an EX of greater or equal draft of the current search sequence. The next
>iteration is always with ++depth for the PV. I doubt collision (different
>positions vying for the same slot) will happen so coincidentally to the 30-60 pv
>nodes (so the ...one to million ... of Diepeveen). So the logic seems to favour
>the PV being intact. I think maybe others reason along a similar line.
>
>Safer to just backup. It won't weaken a program.
>
>Rasjid.


Here is a good test...

(I did this a long while back, there might even be comments in main.c about it
but I am not sure).

Take crafty, which does two things:

1.  it backs up the PV;

2.  it probes the hash table to extend the PV after a search ends, so that those
"short" pvs get lengthened.

And add a feature to simply probe the hash table and produce a brand new PV
after you get back from searching a move that gets a backed up score.  What I
found was two things:

(1) The PVs were absolutely no better for the hashing approach, and were, on
occasion, shorter.

But more importantly

(2) the hash-reconstructed PV occasionally had a bogus move.  Bogus in that the
backed up PV would not have that move.  And on occasion, bogus in that the move
was actually illogical in the position.

What happened in one specific case I tried to debug was this:

searched reached position P with a depth left of N.  Stored the result on
backing up, and here I want the move that was stored as it is part of the PV.
But while searching other alternatives for the same root move, the search found
a quicker way to reach position P, leaving more depth remaining than the
previous search, and we get a beta cutoff.  And store a "best" move that is not
really a "best" move, but is just "a move good enough to cause a cutoff".  Or,
the "best move" previously stored might not be the best move you would choose at
this different search path even though the position is the same, because the
search will probably extend differently below that node and change to a
different best move.  A lot of ways to overwrite the key hash positions, with
perfectly valid information, but which gives wrong PV moves.  In some cases the
PV might actually contain _better_ moves than the moves that led to the terminal
score that was backed up.  Which is OK, except when debugging.

If you have to do it that way, it is workable.  But I personally prefer to do it
in a way that is as accurate as possible.  And since the cost is pretty much
immeasurable (in terms of time wasted) doing it more accurately seems
reasonable.



This page took 0 seconds to execute

Last modified: Thu, 15 Apr 21 08:11:13 -0700

Current Computer Chess Club Forums at Talkchess. This site by Sean Mintz.