Author: Richard Pijl
Date: 01:16:14 07/08/03
Go up one level in this thread
On July 07, 2003 at 19:31:58, Ricardo Gibert wrote: >I was reading about alternatives to alpha-beta search and got an idea that >generalizes ETC (Enhanced Transposition Cut-offs). It is based on the following >observations: > >1 - The greater the draft of a cut node, the more confidence that the cut node >will remain a cut node with deeper search. > >2 - Having more than one move that produces a cut gives more confidence that the >cut node will remain a cut node with deeper search. > >One problem with (2) is that additional cut moves may simply transpose into one >another without having any real independent significance. However, if the backed >up scores of each of the cut moves are not equal to one another, they can be >presumed to be "independent". I'm not 100% sure about this, so let me know if >this is reasonable. > >Now the way to generalize ETC based on (1) and (2) is to also accept >"independent" transposing moves with a lower associated draft provided there are >more than one of them. Such transpositions can be found at nearly zero cost, >since when you fail to find a transposition of sufficient draft, you must run >through all the legal moves anyway in your search for one. > >This can be even further extended by accepting even shallower drafts provided >enough of them are found to offset risk. > >An example would be to accept two independent moves of draft = x - 1 in lieu of >one move of draft x. If this is too risky, it could be made safer by requiring >three moves of draft x - 1, etc. > >In a nut shell, the idea is to allow the trade off of draft for redundancy to >increase the payoff of using ETC. If this idea is workable I would not expect it >to give a big payoff. Also, it probably isn't original either, since it is >rather simple. I've been looking at ETC the last week and implemented it in the Baron. One of the things I discovered is that you must be _very_ careful accepting a cut-node based on a hash entry without sufficient depth. Errors introduced by this type of pruning seem to leed to bad problems. Although a speed increase can be seen (search to ply) some easy positions are no longer solved. I haven't done a big analysis on that, but I assume that when you're cutting with unsufficient depth, the result is stored in that hashtable with sufficient depth (if not on the same ply, it will be stored on the previous ply i.e. one move up in the search). If that entry is used again with unsufficient depth, the search gets blind spots. This problem also occurs when not taking extensions into account. What I do currently is to do a 'verified' ETC. Meaning that when a possible cutoff is found, the move causing the cutoff will be searched first in a normal search. If depth is sufficient, a cutoff during normal search will happen, if depth is not sufficient, there is still a pretty good chance that the move is good and a cutoff will happen. But the errors as described before will not happen. Richard.
This page took 0 seconds to execute
Last modified: Thu, 15 Apr 21 08:11:13 -0700
Current Computer Chess Club Forums at Talkchess. This site by Sean Mintz.