Author: Robert Hyatt
Date: 08:51:34 08/29/00
Go up one level in this thread
On August 29, 2000 at 07:44:08, Pat King wrote: >On August 28, 2000 at 23:32:52, Robert Hyatt wrote: > >>On August 28, 2000 at 22:36:26, Larry Griffiths wrote: >> >>>On August 28, 2000 at 22:21:40, Robert Hyatt wrote: >>> >>>>On August 28, 2000 at 19:32:05, Larry Griffiths wrote: >>>> >>>>>I have been reading about the History Heuristic and have seen pro's and con's >>>>>about it. >>>>> >>>>>I plan on implementing it to see what happens. This heuristic is related to >>>>>killer moves and uses the from and to squares in a 64 x 64 array to maintain >>>>>history information when moves are bestmoves or cutoffs. Each entry has 2 to >>>>>the depth power added to it when a bestmove or cutoff is found. >>>>> >>>>>Would you recommend the History Heuristic, and has anything changed for the >>>>>better with the method described above? >>>>> >>>>>Thanks in advance. >>>>> >>>>>Larry. >>>> >>>> >>>>Works fine, but don't use 2^depth... for reasonable search depths, that will >>>>overflow 32 bit counters almost immediately. I use depth^2 which is much >>>>safer... >>>> >>>>Other than that it works fine. If you don't get a cutoff by the time you have >>>>tried a few history-ordered moves, you probably should give up and just search >>>>the rest of the moves in random order. >>> >>>I have Jonathan Schaeffer's paper "The History Heuristic and Alpha-Beta Search >>>Enhancements in Practice". I also figured the counters might overflow and it >>>looks like he ran his tests to around 9 plys. He also describes that the >>>history tables can become flooded with information, decreasing their usefulness. >>> I wondered if this was due to an overflow of his counters at plys 8 and 9. >>> >>>Excuse me Bob, but I have not done powers in quite a while and I was thinking >>>2^depth amounted to shifting the binary value 2 left depth positions. Maybe I >>>am just tired, but is depth^2 like depth squared? I plan on using 64 bit >>>counters so I am not worried about overflowing the counters. I thought I would >>>also try different formula's for calculating the weights. >>> >>>Larry. >> >> >>Yes, depth^2 is depth squared, which is much safer than 2^depth. >> >>Another point is that after each search, you need to scale the history counts >>back a bit. I shift them all right 8 bits, which means after 4 moves they are >>zero, if a particular entry has had no use in the previous 4 searches. >> >> >>64 bit counters would solve the problem for now, allowing 2^depth to work. >>slower of course. >My approach is to test whether I'm about to overflow, and scale back the counts >at that point, rather than at every search. ie > >if (MAXCOUNT-history[square]<depth*depth) halvehistory(); >history[square]+=depth*depth; > >adding an "if" for every cutoff, but saving halvehistory every search, and never >worrying about an overflow. Perhaps I'm keeping information too long this way, >however. > >Pat That is a costly thing to do in the search, of course... If you use depth*depth, it will take a _long_ time to overflow a 32 bit counter. But testing during the search for an impending overflow seems like over- kill and wasted cycles during the search.
This page took 0 seconds to execute
Last modified: Thu, 15 Apr 21 08:11:13 -0700
Current Computer Chess Club Forums at Talkchess. This site by Sean Mintz.