Author: Robert Hyatt
Date: 09:24:26 09/18/01
Go up one level in this thread
On September 18, 2001 at 11:50:40, Vincent Diepeveen wrote: >On September 18, 2001 at 11:31:00, Robert Hyatt wrote: > >>On September 18, 2001 at 09:10:02, Vincent Diepeveen wrote: >> >>>On September 17, 2001 at 12:00:45, Gian-Carlo Pascutto wrote: >>> >>>Hello now that i saw this link and downloaded the source i >>>realized this was also published in advances in ICCA volume 8, >>>of course i got this book and first removed some dust from it, >>>after which i opened it and saw at page 71 claimed next >>>by M. Brockington and J. Schaeffer >>> >>>"... Although most parallel alfabeta-programs take months to develop, >>>the game-independant library allows users to integrate parallellism >>>into their application with only a few hours of work" >>> >>>[cough] >>> >>>Well if no one here manages to do that, who am i to say that the >>>remainder of this algorithm is worth trying? >>> >>>For sure there is a speedup problem. >>> >>>But there is more. It's a complete different algorithm than YBW. >>> >>>Which means that APHID already says who has to search what before relevance >>>of parallel splitting has been indicated. Considering that nowadays we >>>use nullmove bigtime, this makes APHID completely outdated, because >>>it in short doesn't wait at all! >> >> >>You need to _read_ the article first. That version of Crafty used null-move >>R=2, recursively. Changing to R=2-3 would be a trivial change. > >Bob the speedups claimed are for serach depth from 5 ply to 8 ply MAXIMUM. > >Based entirely upon a program called > >> >>> >>>First few pages it says that a major problem with YBW is the idle time. >>> >>>This refers to the fact that YBW for each node needs a result for the >>>first move first before splitting other moves. Now this is of course very >>>true, but this directly also shows that APHIDs branching factor is by >>>definition bigger than algorithms which use YBW. > >>this is incorrect. Cray Blitz didn't use YBW either. It tried to do that >>whenever possible, but it _never_ waited, _ever_. It would choose to split > >I'm doing that partly too, but i directly resplit if possible then, so i see >DIEP like a program which splitting strategy is completely YBW dominated. > >Did you resplit in Cray Blitz when possible? yes I did... > >>_somewhere_ rather than sitting idle, taking a chance that the parallel work >>done was needed. > >>>Back in 1996 for example TheTurk which used APHID searched 2500 nodes >>>a second. I don't need to mention that for nowadays networks capable >>>of searching millions of nodes a second, that branching factor is a bigger >>>problem than it used to be in 1997. >>> >>>Note that Deep Blue used seemingly a similar approach to APHID, which >>>considering its search depth was no problem to use of course. >> >> >> >>Deep Blue's search was not related to the APHID approach in any fashion. Their >>search was completely different for obvious hardware reasons. > >The explanation given by Hsu in IEEE99 is pretty similar to the story >the APHID guys give... I don't see how. Hsu's algorithm is two-level. Aphid is not. > >> >>> >>>>On September 17, 2001 at 11:54:35, Vincent Diepeveen wrote: >>>>>On September 17, 2001 at 07:10:46, Gian-Carlo Pascutto wrote: >>>>> >>>>>Well show me the code and i can run it here at my 100mbit >>>>>network then! >>>> >>>>ftp://ftp.cs.ualberta.ca/pub/games/aphid >>>> >>>>APHID libs, crafty version for them >>>> >>>>You need PVM (Parallel Virtual Machine, standard message-passing >>>>lib), and probably a quick look at some of their docs how >>>>to set up the config files. >>>> >>>>You can get the papers/thesis and all from >>>> >>>>http://www.cs.ualberta.ca/~games/aphid/index.html >>>> >>>>-- >>>>GCP
This page took 0 seconds to execute
Last modified: Thu, 15 Apr 21 08:11:13 -0700
Current Computer Chess Club Forums at Talkchess. This site by Sean Mintz.