Author: John Merlino
Date: 13:19:23 11/13/01
Go up one level in this thread
On November 12, 2001 at 15:52:25, Scott Gasch wrote: >On November 09, 2001 at 20:25:52, John Merlino wrote: >> >>To give a quick answer: >> >>My test was with a book with just under 150,000 games in it. It took about 250MB >>of RAM (which ended up requiring about 100MB of swapping on my machine), and a >>little less than 4 hours to process at a depth of 40 plies. The result (after >>pruning zero-weight branches, which is, in effect, the same as your "straining" >>process) was a file that was about 550K. If I had not pruned the zero-weight >>branches, the file would have been about 6.1MB. Admittedly, though, this timing >>is during a debugging process, and I have not actually tried running it with a >>release build. > >What is taking it so long? Is it swapping? That will kill the speed of book >generation, of course. Is the PGN reader just really slow? Have you tried >profiling the code during book generation? It might give you an aha. > >>However, I think our PGN reader code is one of the main bottlenecks. It appears >>to only be able to import about 100 games/second, and nobody else has reported >>anything less than 2,000 games/second. That's probably something I should look >>at. It's DEFINITELY the PGN reader (which is very old and VERY picky code, meaning that I am reluctant to change any of it). It appears to average 15-20ms per game, which is why I'm only getting about 60 games/second total speed during import. Oh well.... According to my survey, there's probably only about 100 people who actually use it (and I'm only half-kidding). jm
This page took 0 seconds to execute
Last modified: Thu, 15 Apr 21 08:11:13 -0700
Current Computer Chess Club Forums at Talkchess. This site by Sean Mintz.