Computer Chess Club Archives


Search

Terms

Messages

Subject: Re: Tablebase generation - what am I overlooking?

Author: Angrim

Date: 15:52:42 02/13/02

Go up one level in this thread


On February 13, 2002 at 18:39:23, Russell Reagan wrote:

>How is the decompression handled with the large TB's? Obviously you can't
>decompress a 3GB file from disk and probe it every move. Does it do partial
>decompression of a certain number of bytes? I'm curious how this works.
>
>Russell

block compression. the file is split into 8k chunks which are
then compressed.  To locate the compressed chunk in the file,
an index is used which is loaded into ram.  This is why the
tablebases need a big chunk of ram.  I think it is 4 bytes of
index per 8k of uncompressed file, but I havn't looked recently.

When you have a terabyte (uncompressed) of tablebases, this
can really cut into your available ram.

Angrim



This page took 0 seconds to execute

Last modified: Thu, 15 Apr 21 08:11:13 -0700

Current Computer Chess Club Forums at Talkchess. This site by Sean Mintz.