Author: Peter McKenzie
Date: 02:52:02 01/30/02
Go up one level in this thread
On January 30, 2002 at 05:37:19, Andreas Stabel wrote: >On January 30, 2002 at 01:56:33, Dann Corbit wrote: > > >> >>Let's suppose that you need 170 bits to encode a chess position. Now, with that >>position [for instance], you may have automatically stored 50 permutation of it. >> The net number of bits needed to store each of those 50 positions is 170/50 = >>3.4 bits. Any time you run into one of the others, you perform the same >>algorithm and discard any duplicates (low-order keys already stored). You end >>up storing a single position, which represents an entire class of positions. >> > >... snip ... > >> >>The math is incredibly simple. A lot less work than decompressing a page from a >>Nalimov tablebase file, I would guess. > >A final time: >Let's suppose you need 170 bits to encode a chess position, that is the >same as saying you have 2^170 possible positions you want to store one of. >Now by some miracle all theese positions are equivalent to 50, let's even >say 64 "permutations", so the total number of positions to store is only >1/64 of the initial number. Then the number of bits needed to store theese >positions is ***NOT*** 170/64, but 2^170/64 = 2^170/2^6 = 2^(170-6) = 2^164 > >i.e. You ***DO NOT*** divide the number of bits by 64, but the number of >positions !!! >So the answer is you subtrace the logarithm base 2 of the factor you divide >the possible number of positions by, from the number of bits: >170 - 2log 64 = 170 - 6 = 164 I finally gave in and had a quick scan of this thread ... the argument here seems pretty conclusive. Unless there is something else to this magical compression scheme that was overlooked? > >Best regards >Andreas Stabel
This page took 0 seconds to execute
Last modified: Thu, 15 Apr 21 08:11:13 -0700
Current Computer Chess Club Forums at Talkchess. This site by Sean Mintz.