Author: Angrim
Date: 16:16:06 01/30/02
Go up one level in this thread
On January 30, 2002 at 15:21:53, Dann Corbit wrote: >On January 30, 2002 at 08:37:07, Vincent Diepeveen wrote: >>this is dead wrong, because then we can search that entire >>search space without nullmove and with nullmove we do it in >>a few seconds then, yet we do not search it within a few >>seconds. >> >>secondly mirroring reduces position by factor n. >> >>Say n = 2 for pawns ==> 2^250 / 2 = 2^249 >> >>The whole problem is that you guys still haven't figured out highschool math! > >I talked to Les on the phone to see where he was coming from. Allow me to >clarify. > >You cannot store all of the chess positions in 2^81 bits. Here is what you can >do: >1. Take a billion positions and reduce them to 1 billion/40 (or some such >factor). >2. Store them to disk. >3. Look them up. > >That's it. Not so bad, really. > >For positions which don't have a lot of pieces and for which sliding pieces make >the key move, you may be able to store the entire set to disk in just a few bits >per position, because of the fact that you only store a small subset to disk. > >You can't encode "all of chess" in this way and get the same savings. Nice clarification. And if this was actually what the method could do it would be reasonably useful for openning databases. Unfortunately what it actually does is: 1. take a billion positions that you want to store. 2. generate 39 billion other positions that are in some sense equivalent 3. store a billion positions to disk 4. claim that you stored 40 billion positions and got 40:1 compression. The critical point is that you can not compress an arbitrary set of positions with this method. Angrim
This page took 0 seconds to execute
Last modified: Thu, 15 Apr 21 08:11:13 -0700
Current Computer Chess Club Forums at Talkchess. This site by Sean Mintz.