Author: Vincent Diepeveen
Date: 16:04:45 03/09/04
Go up one level in this thread
On March 09, 2004 at 10:48:15, Bob Durrett wrote: >On March 09, 2004 at 09:08:45, Vincent Diepeveen wrote: > >>On March 09, 2004 at 08:45:07, Bob Durrett wrote: >> >>If you've got 3-4GB ram you can generate 6 men for DIEP, no problem. > >That is interesting! Suppose 7 men or more were desired? How much RAM needed >to do the job? It depends upon the time i get to implement it. Look all 6 men will still take 2 years with Eugene's efforts there. So the discussion is theoretical anyway. But let's assume a 6 men and doing effort for generation you don't need much RAM then. Doing it in the easy way like my 6 men generator is (i'm lazy i'll admit it) then it will need quite some more RAM. Question is what you want to generate in 7 men. You first need a shitload of pawnless nonsense EGTBs before you can start with the interesting ones containing pawns. In diep format the 7 men are pretty big. Where for 6 men in total my index scheme needs 2 x 2.62T entries, for 7 men it needs: 2 x : In total 1511 egtbs using 311825928380540 entries So that's like 2x311.825.928.380.540 = 622T entries That's quite a lot. That's more than a factor 100 more and 3000+ files. Let's quote the smallest file without pawns: 728 krbnkrn 01240208 358752350880 -- -- -- -- So like each of the 2 files will need : 358.752.350.880 entries. 358 billion entries a file !!!! Using simple generator style i got now that's: 358 G / 8 = 45GB So with like 48GB memory or so you can try generating it in a simple way. There are a lot of tricks to do with less RAM, but that eats a lot of programming time and time is something i hardly have. Until then if you find someone with 3-4GB of ram to generate for DIEP egtb's, just give a chill. >Maybe most people do not have that much RAM. That might drastically cut down on >the number of people who could generate multi-men tablebases. Like near to zero. >Perhaps a very large computer, or maybe a special purpose computer would be >needed. It might also be possible to create a new algorithm for generation of >tablebases which would accommodate smaller RAMs at the expense of maybe taking >longer [and much swapping with hard drive?]to generate the tablebases. perhaps, can, might. The problem is time. No one gets paid to generate EGTBs. Nalimov wrote his thing entirely in his sparetime and without Kadatch i doubt we would be any further as the size of the files are just huge. >There is another issue: Suppose that 7-, 8-, or 9-man tablebases were >available. Would an ordinary PC be able to utilize such tablebases? Would The 6 men compressed in diep format are about 100GB, but they are not generated yet. I estimate Nalimov needs 2.6 - 3TB for his ones. We are far away from having them all yet. Suppose it's like 2.6TB, does your ordinary pc support that? Perhaps in 2010 it will... >existing chess-playing programs [GUIs?] be able to work with such large >tablebases? It takes a little bit of time to load Fritz as it is. Maybe use of >very large tablebases would make it take forever to load. The only loading problem is the compression tables created by Kadatch compression. You can also extract all your egtbs and you will have no delay. All 5 men will eat like 30GB only then. >Perhaps there are other practical limitations to the use of very large >tablebases too. The only limitation is the time programmers can invest into creating easy solutions for the users. I'm sure with some type of supercompression you can get all 6 men to like 10GB easily. My win/draw/loss experiments now show different results. This is the raw diep format: 24-02-2004 21:24 61.496.784 krpkr_b.dtb 24-02-2004 19:53 61.496.784 krpkr_w.dtb Compressed with datacomp from Andrew Kadatch using 8KB chunks it's like : 21.8MB Using RAR it's like 14MB Using RKC it's 8MB, so like 2.5 times more compact than Kadatch. That eats however 800MB ram and a fast 2.25Ghz cpu to compress and like 24 hours to calculate how to ideally compress it. Decompressing also 24 hours. an entire evening. Decompressing identical. So compressing the EGTBs to small sizes might eat more time than generating them, if you want to look at it like that. I see no limits however yet to what is possible. The only problem is the time invested. Kadatch' Huffman code works right out of the box so to speak (still busy implementing it though). The 2 nalimov files are by the way together : 07-07-2000 11:32 68.946.403 krpkrnbb.emd 07-07-2000 11:33 81.247.218 krpkrnbw.emd So diep's egtb's even raw (without investing that extra time) are 20MB where nalimov needs 140MB. Say a factor 7 difference for 5 men, using raw unmodified 5 men files of DIEP. Yet better index scheme can do a lot, better compressoin can do a lot. There is quite a difference between 8MB and 140MB. that's already a factor 17.5 diff (which for 6 men will directly convert into a factor 40-50 diff or so). >Thoughts = ? >Bob D. >>For Nalimov you might need a bit more. >> >>With DIEP only cpu speed and memory access counts. harddisk speeds, unless very >>very slow, are not most interesting. >> >>> >>>The solution is thousands of years old. It's called "devide and conquer." >>> >>>Simply get a large number of people to create the tablebases. >>> >>>For example, a seven-piece tablebase can be n mutually-exclusive types of >>>seven-piece endgames and then each person generate the tablebase for his/her >>>part. >>> >>>This assumes some coordination. I hereby nominate Nalimov to do that >>>coordination. >>> >>>Bob D.
This page took 0 seconds to execute
Last modified: Thu, 15 Apr 21 08:11:13 -0700
Current Computer Chess Club Forums at Talkchess. This site by Sean Mintz.