Author: Guido
Date: 04:11:54 05/29/02
Go up one level in this thread
On May 28, 2002 at 03:30:53, Russell Reagan wrote: >Some of the 6-piece tablebases are over a gigabyte in size. Some of those files >take over a month to generate (assuming you even have hardware capable of >generating them in the first place), and there are still many more files to be >generated before the 6-piece tablebases are complete. > >So, I'm wondering how long we (we being Bob or whoever is doing this computing), >are going to continue generating these things. In a few years (or however long >it takes) when the 6-piece tablebases are complete, will we start on the 7-piece >tablebases? After those are complete many more years later, do we then start on >the 8-piece tablebases? > >It seems to me that eventually, since this problem will grow exponentially, that >it will cause a major problem somewhere down the line, because the size of hard >disks does not seem to be growing exponentially with respect to time. Eventually >we will get to the point where a single tablebase file will not fit onto any >hard disk. Eventually the exponential growth of the time it takes to produce the >files will grow beyond the computing power of the time. > >When that happens, will the tablebase generation continue on? Or will you look >at the problem and think, "well it took 30 years to generate the N-piece >tablebases, so it will take at least 10,000 years to generate the N+1-piece >tabelbases" and decide to stop? Or will you continue on in hopes that hardware >advances will help things move along more quickly? > >If you will eventually stop generation of the tablebases, when would you >estimate that will be? Not in years time, but after which level of tablebase >generation? After 8-piece tablebases are complete? Or 7-piece tablebases? Any >ideas? > >Russell On the basis of my experience on EGTB generation, 6 man complete tablebases and simple 7-men (without pawn and with duplicated pieces) could be generated with only 2Gb RAM, but at the following conditions: - Very fast 64 bits processor with possibility of managing practically illimited files - Very big and fast error-free hard disk. For example the KRBBKRR ending occupy in my indexing scheme little more than 85 Gb. So a disk is needed of at least 4 times this value plus some Gb for derived tablebases (KRBKRR, KBBKRR and KRBBKR previously evaluated). Therefore a 400 Gb disk could be the smallest dimension, keeping into account that 8 bits per position are not sufficient (but 16 are non necessary!) 4 times came from the fact that, for each ending, tablebases are always two, and it is necessary to save periodically the work, as the updating of the tablebases is made on the disk, so there are two current files and two backup files. I spoke of very fast error-free hard disk for the following reason: supposing that 200 is the longest sequence of the above mentioned ending and that saving is made every cycle, an evaluation of the read/written bytes are C*85*200 Gb where IMHO a rough value for C could be 10-12. In total to generate such ending about 200 Tb should be read/written without errors. On my small PC I try to generate a same ending using only RAM and using small RAM + disk. The increase in cpu time is not very big as it is about 20-30 %. But I done this check only on very simple endings. I think that CPU time in generating 6- and 7-men endings is surely the biggest problem, together with the requested characteristics of the disks, but when 64 bits processor and large and fast disks will be cheaply available, computations of EGTB could be done for the most part in parallel by many people on different computers (as it is done now for Mersenne's primes), sending reciprocally their results to the other partecipants, so reducing the years requested for this job. Ciao Guido
This page took 0 seconds to execute
Last modified: Thu, 15 Apr 21 08:11:13 -0700
Current Computer Chess Club Forums at Talkchess. This site by Sean Mintz.