Author: GuyHaworth
Date: 12:02:36 09/08/02
Go up one level in this thread
I don't think Rob Hyatt and I are in disagreement about the parallelisability or
distributability of EGT generation.
Distributability: an example. There is nothing to stop computer A generating
KQQKQP while computer B generates KQQKRP. All they require is that KQQKQQ/R/B/N
and KQQKRQ/R/B/N have been generated first.
There is plenty of 'width' available in the lattice of endgames to allow
independent generation of EGTs.
Next, one can distinguish phases in the generation of an EGT:
a) the initialisation of the EGT ... terminal and transition positions
this is a major piece of work and commonly underestimated
b) [ optional phase but available with twin-sourcing:
confirmation that the initialisation of the EGT-generation is correct ]
c) retrograde or repeated-sweep generation of the EGT
d) independent validation of the integrity of the EGT generated
Within 'c', a further opportunity for parallelisation exists
c1) [ notional: divide the range of the EGT-index into N parts ]
c2) have processor 'k' back-up the frontier positions in part k of the index
Agreed: you would not start the next retro-phase until all N processors
complete here. Shared memory is needed for this, not to mention multiple
disc-drives.
In summary, EGT-generation is as parallelisable as required:
resource-availability is the only limit.
g
This page took 0 seconds to execute
Last modified: Thu, 15 Apr 21 08:11:13 -0700
Current Computer Chess Club Forums at Talkchess. This site by Sean Mintz.