Computer Chess Club Archives


Search

Terms

Messages

Subject: Re: Interview with Christophe theron (from chessbase) about GT2 and CT14

Author: David Rasmussen

Date: 05:55:39 04/19/01

Go up one level in this thread


On April 18, 2001 at 16:01:06, Uri Blass wrote:
>
>I know it but the point is that programs are very fast and every algorithm in
>chess that is better for deep search is probably better for today's hardware.
>>

That is simply not true. One example is Enhanced Transposition Cutoff, as I've
mentioned earlier. ETC will always be an improvement in the long run, since
you're winning an exponential amount of time by using a constant (but on todays
systems, large) amount of time. This is often not beneficial on todays system,
but on a much quicker system, it would be.

>>If you draw the two functions with a larger a value, you will see than at some
>>point they cross, and then n*log(n) wins. You know this, of course. But my point
>>is that the domain of chess, modern chess program implementation and modern
>>hardware, is in a state like bubblesort at the very beginning of the graph (1-5
>>data elements). That means the constants are often far more important than the
>>algorithmic complexitites, because the datasets are _very_ small (far to small
>>time controls).
>
>I guess that it is the case for hardware of 1970 but I guess that it is not
>truth for the hardware of today unless you play games at 0.001 second per move.
>

Again I disagree. I am talking about all the algoritmic enhancements that are
_not_ used today, because of the state of our systems. There are many constant
time og polynomial time enhancements that one could make, that would gain
exponential amounts of time, but which would only start to gain time if the
order of magnitude of nodes searched were 1000 or 100.000 or 1000.000.000 times
larger or more on normal time controls.



This page took 0 seconds to execute

Last modified: Thu, 15 Apr 21 08:11:13 -0700

Current Computer Chess Club Forums at Talkchess. This site by Sean Mintz.