Computer Chess Club Archives


Search

Terms

Messages

Subject: Re: O(1) garbage

Author: Dann Corbit

Date: 15:27:05 05/15/01

Go up one level in this thread


On May 15, 2001 at 18:10:30, Jesper Antonsson wrote:
[snip]
>>All I care about is that in 8x8 chess, for each additional ply of depth I try
>>to do, I get an exponential increase in the work required to do the search.
>>For all depth N that are doable in the universe I live within.
>
>Does the big O definition say anything about what's doable in your universe?

That is the only point of big-O notation.  Why bother with complexity analysis
at all?

Things that are demonstrably [read 'measurably'] O(1) can be done
instantaneously for all n.
Things that are demonstrably O(log log n) can be done nearly instantaneously for
very large n.
Things that are demonstrably O(log n) can be done quickly for large n.
Things that are demonstrably O(n) can be done fairly quickly for large n.
Things that are O(n^2) can be done for fairly large n.
Things that are O(n!) or O(exp(n)) can be done only for small n.

That's why we bother to find out.  This sort of analysis tells us that
(heuristically speaking) chess or TSP will cause problems at large input sizes
(plies for chess, cities for TSP) if we want an exact solution.  But it will
also tell us that if we have a sorted list of cities, we can find a given city
almost instantly, despite the list containing every city in the world.  That
sort of analysis will tell us that we can compute combinations and permutations
for a deck of cards without too much trouble, but that DNA will be a bit of a
problem.

Why do we care if a sort is O(n^2) or O(n*log(log(n))) in the first place?  We
will never sort more than a few billion things, tops.  It's because we want to
get some sort of grip on what problem size is doable.

I learned complexity analysis from a purely heuristic standpoint.  From the
large cluster of dissenters, it is clear that some other people learn about
complexity analysis from a purely mathematical standpoint.  Quite frankly, I see
no use in it.  We already know if it is an algorithm, it will terminate.  If we
want to know the answer to the question "WHEN?" then we need to know how the
algorithm operates as a function of its input.

If the algorithm is exponential, we *know* that it is *stupid* to cram a great
big input down its throat.  It won't ever come back.

>>So I will continue to say that N+1 is hard, while some will say it is O(1).
>>I'm waiting to see their algorithm however, that will search to depth=20 and
>>depth=21 in the same amount of time.
>
>Not 20->21, but perhaps 200-201,

exp(200) is a rather large value.  If chess remains exponential, this value is
still absurdly out of reach.  Maybe someone will come up with something better
and search(pos,200) will make sense.  Right now, it doesn't.

>and certainly with 6000->6001.
exp(6000) is so large it has no useful meaning for humans, other than discussing
sizes of galaxies and quantities of atoms.  When it comes to time to complete a
task, it has no effective difference from forever.

>And this
>algorithm can be constructed, you know that and I know that. That it hasn't been
>done isn't relevant.

That it would be completely useless is relevant.



This page took 0.01 seconds to execute

Last modified: Thu, 15 Apr 21 08:11:13 -0700

Current Computer Chess Club Forums at Talkchess. This site by Sean Mintz.