Author: Andrew Dados
Date: 12:49:59 05/09/01
Go up one level in this thread
On May 09, 2001 at 15:14:10, Dann Corbit wrote:
>On May 09, 2001 at 15:02:21, Andrew Dados wrote:
>
>>On May 09, 2001 at 14:17:33, Dann Corbit wrote:
>>
>>>On May 09, 2001 at 14:10:02, Andrew Dados wrote:
>>>[snip]
>>>>I would rather say, that 'finite state number of a problem means O(1)'. Or 'if
>>>>it is solvable once for all (any chess position in our case) then it's O(1).
>>>>Because we proved there exist a formula computable without loops. We don't know
>>>>such a formula yet, but we know it exists. For given traveling salesman problem
>>>>we need those loops as for now.
>>>>
>>>>Now, set up an 8-piece position and leave it overnight for Crafty with 5 piece
>>>>TBs. Would it stay O(exp(n))?
>>>
>>>In storage - yes, in computation no. We have already performed the calculation.
>>> Now, again, some will say "the storage cost is finite." and indeed it is, for
>>>any n. But as n increases, storage needs increase exponentially.
>>
>>Not exponentially. You'll see less positions with 32 pieces then with 31.
>>Don't tell me 32 is too big number to estimate limiting bounds.
>
>Not at all. In fact, if you plug 32 into the equation, you will find an
>estimate of the time to compute it, which was what was wanted all along.
>
>> Now, in the
>>>case of tablebase files, clearly n cannot increase forever. But it will
>>>increase until you can't store the data. Not now, and probably not ever.
>>>Despite all this, the storage requirements for tablebase files are exponential.
>>>
>>>>Surely no. Most likely you'll get O(1). So already
>>>>we catching that O(1) tail slowly :)
>>>
>>>Again, we are trading O(exp(n)) comutational complexity for O(exp(n)) storage
>>>complexity (which is also not a solution). I suspect that we won't ever have
>>>the 32 piece tablebase file for the same reason that we won't ever compute chess
>>>all the way to the end.
>>>
>>>Trading space for time is a common solution. Consider the O(n) calculation for
>>>n!. I can precompute a table of factorials and store them. Then my calculation
>>>of factorials is O(1) {a simple table lookup} but my storage is O(n).
>>
>>You can not store all natural numbers
>
>They are not a finite set. You cannot compute any algorithm over the natural
>numbers that is not O(1).
>
>>, neither you can compute all n!.
>
>Not sure what you mean by this. Can you expand?
>
>>In
>>contrast number of chess positions, while it may seem huge for us now, is
>>bounded and they all can be stored.
>
>Storage is another issue entirely. It may or may not be connected to
>computational complexity.
>
>I can store a list of all the cities on the earth. Is TSP now O(1)?
Arrgh... you keep coming back with that TSP and list of all cities... If in TSP
we had given every time same number of cities in fixed locations, then it would
be O(1), sure.
In chess we have one fixed startup position. Period. You have same input every
time here.
You can't compare whole class of problems to one, well defined problem. Defined
problem don't scale, because there is nothing to change on input. And once
solved, it has O(1) computational cost.
>
>>You are right that today, for all practical purpose, complexity of mini-max
>>applied to chess is equal to mini-max applied to stock market and equals
>>O(exp(n)). However asymptotic behaviour for large n (large means > 100 for me :)
>
>Large n is different for every f(n). For n = O(1), there *is* no large n. For
>n = exp(n!)!
>5 is large.
>
>>will not be of order O(exp(n)).
>>
>>And no matter how slow (or fast) we are computing today, we need to distinguish
>>between practical cost now and theoretical asymptotic cost.
>
>They mean the same thing. Do we only care about sorting integers are the count
>grows to infinity? Do we only care about the most efficient ways to visit all
>cities on a route with the minimum distance if there are an infinite number of
>them?
>
>We will never have an infinite collection to sort. We will never have an
>infinte set of cities to visit. We want to know how much it will cost us to
>sort a vector or visit the cities as n changes.
>
>>Since one day we will reach O(1000)=O(1001) and then our cost will be O(1) :)
>
>O(1000) = O(10001) and in fact, O(999999999999999999999999999999999999999999)
>are all the same value, exactly.
>
>Consider that O(1) + O(1) = O(1) and that an arbitrary constant multiplier
>exists for all of these big-O functions.
Since n^5000>exp(n) for n<1000 then I estimate computational cost at
O(5000)<O(exp(n)). According to your argumentation we should not care for too
big n :).
Now you agree that O(5000) is better cost then O(exp(n)), then why we don't use
O(5000)?
Here's another one: since there is less then 2^100 chess positions then solving
chess will require less or equal then 2^100 steps. Yes... this is O(1). Where
did you get that O(exp(n)) thingie at all?? :)
Also, Big-O is defined as asymptotic behaviour, Dan; you can't arbitrary state
what are big or small numbers. For that matter alpha-beta + TT will never be
O(exp(n)).
I think most of us know what Big-O is; we just differ in classifying chess
problem as finite or not.
-Andrew-
This page took 0 seconds to execute
Last modified: Thu, 15 Apr 21 08:11:13 -0700
Current Computer Chess Club Forums at Talkchess. This site by Sean Mintz.