Computer Chess Club Archives


Search

Terms

Messages

Subject: Re: O(1) garbage

Author: Robert Hyatt

Date: 20:14:44 05/15/01

Go up one level in this thread


On May 15, 2001 at 17:52:07, Jesper Antonsson wrote:

>On May 14, 2001 at 13:58:50, Robert Hyatt wrote:
>
>>On May 13, 2001 at 20:42:29, Jeremiah Penery wrote:
>>
>>>On May 13, 2001 at 19:17:36, Jesper Antonsson wrote:
>>>
>>>>On May 11, 2001 at 21:42:53, Jeremiah Penery wrote:
>>>>
>>>>>On May 11, 2001 at 16:43:38, Jesper Antonsson wrote:
>>>>>
>>>>>>My understanding of this topic is without a doubt as good or better than yours,
>>>>>>thank you very much. I agree that it is not meaningful to talk about complexity
>>>>>>for sorting 1000 items, while I don't agree that's comparable to chess. You,
>>>>>>however, doesn't seem to understand that *depth* was the input considered in the
>>>>>>start of this argument, and that *chess* is the topic.
>>>>>
>>>>>It seems you don't understand the point of the O-notation.  It's not supposed to
>>>>>tell you that algorithm FOO takes time X to run on input BAR.  It's supposed to
>>>>>let you _estimate_ the running time of algorithm FOO on input BAR+1 (BAR+2,
>>>>>BAR+3, etc...) when you know the running time on input BAR.  ***You know that
>>>>>the algorithm will eventually complete, no matter what the input!***  But, for
>>>>>example, naming a sorting algorithm O(1) says that it will sort 1000000 numbers
>>>>>just as fast as it sorts 5 numbers.  When you call a tree-searching algorithm
>>>>>O(1), when depth is the input, you're saying that it takes the same amount of
>>>>>time to search depth 500 as it does to search depth 3.  Obviously, this is wrong
>>>>>for any chess program.
>>>>
>>>>This is all wrong, my friend. The "O-notation" doesn't have a "point" (or at
>>>>least not that one)
>>
>>
>>That is totally incorrect.  From one of many books, "Introduction to Algorithms"
>>by Udi Manber:
>>
>>"The purpose of algorithm analysis is to predict the behavior, expecially
>>the running time, of an algorithm."
>>
>>As far as the "mythical" requirement that N be unbounded, I give this explicit
>>definition of Big-oh:
>
>Bob, you are dreaming. You keep repeating that "unbounded" nonsense, but at
>least I have never given that requirement.


I'm not dreaming.  It has been stated _multiple_ times.  If N is "bounded"
then the search is O(1), rather than O(W^(D/2)).  I happen to not buy that
since my books don't mention it, but some insist on it...




>
>>"We say a function g(n) is O(f(n)) for another f(n) if there exist constants
>>c and N such that for all n>=N, we have g(n) <= cf(n).
>
>Exactly! Now when you have the definition in front of you, you should be able to
>easily see that chess is O(1). Why? Because when n>=5000 or whatever,
>g(n) <= c*1. (n is target depth, g(n) running time, c is a big constant) It
>isn't harder than that.
>

And I maintain that definition of O(1) is flawed and nonsensical.  IE I redefine
the TSP to include every city currently on the surface of planet earth and no
more.  You still claim that when I go from 1000 cities to 2000 cities to solve
the problem the solution is O(1)?  I don't.  Even though the number of cities
has some upper bound.  Because I can do the math to see that the upper bound
isn't going to be reached for a long time.  In the case of chess, your constant
C has to be on the order of 38^10000.  I claim that is close enough to infinity
that the two can't be distinguished.

in another thread I noted that one trillion years is 10^23 nanoseconds.  I
posed the question "if we double speed every year, how long until we can count
to 38^10000, given that today we can only count to 10^23 at one operation per
nanosecond, assuming we have one trillion years to count.  I also claim that
the universe will collapse and big-bang again _long_ before we are even a
fraction of the way done with that "count".




>>I don't see any _requirement_ that n or N be unbounded.  The "for all n" only
>>says that the expression is valid for each n that is >= N.  IE for large enough
>>n,
>>the function g(n) is nore more than a constant times the function f(n).  The
>>function g(n) may be less than cf(n), even substantially less;  the O notation
>>bounds it only from above."
>
>Yup. So chess is also O(2^n), but you repeatedly claimed that it *isn't* O(1),
>which is quite wrong, of course.

Just give me that algorithm that will search from depth D to depth D+1 in
constant time.  Then I'll believe.



>
>>So I don't see how anyone can say "The O notation does not have a point."  Every
>>book I have gives this "point".
>
>Of course, even definitions are made for a reason, but the O notation is old and
>it's definition carved in stone. You can't go around claiming things that
>opposes the definition because it doesn't fit you intuitive notion of whether
>the consequenses of that definition are useful or not in a particular case.
>
>>  Nor do I see any requirement (yet) that says
>>that unless the size of the input is essentially without limit, O notation does
>>not apply.
>>
>>
>>
>>>
>>>Then why take the time to find the O(f(n)) of any algorithm?  The point of doing
>>>it is so that you can see whether algorithm A will run faster than algorithm B
>>>(quicksort runs faster than bogosort, for example) with arbitrary input.  If you
>>>have two sorting algorithms, one with O(n + 5000) and one with O(n!), you can
>>>say that for very small N, the O(n!) program will be faster, but otherwise it
>>>will be much slower.  This is "point" of O(f(n)).  If you don't correctly find
>>>O(f(n)) of an algorithm (or algorithms), how will you ever know which one to
>>>choose for a particular job?
>>>
>>>>and O(1) does not guarantee same running time for different
>>>>inputs. The O-notation is clearly defined, and my use adheres to the definition.
>>>
>>>O(1) is _constant_ running time, regardless of input length.  You can make a
>>>graph and see this very clearly.  Any algorithm with different running times for
>>>different N must use an O(f(n)) that actually has an "n" in it - O(n),
>>>O(log(n)), O(n*log(n)), O(n!), O(exp(n)), etc.
>>>
>>>>It's quite simple math, look it up.
>>>
>>>I agree, the math is quite simple.
>>
>>
>>Nobody is looking this up.  Everyone is using their own personal interpretation
>>of something they were exposed to years ago.  I am _still_ waiting on someone
>>to quote a book that _requires_ the number of inputs to be unlimited.  And I
>>would hope that everyone knocks off the "O is not used to estimate run-time of
>>an algorithm" without quoting their source.  I just quoted one of mine.  I will
>>be happy to quote others if requested...
>
>You are fighting a straw man, Bob. We don't require the depth to be unlimited.
>It doesn't matter. Whether the depth is unlimited or not, chess is O(1). Just
>look at the definition you gave.

I'm waiting on your O(1) algorithm.  I want to be world champion again.



This page took 0.01 seconds to execute

Last modified: Thu, 15 Apr 21 08:11:13 -0700

Current Computer Chess Club Forums at Talkchess. This site by Sean Mintz.