Computer Chess Club Archives


Search

Terms

Messages

Subject: Re: A final stab at big-O

Author: Andrew Dados

Date: 13:10:09 05/09/01

Go up one level in this thread


On May 09, 2001 at 16:05:37, Robert Hyatt wrote:

>On May 09, 2001 at 16:02:59, Andrew Dados wrote:
>
>>On May 09, 2001 at 15:49:26, Dann Corbit wrote:
>>
>>>Suppose that I have a collection of sorting algorithms.
>>>
>>>I know that the count of vector elements to be put into sorted order will never
>>>exceed a size_t.
>>>
>>>I have a counting sort that can sort in O(n*log(element_length))
>>>
>>>I have a relaxed-quick-heapsort that can sort in O(log(n!))
>>>
>>>I have an shell sort that can sort in O(n^(1 + epsilon/n))
>>>
>>>Now, since we know that the count will never exceed a size_t, have all these
>>>calculations really simplified to O(1)?  Is there really no way to choose
>>>between them?  If they really do retain their original behaviors, then what is
>>>the utility of declaring them to be O(1)?
>>
>>In chess, no matter what position, it is guaranteed to be solved after less then
>>2^100 steps. Now find me such a sorting algorithm that will require less then
>>Constant steps * no matter what size_t *. Can't you see that simple difference?
>>
>>-Andrew-
>>-Andrew-
>
>
>Where does that 2^100 steps come from?

If you run mini-max you can visit only legal positions in order to solve
starting position. I just estimate that at 2^100 (or whatever constant
approximates it better).

>And from complexity theory what does it matter anyway?

Guaranteed fixed number of steps means O(1) afik.





This page took 0.01 seconds to execute

Last modified: Thu, 15 Apr 21 08:11:13 -0700

Current Computer Chess Club Forums at Talkchess. This site by Sean Mintz.