Computer Chess Club Archives


Search

Terms

Messages

Subject: Re: A final stab at big-O

Author: Robert Hyatt

Date: 13:05:37 05/09/01

Go up one level in this thread


On May 09, 2001 at 16:02:59, Andrew Dados wrote:

>On May 09, 2001 at 15:49:26, Dann Corbit wrote:
>
>>Suppose that I have a collection of sorting algorithms.
>>
>>I know that the count of vector elements to be put into sorted order will never
>>exceed a size_t.
>>
>>I have a counting sort that can sort in O(n*log(element_length))
>>
>>I have a relaxed-quick-heapsort that can sort in O(log(n!))
>>
>>I have an shell sort that can sort in O(n^(1 + epsilon/n))
>>
>>Now, since we know that the count will never exceed a size_t, have all these
>>calculations really simplified to O(1)?  Is there really no way to choose
>>between them?  If they really do retain their original behaviors, then what is
>>the utility of declaring them to be O(1)?
>
>In chess, no matter what position, it is guaranteed to be solved after less then
>2^100 steps. Now find me such a sorting algorithm that will require less then
>Constant steps * no matter what size_t *. Can't you see that simple difference?
>
>-Andrew-
>-Andrew-


Where does that 2^100 steps come from?  And from complexity theory what does
it matter anyway?



This page took 0.01 seconds to execute

Last modified: Thu, 15 Apr 21 08:11:13 -0700

Current Computer Chess Club Forums at Talkchess. This site by Sean Mintz.