Computer Chess Club Archives


Search

Terms

Messages

Subject: Re: A final stab at big-O

Author: Josh Strayhorn

Date: 13:15:27 05/09/01

Go up one level in this thread


On May 09, 2001 at 16:02:59, Andrew Dados wrote:

>On May 09, 2001 at 15:49:26, Dann Corbit wrote:
>
>>Suppose that I have a collection of sorting algorithms.
>>
>>I know that the count of vector elements to be put into sorted order will never
>>exceed a size_t.
>>
>>I have a counting sort that can sort in O(n*log(element_length))
>>
>>I have a relaxed-quick-heapsort that can sort in O(log(n!))
>>
>>I have an shell sort that can sort in O(n^(1 + epsilon/n))
>>
>>Now, since we know that the count will never exceed a size_t, have all these
>>calculations really simplified to O(1)?  Is there really no way to choose
>>between them?  If they really do retain their original behaviors, then what is
>>the utility of declaring them to be O(1)?
>
>In chess, no matter what position, it is guaranteed to be solved after less then
>2^100 steps. Now find me such a sorting algorithm that will require less then
>Constant steps * no matter what size_t *. Can't you see that simple difference?
>
>-Andrew-
>-Andrew-

I do not think that the fact that a problem is guaranteed to be solved means
that the algorithm used to solve it is O(1).  It only means that there exists an
O(1) algorithm for the problem.  In any case this is tautological.  You cannot
use the solution to a problem to solve the same problem.  The efficiency of the
algorithm used to generate the solution the first time will tell you how
difficult the problem is.  Since chess is not yet solved, the algorithm used to
approach the solution is the only relevant one to the discussion, and it is
exponential.

JS.



This page took 0 seconds to execute

Last modified: Thu, 15 Apr 21 08:11:13 -0700

Current Computer Chess Club Forums at Talkchess. This site by Sean Mintz.