Author: Andrew Dados
Date: 13:24:09 05/09/01
Go up one level in this thread
On May 09, 2001 at 16:12:04, Dann Corbit wrote: >On May 09, 2001 at 16:02:59, Andrew Dados wrote: > >>On May 09, 2001 at 15:49:26, Dann Corbit wrote: >> >>>Suppose that I have a collection of sorting algorithms. >>> >>>I know that the count of vector elements to be put into sorted order will never >>>exceed a size_t. >>> >>>I have a counting sort that can sort in O(n*log(element_length)) >>> >>>I have a relaxed-quick-heapsort that can sort in O(log(n!)) >>> >>>I have an shell sort that can sort in O(n^(1 + epsilon/n)) >>> >>>Now, since we know that the count will never exceed a size_t, have all these >>>calculations really simplified to O(1)? Is there really no way to choose >>>between them? If they really do retain their original behaviors, then what is >>>the utility of declaring them to be O(1)? >> >>In chess, no matter what position, it is guaranteed to be solved after less then >>2^100 steps. Now find me such a sorting algorithm that will require less then >>Constant steps * no matter what size_t *. Can't you see that simple difference? > >I write sorting algorithms for a living. I even wrote a book chapter on them. >If I ever gave someone a sort that took 2^100 steps to complete I would be the >laughing stock of the world. > And what does it have to do with fact that constant (however big) is of O(1)? I don't argue IF CHESS IS PRACTICALLY SOLVABLE; I am just trying to show you, that there exist an algorithm solving ANY chess position with cost of O(1). Unlike sorting of any number of elements, or TSP with any number of cities. -Andrew-
This page took 0 seconds to execute
Last modified: Thu, 15 Apr 21 08:11:13 -0700
Current Computer Chess Club Forums at Talkchess. This site by Sean Mintz.