Author: Dann Corbit
Date: 12:49:26 05/09/01
Suppose that I have a collection of sorting algorithms. I know that the count of vector elements to be put into sorted order will never exceed a size_t. I have a counting sort that can sort in O(n*log(element_length)) I have a relaxed-quick-heapsort that can sort in O(log(n!)) I have an shell sort that can sort in O(n^(1 + epsilon/n)) Now, since we know that the count will never exceed a size_t, have all these calculations really simplified to O(1)? Is there really no way to choose between them? If they really do retain their original behaviors, then what is the utility of declaring them to be O(1)?
This page took 0.01 seconds to execute
Last modified: Thu, 15 Apr 21 08:11:13 -0700
Current Computer Chess Club Forums at Talkchess. This site by Sean Mintz.