Author: David Rasmussen
Date: 14:26:39 05/17/01
Go up one level in this thread
On May 17, 2001 at 16:44:43, Jesper Antonsson wrote: >On May 16, 2001 at 18:31:19, Robert Hyatt wrote: > >>On May 16, 2001 at 18:10:12, Jesper Antonsson wrote: >> >>>On May 15, 2001 at 22:03:19, Robert Hyatt wrote: >>>>>Not 20->21, but perhaps 200-201, and certainly with 6000->6001. And this >>>>>algorithm can be constructed, you know that and I know that. That it hasn't been >>>>>done isn't relevant. >>>> >>>>6000 is not enough, using the 50-move rule. You need over 10,000 plies before >>>>you run out of pawn pushes or captures. And that tree will never be searched, >>>>period. 38^10000 is big enough that it could be considered infinite with no >>>>loss of accuracy since nobody can count to that number using any speed of >>>>processor and amount of time you want to name. IE one trillion years is only >>>>5 x 10^23 nanoseconds. I don't think the universe is expected to last that >>>>long. counting once per nanosecond, you have barely started by the time the >>>>universe ends. you have reached 10^23. You are counting to 10^10000. Is >>>>that close enough to infinity to be happy? >>> >>>No. :-) This has nothing to do with big O. >>> >>>>Saying that eventually chess becomes O(1) is therefore simply nonsense. Unless >>>>you can tell me _exactly_ when it will become O(1). From my vantage point, it >>>>will _never_ happen. >>> >>>That is even more nonsense than before, Bob. It doesn't "become" O(1) at some >>>depth or time or whatever. It either is or is not. And by definition, it is.. >>> >>>>Assume machines get 2x faster a year. When do we get a machine that can count >>>>to that ridiculous number? IE what is log2(38^10000)??? I claim it is so big >>>>no human will ever see chess turn into O(1). >>>> >>>>Which means it will stay at O(W^(D/2)) >>> >>>Again, it doesn't "turn" into O(1), it is or is not. Come on, you use a small >>>straightforward definition incorrectly, then actually *quote* the definition and >>>*still* persists in using it incorrectly again and again. What's up with that? >> >> >>As I said... >> >>(1) O() is used explicitly to predict algorithm run-time as the input parameter >>is increased; >> >>(2) chess _clearly_ requires W times as long to do a depth D+1 search as it does >>to do a depth D search, for all D that are 'reasonable'. > >Yup. > >>By those two statements, chess is _clearly_ O(W^(D/2)). > >Nope. (Because big O notation doesn't deal with "reasonable" D's. On the >contrary, it deals with asymptotic behaviour, as in extremely large D's.) > >>To say it is O(1) is simply a useless statement. > >Perhaps, but in that case it is a correct useless statement. :) > >(and note that by chess, I mean alpha/beta minimax >>search as "the" algorithm used to search the tree.) > >Well, then, that's a different matter. I can agree that "pure" alpha/beta in >some sense is O(W^(D/2)). Of course, that doesn't say much unless we know >"W"... > >>If someone walked into my office and said "Hyatt, here is a game tree search >>algorithm I found (alpha/beta minimax) and I want to use it to search a game >>tree that is based on a game exactly like chess in every way, except that the >>pieces move differently for example, and I want you to tell me how to estimate >>the run-time for my tree search as I try to go a ply deeper, and then another >>ply deeper, and so forth." >> >>I would answer that question with "this is an O(W^(D/2)) problem. (which is >>the same as O(W^D) to the theorists but not to me). Would you _really_ tell >>them that it is O(1)? I wouldn't... > >I would probably not mention big O notation, and instead just say that it will >behave exponentially for reasonable inputs. If he requested big O notation, I >would be forced to say O(1). I certainly wouldn't lie to him just because I had >some idea that the lie would be more useful. You wouldn't lie. You would just take a stand as to what your definition of an algorithm was, what his was, and what question he would really like the answer for, as his question is to vaguely formulated, when in the presence of people with 'odd' but "also "correct"" definitions of normal, but unfortunately informal terms. Big O notation is a very simple purely mathematical tool, that is perfectly formalized. And it says nothing about algorithms, but it does say something about functions. What functions you would like to look at, is your choice, but if you don't agree with the people you're communicating with about what you're talking about, it is worth nothing. If he meant his question in the way that most of us here interprets it (because it _does_ demand interpretation, and yours is just different than ours), we would be getting the wrong answer from you. What you're saying isn't "true", it is just one interpretation of a question with some formal stuff (big O notatition) applied. And a very odd interpretation according to most people here.
This page took 0.01 seconds to execute
Last modified: Thu, 15 Apr 21 08:11:13 -0700
Current Computer Chess Club Forums at Talkchess. This site by Sean Mintz.