Computer Chess Club Archives


Search

Terms

Messages

Subject: Re: Never Say "Impossible"

Author: Robert Hyatt

Date: 10:12:40 05/09/01

Go up one level in this thread


On May 09, 2001 at 12:41:52, Uri Blass wrote:

>On May 09, 2001 at 10:12:29, Robert Hyatt wrote:
>
>>On May 09, 2001 at 05:57:03, Uri Blass wrote:
>>
>>>On May 09, 2001 at 05:42:08, Jeremiah Penery wrote:
>>>
>>>>On May 09, 2001 at 03:39:42, Uri Blass wrote:
>>>>
>>>>>On May 09, 2001 at 01:11:50, Dann Corbit wrote:
>>>>>
>>>>>>On May 09, 2001 at 00:41:38, Uri Blass wrote:
>>>>>>[snip]
>>>>>>>No
>>>>>>>Chess is clearly polynomial and even O(1) like every solvable problem that is a
>>>>>>>practical problem.
>>>>>>>
>>>>>>>You need constant time to solve chess when the problem is only that the constant
>>>>>>>is large.
>>>>>>>
>>>>>>>The only problems that are exponential are problems when the size of the input
>>>>>>>has no finite upper bound.
>>>>>>>You can see the input of every practical problem in the world as something that
>>>>>>>has an upper bound so saying that an algorithm is not O(1) has only mathematical
>>>>>>>meaning and not practical meaning.
>>>>>>
>>>>>>Unbelievable that you also do not understand Big-O notation.
>>>>>
>>>>>I responded to the sentence of Hyatt:
>>>>>"In the case of chess it is clearly exponential, which is _not_ any form of a
>>>>>polynomial."
>>>>>
>>>>>I did not say that O(1) is not also O(exp(n)).
>>>>>I understand from the sentence of Hyatt that chess is not O(1) and even not
>>>>>O(n^500000)
>>>>
>>>>If I understand this correctly, a minimax search of the chess tree should be
>>>>something like O(36^n), where n is the search depth.  Of course for most
>>>>programs of today, perhaps this can be reduced to O(3^n) or so by introducing
>>>>the errors of selective search.
>>>
>>>I think that even with minimax it is better if you use hash tables and save
>>>searching the same position twice.
>>>>
>>>>I've had very little work with big-O notation, but I did read a couple of the
>>>>links given by Dann.  Someone please correct me if the above is wrong. :)
>>>>
>>>>><snipped>
>>>>>>I suggest that you read my other posts on the topic.
>>>>>>
>>>>>>Anyone who says that an O(1) process is not O(exp(n)) does not know what the
>>>>>>notation means!!!
>>>>>>
>>>>>>Anyway, you can say what O(f(n)) is by looking only at the algorithms not the
>>>>>>data.  Do you really understand this!?
>>>>>
>>>>>I understand what you mean but
>>>>>Every algorithm when there is a bound on the size of the data is practically
>>>>>O(1) because exp(constant) is another constant.
>>>>>
>>>>>Algorithms to solve problem like sort can be described theoretically
>>>>>as not O(1) because there is not a finite upper bound for the number of elemnts
>>>>>by definition.
>>>>
>>>>Yes, theoretically you can give infinite elements, but that's really a pointless
>>>>exercise.  An infinite number of elements in any algorithm should take infinite
>>>>time - that's what makes it infinite.  If you divide infinity by 1000000000,
>>>>it's still infinity.
>>>>
>>>>You can theoretically give an infinite-size tree to a tree-searching algorithm
>>>>too - like the algorithms in chess programs... So tree-search algorithms aren't
>>>>O(1) (by your definition!), yet chess algorithms are?
>>>>
>>>>>For practical purposes you can say that there is a finite upper bound so you can
>>>>>say that Solving the practical problems can be done in O(1) when the only
>>>>>problem is the constant.
>>>>>
>>>>>Algorithms to solve problem like chess,go or other games can be described as
>>>>>O(1).
>>>>
>>>>I think that's sort of like wiggling through a loophole in the notation.  The
>>>>constant term would be so huge it may as well be infinity, for practical
>>>>purposes.
>>>
>>>I agree that it is not relevant for practical purposes.
>>>
>>>  It's very possible that to solve chess you'd need more material than
>>>>exists in the universe just to store the information, so again it's a pointless
>>>>argument.
>>>
>>>You need less than it but it is not practical for the near future.
>>>For Storing all the legal positions you need less than 10^50 bytes.
>>>
>>>Uri
>>
>>
>>10^50 is not _close_.  Each position is much more than just the squares the
>>pieces are on.  It must include the game history to reach that position to
>>account for repetitions and 50-move counting.
>The question is what is the meaning of solving chess.
>If the meaning of solving chess is creating an unbeatable program then in order
>to create an unbeatable program this information is not relevant.
>You only need the 32 piece tablebases and even less than it because most of the
>legal positions are not relevant and you only need to store positions when you
>can get into them(for example you do not need to know the result of the game
>from the position after 1.f3 f6 because you will never get this positionm in a
>practical game).
>
>If your first move is 1.e4 and after 1.d4 you reply 1...Nf6 and if these moves
>are correct based on the 32 piece tablebases then you do not need information
>about the position after 1.d4 d5 because you can be unbeatable without getting
>there.
>
>Uri


But you are +still+ missing the point.  To construct those tablebases you have
to do exactly what you are saying is not needed _after_ they are constructed.
The construction phase will _definitely_ have to handle positions and pathways
or it won't include the 50 move rule and it will be wrong.

This argument becomes "once I search the tree with the _real_ rules, then I
can search the resulting database without regard to some of the rules.  Because
they are already incorporated into the database."  That argument has a large
fallacy built-in...  Ie you can't do it until after you have done it once.  Once
you have done it once the next time you can do it easier...



This page took 0 seconds to execute

Last modified: Thu, 15 Apr 21 08:11:13 -0700

Current Computer Chess Club Forums at Talkchess. This site by Sean Mintz.