Computer Chess Club Archives


Search

Terms

Messages

Subject: Re: about a wrong conception that I had.

Author: Christophe Theron

Date: 13:31:44 09/26/03

Go up one level in this thread


On September 25, 2003 at 22:04:24, Uri Blass wrote:

>On September 25, 2003 at 20:52:26, Christophe Theron wrote:
>
>>On September 25, 2003 at 13:37:49, Uri Blass wrote:
>>
>>>I decided to take back the changes that I did in the last months and start again
>>>from version 8080a.
>>>
>>>I am going to do part of the changes again but all the changes about incremental
>>>data are probably not going to be done.
>>>
>>>The changes that I did included information for every square about the queen
>>>directions that it is blocked by white and the queen directions that it is
>>>blocked by black when the information calculated incrementally.
>>>
>>>It also included directions that every piece can go.
>>>
>>>I thought that this information is good because it helped me in faster
>>>generation of moves(I do not need to check moves of the rook to direction that
>>>it cannot go) and I thought about it as an investment for the future because it
>>>may be a good idea to do the code slower if later a lot of things that I am
>>>going to do are going to become faster(like evaluating of trapped pieces).
>>>
>>>I decided that it is not a good idea because the speed of the program is not
>>>proportional to the number of calculations and it seems that having more arrays
>>>also does other things that I want to add slower because the program cannot use
>>>the fast memory for them.
>>>
>>>It caused me to change my opinion about slow searchers.
>>>I had the conception that slow searchers is a better design decision because it
>>>is possible to add knowledge with almost no price but this was based on a wrong
>>>assumption(the assumption that the speed is proportional to the number of
>>>calculations).
>>
>>
>>
>>The wrong conception is to make a difference between slow and fast searchers.
>>
>>Knowing that a program is a fast or slow searcher tells you absolutely nothing
>>about what it is doing inside. It also tells you absolutely nothing about its
>>strength and weaknesses.
>
>I guess that I did not explain it well.
>I will explain my theory in other words.
>
>1)Suppose some calculation takes 1/1,000,000 second per node.
>
>This 1/1,000,000 second per node may be small price if you search 10000 nodes
>per second(only make your program 1% slower) but is a big price if you search
>1,000,000 nodes per second(make you twice slower).
>
>I made my program slower in nodes per seconds by calculating some data
>incrementally that I did not calculate before it.
>
>My theory was that by searching less nodes per second I can do future
>calculations relatively faster(or in the same speed or faster thanks to the data
>that is calculated incrementally).
>
>I agree to pay linear price for the near future of making the program 50% slower
>if I know that later I only earn speed and I assumed that later I may earn speed
>thanks to using arrays to do less calculations.



I understand what you meant, but one wrong assumption might be that you think
that you need to do your small additional computation at every node.

Maybe you don't need. Maybe you assume a program does, but actually it does not.

Of course in your case you know if you need to do it at every node. But you do
not know if some other program does it at every node.

So, again, seeing the NPS of a program tells you close to nothing.




>2)The only problem is that I cannot say that a calculation takes 1/1,000,000
>second per node and for a slow searcher the calculation can takes more time
>because the slow searcher may need to use slow memory for it or for other things
>when the fast searcher can use the fast memory.



While I agree with you and understand the problem (overflowing the processor's
L1 or L2 cache), it might be shortsighted to try to change your program to adapt
to this.

It's a philosophical problem actually.

Some chess programmers have the processors of the future in mind. They say for
example:
* most computers have enough memory to store 6-men databases, so I do not add
any knowledge related to these endings in my evaluation.
* next years the processors will have 1Mb of L1 cache, so I can assume my 700Kb
chess engine + data will fit in the cache.
(these programmers come mainly from rich countries where the cost of a high-end
computer is less than the average one month salary)

Personally I have the processors of the past in mind. I want my chess engine to
work on normal people's computers, the ones that are 3 or even 5 years old. I
even want it to be competitive on 386s and 486s! I also want my engine to work
fine on the most recent computers, but in general I do not have to care about
this: if my program performs well on a 3 years old computer, I'm sure it will be
performing very well on a recent one.

So if I had this L1 cache problem, I would certainly try to solve it. So I would
not use big tables. I wouldn't use bitboards either.




>3)I think that i understand now better the reason for your rule not to calculate
>now things that you can calculate later.
>
>It can be more than the time that it takes to calculate now.
>The problem is that calculation now may do other things slower when the program
>becomes bigger(for example if calculation now use big arrays).



That can be a secondary reason, but the main reason is that in a chess program
there are chances that something that you compute in advance is not going to be
used at all. The main reason for this is the nature of the alpha-beta algorithm
(pruning occurs when the score of a branch goes above beta, and there is no way
to predict when it is going to happen). It's funny that in a chess  program the
same alpha-beta like pattern occurs everywhere: in any routine you do not know
exactly how much work is going to be needed and you must be prepared to abort
the work in progress at any moment.

That's why your data structure must be very well thought and adapted to what you
are going to do. It must not force you to compute anything in advance, and when
you have something to compute the data structure must make this as easy and fast
as possible.

OK, what I'm saying here seems extremely obvious and could be said for any
program.

But the consequence is that attack tables or bitmaps or bitboards are in my
opinion a bad idea in a chess program. Want to know if A attacks B? Look at the
board, don't try to prepare the work in advance.

The importance of the data structures you are using is huge in chess. It took me
several years to find the basic data structure I'm using now.

It would be a mistake to establish a data structure and stick to it by any means
all the time. No amount of deep thinking will allow you to find the right data
structure at the first attempt. Your research will maybe show you that your data
structure is not adapted for this or that. When you realize it, continue working
with it for a while and notice what other problems you have with it. When you
have collected enough understanding of what you need, change your data structure
and start over. You will probably have to do it anyway. It's not a sign of
failure, it just shows that your understanding of the problem improves.



    Christophe



This page took 0 seconds to execute

Last modified: Thu, 15 Apr 21 08:11:13 -0700

Current Computer Chess Club Forums at Talkchess. This site by Sean Mintz.