Computer Chess Club Archives


Search

Terms

Messages

Subject: Re: Hash Collision

Author: Uri Blass

Date: 08:17:37 12/08/02

Go up one level in this thread


On December 08, 2002 at 10:47:20, Sune Fischer wrote:

>On December 08, 2002 at 10:26:40, Uri Blass wrote:
>>The point is that when you are going to have better hardware you may have more
>>collisions because of bigger hash tables and faster hardware.
>
>A larger table makes it closer to 64 bit, ie if you have 2^20 entries you have a
>32+20=52 bits, double that and you get 53 bits etc. so larger tables in this
>case generates larger and safer keys.

How do you define your hash table?
I use the following struct

typedef struct tagHASHE
{
	__int64 key;
	unsigned int depth:7;
	unsigned int flags:2;
	signed int value:16;
	move best;
} HASHE;

It means that I have always 64 bits for the key.
Do you use a key that is dependent on the number of entries in the hash table
and in that case how do you define it?

I guess that when 2^k is the number of entries in the hash tables
I need to have something like
__int64 key:32+k;

>
>>>>Today 48 bit key may be better but I do not consider maybe 3 elo improvement
>>>>that programs may get from using 48 bit instead of 64 bit as very important.
>>>
>>>Well 3 elo here and 3 elo there, it all adds up.
>>>Three elo is still more than the 0.001 elo you lose because of a collision every
>>>10th game :)
>>
>>every 10th game is only for today and things may be worse in the future and I do
>>not want to change things today only to change them back later.
>
>Neither do I, but I don't plan to change this ever. Packing things for storage
>in the hash is pretty common and there is no downside other than a few cycles
>for packing and unpacking.
>
>>I already have things that I need to change in the future(for example I use
>>integers for the history table and when the hardware gets to be faster this may
>>lead to bad order of moves because these numbers can be bigger than 2^32).
>>
>>I know that crafty has the same problem.
>>
>>It is not good for long analysis and in the future(maybe in 2010 or 2015) it is
>>not going to be good for tournament games.
>
>Okay, that is a different problem.
>You can "normalize" the table every few moves, ie by dividing all entries with
>10.

I memset the history table after every move so I guess that you mean that
I can do it every (2^31) nodes.

Uri



This page took 0 seconds to execute

Last modified: Thu, 15 Apr 21 08:11:13 -0700

Current Computer Chess Club Forums at Talkchess. This site by Sean Mintz.