Author: blass uri
Date: 00:31:19 08/17/99
Go up one level in this thread
On August 16, 1999 at 16:08:36, Robert Hyatt wrote: >On August 15, 1999 at 03:43:41, Keith Kitson wrote: > >>From the discussions here I get the impression that there must be some research >>that can be done to generate a table showing reasonably optimium hash memory >>sizes for a given time allocation and speed of processor. Has anyone done any >>research in this area? >> >>I suspect for anyone that is not aware of the hash/processor speed/time control >>ratio affect, games played with Hiarcs732 may not always be running Hiarcs at >>its optimum settings. >> >>I would expect in future relases of strong programs the author should give more >>guidance on optimum settings for the parameters mentioned above, where the >>program strength could be affected. Better still how about an intelligent >>algorithm that is labelled say 'strongest setting' that determines the speed of >>processor, time controls set and either automaticallky sets or suggests the >>optimum hash for the determined parms? >> >>Do we have any programs that do this at present? >> >>Keith > >Here is the basic idea: > >a table entry is some number of bytes, I will use B to represent this (this is >16 in crafty, for reference). > >A program can search some number of nodes per second, lets call this N. And in >any game, it is going to have some target time per move (in in 40 moves in 2 >hours, you could assume 180 seconds if you don't have an opening book.) Lets >call this T. In a typical search, this program is going to search NT nodes >which makes sense. To completely hold these positions, you theoretically need >NTB bytes of hash table. However, this would mean that you are optimally >distributing entries throughout the table. But it is a good first cut. > >For an example, lets take Crafty. It searches about 600K nodes per sec on >my quad xeon. So the 'optimal' table size (so far) would be 600K * 16 * 180 >assuming 180 seconds per move. This assumpption is wrong because you may use 1000 seconds for some moves at tournament time control in this case the hash tables are too small. The average time per move is also more than 180 seconds because you predict part of the opponent's moves by the permanent brain. Uri > >But this isn't 'finished.' You really need to increase this, because no >hash function will provide perfectly uniform distribution of the entries >throughout the table... there will always be some clustering. So lets go for >a factor of 2. With that much memory, the hash table ought to see only minimal >overwriting. 600K * 3K (approximation of 16*180) gives us 1.8 gigabytes of >memory. This is doable, but such machines are not common. Fortunately, Crafty >shrinks this because it doesn't hash _any_ of the q-search nodes. And since >well over 50% of the total search is in the q-search, over 5% of the nodes don't >need to be stored. So this drops back to well under one gigabyte, which is >definitely doable. > >The formula gives you a good idea, but note that for really fast numbers, the >sizes are likely impossible on a typical PC.
This page took 0 seconds to execute
Last modified: Thu, 15 Apr 21 08:11:13 -0700
Current Computer Chess Club Forums at Talkchess. This site by Sean Mintz.