Computer Chess Club Archives


Search

Terms

Messages

Subject: Re: Source code to measure it - results

Author: Vincent Diepeveen

Date: 19:34:18 07/15/03

Go up one level in this thread


On July 15, 2003 at 21:13:23, Jeremiah Penery wrote:

>On July 15, 2003 at 20:19:34, Vincent Diepeveen wrote:
>
>>On July 15, 2003 at 15:24:19, Gerd Isenberg wrote:
>>
>>Gerd use it with a bigger hashtable. Not such a small
>>table.
>>
>>400MB is really the minimum to measure.
>
>Why?
>
>Measuring 90MB, something like 99.65% of the accesses should be to RAM and not
>cache.  With 100MB, it's 99.8%.  Yet when I measure those two things, I get a
>whole 6.1ns latency difference according to your test.  Even measuring only
>20MB, 98.4% of the allocated memory can not be in cache. (All of this assumes
>that the program takes up 100% of the cache, which it won't.)
>
>There's something wrong that causes memory access time to be reported much
>higher when testing larger 'hashtable' sizes.  Anything large enough to
>overwhelm the cache should report similar, if not almost identical, results.
>However, your program gives wildly different numbers.
>
>Trying to allocate 12500000 entries. In total 100000000 bytes
>  Average measured read read time at 1 processes = 183.935982 ns
>
>Trying to allocate 11250000 entries. In total 90000000 bytes
>  Average measured read read time at 1 processes = 177.806427 ns
>
>Trying to allocate 43750000 entries. In total 350000000 bytes
>  Average measured read read time at 1 processes = 253.592331 ns

the only thing from which i was not sure was the RNG used for doing this type of
tests, because the remainder gets used to index.

>In the last test, I can't be completely sure I wasn't paging at all.  I didn't
>see the disk light flashing, but it's possible that this hit the disk more than
>once, which would make the number look much higher than it should.

You can see that by turning on perfmon. Also internetting at the same time has a
bad influence here. Some online software is eating really some bandwidth.

90MB is sick little to use of course for such tests.

I tested basically with sizes like 50GB at SGI.

>Still, relative to the other results people have given, this is not so bad,
>since I have only PC2100 memory (133MHz DDR).

My experience is that when compared to others who did similar tests the numbers
match when using 400MB or more as a cache size.

If you want to find an explanation i guess the RNG is the weak chain in the
test. That's why i had asked Dieters help there as in the past he said he knew a
lot about RNGs.

He didn't report back on that yet. We will see.

The RNG is rotating bits a bit. usually that works well. I do not know though
for the remainder whether it works so well.



This page took 0.01 seconds to execute

Last modified: Thu, 15 Apr 21 08:11:13 -0700

Current Computer Chess Club Forums at Talkchess. This site by Sean Mintz.