Computer Chess Club Archives


Search

Terms

Messages

Subject: Re: Seg fault problems on 64bit linux

Author: Carey

Date: 10:29:09 09/06/05

Go up one level in this thread


Mr. Hyatt, I hope you don't mind me cutting out some of this quoting... It's
getting a bit long.

I know your habit is to quote everything, and I've tried to do that for you, but
it's just getting a bit long...

On September 06, 2005 at 10:21:44, Robert Hyatt wrote:

>>Nowdays, to be honest, I usually don't care that much about alignment.  As long
>>as it's word aligned, then that's good enough for most of the programming I do.
>>When you are allocating half a gig of memory, it usually doesn't matter too much
>>if the array happens to be page aligned or such.  Just as long as it meets basic
>>alignment.  And most compilers get that right.
>
>I care because of cache linesize.  For example, in the hash table, I always
>fetch 4 entries in one "gulp" which is 64 bytes.  I'd like them to be in one
>cache line when possible.  If I don't make sure that the hash table is aligned
>on a 64-byte boundary, I can get two cache miss penalties on _every_ hash table
>probe that would normally get one miss.  That is significant...

Oh yeah.  I wasn't meaning to say the benefits aren't substantial.

When I do numerical stuff, I have to do things like that too, and so on.

I was just saying that usually I work with stuff that cares more about total
data size etc. rather than raw speed.  So I don't normally pull every trick I
can to do squeeze the last cycle of performance out of it.  And I don't normally
put a lot of effort into putting my data on page boundaries, etc.

(As a side note, talking about alignment... Back in the early Pentium days, I
used to work on a program that needed 'double' alignment for some temporary
storage for some assembly routines.  Performance would drop 10%-20% if it was
misaligned.  I fought the compiler for weeks trying to get it to align the
arrays properly and so on.  I finally decided the linker was just buggy and that
it was going to do whatever it wanted to do each time it ran, with no
consistancy.  So I ended up writing an alignment routine for alloca() and
slightly modifying the asm to use that.)


>>I agree it can be anoying.  I'm not disagreeing with you.
>>
>>I'm just saying that's the way things are.  C was written a long long time ago.
>>It was used a long time before people really started caring about 64 bit
>>integers.
>
>Yes, but this problem also existed on the 16/32 boundary as well, that was the
>same sort of "problem child" for programming...

Right.

That's what I've been saying.  Us old "DOSers" used to go through the same sort
of stuff people are having to now for 64 bits.

16 vs. 32 bit integers, pointers, etc. etc.

For us, it's no big deal. We've been there and learned the hard way.

For the new generation of programmers in the past 10+ years, this is all new
stuff.



>>And now that people do care, they give you enough support to do things the way
>>you need to.  You just need to give up the habit of using plain 'char' and 'int'
>>and such.
>
>Problem with that is exactly what data type do I use for 64 bits?  Every
>compiler is not C99 compliant.  Every compiler doesn't support "long long".  So
>it still remains  an issue of portability...


Use the official C99 data type.  uint64_t

If you need an integer at least 32 bits but doesn't have to be exactly 32 bits,
and it needs to be the fastest available, then you can use uint_fast32_t, which
might be 32 bits on some systms but 64 on others.

And so on.

Those are all defined in stdint.h

That's likely to be supported on most compilers still in use today.  C99 has
been around for 6 years now.  Compilers older than that exist, but with the
rapid pace of hardware development and changes, they get outdated pretty quick.

If the compiler is older than that, then you have to do a bit of work to
determine what types they should use.  You are going to have to set things up so
the preprocessor looks at what compiler is being used.  And then choose "long
long" or "_uint64" or whatever that particular old compiler gives you.  You just
typedef that to uint64_t and let the rest of the program go about its merry way.

You could either write your own version of stdint.h that figures all that stuff
out, or you could use / modify one of the public domain versions of stdint.h &
inttypes.h that already do that.  ( http://www.lysator.liu.se/c/q8/ for one
example.  There are others but I don't want to do the searching. )


It'd be nice if C provided a bit more type checking.  That way you could be sure
not to accidently mix your 64 bit bitboard with an integer (of any size.)  You
can do arrays and be sure of never accidently doing the wrong type of index.
(For example, every so often in my chess program, I'll accidently do
Pieces[WHITE] instead of Colors[WHITE])

But C never has had that kind of typechecking and never will.




>>If you don't like the default nature of int or char, then don't use them.  Go
>>ahead and specify whether they are signed or unsigned and whatever particular
>>size you need.  C99 has the tools just waiting to be used.  You just have to
>>give up the habit of using 'char' and 'int'.  (A habit that can be very hard to
>>break...
>
>Again I will remind you that not all compilers are C99 compliant.  If I were
>just writing code for me, this would not be a problem.  But I write code that is
>compiled on every compiler and system known to man, and a few that are not..


No, they aren't all C99.

But that's a good place to start.

Better to start with something that is standard and is widely supported (and
will become more so as the years go by) than trying to figure it out yourself on
every new system that comes along.

In other words, instead of trying to figure out every possible system and
combination all by yourself, first give stdint.h a try.  The preprocessor can
check the C version and can include it if it's there, and if not, go through the
manual stuff to set things based on the compiler type, etc.

That's likely to work on most systems (and more so in the future.)  Then for
those few old systems without a C99 compiler, worry about those and figure
things out for yourself.  (You've probably already done a lot of that.  Or you
can use various public domain stdint.h headers that try to do the same thing.)
There will be a lot less of those than there are total number of systems /
compilers.  That means less to maintain.  And once the few old systems get done,
there wont be many 'new' ones come along that need the special attention,
because those will probably be C99 compatable.

Put in a few preprocessor checks or something so if something still isn't
defined, it puts up an #error message that tells the user he'll have to check
his compiler manual to see what these should be, and to inform you of the
settings for that particular system.

And then just put in a few saftey checks at the start of the program just to
make sure that the data sizes really are what you expect.  That some user hasn't
come along and modified things or set up some custom config that isn't going to
work.  A few sizeof() calls, a few shifts to see where a word becomes zero, and
so on.  (Since you are working with bitboards, you can leave out the tests to
see if you are working on a decimal or floating point system.  You can assume it
really does have binary bits.)







This page took 0 seconds to execute

Last modified: Thu, 15 Apr 21 08:11:13 -0700

Current Computer Chess Club Forums at Talkchess. This site by Sean Mintz.