Computer Chess Club Archives


Search

Terms

Messages

Subject: Re: Java versus C Speed Comparison

Author: Robert Hyatt

Date: 16:23:19 01/11/03

Go up one level in this thread


On January 11, 2003 at 16:10:18, Miguel A. Ballicora wrote:

>On January 11, 2003 at 14:21:20, Robert Hyatt wrote:
>
>[snip]
>
>>>>C is _not_ great for writing code that has to run on multiple architectures.
>>>
>>>I believe this is false.
>>
>>You can believe what you want, but experience is on my side here.  I've
>>ported way too many programs from various languages.  C is only better than
>>assembly.  Other languages such as FORTRAN and ADA are far simpler.
>
>As a portable language C is only better than assembly? well, let's leave it at
>that. That is your opinion.
>

Yes.  What language would you put down there with C?

Note that I have not said "C is bad".  I have been using it forever.  I
have been saying "it could be significantly better with better type
casting."



>
>>>>For a comparison, try FORTRAN.  I have run FORTRAN code on 16, 32 and 64 bit
>>>>machines with _zero_ changes to the source, because I could speficy how long
>>>>a variable had to be, precisely, without worrying about "does the architecture
>>>>support this?"
>>>
>>>You certainly can write C code that does not need to be changed in other
>>>architectures. If you prefer other languages, that is another issue.
>>
>>Again, it depends.  How are you going to do bitmaps?
>>
>>long long won't cut it as that can be 128 bits.  long won't cut it as that
>>can be 32 or 64 bits.  That is the problem.  For many things "int" works
>>fine.  Loop counters.  Array indices.  But for some things I need to know
>>how many bits.  Random number generators come to mind.  As does the bitmap
>>stuff in Crafty.  The bit-tickling stuff done by the crypto guys.  Etc...
>
>I told you already, for key types that are the core of the program you can
>and probably should use a typedef. Not a big deal.


And what typedef am I going to use for a 64 bit integer that will work on
all platforms?  The fact that I would need to resort to a "typedef" indicates
that something is "missing".



>
>>>>Yes C is good.  And yes, it _could_ be better, if only the standards committee
>>>>would write a _standard_ with no missing piecse...
>>>
>>>I do not criticize the standards committee because they had to acommodate needs
>>>of people that have all sort of different hardware.
>>>You cannot shovel down their throats int32 when the word size of the machine is
>>>36 bits without hurting performance.
>>
>>So?  You can do _both_ as has been pointed out many times here.
>
>Why int32 and not int36? The commitee tried to keep things simple and I agree
>about the decision.

int36 doesn't really fit a machine of today.  For 20 years we seem to have
settled on multiples of 8, which is ok by me...

But one correction.  The committee did _not_ try to keep things simple. They
simply could not agree and punted the decision.  Just like they punted a few
other issues (such as is char signed or unsigned?  they _could_ have made a
decision there.  They _chose_ not to.  Ditto for bit-fields.  From which end
of the word?)




>
>>>Another thing is that I guess that writing C compiler could have been something
>>>relatively easy when a new architecture appeared thanks to the simplicity of the
>>>language.
>>
>>Probably.  But I can't say the same for porting sophisticated programs to a
>>new architecture.  That can be a mess...
>
>but that was the reason why there was always a C compiler ready pretty quick for
>a new architecture, wasn't it?

There wasn't "always a C compiler".  When I started programming there _was_
no "C".  It is there first because it is popular and has become an O/S
implementation language.  But that doesn't mean it can't be improved.

>Anyway, C programs have been ported from different architectures without much
>problem when the code was written carefully. IIRC, one of the first versions of
>Excel was ported from the Mac to the PC (or viceversa I do not remember).
>
>[snip]
>
>>But that _still_ doesn't address the problem that "int" can be anything from
>>16 to 64 bits today, and maybe even 128 bits in the future.  IE when I want
>>to grab bits 16-31 of a "thing" I shift it right 16 bits and AND with _what_?
>>65535?  Not on a 64 bit architecture.  -1 ^ 65535?  maybe...
>
>unsigned int x, y;
>y = (x >> 16) & 0xffff;
>
>or more general,
>
>unsigned int x, y;
>y = (x & 0xffffffff) >> 16;

Not if you use bit-fields.  Which is _another_ issue of course.

However, if I am writing code, and I am using a 32 bit int, and I want the
upper 16 bits, I am going to shift right 16 and use that.  I'm _not_ going to
also AND it with 0xffff as that is a wasted instruction.  Unless the compiler
uses something besides 32 bits to hold the value, which I would _like_ to be
able to control...



>
>>But it is a headache that could be eliminated.  IE on a PC you know that
>>short=16, int=32, long=32 and long long=64.  But _only_ on a PC.  It would
>>be nice to know something about specific types that applies to _all_
>>architectures...
>>int8, int16, int32 and int64 would do just fine since the older architectures
>>with 36, 48 and 60 bits are pretty much history...
>
>You assume that those architectures are history, that might be true today but C
>was not born today.

Standards were done _recently_ again.

three years ago in fact..



>
>Miguel



This page took 0.01 seconds to execute

Last modified: Thu, 15 Apr 21 08:11:13 -0700

Current Computer Chess Club Forums at Talkchess. This site by Sean Mintz.