Computer Chess Club Archives


Search

Terms

Messages

Subject: Re: Java versus C Speed Comparison

Author: Miguel A. Ballicora

Date: 13:10:18 01/11/03

Go up one level in this thread


On January 11, 2003 at 14:21:20, Robert Hyatt wrote:

[snip]

>>>C is _not_ great for writing code that has to run on multiple architectures.
>>
>>I believe this is false.
>
>You can believe what you want, but experience is on my side here.  I've
>ported way too many programs from various languages.  C is only better than
>assembly.  Other languages such as FORTRAN and ADA are far simpler.

As a portable language C is only better than assembly? well, let's leave it at
that. That is your opinion.


>>>For a comparison, try FORTRAN.  I have run FORTRAN code on 16, 32 and 64 bit
>>>machines with _zero_ changes to the source, because I could speficy how long
>>>a variable had to be, precisely, without worrying about "does the architecture
>>>support this?"
>>
>>You certainly can write C code that does not need to be changed in other
>>architectures. If you prefer other languages, that is another issue.
>
>Again, it depends.  How are you going to do bitmaps?
>
>long long won't cut it as that can be 128 bits.  long won't cut it as that
>can be 32 or 64 bits.  That is the problem.  For many things "int" works
>fine.  Loop counters.  Array indices.  But for some things I need to know
>how many bits.  Random number generators come to mind.  As does the bitmap
>stuff in Crafty.  The bit-tickling stuff done by the crypto guys.  Etc...

I told you already, for key types that are the core of the program you can
and probably should use a typedef. Not a big deal.

>>>Yes C is good.  And yes, it _could_ be better, if only the standards committee
>>>would write a _standard_ with no missing piecse...
>>
>>I do not criticize the standards committee because they had to acommodate needs
>>of people that have all sort of different hardware.
>>You cannot shovel down their throats int32 when the word size of the machine is
>>36 bits without hurting performance.
>
>So?  You can do _both_ as has been pointed out many times here.

Why int32 and not int36? The commitee tried to keep things simple and I agree
about the decision.

>>Another thing is that I guess that writing C compiler could have been something
>>relatively easy when a new architecture appeared thanks to the simplicity of the
>>language.
>
>Probably.  But I can't say the same for porting sophisticated programs to a
>new architecture.  That can be a mess...

but that was the reason why there was always a C compiler ready pretty quick for
a new architecture, wasn't it?
Anyway, C programs have been ported from different architectures without much
problem when the code was written carefully. IIRC, one of the first versions of
Excel was ported from the Mac to the PC (or viceversa I do not remember).

[snip]

>But that _still_ doesn't address the problem that "int" can be anything from
>16 to 64 bits today, and maybe even 128 bits in the future.  IE when I want
>to grab bits 16-31 of a "thing" I shift it right 16 bits and AND with _what_?
>65535?  Not on a 64 bit architecture.  -1 ^ 65535?  maybe...

unsigned int x, y;
y = (x >> 16) & 0xffff;

or more general,

unsigned int x, y;
y = (x & 0xffffffff) >> 16;

>But it is a headache that could be eliminated.  IE on a PC you know that
>short=16, int=32, long=32 and long long=64.  But _only_ on a PC.  It would
>be nice to know something about specific types that applies to _all_
>architectures...
>int8, int16, int32 and int64 would do just fine since the older architectures
>with 36, 48 and 60 bits are pretty much history...

You assume that those architectures are history, that might be true today but C
was not born today.

Miguel



This page took 0 seconds to execute

Last modified: Thu, 15 Apr 21 08:11:13 -0700

Current Computer Chess Club Forums at Talkchess. This site by Sean Mintz.