Computer Chess Club Archives


Search

Terms

Messages

Subject: Re: Java versus C Speed Comparison

Author: Robert Hyatt

Date: 08:10:07 01/10/03

Go up one level in this thread


On January 10, 2003 at 05:12:04, David Rasmussen wrote:

>On January 09, 2003 at 17:36:11, Robert Hyatt wrote:
>
>>I think the entire concept of "short", "int" and "long" are badly flawed.  It
>>would
>>have been so much more logical and clean to simply go with int16, int32 and
>>int64.
>>
>>I don't personally like "long long" as it is a syntactical oddity in light of
>>char, short, int
>>and float/double.
>
>There is a reasonable explanation for this at least. The idea is that "int"
>should be whatever is the most natural entity of integer calculalation on a
>machine. In many cases, you don't care how many bits a type can store. The lower
>limits given by the standards is enough. You just want to know that by writing
>"int" you get something that on every platform is supposed to be simple, fast,
>signed (no weird problems with subtraction etc.),

But _not_ for "real codes".  Do I _really_ want to use int, when it _might_ be a
16 bit value that won't hold the counter I need?

No.

The point being that _I_ know what I need for various values, and I'd like to be
able
to specify that explicitly.  In C, I can not do so with the current types.
Having "int =
fastest word length" is ok.  But only "ok".  It falls short of what is needed.



> maps naturally on the
>architecture in question. For certain problems and/or on certain platforms, int
>isn't the fastest, or for some applications, knowing the exact width of an
>integer type is important. In these cases, we must do something different than
>the default. From my point of view, it is a question of the ability to specify
>your design concisely. So I think that _both_ forms should be available. That


There we agree, and that is what I have said all along.  "int" is ok.  But I
would
like to see specific forms as well.  IE int, int16, int32, int64 and int128.





>is, sometimes it is very natural to say "just give whatever happens to be a
>natural int on this platform, I know it's at least 16 bits wide, the standard
>says so, and that's good enough for this purpose". Such users shouldn't be
>forced to choose a specific width, say int16, when on some other platform int32
>is faster, and would do the same job. On the other hand, sometimes you want to
>have something exactly 16 bits wide, but in that case, the ugly thing is that we
>have to say it's an int. Most of the time when wanting something to be x bits
>wide, we usually don't interpret it as a normal integer, but rather as a
>bitfield in some sense. So both notions should be available. They aren't. As
>aren't a lot of other useful features in C (many of them, including this, are
>found in Ada). On the other hand, a lot of weird, ugly and error-prone features
>_are_ available. C is an ugly ugly language :)
>
>/David



This page took 0.03 seconds to execute

Last modified: Thu, 15 Apr 21 08:11:13 -0700

Current Computer Chess Club Forums at Talkchess. This site by Sean Mintz.