Computer Chess Club Archives


Search

Terms

Messages

Subject: The upcoming C9x Standard

Author: Dann Corbit

Date: 17:22:42 09/22/99

Go up one level in this thread


On September 22, 1999 at 20:02:46, Robert Hyatt wrote:
[snip]
>No... it in intuitively obvious to the casual observer that a 128 bit
>integer will be knows as a "long long long variable".  Unless you are on
>a machine that uses 16 bits for regular ints, in which case it would be a
>"long long long long int".
>
>I much prefer Microsoft's approach without the underscore...  ie int64 or
>int32 or int16, as that is _absolutely_ clear as to what you want, and the
>compiler is free to comply or produce an error.  As it is, who in the hell
>knows what a "long" will get you? 16 bits.  32 bits.  64 bits.  It all
>depends on the basic architecture type...  And it is insane to leave it 'open'.
To my way of thinking, int should be the fastest integer on your machine
(whatever it is).  And long should be the largest integer on your machine
(whatever that means).

The named size will be addressed to some degree by required typedefs:

Committee Draft  --  August 3, 1998   WG14/N843

[#2]  The  following  designate  exact-width  signed integer
types:

        int8_t       int16_t      int32_t      int64_t

The following designate exact-width unsigned integer types:

        uint8_t      uint16_t     uint32_t     uint64_t

(These types need not exist in an implementation.)

7.18.1.2  Minimum-width integer types

 [#1] Each of the following types designates an integer  type
 that  has at least the specified width, such that no integer
 type of lesser size has at least the specified width.  These
 type   names  have  the  general  form  of  int_leastn_t  or
 uint_leastn_t where n is the minimum  required  width.   For
 example,  int_least32_t  denotes  a signed integer type that
 has a width of at least 32 bits.

 [#2] The following designate  minimum-width  signed  integer
 types:

         int_least8_t              int_least32_t
         int_least16_t             int_least64_t

 The   following  designate  minimum-width  unsigned  integer
 types:

         uint_least8_t             uint_least32_t
         uint_least16_t            uint_least64_t

 (These types exist in all implementations.)

[#2] The following designate  fastest  minimum-width  signed
integer types:

____________________

202The  designated  type is not guaranteed to be fastest for
   all purposes; if the implementation has no clear  grounds
   for  choosing  one type over another, it will simply pick
   some integer type satisfying  the  signedness  and  width
   requirements.

  and:
        int_fast8_t               int_fast32_t
        int_fast16_t              int_fast64_t

The   following  designate  fastest  minimum-width  unsigned
integer types:

        uint_fast8_t              uint_fast32_t
        uint_fast16_t             uint_fast64_t

(These types exist in all implementations.)

>>I suggested to the committe that they free the bitfields from those poor unions
>>that hold them captive.  E.g.:
>>int foo:64; /* 64 bit integer */
>>unsigned int bar:12; /* 12 bit integer */
>>I will admit that it could be problematic to take the address of a one bit
>>integer.  Just waste 7 bits in such a case.
>>
>>Oh well, water under the bridge.
>
>
>Those shouldn't be a problem, and you should _never_ take the address of a bit
>field in any compiler I can think of.  In fact, the last time I tried it, the
>compiler rejected it instantly...  but in any case, long long is gross.  128 bit
>ints are coming, and long long long is _really_ gross...
It seems to me that there are *way* too many new typedefs (I have listed just a
few of them -- there's more).  And what is worse for forgetful folks like me is
that now I will have to remember what types are required and which are optional.

If they allowed the
   int foo:size;
nomenclature, that would take care of every conceivable size.  "No preexisting
practice" they said.  But we have bitfields in unions already, and there are
already arbitrary precision integer packages like MIRACL by Michael Scott.

Oh well, you can't have everything your way, I guess.  But in my case, less
would have been more.  Less documentation, more types.  Efficient if the
compiler can do it, and if not - well I *wanted* a 12 bit type and should have
to pay a performance price if I demanded one.



This page took 0 seconds to execute

Last modified: Thu, 15 Apr 21 08:11:13 -0700

Current Computer Chess Club Forums at Talkchess. This site by Sean Mintz.