Author: Tom Kerrigan
Date: 13:07:36 04/29/00
Go up one level in this thread
On April 29, 2000 at 11:52:28, William Bryant wrote: >On April 29, 2000 at 04:25:21, David Blackman wrote: > >>On April 29, 2000 at 02:22:02, Will Singleton wrote: >> >>>On April 28, 2000 at 19:24:09, José Carlos wrote: >>> >>>>On April 28, 2000 at 19:13:33, Will Singleton wrote: >>>> >>>>> >>>>>I get compiler warnings about implicit int-to-short conversions, even for a >>>>>statement like >>>>> >>>>>x = -x; >>>>> >>>>>where x is a short. >>>>> >> >>>I'm using Codewarrior for the Mac. I assume, therefore, that this compiler >>>doesn't handle shorts well, so I might want to convert shorts to ints and note >>>any changes. >>> >>>Only reason I use them is to conserve memory, which is kind of ridiculous >>>nowadays. I wonder if there is some reason for the warning, like, on some >>>occasions the implicit conversion might fail. >>> >>>Will >> >>On a Mac (assuming it's not a 68K mac) there are no 16 bit instructions except >>load with zero extend, load with sign extend, and store (and maybe zero-extend >>and sign-extend in registers, i don't remember for sure). So it has to convert >>to int, negate, convert back to short. In any case the C standard says it has to >>be done in that way (unless the optimiser can prove that it makes no >>difference). >> >>I agree with Bruce that you shouldn't use short (or char) for single variables. >>It costs speed, and i can see no benefits at all. >> >>Only use them in large arrays, or in structs that will be used in large arrays. >>It's worth doing an explicit cast back to the right type when storing into short >>or char. And it's worth examining each cast carefully to make sure that either >>overflow is impossible, or that overflow does what you want it to. > >Unless something has changed recently, the size int is undefined in the C and >C++ programming language. It may be 16 bits, 32 bits, 64 bits. This is up to >the processor and compiler. > >In the Header <types.h>, they define rather easy to use types of integer size. >I prefer UInt32, SInt32, UInt8, SInt8, etc. rather than unsigned int, int, >unsigned char, char. These data sizes should be consistent over time. > >William >wbryant@ix.netcom.com Here's the nice thing about ints: if you don't care if a variable is 16-bit, 32-bit, or 64-bit, then you can just declare it as an int, and the compiler will use the processor's word size. Then your program will run reasonably well on different processors. -Tom
This page took 0 seconds to execute
Last modified: Thu, 15 Apr 21 08:11:13 -0700
Current Computer Chess Club Forums at Talkchess. This site by Sean Mintz.