Author: Matt Taylor
Date: 17:06:18 01/11/03
Go up one level in this thread
On January 11, 2003 at 16:10:18, Miguel A. Ballicora wrote: >On January 11, 2003 at 14:21:20, Robert Hyatt wrote: > >[snip] > >>>>C is _not_ great for writing code that has to run on multiple architectures. >>> >>>I believe this is false. >> >>You can believe what you want, but experience is on my side here. I've >>ported way too many programs from various languages. C is only better than >>assembly. Other languages such as FORTRAN and ADA are far simpler. > >As a portable language C is only better than assembly? well, let's leave it at >that. That is your opinion. It is true that Hello World programs are usually quite portable. >>>>For a comparison, try FORTRAN. I have run FORTRAN code on 16, 32 and 64 bit >>>>machines with _zero_ changes to the source, because I could speficy how long >>>>a variable had to be, precisely, without worrying about "does the architecture >>>>support this?" >>> >>>You certainly can write C code that does not need to be changed in other >>>architectures. If you prefer other languages, that is another issue. >> >>Again, it depends. How are you going to do bitmaps? >> >>long long won't cut it as that can be 128 bits. long won't cut it as that >>can be 32 or 64 bits. That is the problem. For many things "int" works >>fine. Loop counters. Array indices. But for some things I need to know >>how many bits. Random number generators come to mind. As does the bitmap >>stuff in Crafty. The bit-tickling stuff done by the crypto guys. Etc... > >I told you already, for key types that are the core of the program you can >and probably should use a typedef. Not a big deal. Should every program that needs these have to define them or should we save time spent porting programs and have the compiler do it for us? This isn't a radical transformation of the C language, and it fits perfectly well in the C paradigm. >>>>Yes C is good. And yes, it _could_ be better, if only the standards committee >>>>would write a _standard_ with no missing piecse... >>> >>>I do not criticize the standards committee because they had to acommodate needs >>>of people that have all sort of different hardware. >>>You cannot shovel down their throats int32 when the word size of the machine is >>>36 bits without hurting performance. >> >>So? You can do _both_ as has been pointed out many times here. > >Why int32 and not int36? The commitee tried to keep things simple and I agree >about the decision. Read what I say at the end of this post. You can have both without defining either. >>>Another thing is that I guess that writing C compiler could have been something >>>relatively easy when a new architecture appeared thanks to the simplicity of the >>>language. >> >>Probably. But I can't say the same for porting sophisticated programs to a >>new architecture. That can be a mess... > >but that was the reason why there was always a C compiler ready pretty quick for >a new architecture, wasn't it? >Anyway, C programs have been ported from different architectures without much >problem when the code was written carefully. IIRC, one of the first versions of >Excel was ported from the Mac to the PC (or viceversa I do not remember). Excel is hardly a good example. Look at Crafty -- it needs a 64-bit bit vector. Look at CRC algorithms where the constant used is different depending on what your data size is. >[snip] > >>But that _still_ doesn't address the problem that "int" can be anything from >>16 to 64 bits today, and maybe even 128 bits in the future. IE when I want >>to grab bits 16-31 of a "thing" I shift it right 16 bits and AND with _what_? >>65535? Not on a 64 bit architecture. -1 ^ 65535? maybe... > >unsigned int x, y; >y = (x >> 16) & 0xffff; > >or more general, > >unsigned int x, y; >y = (x & 0xffffffff) >> 16; > >>But it is a headache that could be eliminated. IE on a PC you know that >>short=16, int=32, long=32 and long long=64. But _only_ on a PC. It would >>be nice to know something about specific types that applies to _all_ >>architectures... >>int8, int16, int32 and int64 would do just fine since the older architectures >>with 36, 48 and 60 bits are pretty much history... > >You assume that those architectures are history, that might be true today but C >was not born today. No, but C99 was. I think you also missed another point. Why does the committee even -have- to define specific forms of intxx? If the compiler understands that int32 = 32-bit int, it can just as easily recognize that int36 = 36-bit int. It is not difficult. You yourself pointed out that emulation logic is trivial. Furthermore, boolean variables become efficient (bool = int1). If the address of a boolean variable is never taken, it can be coalesced with other variables to save space. This also allows simultaneous testing of multiple boolean variables -- greater efficiency. At the same time it eliminates the worry of "how many bits can I fit into this type or that type?" The argument that it would complicate compilers is no better than lame. The logic can be implemented simply if performance is not required. It could be as simple as a preprocessor-like translation. This is a BETTER compromise between performance and portability than the existing system. -Matt
This page took 0 seconds to execute
Last modified: Thu, 15 Apr 21 08:11:13 -0700
Current Computer Chess Club Forums at Talkchess. This site by Sean Mintz.