Author: Robert Hyatt
Date: 11:21:20 01/11/03
Go up one level in this thread
On January 11, 2003 at 12:30:16, Miguel A. Ballicora wrote: >On January 11, 2003 at 11:30:44, Robert Hyatt wrote: > >>On January 11, 2003 at 02:10:45, Miguel A. Ballicora wrote: >> >>>On January 10, 2003 at 21:34:38, Robert Hyatt wrote: >>> >>>>>Then you sacrifice performance. Particularly for machines that do not have >>>>>8 bit chars and weird configurations! The performance hit in those cases is must >>>>>be huge. You cannot have 100% portability and best performance at the same time. >>>>>C gives, IMHO, the best compromise. History showed that. >>>> >>>>Actually it doesn't show that. C was developed 30+ years ago on a very simple >>>>architecture. The basic language structure has survived for a long time, but >>>>the data types (particularly integer) have really lagged behind, and kludges >>>>like "long long" are the result of short-sightedness... >>> >>>Well, it survived 30+ years, so history is saying something. IMHO, the data >>>types had a lot to do with the success of the language. It allowed to write >>>portable and efficient code for diversed machines with completely different word >>>sizes, and I did not make that up, it really happened. >> >> >>Yes, but not quite like you think. C is _great_ for working on a specific >>architecture. Efficient. Easy to write good code. Readable. Etc. >> >>C is _not_ great for writing code that has to run on multiple architectures. > >I believe this is false. You can believe what you want, but experience is on my side here. I've ported way too many programs from various languages. C is only better than assembly. Other languages such as FORTRAN and ADA are far simpler. > >>For a comparison, try FORTRAN. I have run FORTRAN code on 16, 32 and 64 bit >>machines with _zero_ changes to the source, because I could speficy how long >>a variable had to be, precisely, without worrying about "does the architecture >>support this?" > >You certainly can write C code that does not need to be changed in other >architectures. If you prefer other languages, that is another issue. Again, it depends. How are you going to do bitmaps? long long won't cut it as that can be 128 bits. long won't cut it as that can be 32 or 64 bits. That is the problem. For many things "int" works fine. Loop counters. Array indices. But for some things I need to know how many bits. Random number generators come to mind. As does the bitmap stuff in Crafty. The bit-tickling stuff done by the crypto guys. Etc... > >>Yes C is good. And yes, it _could_ be better, if only the standards committee >>would write a _standard_ with no missing piecse... > >I do not criticize the standards committee because they had to acommodate needs >of people that have all sort of different hardware. >You cannot shovel down their throats int32 when the word size of the machine is >36 bits without hurting performance. So? You can do _both_ as has been pointed out many times here. > >Another thing is that I guess that writing C compiler could have been something >relatively easy when a new architecture appeared thanks to the simplicity of the >language. Probably. But I can't say the same for porting sophisticated programs to a new architecture. That can be a mess... > >>>The bottom line is that you do not like the data type structures but I do. >>>It is a matter of taste. >> >> >>It is more than taste. You might not be as concerned about portability as I >>am. But in my case, portability is an important issue, and C has some distinct >>problems in that area. Problems that _could_ be addressed by standards. > >I am concerned about portability, not less than you. As I say, it is a matter of >taste. Few things about portability in C are left to the programmer to take care >(like anding with a mask). I do not mind those tasks, it is a little effort that >you have to pay to have the simplest language with the highest portability >possible that does not hurt performance. > >Miguel But that _still_ doesn't address the problem that "int" can be anything from 16 to 64 bits today, and maybe even 128 bits in the future. IE when I want to grab bits 16-31 of a "thing" I shift it right 16 bits and AND with _what_? 65535? Not on a 64 bit architecture. -1 ^ 65535? maybe... But it is a headache that could be eliminated. IE on a PC you know that short=16, int=32, long=32 and long long=64. But _only_ on a PC. It would be nice to know something about specific types that applies to _all_ architectures... int8, int16, int32 and int64 would do just fine since the older architectures with 36, 48 and 60 bits are pretty much history... int could _still_ work as "fastest counter available" but int64 would be nice when I want 64 bits, period. long long is ridiculous. What will it be on 64 bit machines that can do 128 bit integer math? what will it be on 128 bit machines that can do 256 bit integer math? long long long long and long long long long long long long long? ridiculous
This page took 0.01 seconds to execute
Last modified: Thu, 15 Apr 21 08:11:13 -0700
Current Computer Chess Club Forums at Talkchess. This site by Sean Mintz.