> For example, why does C99 have the abomination "long long", even though 64
> bit code could easily be accomodated by char/short/int/long? Because far
> too many people wrote code that assumed "long" was 32 bits, and the C
> compiler vendors didn't want to break that.
Well, I wouldn't really choose to complain about THAT particular example, personally... I think it would be kind of awkward to have "long" be 64-bits on a 32-bit system... Not to mention probably inefficient, since LOTS of stuff uses longs, and manipulating 64-bit ints on a 32-bit system has to be less efficient... With a separate "long long", people only use it when they need a potentially 64-bit value... Yes, it's a bit of a pain and not as clean as just using "long", but I can certainly see the logic in it, above and beyond just supporting people who write broken code assuming a 32-bit "long"...
And, the ABI issue you mentioned is a big deal-breaker, as well... HOW would you propose to solve that other than leaving "long" alone?? You can't just change all standard lib functions that used to take/return "long" to "int" (or some new typedef), because all existing code quite properly assumes they take/return a "long", since that's how they've always been defined... Plus, there's tons of non-standard third-party libs to think of, which would also be affected and which you could never hope to change all of... (On a side-note: am I the only one who hates the fact that various socket functions these days take stupid typedefs like "socklen_t", instead of the traditional "int"?? I wouldn't mind so much, but apparently that's being defined as "unsigned" instead of signed "int", which is what it's historically always been... Sure, unsigned makes more sense, in retrospect, but geez... And, now GCC complains about passing in a pointer to an "int" (which is how things have always been done) for stuff like accept()/getsockname()/etc., since it's not unsigned... ;-/ Yeah, you can disable it, thankfully, but still it might be a nice warning to leave enabled for OTHER stuff where it legitimately IS a mistake, but here it's a case of the API changing, which just isn't cool...)
> Those who could read the C89 standard, and made no assumptions about
> "long", except what was *promised* in the C89 standard: "long is the
> largest integer type".
Well, if you change it to "largest integer type native to the current platform", it still works... ;-) No, I know what you're saying... I'm old enough to remember the conversion from 16-bit systems to 32-bit; there, "long" was 32-bit, even though the system was 16-bit, so what you say certainly makes sense... I just don't really have a problem with "long long", personally...
The real fun is going to come if/when we ever go to 128-bit systems: I guess the only choose at that point will be to keep "long" 64-bit, and make "long long" the only 128-bit integer; or else, invent another new native type... Either choice is kind of ugly...
Copyright © 2017, Eklektix, Inc.
Comments and public postings are copyrighted by their creators.
Linux is a registered trademark of Linus Torvalds