Imagine: the future
Imagine: the future
Posted Apr 25, 2025 11:52 UTC (Fri) by uecker (guest, #157556)In reply to: Imagine: the future by tialaramex
Parent article: Some __nonstring__ turbulence
I think parent is right. Having a fat pointer means you embed some hidden information and this is what is not "low-level." You can have length-prefixed string pointers in C just fine and I use them in some projects. You certainly do not need Rust for this. As some other commenter pointed out, the APIs are all just build around traditional C strings and we would need to agree on a single string pointer type to really get people to switch (which we should do in WG14).
Posted Apr 25, 2025 13:28 UTC (Fri)
by tialaramex (subscriber, #21167)
[Link] (1 responses)
The thing for WG14 to have done was land the fat pointer types for say C99. I think that would have been one of those "controversial at the time but rapidly proved correct" choices like removing gets in C11. If you (or WG14 as a whole) make that happen in C2b that'd be welcome, but it's very late, I do not expect I would make any use of this, the decades of C programming are behind me.
The pascal-style length-prefixed string is completely orthogonal to the fat pointer string slice. That's why if I look inside a Rust binary it has text like "EPERMENOENTESRCHEINTREIOENXIOE2BIGENOEXECEBADF" baked inside it, there aren't any zero terminators _or_ length prefixes, neither is needed to refer to EPERM or E2BIG in that text with a slice.
Posted Apr 27, 2025 10:29 UTC (Sun)
by wahern (subscriber, #37304)
[Link]
Notably, Ritchie's proposal lacked variable size automatic storage arrays. With the VLA proposal, you can declare arrays on the stack: int a[n][m];. In Ritchie's proposal you were required to use malloc (or alloca?): int (*a)[?][?] = (int (*)[n][m])
It would have been better if we ended up with Ritchie's proposal. But I surmise (based on those papers alone) that MacDonald's won the day because 1) it was easier to implement--no ABI changes nor introduction of fat pointers into the compiler architecture; 2) GCC seemed to already have most of the implementation already, albeit with a slightly different syntax; 3) Ritchie gave short shrift to declaring automatic storage variable size arrays; 4) MacDonald was working at Cray so presumably was deemed to speak for pressing industry demands. #3 and #4 seem especially pivotal given everybody's preoccupation with numerical computing rather than bounds safety per se.
Posted Apr 25, 2025 16:54 UTC (Fri)
by khim (subscriber, #9252)
[Link]
That's just nonsense. By that reasoning we shouldn't have structs, or opaque pointers or any other ways of “hiding” information… yet that's literally bread-and-butter of any large project, include Linux kernel. You need something-else-not-C for that, though. We have all that stupidity because C literals behave like they do and without changing language one couldn't fix that problem. Yeah, that, too. Rust got saner strings than C because it started from scratch while C++ is a mess because it haven't. The question of “how often new language have to be introduced” is a good one, but it feels as if the good answer is “somewhere between “every 10 years” and “every 20 years”… with all languages being supported for about 3-5x as long”. Simply because certain things do need a full rewrite on new foundations with certain critical fixes embedded… and yet the only way to efficiently do that is via the well-known “one funeral at a time” way… means languages have to chage with generations… and these happen once per approximately 15 years.
Imagine: the future
Imagine: the future
malloc(n * m * sizeof (int)));. I don't think this was an intrinsic limitation; I think Ritchie just objected to the ambiguity of how and when sizeof evaluated the integral expression(s) used in the type definition (i.e. n and m above). MacDonald's paper mentions that the compiler would have to cache the evaluated value so it reflected the value of the type at it's declaration; subsequently modifying n or m wouldn't change the result of sizeof. Except when using VLAs in parameters, or with variable length structures (part of the original proposal), this wasn't quite true (or true but irrelevant), something fat pointers avoids.
> I think parent is right. Having a fat pointer means you embed some hidden information and this is what is not "low-level."
Imagine: the future
