|
|
Subscribe / Log in / New account

Who's afraid of a big bad optimizing compiler?

Who's afraid of a big bad optimizing compiler?

Posted Jul 26, 2019 18:18 UTC (Fri) by andresfreund (subscriber, #69562)
In reply to: Who's afraid of a big bad optimizing compiler? by jerojasro
Parent article: Who's afraid of a big bad optimizing compiler?

> I know, the first thing the article says is: "the C standard grants the compiler the right to make some assumptions that end up causing these weird/non-obvious things if you don't guard against them", but I must ask: why can't the compiler grow an optimization mode that does the right thing (and thus avoid all of the issues documented here) in the presence of global variables and concurrent execution? that situation (globals+concurrence) is so common, that solving it in the compiler would benefit most, if not all, C users, in both kernel and user space.

I'd say that C11/C++11 made a large step in that direction, by having a formalized memory model, and builtin atomics. Before that there really was no way to not rely on compiler implementation details to get correctly working (even though formally undefined) concurrent programs.

It does require you however to actually use the relevant interfaces.

I don't quite see how you'd incrementally get to a language that doesn't have any of these issues, without making it close to impossible to ever incrementally move applications towards that hypothetical version of C. I mean there's basically no language that allows to use shared memory and doesn't require escape hatches from its safety mechanisms to implement fast concurrent datastructures (e.g. rust needing to go to unsafe for core pieces). And the languages that get closes require enough of a different approach that it's hard to imagine C going towards it.

That's not to say that C/C++ have sufficiently progressed towards allowing to at least opt into safety. The C11/C++11 memory model and the atomics APIs are a huge step, but it's happened at the very least 10 years too late (while some of the formalisms where developed somewhat more recently, there ought to at least have been some progress before then). And there's plenty other issues where no proper ways are provided (e.g. signed integer overflow handling, mentioned in nearby comments).


to post comments

Who's afraid of a big bad optimizing compiler?

Posted Jul 29, 2019 16:44 UTC (Mon) by PaulMcKenney (✭ supporter ✭, #9624) [Link]

I would say 30 years too late rather than just 10, but yes. There was a surprisingly large amount of concurrent C code written during that 30 years prior to C11. It is only natural for people to want to discount that code as "broken" so it can be ignored, but some of that code is rather important. The problem is that we do not have a reasonable way to identify the problem areas, even for the small fraction of that code for which source code is publicly available.

So there are great opportunities for innovations in the area of locating old concurrent C code in need of an upgrade! :-)


Copyright © 2025, Eklektix, Inc.
Comments and public postings are copyrighted by their creators.
Linux is a registered trademark of Linus Torvalds