Optimization-unstable code
Optimization-unstable code
Posted Dec 6, 2013 1:40 UTC (Fri) by dashesy (guest, #74652)In reply to: Optimization-unstable code by mti
Parent article: Optimization-unstable code
A compiler flag to warn, any time compiler optimizes something assuming that alternative is undefined would be helpful too. Right now compilers are actually more than happy if I accidentally trigger one of these undefined behaviors, because they could just simplify the whole thing to equivalent of return 0;
and make a code that is much faster although unexpected, thus useless.
As long as compiler writers are concerned, their paycheck depends on how faster (or smaller) the compiled binary is, while mine requires the binary to work first. The concept of correctness depends on the perspective I guess.
Posted Dec 6, 2013 3:36 UTC (Fri)
by nybble41 (subscriber, #55106)
[Link] (2 responses)
It's easy to look at optimizations based on undefined behavior a merely making the compiler authors' jobs easier, but that isn't really the case. The compiler is just doing the best it can under the assumption that the programmer wrote what he or she meant, meaning that any undefined cases are really "don't cares", i.e. will never happen. A good example is NULL pointer dereferences; if you dereference a pointer without first checking that it isn't NULL, the fact that the NULL dereference case is undefined means that the compiler can take this as a hint that you know something it can't: that the pointer won't be NULL at that point in the program. Signed overflow is another such case, since the fact that it's undefined behavior allows the compiler to assume no overflow will take place and optimize loops which it would otherwise be unable to optimize due to the need to handle the overflow.
A compiler is expected to produce correct code first and foremost, not just code which is small and/or fast. However, it can only generate correct code when it's given correct programs. A correct program won't depend on undefined behavior.
Posted Dec 13, 2013 1:29 UTC (Fri)
by pjm (guest, #2080)
[Link] (1 responses)
The danger of this “treat definedness as knowledge of data values” approach is of course that the resulting compiler behaviour (and behaviour of compiled binaries) makes C/C++ a more dangerous choice of language. Runtime efficiency is a significant reason why people choose to use C/C++, but it's worthwhile looking for other approaches to get that efficiency without the costs in security bugs and obstacles to debugging (nigh impossibility of reasoning about program behaviour when ‘if’ blocks can be optimized away).
A more explicit approach to conveying that information to the compiler would be something more along the lines of ‘ [Programmer-specified preconditions also give a path to overcoming those concerns about how to warn: if the program claims that the given preconditions are complete, and the compiler determines that the given preconditions don't imply tun != NULL, then the compiler can issue a warning or error as soon as it sees By itself, an ‘assume’ facility provides only optimization rather than undefinedness-bug-prevention; but by providing an alternative, it makes it more reasonable to change compiler behaviour to make C/C++ less dangerous choices of implementation language.
Posted Dec 13, 2013 3:25 UTC (Fri)
by dashesy (guest, #74652)
[Link]
Optimization-unstable code
> the fact that the NULL dereference case is undefined means that the compiler can take this as a hint that you know something it can't: that the pointer won't be NULL at that point in the program
A path towards avoiding costs of undefined behaviour optimizations
__builtin_assume(tun != NULL)
’. It could be wrapped in a macro that tests for compiler version and falls back to expr_ ? 0 : 1 << -1
(suitably parenthesized and void-cast), at least for the case that expr_
has no side effects. This could even be built into an assert-like macro in the NDEBUG case, so that such assertions provide both a testing benefit without NDEBUG, and a speed benefit with NDEBUG. Similarly, it mixes well with the design-by-contract approach of specifying all the preconditions of a function.
tun->sk
. It requires work; but the payoff in debugging time and bug cost can be worthwhile.]
Undefined behaviors in C are like a minefield, there are many of them, and programmers learn them by heart only after hitting them hard. A path towards avoiding costs of undefined behaviour optimizations
Compiler does not help in teaching about them and reading is not enough because they are very subtle. If warning is issued when an optimization is kicked in, one learns them and can also opt-in the optimization when it is safe to do so.