Possible advantage
Possible advantage
Posted Jun 11, 2008 0:38 UTC (Wed) by tialaramex (subscriber, #21167)In reply to: Implications of pure and constant functions by nix
Parent article: Implications of pure and constant functions
I'd say, from a little experience that it's an advantage during debugging. The closer the resemblance between source and executable, the more chance you have of understanding what you're seeing in the debugger. If, for example, you used an unnecessary temporary, the debugger cannot show you the value of that temporary. If you call a side-effect free function like strlen() several times the actual code may call it just once, meaning that breaking on entry to strlen() will not do what you expect. I recently deleted some code which read something like as follows... int cache[CACHE_WIDTH]; if (!cache) { log_critical("Could not allocate cache"); } A naive programmer might be quite surprised to see his debugger skip the last three of those lines during single stepping, but in reality not a single byte of code was emitted or executed for them due to a trivial optimisation.
Posted Jun 13, 2008 16:17 UTC (Fri)
by giraffedata (guest, #1954)
[Link] (2 responses)
Hear, hear. There are two distinct ways to look at a program: 1) instructions to a computer; 2) description of the solution of a computational problem. The primary audience for (1) is a computer; for (2) it's a human. In view (2), a compiler's job is to produce a machine program that computes the solution described by the source code. A lot of programmers like to do that work themselves, but I think that is an inefficient use of brain power (for everyone who works on that code).
That's definitely my experience. But there is a middle ground. I write human-oriented code and let the compiler do its job normally. But when I debug at the machine level I add -O0 to the compile options. That's usually described as "don't optimize", but as I consider optimization to be an integral part of the compiler's job, I view it as, "Make the machine program track the source code as closely as possible."
Posted Jun 17, 2008 21:38 UTC (Tue)
by roelofs (guest, #2599)
[Link] (1 responses)
"...and watch your bug go away." :-)
That seems to happen to me more often than not. Most recently it turned out to be a gcc code-generation bug, 32-bit only, 3.4.x only. (Pretty sure not related to const, though C++ is a different beast, so I'm not 100% certain.)
Greg
Posted Jun 18, 2008 2:00 UTC (Wed)
by giraffedata (guest, #1954)
[Link]
Of course, this is mostly with older compilers. (The new ones are too slow for me).
Possible advantage
I don't really see any advantage of making the code resemble the output of the compiler.
The closer the
resemblance between source and executable, the more chance you have of understanding what
you're seeing in the debugger.
But when I debug at the machine level I add -O0 to the compile options. That's usually described as "don't optimize", but as I consider optimization to be an integral part of the compiler's job, I view it as, "Make the machine program track the source code as closely as possible."
Possible advantage
Of all the bugs I've analyzed with GDB, I'd say about 5% stop manifesting when I disable optimization. One was a compiler optimizer bug, one was me lying to the compiler (unintentionally, of course), and the rest were wild pointers causing stack corruption.
Possible advantage