> Implicitly generated code hurts. Default constructors hurt. Default parameters hurt even more. You might disagree, but I've had a lot of time to come to this opinion.
Sorry but C++ poor mix of features doesn't mean that those features are always bad..
Plus C's lack of initialisation also hurt a lot, no?
Object-oriented design patterns in the kernel, part 1
Posted Jun 9, 2011 16:56 UTC (Thu) by jd (guest, #26381)
[Link]
Well, no. Implicit code is (IMHO) always a Bad Thing because when the specs or compiler change, the resultant binaries will change behaviour when given the same source.
The source is a specification document, to all practical intents and purposes, which the compiler uses to generate a program. If the program can change in nature for the same specification, the specification is incomplete and insufficient.
In other words, by depending on something external (in this case, the compiler) you cannot do reliable testing. There are too many external parameters that you can never reliably take account of. Good engineering practice is to always work to reduce or eliminate the unknowables. If the compiler is any good, it'll ignore initializing to the compiler's defaults since the instructions generated by the compiler's default will already be present. If the compiler isn't any good, you really shouldn't be trusting it to be doing the Right Thing anyway.
If you explicitly initialize, you always know the state of a variable, no matter what new C or C++ standard is produced. This is guaranteed safe.
Compiler-tolerant software is necessarily well-engineered software. Yes, it's more work, but if you wanted to avoid work, you'd not be using C or C++, you'd be using a fourth- or fifth-generation language instead. The only reason to use anything in the C family is to be able to balance development efficiency with code efficiency. Higher levels of language offer better development efficiency, lower levels (Fortran, assembly, etc) offer better code efficiency. Neither is useful on its own for something like an OS kernel, which is why nobody writes general-purpose kernels on the scale of Linux in either Erlang or IA64 assembly.
(Kernels do exist in both those, and indeed in Occam, but because of tradeoffs are almost always much more special-purpose.)
Object-oriented design patterns in the kernel, part 1
Posted Jun 13, 2011 0:20 UTC (Mon) by cmccabe (guest, #60281)
[Link]
> Plus C's lack of initialisation also hurt a lot, no?
I remember a thread here on LWN a while back where people were complaining about a performance regression in the kernel caused by zeroing struct page. So it seems that at least in some corner cases, not initializing memory is a feature. In a perfect world, non-initialization probably should be something that the programmer has to ask for specifically, rather than the default. But C was designed in the 1970s-- give it a break already.
When C++ was designed, in the mid-1980s, the decision was made to keep all the old uninitialized variable behavior and add some more. So if you create a C++ class with some primitives, and forget to initialize them in one of the constructors, they'll be uninitialized. This also applies to copy constructors. The rationale was that programmers shouldn't have to pay for what they didn't use.
Luckily there is some relief in sight, in the shape of Coverity and similar static analysis tools. These can catch most uninitialized variables.