> No, that would be a violation of the ODR. Structures and definitions
> cannot change after they've been used.
It's only a violation of the ODR if the objects that are defined differently have global linkage and all of them are not weak symbols.
There are actually a lot of macros that change the behavior of standard headers. _GNU_SOURCE, _BSD_SOURCE, and _SVID_SOURCE are three popular ones.
You seem to be confused about how include files work in C and C++. The way they work is that each translation unit (that's .cpp file to you) has to scan through all the files included by that unit, recursively. There are no shortcuts and the compiler cannot cache this work.
The reason why I said it was O(n^n) is because n^n is the upper bound on the time complexity. Remember that you can include .c or .cpp files. In reality, most projects compilation times will grow slower than this. However, it's still exponential in the number of files and the compile times seen by real-world projects like WebKit reflect this.
> One can write well defined
> modules and interfaces in both languages. One can write poorly structured
> code in both languages.
I agree. A good programmer can write good code in any language. A bad one can write Vogon poetry in any language.
There's a lot of projects I like and respect that use C++. LLVM, OpenCV, Ceph, WebKit, and a lot of others. C++ will be around for a long time. For new projects, however, I would encourage people to look at newer languages like Google Go. Progress hasn't stood still and we have learned some things since the early nineties. I swear!