> It's 2012. Why is a security critical application written in a language that tolerates integer overflow/underflow?
Even *new langages* such as D (and it isn't the only one) doesn't treat correctly integer overflow/underflow..
And you don't even need to change the language you can "patch" C, but it isn't done by default: http://blog.regehr.org/archives/715
> Is performance so critical on today's hardware that we can't afford a runtime environment that checks for these kind of stuff?
Probably not: remember that we also use very slow langages..
I blame "mind" inertia: C did it this way, so we should also do it like this, nevermind that nowadays most of the time the bottleneck is the memory's access time not the CPU..