Paul Graham wrote about usage of LISP in NASA. For Java see http://javapathfinder.sourceforge.net/ - that's a framework for formal verification of Java programs. I don't have a link about its usage on spacecrafts, unfortunately.
Oh, and Spirit and Opportunity rovers use vxWorks without any memory protection (it's available in vxWorks but is rarely used).
>Sure. But the same can be said about JVM and CLR - and they don't even try to offer full capabilities of full-blown OS!
Mostly because they are implemented in unmanaged languages and try to do a lot of work that should be done in hardware. Which as you've said is subject to economy of scale and designed by hordes of Intel engineers.
>As the saying goes: memory protection is the worst form of security... except all the others that have been tried.
So how come memory protection can't protect us from flaws in JVM/CLR/Flash?
>Sorry, but no cigar. Traditional supercomputers excels on these tasks. The only problem: these beasts were so expensive most researches dropped them and learned to use much "worse" architecture to do real work.
Well, OK. But still, traders have a requirement to access A LOT of data with the least possible latency. That requires an architecture that is ill-suited for modern [super]computers.
But we might be moving in that direction even with the general-purpose computers. It's quite possible that in 10 years we're going to have practical phase-change non-volatile RAM which is going to make Linux and other traditional OSes obsolete. That's another reason why I find managed OSes interesting.
>Sorry, but you, again, are rewriting history. PC had no graphic acceleration initially. It got it in 1991 (with S3 86C911) and kept it ever since. Sure, today we don't have 2D acceleration in hardware - but that's because 3D units can emulate is quite efficiently.
Commodore 64 and the original Apple Mac had it, though. As well as tons of other architectures. If you read history, X11 server was first designed for a machine with full graphic acceleration (it had hardware vector drawing primitives, hardware mouse cursor, etc.) and then ported to a machine with a dumb framebuffer.
And yet we've had a resurgence of graphics accelerators once it became clear that CPUs are not up to speed with graphics.
>Yeah, I've seen and heard tales. Sure, some hiccups happened, but somehow the only country where cyberwar managed to destroy significant value is Iran - which kinda makes it all less scary then you want to imply.
SQL Slammer was a walk in the park. It has only infected several tens of thousands of machines and yet it managed to bring down the networks of the whole countries. Several millions of infected routers can kill the whole Internet for a period of time.
Just imagine - a new worm starts infecting routers making them slam the network with continuous scans at full speed. In minutes you can get millions of routers infected. Such a scenario is unlikely, but is certainly possible. Especially since routers' hardware is slowly converging to just several platforms.