GC is great for batch-processing, but real-time? It does not work in practice beyond toy samples.
Hmmm... Been there, done that: embedded system running RTOS and RT Java, driving an avionics Mil-Std-1553 bus at 50Hz with <30ms latency under heavy load for many hours. Very non-trivial (several hundred thousand LOC).
Yes we had to pay attention to memory allocations in the time critical code paths, but that was actually pretty easy. And yes, we used the real-time GC that the RT Java gave us to essentially run in the background.
And we had to tune some other parts of the code where some programmers had made some dreadful data structure choices (which worked find under low utilization, but didn't scale). But mostly things worked and we didn't really pay much attention to memory allocations across the majority of the application. No stop-the-world pauses, no burps in latency (which there were before we tuned).
The key with the whole thing was to have enough memory available so the GC could come along behind and clean up without having to stop the rest of the system from running. Similar tests on a much more memory constrained device did not scale as well - but not: it is just a matter of scaling.
I'd done similar things on Windows with C/C++ and all the explicit memory management, so I'm very aware of the trade-offs. I'll take the GC thank-you-very-much, and RT GC does work just fine, IMO.
Copyright © 2017, Eklektix, Inc.
Comments and public postings are copyrighted by their creators.
Linux is a registered trademark of Linus Torvalds