For web servers, GC languages already dominate.
Err, no. Yaws is about the only major Web server that isn't written in C. And that's entirely because C is the only thing that performs well enough.
For the CGI like "backend", yes GC languages dominate mainly because performance isn't as big a problem and the real web server can take actions to limit the performance problem of the GC'd language code. Also the person running the application is often closely tied to the person writting it, and so they can throw money at their users performance problems.
On the desktop, I can't think of a single GUI app that I would rather write (or see written) in C or C++ instead of one of the languages mentioned above.
There are still very few GUI applications that aren't written in C, and again it's mainly because of performance and memory usage. For instance the "revelation GNOME applet" is currently ~125MB big, with an RSS of 32MB; this is a python application that provides a single text entry and an icon on my panel ... it is far from unique. About the only major GC'd application I use is xemacs, and it's all too often that I reboot it due to memory usage spiraling out of control (and I wouldn't call it fast).
It just makes sense to have the inexpensive computers do the extra work instead of the expensive programmers.
That is wrong in two ways: 1) The computers are now not doing real work for their users, instead they are doing busy work for the programers (on the users time). 2) Doing it properly is often not that expensive for a good programer, who already has to manage other reasources. But, yes, users are often still letting programers charge them millions of units of work in exchange for not having to do a single unit themself. I doubt any economy can make this sustainable, long term, and you only have to look at people using dillo and/or lighttpd to see the choices being made.
Copyright © 2017, Eklektix, Inc.
Comments and public postings are copyrighted by their creators.
Linux is a registered trademark of Linus Torvalds