> Companies routinely run Linux and Unix on tens of thousands of servers, and keep all of them up to date with all the right patches. Why would that same infrastructure not work for Linux desktops?
Because it's much harder to do it for desktops. And I've actually wrote my own cluster computing system for Amazon EC2 that has more than 2000 nodes (Linux, of course) during peak times.
For servers it's easy - you create config files and start required services. Easy peasy lemon squeeze.
It's much more difficult for desktop software. Puppet has lots of useful templates for servers, but almost nothing for desktop. For example, some HP printers require hplip setup that can only be done interactively. Fail.
> apt and yum both provide you will all the info you need to keep your systems on the software that _you_ want them to be on (which isn't necessarily the latest and greatest that's been released, you can trivially run your own repositories that only contain approved software and everything can trivially update from there)
So now we're talking about running your own repositories and checking all changes. That'll require at least one $100k-a-year high-level sysadmin.
On Windows it can be done by an MCSE (Minesweeper Consultant, Solitaire Expert). They'll probably won't understand how this devilish ActiveDirectory works, but it'll work good enough.
That's the problem with Linux - you HAVE to have a solution that can be deployed by average technician. And right now the only way to do it with Linux is to restrict functionality to a known-good set (Chromebooks, various set-top box devices).