Some interesting publicity
The first claim - that a given Linux server gets more updates than a given Windows server - could at least be verified. Whether the figure means anything is another story. Updates to a Linux system cover the vast array of packages available there. Many of them result from active code audits and fix obscure problems that are difficult to exploit. Of the large number of security problems fixed by Linux distributors each year, it is a good bet that most of them are never exploited to compromise even a single system. How many systems have you encountered that are threatened by any of these recently-patched problems?
- The Hangul Terminal
vulnerability ("Since it is not possible to embed a carriage
return into the window title the attacker would then have to convince
the victim to press 'Enter' for it to process the title as a
command...")
- Insecure temporary files in
gzip. It is a local vulnerability, but the chances of it
being used are very small.
- The file vulnerability, which requires an attacker to convince the system administrator to run "file" on a specially-crafted file.
...and so on. It is good that these problems are being fixed, but they do not threaten most users. The updates to that Windows system, instead, are far more likely to be addressing serious vulnerabilities that are being actively exploited.
The second claim in the TechWeb article ("many of the attacks aimed at Windows
vulnerabilities are written by Linux experts") requires a response. How,
exactly, did they come by this information? It is, after all, rare for
authors of malware to include their resumes with the code. This statement
is pure slander which has been reported as fact. One can only hope that a
correction will be forthcoming.
Posted Jun 12, 2003 1:30 UTC (Thu)
by proski (subscriber, #104)
[Link] (3 responses)
Posted Jun 12, 2003 13:10 UTC (Thu)
by smoogen (subscriber, #97)
[Link] (2 responses)
With all the multi-libbing that Linux has now, many a Linux box faces a similar issue. You update that XYZ package, but with all the apps that are running on your system linked to the old libXYZ, you either have to restart them or find them a bit flakey or insecure depending on what happens next. My old IBM/Sun/HP manuals tell me that I need to put the box into single user mode to apply patches and for patches that affect system libraries that a reboot may be needed for best functioning. At the core of things the real factor is not the uptime between reboots as the amount of time that you can perform the work you need to. If somehow Windows could do this more than a Linux box then the user might not care if he has to reboot once a day..
Posted Jun 12, 2003 13:21 UTC (Thu)
by Ross (guest, #4065)
[Link] (1 responses)
Posted Jun 13, 2003 21:09 UTC (Fri)
by esh (guest, #140)
[Link]
Posted Jun 12, 2003 7:23 UTC (Thu)
by Dom2 (guest, #458)
[Link]
-Dom
Posted Jun 12, 2003 8:29 UTC (Thu)
by AnswerGuy (guest, #1256)
[Link]
Clearly authors can pick their conclusion and generalize on some anecdotal statistics to support whatever they wanted to say. One particular Linux machine required "three times as many" updates as a particular "Windows platform" (whatever that means). What were the usage and exposure profiles of the systems in question? My multipurpose Linux web, DNS, FTP, and mail server (with VPN and ssh functions) which is on the "bastion" segment exposed on some branch office or partner site vs some internal dedicated print server?
This doesn't even ask what they were counting: five specific RPMs vs.
2 "jumbo patch kits" from MS? Patchkits which each consolidating 20 fixes (and 10 new bugs).
If the discrepancy had been the other way, the author could have just
asserted that Linux exposures weren't being fixed at rapidly, or some such.
Of course all of this is the problem with relying on the media for information. The media is concerned with finding sound bites and keeping eyes scanning over more advertisements. There is an imperative to "find a story" and to "find an angle" even if there is no real story to be had and when the only "angle" available is a straghtforward recounting of facts.
In this particular case the excerpt is unfortanately indicative of the
whole article. It's about four lines of text in their rendering, out of
only 20 lines total. So this one paragraph is 20% of the article and almost nothing is said in the rest of it.
Basically I can sum the story up thus: Microsoft continues to play lip service to security issues by publishing "certification standards" and announcing "streamlining efforts" while they and 10 other companies have also published some bureaucratic standards that should delay full-disclosure at least 30 days in most cases under the dubious aegis of the OIS
Two potentially useful statistics: 80 security patches per year; 95%
of attacks occur after patches for those exploits have already been released. Of course these numbers aren't credible from this source; and
not citations for them are evident.
I read this sort of story as: "Nothing happened in the field of Linux and MS Windows security today; Journalists forced to babble about meaningless anecdotes as if they contain relevant metrics; More non-news at Eleven"
Another difference that should be considered in the number of reboots required to apply all patches and the number of "exclusive" updates that cannot be performed together with other updates. Also it would be interesting to compare the number of packages that change their license during update.
Some interesting publicity
The 'its less reboots' is actually getting to be a misnomer from us old dinosaurs. Many of the Windows updates can be applied without rebooting a machine.. the reboot is there to absolutely sure there arent any routines running depending on old DLL's in memory. Reboots and Linux/Windows
At least on Linux fuser(1) makes it easy to tell which processes are usingReboots and Linux/Windows
the old library.
On many (most?) Linux systems you can put lsof(8) to good use. "lsof +L1" - sans quotes - will show you all commands which have files open where the link count has fallen below 1. Those are files that have actually been deleted on the file system, but the file itself is still kept around as long as a processes uses it. For this to show you old, untrustworthy processes, the old libraries need to be deleted, not just left around next to the shiny new replacements. But then again, you would want to delete those old library versions anyway - they are not to be trusted after all. The same goes for the programs themselves if a process still refers to an old version that has since been replaced (and deleted).
Reboots and Linux/Windows
The point is not that they're fixing trivial vulnerabilities. Its the fact that they form part of an attack chain. An attacker may well gain access to a box as a less privileged user. From there, he can execute another attack to gain privileged access. Fixing these sorts of small vulnerabilities can make this step much harder.Some interesting publicity
Damned statistics