|| ||Nathan Myers <email@example.com>|
|| ||latest Forrester report|
|| ||Sun, 26 Jan 2003 13:43:11 -0500|
To the editor,
The latest report from Forrester Research summarized at
was disappointingly unprofessional in several respects.
Its dig about "the open-source socialist fringe" demonstrates a
characteristic confusion: the term "open source" was invented
specifically for participants to distance themselves from the Free
Software movement's political opinions. By definition, there can
be no such thing as an "open-source socialist fringe". Nonetheless,
the report would better have observed that even the putatively
fringiest socialists' code works demonstrably better than the
convicted monopolists' output, and let readers draw their own
Its dismissive treatment of desktop use of Free operating systems as
a "gaffe" that wouldn't "make sense", is similarly unprofessional.
If the writers think no Free Software is ready for desktop use, they
neither support the claim, nor offer any estimate of how long it will
be before any will be ready. The many successful desktop deployments
to date, and the unexplainable paucity of failures, would surely
mystify the authors if they considered the matter.
The authors pretend that only open-source software produces additional
costs "like documentation, support and commercial add-ons", which
"swell a company's IT budget". What do they imagine swells the IT
budgets of companies dependent on proprietary software? Similarly,
they recommend staffing a technology center with "skeptics--not gurus".
Since a guru is, by definition, the most competent available individual,
"skeptics" must be those less competent. They beg the question,
skeptical of what? Might skepticism about the wisdom of depending
on the goodwill of a criminal monopolist qualify?
The blanket advice, "companies ... should treat open source like
commercial software: Hands off the code," betrays a deep failure to
understand the success of Free Software to date. Decisions about
participation in Free Software projects belong at the lowest levels
of the company, where the costs and benefits to each project may be
evaluated directly, without reference to ideology. If a particular
group has the needed skills on hand, and would benefit from engaging
with others to improve their tools, what does it matter how
sophisticated the rest of the company is about building software?
Better advice for a CIO would be, "Hands off: encourage line managers
to make reasoned choices." Such good advice is too generally
applicable, somehow, to put into a report.
The tacit advice to ignore the second most widely-deployed Linux
distribution, Debian, is simply irresponsible. Support for Debian
installations is as readily obtained as for most distributions they do
recommend, and Debian has unquestionably better future prospects than
most. The Debian project's continued success must so mystify the
authors that they dare not mention it at all.
The report's final predictions -- Microsoft freeing its "language
runtime" (thus making its OS, somehow, magically scalable from embedded
systems to mainframes), and a million-dollar "Ellison Prize" for
people who no longer write code, somehow generating an outpouring of
innovation -- smack of fevered fantasy. Where did we get the Free
Software we have? That's where to look for it in the future.
Many of Free Software's key components (including the BSD TCP/IP stack
used in Microsoft's operating systems) came out of (socialistic?)
direct government grants to solve specific problems. Some arose from
the "socialist fringe" the report disparages. Most were developed to
meet specific needs by people hired to satisfy those needs, and then
found uses (and development support) worldwide. Many of those people
were hired by, or on behalf of, governments. Is that socialistic?
The code works.
The report's flaws come from the same place as in most research firms'
reports: sponsorship. Who paid Forrester to have this report written?
It looks stitched together from scraps of position papers from IBM and
an embedded-system vendor. The authors clearly do not understand the
field they pretend to analyze. Instead, they have constructed a fantasy
world in which they can echo the wishes of their sponsors.
We should not allow the report's apparently-positive remarks to mislead
us about the merits of the report or its publisher.
Comments (2 posted)
|| ||Leon Brooks <firstname.lastname@example.org>|
|| ||Wed, 29 Jan 2003 09:33:04 +0800|
Hi, Fred; with regard to your recent pontifications on security:
> the article said: "...more than 50% of all [CERT] security advisories ...
> in the first 10 months of 2002 were for Linux and other open-source
> software solutions."
The implication is that Linux has more bugs than everything else combined. You
also implied an acceptance of WinInformant's wildly errant conclusions
evidently founded on the same implication.
> None of this excuses or lessens the seriousness of Windows' own problems,
> of course, but it does show that as Linux grows in popularity, it will
> have its own full share of bugs and security problems, too.
This assertion is independent of WinInformat's, and it is wrong too. Bugs have
nothing to do with popularity; if anything, more participants in a given
development process implies less bugs. In real life, the bug reporting
process extends to more decorative issues that a project with fewer
developers wouldn't have the resources to worry about.
Quoting InformationWeek again:
> It's hard to imagine a less inflammatory or more obvious assertion - that
> all operating systems have bugs and security issues
Unfortunately, you did not limit yourself to this assertion. If you had, you'd
be clear. You tried to be borrow some of WinInformat's facade of cleverness
and bend CERT's reports to support your statement in such a way that you
appeared to be conservative. That was damn silly, and you deserved to be
flamed for it.
You then go on to raise and knock down a straw man by putting up a few mild
objections to your point, namely that there aren't really that many bugs, and
they can be fixed faster. Let's look at those.
> We can avoid CERT's problem of counting the same bug more than once if
> we compare the security patch/update counts for one popular distribution
> and version of Linux to one popular version of Microsoft Windows.
First off, the problem lies not with CERT, but with careless or zealotrous
researchers interpreting the raw CERT data wrongly.
Second off, you do avoid that problem, but you smack face-first into another,
one which is actually worse ("out of the frying pan, into the fire").
Slammer/Sapphire, currently the bane of MS-SQL servers the world over (still
one probe every 2 minutes or better in a Class C subnet as I write) is not
counted as a Windows bug, but a similar problem in PostgreSQL would be
counted as a bug in, say, the SuSE, Slackware or Caldera Linux distributions.
There is no direct Windows equivalent for a Linux distribution. No Windows
version ships with anything like Mandrake's 4000 or Debian's 11000 or so
(slightly more granular) packages. Or, for that matter, with anything like
the same amount of control over them. Microsoft's "157 products" aren't a
drop in a bucket compared to that.
You also do not correctly address the issue of bug severity. A typical
Microsoft bug results in, as the mythical CERT CA-96.13 says, "the total
destruction of your entire invasion fleet and [...] unauthorized access to
files" by remote control. A typical Linux bug results in remote access as an
ordinary or even crippled (chrooted and/or owns no files) user, or the
possibility of local escalation to superuser. Your "quick example" is
exceptional, not typical.
Perhaps more terrifying are the Windows bugs that _cannot_ be fixed. Because
of the way Windows is designed, in all known versions, it will _always_ be
possible to push a stick through the spokes of the Windows message-passing
system and escalate privs. IE's MIME handling under Windows is still badly
broken, and as far as I can tell, always will be.
Just to labour the point, conside this list of known, unpatched Internet
Explorer vulnerabilities - http://www.pivx.com/larholm/unpatched/ - including
"Silent delivery and installation of an executable on a target computer", and
contrast that with the Open Source competition (Mozilla, Konqueror and
derivatives) which patched and tested the most recent SSL vulnerability in
under a day (95 minutes from notification to fix-release for Konqueror).
> The open source community has fragmented into myriad competing segments,
> each with its own different, and increasingly quasi-proprietary,
> distributions of software.
Using a few prominent examples to speak for all Linux distributions is grossly
careless. In general, Linux distributions include little if any proprietary
software, and most have downloadable distributions which are both libre and
gratis. Many, notably Debian and Mandrake, make a point of GPLing all of
their specialised tools, and many distributions borrow chunks from each
other. In the case of Sun's "Mad Hatter" distribution, they borrowed RedHat's
entire distribution en bloc.
Fragmentation is - from a security perspective - good. A software monoculture
is vulnerable. _Any_ monoculture is vulnerable. Linux runs on 13 hardware
architectures, Windows on 2 (really only 1), a typical Linux distribution
provides a sheaf of different window managers, web browsers, mail clients,
office suites, databases, webservers, scripting lanugages and so on.
From a user perspective, the choice represented by fragmentation is good. For
a current example, a parochial Australian girls' school installed Linux and
defaulted the girls' desktops to KDE. Within days, a significant number of
the students had discovered and settled on GNOME and lighter window managers
like IceWM and BlackBox. The helldesk didn't explode as a result (choice of
WM is part of the user context), in fact the support crew do much less
running around than they used to with Windows and no such choices.
Is Linux ready for the desktop? The 20,000 (soon to be 200,000) users in Rio
Grande do Sul's State schools think so too.
There's a lot more which could be said about your article, but it hardly seems
worthwhile. Do more research, come at the issues with more hard facts and
less fancy theories. Don't try to justify mistakes, it's much more useful to
learn from them.
http://cyberknights.com.au/ Modern tools; traditional dedication
http://slpwa.asn.au/ Member, Linux Professionals WestOz
http://linux.org.au/ Committee Member, Linux Australia
http://linux.org.au/~leonb/lca2003/ THE Oz Linux Technical Conf:
excellent event, photos here!
Comments (none posted)
|| ||email@example.com, firstname.lastname@example.org, email@example.com, firstname.lastname@example.org|
|| ||The Langa Letter - an opposing view|
|| ||Mon, 27 Jan 2003 17:18:33 -0800 (PST)|
Fred's comment about "severity" is, as he points out, inherently
subjective. His numerical analysis is also subject to more issues that
he's simply ignoring.
For example the 157+ bug count for RH 7.2 or 7.3 includes fixes for many
overlapping products and many which are rarely installed by Linux users --
RH simply includes a lot of optional stuff. Meanwhile the count for
Micrsoft may still be artificially low, since MS is known to deliberately
minimize the number and severity of their bug reports. Many of their 30+
reported patches might include multiple fixes and descriptions which
downplay their signficance.
Fred also, inexcusably, argues that "first availability" of a fix (in
source form, sometimes in focused, though public, mailing lists and venues)
"doesn't count" as faster. That is simply jury rigging the semantics to
support a prejudiced hypothesis.
Another approach to looking at the severity of bugs is to view the effect
of exploits on the 'net as a whole.
In the history of Linux there have only been a couple of widespread worms
(episodes where a bug's exploit was automated in a self-propagating
fashion). Ramen, Lion and Adore are the three which come to mind.
Subjectively the impact of these were minimal. The aggregate traffic
generated by them was imperceptable on the global Internet scale. Note
that the number of Linux web, DNS and mail servers had already surpassed MS
Windows servers by this time --- so the comparison is not numerically
Compare these to Code Red, Nimba, and the most recent MS SQL injection
worms. The number of hosts compromised, and the effect on the global
Internet have been significant.
I simply don't have the raw data available to make any quantitative
assertions about this. However, the qualitative evidence is obvious and
irrefutable. The bugs in MS systems seem to be more severe than
comparable bugs on Linux systems.
If a researcher were really interested in a rigorous comparison, one could
gather the statistics from various perspectives --- concurrently trying to
support and refute this hypothesis.
Fred is right, of course, that Linux has many bugs --- far too many.
However, he then extends this argument too far. He uses some fairly shoddy
anecdotal numbers, performs trivial arithmetic on them and tries to pass
this off as analysis to conclude that there is no difference between MS XP
security (and that of their other OSes) and Linux' (Red Hat).
I won't pass my comments off as anything but anecdotal. I won't look up
some "Google" numbers to assign to them and try to pass them off as
I will assert that Linux is different. That bugs in core Linux system
components are fewer, less severe, fixed faster, and are (for the skilled
professional) easier to apply across an enterprise (and more robust) than
security issues in Microsoft based systems.
The fact that numerous differences in these to OSes make statistical
comparison non-trivial doesn't justify the claim that there is no
Further anecdotal observations show that the various Linux distributions
and open source programming teams have done more than simply patch bugs as
they were found. Many of the CERT advisories in Linux and elsewhere (on
the LWN pages, for example: http://www.lwn.net/ ) are the result of
proactive code auditing by Connectiva, Gentoo, S.u.S.E., IBM and The MetaL
group at Stanford, among many others. In addition many of these projects
are signficantly restructuring their code, their whole subsystems, in order
to eliminate whole classes of bugs and to minimize the impact of many
others. For instance the classic problems of BIND (named, the DNS server)
running as root and having access to the server's whole filesystem used to
be mitigated by gurus by patching and reconfiguring it to run "chroot"
(locked into a subdirectory tree) and with root privileges dropped after
initial TCP/port binding (before interacting with foreign data). These
mitigations are now part of the default design and installation of BIND
9.x. Linux and other UNIX installations used to enable a large number of
services (including rsh/rlogin and telnet) by default. These services are
now deprecated, and mainstream distributions disable most or all network
services by default and present dire warnings in their various
enabling dialog boxes and UI's). before allowing users to enable them.
These changes are not panacea. However, they are significant in that they
hold out the promise of reducing the number and severity of future bugs,
and they artificially inflate recent statistics (since the majority of this
work as been over the last two or three years).
Fred will undoubtedly dismiss these comments as being more "rabid
advocation" by a self-admitted Linux enthusiast. He may even point to MS'
own widely touted "trustworthy computing" PR campaign as evidence of a
parallel effort on "the other side of the Gates." However this message
isn't really written to him.
It's written to those who want to make things better.
The real difference between security in MS and in Linux is qualitative
rather than quantitative. With Linux every user and administrator is
empowered to help themselves. Every one of us can, and many more of us
should, accept a greater responsibility for our systems and their integrity
and security. Linux users (including corporations, governments and other
organizations) can find and fix bugs and can participate in a global
community effort to eliminate them and improve these systems for everyone.
Let's not get wrapped up in blind enthusiasm and open source patriotism.
But let us not fall prey the the claim that there is no difference. There
is a difference and each one of us can be a part of making that difference.
Comments (none posted)
Page editor: Jonathan Corbet