Linux distributions often patch the software they distribute, to fix bugs
or add features. Anything they add is pushed upstream to the project
responsible for the package—at least in theory. When that theory is
not borne out in practice, it can lead to the kind of unhappiness and
finger pointing that went along with a recent OpenSSH release. The
release notes point at Debian for failing to report it upstream, but the bug was actually fixed much earlier, in Red Hat Enterprise Linux 4 (RHEL4).
The bug in question is rather nasty, allowing a local attacker to hijack X
Windows programs of a user who logged in using ssh with X forwarding
enabled. Under those
circumstances, the ssh client and server arrange that any X programs started on the
logged-in machine actually display on the client machine's desktop. This
is very useful for running X programs across the internet as the X
traffic is encrypted as part of the ssh session.
Due to a broken interaction with Internet Protocol version 6
(IPv6)—the next generation protocol for internet traffic—ssh
get confused about the port number of the X server. If a particular port
(which maps to the X DISPLAY environment variable) is not available to be
used under IPv4—the protocol in use today—but is available under
IPv6, the ssh server will incorrectly set DISPLAY. If it is an attacker's
program that is listening on the IPv4 port, it will be able to hijack X
programs that get run.
Up until sometime in the last several years, this would not have happened for most Linux
boxes because IPv6 was generally not enabled. In that case, the ssh server
would recognize that it could not get the port it wanted and try another,
eventually setting DISPLAY correctly. Because IPv6 is much newer, these
kinds of bugs may exist in other network programs. This bug should serve
as a reminder to developers to closely check their IPv6 support.
Clearly, though, the bug fell through the cracks. The OpenSSH team shows
its annoyance in the release notes:
We apologise for any inconvenience resulting from this release
being made so shortly after 4.9. Unfortunately we only learned of
the below security issue from the public CVE report. The Debian
OpenSSH maintainers responsible for handling the initial report of
this bug failed to report it via either the private OpenSSH security
contact list (email@example.com) or the portable OpenSSH Bugzilla
It was reported
in January to the Debian bug tracking system, but not fixed and released
until late March. OpenSSH does releases every six months or so, with 4.9
being released on March 30. Having to turn around another release four days
later to fix a problem that was known for a few months could certainly make
for annoyed developers. So how did the bug get fixed in Debian, with a
Vulnerabilities and Exposures (CVE) number being assigned, but without
notifying the OpenSSH team?
The Debian bug entry is instructive, because it documents some of the
steps that led to the hurried release. In particular, Phil Miller
thought he had done the right thing to report the problem in February:
As noted in the control section, I have forwarded this to Theo
DeRaadt, the point of contact for security issues found in OpenBSD's
That email must have gotten lost or been eaten by a spam filter as de Raadt would presumably have gotten it to the right
people had he seen it. The bug description clearly puts it in the realm of
a security problem, but the bug was not classified that way in the Debian
system. Had it been, it would have been handled differently, possibly
triggering an email to the proper place. But the bug report also shows
that Red Hat fixed it in 2005.
It was reported to Red Hat by a customer and got entered into their bugzilla
#163732. Unfortunately, that bug report is confidential because it
contains potentially sensitive
customer information. This makes it difficult to track further.
Indications are that it was not seen as a security problem and that it was
believed to have been already known as an OpenSSH bug. Apparently no one
checked to make sure the OpenSSH folks knew of it though.
Closer cooperation between the OpenSSH maintainers for Red Hat and the
upstream team would probably have helped. Red Hat has been carrying the patch
along for quite some time. Because the security implications were not
clear and the patch is quite simple, it may not have seemed to be all that
necessary to get it upstream. Though, there are more than twenty patches listed in
the fedora OpenSSH CVS repository for rawhide, which will become Fedora 9.
The OpenSSH team would be well served by paying closer attention to various
distribution patches to their code as well. It is certainly plausible that
those interested in finding security holes to exploit might start by seeing
if any patches floating around for critical services like OpenSSH were
useful. By being more proactive, OpenSSH might have found and fixed this
bug much earlier. The way this particular bug avoided notice seems to be
mostly happenstance; if there is blame to be placed, there is plenty to go
RHEL and other "enterprise" distributions have long support cycles which
means that the versions of various packages being maintained are well behind the upstream
project. It doesn't take very many bug reports getting shot down because
they have already been fixed in a more recent version before distribution
maintainers lose enthusiasm for making those reports. But it is an
essential part of the process. The OpenSSH team has the reputation of
being somewhat difficult to work with, which may have helped this
particular problem get overlooked.
It is a difficult problem to solve fully. Distributions have their own
set of requirements which may be in opposition to those of the upstream
project. Those projects may also have policies and procedures that
distributions are not up to speed on. The Linux kernel often sees the same
kind of conflicts, which is why distributions often maintain their own set
of kernel patches for features their customers need. But it is in
everyone's best interest to work those problems out so that distributions
carry along as few patches as possible while upstream projects do not miss
out on bug fixes and features.
Comments (44 posted)
Your editor is typing this from the Linux Foundation's collaboration
summit, currently in progress in Austin, Texas. The day's agenda includes
giving a talk on the state of the kernel during the evening reception;
beer-fueled hecklers would appear to be in your editor's near future.
The first day, though, included a rather more sober panel on the state of
the Linux desktop which revealed some interesting thoughts on where things
This panel, moderated by Steven Vaughan-Nichols, featured John Hull from
Dell, David Liu (gOS), Jim Mann (HP), Timothy Chen (Via), Kelly Fraser
(Xandros), Grégoire Gentil (Zonbu), Ellis Wang (Asus), Debra
Kobs-Fortner (Lenovo), and a representative from Everex whose name your
editor did not catch. Together, they represented a wide range of
industries, from component makers and operating system vendors to providers
of complete systems. They take different approaches to the Linux desktop,
but they are all optimistic about where it is heading - though some are
more so than others.
So how are these vendors doing with desktop Linux? While all of the
vendors were optimistic, some were more guarded than others. Dell states
that sales have "met expectations," but are aimed mostly at niche markets
so far. There is, they say, a lot of interest in emerging markets, where
users can start with Linux from the outset and do not have to migrate from
other platforms. HP was also moderate in its enthusiasm, saying that its
sales are "right about at the industry average." Lenovo was cautiously
optimistic; their Thinkpad offerings are targeted at business users, which is
a slower market to get into. According to Lenovo, most of their
Linux-based sales are custom products designed for specific businesses.
Rather more enthusiasm came from gOS, the company which supplied the
distribution for Wal-Mart's low-end PC. Sales, they say, are "very good."
Asus is clearly happy with the success of the Eee PC. That success, they
say, comes from the effort put into designing a complete solution for
users, with features like quick booting and solid-state storage: "you drop
it, it still works." Everex says that "sales are brisk"; the company is
pleased and will continue to offer Linux-based products - including the
"MyMiniPC", a small system aimed specifically at MySpace users. Via's
components are found in a number of small Linux systems, including the Eee
PC, so Via is happy.
It's too early for real results from Zonbu, which is
trying to use Linux-based systems for a "computers as a service" business
model. But, says Zonbu, Linux is the best platform for companies trying
new models. Finally, Xandros also is optimistic, especially about "new
form factors" for the desktop, a place where Microsoft, they say,
The panel was asked what the development community can do to help these
desktop businesses; in response, Arjan van de Ven piped up from the
audience, asking what the companies are doing for the kernel community.
From Lenovo, the word is that developers can work to get drivers into
enterprise distributions as soon as possible. That request, of course,
gets back to the tension
between enterprise distributions and the desire for current code; this
subject was not pursued further here, though. Dell would like to see more
collaboration with other vendors in the production of drivers. The Via
representative came straight out and said that "we don't do much" to
support the community, but insisted that their intentions are good. He
said that community support is hard for a Taiwanese company to do, but
didn't say why. Via does plan to open a community site at linux.via.com.tw with driver code and
more, but this site is not yet in place.
There would appear to
be some tension between providing a truly open device and
keeping support costs down.
Support of users came up briefly. The HP representative said that the
company expects distributors to provide backup support, but the first call
will always go to the vendor of the hardware. That can be a problem,
especially for the small devices which are seeing so much success at the
moment; a single support call can wipe out any profit on the sale of one of
those systems. Selling "constrained systems" which only do a few things
helps; but, earlier, Mr. Mann had also talked about the difficulty of
installing additional applications on these systems. There would appear to
be some tension between providing a truly open device and
keeping support costs down. The word from Asus is
that a system like the Eee PC generates a lot of relatively trivial calls -
things like "how do I search on the web?" So there is a real need to train
users which has little to do with Linux itself.
On the subject of applications, the gOS representative discussed a strategy
of putting as much as possible on the web. The problem with local
applications which look like Microsoft products is that users then expect
those applications to behave like Microsoft products. It is better to have
something which is obviously different and, presumably, better. Xandros
called for better style guides and consistency throughout the interface;
clones of other products are not what the market needs. On the HP side,
the biggest request was "don't make people open a terminal."
Perhaps the most amusing comment came from the Via representative, who
described a "Maddog/Shuttleworth" choice. He asserted that his
grandparents would find Jon "maddog" Hall (who was in the audience) to be a
rather scary presence, while Mark Shuttleworth comes across as a friendly
gentleman. Our interfaces, he says, need to look more like Mark
Shuttleworth. Your editor, who has always found Maddog to be one of the
friendliest people he knows, does not entirely buy into this analogy. But
perhaps there is something to be said for clean-shaven interfaces.
There was some talk of asking suppliers to provide hardware which is
supported by free software. Perhaps the most telling comment came from
Lenovo, which, apparently, has been asking for Linux-supported hardware
"for a number of years." Free drivers are not a priority, though; the
first priority is just having things work. So there is still some work to
be done in this direction.
Arguably the most interesting theme which came from this discussion - and
from the first day of the summit as a whole - is that nobody is really
pushing all that hard to get Linux into traditional desktop settings. The
real action at the moment would appear to be in small devices like the Eee
PC. These "greenfield" areas where there is no established presence to
compete against offer vendors a market where they are not trying to migrate
users away from other products. They would appear to be convinced that
Linux can be a strong contender there - maybe the strongest. So soon we
may truly see the year of the Linux desktop - for specific types of
Comments (17 posted)
Over the last few years, we have seen the rise of video content on the web,
but much of that content has been locked up in non-free formats. Patented
video codecs are a big part of the problem, though there are free
alternatives (Theora and Dirac for example), they are not
widely used. Free software projects often use videos as part of their
marketing and documentation, using screencasts to highlight interesting or exciting features
of the program for example. But the choices for collecting and distributing
video content leave much to be desired for free software advocates.
The Fedora project has been looking into this problem lately, in support of
its FedoraTV project. A recent thread on the fedora-advisory-board
mailing list looks at various alternatives now that the original host
of FedoraTV content, luluTV, has gone out
of business. Greg DeKoenigsberg outlines the problem:
The original goal of Fedora TV was to provide a "Fedora-friendly" home for
videos that we had some control over. I think this is still a worthwhile
strategic goal, but since we no longer have the help of dedicated
no longer think it's a sensible tactical goal.
The question that follows: "we've got lots of people who are excited about
making Fedora videos. What's the best way, in the short term, to gather
videos together to make them accessible?"
He goes on to outline the criteria for finding a near-term solution,
starting with the absolute requirements: Ogg Theora format, one-click
download, and a robust, stable hosting site. Also important, but not as
critical are things like the ability to extract static screenshots for
posting in various places, an easy way for community members to know when
new videos are available (an RSS feed for example), and a way for uploaders
to easily associate a license with their video. These should resonate with
most projects that have an interest in providing a video forum for their
community as they are likely to have many of the same needs.
Transcoding the videos to Flash to reach the largest possible audience is
DeKoenigsberg's "controversial" criteria. It is an
unfortunate truth that, even for fairly strong free software proponents, the
Flash browser plugin provides the simplest route to viewing online videos.
Other solutions exist and work, but require a great deal more effort to
enable additional software repositories so that the proprietary or patented
codecs can be installed. Interestingly, there were no arguments presented
against the transcoding suggestion.
For Fedora, where Theora—or other free codec—viewers are easily available, Flash transcoding
might be less of a requirement. Other projects, especially those that are
cross-platform, may find that a large part of their community is either
unable or unwilling to install additional software to view videos. Users
of non-free operating systems are largely unaware of the video codec
problems; their OS comes with a no-extra-cost video viewer that just
works. Because of that, transcoding to Flash does at least provide a way to
present videos that can be relatively easily viewed by free and non-free
Various solutions to the hosting problem were discussed, from partnering
with archive.org to rolling their own
using MediaWiki, Plumi, or some of the technology released
by luluTV. One of the suggestions that got the most attention was to
create a Miro channel hosted, at
least temporarily, on Fedora project servers. Miro has a lot of promise as
a viewer and organizer of videos, with a BitTorrent client built-in, but it
doesn't solve the other half of the problem: how to allow the community
There is, it seems, a growing need for a free community video forum, both
from a code and a hosting perspective. The bandwidth and storage requirements of video
are enormous, so covering the actual cost will be a big challenge. Places like YouTube allow short videos to be uploaded, but
they can only be played back via Flash. In addition, their software is not
free, so they only solve parts of the problem.
There are no obvious free
solutions, yet, but it is a problem that we will be facing more frequently.
Somehow leveraging Miro as a free, cross-platform video delivery system may
make the most sense. Providing a way for the community to upload video
content into the channels would make for a mostly working FedoraTV and
other projects like that. Miro supports
free codecs as well, which might help to start weaning people away from their
current non-free codec addiction. Then we can start figuring out how to
pay for the network and hard disk capacity required.
Comments (43 posted)
Page editor: Jonathan Corbet
Next page: Security>>