The life of South African Mark Shuttleworth has been a kind of geek dream: found and sell Internet company for $500+ million in mid-20s; spend $20 million to become the second space tourist; and create a GNU/Linux distribution with a cool name that has become the most popular on the desktop.
Here, he talks to Glyn Moody about Ubuntu's new focus on the server side, why Ubuntu could switch from GNOME to KDE, and what happens to Ubuntu and its commercial arm, Canonical, if Shuttleworth were to fall out of a spaceship.
I believe you made about $500 million when you sold the certificate authority Thawte Consulting to Verisign in 1999. Creating a GNU/Linux distribution is not the most obvious follow-up to that: what were the steps that led from the early part of your life to the current phase?
I have a belief that we should all paint our lives as boldly as we can, and we should explore the things that are the most interesting to us personally. I'm always disappointed when I see people asking the question: "What's going to be the next big thing? What career should I choose? Where will the most money be paid?"
It's impossible to know what the future holds, but it's very possible to know what you might be personally interested in. So after Thawte, I spent some time setting up the [Shuttleworth] Foundation and some time setting up the [HBD] Venture Capital group, which I wasn't going to run personally, but which I thought was a good thing to have, and put a team in place to do that.
And then I thought: what are the most interesting challenges out there, what are the opportunities that I'm sort of uniquely positioned to do? And the opportunity to go to Russia and train there and then fly was the opportunity that I chose.
After that, it was more difficult. There were three things that I was looking at. Each of them was exploring the impact of the Internet in society and in commerce, but in different ways. And of all of them, [Ubuntu] is the project I thought was the most interesting, the most difficult, the biggest scale project. And ultimately, if we succeed, the one that will have the biggest impact. So I took this one on.
Given that Ubuntu's roots are on the desktop, what's behind the recent shift in strategy to address the server side too?
That's not a change in strategy, it's more a pull through. We started with a very narrow focus on the desktop, and that allowed us to punch in. As we've penetrated the industry, there's a natural pull through where someone who's started using us on their desktop has now started setting up Ubuntu on a server.
You could always run Ubuntu on a server; there was never a significant reason not to. That body of users has now reached a critical mass on the server, and so our server work is now more responding to that than a shift in strategy. We continue to make the desktop our labor of love, the server requires a very enterprise-oriented approach. We've built out a dedicated team that just handles that. We haven't re-assigned people who are desktop specialists and asked them to test a server.
You're not worried you're spreading yourselves to thinly?
That is a risk, and that's something we discuss here a lot. There are benefits to offering a platform that can be used in both configurations. We see companies often saying: "We love your desktop. We would definitely choose your desktop if we could also use you on the server."
Companies don't like to introduce arbitrary diversity in technology. Everybody has heterogeneous systems, but they don't like to make that situation worse without a very good reason for it. Ubuntu is a very good server for certain use-cases now, just like Ubuntu is a very good desktop for certain use-cases. Our challenge over the next couple of years is just to broaden the base to which it appeals on both fronts.
On the server, it's very much a question of taking time to build the portfolio of relationships with other vendors. There are a lot of applications - what we call solutions - which are now free software-based: standard web-serving, mail-serving and so on. Ubuntu does very well for those. Increasingly, the challenge for us now is to build out the portfolio of non-free software certifications, everything from Oracle through SAP and thousands and thousands of pieces in between. That will take time; it's not something we can achieve overnight.
One of the interesting things you've floated recently is the idea of coordinated releases amongst GNU/Linux distributions. Where did the idea come from, and what would the benefits be?
That's really what Ubuntu's all about. We want to express fully the real nature of free software, as a true commercial, economic entity in its own right.
What I'm really, profoundly interested in, is how a different approach to technology makes new things possible.
The business model of the proprietary software industry is licensing software to new customers or updates of software to existing customers. You make money when you have a new version. So there's an imperative both to release new versions and to have a whole bunch of new features in those versions, specific features that you articulate in advance.
In the free software world, we don't have that to cloud our thinking. We accept that development goes at the pace that it goes. If we operate on a basis that we only integrate new features into the platform when we consider them ready, then we can effectively release the platform at any time. When you look at the world though those glasses, it makes sense then to articulate not that you'll ship the product when you have certain features, but you'll ship it at a certain time. That's actually really useful to all of your users, because they can plan for a particular time. This wasn't our stroke of genius: GNOME was the one that really championed this idea.
We took the fairly radical step of saying we could do that across the whole ecosystem. The reason that is radical is because when you're one project, you can make decisions for yourself. But obviously as Ubuntu, we aggregate everyone from the Linux kernel to the GNOME project through the Firefox web browser and the Apache web server, and a ton of stuff in between. So people said: "How on earth will you tell them when to ship their stuff so that you can ship what you want?"
We've simply taken the view that we have a very carefully-managed release process, and a new version from one of those projects just doesn't get in unless it's ready at the time it needs to be ready for us to have confidence that it can be integrated and tested.
What this has really done is it's separated, very elegantly, the processes associated with R&D, which is focused on what new features we're going to develop, and how to manage that, which is very difficult to put on a particular schedule, from the process of integration, testing and distribution.
Now, if I look at a company like Oracle or Microsoft, they have both of those responsibilities. So you end up in this horrible situation where they start saying now: "you'll have the next generation file system in this version and it'll ship on that date." And then reality intervenes, and that puts them in a very awkward situation. We just don't have that.
To come back to the original idea, we try to understand what's the essential difference between the way we produce software and the way other people produce software, and what becomes possible because of that, that wasn't possible before, both economically and technologically. That's really what Ubuntu's all about. We want to express fully the real nature of free software, as a true commercial, economic entity in its own right.
Have you had any feedback yet from the other distributions?
Not yet, no. This is something that we've only just started articulating. My hope is that other distributions will see the benefits of synchronizing all of our releases. It doesn't matter whose cycle we converge on, but the idea of synchronizing releases then cues all of those thousands of other projects, that if they want their latest technology shipped by a particular date, if they're able to get it done by a particular time, then that will happen not just with Ubuntu, but with a whole bunch of different platforms. I think it's a powerful idea.
There are commercial interests that might block it. It will be interesting to see if the other commercial distributions are nervous to put themselves in a situation where they really are being compared, apples to apples. We'll see.
Given that more and more computing will be done in the cloud, is that going to be a threat or an opportunity for Ubuntu?
It's a real opportunity, both on the server side and on the client side. To build a server-side cloud infrastructure, you want an operating system which is not licensed per seat or per processor or per machine or per instance. It is simply freely available with all of its updates, and Ubuntu meets that.
You can go from a hundred instances in the cloud to a hundred thousand instances in the cloud and legally pay Canonical no more money. You will probably want to have some sort of support relationship with us, but that's entirely separate from the actual licensing of the platform, and it's not required in any way. We cut a deal to support you in the way that you need support.
So, economically on the server side that's a very big winner, and Ubuntu is seeing a lot of adoption and traction there. You also want something that can be shrunk down so that in your cloud server you only have the pieces which you really need. Every extra piece is an extra piece of disk space that's not being used; it's an extra piece of memory that's not being used. It's an extra thing that can have a security issue that's not being used. And so you may as well get rid of it. Ubuntu's very modular - probably the most modular of the commercial platforms; this comes from our Debian heritage.
On the client side, for cloud computing you really want something that "speaks the Internet", and does so very well and very securely, and speaks the web very well and very securely. Ubuntu running Firefox is a really compelling option there.
So I think there's a good chance that the next YouTube is running in the cloud and running on Ubuntu.
One of the versions of Ubuntu is Gobuntu, which has no non-free elements whereas Ubuntu does have some. Where do you stand on the question of including proprietary elements in a free software distribution?
But we are willing to put in drivers that are not yet open source, because we figure it's more important to give everybody's grandma the opportunity to actually run free software applications on a free software environment, even if they need some proprietary drivers to get their hardware going. That puts us squarely in the pragmatist camp rather than the purist camp.
Very clearly, I'm a pragmatist. The non-free pieces of Ubuntu are nothing to do with Canonical's commercial interests. It's not like we've put pieces in there that suit us and don't suit anybody else. They're drivers for hardware where the manufacturers of that hardware haven't yet wrapped their heads around the idea of releasing the source code that makes their hardware work. They're not applications.
We work with those vendors to help them understand that in fact it's to their advantage to make their source code open source. They will get much better quality. We have real examples of this. We have much better quality drivers with much better reliability that make their hardware more attractive to a bigger portion of the market.
But we are willing to put in drivers that are not yet open source, because we figure it's more important to give everybody's grandma the opportunity to actually run free software applications on a free software environment, even if they need some proprietary drivers to get their hardware going. That puts us squarely in the pragmatist camp rather than the purist camp.
Gobuntu is an attempt to create a version of Ubuntu that does away with that, but also that is specifically designed to be a platform where other ideas about Copyleft can be explored - this meme about collaborative creation of something is extremely powerful and software is just the tip of the iceberg - we've already seen Wikipedia. I think every industry is going to need to adjust its thinking to say: "How can this participative computing phenomenon energize us?"
Gobuntu aimed to do that. People didn't really flock to it, so I think we will stop doing Gobuntu. People liked the idea, but not the people who would actually invest their time in it. I think it's too closely associated with Ubuntu. There's another one called gNewSense, which is exactly the same - Ubuntu with all the non-free stuff taken out. But because it's a separate organization, people feel more comfortable participating there. I don't mind, really.
On a related issue, do you worry that GNOME is becoming too involved and enmeshed with Microsoft technologies? If the patent problem with GNOME becomes too great, might you switch to KDE one day?
I think it's very healthy that we have multiple desktop platforms, and that they're both committed to free software and sources of innovation and inspiration and competition. We picked GNOME mostly because of its approach to the release cycle and because it had a real strong commitment back in 2004 to usability.
Since then, KDE has also embraced the idea of usability as a primary driver, and they've done some really interesting things on the technology front. I keep a level of awareness of KDE, and I run KDE at home just to make sure I have a sense of where it's going and how it is doing. I like the rivalry. We might [switch]; it's good to have that option.
As for patents in software, I think society does a very bad deal when it gives someone a monopoly in exchange for nothing. The traditional patent deal was you gave someone a monopoly in exchange for disclosure of a trade secret. You can't really have trade secrets in software.
Of course, the entrenched interests like to frame this as "patents are all about innovation", when they really aren't. There's very strong, academic, peer-reviewed research that suggests that patents stifle the pace of change and innovation.
The real insight with patents is that what society is buying with that monopoly is disclosure. And so the real benefit to society is accelerated disclosure of new ideas - not convincing people to invest. People have ideas all the time. You can't stop the human mind from innovating. People do research and development to win customers, that's what it's really about. It's not to file patents. So the entrenched patent holders really aren't doing much of a service to society when they articulate their position in very flawed terms.
With regard to GNOME and Microsoft, I'm not concerned. My view is that to win, you have to have your own vision. You have to have a very clear idea of what you can deliver that's unique. You can't go around sort of chasing someone else's coat tails. So while I respect the people in the free software community who invest a lot of time in making compatible implementations of other people's technology, I don't think that's the real recipe for success for free software. We have to give people a reason to use our platform for itself, not because it's a cheap version of someone else's.
And in fact, the real successes of free software have been the places where it has just blown away the alternatives. The Internet runs on free software, and not because it has copied anything from Microsoft. The proprietary software guys like to accuse free software of not innovating and not doing anything other than sort of walking down the same path that they've already walked, which is always easier. That's just not true, but guys like the Mono Project are reinforcing that stereotype.
Finally, one of the issues that has traditionally preoccupied the Linux community is: what happens if Linus falls under a bus? So I was wondering what happens to Canonical and Ubuntu if you fall under a spaceship or something?
Fall *out* of a spaceship! Well, I've made suitable preparations so that if I'm looking the wrong way when the bus comes, economically both Canonical and Ubuntu are fine: there are provisions in my will to make any additional investments needed.
As to the other things that I do for the project, they will have to find someone else to step into my shoes. You know, there's a lot of good talent, and both technically and commercially and socially. I think the project would continue.
Glyn Moody writes about open source at opendotdotdot.
Comments (51 posted)
Bryan Che, a member of the product management team at Red Hat, recently
Nightlife, a project he hopes will motivate people to donate their
computer's downtime to processing data for scientific research and other
socially beneficial work. The heavy lifting will be done by the University
of Wisconsin-Madison's Condor
workload management system which will be responsible for the scheduling and
logistics of donated computer power and, in the end, Che hopes to build a
network of more than a million nodes of Fedora systems to help process data
for everything from Web-indexing projects to medical research.
"[W]e have begun talking with the guys over at Wikia about helping them index the Web
for their open source search engine," says Che. "It would be great if we
could help with tasks for the Fedora infrastructure team at some point with
things like automated builds or tests. There is a lot of scientific
research that requires lots of computing power, and there are lots of
students who could use access to a grid for research. I'd love to have all
sorts of projects like these participate."
Che says that the scope and type of projects that join will largely be
dictated by the community, and he's hoping to draw on its collective
expertise to "shape Nightlife into a useful community service." His end
goal, however, isn't just to make computer resources available but to also
develop a basis for larger infrastructure projects. Che notes, "For
example, much of the high performance computing (HPC) jobs these days are
done on Linux — and particularly Fedora or Red Hat. This puts us in a
prime position to be able to shape and build out an entire open source
stack for research computing on grids. Today, many people depend upon
proprietary (and often costly) libraries for their scientific research or
even enterprise computing. Nightlife will provide us a great forum to
engage these users to see what are their needs and provide them with a
fully open source solution that they can use for their valuable
Naturally, security is of primary importance when individual computers
are clustered together or outside data is inserted into a system for
processing. Che says the Nightlife team takes security very seriously and
has a number of measures in place to protect users' computers and ensure
the application code is safe as well.
"[W]e will require that projects that want to leverage Nightlife must
distribute their packages and source code through Fedora," explains
Che. "This will allow us to inspect what the applications are doing and
make sure there isn't anything malicious. On the execution side, one of the
capabilities that we've added to Condor recently is integration with our
libvirt virtualization technology. This will enable people to execute
Nightlife jobs entirely within a virtual machine bubble that is shielded
from their physical computers.
"We are also looking at taking advantage of SELinux technology, which we've
developed with the NSA, as a mechanism for
tightly locking down jobs so that they can only perform tasks for which
they are explicitly granted permission."
Che is quick to point out that although Fedora has committed plenty of
resources to Nightlife, it is not Fedora-specific — indeed it's not
even Linux-specific. Since Condor supports executing processes on many
different platforms, Mac OS, Windows, Unix, and Linux distributions of any
flavor are capable of donating resources. Not all features will be
available on non-Linux platforms, however, if they lack certain underlying
technologies. For instance, Windows lacks a built-in hypervisor for running
virtual environments and doesn't support SELinux for lock-downs.
"I would welcome anyone to donate spare capacity to Nightlife [and] I'd
hope that people from all sorts of platforms join us," encourages
Che. "[T]here isn't any reason why other communities couldn't participate
with us and even start adding some of these capabilities to a Nightlife
client for their platforms. From a development standpoint, the upstream
code lives in the Condor project at the University of Wisconsin. So, anyone
can contribute at that project as well without having any involvement with
When the project was announced last week, some community
members were puzzled as to why Fedora chose to use Condor instead of BOINC, a similar
project developed by University of California-Berkeley. Che points out
that, though the two efforts have a lot in common, they each have an
entirely different focus. He says BOINC's mission is "very much focused on
enabling desktops/laptops to provide computing capacity as part of a larger
grid [while] Condor is more general-purpose; it can take idle capacity and
utilize it well, but it is primarily a good resource scheduler for
While some people's comparisons of Condor and BOINC focus on the
technology behind the projects, others see similarities between the Condor
and Nightlife projects themselves. In actuality, they are really quite
different. "Condor's client can use a BOINC client to process data as
backfill (when there are no other jobs to run)," notes Che. "So, there is
no need to view these projects as competitive. Indeed, one possibility is
to use Nightlife to increase the number of machines participating in
BOINC." Of course, a low barrier to entry is also important for widespread
adoption of Nightlife. Since many enterprises and researchers already run
Condor for their dedicated grids, Che says it was a logical choice for the
Dr. Keith Laidig can easily see the intrinsic value of Nightlife and how
it will benefit the scientific community at large. He runs the computing
infrastructure for the computational biophysics group in the Department of
Biochemistry at University of Washington, and regularly relies on outside
computing power to crunch data for researchers. Under the direction of
Professor David Baker, about four years ago the group created Robetta, an automated prediction
server that farms out work to other systems via Condor which has proven
"quite successful at keeping the wait times [for research results] down to
the range of 'months'."
Laidig recently told
the Nightlife community, "If we had access to more computing power,
even that available from modest periods of inactivity, we could put that
power to work to address many pressing issues in bio-medical research such
as HIV/AIDS vaccine design, improvement of existing drugs and/or design new
drugs, and creation of new methods to harness biology to address issues
such as carbon sequestration."
As Laidig explained to LWN, reducing the wait times for results to even
a matter of weeks is not out of the question. "Given sufficient computing
power, the processing time would drop even further. In principle, the
processing could take a day or less — depending on computing power,
queue depth, etc."
Laidig says it's hard to estimate just how much donated computer access
his lab would need in order to see an appreciable rise in research
turn-around time, but he estimates they currently use around 300 - 400
processors running around the clock to maintain the current work
flow. "Should we gain, say, 1,500 machines that could work for 8
hours... we'd be matching that — taking into account overhead. Now,
I'd like to increase that by a factor of ten or more."
Though he would be happy to see Nightlife flourish, Laidig notes there
are some things to consider before committing your computer's resources to
the project. "Not to throw a wet blanket on things, but [there are] issues
that folks should keep in mind. Their gear would be using electricity and
generating heat. There are also network bandwidth considerations as well
— some data-sets necessary to undertake distributed work can be
sizable (100 MBs) which can soak up resources. There's the local disk space
"Folks should be made aware of the 'costs' of contributing. Then, should
their desire to contribute outweigh the costs, they should join up!"
Some community members have indeed expressed concerns about the
energy consumption associated with idling computers and suggest that the
ecological harm of running the CPUs and fans of an unattended machine
outweighs the benefit of charity in the name of science. In response to an
animated discussion about Nightlife at Slashdot, one enterprising
commenter tested how much energy his idle computer uses and discovered
it was upwards of $70 per year. Che responded
to the criticism by acknowledging that although cycle harvesting can be
viewed as a "waste of energy," it can, in fact, save energy in the
long run. In addition to the notion that energy to process data will
eventually be used at some point or another anyway, Nightlife also
distributes energy consumption over a wide geographical area, thereby
reducing the overall energy burden on a single data center or location.
Future plans for Nightlife include making it a first-boot option for
Fedora so when a user does a fresh install, they are prompted to donate
computer power to the project. Of course, before Che can attain his
million-node goal, there are several smaller goals to accomplish along the
way. "At the earliest, we wouldn't be able to start reaching numbers at
this level until after Fedora 10 — and that's probably pushing
Comments (16 posted)
Laptop installation has traditionally been one of the biggest challenges
faced by Linux users. These systems come with no end of special-purpose
hardware, and they bring particular needs of their own. More recently,
getting a laptop into a basic, working state has become less of a challenge
- at least, for carefully-chosen systems. Life has gotten much easier in
But a contemporary laptop user is not content with "it boots Linux." A
well-provisioned laptop in 2008 should be able to make full use of all the
hardware, suspend and resume reliably, avoid turning presentations into
extended projector-related hassles, and get the most out of the battery.
Your editor has, in the past, proved that he could get a laptop to suspend
through a sufficient investment of his life into building kernels and
tweaking configurations. Your editor, in the present, has little patience
for that kind of messing around. The manual creation of power management
should really, at this point, go the way of hand-crafting XFree86
modelines. Both were once ways of showing one's advanced Linux skills, but
both are now just unnecessary pain.
A period of relatively little travel recently made it possible to follow
through on an old suggestion from Arjan van de Ven: install a number of
distributions on a laptop and compare how they perform. To this end, your editor's
aging Thinkpad X31 was pressed into service with offerings from several
distributors. In each case, a recent stable (or occasionally beta)
distribution was installed while doing a minimum of work beyond clicking
"next": no "expert" installations were done. All available updates were
applied. Then, a number of things were checked:
- Powertop was installed (if not
already present) and run to measure the steady-state power usage of the
machine. The laptop was as idle as your editor could get it to be,
with the backlight at minimum brightness; the system was left long
enough for the power usage numbers to stabilize. The idea was to get
the lowest possible value for each distribution.
- Suspend (to RAM) and hibernate (suspend to disk) were tested.
- Various laptop-specific buttons were tested. The X31, for example,
has a button combination which controls a small light which
illuminates the keyboard.
- The wireless network adapter was tested. The X31 presents an
interesting complication in that it has an Atheros-based adapter,
which, until recently, has not been supportable with free software.
- An external monitor was connected to determine how much work is
required to drive an external projector.
During the process, any other events of note were recorded as well.
Late in the process of writing this article, your editor was lucky enough
to receive a shiny new HP 2510p laptop, thanks to the generosity of the
folks at HP (and Bdale Garbee in particular). This machine, being based on
Intel chipsets, is fully supported by free software. It promises to make
future travels much more pleasant; having a toy like this show up in the
mail makes it hard to maintain a grumpy
attitude. The above tests were run on the new machine, but only for a
subset of the distributions.
Debian Lenny (
Your editor chose to perform this experiment with a mid-May Debian Lenny
testing release, rather than the aging stable distribution. That installed
a system with a 2.6.22 kernel which, of course, has no ath5k driver. So no
wireless on the X31 for Debian users - at least, not without installing the
proprietary MadWifi module. Unsurprisingly, the Debian installer did not
offer MadWifi as an option.
Suspend works, as long as the user does not mind a corrupted display on
resume; it's possible to see enough to perform an orderly reboot, but not
much more. It is strange that Debian would have this problem; suspend has
worked on this laptop with kernels significantly older than 2.6.22.
Hibernate was not accessible via its usual place on F12, but, when
invoked from the menus, worked properly. Other laptop keys worked without
The external display port did not work under Debian. The only way to get
video out of that port is to have the monitor plugged in when the system
Power consumption on an idle system was 10.7 watts, with the system waking
up an average of 67 times every second. This is far from the worst power
performance your editor saw over the course of this exercise, but also far
from the best.
All told, Debian Lenny in its current form is not one of the better
systems for laptops - at least, for this particular laptop. Some of the
other distributors have made much more progress in this area in recent
The installation from the Fedora 9 DVD went without any significant
problems. One of the nicest things about this particular distribution was
its inclusion of the ath5k driver as part of its 2.6.25 kernel. It seems
that ath5k does not work well for all chipsets, but the X31 wireless
adapter works quite well with it. So, with Fedora 9, the X31 laptop
works with 100% free software.
Another thing worthy of note: Fedora 9 was the only distribution tested
which offered to install the system on an encrypted disk. Given the
frequency with which laptops are lost, encrypting the data on them seems
like something a lot of users would want to have.
Suspend and hibernate worked on this system, with one little glitch: the
backlight remained on after the system was suspended. Your editor ran into
the same problem with Ubuntu Hardy during its development cycle; after some
conversation in Launchpad,
the problem was quickly fixed. So a bug has been filed in the Fedora
tracker pointing to that resolution, but no activity has been seen so far.
The power consumption for Fedora was 8.9 watts, with the processor waking
up an average of 45 times per second. The NetworkManager applet offers a
"disable wireless" operation which, indeed, will disable the wireless
interface. It does not power it down, though, so power consumption is
unchanged. Actually uninstalling the ath5k
module dropped power consumption to 8.2 watts.
Plugging into an external display worked, though it was necessary to bring
up the "screen resolution" dialog to bring up the external port.
On the 2510p, the display was run in a strange, non-native resolution
during the installation, making the text harder to read. The installed
system, however, did not have this problem. This system ran at 11.0 watts,
with a surprising 145 wakeups per second. Following Powertop's advice,
your editor shut down the Bluetooth interface and the HAL CD polling
daemon, bringing power usage down to 10.1 watts. Once again,
NetworkManager was unable to save any power by disabling the wireless. The
hardware's wireless button did power down the interface, bringing power
usage down to 8.6 watts. But (and this is true for all
distributions tested), NetworkManager was never able to make use of that
interface again until the system was rebooted.
All told, Fedora 9 works quite nicely for laptop installations; this
distribution has made quite a bit of progress over the last few releases.
Some grumpiness about the GNOME setup is appropriate, though. Fedora's
hackers seem especially enamored of those dialog notifier windows which pop
up from the panel icons. The experience is rather like trying to work
while being heckled by a sizable crowd of unhelpful bystanders.
One window, in particular, announced that closing the lid would no longer
suspend the system because some (unnamed) program was blocking that
action. That might be useful information, but knowing which program was
getting in the way would have been more helpful. But even more helpful
would be to not have to dismiss little notifier windows all the time.
There's also something in the GNOME system on Fedora which feels entitled
to adjust the backlight brightness anytime it thinks that the user has
screwed it up again. This happens even after the "dim display on idle"
options have been disabled, and often results in making the display
brighter on an idle system. If the user has set the backlight brightness,
the system should not presume to readjust it. One should not have to
wrestle with one's computer over the brightness of the display.
Some whim or other inspired your editor to install the OpenSolaris 200805
release. It has been almost ten years since the last encounter with
Solaris, so, perhaps, it was time for a brief reunion. Brief it was.
The installation procedure for this operating system is textual; it seems
rather primitive next to the effort Linux distributors have been putting
into making their installers attractive. There is a license acceptance
stage, where the poor user gets to scroll through all of the licenses
applicable to the software in this distribution - 244 licenses in all.
There's no requirement to indicate acceptance, though.
The installed system worked with the Atheros wireless by virtue of a
binary-only driver. Initially it only worked so well, though; this system,
from Sun "the network is the computer" Microsystems, installs itself configured to
use a local hosts file (only) for hostname lookups. Your editor had to
manually tweak nsswitch.conf to get it to use DNS. Sun's equivalent to
NetworkManager is the "network automagic daemon," which is obscure in spots
but seems to work. There is no power savings to be had from turning off
the wireless interface.
On the power front, once your editor tracked down a Powertop port, the
system was seen to be drawing 11.5 watts. Unlike with any Linux
distribution, Solaris runs the processor at its fastest speed at all times;
there does not appear to be any concept of CPU frequency control. The
laptop fan runs constantly under Solaris.
There is no suspend capability, no hibernate. In general, it would appear
that the Solaris developers have not put a whole lot of effort into the
power management problem so far - at least, not on x86; the OpenSolaris power
management page says that life is better with the Sparc port and that
all this goodness is coming to x86 Real Soon Now.
The external video port did not work at all under OpenSolaris. Your editor
was charmed to notice that the Solaris folks have retained the classic "log
off now or risk your files being damaged" message in the shutdown
On the 2510p, the OpenSolaris CD brought up GRUB, but did not succeed in
booting into the installer.
All told, OpenSolaris has some catching-up to do. Laptops were almost
certainly not at the top of the priority list for Project Indiana, but it
is still a little discouraging to see how far behind things are.
openSUSE 11.0 Beta 3
The openSUSE development cycle is heading toward its close, so your editor
decided to go with the beta 3 release. It must be said that this
distribution got off on rather the wrong foot; it puts up an end-user license agreement which prohibits
redistribution for compensation, bundling openSUSE with any other "offering,"
reverse engineering, transfer of the software, use in a production
environment, or publishing benchmark results (but only if you're a software
vendor). Users are required to stop using the software upon termination of
the license, which happens after 90 days, after the next release, or
whenever Novell says so. And, just in case one was considering the crime
of using the release for too long:
The Software may contain an automatic disabling mechanism that
prevents its use after a certain period of time, so You should back
up Your system and take other measures to prevent any loss of files
There's a certain amount of weasel-wording to the effect that Novell is not
trying to take away any rights conferred by the real licenses on the
software it ships. So the EULA has little force. But it is not consistent
with the mores of the community from which Novell took this software, and
it leaves a bad taste in one's mouth.
Installation is relatively straightforward, though a bit more
mouse-intensive than some other distributions. But one has to watch carefully:
openSUSE, by default, configures the system to automatically log in
the user account created at installation time. An amusing addition is
that, after suspending and resuming the system (which works), a password
prompt will be presented, even though none is required on a cold boot.
openSUSE, like Fedora, thinks that it's smarter than the user and is
entitled to readjust the backlight at any time.
As mentioned, suspending the system worked without trouble. Hibernation,
however, failed; it goes straight to resume without halting the system.
openSUSE ships the ath5k driver, so the wireless interface worked
flawlessly with free software. The external monitor port is always on
under openSUSE; the dialogs offered to create a Xinerama setup, but that
Power consumption was 11.2 watts, with 106 wakeups happening per second.
Your editor noticed that beagled was running; something which was not
observed on other systems. Powertop noticed too, and politely offered to kill it
off; that brought the system down to 78 wakeups with slightly less power
used. Removing the ath5k driver brought consumption down to 10.8 watts.
Experience with the 2510p was quite similar. Hibernate still fails. Power
usage is a low 9.0 watts; 8.8 when the "kill beagled" option is selected.
Unfortunately, this lower usage is likely to be a result of the wireless
working. NetworkManager is able to present a list of access points, but
does not succeed in associating with any of them. This is a device with a
free driver, well supported in the 2.6.25 kernel shipped by openSUSE; its
failure to work is discouraging.
Many of the glitches encountered in this distribution are easily explained
by pointing out that it is a beta release. One can only assume that many
of them will be fixed up before the final version. With that done,
openSUSE has the potential to be a solid system for laptops; many of the
right pieces are there. Your editor, though, will have a hard time
considering an openSUSE installation; that unpleasant EULA has left a
Ubuntu made its name partially through its attention to laptop
installations, so your editor had reasonably high expectations from the
"Hardy Heron" long-term-support release. Those expectations were met, for
the most part.
The installation CD did its job, and the resulting system worked well. The
Ubuntu time zone selector deserves special mention, though: it tries to pan
the world map under the mouse, with the effect that the target one is
aiming for moves away as one gets close. It's a video game of sorts, but
it can be a little frustrating, especially with a laptop-style mouse
Wireless works, but Ubuntu silently installs the MadWifi driver to bring
that about. Suspend and hibernate work, as do the various Thinkpad
buttons. Ubuntu demonstrates some of the same backlight obnoxiousness as
the other GNOME-based distributions - but quite a bit less of it.
This system drew 9.5 watts of power, with 47 wakeups per second. With this
configuration, disabling the wireless in NetworkManager did reduce power
usage considerably - down to 8.1 watts. It would seem that the MadWifi
driver still knows something about powering down the hardware that ath5k
doesn't. Even so, removing MadWifi entirely dropped consumption still
further, to 7.8 watts.
On the 2510p, things generally worked well. Power consumption was 10.1
watts, with an amazing 217 wakeups per second, though. Part of the problem
here appears to be a bug in the i915 driver which causes it to generate a
steady stream of interrupts if the 3D engine is engaged. Ubuntu turns on
Compiz by default, causing the video processor to pound on the CPU.
Turning off "visual effects" cut the wakeup rate considerably. Following
Powertop's advice and disabling the Bluetooth interface as well dropped the
system down to 9.7 watts and 50 wakeups per second.
Here's a table summarizing some of the results reported above:
||Y||Encrypted install option|
||N||No external video|
The second power number, when present, indicates what is achievable with
minimal tweaking: turning off wireless or letting Powertop shut things
down. More invasive techniques (unloading modules, for example, or
changing kernel boot parameters) are not
For the 2510p, the results are:
Two other distributions were tried, but did not make it all the through the
- Gentoo. Playing with Gentoo has been on the list for years. So an
install disk was downloaded and your editor launched into the "quick
install guide." It is clear that Gentoo employs a rather long
value of "quick." This guide prints over many pages, includes 39
"code listings," requires creating each filesystem by hand, etc. Your
editor would still like to play with Gentoo, but there was no time for
such an exercise now. Life has gotten too short to go through that
kind of obstacle course just to get Linux installed on a computer.
- Slackware. In this case, your editor was able to get through the
somewhat rustic Slackware 12.1 installation procedure. It was kind of
nostalgic to see LILO again. The system ran, and even brought up the
window system, but the system would lock hard as soon as your editor
tried to bring up a terminal window. That, too, was not the sort of
experience which had been sought.
What comes out of all this work is that the Linux community now has a few
good options for laptop-friendly distributions. Getting Linux running well
on a laptop need no longer be an act of advanced wizardry.
That said, there's clearly still room for improvement. Even well-supported
hardware does not always cooperate well. For a laptop system, in
particular, it is important to be able to power down unneeded hardware
without having to dig into the system configuration or unload kernel
modules. If the wireless interface, FireWire port, modem, BlueTooth
interface, etc. are not being used, they should not be drawing power.
After all, if the laptop's user is going to have something to actually
do through a long series of LinuxWorld keynotes, it's important to
stretch that battery as far as possible. Progress has been made, but there
is more to do.
Your editor must now make a choice as to which distribution will remain on
these laptops. For the X31, the choice makes itself: Fedora. It works the
best while installing only free software. One could retrofit a 2.6.25
kernel into an Ubuntu installation to get the ath5k driver, but it's nicer
to not have to do that. For the 2510p, the choice is not quite so clear.
It might, in the end, be Ubuntu for the slightly lower power consumption
and fewer backlight hassles. The potential (not always realized) for
online upgrades might also tip things a little more in the Ubuntu
direction. All of that will have to be traded off against Fedora's
out-of-the-box encrypted installation, though.
But either Ubuntu or Fedora is a fine choice for this machine;
it is nice to be in a position where there are a couple of high-quality
Comments (58 posted)
Page editor: Jake Edge
Next page: Security>>