The 2010 edition of linux.conf.au was held on January 18 to 22 in
Wellington, New Zealand. A number of the talks from this event have been
covered elsewhere on LWN, with more to come; this article will talk about
several other sessions and your editor's impressions of the conference as a
whole. In brief: it was a highly successful event which easily lived up to
the high standards set by LCA.
One often goes to conferences to see the speakers perform. It's a rare
event, however, which gets them up on stage together to do a Maori war
dance. The speakers' dinner on Tuesday night featured plenty of good food,
"Fiasco" wine, and a group which gave lessons on how to do the Haka (which only coincidentally
sounds a lot like "hacker"). Much noise was made, much fun was had, and,
much to the participants' chagrin, videos were made.
Benjamin Mako Hill presented the Wednesday morning keynote. He started off
with a discussion of the open source/free software divide, noting that he
is very much in the free software camp. The open source side, he said,
emphasizes practical benefits, whereas freedom has inherent benefits.
The rest of his talk was dedicated to one specific benefit (a rather
practical one, in your editor's opinion) that comes with free software:
freedom from antifeatures.
Antifeatures are behaviors added to proprietary software as a way of
exerting some sort of control over users. It can be a simple matter of
extracting money from users - requiring them to pay more to have
advertising or spyware features removed, for example. It can be a matter
of market segmentation; see, for example, the several versions of Windows
Vista offered by Microsoft or the removal of raw image support from some
Canon cameras. Vendors may be trying to secure monopolies; software which
detects third-party batteries in devices and disables the power-saving
features is an example. "Protecting copyrights" is another; there are, he
says, no Facebook fan clubs for dongles or the unskippable tracks at the
beginning of DVDs.
In all of these cases, the cited behavior works against the interests of
the people actually using that software; these features are not something
that users have requested. They are all also features which are entirely
unsustainable in the free software world. Even if a free software project
were to implement this sort of antifeature - something which happens rarely
- others will quickly disable it; see the Okular cut-and-paste story
for an example. Software
freedom means the freedom to remove functionality we don't want.
Mako has set up a wiki site
where he is collecting interesting examples of software antifeatures.
How can we make a community which is more welcoming? Matthew Garrett
addressed this question from a number of viewpoints, without necessarily
coming to a lot of conclusions. The problem, he says, is that, as a
community, we tend to be hostile - even if truly unprovoked aggressive behavior
is relatively rare. We tend to value code over everything else, and we
value technical excellence above behavioral excellence. The result is that
the community is not terribly functional as a whole; it has not gained the
behavioral standards that one would normally associate with a community,
and we're getting big enough that we really need to do something about it.
In general, we don't hate each other; we can get together at conferences
and not punch each other in the face. It has only happened to him once at
LCA, Matthew says, and he deserved it.
So what do we do? Codes of conduct can help, but only if we are willing to
enforce them. We need to decide whether we are willing to tolerate
poisonous people if they are technically strong enough. There should be a
greater willingness to point out unacceptable behavior; Matthew would
especially like to see respected community members doing more of this.
What works best, though, may be the simple power of positive examples.
Glyn Moody's keynote focused on the power of sharing, and how ideas from
our community have spread out and influenced the wider world. For
example, consider open access to scientific results, which have been
increasingly bottled up by the publishing industry. The ArXiv.org repository was announced within a
week of when Linus announced his first kernel release; since then, open
access has become an increasingly strong force in the scientific community.
Related to that was the race to completely sequence the human genome. A
company called Celera was a late entry with a scary agenda: sequence the
genome, then patent as much of it as possible. In the end, though, a lone
hacker named Jim Kent was able to bash out a system which solved the
problem first, using a 200-system Linux cluster. He won the race by a few
days and put the results into the public domain, heading off the patent
Project Gutenberg - which predates Linux by some years - is an interesting
example. Despite having significant resources, this project only had ten
books online by 1991. By 1997, though, that number had expanded to 1000.
The spread of the Internet clearly helped in this regard, but a wider
understanding of the importance of freely-available information also
Sharing is moving into a number of other realms; Glyn described sites like
Facebook and Twitter as simply a means for the sharing of lives. Openness
is also moving into government - to an extent. The use of a Creative
Commons license for the content on the Change.gov site was a clear sign that things
are changing. Still, things are not really open; it's the traditional
power structure with a bit of data released - "shared source government."
The final part of the talk went rather far afield into the areas of climate
change, environmental problems, and the financial crisis. In the end, Glyn
said, these problems are all the result of a failure to share. Our
community, he said, has shown how sharing is done, and we've exported that
knowledge widely. Now we need to find a way to apply it to these larger
problems. That is quite the challenge; your editor can't wait to see the
patches that result.
Andrew "Tridge" Tridgell is concerned about a different threat: patent
attacks on free software. These attacks, he fears, are only going to
become more common; the community as a whole needs to learn how to defend
itself. Patent defense, Tridge says, begins with the developers.
To that end, developers should learn how to read patents, a process which isn't
obvious from the outset. Many developers have come to the conclusion that
looking at patents can be dangerous - triple damages for willful
infringement and all that. Tridge's point is that most free software
projects cannot withstand even single damages. There is no point in
worrying about a triple death when a single death is enough. So, rather
than walking through the minefield with a blindfold on, it's better to take
the blindfold off and step around the mines.
There are three ways to defend against patent claims. Developers tend to
turn to prior art, but that is a difficult and dangerous way to go;
establishing prior art can be much harder than most people expect.
Invalidating patents is even worse; that can almost never be done
successfully. The best defense, he says, is finding ways to not infringe
on the patent in the first place. The cost is low, the certainty is
higher, and it can lead to a stronger defense for free software in
general. Non-infringement, normally, is achieved through a combination of
careful reading of the patent and the crafting of workarounds where needed.
The problem is that the GPL requires broad licensing of patents; if a
patent is not licensed for all users of the code, that code cannot be
distributed. There are good reasons for this requirement, but it also can
make us into an attractive target: a company which wishes to settle a
patent suit cannot stop with buying a license for itself; it must buy a
license for the entire community. That's the sort of situation which makes
patent trolls dream of dollar signs.
The situation changes, though, when we find an effective workaround for a
patent. That workaround essentially invalidates the patent, eliminating
the threat. When proprietary companies find workarounds, they tend to keep
them to themselves; there's no point in helping their competition avoid the
payment of royalties. In the free software world, though, we can
distribute workarounds broadly, to the point that proprietary software
companies can pick them up too. That will kill the value of the patent
entirely, drying up any associated revenue stream. After a few episodes
like that, the free software community will
look like the "toughest, meanest kid on the block," and patent trolls will
be inclined to leave us alone.
Workarounds must be done rigorously, though, with help from lawyers. That
is a challenge: the legal community is not known for open sharing of
information on topics like this. We need a forum where engineers and
lawyers from competing companies can talk openly about patents, but such a
forum does not yet exist.
Josh Berkus updated attendees on the state of PostgreSQL; it is, he
says, an exciting time for the project. He started by announcing that the
upcoming release will be named 9.0, not 8.5 as had been previously
expected. That's because this release contains a number of features which
they hadn't thought would be ready by now; these include hot standby,
streaming replication, a 64-bit Windows port, the new DO()
statement, and more. The dot-zero number also reflects the fact that some
of these features "might not work perfectly" in this release.
The PostgreSQL development process has changed in the last couple of years
in response to the difficult 8.2 cycle which dragged out for six months
longer than anybody had expected. It has proved difficult to manage
committer and reviewer time for PostgreSQL. The way it works now is that,
every other month, the project enters a "commitfest," at which point the
outstanding patch queue is emptied. Patches may be merged, rejected, or
deferred, but, anyway, some sort of disposition is decided upon. This
process helps to ensure that patches move through the system, it allows
contributors to see which patches are stalled and why, and it should help
to train new reviewers and committers for the future.
The final commitfest for 9.0 goes through the end of January; after that
the project goes into stabilization mode, with the final release expected
sometime around June or July.
One widely-anticipated feature for 9.0 is hot standby. This feature works
by taking the transaction logs from the primary database server and copying
them to one or more standby systems. Those systems fold the logs into
their copy of the database. The result is that the backup systems may be
slightly behind the primary database, but they stand ready to take over at
any time. While they are in standby mode, they are able to handle
read-only queries, helping to distribute the load somewhat.
A related new feature is streaming replication. It aims to solve the same
problem as hot standby, with some changes: streaming replication is
for sites which are concerned about never losing any data, want minimal (as
in a few seconds) downtime should a failover be necessary, and which are
less concerned about multi-node scalability. Such sites can set up
replicated servers which receive transaction log data almost immediately
after each transaction completes. The replicated servers are thus very
close to the state of the primary server. This feature works, though, Josh
notes, the administration is a bit awkward in 9.0.
The "explain" feature has been enhanced in 9.0. In addition to the
semi-human-readable version that PostgreSQL has used for some time,
"explain" can now output its results in XML, JSON, or YAML format. This
change is meant to make it easier for graphical frontends to interpret the
output, but developers are starting to discover that some of the formats
(YAML in particular) are easier to read than the classic format.
Finally, Josh talked about the project's upcoming transition to git for its
source code management. They are hoping to free themselves of CVS in the
next development cycle, but a couple of developers are still dragging their
feet. It seems that this little problem will be overcome sooner or later.
Meanwhile, the PostgreSQL project appears to be in good shape and getting
In conclusion: LCA 2010 was a busy and interesting event. Your
editor's main grumble was that the schedule was so full of useful talks
that he never got to go out and enjoy the beautiful, sunny weather which
only occurred while the conference was in session. LCA retains the things
that make it special: interesting talks on a wide variety of topics, a
unique mix of people, lots of fun, and a generally friendly atmosphere.
Also notable was the presence of more women than at any other event you
editor has ever seen - and the fact that nobody even felt the need to
comment on it.
Even an article of this length - along with the other half-dozen LWN
articles coming from this conference - cannot cover all of the interesting
things that happened there. Also noteworthy were Selena Deckelmann's
lightning talk on using free software to help overturn a rigged African
election, Gabriella Coleman's keynote on free software culture, Patrick
Brennan's talk on Albany Senior High School, which abruptly
shifted to Linux in 2009, Joel Stanley's push for hardware designed
explicitly to run free software, and, needless to
say, the traditional Penguin Dinner, even if memories from that particular
event tend to be a bit fuzzy.
LCA 2010 organizers Andrew and Susanne Ruthven are to be commended on their
stewardship of this venerable event. LCA might not have been in Australia
this year, but they managed to keep all that makes LCA worthwhile while
bringing it to an interesting new venue. For added fun - since organizing
a conference like LCA is evidently not enough work on its own - they also
threw having a baby into the mix and still kept everything together (with a
lot of help from the rest of the organizing team, needless to say). They
are probably more than ready to pass the baton on to next year's organizing
team, which announced that LCA 2011 will return to Brisbane, Australia,
probably in early February.
to post comments)