Many programs - free and proprietary - offer a plug-in interface to make it
easy to add new functionality. In many situations, the existence of a
well-defined plugin interface has been a key driver for the success of the
system as a whole; imagine Firefox, for example, without its extension
mechanism. The GNU compiler collection (GCC) is an example of a complex
system which could benefit from such an interface, but which currently
lacks one. GCC developers have been talking about adding a plugin API, but
it is far from clear that this will be done; how this decision goes may
have major consequences for how GCC works with its wider development
community and the free software community as a whole.
GCC is designed as an extended pipeline of cooperating modules.
Language-specific front-end code parses code in a specific source language and
turns it into a generic, high-level, internal representation. Various
optimization passes then operate on that representation at various levels.
At the back end, an architecture-specific module turns the optimized
internal code into something which will run on the target processor. It's
a long chain of modules; at each point in the chain, there is an
opportunity to see the code in a different stage of analysis and processing.
There can be a lot of value in hooking into an arbitrary point in that
chain. Static analysis tools need to look at a program at different levels
to get a sense for what is going on and look for problems or opportunities
for improvement. New types of optimization passes could be added at
specific points, making the compiler perform better. Project-specific
modules could look for problems (violations of locking rules, perhaps) tied
to a given code base. Language-specific modules can provide tighter
checking for certain constructs. And so on.
Currently, adding this sort of extension to GCC is not a task for the faint
of heart. The GCC build system is known to be challenging, and GCC's
internal documentation is, one might say, not quite as complete as one
might like. Researcher Alexander Lamaison described it this way:
Out of the 6 months, 4 were spent learning the GCC internals and
fighting the GCC build process, 1 was spent writing up leaving 1
month of actual productive research... I fully understand that
this can seems strange to people who know GCC like the back of
their hand, but to a newcomer it is a huge task just to write a
single useful line of code. I'm sure many give up before ever
reaching that point.
Once they have overcome these problems, developers adding extensions to GCC run
into another problem: if they want to distribute their work, they end up in
the business of shipping a whole new compiler. Brendon Costa, who works on
the EDoc++ GCC extension, noted:
I approached the debian maintainers list with a debian package for
this project to see if they would include it in the official
repositories. It was not accepted and the reason for that is
because it includes another patched version of GCC which takes up
too much disk space. They don't want to accept these sorts of
projects because they all effectively require duplicates of the
Both of these problems could be addressed by adding a plugin mechanism to
GCC. A well-defined API would make it relatively easy for developers to
hook a new tool into the compiler without having to understand its
internals or fight with the build process. If an off-the-shelf GCC could
accept plugins, distributors could ship those plugins without having to
include multiple copies of the compiler. Given that we would all benefit from
a more capable GCC, and given the many examples of how other systems have
benefited from a plugin architecture, one would think that the addition of
plugins to GCC would not be a controversial thing.
It seems that one would be wrong, however. In a recent discussion on plugins, two concerns were
- Adding plugins to GCC would make it easy for people to create and
distribute proprietary enhancements.
- A plugin API would have to be maintained in a stable manner, possibly
impeding further GCC development.
There were also some suggestions that, if the effort put into a plugin API
were, instead, put into documentation of GCC internals, the overall benefit
would be much higher.
The proprietary extensions concern is clearly the big stumbling block,
though. Some participants stated that
Richard Stallman has blocked any sort
of GCC plugin mechanism for just this reason - though it should be noted
that Mr. Stallman has not contributed directly to this discussion. But,
given that GCC remains a GNU project, it is not hard to imagine anything
which could lead to proprietary versions of GCC would encounter a high
level of opposition.
The attentive reader may have spied some similarities between this
discussion and the interminable debate over kernel modules. The kernel's
plugin mechanism has certainly enabled the creation of proprietary
extensions. In the GCC case, it has been suggested that any plugins would
have to be derived products and, thus, covered by the GPL. This, too, is
an argument which has been heard in the kernel context. In that case,
concerns over the copyright status of proprietary modules have kept them
out of most distributions and, in general, cast a cloud over those
modules. Something similar would probably happen to proprietary GCC
modules: they would not be widely distributed, would be the subject of
constant criticism, and would be an impetus for others to replace them with
free versions. It is hard to imagine that there would be a thriving market
for proprietary GCC extensions, just like there is no real market for
proprietary GIMP extensions - even though Photoshop has created just that
kind of market.
It has also been pointed out that the status quo has not prevented the
creation of proprietary GCC variants. As an example, consider GCCfss - GCC
for Solaris systems. This compiler is a sort of Frankenstein-like grafting
of the GCC front end onto Sun's proprietary SPARC code generator. Back
when Coverity's static analysis tools were known as the "Stanford checker,"
they, too, were a proprietary tool built on top of GCC (the current version
does not use GCC, though). People wanting to do proprietary work with GCC
have been finding ways to do so even without a plugin mechanism.
The GCC developers could also look to the kernel for an approach to the API
stability issue and simply declare that the plugin API can change. That
would make life harder for plugin developers and distributors, but it would
make it even harder for any proprietary plugin vendors. An unstable API
would not take away the value of the plugin architecture in general, but it
would avoid putting extra demands onto the core GCC developers.
In general, GCC is at a sort of crossroads. There are a number of
competing compiler projects which are beginning to make some progress; they
are a long way from rivaling GCC, but betting against the ability of a free
software project to make rapid progress is almost never a good idea. There
is a pressing need for better analysis tools - it is hard to see how we
will make the next jump in code quality without them. Developers would
like to work on other enhancements, such as advanced optimization
techniques, but are finding that work hard to do. If GCC is unable to
respond to these pressures, things could go badly for the project as a
developer Ian Lance Taylor fears the worst
in this regard:
I have a different fear: that gcc will become increasing
irrelevant, as more and more new programmers learn to work on
alternative free compilers instead. That is neutral with regard to
freedom, but it will tend to lose the many years of experience
which have been put into gcc. In my view, if we can't even get
ourselves together to permit something as simple as plugins with an
unstable API, then we deserve to lose.
Back at the beginning of the GNU project, Richard Stallman understood that
a solid compiler would be an important building block for his free system.
In those days, even the creation of a C compiler looked like an overly
ambitious project for volunteer developers, but he made GCC one of his
first projects anyway (once the all-important extensible editor had been
released). His vision and determination, combined with a large (for the
times) testing community with a high tolerance for pain, got the job done.
When Sun decided that a C compiler was no longer something
which would be bundled with a SunOS system, GCC was there to fill in the
gap. When Linus created his new kernel, GCC was there to compile it.
It is hard to imagine how the free software explosion in
the early 1990's could have happened without the GCC platform (and
associated tool chain) to build our code with.
The vision and determination that brought us GCC has always been associated
with a certain conservatism which has held that project back, though. In
the late 1990's, frustration with the management of GCC led to the creation
of the egcs compiler; that fork proved to be so successful that it
eventually replaced the "official" version of GCC. If enough developers
once again reach a critical level of frustration, they may decide to fork
the project anew, but, this time, there are other free compiler projects
around as well. Perhaps, as some have suggested, better documentation is
all that is really required. But, somehow, the GCC developers will want to
ensure that all the energy which is going into improving GCC doesn't wander
elsewhere. GCC needs that energy if it is to remain one of the cornerstones
of our free system.
Comments (50 posted)
The libre-java movement got a huge boost on November 13 2006, "Java
Liberation Day". On that day, Sun announced its intent to distribute
all the source code for their implementations of Java Standard Edition
(SE, code name OpenJDK),
Enterprise Edition (EE, code name Glassfish) and Micro
Edition (ME, code name PhoneME) as free
software. All are licensed under the GPL, using the classpath
exception clause for selected libraries.
Sun went out of its way to make its commitment to the GPL, with
everything it implies, very clear in an extensive FAQ.
Two things, unrelated to source code, are reserved: the specification and
certification of Java is reserved for the JCP
(Java Community Process), while Sun
controls the usage of the Java(TM) trademark. Everybody
is free to run, copy, distribute, study, change, share and improve the
source code for any purpose as long as they follow the tit-for-tat,
share-alike, copyleft rules of the GPL for all the code they release.
Sun not only contacted big Free Software names, like Richard Stallman
and Eben Moglen, before the event, the company also made sure that the
existing libre-java communities knew about the plans before it all hit
the press. Jonathan Schwartz, the Sun CEO, himself explicitly reached
out to the existing GNU Classpath, GCJ and Kaffe communities with an offer to collaborate
with the developers making
libre-java on GNU/Linux a reality. The Sun OpenJDK engineers came to
the Fosdem conference to have a DevJam
and share their ideas with the existing libre-java communities.
Sun couldn't liberate all the code at once, but did so in stages for
anything it could get all the rights to over the last year. At
the same time, Sun has been slowly opening up the internal development process and
switching to a public repository for all the code (Kelly O'Hair keeps
a blog that is
a great case study for moving a large project to Mercurial). Currently
all code for Glassfish and phoneME have been released, while about 96%
of the OpenJDK code base is available (almost all the parts not
released yet are because Sun couldn't get the rights from some third
party). Rich Sands gives a great overview of how all this looked
from inside Sun in his one
year retrospective on java liberation day.
Meanwhile, an effort called IcedTea was started to put
OpenJDK into GNU/Linux distributions by moving it quickly towards that
100% free software line. As Andrew Haley explained during the official announcement, it is an experimental build tree that
tries to stay close to the OpenJDK upstream but does not contain any
non-free proprietary blobs. Those pieces are either stubbed out or
replaced with code from GNU Classpath. IcedTea can be built using only
free software. In particular, you can use GCJ for the first build phase
to bootstrap those parts that depend on a working java
compiler and runtime. It can be easily built on a modern
GNU/Linux distribution with a simple
./configure && make; or at
least that is the theory. It builds out of the box on Fedora 7 or
8. For Debian
some additional steps are still needed.
Because OpenJDK didn't have a open bug database and (until recently) a
source code repository, IcedTea has an open public bugtracker and a
mercurial repository. All
discussions about the code are done on the public OpenJDK distro-pkg-dev
IcedTea is what most GNU/Linux distributions ship now for x86 and x86_64 in
addition to GCJ for other architectures. IcedTea also adds some things that
make things easier for distribution packagers. It adds patches so that
the runtime and core libraries link to system shared libraries, like
zlib and libjpeg, which should make security updates simpler. It adds
support for using things like the system wide tzdata files for the
TimeZone utility classes. It also supports using the system
installed certificate authority files for security related checks as
used in the ssl network classes. Thomas Fitzsimmons, who helped with
the above items, keeps a Packager's Wishlist for
IcedTea also contains some bug fixes from people who submitted patches
to the OpenJDK upstream, which haven't been accepted yet, to give them a
wider testing audience. Others have offered alternative backends for
existing packages that make java applications integrate better in a
GNU/Linux system. An example is the GConf backend for the
java.util.prefs package by Mario Torre, which he ported
from GNU Classpath.
Lastly, IcedTea is a testbed for porting OpenJDK to other
architectures beyond the currently supported x86, x86_64 and sparc. There are
two different approaches. The first is to start with the Hotspot C++
Interpreter. HotSpot is the runtime of OpenJDK and actually contains
another byte code interpreter by default, the Template
Interpreter which is a bit harder to port. After requests from the
community, Sun also released its older C++ based interpreter to help
the porting effort. Gary Benson recently made a breakthrough and got
that working for PowerPC (both 32 and 64 bit). He wrote a guide to porting
IcedTea that will hopefully help people porting to other
architectures. Of course the interpreter alone is slow, but the benefit is that you get a full system up and running
that is similar to the existing architectures with full support for
all features. The next
step is to add support for the HotSpot just-in-time (JIT) compilers, which
will be a lot
Another approach is taken by the Cacao team who are working on replacing
the whole HotSpot runtime with
libjvm.so based on the Cacao JITs, but reusing all of
the core libraries. This now works for s390 and powerpc. Cacao also
supports alpha, arm and mips, so this is an interesting path for
getting a faster port. You will have to replace all of the Hotspot
runtime to get it. The Cacao VM is still missing some features,
like annotations and full debugging support.
There are some other interesting developments around former GNU
Classpath-based projects that are now experimenting with combining
their code and the new OpenJDK and/or PhoneME projects. Dalibor Topic
worked on a Google Summer
of Code project to make the OpenJDK javac compiler a source-to-bytecode
front end for the GCJ native compiler. The MIDPath project combines
elements of SE and ME, plus different backends (SDL, FB, AWT, X11,
Gtk+ and Qt) to provide access to MIDP2, a mobile devices standard used
on many phones for various platforms. JaLiMo provides all of the above
packaged for the Maemo and OpenMoko platforms. JNode is a full operating system written
in java that is now built upon a hybrid of OpenJDK and GNU Classpath
libraries. There is also the IKVM
project that is providing a java implementation, translation and
interoperability framework for .net implementations like Mono.
With all this one could safely say that The Java Trap
has been dismantled. But even with some replacements from GNU
Classpath, IcedTea/OpenJDK is still missing some small parts. The
java.sound implementation isn't complete, but Sun released
the parts so it could. The Java Management Extensions (JMX) implementation is missing SNMP
support. Applets are currently supported through gcjwebplugin, which
has the benefit that there is finally a 64-bit plugin, but it is still
there is no support yet for Java Network Launch Protocol (JNLP aka Java Web
Start), although there is netx, which might be
added to IcedTea soon.
So when you put all the above together, what Java version do we get?
Officially what the GNU/Linux distributions ship now isn't Java(TM) since it
isn't certified. And IcedTea comes with the following warning:
IcedTea is derived from OpenJDK, Sun's open-source implementation
of the Java SE platform. At this time the build from which IcedTea
was constructed corresponds to an early build (b21) of JDK 7. When
JDK 7 is complete it will implement the Java SE 7 Platform
Specification. Work on that specification is underway, but far
from final. Any APIs in the JDK 7 implementation, whether new or
old, are therefore subject to minor adjustments, major revisions,
or even outright removal between now and the time that the Java SE
7 Platform Specification is finalized. Please take these facts
into account before depending upon IcedTea.
Red Hat has recently signed
Community TCK License Agreement. This gives them access to the
test suite that determines whether an implementation that is derived
from the OpenJDK project complies with the Java SE 6 specification. This
will cover the
binary releases that they ship (while the source code will of course still
be available under the GPL). This agreement does, however, contain an NDA
which prevents talking about how compatible the current code is.
The test compatibility kit itself is still proprietary, so it will be
hard to work together with the community at large on this.
One year after Java Liberation Day modern GNU/Linux distributions are
starting to ship with a more complete free java-like stack than ever
before, but there are still some small loose ends to tie up. It was a
learning experience for all communities involved as suddenly the whole
free java ecosystem changed. There was certainly a lot of frustration
about the speed with which things opened up. On the other side, there was
the fact that shipping something purely-free seemed more important
than compatibility. But Sun consistently kept its promises about
opening up. The existing libre-java communities learned to respect
and take advantage of the now free reference implementation of the code that they
worked so hard to replace over the
last decade. Now the fruits of collaboration and reusing each others
code is materializing. In less than another year, it will be common to
have a full, free, java stack, either SE, ME or EE, wherever you
find GNU/Linux running.
[ Mark Wielaard has been doing libre-java development since 1999. He is
also the maintainer of GNU Classpath. He is currently employed as an
engineer by Red Hat
on non-java related projects. ]
Comments (12 posted)
Long-time Red Hat employee and Mozilla contributor Christopher Blizzard
recently took a new job as a member of the Evangelism team at Mozilla
Corporation. Just settling in – he started just over a week ago
– he graciously
agreed to be interviewed. His answers provide a look at
evangelism at Mozilla, what his role will be, along with a bit of a
retrospective on his days at Red Hat.
LWN: What does it mean to be on the Evangelism team at Mozilla?
What kinds of things does the team do?
The Evangelism team at Mozilla has quite a few roles to play. We
handle a lot of external communications, including some amount of
press, although there's a separate press team that handles most of
that. Sometimes it's tactical – responding to a bad blog entry or
press result and repairing misleading facts or conclusions when that's
possible. But it's strategic as well in the sense that we try and
understand and compose the story of Mozilla into a consumable form and
then try and tell that to the world. Mozilla has a good story to tell
and helping people understand who we are and what we're doing opens
doors both for us and for others. It's an important role in any
organization, but especially ours.
We also do a bit of internal communications facilitating as well. As
the project and the company grow past certain sizes that kind of
"internal understanding" role is going to be more and more critical.
Organizations that don't understand themselves get into trouble in
pretty short order. We're around to keep that from happening as best
we can. Think of it as guarding the culture as new people come on
We also do a lot of technical evangelism as well. Just as examples
you can see that work in Mark Finkle's weblog where he does a lot of
work describing add-ons and extensions and what's going on in that
area. John Resig also does a lot of work telling the story of
Dotzler and Seth Bindernagel do a lot of work with the community
directly while Deb Richardson and Eric Shepherd both do a lot of work
on documentation and external communications.
We're a diverse group with a lot of different roles, but that's the
nature of the audiences we face.
But these systems are
under attack by companies like Microsoft and Adobe, attempting to
replace them with a proprietary platform under the control of a single
company. These represent an attack on the web itself, and should be
taken very seriously.
LWN: What are your specific near-term tasks as part of the team?
My role is really to figure out how to work with
other open source projects and help them figure out how to properly
leverage what we're doing in Mozilla. This includes organizational
development. For example, we are a non-profit, public benefit organization
that acts like a business. But we are also an open source project with a
very active non-corporate contributor base, targets a consumer market and
has a strong product focus and over 100 million users - we would love to
see that replicated in other places as well and we would love the chance to
teach others how to do it.
We also share a common alignment with a huge number of other open
source projects. Open source projects require an open playing field
to build on. These usually take the form of "open standards" and
right now form the basis of the web that we see today, and many of the
forms of communication used on the Internet. HTTP, HTML, CSS, SMTP,
Jabber, etc. These were standards that anyone was able to implement
and saw an explosive growth in use as a result. But these systems are
under attack by companies like Microsoft and Adobe, attempting to
replace them with a proprietary platform under the control of a single
company. These represent an attack on the web itself, and should be
taken very seriously. Remember that the web is still the killer app,
and keeping that open and protected is paramount.
Our role has to be larger than just delivering a browser that normal
people can use. We also have to push the web itself forward to make
sure that it remains competitive against other platforms and is the
platform of choice for development. This means evolving the languages
that the web uses, adding capabilities to the browser itself (video,
audio, canvas, SVG, others) and then putting it into a consumable
package that people love to use on the widest possible set of platforms.
That's the context for my role. What I will try and do is to make
sure that well-aligned open source projects understand this story and
know what we're doing on the ground so that we can help them and they
can help us. Mozilla has a huge footprint of users and we want to
make sure that other open source projects know how to take advantage
of that. What this actually means in terms of actions is still
something that's being figured out but we'll start to see movement in
the next few weeks.
There are some easy first steps. First steps include helping with our
embedded and mobile stories (where open source and Linux have a lot of
leverage) and sparking discussions on performance and footprint. I
will also continue with my Linux role and act as a contact for the
Linux distributions that are shipping Firefox and Mozilla technologies
LWN: Are there specific ways that you will be using your Red Hat and
Linux background in your new job?
I was at Red Hat for nearly 9 years, and I am a vastly changed person
as a result of that time, so of course! But a lot of the work that I
did there will be used indirectly. I learned a lot of things at Red
Hat. I learned to think strategically, how to build and run teams,
what makes successful projects and a nearly rabid devotion to building
products that really help the people who are using them. How
important design and a user focus is to building a successful
product. The incredible importance of brand in the development of
products and how to build messaging around that. I learned some
important lessons about how to mix business and open source projects.
That recognizing companies is an incredibly important part of making
many projects successful and cultivating those relationships can
create wins on both sides. I learned that open source itself isn't a
business model, but requires a different way to think about how you
deliver value to people and organizations. But the most important
thing that I probably walked away with was that the most important
factor that seems to make an organization successful (aside from
having a market to work in!) is the people that you bring to the
table. Red Hat was filled with wonderful people, and still is. I
take that lesson to heart in choosing the people around me and who I
will try and bring to Mozilla as well. It's incredibly important.
LWN: Will you still be involved in OLPC development? In an official
capacity as part of your job with Mozilla? What kinds of things will you be
I certainly won't be involved to the same level I was. I had day to
day involvement in the software development and design process and
what OLPC will deliver to the public is something that I both deserve
much of the credit and much of the blame for. There are still some
touch points with OLPC from time to time, largely around the browser
that's included on the OLPC machine, but I'm not that involved with
them right now.
I decided it was time to go somewhere that would let
me affect a lot of people and create some leverage for open source.
Mozilla is that place.
LWN: What led to deciding to leave Red Hat after so many years?
The answer to that question is complicated and deeply personal. Some
small part of it was just employment diversification. I had been
there for 9 years, which is a huge portion of my young life, and I
felt like I should experience something else.
But it was also a question of leverage for me. Red Hat is an
enterprise company. They are doing well in that market and are doing
a great job of developing value for that customer segment and bringing
the story of open source along with them. The are a committed
organization and are well-liked and continue to make good decisions.
But they aren't going to be able to change the world from the back
office and have become far too conservative over the years to go out
and touch human beings directly. And they aren't going to be large
enough any time soon to be able to have credible experiments to grow
into a "consumer" market. I learned that the hard way with both OLPC
and Mugshot and I decided it was time to go somewhere that would let
me affect a lot of people and create some leverage for open source.
Mozilla is that place.
But, really, that question should be "why Mozilla?" Mozilla is still
a pretty small organization, less than 150 people. I feel like I have
a lot to bring to the organization, both based on my experience inside
of Mozilla (which is longer than my Red Hat experience!) and from my
work at Red Hat. Mozilla is trying to create change in the world in a
very real way by touching people directly, and doing so using open
source methodologies under the umbrella of a public benefit company. They have a
fantastic team from the top to the bottom of the organization, want to
compete and want to keep their users winning by creating a great
product that protects the web. It's unique in the market, and I hope
to do my part to keep it that way.
LWN: What things do you look back on that you did at Red Hat that
you are most pleased with? Are there things you would have done
I've always been pretty happy with the desktop work that happened at
Red Hat. These days Ubuntu gets most of the credit for the Linux
Desktop but it was _clearly_ Red Hat (and Ximian/Novell!) that did
much of the heavy lifting to get it from where it was in the mid-1990s
to where it is today, and that is still the case. GNOME in particular
would have not gotten past the stages of its infancy without Red Hat's
early involvement, and I like where it's gotten today. (Anyone else
remember GNOME 0.27?) I had a part to play there with my early
Mozilla development work, which I very carefully selected for that
reason, and I think that helped things get on the right track. It was
a good decision and I really like the outcome.
The OLPC work was quite satisfying. If not only because of the
proposed effect - helping millions of poor kids gain a foothold in the
world of knowledge that I take for granted every day - but also
because I was able to take that decade of knowledge of what was broken
in computing and apply it. A lot of what you will see in OLPC is a
result of that, sometimes with me acting as the guy driving ideas and
sometimes acting as a champion for the ideas of others. I can take
some solace that even though I'm not working on it today, a lot of
that work I can point at as my own. I hope that many of the other
team members feel the same way.
But I made a huge number of mistakes. I passed on some very early
leadership opportunities where I probably could have made an earlier
difference and pushed things harder. Failing to grow out of an early
engineering position and thinking about wider issues around teams,
users, strategy, markets and business. I should have done that
earlier and I think that if I had, we would all be a little better
But on reflection, it would be hard to point to many things that I
would want to change. It's still a good company and I am happy to
have worked there.
Our thanks to Christopher for taking time to respond, especially during
his, probably rather busy, first week on the job.
Comments (17 posted)
Page editor: Jonathan Corbet
Inside this week's LWN.net Weekly Edition
- Security: Fingerprint recognition using fprint; New vulnerabilities in apache2, MySQL, openssh, samba...
- Kernel: sys_indirect(); Supporting electronic paper; PID namespaces.
- Distributions: JeOS, Mamona, Online Desktops and CBI; Fedora 8 Everything Spin; CAELinux; Interview with Dan Walsh
- Development: A test run of Firefox 3,
KDE 4.0 RC1, History and Future of OLPC SimCity, Art Tablets for Krita,
new versions of Samba, Open Computer Forensics Architecture, Segue CMS,
Shared Registry System, Audacious, Audacity, XMMS2, Equinox, GNOME,
GARNOME, BZFlag, UniConvertor, Claws Mail, nova, GNU CLISP,
- Press: Code monkey's guide to cryptographic hashes, another Year of the Linux Desktop,
MontaVista partners with Movial,
what's really going on with Oracle Linux, Dell selling Ubuntu-based servers,
Wal-Mart to restock Linux PCs, Swedish police arrest Dan Egerstad, History
of open-source Java, Archos 604 review.
- Announcements: New BusyBox lawsuits, EFF Wins Internet subdomain patent reexamination, EFF on telecom spying,
GNU Affero General Public License v3, GPL compliance case against Iliad,
Coverity scans Java, Nagios and GroundWork partner, Mandriva Linux product of the year award,
LAC2008 cfp, OLS cfp, Velocity Conf cfp, PyCon 2008, User Friendly turns ten.