Maps are cool; there's no end of applications which can make good use of
mapping data. There is plenty of map data around, but it's almost
exclusively proprietary in nature. That makes this data hard to use with
free applications; it's also inherently annoying. We, as taxpayers, own
those streets; why should we have to pay somebody else to know where the
Your editor likes to grumble about such things; meanwhile, the OpenStreetMap project (OSM) is busily
doing something about it. OSM has put together a database and a set of
tools making it easy for anybody to enter location data with the intent of
producing a free mapping database with global coverage. It is an ambitious
project, to say the least, but it's working:
Right now on each and every day, 25,000km of roads gets added to
the OpenStreetMap database, on the historical trend that will be
over 200,000km per day by the end of 2009. And that doesn't include
all the other data that makes OpenStreetMap the richest dataset
OSM data is not limited to roads; just about any point or
track of interest can be added to the database. If current trends
continue, OSM could well grow into the most extensive geolocation database
anywhere - free or proprietary. And those trends could well continue; one
of the nice aspects of this kind of project is that no particular expertise
is needed to contribute. All you need is a GPS receiver and some time; some OSM
local groups have even acquired a set of receivers to lend out to
interested volunteers. This is our planet, and we can all help to map it.
All this work raises an interesting question, though: under what license
should this accumulated data be distributed? Currently, the OSM database
is covered by the Creative Commons
Attribution-ShareAlike 2.0 license. It is a copyleft-style license,
requiring that derived products be made available under the same license.
So, for example, if a GPS navigator manufacturer were to include an
enhanced version of the OSM database in its products, it would have to
release the enhanced version under the CC by-SA license.
The OSM project is not happy with this license, though, and is looking to
make a change. The attribution requirement is ambiguous in this context;
do users need to credit every OSM contributor? Does making a plot of OSM
data with added data layered on top create a derived product? But the
scariest question is a different one: can the CC by-SA license cover the
OSM database at all?
Copyright law covers creative expression, not facts. The information in
the OSM database is almost entirely factual in nature; one cannot copyright
the location of a street corner. So what OSM is trying to protect is not
the individual locations, but the database as a whole. Copyright law does
allow for the protection of databases, but that law is far more complex
than the law for pure creative works, and it varies far more between
jurisdictions. Europe has a specific (though much-derided) database right,
the US has far weaker
database protections, and other parts of the planet lack this
protection altogether. So it may well be that, if some evil corporation
decides to appropriate the OSM database for its own nefarious, proprietary
purposes, there will be nothing that the OSM project can do about it.
So the project is thinking of making a switch to the Open
Database License (ODbL), which is still being developed. It, too, is a
copyleft-style license, but it is crafted to make use of whatever database
protection is available in a given jurisdiction. To that end, the ODbL is
explicitly structured as a contract between the database owner and the
user. In any jurisdiction where database rights are not recognized under
copyright law, the
contractual nature of the ODbL should provide a legal basis to go after
But the use of contract law muddies the water considerably; there are good
reasons why free software licenses are carefully written to avoid that
path. Contracts are only valid if they are explicitly and voluntarily
entered into by all parties. If the OSM cannot show that a license
violator agreed to abide by the license, it has no case under contract
law. The project has
a plan to address this problem:
To ensure that potential users are aware of and agree to the
contract terms, we are proposing to require a click-through
agreement before downloading data. (All registered users would
agree to this on signing up so will not need a further
click-through on each download.)
Registration and clickthrough licensing are obnoxious, to say the least.
But, in any case, the only people who will go through that process are
those who obtain the database directly from OpenStreetMap. The ODbL allows
redistribution, naturally, and it does not require that explicit agreement
be obtained from recipients of the database. So it is hard to see an
outcome where copies of the database lacking a "signed" contract do not
proliferate. Additionally, reliance on contract law makes it
very hard to get injunctive relief, weakening any enforcement efforts
The ODbL includes an anti-DRM measure; if a vendor locks down a copy of the
database with some sort of DRM scheme, that vendor must also make an
unrestricted copy available. This license tries to distinguish between
"collective databases" (which are not derived works) and "derivative
databases" (which are). Drawing layers on top of an OSM-based map is a
collective work; tracing lines from such a map is a derivative work. It
is, in general, a complex bit of work.
It is complex enough that a number of OSM contributors are wondering if
it's all worth it. Jordan Hatcher is one of the authors of the ODbL, and
he supports its use with OSM, but even he understands the concerns that some people
The [Science Commons] point is that all this sort of stuff can be a
real pain, and isn't what you are really doing is wanting to create
and manipulate factual data? Why spend all the time on this when
the innovation happens in what you can do with the data, and not
with trying to protect the data in the first place.
There is an active group with OSM which is opposed to this kind of
licensing and would, in fact, rather just get down to the task of
collecting and distributing the data. They express
themselves in terms like this:
One thing I really love about OSM is the pragmatic, un-political
approach: You don't give us your data, fine, then we create our own
and you can shove it.
Not: You don't give us your data, fine, then we create a complex
legal licensing framework that will ultimately get you bogged down
in so many requests by prospective users who would like to use our
data and yours but cannot and you will sooner or later have to
release your data according to the terms we dictate and then we
will have won and the world will be a better place.
These contributors would rather that OSM release its data into the public
domain - or something very close to that. Rather than put together a
complicated license, they prefer to just publish their data for anybody to
use as they see fit. There have been all of the usual discussions which
resemble any "GPL vs. BSD" licensing flame war one has ever seen - except
that the OSM folks appear to be a very polite crowd. It comes down to the
usual question: will the OSM database become more complete and useful if
those who extend it are forced to contribute back their changes?
The public domain contingent clearly does not believe that any improvements
to the database obtained via licensing constraints will be worth the
trouble. So it seems likely that there will be some sort of fork involving
the creation of a smaller, purely public-domain OSM database. It may well
be an in-house fork, with the public domain data being merged into the
larger, more restrictively licensed database for distribution. Regardless
of how that goes, this split raises issues of its own: how are the two
databases to be kept distinct in the face of cooperative additions and
Any relicensing of the database also brings up another interesting
question: what to do about all of the existing data, which may or may not
be copyrighted by those who contributed or edited it? The license change
may well require a process of getting assent from all contributors and
purging data obtained from those who do not agree. This
proposed timeline shows how the project is thinking about working
through this task. It is hard to imagine this process going entirely
The OSM community clearly has a set of thorny issues to work out. Given
that, it's not surprising that this process has already been dragged out
over the better part of a year. How this issue is eventually resolved will
certainly serve as an example - not necessarily a good example - for other
projects working on free compilations of factual data.
Let us hope that OSM can come to a
solution which lets this project continue to grow and generate a valuable
database that we all will benefit from.
Comments (46 posted)
that Wikipedia was in the process of switching away from Red Hat and
Fedora—and to Ubuntu—has stirred up some Fedora
folks. The relatively short, 13 month support cycle for Fedora releases
was fingered as a major part of the problem in a gigantic thread
on the fedora-devel mailing list. Some would like to see Fedora be
supported for longer, so that it could be used in production environments,
but that is a fundamental misunderstanding of what Fedora has set out to
The idea of supporting Fedora beyond the standard "two releases plus one
month", which should generally yield 13 months, is not new. It was, after
idea behind the Fedora Legacy
project. Unfortunately, Fedora Legacy ceased operations at the end of
largely due to a lack of interested package maintainers. So, calls for a
"long term support" (LTS) version of Fedora are met with a fair amount of
Just such a call went up in response to the Wikipedia news. Patrice Dumas
outlined the need:
[...] it seems to me that a true Fedora LTS is
missing, that would allow those who want things that are new, including
for testing but cannot afford changing everything each year (servers for
example or user desktops). It seems to me that fedora ends up being used
almost exclusively as single user desktop, so that testing of other
functionalities is likely to be less widespread.
Fedora is not meant for production use, nor for those who cannot upgrade at
least yearly. It has an entirely different mission, which
Jon Stanley sums up:
Well, in all fairness, Fedora's stated goal is to advance the state of
free software. You get that by being bleeding-edge. Unfortunately,
being bleeding edge also means not being suitable for production
environments - these are two fundamentally incompatible goals. This is
why Red Hat Linux split into two - Fedora and RHEL. RHEL is a
derivative distribution of Fedora.
Many believe that folks who want "Fedora LTS" would be better served by Red
Linux (RHEL) or, for those that do not want to pay for a distribution with
support, an RHEL
derivative such as CentOS or Scientific Linux. But those don't have the
package diversity available with Fedora. A stable release would also want
to freeze major packages at a particular version—only backporting
security fixes into that version—which is definitely not what is done
with Fedora while it is being supported. Dumas wants to see something that
finds a middle ground:
Fedora legacy (or fedora lts) would not be the same than centos. Maybe a
Centos + repository with more recent stuff would be, but currently I
think that there is something in the middle between fedora and centos
that is missing.
The Extra Packages for
Enterprise Linux (EPEL) project is meant to help fill that gap, by
maintaining additional packages—beyond what Red Hat
maintains—for RHEL and compatible distributions. Typically, though,
those packages will also be held at a version level that will, with time,
grow rather obsolete, at least to those who want to more closely follow the
upstream project. And, of course, there aren't as many packages available
for the enterprise distributions, even with EPEL, as there are for Fedora.
It would seem the classic tension between "bleeding edge" and stable as
described by Stanley. Though it isn't clear how it would solve that
problem, there are calls for reviving Fedora Legacy. There are few opposed
to the idea of continuing Fedora support—if enough people can be
found to do it—but the implementation details seem to bog things down.
There is a bit of a "chicken and egg" problem in that attracting package
maintainers is hard to do without a project to point to, but convincing the
Fedora Engineering Steering Committee (FESCo) that it is worthwhile without
having those maintainers will be difficult.
One of the sticking points is the availability of
infrastructure—servers and bandwidth primarily—for any nascent
legacy project to use. The Fedora board is seen as being resistant to
allowing the use of the Fedora infrastructure for such a project. In
response to someone who pointed out that the board's approval is not
required, Dumas disagrees:
When it requires cooperation with the infrastructure, it does. It is
also possible to start something external like rpmfusion, but the amount
of work is very big. My proposal only made sense if the economies of
scale realized by working inside the fedora project were realized.
Still, if somebody provides the infrastructure, sure I'll try to help
with a project similar than the one I proposed, but I cannot myself do
anything for the infrastructure part.
There is also the question of what kind of guarantees a legacy project
would make about how long it would support older releases. Dumas and
others seem to be in favor of essentially no commitment, maintainers would
continue supporting their packages for as long as they wished. While
there is some attraction to that idea—it certainly reduces the number
of maintainers required—it is unclear that it actually provides a
useful service. The idea that some security fixes are better than none is
attractive, but David Woodhouse cautions against that view:
If we present the _appearance_ of a distro with security updates, while
in fact there are serious security issues being unfixed, then that is
_much_ worse than the current "That distro is EOL. Upgrade before you
get hacked" messaging.
For anything to have the Fedora name on it, it _must_ have guaranteed
security fixes for at least the highest priority issues.
As the original Fedora Legacy project wound down,
it left just this kind of impression by promising support,
but often not delivering it. For several years, updates
for serious security problems were delivered late, if at all. Any new
effort in that
direction would have to be very clear about what it was delivering
and how it planned to get the job done.
A project that offered few, if any, guarantees would not be seen as
something very useful, but making guarantees that don't get met is far
While there are clearly Fedora users that would be interested in hanging on
to their operating system for longer than one year, it isn't clear that there
are enough of them—and, more importantly, enough maintainers—to
make a legacy project successful. Agreement on the goal of the project,
along with the promises it would make to adopters is important. It is
difficult to see how the Fedora powers-that-be could allocate resources to
such a project without those things. As Shmuel Siegel points out:
You are looking for
infrastructure support from Fedora without indicating that there is a
benefit to Fedora. Supply without demand is no more useful than demand
without supply. Since Fedora views itself as "the cutting edge distro",
you have an uphill PR fight. Give the Fedora project a reason to spend
some of their limited resources on you. At least let them know your
target audience and why they would be interested.
At least at this point, it doesn't seem like a revival of Fedora Legacy is
in the cards, which leaves the problem unaddressed. Perhaps adding enough
additional packages to EPEL will allow CentOS to truly become "Fedora
LTS". It should be noted that while the original concern that LTS users
might be switching to Ubuntu could well be true, Ubuntu LTS doesn't have a
solution to the problem of package versions slowly getting obsolete either.
Newer packages and
stability are fundamentally at odds—trying to solve that problem is
probably far too large of a job for any community distribution.
Comments (114 posted)
Like many communities, the Linux community depends heavily on conferences
as a way to help our developers and users know each other and work well
together. We make highly effective use of electronic communications, but
there is truly no substitute for occasionally getting together, sharing a
beer or three, and engaging in some high-bandwidth discussion. So it
stands to reason we want our events to be as productive and useful as
possible, especially given the expense of participating in them.
Your editor recently had the fortune of attending, over the course of one
week, two conferences which are arguably the oldest and the newest in our
community. They were both interesting events, but they were very different
in their organization and attendance. Both show both strengths and
weaknesses in our organization of face-to-face events.
Arguably, the first Linux-related event ever was Linux-Kongress 1994.
That gathering brought together developers working
on the Linux kernel for the first time; it played host to a large portion
of the (quite small) development community. For a period of time thereafter,
Linux-Kongress was the development event for
people working at or near the kernel level. It didn't take too long for
other conferences (notably Linux Expo in the US) to grab some of the
spotlight, but, unlike Linux Expo, Linux-Kongress is still an active
The 2008 event, in Hamburg, Germany, was well organized and a
lot of fun; it was a pleasant gathering of a part of the community which
your editor visits far too rarely. It was a technical conference for
technical people, with a number of well-known developers present.
But it must be said: Linux-Kongress is a small and relatively obscure event
in 2008. There were maybe 200 attendees; much of the northern European
development community was absent. Even some developers based in Hamburg
declined to attend. The quality of the talks was not uniformly good,
though some were excellent. And, in stark contrast to the recent Linux
Plumbers Conference, it's hard to point at much work that got done.
For something that was once the Linux
development gathering, Linux-Kongress has clearly come down in the world.
It is interesting to observe that Europe, while being the home to large
numbers of free software developers, lacks a definitive development
conference. That is not to say that no interesting events happen there;
GUADEC and Akademy are probably the biggest desktop conferences, and the
event is something to look forward to. But
developers looking for a pan-European, Linux-oriented conference will not find
one. LinuxConf.eu, a combination of the UKUUG and Linux-Kongress events
held in Cambridge last year, offered the potential to become such an event,
but the LinuxConf.eu idea appears to have stalled for now.
From Hamburg, your editor flew straight to New York City, where the
End-User Summit was held. This event, happening
for the first time, differs greatly from Linux-Kongress in many ways. To
begin with, it was an invitation-only event, and one which explicitly
excluded the press (which is why there have been no LWN articles from
there). It was also intended to host a mixture of developers and users,
and to allow them to talk to each other. These characteristics led to a
different sort of conference experience.
We do not run an invitation-only community; excluding
people from our conferences seems to run counter to the inclusive
atmosphere we normally try to encourage.
The invitation-only nature of some Linux Foundation events naturally leads
to complaints. We do not run an invitation-only community; excluding
people from our conferences seems to run counter to the inclusive
atmosphere we normally try to encourage. The Linux Foundation's reasoning
here is easy to understand, though: many of the targeted end users (who represent
mainly the financial industry in New York) have a hard time talking about
what they are doing in any setting. In an open conference with press in
attendance, those people will simply keep their mouths closed - if they
show up at all.
The user community represented by the financial industry is important; they
are a significant part of the business which keeps the enterprise
distributions going. Even now, they are highly sought after as customers.
It is important to know what they are thinking and what their biggest
difficulties with Linux are. In the absence of an event like the End User
Summit, this information will only be communicated directly to the enterprise
distributors under a non-disclosure agreement. An invitation-only summit
is fundamentally exclusive at one level, but it does help the development
community (as opposed to one or two companies) get a sense for what this
user community is thinking.
So what are they thinking? They feel some stress between the stability of
enterprise distributions and the desire to have the features developed by
the community in recent years. They want good tracing mechanisms, but do
not necessarily need the dynamic tracing provided by tools like
DTrace or SystemTap. They like Linux because its broad hardware support
frees them from reliance on any specific hardware vendor. They are very
interested in work on next-generation filesystems. Some of them, at
least, very much want to better understand how our development process
works and, possibly, participate in it. See the Linux Foundation's press
release for a summary of what was discussed there.
It was a productive gathering, especially once the CEOs got off the stage
and the attendees were able to talk to each other. But it points out
another thing that we, as a community, lack: there are few forums where
developers and users can get together and learn from each other.
Developers tend to prefer the company of other developers; convincing them
to go to more user-oriented events can be a challenge. So the closest
thing we have to a combined user/developer event is the single-vendor
conferences held by companies like Red Hat and Novell. Those, needless to
say, are not the most community-oriented gatherings. They are not the best
way to learn what our users are thinking.
The proposed LinuxCon event, to be co-located with the 2009 Linux Plumbers
Conference, may help to fill in this gap somewhat.
Our community is blessed with a wealth of interesting gatherings
worldwide. But that doesn't mean that we can't do better. Whether the
subject is a true pan-European Linux gathering, user-oriented conferences,
or something else altogether, there are always opportunities to find ways
to help our community be more cohesive and productive. The trick is to
expand communications to a broader community - as seen in our newest
conference - while growing the open collaborative spirit exemplified by our
Comments (14 posted)
Page editor: Jonathan Corbet
Inside this week's LWN.net Weekly Edition
- Security: HTTP response splitting; New vulnerabilities in cups, drupal, kernel, libxml2...
- Kernel: 2.6.28 merge window, part 2; The source of the e1000e corruption bug; Reworking vmap().
- Distributions: K12Linux - Fedora 9 with LTSP; CentOS 4.7 Server CDs, Fedora 10 Snapshot 2, Foresight 2.0.5, OpenSUSE 11.1 beta 3, RPM 4.6.0 rc 1
- Development: Mozilla releases Firefox 3.1 Beta 1, Android source, new versions of SQLite, Gutenprint, conntrack-tools, sqlmap, amplee, Gallery, Xtreme, OrangeHRM, Asymptote, DiffPDF, GPGME, Wine, Elisa, UFRaw, MTDP, PiTiVi, Kamaelia, Jikes RVM, bzr, GIT, OpenGrok.
- Press: A look at open-source hardware, Bruce Perens on open standards in vertical markets, Adobe's Flash Player 10 for Linux, Wind River acquires Mizi Research, Why OO.o failed in Belgium, Animating slide shows in OO.o Impress, Dillo 2.0 review, Mozilla Fennec review, LSB 4.0 review, FOSS attitude problems.
- Announcements: AEGIS invests in Accessibility, Moblin adds members, DroboApps open-sourced, Infobright ICE 32 bit, GPL Compliance Engineering Guide, LF puts a value on Linux, Plat'Home contest winners, free Linux training notes, LF User Summit wrap-up, Symbian moves toard open-source, Nordic Perl Worksho cfp, ApacheCon New Orleans, Camp KDE Jamaica.