User: Password:
|
|
Subscribe / Log in / New account

Leading items

A big setback on software patents

Up until the last moment, it looked like things might go the right way. The European Council's attempt to adopt the software patent directive as a no-debate item seemed doomed as a result of opposition from Denmark and a few other countries. In the end, however, the Council violated its own procedural rules by adopting the directive anyway, and nobody stood up to stop it. Barring an unlikely sequence of events, software patents will become the law in the European Union.

The unlikely sequence of events is this: the European Parliament will have a second reading of the directive in the next few months; at that reading, it will have the opportunity to reject or amend the directive. The Parliament had, the first time through, added amendments which made it clear that the patenting of software was not to be allowed, so there is reason for hope. The problem is that, on the second reading, an absolute majority of votes is required for any amendment. Simply getting enough members into the chamber to create a majority is often a problem with the European Parliament, so getting enough of them to vote for positive changes in the patent directive will be doubly challenging. To many observers, fixing a directive on the second reading seems just about impossible.

There is reason to hope, however. The fact that the Council ignored the Parliament's request to restart the procedure and the manner in which the directive was adopted has upset a number of members of Parliament. These members may just find enough energy to haul themselves down to the debate and vote to reassert the Parliament's authority. If these members continue to hear from their constituents in the mean time, they should be even more motivated.

In other words, now is not the time to give up and let up on the pressure. Instead, it is more important than ever that EU citizens express their views to their representatives. With enough effort, this battle might, just yet, be won.

And it is an important battle. The possible effects of software patents on small European businesses have been well discussed. But the absence of software patents in Europe has had a chilling effect on software patent enforcement in general. Currently, a patent holder could make life difficult for free software in the U.S., but European developers would just sneer in that smug manner unique to Europeans talking about American ways. So a patent challenge against, say, the Linux kernel could be a problem for an American company or developer, but it would be unlikely to impede Linux itself.

In a world with global software patent legislation, however, the situation is different. A patent challenge could shut down Linux over much of the planet; there would be no place for the software to run to. For this reason, European resistance to software patents helps to protect all of us; the forces behind software patenting understand that fact well. So we must hope that the European Parliament can find the energy to stand up for its rights.

Comments (26 posted)

Is the kernel development process broken?

According to some, the 2.6 development process has gone far out of control. Wildly destabilizing patches are routinely accepted, to the point that every 2.6.x release is really a development kernel in disguise. There are no more stable kernels anymore. As evidence, they point out certain high-profile regressions, such as the failure of 2.6.11 to work with certain Dell keyboards.

It is true that the process has changed in 2.6, and that each 2.6 release tends to contain a great deal of new stuff. The situation is nowhere near as bad as some people claim, however. The problems which have turned up have tended to be minor, and most have not affected all that many users. Big, embarrassing security bugs, data corruption issues, etc. have been mostly notable in their absence. Kernel developers like Andrew Morton don't think there is a problem:

I would maintain that we're still fixing stuff faster than we're breaking stuff. If you look at the fixes which are going into the tree (and there are a HUGE number of fixes), many of them are addressing problems which have been there for a long time.

Even so, there is a certain feeling that some 2.6 kernels have been released with problems which should not have been there. Last week, in an effort to improve the situation, Linus posted a proposal for a slight modification to the kernel release process. The new scheme would have set aside even-numbered kernel releases (2.6.12, 2.6.14, ...) as "extra-stable" kernels which would include nothing but bug fixes. Odd-numbered releases would continue to include more invasive patches. The idea was that an even-numbered release would follow fairly closely after the previous odd-numbered release and would fix any regressions or other problems which had turned up. With luck, people could install an even-numbered release with relative confidence.

Over the course of a lengthy discussion, an apparent consensus formed: the real problem is a lack of testing. In theory, most patches are extensively tested in the -mm tree before being merged. -mm does work well for many things, and it has helped to improve the quality of patches being merged into the mainline. But the -mm kernels are considered to be far too unstable by many users, so they are not tested as widely as anybody would like. Even quite a few kernel developers work with the mainline kernels, since they provide a more stable development platform.

The next step in the testing process is Linus's -rc releases. These kernels, too, are not tested as heavily as one might like. Many developers blame the fact that most of the -rc kernels are not really release candidates; they are merge points and an indication that a release is getting closer. Since users do not see the -rc kernels as true release candidates, they tend to shy away from them. For what it's worth, Linus disagrees with the perception of his -rc kernels:

Have people actually _looked_ at the -rc releases? They are very much done when I reach the point and say "ok, let's calm down". The first one is usually pretty big and often needs some fixing, simply because the first one is _inevitably_ (and by design) the one that gets the pent-up demand from the previous calming down period.

But it's very much a call to "ok, guys, calm down now".

The fact remains, however, that many people see a "release candidate" rather differently than Linus does.

There are some -rc kernels which clearly are release candidates; 2.6.11-rc5 is an obvious example. But even that kernel did not see enough testing to turn up the Dell keyboard problem.

The real problem seems to have two components. The first is that widespread testing by users is a vital part of the free software development process. This is especially true for the kernel: no kernel developer has access to all of the strange hardware out there, but the user community, as a whole, does. The only way to get the necessary level of testing coverage is to have large numbers of users do it. But here is where the second piece of the puzzle comes in: most users are unwilling to perform this testing on anything other than official mainline kernel releases. So certain classes of bugs are only found after such a release takes place.

A solution which was proposed was to bring back the concept of a four-number release: 2.6.11.1, for example. These releases would exist solely to deal with any show-stopper bugs which turn up after a major mainline release. Linus was negative about this idea, mostly because he didn't think anybody would be willing to do that work:

I'll tell you what the problem is: I don't think you'll find anybody to do the parallel "only trivial patches" tree. They'll go crazy in a couple of weeks. Why? Because it's a _damn_ hard problem. Where do you draw the line? What's an acceptable patch? And if you get it wrong, people will complain _very_ loudly, since by now you've "promised" them a kernel that is better than the mainline. In other words: there's almost zero glory, there are no interesting problems, and there will absolutely be people who claim that you're a dick-head and worse, probably on a weekly basis.

Linus went on, however, to outline how the process might work if a "sucker" were found who wanted to do it. The charter for this tree would have to be extremely restricted, with many rules limiting which patches could be accepted. The "sucker tree" would only take very small, clearly correct patches which fix a serious, user-visible bug. Some sort of committee would rule on patches, and would easily be able to exclude any which do not appear to meet the criteria. These conditions, says Linus, might make it possible to maintain the sucker tree, if a suitable sucker could be found.

As it turns out, a sucker stepped forward. Greg Kroah-Hartman has volunteered to maintain this tree for now, and to find a new maintainer when he reaches his limit. Chris Wright has volunteered to help. Greg released 2.6.11.1 as an example of how the process would work; it contains three patches: two compile fixes, and the obligatory Dell keyboard fix. 2.6.11.2 followed on March 9 with a single security fix. So the process has begun to operate.

Greg and Chris have also put together a set of rules on how the extra-stable tree will operate. To be considered for this tree, a patch must be "obviously correct," no bigger than 100 lines, a fix for a real bug which is seen to be affecting users, etc. There is a new stable@kernel.org address to which such patches should be sent. Patches which appear to qualify will be added to the queue and considered by a review committee (which has not yet been named, but it "will be made up of a number of kernel developers who have volunteered for this task, and a few that haven't").

The rules seem to be acceptable to most developers. There was one suggestion that, to qualify, patches must also be accepted into the mainline kernel. Being merged into the mainline would ensure wider testing of the patches, and would also serve to minimize the differences between the stable and mainline trees. The problem with this idea is that, often, the minimal fix which is best suited to an extra-stable tree is not the fix that the developers want for the long term. The real fix for a bug may involve wide-ranging changes, API changes, etc., but that sort of patch conflicts with the other rules for the extra-stable tree. So a "must be merged into the mainline" rule probably will not be added, at least not in that form.

How much this new tree will help is yet to be seen. It may be that its presence will simply cause many users to hold off testing until the first extra-stable release is made. This tree provides a safe repository for critical fixes, but those fixes cannot be made until the bugs are found. Finding those bugs requires widespread testing; no new kernel tree can change that fact.

Comments (32 posted)

The 2005 Debian Project Leader election

March 9, 2005

This article was contributed by Joe 'Zonker' Brockmeier.

The Debian Project Leader (DPL) election is fast approaching. The nomination period ended on February 28, and the campaigning period runs through March 21. The field of candidates is much broader than in recent years, with six serious candidates vying for the role of Debian Project Leader. Current DPL Martin Michlmayr is not running for re-election.

The candidates, and their platforms, for 2005 are Matthew Garrett, Andreas Schuldei, Angus Lees, Anthony Towns, Jonathan Walther, and Branden Robinson.

We sent a list of questions to each candidate to find out where they stand on issues facing Debian in 2005. The first question we posed to the candidates was how they would help to ensure that Sarge would be released this year, and if too much emphasis was placed on a new stable release.

In his platform, Walther endorsed the idea of a six-month release cycle, borrowed from the OpenBSD project, saying it could "turn Debian into a monster powerhouse of software goodness." In his response, he added that he was unsure of the limits of the DPL's authority, but would do "everything in my power to get Sarge out the door immediately, as-is, and formalize the OpenBSD/Ubuntu/Xouvert 6-month release cycle."

Towns responded that there were a variety of reasons that Sarge had been delayed, and that "the release team currently have a handle on them." He also said that releasing Sarge is "the highest priority for the project at this point, and the highest priority of the DPL is to do everything possible to ensure that the release team and those working on resolving the remaining issues have the support and resources they need to do their work quickly and effectively."

Lees pointed out that the DPL "is not a position with direct control over Debian's actions" and that the DPL "is there to provide a single point of contact with the outside world and to ensure the relevant groups within Debian coordinate effectively." He also said that he is confident that the Sarge release would go out this year without intervention from the DPL, but "would of course try to ensure that the relevant technical teams have the resources they need to avoid any further delays."

As for the importance of stable releases, Lees said that the stable releases are necessary to provide "a static fork to provide security fixes against and a known minimum point from which package maintainers must ensure smooth upgrades." The ideal release point, according to Lees, would be "around the 1.5-2.5 year point, so shorter than the Sarge release cycle - but not by much."

Garrett noted that Sarge is close enough to release that "anything the DPL does is more likely to slow things down than speed them up."

The release team have assured me that the list of awkward problems is now small and under control, and I'm inclined to trust them on this.

A more interesting question is probably how we can prevent Sarge from happening again. A large part of the problem is that many people have lost faith in us ever making timely releases, which ends up costing us a lot - without the feeling that you're working towards a release, there's far less incentive to make sure that your code is in good condition and help track down bugs in other packages. I want to deal with this problem by making people believe that we can actually make releases when we say we will, and I think the first step towards that will be to make sure that we have a list of concrete goals for our next release the moment we've finished with Sarge.

He also said that slow releases not only cost Debian users, but development effort as well.

Robinson told LWN that he would work closely with the Release Management team to find out what they need and "try to get those needs satisfied, whether they involve hardware for build daemons, additional personnel for the security or debian-installer teams, or simply general encouragement (some would say whip-cracking) to get the release-critical bug count down."

He also said that Debian is compared "unfairly and unfavorably to the bleeding-edge nature of some distributions" and could "greatly mitigate that criticism by establishing a more predictable and regular release cycle.."

Finally, Schuldei said that Sarge should be in "deep freeze already" by the time the next DPL takes office on April 17. Schuldei also said that regular releases "are important for Debian and are one of my priorities."

The next question we posed to the candidates is whether Ubuntu had hurt Debian by drawing away development effort, how Debian should work with projects derived from Debian and if Debian was "infrastructure" for other projects.

Schuldei responded that Ubuntu "cherry-picked from Debian's most active developers."

When your hobby becomes your job, it is easy to lose interest in participating in the hobby outside of work. And working in a start-up company can easily become an all-consuming activity. Given this combination, it was probably inevitable that developers working on Ubuntu would have less time and energy to expend on Debian itself.

Those Ubuntu developers who used to work on Debian infrastructure were missed painfully, indeed. I hope that "Small Teams" as described in my platform can help by building lots of small multiplying knowledge pools which would make Debian resilient against loss of single individuals and enable it to grow able successors very quickly.

Schuldei told LWN that Debian "should more actively incorporate the good things that it sees other distributions" do and that if Debian "managed the 'taking' as well as the 'giving' [to other projects] there would be little limit to its potential."

Robinson says that Canonical Ltd. (the company that sponsors Ubuntu) is a "mixed blessing."

Previous companies that centered their identities around Debian (such as Stormix and Progeny) have not had the resources to hire more than a handful of Debian developers. Canonical has hired many. It's a good thing to see so many Debian developers able to more closely align their careers with their passions -- it's something I've enjoyed for nearly five years, so I can hardly begrudge others that same condition.

At the same time, Canonical's interests are not identical to Debian's. If Canonical is to operate anything like a conventional business that realizes revenue, it cannot help but pursue paths to do so. The Debian Project doesn't have that pressure on it. Inevitably in such an environment, at least some Debian developers who work for a commercial interest are going to experience tension between what's good for Debian and what's good for their employer, even if that divergence is perceived as merely short-term. In the short term, Debian needs to release sarge. We cannot count on Canonical, Linspire, Progeny, Xandros, Hewlett-Packard, or any of Debian's other benefactors to solve our problems for us -- they will not supply the magical second step between "collect underpants" and "RELEASE!", to spin an old joke.

He also said that Debian has to be "frank about it" and accept that some developers may be drawn away from Debian.

Garrett pointed out that Ubuntu "has taken some effort away from Debian, but it's also contributed a lot back."

One of the major advantages that Ubuntu has over Debian is that their development process makes it much easier to push new technologies. We've already gained from that in at least one case, since Debian's Project Utopia stack is heavily based on the code in Ubuntu. That would have been much harder to coordinate if it hadn't been demonstrated in a working scenario first. Remember that Ubuntu hasn't existed for all that long - it's hard to have any great certainty what the long-term effects will be.

One of the fundamental reasons for free software is the right to produce derived works, and I think that making it as easy as possible for others to produce derived distributions is the best way for Debian to support that. The number of distributions based on Debian is large enough that I think we class as infrastructure, but don't think that's incompatible with making releases.

Providing employment for Debian developers is "a good thing" according to Lees, though he notes that "some inevitable divergence between Ubuntu and Debian as Ubuntu strives to differentiate itself."

The core axiom of free software however is that having someone copy and modify your software doesn't reduce its value to you. Whatever happens, Debian is a process not a product and it will eventually incorporate any code that the Developers deem worthwhile.

What I'm really excited about from Ubuntu is some of the tools they're working on, like bug trackers and version control tools. These tools are being developed specifically for the unique needs of distributors, rather than authors, and it will be very interesting to see what they become.

Towns said that the only way Ubuntu draws developers away from Debian "is by providing a better environment for hacking -- whether that be by paying for the work, or being more fun, or being more satisfying, or all of the above."

I think it's great that there are projects that some people find more enjoyable than Debian, and the great thing about free software is that those of us who prefer Debian can just take the work they do for Ubuntu and use it ourselves. And vice-versa, too -- all without anyone being unhappy about code theft or having to involve lawyers or formal agreements or anything of the sort.

I think Debian works quite well both as a distribution of its own, and as infrastructure for other distributions; I hope it will improve as both.

According to Walther, projects like Ubuntu or Knoppix help Debian rather than hurt it. "Because of our licensing, we can always fold things back in from other projects that work out well."

We also asked candidates if they had any idea why so many people were running this year, as opposed to past years that saw only a few candidates.

Walther quipped, "because the incumbent decided not to run for re-election."

Schuldei told LWN "some of the candidates clearly believe that Debian is in need of their special knowledge or ability. I myself believe that my vision for Debian and my experience in implementing change in social groups will help the Debian Project to reach new heights and strength."

Robinson said that "people are getting a better idea of what they want out of a Project Leader."

I don't know of many precedents in our field; no other free software project of Debian's size entrusts its entire membership with electing its leadership. We're striving to identify the right balance of personality traits and experience that will equip us to face new challenges with confidence, rather than butting our heads against the same old brick walls that have stymied us for years.

Garrett said that he can't speak for the other candidates, but "I'm standing because I think Debian has problems that need fixing, and I think being DPL is the best way that I can help fix them. Perhaps our problems are more obvious this year than in the past?" Lees told LWN that he has no idea why so many people are running for DPL, and that he's running "at the insistence of several other Debian developers, probably in response to some of the more radical factions that are gaining influence within Debian." Towns said that there have been "a lot of fairly controversial questions raised or decided...and in the midst of all this the next release of our operating system has continued slipping. It seems plausible to me that the range of candidates represent the range of different views within the project of how to approach these issues."

Another topic that comes up frequently when discussing delays for Sarge is dropping architectures. We asked the candidates if they thought Debian should drop any of its architectures in order to release on a more timely basis. There was not a great deal of enthusiasm for this idea among DPL candidates. Walther is against the idea of dropping architectures altogether. "I see no need to drop any architecture, but I do see it as a good thing to release each architecture separately. This prevents the lowest common denominator from retarding the distribution as a whole."

Towns said, simply, "That's a decision for the release and archive teams to make." Lees said that there was "no correlation between the number of architectures and any delay in release," as far as he could see. Schuldei said, "yes, that's one possible option."

Garrett told LWN that dropping architectures would not speed up the release, and would "undoubtedly reduce the quality of our distribution. There are whole classes of bugs that only show up when you port to a wide range of platforms."

In any case, which architectures should we drop? M68K is often used as an example, but is actually one of the better architectures in terms of keeping up. Mips and Arm aren't widely used on the desktop, but we get a great deal of enthusiasm from embedded developers.

If we get to the point where an architecture can't pull its weight, then we'll drop it. We're not there yet.

Robinson said that the idea that dropping an architecture would benefit the release cycle "seems to meander between a vague notion and an article of faith." He also said that he has yet to see a proposal that explains how it would benefit the release cycle, and that he needs "more convincing...to support such a dramatic step. For some architectures, Debian is the only modern option for a GNU/Linux installation. It'd be a shame to give that in exchange for an unproven benefit."

Finally, we asked the candidates what the biggest challenge facing the DPL would be. Schuldei told LWN that scalability was the biggest problem facing Debian.

A lot of Debian's hottest issues over the past few years have been capacity issues: making sure the autobuilder network scales to handle our package count; making sure the NM process scales to meet the number of incoming applicants; making sure the security team scales to handle the architecture count; etc. While many of these issues are largely technical in nature, the task of identifying and resolving chokepoints before they become a problem is one that requires managerial attention, and the DPL is best suited to provide this oversight. The social structure of Debian still stems from its early years. With the size of 900+ active developers the social bonds and self-regulatory functions are just not good enough any more nowadays for it to work as smoothly and effectively as it used to be.

The changes in the leadership and small team infrastructure as well as nurturing of good working climate will address this effectively and will allow Debian a new growth cycle.

Garrett sees communication as the largest hurdle for Debian:

We're bad at it. A large part of the problem facing the release is that half the time nobody's sure why we can't release yet. People get into arguments over whether or not people are passing on enough information. It's all wasted effort, and it's all entirely unnecessary. If there's one thing that I would hope to do as DPL, it's to ensure that people know who they're supposed to be speaking to whenever they have a problem. In principle, that's not too difficult, but it's something nobody's really succeeded at yet.

Lees told LWN that Debian "basically works" and said it was difficult to sort out a minor issue to highlight as a problem. He also touched on communication as a problem, and said VoIP would be an "interesting way to improve the quality of communication...since email seems to bring out the worst in people. I would hope that improving the nature of the communication would make it easier to address other issues that arise within Debian."

Towns said that the biggest single issue was "getting Sarge out the door, but that's primarily an issue for the release team to handle." Robinson didn't respond directly to the question of the biggest challenge for Debian, but also pointed out in his responses that "the collective psyche of the project gets antsy when a release process has dragged on for too long."

The general level of irritability seems to go up. We are nearly three years pregnant with sarge, and we need to be delivering our latest offspring soon. The challenge is to practice good obstetrics, and preserve the health and well-being of ourselves and our release. In my campaigns for Debian Project Leader over the years I've consistently prescribed medicine for our ails, and I'm ready to assist my fellow developers with the delivery.

Walther also told LWN that the release cycle is the largest problem for the project.

It has caused a stagnation where we focus on putting in new packages and fixing old bugs, but the mantle of fresh new innovation that made us stand out in the early days has been passing on to other distributions. With a quicker release cycle we can definitely get that back in short order. We have all the resources and manpower.

Debian Developers may begin voting for DPL on March 21, through April 11. The voting procedure is described in section A of the Debian Constitution. We'd like to thank each of the candidates for responding to our questions, and wish them good luck in the election.

Comments (7 posted)

Page editor: Jonathan Corbet
Next page: Security>>


Copyright © 2005, Eklektix, Inc.
Comments and public postings are copyrighted by their creators.
Linux is a registered trademark of Linus Torvalds