Leading items
Python gears up for 2.6 and 3.0
Things are heating up in the Python world in advance of two major synchronized releases of the language. As it heads towards Python 3000 (aka Py3k or Python 3.0), alongside the transitional version 2.6, the development team is narrowing its focus to just those items that are required for the releases. Along the way, the conversations taking place on python-devel provide a look inside the development and release process decisions that a project needs to make as releases loom.
Py3k is the next-generation version of Python, as we described last September. It will not be backward compatible with programs written for Python 2.x in a wide variety of ways. Python 2.6 is an effort to bridge the gap, enabling much of the 3.0 functionality so that new programs can start using it. It can also provide warnings for code that will not work with Py3k.
Python 2.6 was originally scheduled for an April 2008 release, in advance of the August 2008 release planned for Py3k. Now the two are slated for synchronized releases, roughly monthly, until the final release now scheduled for early September 2008. The synchronization is seen as important for two reasons as Python's Benevolent Dictator For Life (BDFL) Guido van Rossum outlines:
Because Py3k is such a radical change, the 2.x series will continue for a long time. van Rossum's recent PyCon keynote (PDF slides) mentions five years as the time frame for 2.6 to be supported, with 2.7 and 2.8 releases possible. A stable development platform for the next few years is very important for current Python users as is giving them a long time to migrate their code.
The third alpha of Py3k was released at the end of February along with the first alpha of 2.6. Additional alpha releases of each are slated for April and May as laid out in Python Enhancement Proposal (PEP) 361. Those are to be followed by betas in June and July with the final release planned for September 3. All of that adds up to a fairly aggressive schedule, but the team seems confident—at least so far.
One of the issues that the Python hackers are trying to figure out is how to track the items still left to be done. van Rossum describes the scope of the problem:
No one had any major objections to van Rossum's suggestion of using the bug tracker to track the tasks, with Christian Heimes pointing out:
The bug tracker allows for different priorities to be set on bugs (or tasks) that are entered into it, which led van Rossum and others to wonder about the proper usage of that field. One of the problems is distinguishing between issues that must be addressed before the next release versus those that must be addressed sometime before the final release. In some sense, both are "critical" and "show-stopping" (depending on which show you are focused on). Brett Cannon reported the scheme they came up with:
This can elevate bugs that are relatively minor, but need to be handled before a final release, into a category that inflates their importance. But, not elevating the bugs can lead to them incorrectly being set aside for a later release. van Rossum wondered about this bug priority "inflation", but it is the way that 2.6/3.0 release manager Barry Warsaw wants to handle things:
Other projects or project managers might make different decisions on how to handle bug priorities, but the important thing is to make a reasonable decision quickly. Once that was done, the tasks were added to the tracker and could be prioritized correctly within the framework and without a lot of hand-wringing about which way is "best". It is an important skill for project managers of all kinds to learn.
Things are progressing rapidly on python-devel these days—not surprising with two major releases due in less than six months. There is a lot of work to be done, but the Python hackers aren't shrinking from those tasks. In addition, the team has also been able to change their processes as needed to support their tight schedule. With hard work and a bit of luck that should put Py3k and its 2.6 sibling on our development machines by autumn.
Who maintains dpkg?
The Debian project is known for its public brawls, but the truth of the matter is that the Debian developers have not lived up to that reputation in recent years. The recent outburst over the attempted "semi-hijacking" of the dpkg maintainership shows that Debian still knows how to run a flame war, though. It also raises some interesting issues on how packages should be maintained, how derivative distributions work with their upstream versions, and what moral rights, if any, a program's initial author retains years later.Dpkg, of course, is the low-level package management tool used by Debian-based distributions; it is the direct counterpart to the RPM tool used by many other systems. Like RPM, it is a crucial component in that it determines how systems will be managed - and how much hair administrators will lose in the process. And, like RPM, it apparently causes a certain sort of instability in those who work with it for too long.
Ian Jackson wrote dpkg back in 1993, but, by the time a few years had passed, Ian had moved on to other projects. In recent times, though, he has come back to working on dpkg - but for Ubuntu, not for the Debian project directly. One of his largest projects has been the triggers feature, which enables one package to respond to events involving other packages in the system. This feature, which is similar to the RPM capability by the same name, can help the system as a whole maintain consistency as the package mix changes; it can also speed up package installations. Triggers have been merged into Ubuntu's dpkg and are currently being used by that distribution.
The upstream version of dpkg shipped by Debian does not have trigger
support, though, and one might wonder why. If one listens to Ian's side of
the story, the merging of
triggers has been pointlessly (perhaps even maliciously) blocked for
several months by Guillem Jover, the current Debian dpkg maintainer. So
Ian concluded that the only way to get triggers into Debian in time for the
next release ("lenny") was to carry out a
"semi-hijack" of the dpkg package. By semi-hijack, Ian meant that he
intended to displace Guillem while leaving in place the other developers
working on dpkg, who were encouraged to "please carry on with your
existing working practices.
"
Ian also proceeded to upload a version of dpkg with trigger support, and without a number of other recently-added changes. It is worth noting that all of this work went into a separate repository branch, pending a final resolution of the matter. So when the upload was rejected (as it was) and Ian was deprived of his commit privileges (as he was), there was no real mess to clean up.
Those wanting a detailed history of this conflict can find it in this posting from Anthony Towns. It is a long story, and your editor will only be able to look at parts of it.
One of the relevant issues here is that Guillem Jover appears to be a busy developer who has not had as much time to maintain dpkg as is really needed. Since the beginning of the year, he has orphaned a number of other packages (directfb and bmv, for example) in order to spend more time on dpkg. But, as a result of time constraints, a number of dpkg patches have languished for too long.
While this was happening, Guillem put a fair amount of the time he did have into reformatting the dpkg code and making a number of other low-level changes, such as replacing zero constants with NULL. Ian disagrees strongly with the reformatting and such - unsurprisingly, the original code was in his preferred style. And this is where a lot of the conflict comes in, at two different levels. Ian disagrees with the coding style changes in general, saying:
Many developers will disagree on the value of code reformatting; some projects (the kernel, for example) see quite a bit of it. Judicious cleaning-up of code can help with its long-term maintainability. All will agree, though, that reformatting can make it harder to merge large changes which were made against the code before the reformatting was done. This appears to be a big part of Ian's complaint: unnecessary (to him) churn in the dpkg code base makes it hard for him to maintain his trigger patches in a condition where they can be merged.
Code churn is a part of the problem, but Ian's merge difficulties are also a result of doing the trigger work in the Ubuntu tree rather than in Debian directly. Ian did try to unify things back in August, but that was after committing Ubuntu to the modified code. Ubuntu's dpkg is currently significantly different from Debian's version, and, while one assumes that, sooner or later, Debian will acquire the trigger functionality, there is no real assurance that things will go that way. Dpkg has been forked, for now, and the prospects for a subsequent join are uncertain.
Ian also asserts that, as the creator of dpkg, he is entitled to special consideration when it comes to the future of that package. His semi-hijack announcement makes that point twice. But one of the key features of free software is this: when you release code under a free license, you give up some control. It seems pretty clear that Ian has long since lost control over dpkg in Debian.
So who does control this package, and how will this issue be resolved? Certainly Ian's hijack attempt found little sympathy, even among those who think that dpkg has not been well maintained recently. There are some who say that the disagreement should be taken to the Debian technical committee, which is empowered to resolve technical disputes between developers. But faith in this committee appears to be at a low point, as can be seen in this recent proposal to change how it is selected:
Meanwhile, the discussion has gone quiet, suggesting that, perhaps, it has been moved to a private venue. The dpkg commit log, as of this writing, shows that changes are being merged, but triggers are not among them. It is hard to imagine that the project will fail to find a way to get the triggers feature merged and the maintenance issues resolved, but that does not appear to have happened yet.
Installfest generates 350 Linux computers for schools
On Saturday March 1st, Untangle and the Alameda County Computer Resource Center (ACCRC) organized the first of what is hoped to be many "Installfest for Schools" events. It took place at four San Francisco Bay area locations (San Francisco, Berkeley, San Mateo and Novato) and refurbished 350 older computers with Ubuntu for northern California schools.
The primary goal of the installfest was to give children in disadvantaged neighborhoods the same access to technology that students in wealthy school districts grow up with. However, the event was also about curbing waste. 132 million PCs were bought in the year 2000 alone and none of them can run Vista. But older hardware works great with GNU/Linux and extending the life of these PCs will keep thousands of tons of toxic electronic waste out of the landfill. And let's not forget about budgetary waste. With many states facing budget crises that will inevitably force deeper classroom spending cutbacks, why should our schools to spend their scarce resources on proprietary software licenses? In fact, cutbacks may create an incredible window of opportunity for the GNU/Linux desktop movement to establish itself within schools.
The installfest drew approximately 130 free and open source software community volunteers across the four locations. We started with over 1,000 older, discarded computers that had been collected by ACCRC through donations from the general public, local businesses and municipal governments. Some of the computers were smooth sailing: they met the hardware specification, had all of the necessary components and installed without any problems. Other computers had software install problems, but those were easy to solve because so many of the Bay Area's most hardcore free and open source software gurus participated and with their combined expertise, no error message went unattended to. The rest of the computers required a little more care, as many of them were missing a hard drive, NIC or enough RAM to run Ubuntu. Yet, by disassembling problematic boxes it was easy to form a pool of spare parts that could then be stitched back together to create working computers. The week after the installfest, ACCRC put the finished systems through a 72-hour burn-in test and we now have 350 computers that have already started being donated to schools.
The Ascend School in Oakland received the first batch of nine computers. Other schools that have received open source computers from the ACCRC include:
- Lockwood School (Oakland)
- Whittier Elementary School (Oakland)
- Casa Grande High School (Petaluma)
- Woodside Elementary School (Concord)
- KIPP San Francisco Bay Academy (San Francisco)
- Mission High School (San Francisco)
Computer hardware and software specifications
![[installfest computers]](https://static.lwn.net/images/ifest/sm-ifest_sm.jpg)
The minimum specifications for each computer were an 800mhz processor (PIII or AMD), 256MB Ram and a 20 GB hard drive, but we were pleasantly surprised to find a handful of P4 processors in the mix as well. One location even received a batch of 6 dual core systems with elegant slim cases—who throws those out and what else are they looking to get rid of?—but ironically we couldn't install them during the event because they were only equipped with DMS-59 DVI ports that required special monitor cables.
Each system received a fresh copy of Ubuntu 7.10 desktop with the latest apt-get upgrade applied as of February 27, 2008. Because the computers were going into schools with little or no GNU/Linux expertise, it was important to try and create a positive first experience so we worked with Creative Commons to package samples of pictures from Flickr and music from Jamendo to show off the fun side of the donated computers. No Starch Press also donated PDF copies of Ubuntu for non-Geeks that were loaded on to each computer so that help for common support questions was never more than a click away.
Install specifications
Each location was set up with 10 to 40 workstations that had permanent
keyboards, mice, monitors and cables so that the volunteers only had to
move the desktops themselves back and forth. The process was started by
booting from custom install CDs and the packages were applied over the
network via apache http web servers. The custom CDs were optimized to make
the Ubuntu OS installation as fast and easy as possible. Physically
placing the CD into the drive and booting from disc was really all that was
required because the additional content from Creative Commons and No Starch
Press were bundled as Debian packages that were automatically installed via
the network just like the other Ubuntu updates and patches.
The installfest networks were based on dual Pentium III servers with a RAID array and Gigabit network cards plugged into a 24-port Gigabit switch. It was important to have a fast setup because updating as many as 40 systems at once placed a heavy load on drives and network connections. Electricity was also a concern as most of the outlets available had 15 or 20 Amp circuits. Given the intensity of the installation/reboot workload and the relatively power inefficient CRT monitors, we drew the line at 5 workstations per 15 Amp circuit because an extra machine might have fit, but blowing the circuit breaker would have caused a big disruption—especially if the breaker happened to be in a locked closet.
Community goes the extra mile
With 130 volunteers showing up, Untangle and ACCRC really had a lot of help in pulling the Installfest for Schools off. However, the community did far more than just show up, our volunteers really went the extra mile to save the day on as we stumbled across a handful of unexpected hiccups. One particularly inspirational moment was when the San Mateo location ran out of computers, our volunteers drove their own cars across the Bay to pickup extra hardware rather than close the location early! We also owe a debt of gratitude to 3 members of the San Francisco Linux Users' Group (Christian Einfeldt, Jim Stockford and Daniel Mizyrycki), who worked long hours to set up and clean up that location.
We also received lots of help from free and open source software related organizations. Mozilla in particular really stepped up to the plate by blogging about the event and then bringing schwag and pizza for all 130 volunteers! But Mozilla wanted to get their hands dirty as well and Mozilla team members showed up to lend a hand at each location. Creative Commons and the No Starch Press helped put together content. Also, O'Reilly, OSI, the Linux Foundation, Sun and Canonical really helped get the word out with supportive blog mentions that encouraged participation as well.
Future plans
Moving forward, Untangle and ACCRC hope to continue organizing bigger and better Installfests for Schools. Our goal is to turn the one-time regional event into a distributed national event occurring on a regular basis. If we're able to find some friendly organizations to help out, we may even be able to go international. Stay tuned because you'll be hearing from us sooner rather than later about the next Installfest for Schools.
Anyone wishing to help should stay informed by signing up for the installfest mailing list. As we move more into a distributed national event, we need all of the help that we can get identifying local schools, old computer donors and feet on the street volunteers to make sure everything goes smoothly. That work will be coordinated on the mailing list.
[ Andrew Fife, of Untangle, is one of the organizers of the project. ]
Page editor: Jonathan Corbet
Next page:
Security>>