|
|
Subscribe / Log in / New account

Zimmerman: We’ve packaged all of the free software…what now?

Ubuntu CTO Matt Zimmerman ponders package managers on his blog. He is concerned that the success of the package manager approach has led to using that model for many kinds of deployments (data, embedded, client/server, and so on) that might best be solved in other ways. "No single package management framework is flexible enough to accommodate all of the needs we have today. Even more importantly, a generic solution won't account for the needs we will have tomorrow. I propose that in order to move forward, we must make it possible to solve packaging problems separately, rather than attempting to solve them all within a single system."

to post comments

Zimmerman: We’ve packaged all of the free software…what now?

Posted Jul 7, 2010 8:38 UTC (Wed) by ledow (guest, #11753) [Link] (1 responses)

Sounds like the old "consolidate, separate" arguments all over again. "Nationalise / denationalise", "expand / streamline", they all happen to the same things over and over again.

The real answer lies somewhere in the middle - i.e. letting people choose what they want to do rather than demand that all package management is separate / conjoined. And as with all things code-wise, this tends to mean that people get both options and choose what they want to use, either a "do-all" piece of software that they configure some things out of, or a group of separate software that they glue together to meet their needs.

Either way, preaching one way or the other just makes you look silly to the other 50% of people.

Zimmerman: We’ve packaged all of the free software…what now?

Posted Jul 7, 2010 9:07 UTC (Wed) by mjthayer (guest, #39183) [Link]

> The real answer lies somewhere in the middle
Wasn't that what he was saying? That free software packaging is good, but not the answer to all questions?

Zimmerman: We’ve packaged all of the free software…what now?

Posted Jul 7, 2010 9:17 UTC (Wed) by djzort (guest, #57189) [Link]

didnt debian solve ubuntu's packaing?

Zimmerman: We’ve packaged all of the free software…what now?

Posted Jul 7, 2010 9:38 UTC (Wed) by mjthayer (guest, #39183) [Link] (6 responses)

Regarding data and packaging ("Data, in contrast to software, has simple requirements. It just needs to be up to date and accessible to programs. Packaging and distributing it through the standardized packaging process is awkward, doesn’t offer tangible benefits, and introduces overhead."), one of the comments on Zimmerman's blog pointed out that packaging data ensures that you have at least have one version of that data available and without needing net access. Which leads (me) to the idea that the data could be automatically packaged. When the original packaging (Ubuntu live CD?) is done, a package could be generated from the most recent data available at the time. Then, whenever new versions of the data are searched for on the net at "system runtime", that data could be pulled, a new package automatically generated locally, and that package automatically installed. Then the packager's task becomes managing that automatic packaging process (and finding ways to verify the data when it is automatically packaged).

That approach might have applications in other situations too.

Zimmerman: We’ve packaged all of the free software…what now?

Posted Jul 7, 2010 10:27 UTC (Wed) by mjthayer (guest, #39183) [Link] (4 responses)

Another funny thought that occurs to me would be to have the main packaging database on a system as a sort of meta-database. One could have a dpkg database on the system, and a Java environment in parallel (somewhere out of the way of dpkg), and the meta-database would be aware of all of them, and would be able to install or find software with a single command, automatically deciding which of the packaging systems was the right one.

After all, free software is all about choice, but the choice of packaging system currently tends to be rather all-or-nothing.

Zimmerman: We’ve packaged all of the free software…what now?

Posted Jul 7, 2010 10:33 UTC (Wed) by tzafrir (subscriber, #11501) [Link] (1 responses)

What happens when Eclipse depends on gtk? Do we take it from the Java repo or from the deb repo?

Zimmerman: We’ve packaged all of the free software…what now?

Posted Jul 7, 2010 13:27 UTC (Wed) by dskoll (subscriber, #1630) [Link]

The meta-packager would have to have some rules. (I run into this a lot with .deb packages vs. CPAN modules.)

For example, the meta-packager could be told to prefer .debs if they are new enough to satisfy requirements. Otherwise, fall back on CPAN.

This is a very hard problem, though. And if you want to allow more than one version of a package to be installed at the same time, it gets totally nightmarish. This is why (in my experience) most real-world systems consist of distributor packages plus a few hand-installed and hand-maintained non-packages. And this makes security updates complex and annoying.

Zimmerman: We’ve packaged all of the free software…what now?

Posted Jul 7, 2010 13:43 UTC (Wed) by amck (subscriber, #7270) [Link] (1 responses)

The package manager is there to provide consistency guarantees for the _system_, not just the software in the package. If you have multiple package management systems (eg dpkg + CPAN + python eggs, .. ) then you can't guarantee that the necessary dependencies are installed, that the language extensions match the version of the language installed, etc.

So you implement a set of rules, if you want them all to work together, and then you end up with a 'meta-package-manager' on top to enforce consistency. Think about how you would manage evolving packaging policies in such an environment.

Alternatively, you implement one package manager on the computer (eg. .deb or .rpm) and create converters to ensure that python eggs ship as debian-system-consistent .deb packages, etc. This is the most flexible approach: it means that only one place needs to change to make installed software match any changes in debian policy, etc; stops the python developers from being locked into deb-format and policy rules, etc. This is what is done today.

Zimmerman: We’ve packaged all of the free software…what now?

Posted Jul 8, 2010 9:06 UTC (Thu) by mjthayer (guest, #39183) [Link]

> Alternatively, you implement one package manager on the computer (eg. .deb or .rpm) and create converters to ensure that python eggs ship as debian-system-consistent .deb packages, etc. This is the most flexible approach: it means that only one place needs to change to make installed software match any changes in debian policy, etc; stops the python developers from being locked into deb-format and policy rules, etc. This is what is done today.

What about having say apt/dpkg as the meta-manager and CPAN/whatever plugins for that? So that apt-get commands also search through CPAN and so on. Of course, one is then relying on both the Debian people and the CPAN people to test that the two work well together. I don't think that is such a big issue though, because in the end the individuals who would be doing that testing are the same ones who package and test the stuff in Debian today.

Zimmerman: We’ve packaged all of the free software…what now?

Posted Jul 7, 2010 13:51 UTC (Wed) by vonbrand (subscriber, #4458) [Link]

Right, the "get the latest version of <foo> out" is the way to go. But for data (just as for code) it is imperative that someone look it over and check it doesn't break (at least not too badly) and keep possible dependencies up to date (What use is the latest clamav database, which needs a new program, if my installation is outdated?). Exactly good plain old packaging work. What distributions did was precisely to take most of that burden from the hapless end user, and we thank them heartily for it. But that can work only if your package management system knows about all stuff you have on your machine, i.e., one packaging system only. (Yes, I remember my days futzing around with open source packages installed from code on our Sun machines. Fond memories, to be sure; but handling that mess was a real pain. And then installing unofficial sources over Slackware or early Red Hat didn't go much better.)

BTW, the benefits of a single, standardized package over "get to the sources and build your own, custom-tailored package" (be it for code or data) is that there are others out there feeling the same pain if something goes awry, thus it will get fixed (even faster if you lend a hand). There are others with the exact same stuff installed which you can ask in case of trouble.

Besides, what is this "client/server where you have to ensure client and server are matched" nonsense? The solution to that one has been to make the client be the web browser, or use standardized interfaces to databases, or whatnot. "Client/server" applications in this sense are so old-fashioned... No "this is the package that the Internet has to install to work with my stuff" type of solution will ever work (thank $DEITY for that).

Zimmerman: We’ve packaged all of the free software…what now?

Posted Jul 7, 2010 13:50 UTC (Wed) by nlucas (guest, #33793) [Link] (14 responses)

What now? Well for desktop usage I don't think it makes sense to wait until the next version of the distribution to install new released versions in the meantime.

If a new skype version is released with some new cool feature, the user will not understand why it has to wait 6 months and be forced to update everything (specially if the new distro version has some radical changes, like KDE 4.0, which are far from stable).

It should be the distro mission to make sure every package in the new distro version is compatible with the previous version, so the user can just take a deb from the next distro version and install it on the LTS version (no big problem if that would force the update of many other packages).

This would have the advantage of gaining a stable user ABI across versions and get ISV interest. This is one of the big advantages of Windows systems against Linux distros.

Zimmerman: We’ve packaged all of the free software…what now?

Posted Jul 7, 2010 14:02 UTC (Wed) by vonbrand (subscriber, #4458) [Link] (11 responses)

The problem isn't "get latest <foo> out to users", the problem is making sure the whole system works reasonably well. Having a few thousand packages installed, each available in a few dozen versions, is just impossible to check for minimal sanity.

BTW, apropos the so touted Windows compatibility, you will run regularly into programs that plainly don't work on the latest version of the operating system (ever wondered why many people have been reluctant to follow the "service pack" threadmill?). Yes, this has happened to me with a Microsft game for Win98, for which all requirements were fullfilled on WinNT. The only thing that worked was the splash startup, after that it just crased badly. And that even in the face of heroic efforts by Microsoft people to make sure that "important legacy applications" still work, even if they rely on undocumented behaviour, plain bugs, or even outright security failures. "Not so important legacy applications" (like some my daughter uses frequently) just don't work at all on anything except their native WinXP right now.

Zimmerman: We’ve packaged all of the free software…what now?

Posted Jul 7, 2010 15:20 UTC (Wed) by nlucas (guest, #33793) [Link] (10 responses)

It's very hard, but not impossible.

Don't forget there are many times more incompetent developers on Windows than all Linux developers combined.
Win9x and WinNT are so different systems that it's hardly surprising incompatibility happens. Also, Win9x is already an unsupported system (which is one of the best things since sliced bread for any sane windows developer starting a new application).

If a developer actually follows the Microsoft guidelines on how to write a windows program, then it will work from Windows 2000 to Windows 7. But it's hard, not an easy task to do (although not as difficult as writing a complex Linux application that will work on every system).

Microsoft doesn't maintain bug and undocumented behavior compatibility for everything, just when a big enterprise pays them or some very famous application rely on it. It's not in their interest to do so, so don't expect they will do it for free.

One example are the anti-virus software programs that stopped working because most of the time rely on undocumented APIs. Microsoft finally decided security was more important than compatibility and drop the ball on them.

Any way, I only sporadically do any Windows development nowadays, so don't care much anymore. I just don't like reading myths, even if they are related to a common "enemy".

Zimmerman: We’ve packaged all of the free software…what now?

Posted Jul 7, 2010 19:43 UTC (Wed) by vonbrand (subscriber, #4458) [Link] (9 responses)

Yes, the proportion of incompetent developers is roughly constant across systems. In any case, MSFT advertised a program written by them to work as long as certain requirements were met, and they were met by WinNT. It just didn't work, period.

No, MSFT didn't "decide security was more important", they just pulled the rug out from under antivirus vendors when they decided to (try to) take over that segment.

"Writing a complex application that works on every Linux system" is reasonably easy (much easier than it was to write anything to run on a reasonable range of Unix systems back when), as long as you ship source. If you try to ship binaries, you are out of luck.

Zimmerman: We’ve packaged all of the free software…what now?

Posted Jul 7, 2010 21:31 UTC (Wed) by mjthayer (guest, #39183) [Link] (8 responses)

> "Writing a complex application that works on every Linux system" is reasonably easy (much easier than it was to write anything to run on a reasonable range of Unix systems back when), as long as you ship source. If you try to ship binaries, you are out of luck.

From my personal experience, even shipping binaries of complex applications is easier than people believe. As long as you are willing to build them for all the architectures you want to support (most likely the two x86 ones).

Zimmerman: We’ve packaged all of the free software…what now?

Posted Jul 8, 2010 11:13 UTC (Thu) by nlucas (guest, #33793) [Link] (7 responses)

Well, one thing I agree. Shipping binaries it's the easy part.

Zimmerman: We’ve packaged all of the free software…what now?

Posted Jul 8, 2010 11:23 UTC (Thu) by mjthayer (guest, #39183) [Link] (6 responses)

> Well, one thing I agree. Shipping binaries it's the easy part.

When I said "shipping", I included building and solving binary compatiblilty issues in that. It really isn't too hard - build on the oldest system you want to support, decide what dependencies you can rely on the system to provide and ship the rest static, load a couple of things at runtime if in doubt. There are still a couple of gotchas (the autopackage web page has good information on the subject, although C++ ABIs are no longer a problem, and glibc is getting better at not pulling in new features unless you actually use them), but it is still quite feasible. And once you have solved the problems, they stay solved on the whole.

Zimmerman: We’ve packaged all of the free software…what now?

Posted Jul 8, 2010 11:31 UTC (Thu) by nlucas (guest, #33793) [Link] (2 responses)

Right. That's exactly what I was agreeing to.
And the current C++ ABI stability plays a big part in that, contrary to the hell it was earlier (and worst on Windows because there were more than one mainstream compiler).

Zimmerman: We’ve packaged all of the free software…what now?

Posted Jul 8, 2010 11:35 UTC (Thu) by mjthayer (guest, #39183) [Link] (1 responses)

> And the current C++ ABI stability plays a big part in that, contrary to the hell it was earlier (and worst on Windows because there were more than one mainstream compiler).

I thought that Windows people used COM for ABI because it is well defined.

Zimmerman: We’ve packaged all of the free software…what now?

Posted Jul 8, 2010 15:25 UTC (Thu) by nlucas (guest, #33793) [Link]

COM has nothing to do with C++ objects, although there are Microsoft C++ extensions to the language to make it easy to work with it inside C++.
You can use COM in a pure C program.

Zimmerman: We’ve packaged all of the free software…what now?

Posted Jul 8, 2010 15:18 UTC (Thu) by foom (subscriber, #14868) [Link] (2 responses)

> although C++ ABIs are no longer a problem

Hm. Unfortunately, I think they will be again at some point soon. It sounds like when G++'s C++0x support is finalized, they're going to have to change the ABI of std::string and maybe a few other core classes. (okay, so, that's not a C++ ABI break, only an STL ABI change).

Zimmerman: We’ve packaged all of the free software…what now?

Posted Jul 12, 2010 13:25 UTC (Mon) by HelloWorld (guest, #56129) [Link] (1 responses)

Do you have any reliable sources that state this?

Zimmerman: We’ve packaged all of the free software…what now?

Posted Jul 15, 2010 14:51 UTC (Thu) by nix (subscriber, #2304) [Link]

There have been discussions to this effect on the GCC mailing list. It's not unsubstantiated scuttlebutt. (Perhaps a way will yet be found to avoid a std::string ABI change, but none is yet evident.)

Zimmerman: We’ve packaged all of the free software…what now?

Posted Jul 7, 2010 17:11 UTC (Wed) by jrn (subscriber, #64214) [Link] (1 responses)

> It should be the distro mission to make sure every package in the new distro version is compatible with the previous version, so the user can just take a deb from the next distro version and install it on the LTS version (no big problem if that would force the update of many other packages).

You’re in luck, then. For unrelated reasons (being able to recover from an interrupted upgrade), this is part of Debian policy.

Zimmerman: We’ve packaged all of the free software…what now?

Posted Jul 7, 2010 18:16 UTC (Wed) by nlucas (guest, #33793) [Link]

Then I either have been unlucky with Ubuntu on the few times I tried this or they are not exactly doing the same (which really doesn't surprise me that much).

Zimmerman: We’ve packaged all of the free software…what now?

Posted Jul 7, 2010 15:25 UTC (Wed) by xxiao (guest, #9631) [Link] (4 responses)

don't get his point at all...except for deb there are rpm, ipk, apk and so on, there are also LFS and Gentoo, the eco-system is pretty diverse and you can always find some distribution and/or framework to start with, in fact we may already to too many choices. ubuntu is just one case in this big picture, what is he worrying about?

Zimmerman: We’ve packaged all of the free software…what now?

Posted Jul 7, 2010 18:36 UTC (Wed) by Frej (guest, #4165) [Link] (3 responses)

The sad part of packaging systems is that they remove direct contact of those who produce the product (mozilla) to their actual customers.

Ie, the app producer don't have a choice of actually supporting their users. And for users it's quite disruptive to update everything 6 months to get a new version of say... chrome and rhythmbox.

Not always...

Posted Jul 7, 2010 21:12 UTC (Wed) by marduk (subscriber, #3831) [Link]

> The sad part of packaging systems is that they remove direct contact of
> those who produce the product (mozilla) to their actual customers.

> Ie, the app producer don't have a choice of actually supporting their
> users.

OTOH if you need support for how one application behaves with another, or with the OS itself, which the app producer may have little or no knowledge of, then that's where you need the support from the package distributer. This is frequently the case in my experience. Also there's nothing keeping you from submitting a bug to the product's bug tracking system. Add this to the fact that many people are actually paying for support from the distributer and not the producer, then it also makes more sense to contact the distributer. In the same way that I don't have to hunt down the producer of the of the battery in my laptop when it's faulty.. I contact the the laptop's manufacturer because that's where I got the battery.

> And for users it's quite disruptive to update everything 6 months to
> get a new version of say... chrome and rhythmbox.

That's pretty distro-specific. Some distros don't have arbitrary time spans on package updates. :) Or you could always roll your own.

Zimmerman: We’ve packaged all of the free software…what now?

Posted Jul 7, 2010 21:35 UTC (Wed) by vonbrand (subscriber, #4458) [Link]

You mean that is the happy part... as a developer I certainly don't want to have to deal with Open Solaris, Red Hat Enterprise Linux, SUSE Linux Enterprise System, Fedora, Debian, Gentoo, and the list goes on forever. On top of struggling with Windows and MacOS du jour, and a few versions back.

Zimmerman: We’ve packaged all of the free software…what now?

Posted Jul 8, 2010 7:21 UTC (Thu) by niner (subscriber, #26151) [Link]

"And for users it's quite disruptive to update everything 6 months to get a new version of say... chrome and rhythmbox."

Are there still distros out there that don't have repositories with upgraded applications? If I need a newer version of an application than my distribution provides, I just head over to http://packages.opensuse-community.org/ and find a repository containing it. No need to wait 6 or 8 months.

Zimmerman: We’ve packaged all of the free software…what now?

Posted Jul 8, 2010 19:09 UTC (Thu) by pliden (guest, #68580) [Link]

Is it not exactly this problem which the Psys library is supposed to solve? (http://gitorious.org/libpsys/pages/Home)

As far as I have understood it, it will allow for simple, independent installations of packages that integrates nicely with the current package system. It will work on any package system just a back-end is written for that package system.


Copyright © 2010, Eklektix, Inc.
Comments and public postings are copyrighted by their creators.
Linux is a registered trademark of Linus Torvalds