User: Password:
|
|
Subscribe / Log in / New account

Debian squeezes out Chromium

Please consider subscribing to LWN

Subscriptions are the lifeblood of LWN.net. If you appreciate this content and would like to see more of it, your subscription will help to ensure that LWN continues to thrive. Please visit this page to join up and keep LWN on the net.

By Jake Edge
September 8, 2010

Web browsers tend to be fast-moving targets. They are frequently updated both for security holes and to add new functionality. Two of the most popular free software browsers, Firefox and Chromium, also have fairly short lifecycles, which often requires distributions to backport security fixes into unsupported releases. Both Mozilla and Google are more focused on the Windows versions of their browsers—where application updates with bundled libraries are the norm—but it makes life difficult for Linux distributions. So difficult that it appears Debian will be dropping Chromium from the upcoming 6.0 ("Squeeze") release.

On September 1, Giuseppe Iuculano posted to debian-release asking that the release team allow him to replace Chromium version 5, which was in the testing repository, with version 6. Version 5 will no longer receive updates from Google, but Iuculano was concerned that it would too difficult to backport patches that fix some "security issues" with SVG in WebKit because of major refactoring in that codebase. Roughly a week a later, he uploaded Chromium 6 and asked that the team either unblock that from being added to unstable or remove Chromium 5 from testing. The team opted for the latter.

One of the big problems is that Chromium uses a WebKit version that is bundled into the browser source, rather than using a particular released version of the library. Each time Google updates the browser, a new WebKit comes with it, and the old browser goes into an unsupported state. In order to keep using the v5 browser, any security fixes from the new browser and bundled libraries would have to be reflected into the older code—not a small task by any means.

In addition, Chromium versions come fast and furious: v5 was only supported for roughly two months, so the release team was worried that the same would be true of v6. Meanwhile, Squeeze has been frozen, which means that new features are not being added. For a release that prides itself on stability, there really was no choice but to drop Chromium.

In response to Debian project leader Stefano Zacchiroli's request for information to better understand the decision, release assistant Julien Cristau put it this way:

We were given a choice between removing chromium-browser from testing, or accepting a diff of
22161 files changed, 8494803 insertions(+), 1202268 deletions(-)
That didn't seem like much of a choice. I don't have any reason to believe the new version won't have the same problem 2 months (or a year) from now, and as far as I know neither the security team nor the stable release managers usually accept that kind of changes in stable.

Zacchiroli noted that he is a Chromium user and wants to have a clear story for why the browser is not in Squeeze, both for users and for upstream. While it won't be available in the Squeeze repository (testing right now, but stable once it is released), Chromium will likely still be available in the newly official backports repository. That led Michael Gilbert to suggest a slightly different interpretation of what "supported in stable" might mean:

I think that this need is justification to declare backports "officially supported by the debian project". Thus when asked this question, you can point to the fact that chromium is indeed supported on stable, just via a different model than folks are used to. That is of course assuming someone is willing to support the backport. I may do that if Giuseppe isn't interested.

Having chromium not present in stable proper helps the security team immensely.

While it may help the security team, there are still some things to be worked out for users of the backports repository. Currently, packages that come from backports do not get automatically updated when doing an apt-get upgrade (or its GUI equivalent). That would mean that users would have to remember to go grab the latest Chromium whenever a security update came out. Since backports has become an official part of Debian, there is thought that changing the behavior to pick up updates from there would make sense.

It's not just Chromium that is affected however. Squeeze will ship with an Iceweasel—Debian-branded Firefox—in its repository, but there seems to be a belief that over time, as the shipping Iceweasel version falls further and further behind, it too will be coming from backports. That would give further reason to make backports updates automatic and that seems to be the consensus on the debian-release list.

This is a problem that we haven't heard the last of. For any distribution with a long-lasting support window (like Ubuntu LTS, Debian, or any of the enterprise distributions), it is going to be very difficult to keep supporting older browsers. The alternative, which is the direction that most are taking, is to update to the latest upstream version throughout the distribution's support window.

New browser versions often require newer system libraries, though, which may conflict with other application's requirements. Either the browser can be backported to use the libraries that shipped with the distribution, or the newer libraries can be bundled with the browser. Ubuntu has taken the latter approach, choosing to bundle some libraries for 10.04 LTS.

It's also possible that eventually it may be more than just web browsers that adopt a fast release schedule with fairly short support windows. If that happens, it seems likely that distributions will need to pool their resources to backport fixes into the older releases or just follow the upstream releases. It will be especially prevalent for cross-platform software, where the Windows or Mac OS X versions are the most popular. Bundling libraries is the usual path taken by applications on those platforms, so they don't suffer under the same constraints that Linux systems do.

Keeping up with what seems to be an ever-increasing pace of development, while still maintaining stability for users, is a tricky problem to manage. Distributions may find that it is one they can't manage alone—or at all. If the latter, distributions will increasingly have to rely on stability coming from the upstream projects, but there is tension there as well.

Fast-moving projects are likely to make changes to fundamental components, changing both the program's behavior and its interface. While that doesn't necessarily make the application unstable in the usual sense of that term, it does change things in ways that stability oriented distributions try to avoid. It's a difficult balancing act and we'll have to see how it plays out.


(Log in to post comments)

On stability

Posted Sep 9, 2010 4:21 UTC (Thu) by ringerc (subscriber, #3071) [Link]

The big issue with fast-changing apps and stability is that they make it impossible to win the "bug race".

You fix the worst bugs in 1.0. Meanwhile, 1.1 has come out with fixes ... and a whole bunch of bugs^Wfeatures. By the time you fix those, 2.0 comes out with yet another set of exciting quirks to discover and work around. Things never sit still for long enough for issues to be ironed out properly.

This is OK if the fast-moving upstream has really good automated testing and QA. Even then, though, there'll be integration bugs, platform-specific quirks, etc to deal with, and it becomes very frustrating chasing all those when all you want to do is keep up with security fixes.

On stability

Posted Sep 9, 2010 4:47 UTC (Thu) by roelofs (guest, #2599) [Link]

The big issue with fast-changing apps and stability is that they make it impossible to win the "bug race".

Indeed. But the bigger issue, I think, is: what makes Firefox and Chromium so special that they have to pop out a major new, compatibility-busting release every few months? Internet Exploder certainly doesn't; Safari doesn't (AFAIK); and neither does Opera (also AFAIK). This seems like a classic case of cranio-rectal impaction on the part of upstream.

(Of course, if there is a reasonable excuse, I'd love to hear it.)

Greg

On stability

Posted Sep 9, 2010 5:32 UTC (Thu) by drag (subscriber, #31333) [Link]

> Indeed. But the bigger issue, I think, is: what makes Firefox and Chromium so special that they have to pop out a major new, compatibility-busting release every few months?

It's probably related to the issue of how Linux distributions can't give enough of a crap to put serious effort to care about binary compatibility with applications. You know; so it's possible for software vendors to provide and users to take advantage of newer features in newer software and/or rely on proven older software without tearing their hair out.

It's probably a lot easier for Mozilla and Google to simply bundle the dependencies they require with their software then it is leave it up to the distributions and package management systems to do it for them.

On stability

Posted Sep 9, 2010 6:32 UTC (Thu) by rvfh (subscriber, #31018) [Link]

I think Greg's question was not about bundled libraries, but about the rhythm at which new versions come out for Firefox/Chromium compared to, say, Internet Explorer.

IE v8 has been out for some time now, and it looks like there two schools of thought inn the browser landscape: the fast and furious on one side and the stable on the other.

It would be interesting to understand why Firefox is still a so fast moving target after so many years of existence.

Well, yeah.

Posted Sep 9, 2010 6:56 UTC (Thu) by khim (subscriber, #9252) [Link]

It would be interesting to understand why Firefox is still a so fast moving target after so many years of existence.

Because the web is changing? 3D, Drag-N-Drop, WebStorage, WebM, file access, etc - all these features can only be used by web developers if old versions without them are declared obsolete. Microsoft's answer to this challenge is called Silverlight: you can use all these nifty features there already and then it does not matter if the user uses IE6 or IE9. Chrome/Firefox are trying to propose different answer, but it'll only work if features will be pushed to the end users. It does not matter for the web developer if the feature is committed to git respository year ago or just yesterday, what does matter is when the feature is pushed to end users. So it does not matter when new version is released, what does matter is when old version is killed: Google certainly showed is nicely with Chrome and Firefox followed.

Well, yeah.

Posted Sep 10, 2010 0:38 UTC (Fri) by ras (subscriber, #33059) [Link]

I don't think it is the web. Yes, HTML 5 is pushing things along a bit now. But the last major revision was HTML 4, which was released in 1998. That is hardly a frantic rate of change. Ditto for the other major underlying standards - CSS, SVG, Javascript. WebM has nothing to do with it - it is just a codec, a plugin.

I can think of two reasons for pace browsers are developed at. The first is because it is a monumental task, and when you are undertaking a monumental task break it down in to small bits and release early, release often, otherwise you will be overwhelmed. This is because, as standards go, those published by W3 are real pricks. We have enormous teams of programmers implementing them. As I said they haven't changed much in 12 years, yet a seemingly simple thing like rendering ACID3 correctly is a huge challenge. This is what happens when you publish the standard before writing the code. The IETF's policy of producing working code then publishing the standard is how it should be done. The difficulty in implementing a fully compliant HTML, CSS, and SVG are great examples of why it should be done that way.

The second reason is to do with marketing. I used to use, and still occasionally use Firefox 1.0.1. It has UI bugs, it crashes, and it renders things badly. But it was fast, and if they had just fixed those bugs I would be happy running it today. (Although maybe not for much longer, given the advent of HTML 5.) Fixing those bugs didn't require millions of lines of code to change every few months. What does require a new release every few months is a mindshare competition. (Why do you think Ubuntu does it?) What triggers most of those millions of lines of changes is new eye candy - something like rearranging the tabs and title bar, or the introduction of an "awesome bar".

So it is a combination of those two things - the size of the task means implementing a web browser will take 100's if not 1000's of man years, and that means a steady stream of releases as you do it. This is not unlike we see with the kernel or any other large project release. We know Debian can handle that. But them you mix those necessary changes with avalanche of bubble and froth created by a mind share competition - ie change for no reason than it generates publicity, and you become incompatible with a distribution that values stability over most over things.

Well, yeah.

Posted Sep 10, 2010 1:49 UTC (Fri) by iabervon (subscriber, #722) [Link]

Actually, the practical effect of the standards being such a pain is that, despite there not being new blessed versions of the standards documents, there is a lot of change in the de facto standards. This includes things that have always been in the standard, but which nobody previously expected to work. It also includes things that were never specified, but where enough popular browsers did something sensible and the same that people came to rely on that being available. Just because nobody wrote up standards documents of what was expected for a while doesn't mean that the de facto standards didn't keep changing.

Well, yeah.

Posted Sep 10, 2010 2:23 UTC (Fri) by ras (subscriber, #33059) [Link]

> there is a lot of change in the de facto standards.

I disagree. The de facto standards trail the real ones by a large margin.

That is because the de facto standards are to a large extent determined by the oldest browser in use. Until very recently IE 6, which was released in 2001, and it didn't comply with the existing standards in force back then. Google and Facebook only just stopped supporting it.

This "the browsers must change because the web is changing" thing is a complete furphy. Certainly the web is changing. But that is because people are writing millions of millions of lines of javascript as they figure out how to use this "new" platform. But just because the stuff we build on top of the underlying platform changing at a dizzying rate doesn't mean the platform itself must be changing. On the contrary, it hasn't, until now with HTML5. Inventing new ways to do web pages no more depends on HTML/javascript change than the advancement of the linux kernel depends on gcc changing.

Well, yeah.

Posted Sep 10, 2010 20:57 UTC (Fri) by njs (guest, #40338) [Link]

Sure, but one result of those millions of lines of javascript is that developers are running into limitations of the web platform; the platform has to change if it wants to be competitive with Flash/Silverlight/etc.

On stability

Posted Sep 9, 2010 6:54 UTC (Thu) by josh (subscriber, #17465) [Link]

>> Indeed. But the bigger issue, I think, is: what makes Firefox and Chromium so special that they have to pop out a major new, compatibility-busting release every few months?

>It's probably related to the issue of how Linux distributions can't give enough of a crap to put serious effort to care about binary compatibility with applications. You know; so it's possible for software vendors to provide and users to take advantage of newer features in newer software and/or rely on proven older software without tearing their hair out.

You have that entirely backward. Linux distributions are the *only* ones that end up actually addressing library compatibility issues. Software vendors used to systems with no sensible library system (Windows and OS X) treat Linux exactly the same way, either because they want to have a pile of forked upstream libraries without pushing those changes upstream or maintaining a real library fork, or simply because they don't care about handling Linux correctly and the least common denominator requires shipping everything with the application.

Meanwhile, Linux distributions get stuck trying to figure out whether they can build each crazy upstream application against the system version of the library without breaking whatever crazy assumptions the bundled version allowed, and without rewriting large parts of the application's build system.

On stability

Posted Sep 9, 2010 9:06 UTC (Thu) by fb (subscriber, #53265) [Link]

> or simply because they don't care about handling Linux correctly and the least common denominator requires shipping everything with the application.

There is also the fragmentation that makes handling things correctly harder. From the perspective of the software vendor "Desktop Linux" is a blip in the radar, and a very fragmented one. (How many Fedora, Ubuntu, Suse, Debian etc versions are there to test against?)

The software vendor can rationally decide that there will be less users hit by a bug if they put even more resources testing it on Windows and MAC/OS than trying to test things on this large matrix of different Linux distributions and all their versions in use.

On stability

Posted Sep 9, 2010 17:05 UTC (Thu) by josh (subscriber, #17465) [Link]

That's what "unstable" versions of distributions are for; users of those distributions will happily do the testing for you, and report a pile of bugs either upstream or by way of the distribution. That still doesn't explain shipping the (modified) source code to umpteen libraries in the application source tree.

"Handling things correctly" doesn't mean "test with every obscure Linux distribution", or even "test with the top N distributions". It means "make sure it builds in whatever reasonably up-to-date distribution the developers run, handle shared library versioning sanely, document the list of build dependencies, let Linux distributions package it, and work with them when they report bugs to you".

Well, it's a race...

Posted Sep 9, 2010 6:46 UTC (Thu) by khim (subscriber, #9252) [Link]

Internet Exploder certainly doesn't; Safari doesn't (AFAIK); and neither does Opera (also AFAIK).

Internet Explorer certainly released often enough back when it had competition (IE1: August 1995, IE2: November 1995, IE3: August 1996, IE4: September 1997, IE5: March 1999, IE55: July 2000, IE6: August 2001). When it reached 90%+ market share Microsoft decided to kill the HTML and replace with with XAML. When that failed it started released new versions again - albeit still sluggishly. Safari enjoys MacOS lock-in (Windows version is a joke) while browser which don't have such lock-in support need to innovate faster, so both Firefox and Opera released few versions per year (sure, they were named "minor versions", where Chrome names it's versions "major", but this is just PR). For example Opera released version 10.60 just two months ago.

The only recent difference is faster obsolescence of older versions - and this is related to HTML5 push: if we are proposing it as a viable alternative to Flash or Silverlight then we must guarantee that new features will be accessible to majority of users quite fast. And the only way to achieve it is to aggressively upgrade clients - like Flash or Silverlight are doing.

Well, it's a race...

Posted Sep 9, 2010 15:53 UTC (Thu) by smoogen (subscriber, #97) [Link]

History of things:

Microsoft Explorer was originally Spyglass Enhanced Mosaic. Spyglass was a company whose idea was that by making browsers companies could specialize then it could get into the corporate market versus the consumer market that Netscape was focused on. Problem is that consumer markets move very very quickly and corporate ones do not. So Spyglass ended up in a game of catchup with its 5 programmers against Netscapes 20. However, Spyglass thought it had an ace in the sleeve with a deal with Microsoft.. until they realized that Microsoft could hire hundreds of engineers to rework their source code.

IE1->IE4 were mostly Spyglass code with lots of additions from Microsoft. IE5 was mostly Microsoft with some stuff from the remains of Spyglass (who had gone away in 1998 or so because they had not asked for a per copy payment from Microsoft in exchange for the source code... ) I have been told that sometime after Microsoft reached 90%, they repurposed most of the engineers working on the browser to other projects thus the slow down of 'features' until Mozilla restarted the race.

On stability

Posted Sep 9, 2010 8:46 UTC (Thu) by epa (subscriber, #39769) [Link]

What do you mean by 'compatibility-busting'? New Firefox releases (and, I assume, Chrome and Chromium) have pretty good compatibility with everything except old plugins. A new release may add features, but the web being what it is, it's rare for browsers to break compatibility with existing sites or web applications.

That is why most distributions, including the commercial ones with long support windows, are happy to package the latest browser versions as they come out. Keeping your users on the same four-year-old browser (with only security fixes backported) is not long-term support worthy of the name.

On stability

Posted Sep 9, 2010 9:29 UTC (Thu) by NAR (subscriber, #1313) [Link]

A new release may add features, but the web being what it is, it's rare for browsers to break compatibility with existing sites or web applications.

Actually Opera 10.60 sometimes crashes on gmail, so I went back to 10.10. Bug report was sent.

On stability

Posted Sep 10, 2010 23:43 UTC (Fri) by giraffedata (subscriber, #1954) [Link]

A new release may add features, but the web being what it is, it's rare for browsers to break compatibility with existing sites or web applications.
Actually Opera 10.60 sometimes crashes on gmail, so I went back to 10.10. Bug report was sent.

I assume accidental incompatibility, which this surely is, isn't what the OP had in mind when complaining of compatibility-busting updates.

One thing I thought of when I read that is compatibility with user procedures. If the user knows how to use Opera 59, will he be able to use Opera 60 the same way? In the eight years or so I've been using Opera, both on Linux and Windows, I've upgraded two or three times and each time features I used disappeared. Sometimes they disappeared for good; other times they went into hiding, bound to a different key or something. Consequently, my policy now is not to upgrade. If I were using a system where updates were essentially mandatory, I'd be pretty bothered.

On stability

Posted Sep 9, 2010 17:04 UTC (Thu) by foom (subscriber, #14868) [Link]

New firefox requires new xulrunner and rendering libraries, which are not always compatible, and may require recompiling all dependent applications. If it was only firefox and thunderbird, and nothing else used the rendering engine, then sure, you could simply upgrade to the latest version without thinking. But there's other *desktop* apps involved too...

On stability

Posted Sep 9, 2010 21:41 UTC (Thu) by roelofs (guest, #2599) [Link]

What do you mean by 'compatibility-busting'?

I meant more or less what the article was talking about, at least in part--i.e., the bundling of custom, system-incompatible libraries/toolkits such as Chromium and Webkit. (Other articles have covered Firefox's bundling and occasional forking of system libraries, not to mention its API disaster called "xulrunner.") But beyond that, there's the issue of "self-compatibility," which is what's relevant to the backporting of security fixes. Granted, it's unusual to ding a project on the pace of changes to its internals, but then again, in today's desktop systems there's no greater attack surface than the web browser (and its dependencies). Firefox's never-ending stream of vulnerabilities makes 1990s sendmail look good.

I'm not a complete luddite; I get the need for apps that are as central to the user experience as browsers are to innovate, add features, etc. But with that great power comes great responsibility--i.e., to make the browser significantly more secure than the average desktop app--and I'm not seeing an acknowledgment of that responsibility. Indeed, the short support cycles and general level of code churn that limits the ability of others to provide such support are arguably an abdication of that responsibility. (And, for what it's worth, I really don't see a need for the feature cycles to be so rapid. What are the appalling omissions in, say, a 2007 browser--or even a 2009 one--that are blocking the deployment of critical new web stuff?)

Note that nothing I've said implies that distributions should not be able to package newer releases if that's what makes sense for them. My beef is with the other end, i.e., development practices that penalize those distros (or end users) that don't want to upgrade more than once every couple of years.

Greg

On stability

Posted Sep 9, 2010 22:52 UTC (Thu) by jspaleta (subscriber, #50639) [Link]

look at it this way. Google has a master plan for its Chrome browser that quite frankly does not include the needs of other linux distributions or their users. Chromium the project has a codebase which is the leading edge of that plan for Chrome the product.

Do you really think that ChromeOS is going to have a browser with this sort of rate of churn? The rapid rate of development for Chromium right now has a lot to do with Google's overall plan for its own product line. They have real consumer device targets in mind for Chrome and ChromeOS and what you are seeing is the development run-up towards very specific end-goals.

Lets face it, other linux distributions are not the target audience and are not driving the development curve. But the development curve does make sense if you look at stable ChromeOS deployments as the end goal for the rapid development push that is going on now with Chromium. We can get as mad as we want about the reality of that, and its not going to make any difference at all.

-jef

On stability

Posted Sep 10, 2010 15:24 UTC (Fri) by nevyn (guest, #33129) [Link]

> Do you really think that ChromeOS is going to have a browser with this
> sort of rate of churn?

Yes. Look at the Nexus One, that has had ~4 download and reboot OS upgrades in the 6 months I've had it.

Release engineering and stability is something old people talk about.

On stability

Posted Sep 10, 2010 16:57 UTC (Fri) by jspaleta (subscriber, #50639) [Link]

Which version of Android are you running? 2.1 as originally shipped or did you upgrade to 2.2?

On stability

Posted Sep 10, 2010 21:03 UTC (Fri) by nevyn (guest, #33129) [Link]

2.2 now, I guess. I bought it in Feb. and hit the update button whenever it told me to (which, as I said, is like 4 times).

The apps. on it are even worse, it was a significant timesaver when the last OS update now allowed me to turn automatic updates on for them.

On stability

Posted Sep 10, 2010 18:31 UTC (Fri) by bronson (subscriber, #4806) [Link]

> Release engineering and stability is something old people talk about.

True! I nominate this for a quote of the week.

On stability

Posted Sep 9, 2010 8:51 UTC (Thu) by fb (subscriber, #53265) [Link]

> what makes Firefox and Chromium so special that they have to pop out a major new, compatibility-busting release every few months? Internet Exploder certainly doesn't; Safari doesn't (AFAIK); and neither does Opera (also AFAIK). This seems like a classic case of cranio-rectal impaction on the part of upstream.

When it's the Linux kernel release cycle, the words "release early, release often" come to everyone's mind, and the release speed is something of a lesson to be learned. An achievement of excellence with regards to software development.

Isn't the same thing that Firefox and Chromium are doing? Releasing early, and releasing often?

[...]

Firefox and chromium development cycles are focused on desktop users.

The position of "stability" in the priorities of many Linux distributions are IMO based on a "server room mentality": stability cannot be at _any_ risk, and there is a sys-admin. If new features are needed, the admin will manually do something about it.

Desktop users, in general, live a different life: (i) there is no "professional full-time admin" for the box (it must "just work") (ii) the whole point of that computer is to browse the internet, print flight tickets, and use VoIP. They need the features, and _some_ stability risk is an acceptable trade off. It just so happens that most desktop Linux users are comfortable as sys-admins.

Off-topic: FWIW _all_ desktop Linux users I knew during my PhD who were not comfortable as sys-admins are now MAC users. Every time I asked for the reason to migrate (these were people who had Linux installed at home) the answer was (in essence) that MAC/OS didn't require them to play sys-admin in order to get things to work.

On stability

Posted Sep 9, 2010 10:37 UTC (Thu) by nicooo (guest, #69134) [Link]

> When it's the Linux kernel release cycle, the words "release early, release often" come to everyone's mind, and the release speed is something of a lesson to be learned. An achievement of excellence with regards to software development.

The kernel doesn't bundle 17 different libraries.

On stability

Posted Sep 9, 2010 12:14 UTC (Thu) by fb (subscriber, #53265) [Link]

>> When it's the Linux kernel release cycle, the words "release early, release often" come to everyone's mind, and the release speed is something of a lesson to be learned. An achievement of excellence with regards to software development.
> The kernel doesn't bundle 17 different libraries.

How is that related?

As far as I had understood both "roelofs" and "ringerc" were complaining about release speed, the time a release is maintained, and new features bringing new bugs. AFAICT that has nothing to do with bundling of libraries.

On stability

Posted Sep 9, 2010 13:29 UTC (Thu) by nicooo (guest, #69134) [Link]

The excuse for bundling libraries was that it helps them release faster.

On stability

Posted Sep 9, 2010 15:39 UTC (Thu) by ringerc (subscriber, #3071) [Link]

In truth I really like rapid releases - for desktop software on home machines, and in development tools/libraries. Bugs in software I use myself I can work around or fix, or I can roll back to an older version until the issue is fixed in a later release.

It's only when I have my sysadmin hat on and am responsible for the reliability of a network of machines used by other people that I start to want stability and time to fix things before the next update breaks everything all over again. Even then, I still really like rapid releases ... it's only when they're accompanied by the total abandonment of any support for any older releases that they bug me.

I do think FF and Chrome may be rushing into the future a little *too* fast - not in the sense of improving too rapidly, but in being unwilling to keep a release or two around for at least security fix purposes.

On stability

Posted Sep 9, 2010 21:54 UTC (Thu) by roelofs (guest, #2599) [Link]

As far as I had understood both "roelofs" and "ringerc" were complaining about release speed, the time a release is maintained, and new features bringing new bugs.

Yup.

AFAICT that has nothing to do with bundling of libraries.

To the extent the bundled libraries are modified, it certainly does. And even where they're not modified, the mere fact that they're additional copies means extra pain when security issues affect (or may affect) them--particularly when the browser version is no longer maintained by upstream.

But that discussion is more relevant to an earlier article.

Greg

On stability

Posted Sep 16, 2010 14:57 UTC (Thu) by robbe (subscriber, #16131) [Link]

The kernel community (Greg KH, with much help from others) do release updates to certain kernels, essentially creating upstream-supported stable branches.

I cannot see Firefox or Chromium upstream doing that. They seem to jettison support for older version once the new one is out.

Another difference is that the big distros seem to fund quite a bit of manpower working on the kernel. How many people have their Firefox/Chromium work paid for by a Linux distro?

On stability

Posted Sep 9, 2010 10:36 UTC (Thu) by djm (subscriber, #11651) [Link]

Hear are two good reasons for FF and Chromium's rapid release cycle: 1) fixing security problems (these are large and complex application by necessity). 2) Adding new features, since the web platform is still evolving fast.

The alternative to this, as you correctly identify, is Internet Explorer. A browser that substantially lags FF and Chromium in standards compliance, features and in which very serious vulnerabilities are not fixed _for years_ because of the engineering difficulties in doing so.

On stability

Posted Sep 13, 2010 16:19 UTC (Mon) by gerv (subscriber, #3376) [Link]

The evolution of the web platform. In the past fifteen months, Firefox has added at least the following major features:

* Web fonts
* Vast improvements in SVG support
* Major speed improvements for JavaScript (JITs and other magic)
* Out of process plugins, to stop Flash crashing your browser all the time (which required a major internal rearchitecture)
* Profile sync
* Large chunks of HTML5 and CSS3
* Rendering system changes to support mobile and GPU-accelerated graphics

You can't do this with small patches on top of a stable base.

Gerv

On stability

Posted Sep 13, 2010 18:03 UTC (Mon) by Trelane (subscriber, #56877) [Link]

They also added support for Cairo 1.10 in trunk last week. This brings a lot of improvements in the longer-term, but is trouble in the short (my build system now needs retweaking).

Stability of Chrome on Ubuntu

Posted Sep 9, 2010 8:10 UTC (Thu) by Cato (subscriber, #7643) [Link]

Very topical, since the Google Chrome 6.0 that I got from the Google repositories segfaults on startup with Ubuntu 8.04. Not what I expect from a stable browser, and Google Chrome 5.0 works fine.

Stability of Chrome on Ubuntu

Posted Sep 9, 2010 8:51 UTC (Thu) by MisterIO (guest, #36192) [Link]

Actually Chromium 6, which is in Debian Unstable, works better than Chromium 5.

Debian squeezes out Chromium

Posted Sep 9, 2010 8:18 UTC (Thu) by michaeljt (subscriber, #39183) [Link]

I think that this is partly the result of a couple of inconsistencies in the FLOSS/Linux distribution model. Distributions can't decide whether they want to be packagers or downstream developers and end up distributing massively patched pieces of software. And they try to cater to users who want a stable distribution with long-term support, but who still want Chromium and Firefox (probably many even want the latest versions). Nothing wrong with that of course, no one is asking humans to be robotically consistent and logical, but you have to decide where to stop. And it sounds like Debian is moving that way too.

It would be lovely to see a more tiered distribution, with a base set of packages for which the upstream is known to be reliable at maintaining stable releases and quick to apply patches from downstream (that way distributions can patch upstream directly and adopt new bug-fix releases faster in the knowledge that upstream is not likely to put silly things into them) and other packages sorted according to how reliable upstream is known to be, but still adopting upstream's bug-fix releases reasonably fast (or new major releases for more difficult cases like Firefox and Chromium). Then the user can make their own informed choices as to what they install.

I think that my other dream would be for the packaging to be maintained upstream as well (though probably by distribution people), so that when a new version is to be pulled in, distributions can just examine changes and build.

Debian squeezes out Chromium

Posted Sep 9, 2010 17:54 UTC (Thu) by tzafrir (subscriber, #11501) [Link]

So no problem with Chromium bundling WebKit, which is on its way to become a "system library"?

Who maintains the seucirty fixes in the bundled copy of webkit in chromium? Google?

Debian squeezes out Chromium

Posted Sep 9, 2010 19:34 UTC (Thu) by michaeljt (subscriber, #39183) [Link]

> Who maintains the seucirty fixes in the bundled copy of webkit in chromium? Google?

In my picture yes. It is their choice to bundle a library and assume the responsibility for it rather than use the system version. And the distribution will do stability testing and supply upstream with fixes as now, as far as upstream plays responsibly, and at the same time they will provide their users with the information they need to judge how much of a security risk they are taking by installing Chromium. And Google will no longer have the distribution conveniently taking responsibility in front of the user for the security aspects of their (Google's) decisions.

Debian squeezes out Chromium

Posted Sep 16, 2010 13:49 UTC (Thu) by Lennie (guest, #49641) [Link]

It all sounds a bit like http://www.debian.org/volatile/ as well

The problem is even visible with short distribution cycles

Posted Sep 9, 2010 11:05 UTC (Thu) by cdamian (subscriber, #1271) [Link]

With Fedora for example it happens that a new Firefox is released just after one of 6 month Fedora cycle releases. So users have to live with an "outdated" browser for half a year.

And on the web six months is a long time. There are repositories for Firefox and Chromium which always contain the newest version, but these are usually not as well tested as the distribution packages.

Debian squeezes out Chromium

Posted Sep 9, 2010 11:56 UTC (Thu) by Trou.fr (subscriber, #26289) [Link]

On a not-completely-related note, now that backports are "official", I couldn't find any word about security updates for them. Any idea ?

Debian squeezes out Chromium

Posted Sep 9, 2010 15:19 UTC (Thu) by patrick_g (subscriber, #44470) [Link]

According to the FAQ there is no security support.

Debian squeezes out Chromium

Posted Sep 9, 2010 14:10 UTC (Thu) by rhertzog (subscriber, #4671) [Link]

This article comes out one day after the removal of squeeze and assumes the decision is final. It would not be the first time that a package reenters testing once the situation is clarified and when the security team confirms that it is able to provide some security support for it.

Moritz Muehlenhoff of the security team just confirmed that they were planning security support for chromium much like they do for iceweasel:
http://article.gmane.org/gmane.linux.debian.devel.release...

And the chromium maintainer also accepted to backport security fixes once v6 is no longer the current stable version for upstream. I hope it will get back into squeeze before the release.

This is an internal communication failure from Debian and it becomes an external communication failure because the maintainer was frustrated (because Julien Cristau removed the package and gave no explanation to the maintainer) and blogged and you picked up the news very quickly (maybe too quickly).

Automatic upgrade of backports

Posted Sep 9, 2010 17:55 UTC (Thu) by ametlwn (subscriber, #10544) [Link]

Currently, packages that come from backports do not get automatically updated when doing an apt-get upgrade (or its GUI equivalent).
This is fixed easily by adding
Package: *
Pin: release a=lenny-backports
Pin-Priority: 200
to /etc/apt/preferences. It is the recommended setting for using Debian backports, see the official instructions.

Automatic upgrade of backports

Posted Sep 10, 2010 12:06 UTC (Fri) by wookey (subscriber, #5501) [Link]

Maybe we should just stick chromium and iceweasel into volatile and have done with it.

Debian squeezes Chromium back in

Posted Oct 4, 2010 13:18 UTC (Mon) by zack (subscriber, #7062) [Link]

As of today, chromium is back in Debian Squeeze (which is not released yet, but still in freeze) http://packages.qa.debian.org/c/chromium-browser/news/201...

The lesson to learn here is that, as long as Squeeze (or any other Debian suite, FWIW) is not released, there is still a margin of variability in the software it contains. If that margin weren't there, then the suite would have been released in the first place. The point of a freeze, in Debian, is to stop package acceptance in the suite *by default* and undergo a thorough scrutiny of what goes in and what gets out.


Copyright © 2010, Eklektix, Inc.
This article may be redistributed under the terms of the Creative Commons CC BY-SA 4.0 license
Comments and public postings are copyrighted by their creators.
Linux is a registered trademark of Linus Torvalds