Not logged in
Log in now
Create an account
Subscribe to LWN
LWN.net Weekly Edition for December 5, 2013
Deadline scheduling: coming soon?
LWN.net Weekly Edition for November 27, 2013
ACPI for ARM?
LWN.net Weekly Edition for November 21, 2013
It's possible that an update could fix it, but if you have applied all the 10.04 updates and still have a problem, move to a newer version.
not all fixes can be backported, and you are now complaining about a version that's over a year old.
Linux Mint goes to 11
Posted Jun 7, 2011 6:12 UTC (Tue) by Cato (subscriber, #7643)
That's one of the big problems with a time-based release strategy for an LTS - good for marketing and potentially bad for stability. In fact I'm not a fan of time-based releases for software that's hard to debug - "release when ready" with thorough automated regression testing is more likely to give good results.
Is anyone working on a distributed automated test suite for Linux distros, a bit like CPAN Testers - http://wiki.cpantesters.org/wiki/WhatIsCPANTesters - or this IBM effort from 2005: http://linux.slashdot.org/story/05/06/05/1426206/Linux-Ke... ?
My feeling is that the testing model for Linux is really quite ad hoc with little automation and fairly random coverage, so that major regressions are introduced (Intel graphics worked fine in Ubuntu 8.04 and earlier) and they are only found when distros at beta or release stages, and often not fixed.
Posted Jun 7, 2011 18:14 UTC (Tue) by dlang (✭ supporter ✭, #313)
but if you think about it for a little bit you will see why this is not the case.
how much would it cost to buy one of every model machine that is produced this year? and then think about having to try and go back to machines produced in the past. Then think about trying to house and power all these machines.
and you really would have to have one of every machine due to the various oddball bugs in BIOS/firmware/etc that show up.
Microsoft doesn't have this problem because it is the hardware manufacturer's responsibility to be windows compatible, not Microsoft's responsibility to work with random hardware.
Unfortunantly Linux is not in such a dominant position, so Linux has to work around the bugs, but bugs are only found when they are tested on specific hardware.
the people who released the Intel drivers that are in the kernel used in 10.04 thought they had everything working (and I'm sure they tested it on a bunch of real hardware), but when it got out into the wild, it was discovered that a lot of hardware didn't quite work the way that Intel expected it to once it was wired into motherboards and various customized BIOS' were running it.
As for the problem of 'fixing' 10.04 without upgrading, this gets to the age-old problem of trying to backport only the 'right' things. the number of changes in each kernel release are massive (approaching 10,000 changes per release), deciding which changes need to be backported is an inexact science, and frequently the changes involved for major work end up being so large that it can cause more stability problems than it solves.
you could try running a newer kernel on your 10.04 box to see if that solves your problem, but for something as fundamental as this, I expect that it will take a new kernel to fix it (not just an updated version of the old kernel with backported fixes). Ubuntu may release an optional kernel upgrade for 10.04, but one of the "advantages" of a LTS/Enterprise release is that people don't want the kernel to change.
you can't have it both ways, you need to decide between 'no changes' and 'bugs get fixed'
Posted Jun 8, 2011 11:18 UTC (Wed) by Cato (subscriber, #7643)
Imagine if you could install a nightly alpha version of Ubuntu as dual-boot, using a separate test filesystem, purely to participate in a wide-scale automated test. Some tests would require human inspection (video / sound problems) but it should be possible to determine whether the system froze or crashed (after a reboot involving the user perhaps).
The ideal is to make it as easy as running SETI@Home - tell the system to reboot into the automated nightly testing setup when you are finished for the day, then it does 99% of the work of testing.
This is more difficult than CPAN Testers, since hardware is involved, but finding installation errors is useful, and even statistical info such as "80% of XYZ Intel GPU model are failing" would be of some use, particularly if logs are automatically uploaded.
On the issue of updating 10.04 without a kernel version change - without KMS, the kernel wouldn't be involved. My problem with KMS is that it ties the flaky support of graphics drivers right into the kernel, and in some cases stops boot (Plymouth etc) or causes freezes. Without KMS, an Xorg+driver update would be enough, which is logistically easier.
I know that KMS is good for the future with GPUs that don't support 2D, but for today's GPUs and particularly Intel, it's a major pain - it should not have been allowed anywhere near an LTS, or perhaps the Ubuntu 10.04 LTS should have been delayed by a year to let the drivers stabilise.
KMS has to happen sometime but it has greatly reduced the actual stability of Linux for some people - in fact Linux is generally much less stable on the desktop than when I first started using it in 1996, perhaps because it has much wider hardware support, or perhaps because it is evolving much faster. On the server, Linux is fine and very stable of course.
Copyright © 2013, Eklektix, Inc.
Comments and public postings are copyrighted by their creators.
Linux is a registered trademark of Linus Torvalds