Not logged in
Log in now
Create an account
Subscribe to LWN
LWN.net Weekly Edition for May 23, 2013
An "enum" for Python 3
An unexpected perf feature
LWN.net Weekly Edition for May 16, 2013
A look at the PyPy 2.0 release
That's how most corporations work in theory, for instance, and also how the Linux kernel project works in practice.
Kamp: A Generation Lost in the Bazaar
Posted Aug 20, 2012 23:56 UTC (Mon) by neilbrown (subscriber, #359)
I'm sorry, but I don't see that. The functionality of Linux is certainly going up, but I'm convinced that quality is going down (and that observation is from reading code).
You can only scale by delegation when the property delegated is transitive. However taste is not transitive and consistent taste is needed for high quality.
The linux development style tends to promote compromise rather than quality (Linus wont resolve your dispute, he'll just ignore both of you until you resolve it yourself). Compromise is certainly pragmatic and functional, but is unlikely to be elegant.
I'm not meaning to attack Linux here. It's success speaks for itself and I wouldn't try to change anything which would threaten that. But let's not pretend that quality - of the sort that come from a single guiding taste - has anything to do with it.
Posted Aug 21, 2012 14:41 UTC (Tue) by intgr (subscriber, #39733)
Perhaps it's true that the addition of new features is driving the average quality down, but I also get the impression that, as features stabilize and mature, their quality improves over time again.
One clear case is the mess that was Wi-Fi support back when it got started; how the crappy "code drop" drivers got cleaned up, duplicate Wi-Fi stacks got merged into one, and gained uniform support for features and a uniform configuration interface.
Another example is the ARM cleanups and unifications in recent releases.
I would say that these are a sign of increasing quality and elegance, and a direct result of "the Linux development style", and Linus in particular.
Would you disagree?
Posted Aug 21, 2012 23:04 UTC (Tue) by neilbrown (subscriber, #359)
There are occasionally opportunities to do some refactoring to impose some design on something that has become an ad-hoc collection of features. And sometimes, these opportunities are turned into realities. And that is great.
However I don't think of this as 'design' in the way the original article was using the term. It is more of a case of "struggle along with no real guiding force until the weight of the mess becomes unbearable and then make the effort to clean it up".
That is the sort of design that can scale. It doesn't need just one person. It allows lots of work to be done in parallel and then while there is lots of working code and lots of examples to draw from, an improved design can be drafted and existing code moved over to it - slowly and by lots of people.
It is "A priori" design that people seem to value, but doesn't really scale. "post hoc" design can scale to some extent and is what you see happening in wifi and ARM (And lots of other places).
Of course a consequence of "post hoc" design is that lots of things will never get redesigned because the weight of the ugliness never gets high enough. Maybe that is where autotools is. It is ugly, but not quite ugly enough to force a re-write. The cost/benefit is still too high.
So while there are islands of good design in Linux, they can be expected to degrade over time unless someone puts consistent work into cleaning up. It's the constant battle between energy and entropy.
Posted Aug 22, 2012 8:13 UTC (Wed) by tnoo (subscriber, #20427)
Getting some working implementation, and then abstract from it, while benchmarking alternative, more elegant solutions, is the only process that works if the problem is not fully understood, or is hard.
Copyright © 2013, Eklektix, Inc.
Comments and public postings are copyrighted by their creators.
Linux is a registered trademark of Linus Torvalds