User: Password:
|
|
Subscribe / Log in / New account

Kamp: A Generation Lost in the Bazaar

Here's a troll of sorts by Poul-Henning Kamp, posted to the ACM Queue site. "That is the sorry reality of the bazaar [Eric] Raymond praised in his book: a pile of old festering hacks, endlessly copied and pasted by a clueless generation of IT 'professionals' who wouldn't recognize sound IT architecture if you hit them over the head with it. It is hard to believe today, but under this embarrassing mess lies the ruins of the beautiful cathedral of Unix, deservedly famous for its simplicity of design, its economy of features, and its elegance of execution." Perhaps it's just venting by somebody who got left behind, but perhaps he has a point: are we too focused on the accumulation of features at the expense of the design of the system as a whole?
(Log in to post comments)

Kamp: A Generation Lost in the Bazaar

Posted Aug 20, 2012 14:49 UTC (Mon) by fb (subscriber, #53265) [Link]

He makes his finishing wrap-up point clearer in the article's comments section:

> ... My point is not that there are no cathedrals today, but that people don't recognize them as such or see any value in them.

Kamp: A Generation Lost in the Bazaar

Posted Aug 20, 2012 15:45 UTC (Mon) by pboddie (guest, #50784) [Link]

The author references a Brooks publication in his rant, but I wonder whether he has considered the lessons of Brooks' other works. One of the problems with developing lots of software that has to work together is scaling up to actually do just that, and one of the choices made to make this happen is to reduce the communications overhead even if that means people duplicating effort, making their own technology choices, and so on. If the resulting system can withstand this "anarchy" both in terms of resources and maintainability then such a tradeoff has to be considered acceptable because the alternative is that you don't deliver anything at all.

(In "The Mythical Man Month" I seem to recall Brooks going into some detail about the distribution of paper manuals to development groups on a regular basis during the development process. Although the overhead of such communications would be less in today's networked reality, it emphasises the kind of effort you need to coordinate groups within a monolithic project.)

I think the author conflates "cathedral" with "software engineering" and "bazaar" with "hobbyist practices" when in fact both styles of development are best practised with proper software engineering techniques. There are probably plenty of cathedral projects suffering from the kind of unsophisticated techniques that the author bemoans.

Kamp: A Generation Lost in the Bazaar

Posted Aug 20, 2012 15:00 UTC (Mon) by henning (guest, #13406) [Link]

He clearly has a point: "But to anyone who has ever wondered whether using m4 macros to configure autoconf to write a shell script to look for 26 Fortran compilers in order to build a Web browser was a bit of a detour, [there is] well-reasoned hope that there can be a better way."

Kamp: A Generation Lost in the Bazaar

Posted Aug 20, 2012 15:16 UTC (Mon) by landley (subscriber, #6789) [Link]

Apparently he missed that the "Cathedral" in CATB was the FSF, and the "Bazaar" was Linux development. The 1997 usenix paper was comparing two different styles of free software development.

Rob

Kamp: A Generation Lost in the Bazaar

Posted Aug 21, 2012 12:45 UTC (Tue) by njwhite (guest, #51848) [Link]

Exactly what I was thinking as I wrote it. He seems to be appropriating ESR's terminology for something wholly different.

Kamp: A Generation Lost in the Bazaar

Posted Aug 20, 2012 15:17 UTC (Mon) by patrick_g (subscriber, #44470) [Link]

What was the canonical way to compile a program during the early age of UNIX (before the Cathedral and even before "AT&T spun off Unix to commercialize it") ? What was the better way PHK is talking about ?

Kamp: A Generation Lost in the Bazaar

Posted Aug 20, 2012 15:21 UTC (Mon) by corbet (editor, #1) [Link]

Lots of stuff shipped with a collection of makefiles...Makefile.sunos, Makefile.hpux, Makefile.irix, ... That and lots of #ifdef lines.

Kamp: A Generation Lost in the Bazaar

Posted Aug 20, 2012 18:10 UTC (Mon) by samroberts (guest, #46749) [Link]

Also, imake, but even the X project it was written for has abandoned it. Disliking autotools shouldn't lead you into the belief that nothing can be worse!

Kamp: A Generation Lost in the Bazaar

Posted Aug 20, 2012 19:43 UTC (Mon) by wahern (subscriber, #37304) [Link]

Shipping a bunch of different Makefiles isn't necessarily worse. Consider that in the intervening 20 years, all of the extant platforms have converged significantly, due in no small part to POSIX and SUS. Today, apart from major feature differences for which there's no simple solution, there are fewer incompatibilities. Papering them over requires some ifdef's, yes, but it's not that bad. Using separate Makefiles (1) allows automatic builds to work in the common case, and (2) simplifies fixing issues because the learning curve wrt the build for that project is shallow.

In the modern context, the value of autotools and similar is less, and the relative cost higher.

The bigger issue today is remaining "portable" between Linux and Windows. That causes many times more headaches than in even the most backwards of Makefile hacks.

Kamp: A Generation Lost in the Bazaar

Posted Aug 20, 2012 21:23 UTC (Mon) by cmccabe (guest, #60281) [Link]

I hate how the discussion about build systems gets framed as a binary choice between using hand-rolled Makefiles and using autotools. There are other build systems in the world, like CMake! And they even have portability to Windows, which autotools does not.

The so-called portability of autotools is a mirage, anyway. autotools requires you to write portable shell scripts in order to be portable. Since most developers couldn't do this to save their lives, it is effectively Linux-only.

Kamp: A Generation Lost in the Bazaar

Posted Aug 20, 2012 22:29 UTC (Mon) by nix (subscriber, #2304) [Link]

People writing "configure.ac"s with significant shell fragment content (which is surely going to be checking for something more complex than the mere presence or absence of functions) but yet who can't even be bothered to read the Shellology chapter in the Autoconf manual or even use the freely-available POSIX spec on opengroup.org when writing their shell fragments... well, words fail me. People like that *deserve* to lose.

Kamp: A Generation Lost in the Bazaar

Posted Aug 20, 2012 22:35 UTC (Mon) by Cyberax (✭ supporter ✭, #52523) [Link]

You mean 99% of developers who simply treat autotools as a magic box that prints cryptic incantations on their consoles?

Kamp: A Generation Lost in the Bazaar

Posted Aug 21, 2012 1:11 UTC (Tue) by cmccabe (guest, #60281) [Link]

Let's just consider a random configure.ac that happens to be laying around on my computer.

http://ceph.com/git/?p=ceph.git;a=blob;f=src/gtest/config...

There's something non-portable about this one. Can you spot what it is?

Kamp: A Generation Lost in the Bazaar

Posted Aug 21, 2012 2:04 UTC (Tue) by hummassa (subscriber, #307) [Link]

That it needs pthreads?

Kamp: A Generation Lost in the Bazaar

Posted Aug 21, 2012 5:38 UTC (Tue) by pbonzini (subscriber, #60935) [Link]

"test x == y" is unportable.

But it's otherwise a very well-written configure.ac

Kamp: A Generation Lost in the Bazaar

Posted Aug 21, 2012 20:40 UTC (Tue) by cmccabe (guest, #60281) [Link]

Good find. I hope this puts to rest the idea that you need to write "significant shell content" to run into portability problems with autotools. You only need to write one line, or half a line.

By the way, gtest has since switched to CMake.

Kamp: A Generation Lost in the Bazaar

Posted Aug 21, 2012 22:36 UTC (Tue) by nix (subscriber, #2304) [Link]

For that project, who cares? Ceph is intrinsically nonportable, IIRC.

(Arguing that Autoconf is awful because it includes shell fragments is also vitiated in the presence of a project that uses any manually-written shell scripts at all. As most do.)

Kamp: A Generation Lost in the Bazaar

Posted Aug 22, 2012 6:15 UTC (Wed) by cmccabe (guest, #60281) [Link]

As a sometime contributor to Ceph, I can definitely say that no, it is NOT "intrinsically nonportable." The servers which implement the filesystem itself are all userspace processes, which should, in theory, be portable to any POSIX OS. In fact, you can see a few places in the source where someone added #ifdef DARWIN and similar things. There was never a serious effort to get things working well on non-Linux platforms, just due to limited resources and interest. But that may change in the future.

And yes, the kernel component is Linux-only, but there are other ways to access the filesystem.

> (Arguing that Autoconf is awful because it includes shell
> fragments is also vitiated in the presence of a project
> that uses any manually-written shell scripts at all. As
> most do.)

All the standalone shell scripts that are currently in the Ceph server repo are used for testing purposes. That's why they're in directories with names like 'qa', 'test', and so forth. Later on, the project started moving more towards Python for testing... but I digress.

Would you be ok with a build system that required you to write Makefiles in C? Oh, but your project includes .c files, so your arguments are vitiated. Come on, this is absurd, and you know it.

Kamp: A Generation Lost in the Bazaar

Posted Aug 22, 2012 7:40 UTC (Wed) by cmccabe (guest, #60281) [Link]

In case it's not clear, the opinions about autotools here are my own, not necessarily those of other Ceph developers. I'm not claiming to speak for them (although if you are reading this, guys, consider ditching autotools).

C.

Kamp: A Generation Lost in the Bazaar

Posted Aug 22, 2012 13:38 UTC (Wed) by nix (subscriber, #2304) [Link]

I'd be unhappy with a build system that required me to write makefiles in C because C is a bad language for writing makefiles in. However, I note that makefiles are themselves little more than dependencies tied to shell script fragments, so clearly the shell and make are relatively tightly associated (indeed, make knows internally what characters are common shell metacharacters, and stuff like that). configure scripts can in any case accept C fragments too (and Pascal fragments, C++ fragments and the like). They are used where they are suitable, i.e. when trying to figure out properties of the compilation environment.

What I was trying to point out is that the nonportability of shell script is a bad argument against autoconf if you use any other shell script in your source tree, because you are just as likely to introduce nonportabilities there as in the configure.ac (and they would be just as hard to detect). The portability or otherwise of C is irrelevant to this argument: your objection is a non sequitur.

configure scripts depend on the shell environment in any case, so allowing you to introduce your own shell fragments introduces no new dependencies, which was very important to the Autoconf designers. (Again, the manual *says* all this. This is not secret hidden knowledge.)

Kamp: A Generation Lost in the Bazaar

Posted Aug 22, 2012 18:27 UTC (Wed) by cmccabe (guest, #60281) [Link]

> However, I note that makefiles are themselves little more than
> dependencies tied to shell script fragments, so clearly the shell and make
> are relatively tightly associated (indeed, make knows internally what
> characters are common shell metacharacters, and stuff like that).

Again, this is not a binary choice between plain old Makefiles and autotools. My choice is CMake.

CMake has its own scripting language which is portable across platforms (including Windows), versioned, and specifically designed for this use-case. Using the policy mechanism you can tell CMake "act as if you were CMake 2.6." Anyone with a version of CMake higher than or equal to 2.6 can then compile your project exactly as you intended. Shell scripts don't have any of these things.

Yes, shell scripts are appropriate for some use cases. But a build system is just not one of them. Let's use the right tool for the job rather than trying to explain how with a little duct tape, a plunger can be used as a hairbrush.

At the end of the day, nothing can make your code portable but you. But the build system can help you or hurt you. If it closes off platforms to you, like autotools closes off Windows to you, that hurts. If it forces you to write shell scripts to do everything, that hurts portability too. I've spent a long time fixing build breakages that resulted from autotools' poor choices. The question of what kind of Makefile.am or configure.ac a superintelligent Zen master who has studied AIX, IRIX, csh, tcsh, bash, zsh, dash, and every other shell out there might write is not interesting to me. In the real world, autotools = not portable.

Kamp: A Generation Lost in the Bazaar

Posted Aug 22, 2012 19:21 UTC (Wed) by hummassa (subscriber, #307) [Link]

That is really not my experience. I prefer CMake to autotools for new code, but CMake was not really crosscompiler-friendly the last time I checked. Autotools is really much more complicated, but it will even work in MinGW/CygWin without a lot of problems... and it will suport crosscompiling without a lot of EXTRA effort. Yes, it is possible that CMake works well with MSVC, but I wouldn't know about it.

Kamp: A Generation Lost in the Bazaar

Posted Aug 22, 2012 20:17 UTC (Wed) by boudewijn (subscriber, #14185) [Link]

I know that it does work well with msvc. I also know that it works well when cross-compiling from linux to windows. Maybe the setup is a bit more involved than with autotools, maybe it's not -- maybe the people feel it is more involved just don't remember the effort they expended on the autotools setup when they had to work with it for the first time. And yes, cmake works very well with mingw on windows itself.

Really, people who have an opinion on cmake and autotools in relation to windows, whether it's with mingw, msvc, or icc, but haven't checked out the way the kde-windows project uses cmake should check it out. Doing that will make their life _much_ easier.

Kamp: A Generation Lost in the Bazaar

Posted Aug 24, 2012 1:16 UTC (Fri) by cmccabe (guest, #60281) [Link]

CMake has supported cross-compiling for a long time. http://www.vtk.org/Wiki/CMake_Cross_Compiling

Various projects may have trouble with cross-compiling, because their authors did something problematic (like assuming that they could build a binary with the target compiler and then run it, or using hard-coded paths in a dumb way). However, this is just as true with autotools. Fix the bugs and move on.

Kamp: A Generation Lost in the Bazaar

Posted Aug 23, 2012 5:46 UTC (Thu) by quotemstr (subscriber, #45331) [Link]

> can't even be bothered to read the Shellology chapter

I read the chapter and decided to ignore it. I'm not interested in working around ancient HP-UX bugs or supporting systems too old to have decent support for here-documents.

Kamp: A Generation Lost in the Bazaar

Posted Aug 21, 2012 0:33 UTC (Tue) by HelloWorld (guest, #56129) [Link]

CMake is a steaming pile of crap and the sooner it dies, the better.

Kamp: A Generation Lost in the Bazaar

Posted Aug 21, 2012 1:56 UTC (Tue) by Cyberax (✭ supporter ✭, #52523) [Link]

Yet it works and works great. It can build usable project files for XCode and MSVS, and it's the only real build project that can do it.

Scons and waf are also nice.

Kamp: A Generation Lost in the Bazaar

Posted Aug 21, 2012 11:42 UTC (Tue) by cortana (subscriber, #24596) [Link]

Cross compiling with it is a significant pain. The way dependencies are searched for is extremely hard to debug--it either works or it fails with no indication of why. I usually end up having to hack debug statements throughout /usr/share/cmake-2.8/Modules in order to work out what's going on. IIRC the breakage is usually caused by the macros for finding dependencies not taking into account that CMAKE_FIND_ROOT_PATH would ever be used. There is also the magical handling of the toolchain file (why can't the relevant variables be set with -D on the command line?), but that is a minor issue that can be easily worked around once the necessary amount of hair has been pulled out.

Some day I hope to use a build system that does not make me pine for the autotools, but that day has not yet come to pass.

Kamp: A Generation Lost in the Bazaar

Posted Aug 21, 2012 11:46 UTC (Tue) by cortana (subscriber, #24596) [Link]

I'll add that I evaluate a build system by considering how it caters to the needs of the following categories of user:

1. Project developers
2. End users (who simply want to build the project)
3. Translators (who need rules for extracting, merging and updating translations)
4. Porters (who want to cross compile to another system, or build with a native toolchain on another system)

CMake and other systems that are sadly popular such as SCons or waf, fall over when considering the needs of those who aren't project developers themselves. For all its warts, autotools is the only system that caters adequately to the needs of all of the above groups.

Kamp: A Generation Lost in the Bazaar

Posted Aug 21, 2012 14:06 UTC (Tue) by hummassa (subscriber, #307) [Link]

CMake is the second best build system, far far far behind autotools. Yes, it does not cater well to Group 4 (especially if cross-compiling is needed), but it's OK for the other three.

Kamp: A Generation Lost in the Bazaar

Posted Aug 21, 2012 15:27 UTC (Tue) by pboddie (guest, #50784) [Link]

I've spent the last few days cross-compiling, and although the overhead of the build system in question may involve the use of autotools by some projects, my own rant would have something to do with people writing and promoting new tools whilst ignoring use-cases that result in those new tools being completely useless for various groups of people.

I've seen quite a bit of build system evolution and apathy towards cross-compiling in my time - just search the Python bug-tracker for the debris of attempts to get Python's build system to stop doing "I need to run myself on the host now!" and actually work for cross-compilation - and hoping that target devices will be powerful enough for everyone to forget about cross-compilation isn't a useful strategy.

I suppose people can argue that the "anarchy" has not only resulted in the construction of autotools - a dubious claim, since autotools does consider things that later tools ignore or miss, and thus does seem to have had some thought given to its functionality - but also the lack of maintenance of autotools and the proliferation of other tools that don't always measure up. Those things have more to do with the perceived cost of fixing things or developing new things than any particular development strategy, though.

Kamp: A Generation Lost in the Bazaar

Posted Aug 21, 2012 14:22 UTC (Tue) by Cyberax (✭ supporter ✭, #52523) [Link]

I felt a little sick reading this:
>For all its warts, autotools is the only system that caters adequately to the needs of all of the above groups.

Autotools at most is an equal-opportunity serial murder, which chooses its victims from any of the four listed groups indiscriminately.

Kamp: A Generation Lost in the Bazaar

Posted Aug 22, 2012 17:21 UTC (Wed) by dashesy (guest, #74652) [Link]

I enjoyed your analogy, I may borrow it for many discussions at work I guess.
I wish LWN adds the "Unread comments by $USER"

Kamp: A Generation Lost in the Bazaar

Posted Aug 23, 2012 22:09 UTC (Thu) by nix (subscriber, #2304) [Link]

Recent versions of the autotools add parallel murder support if you set the AUTOMURDER_JOBS environment variable.

Kamp: A Generation Lost in the Bazaar

Posted Aug 21, 2012 18:22 UTC (Tue) by boudewijn (subscriber, #14185) [Link]

Well, actually I gave it a first try not that long ago, and I actually found it really easy to cross-compile and package my first application from Linux for Window using cmake. If you have to hack around like you describe, I suspect you're actually taking the wrong approach to begin with.

On the other hand, the cmake systems for all the dozens of dependencies I used in that case had been tested in KDE's wonderful emerge project beforehand, and that helped a lot.

Kamp: A Generation Lost in the Bazaar

Posted Aug 21, 2012 8:05 UTC (Tue) by BlueLightning (subscriber, #38978) [Link]

Funny, those are usually my sentiments about autotools when I am forced to deal with them....

Kamp: A Generation Lost in the Bazaar

Posted Aug 20, 2012 22:55 UTC (Mon) by marcH (subscriber, #57642) [Link]

> Disliking autotools shouldn't lead you into the belief that nothing can be worse!

Wow, a quadruple negation. Totally appropriate for a discussion about autotools; well done!

Kamp: A Generation Lost in the Bazaar

Posted Aug 20, 2012 15:14 UTC (Mon) by jmorris42 (guest, #2203) [Link]

I see a more subtle problem with the bazaar after watching it for a bit.

The cathedral builders, much like the company design teams of old, had a coherent vision. The people who built UNIX had a small group of people who all met each other and shared a goal. Pretty much the same thing for the early days of GNU, Emacs, etc.

Now we have warring tribes almost having wikipedia style edit wars carried out in slow motion in the repositories of projects and distributions. We don't agree on the general direction anymore. By knocking down the gates and allowing anyone in, we allowed anyone in. So no more agreement on where we want to go and no real way to decide. So we get UNIX haters with commit access to the core cultural artifacts of UNIX.

Kamp: A Generation Lost in the Bazaar

Posted Aug 20, 2012 15:20 UTC (Mon) by thumperward (guest, #34368) [Link]

> So we get UNIX haters with commit access to the core cultural artifacts of UNIX

Who are these "UNIX haters", and what "commit access" do they have to "the core cultural artifacts of UNIX"?

Kamp: A Generation Lost in the Bazaar

Posted Aug 20, 2012 17:23 UTC (Mon) by theophrastus (guest, #80847) [Link]

I suspect that detailed reply to this question is most likely to run this otherwise very interesting discussion in a less productive direction. (hint: search previous LWN comment sections which went two sigma in number)

Personally, i've always seen the cathedral versus the bazarr metaphor as ersatz for rules-based versus crisis (entropic) evolution. both are necessary, the latter to get out of local minima. ((must resist making scifi references to babylon-5 or dune..[grit]))

Perhaps we're just at a point where the pendulum of activity should swing back toward the rules-based "priest hood" (and Poul-Henning Kamp is a voice in the wilderness...)? That is, progress is not ever going to be monotonic.

Kamp: A Generation Lost in the Bazaar

Posted Aug 20, 2012 15:23 UTC (Mon) by landley (subscriber, #6789) [Link]

Bazaar development can't do user interfaces. Any time "shut up and show me the code" isn't the correct response to the problem at hand, the development model melts down into three distinct failure modes:

1) Endless discussion with no resolution,

2) Fork to death with everybody implementing their own vision and then no way to merge them back together,

3) Delegate the problem to nobody by either A) separating implementation from interface and focusing on the engine, B) make it so configurable the fact it has no sane defaults is now somehow your fault.

Bazaar development either needs empirical tests so everybody can agree on what "good" looks like, or a Benevolent Dictator For Life who makes the calls on aesthetic issues.

Trying to do a GUI collaboratively is like asking wikipedia to write a novel. Too many cooks spoil the soup by making it taste bad, not by causing it to lack nutrition. We've only _selectively_ overcome brooks' law.

Rob

Kamp: A Generation Lost in the Bazaar

Posted Aug 20, 2012 15:58 UTC (Mon) by Company (guest, #57006) [Link]

This has nothing to do with graphical user interfaces. It applies equally well to command line utilities, APIs and probably lots of other things.

It's just that GUIs is where everyone can see it. And probably because you got used to the fact that find uses single-dash arguments, perl takes -e while python takes -c and memcpy() uses dest before src while g_value_copy() uses src before dest.

Kamp: A Generation Lost in the Bazaar

Posted Aug 20, 2012 16:41 UTC (Mon) by drag (subscriber, #31333) [Link]

> Bazaar development either needs empirical tests so everybody can agree on what "good" looks like, or a Benevolent Dictator For Life who makes the calls on aesthetic issues.

That's how you end up with people calling Gnome developers fascists.

Kamp: A Generation Lost in the Bazaar

Posted Aug 20, 2012 17:18 UTC (Mon) by landley (subscriber, #6789) [Link]

Gnome didn't make just _one_ mistake. But in this context, it couldn't figure out if it was steered by committee or by fiat, so it seesawed back and forth a few times, which didn't go over well.

The standard way to move from "committee" to "fiat" is by forking, an example would be Galeon (and later Firefox) forking off of Mozilla. Or you can start over and be simple but reuse some parts, ala XFCE (done by Olivier Fourdan).

But Gnome started life as a reaction to KDE (a protest over licensing issues), which meant early on KDE got all of the pragmatists and Gnome got all the idealists. There's way more wrong with Gnome than just this one organizational issue. :)

Rob

Kamp: A Generation Lost in the Bazaar

Posted Aug 20, 2012 17:28 UTC (Mon) by danieldk (subscriber, #27876) [Link]

> meant early on KDE got all of the pragmatists and Gnome got all the idealists

Yeah, well, except that big elephant in the room, that likes to wear a colored hat.

Kamp: A Generation Lost in the Bazaar

Posted Aug 20, 2012 17:29 UTC (Mon) by drag (subscriber, #31333) [Link]

> Gnome didn't make just _one_ mistake.

Nobody in this thread mentioned any mistakes. At least not yet.

I am merely pointing out that making arbitrary decisions about what is good for your own software project isn't too popular among some Linux circles.

> The standard way to move from "committee" to "fiat" is by forking, an example would be Galeon (and later Firefox) forking off of Mozilla.

Neither of these are really examples of forks. Galeon started off as it's own web browser that used the Mozilla engine and Firefox is Mozilla's own project which they split off the non-browsing bits into separate applications and put a better UI on top of it.

Galeon essentially died, replaced by Epiphany (at least in spirit..) which now uses webkit and has been largely gone unappreciated because Chrome kicks so much ass.

Kamp: A Generation Lost in the Bazaar

Posted Aug 20, 2012 21:54 UTC (Mon) by cmccabe (guest, #60281) [Link]

I don't understand why "shut up and show me the code" isn't the right response to a UI discussion. At the end of the day, user interfaces aren't some magical thing that only experts can fathom. They're just like any other engineering problem.

Unfortunately, there's a strong "popularity contest" aspect to user interfaces. We use QWERTY because QWERTY is familiar. QWERTY is familiar because we use QWERTY. etc. Similarly, a lot of arbitrary choices were made back in the last 20 years (close box on the right of the window, etc). Well-meaning engineers who create "unconventional" user interfaces find that most people are allergic to them, simply because they're unfamilar. (Case study: GNOME3.)

Another aspect is that UIs are targeted at different types of users. emacs and vi are great user interfaces-- for programmers. Not necessarily for non-programmers. Failure to design something with the needs of its intended audience in mind is another leading cause of UX failure.

None of this stuff means that UI design is some sooper secret thing that non-designers can't possibly comprehend. Just know your audience, and stick to the familiar whenever possible, and you'll be most of the way there.

Kamp: A Generation Lost in the Bazaar

Posted Aug 20, 2012 23:31 UTC (Mon) by hummassa (subscriber, #307) [Link]

UI design (and its supersets, interaction design and experience design) are not sooper seecret, but to do a Good user interface you have to have a Good toolset and knowledge -- just like to make a good program you have to be a good programmer.

Any programmer can programmer an UI. But if it was done without regard to the best practices of UI design (reading the UI guidelines is mandatory, but by no means sufficient), it will be an ugly and/or clunky UI.

It can even be successful! What you said about QWERTY is true, because one of the tools in the UI designer's toolset is "to establish the current mental model of an user when dealing with a product or service". And one of the great problems with UI design from the point of view of a programmer is that good UI design requires a lot of user testing and friction -- you know, things the proverbial programmer is not quite fond of.

Kamp: A Generation Lost in the Bazaar

Posted Aug 23, 2012 17:14 UTC (Thu) by jedidiah (guest, #20319) [Link]

No. The "best practices of UI design" are pretty worthless. The main focus should always been the end user and the use case. Otherwise you get nonsense like Ribbon and GNOME3. You end up with lots of academics completely divorced from reality spouting "principles" that may or may not have anything relation to anything.

You end up with UI disasters that go into the release of a project without even getting decent end user feedback first.

Even the much over hyped interfaces created by Apple suffer from this kind of foolish ivory tower approach to real problems.

Kamp: A Generation Lost in the Bazaar

Posted Aug 23, 2012 18:48 UTC (Thu) by hummassa (subscriber, #307) [Link]

> No. The "best practices of UI design" are pretty worthless.

As I said in other topic, people who say stuff like this do not know anything about UI design. And this is PROVED by your next phrase:

> The main focus should always been the end user and the use case.

THAT is the FIRST best practice of UI design. Design for the user, thinking about the user, prototyping, iterating, testing with the user, etc.

> Otherwise you get nonsense like Ribbon and GNOME3.

Well, actually... user tests show that the Ribbon can actually be good when it's well designed, and lots of people around the world use Office 201x without lots of problems. Some people actually conducted PROPER testing on it.

GNOME3 does not have anything like that AFAICT, so, you are putting two very different things (united only for your dislike of them) together. Anedoctal evidence == No evidence.

> You end up with UI disasters that go into the release of a project without even getting decent end user feedback first.

Repeating what I already said: *real* UI designers don't do that.

> Even the much over hyped interfaces created by Apple suffer from this kind of foolish ivory tower approach to real problems.

At Apple, there was one engineer responsible for the calls on UI design. They had a lot of good eye for details -- and that's why it is where it is today -- but not a lot of best UI design practices like user testing. Specially in the last interactions. And they are lost in the controversial skeuomorphism fad (the jury is still out if it's good, bad or just plain ugly -- no conclusive user testing AFAIK).

Kamp: A Generation Lost in the Bazaar

Posted Aug 23, 2012 19:57 UTC (Thu) by renox (subscriber, #23785) [Link]

>Well, actually... user tests show that the Ribbon can actually be good when it's well designed, and lots of people around the world use Office 201x without lots of problems. Some people actually conducted PROPER testing on it.

An user interface which takes lots of vertical space which is at premium instead of being on the sides which are much less used, "PROPER testing"?
Somehow I doubt it!

Speaking about failure, I remember the Gnome's decision to use "spatial browsing" which was backed supposedly by usability research, they had to revert it in the end.

Kamp: A Generation Lost in the Bazaar

Posted Aug 23, 2012 20:38 UTC (Thu) by hummassa (subscriber, #307) [Link]

> An user interface which takes lots of vertical space which is at premium instead of being on the sides which are much less used, "PROPER testing"? Somehow I doubt it!

The MSOffice201x ribbon actually takes less vertical space than LibreOffice's menu plus two (or three, in some cases) lines of toolbars. It is also spatially better distributed, and the size relations between the icons reflect their "use count" in user testings. See Deborah Hix's usability works... IIRC lateral toolsets have a problem because the distance the mouse has to travel is bigger. Anyway, people who do a lot of DTP or other "give me my vertical space" application IME make use of a rotated display, whenever possible!

> Speaking about failure, I remember the Gnome's decision to use "spatial browsing" which was backed supposedly by usability research, they had to revert it in the end.

I have heard those rumors, but I have never seen any real research.

Kamp: A Generation Lost in the Bazaar

Posted Aug 24, 2012 6:40 UTC (Fri) by dgm (subscriber, #49227) [Link]

> Anedoctal evidence == No evidence.

Lots of "anecdotes" == overwhelming evidence.

Kamp: A Generation Lost in the Bazaar

Posted Aug 24, 2012 20:08 UTC (Fri) by rgmoore (✭ supporter ✭, #75) [Link]

Lots of "anecdotes" == overwhelming evidence.

No. You only get meaningful, much less overwhelming, evidence if you have a representative sample, and listening to whomever yells the loudest is not a reliable way to get it. Improper sampling is exactly why the plural of anecdote is not data.

Kamp: A Generation Lost in the Bazaar

Posted Aug 23, 2012 19:42 UTC (Thu) by rgmoore (✭ supporter ✭, #75) [Link]

Otherwise you get nonsense like Ribbon and GNOME3.

You're not helping yourself out with those examples. It's admittedly anecdotal, but my personal experience with both is that I initially hated them for being different, but found that they're actually more efficient than the old system once I figured out how to use them. If anything, I'd hold them up as examples of how we should be skeptical of people's instinctive and uninformed reaction to UI changes, and pay more attention to people who have actually studied this stuff.

Kamp: A Generation Lost in the Bazaar

Posted Aug 21, 2012 9:04 UTC (Tue) by fb (subscriber, #53265) [Link]

> 3) Delegate the problem to nobody by either A) separating implementation from interface and focusing on the engine, B) make it so configurable the fact it has no sane defaults is now somehow your fault.

Ouch. I've seen "B" so often in my life that it just isn't funny.

[...]

Does anyone else has memories of a dreaded question about 'less' charset configuration that took one full screen shoot of text when installing Debian? Every time I faced that I realized that there was really nobody in charge of reviewing the Debian installation steps.

Kamp: A Generation Lost in the Bazaar

Posted Aug 20, 2012 17:22 UTC (Mon) by slashdot (guest, #22014) [Link]

Uh?

Projects like GNOME 3 which have a "vision" are usually the shitty ones, since that vision is usually crap (at least compared to the visions by leading commercial companies, since those have money directly on the line depending on how good their vision turns out to be, while the open source guys are usually just crazy lunatics doing whatever).

On the other hand, Linus Torvalds doesn't enforce any vision for Linux, yet Linux is one of the most successful projects precisely because by not having a vision, it literally does everything and makes everyone happy.

What REALLY matters is:
1. Having an intelligent and capable person maintaining the project
2. Having open policies that only care about technical merits of the work submitted
3. Having lots of developer/company interest and thus a vast community

"We have a coherent vision" is usually shorthand for "We are morons who are too stupid to design truly generic and flexible software, and need a "coherent vision" so that our small brain can cope".

Kamp: A Generation Lost in the Bazaar

Posted Aug 21, 2012 7:00 UTC (Tue) by tnoo (subscriber, #20427) [Link]

Thanks, you spelled out niceley what I was about to write.

Another similarly successful example is Python, with its "BFDL" (benvolent dicator for lifetime).

Having one person with an open mind, and very strong esthetic and technical viewpoints as ultimate decision maker seems to be the way to success.

Kamp: A Generation Lost in the Bazaar

Posted Aug 20, 2012 18:54 UTC (Mon) by iabervon (subscriber, #722) [Link]

When you say "UNIX haters", are you referring to the people who brought up legitimate issues with UNIX that Ken Thompson acknowledged in the epilogue to their handbook? Pretty much all of the work that has gone into making it possible for non-professionals to use and administer UNIX systems has been addressing items from the UNIX-Haters Handbook.

Kamp: A Generation Lost in the Bazaar

Posted Aug 22, 2012 1:21 UTC (Wed) by jmorris42 (guest, #2203) [Link]

To say UNIX was perfect and needed no improvement would be kinda stupid, the sort of thing Team Amiga would say. No, the question isn't "Are there problems in current UNIX/Linux?" The question is "What is the best way to improve the UNIX implementation in the currently popular Linux/GNU/X systems?"

One way is to discard the base ideas the whole thing rests on and just make it more like Windows, iOS or Mac. (Saying Mac OS to make clear the distinction between the largely ignored UNIX underbelly of OS X and the visible part.) Another way is to evolve it to more closely approach the ideal of the basic ideas. In which case things like Plan 9 are a better lodestar to steer toward and steal ideas from.

Kamp: A Generation Lost in the Bazaar

Posted Aug 20, 2012 15:15 UTC (Mon) by thumperward (guest, #34368) [Link]

Worse is Better predates CATB by a decade and describes exactly how Unix got to where it did over the preceding decade to that. Raymond may have put a modern spin on it by comparing GCC (the original "cathedral" which, ironically, has a famously baroque and overblown design of the type alluded to in this column) to Linux, but the underlying contrast between the rigid, planned development model and the loose bolt-features-on-and-worry-about-the-bugs-later one had been made long before free software had escaped the clutches of the FSF and become the bazaar the column rails against.

Seeing as the original column was a troll, I suppose it's time to wheel out the "BSD is dying" meme. FreeBSD's supposed philosophy has been to impose some order on this allegedly out-of-control pile of gruesome hacks. If Poul-Henning Kamp thinks it's failed in its goal, then I suppose it's outlived its usefulness.

Kamp: A Generation Lost in the Bazaar

Posted Aug 20, 2012 15:18 UTC (Mon) by landley (subscriber, #6789) [Link]

BSD is no more dying than Cobol is: Zombies can't die.

Kamp: A Generation Lost in the Bazaar

Posted Aug 20, 2012 16:55 UTC (Mon) by drag (subscriber, #31333) [Link]

FreeBSD and OpenBSD are worth caring about, because both have their uses. Other variants are irrelevant. Just like how only a few Linux distros actually matter.

I wish Solaris would just curl up and die, but ZFS has it hanging on a bit longer. Oracle's Dtrace implementation on Linux is hopefully interesting and may lead to another nail in the coffin for SunOS. I strongly suspect that Oracle wouldn't mind seeing Solaris die off either.

And that is pretty much it for classic Unix systems. Between Microsoft and Linux there isn't much room left for it.

AIX and the like will still matter to some, but nobody wants to move to using those systems if they can help it. They are kinda like 'Blackbeard the pirate' during his heyday. Psychologically damaged relics of a bygone era.

Nowadays it's ridiculous to talk about portable software and not to automatically assume that you are including OS X or Windows into the discussion.

Kamp: A Generation Lost in the Bazaar

Posted Aug 20, 2012 20:07 UTC (Mon) by wahern (subscriber, #37304) [Link]

OS X is a BSD as far as portability is concerned, but in any event is strongly POSIX compliant, and is SUSv3 certified. Part of Kemp's argument, I think, is that rather than rely on cargo cult programming practices, people should simply follow the fscking POSIX standard more closely. Rather than writing Linux software and then porting it to something else, they should write POSIX software and then make it work on Linux, etc. Usually it'll just work, because Linux, while having many non-standard interfaces, supports almost all of the standard analogs.

The elephant in the room is Windows. Compared to the differences among AIX, Linux, Solaris, OS X, FreeBSD, NetBSD, OpenBSD, etc, Windows is a completely alien dimension. The world would be much better off if more projects stopped to trying to be portable to it. It's almost always a red-headed step child in a build, and the demand seems to come entirely from developers and techies who refuse to give up Windows as their personal development environment. Instead they weigh the burden on the rest of us in the form of crappy implementations.

Supporting Windows is almost like supporting TRON, in terms of common functionality of the platform. The least-common-denominator interfaces are stuck in the 1980s, and a project makes an immense sacrifice in complexity and/or usefulness by trying to work on both. Obviously it _can_ be done, but in the vast majority of cases it probably _shouldn't_ be done.

The only standard Windows cares about is C++. They have no support for contemporary POSIX/SUS or C standards. And their fundamental approach to API design conflicts with that of Unix derivatives. I'm not saying it's better or worse, just very different. And that makes porting sophisticated software difficult, to say the least.

Kamp: A Generation Lost in the Bazaar

Posted Aug 20, 2012 20:21 UTC (Mon) by apoelstra (subscriber, #75205) [Link]

>The elephant in the room is Windows. Compared to the differences among AIX, Linux, Solaris, OS X, FreeBSD, NetBSD, OpenBSD, etc, Windows is a completely alien dimension. The world would be much better off if more projects stopped to trying to be portable to it. It's almost always a red-headed step child in a build, and the demand seems to come entirely from developers and techies who refuse to give up Windows as their personal development environment. Instead they weigh the burden on the rest of us in the form of crappy implementations.

The reason that developers try to support Windows is to increase their userbase and (hopefully) gain more developers, because the vast majority of open-source projects are hopelessly understaffed. Nobody really likes it, or likes being portable to it. But the fact is that a lot of people use it, and if we're writing software for people to use, we need to support it.

The fact that these developers -don't- use Windows as their personal development environment is the reason that implementations are so crappy.

Kamp: A Generation Lost in the Bazaar

Posted Aug 20, 2012 21:20 UTC (Mon) by Cyberax (✭ supporter ✭, #52523) [Link]

People whining about POSIX should just shut up. POSIX is by now a small part of the whole system. About the only thing you can write strictly within POSIX are mail daemons (well, maybe also ftp daemons). Almost everything else would need system-specific bits.

And that's OK, because otherwise there'd be no need for different operating system. We'd just all happily use One True Faceless OS.

And I also disagree about Windows being a horrible OS for build systems. It's actually a GREAT OS for build systems because it forces you to avoid hacky shortcuts like assuming that /usr/include always has all the relevant libraries or always installing binaries into /usr/bin.

Kamp: A Generation Lost in the Bazaar

Posted Aug 21, 2012 13:14 UTC (Tue) by Wol (guest, #4433) [Link]

Don't tell that to the LibreOffice folks.

Last I heard, and I'm sure it's speeded up since then, a "make clean, make" took A WEEK on an 8-core Windows monster.

My 3-core gentoo box takes a couple of hours.

Cheers,
Wol

Kamp: A Generation Lost in the Bazaar

Posted Aug 21, 2012 14:43 UTC (Tue) by Cyberax (✭ supporter ✭, #52523) [Link]

Again, that's probably an issue with their build system. MSVC really really wants you to pass it several files at once. If you do this, then it becomes several times faster than GCC on simple C/C++ code.

Kamp: A Generation Lost in the Bazaar

Posted Aug 21, 2012 19:21 UTC (Tue) by knico (guest, #67323) [Link]

If the files are local to the machine yes, Windows builds can be very fast.

If your files are on a network drive, Windows builds are very, very slow. Visual Studio seems to have huge network latencies due to its sub-optimal dependency checking.

Kamp: A Generation Lost in the Bazaar

Posted Aug 20, 2012 21:39 UTC (Mon) by epa (subscriber, #39769) [Link]

It doesn't matter much that Mac OS X is POSIX compliant or certified, since POSIX and Unix(tm) don't standardize any user interface - or at least not any UI that a user-facing application would want to use. Libraries like Qt go a long way towards a cross-platform interface for writing applications, but there still remains a lot of work to port and maintain your program from Linux to Mac, or vice versa.

Kamp: A Generation Lost in the Bazaar

Posted Aug 20, 2012 22:48 UTC (Mon) by drag (subscriber, #31333) [Link]

> Windows is a completely alien dimension.

Actually it's not. Microsoft started off as a Unix vendor and even though it has a lot of VMS influences the Unix influence definitely shines through.

It's certainly a lot more Unix-like then, say, AS/400 or VSE/ESA. Ever used a thirty-_ONE_ bit operating system? I have. It's not pleasant.

>They have no support for contemporary POSIX/SUS or C standards.

Actually Microsoft Windows does have POSIX/SUS support.

The NT kernel has the ability to support multiple 'userspace personalities'. This is a side effect of it's layered 'Hybrid Microkernel' design with internal APIs and message passing. Originally it only supported the NT userland. Later one, with windows 2000, win32 support was added. POSIX is just yet another 'native' userland API that is supported by Microsoft Windows.

Numerous government entities have POSIX requirements for operating systems.. In order to get contracts with those entities Microsoft must provide a POSIX operating system. Microsoft does this previously through SFU in the past and now through SUA.

One of the options you get for the higher priced versions of Windows Visa and Windows 7 is that they can be effectively a Unix operating system, depending on your specific configuration. Of course nobody gives a flying flick about POSIX so it's being depreciated in Windows 8.

Maybe this will help put the whole 'POSIX' thing into perspective. Maybe not.

If POSIX is all that important then why are not people using native BSD-style user-land that Microsoft offers rather then trying to shoe-horn a Linux environment via third-party win32 binary? (aka Cygwin or other proprietary competitors) The answer, it seems, is that it's a lot easier to ignore POSIX completely if you are aiming for portability.

Kamp: A Generation Lost in the Bazaar

Posted Aug 21, 2012 0:27 UTC (Tue) by wahern (subscriber, #37304) [Link]

Windows' POSIX support is circa 1988. What I mean, for example, is that Windows API designs tend to leverage threads and callbacks, while Unix uses readiness notification through file descriptors. There are big differences between those two paradigms, and bridging them requires a lot of bug prone boilerplate code.

That's just one example. I18N and L10N is another. The list goes on-and-on.

The best way for an application to simultaneously support Unix and Windows is to not support it for those components which heavily rely on system-specific features. That particularly applies to GUI software.

The problem is that people try to design highly detailed, portable interfaces, when instead of thinking about small details they should think big. That is, break out huge chunks of the application and develop them in parallel, but separately. Trying to write them "portably" as one cohesive whole is foolhardy.

Same logic applies to the build environment. Trying to write or find the "one build system to rule them all" is a fool's errand. You can do better than autotools or CMake by simply not trying to accomplish what those two tools ultimately fail to do. Not developing the Windows portion of a GUI application completely in Visual Studio is just asking for endless and unnecessary headaches.

Point being, if you don't mix portable and non-portable code, you don't really need a complex, "portable" build solution.

Kamp: A Generation Lost in the Bazaar

Posted Aug 21, 2012 0:36 UTC (Tue) by wahern (subscriber, #37304) [Link]

People don't use WIndows SUA because it's not installed by default. The purpose of writing Windows software is, presumably, so other Windows users can run it, not only so other people with SUA can run it.

Windows is presumably dropping their ancient POSIX support for the same reason they've never supported newer C versions and perhaps for the same reason their commitment to C# was so short lived--because their management team doesn't really care about anything other than the old Win32 API which keeps vendors locked in to the platform.

Supporting POSIX or C isn't going to draw people to the platform, it'll just make everybody else's life easier. Attempt to support newer C++ features is what will help draw and keep developers, as will their typical overhyped promises about an SDK/API revolution in the next Windows product.

Kamp: A Generation Lost in the Bazaar

Posted Aug 23, 2012 5:38 UTC (Thu) by quotemstr (subscriber, #45331) [Link]

> perhaps for the same reason their commitment to C# was so short lived

C# commitment? Short-lived? Says who? All the new stuff supports C# as much as it supports JavaScript and C++.

Kamp: A Generation Lost in the Bazaar

Posted Aug 21, 2012 9:54 UTC (Tue) by tialaramex (subscriber, #21167) [Link]

You seem to have only the vaguest idea what was going on with NT and personalities, so I guess it might help to fix some of that before we get to the "Why" part

- There is no real "NT userland". The NT-specific stuff all lives below the personalities, and is thus accessible from all of them. Unlike a Unix Microsoft does not officially document and support the system call interface to their kernel.
- Win32 originates in Windows NT (although obviously the design owes plenty to Win16) and was present as a personality essentially from the platform début. This is the "native" personality of NT operating systems, and the one with access to all the nice toys invented in the last 25 years.
- The original POSIX and OS/2 subsystems date to the early 1990s too.

You're correct that POSIX is present because it was felt to be advantageous in contract negotiations. You would be wrong if you imagined that Microsoft intended /anyone/ to use this interface. It's drastically hobbled, implementing only a bare minimum of POSIX features and serving to demonstrate (as someone has commented above) that POSIX is "useless" because it doesn't specify enough to build much real software. BSD sockets, for example, are not a mandatory POSIX feature, and thus Microsoft omits them. Likewise X is not present, nor is any alternative Unix-like windowing system.

The OS/2 personality is there for the same reason. It's a checkbox item. Can this all-singing all-dancing NT run our old OS/2 software? Yes says the salesman. Then the NT stuff is installed, the OS/2 app doesn't work, and Microsoft explains that oops, that's using an unsupported API but hey, one of the many Windows ISVs will be happy to offer you an alternative that does run on NT.

Services For Unix isn't an evolution of the POSIX personality. It's Microsoft reluctantly accepting the reality that people wanted to actually _run_ Linux software (hence all the GNU utilities) on their expensive NT systems and that all too often now the IT management is present when the salesman makes his pitch and will laugh uncontrollably if the POSIX personality is mentioned. They needed a new pig, as my old CEO would say. So they bought Interix (which is what customers who desperately needed this stuff to actually work had been using instead of the POSIX personality) and renamed it SFU (I'm surprised they couldn't find an excuse to insert a 'T' in that abbreviation) but of course they've essentially allowed it to stagnate so that it's becoming irrelevant again.

It's a mistake to imagine that NT's innards resemble those of a Unix at all. It's similarly _capable_ but that's all. Roughly everything you can do on Linux from a certain era you can also do on NT from that era. But the /way/ you do them is very different. There is a different philosophy at work in the Microsoft kernel team than in Linus' project.

Today the Microsoft salesman will deflect questions about cross platform compatibility by suggesting that you can run all your "legacy" (ie non-Microsoft) software in a virtual machine. If you're dumb enough to fall for that, you deserve everything you will get.

Microsoft cross-platform

Posted Aug 21, 2012 13:39 UTC (Tue) by Wol (guest, #4433) [Link]

Actually means your old Windows software will run on your new Windows.

Which may be true for stuff MS wrote, but I have problems running WordPerfect on XP, and on 7 it won't even think of installing!

Cheers,
Wol

Microsoft cross-platform

Posted Aug 22, 2012 12:09 UTC (Wed) by vonbrand (guest, #4458) [Link]

It's not. A MSFT written game (forgot details) for Win98 which exactly met WinNT 4.0's specifications wouldn't make it past its splash screen.

Microsoft cross-platform

Posted Aug 22, 2012 15:01 UTC (Wed) by cortana (subscriber, #24596) [Link]

IIRC, DirectX on NT 4 never made it past version 2, so it's not surprising that a game written by MS (and hence one that probably relied on the latest DirectX) wouldn't work.

Kamp: A Generation Lost in the Bazaar

Posted Aug 22, 2012 8:42 UTC (Wed) by drag (subscriber, #31333) [Link]

> BSD sockets, for example, are not a mandatory POSIX feature, and thus Microsoft omits them.

Microsoft SUA supports BSD sockets. In Ipv6, no less. But that's neither here nor there and thanks for the mini lesson on NT personalities. I am not a NT guy.

I am sure that we all can agree at this point that POSIX has no relevance any more if your concern is about portable software. It still may be useful for some API lawyering, but besides that it is entirely insufficient for most software.

Kamp: A Generation Lost in the Bazaar

Posted Aug 23, 2012 5:35 UTC (Thu) by quotemstr (subscriber, #45331) [Link]

> Supporting Windows is almost like supporting TRON

You think that's bad? Emacs still has a fully-maintained DOS build. Not too long ago, we had a long discussion about 8.3 filenames.

AIX is dying too ...

Posted Aug 21, 2012 13:11 UTC (Tue) by Wol (guest, #4433) [Link]

From what I can make out, IBM's attitude is "linux is going to replace it. If customers are happy to pay us to maintain it, we're happy to take their money".

But AIX is now very definitely a profit centre. When it can no longer compete in customers' minds, and starts losing money, it will be gone.

Cheers,
Wol

AIX is dying too ...

Posted Aug 21, 2012 13:59 UTC (Tue) by drag (subscriber, #31333) [Link]

> But AIX is now very definitely a profit centre. When it can no longer compete in customers' minds, and starts losing money, it will be gone.

It's hella expensive to migrate away from it. Just like any enterprise platform that a large-ish corp depends on for core functionality. The time and money it will take for most companies to move away from AIX far outstrips the cost of keeping it.

(Not to also mention the risk to your career if you decide to take something that is working and replace it with something that is cheaper... only to find the new solution doesn't work.)

IBM knows this and sets up licensing accordingly and will continue to make a killing for a very long time.

If it wasn't for this fact 'big iron unix' would of been dead years ago, I expect.

AIX is dying too ...

Posted Aug 21, 2012 17:46 UTC (Tue) by dgm (subscriber, #49227) [Link]

AIX will die, not because enterprises migrate away from it, but because no new projects are going to be started on it. Eventually all of the old projects will be replaced by something bigger, faster, more integrated and/or more "cloudy".

AIX is dying too ...

Posted Aug 21, 2012 18:03 UTC (Tue) by dark (guest, #8483) [Link]

But "eventually" could easily be decades, as we found out in the runup to Y2k :) I guess we'll see what's left in 2038.

Kamp: A Generation Lost in the Bazaar

Posted Aug 20, 2012 16:53 UTC (Mon) by alankila (guest, #47141) [Link]

AS a certain wise man once said to me, people do amazing amount of really hard work to avoid doing some very light work.

This phrase could be applied in the design and execution of libtool, and most certainly in autoconf, and all that. I guess I enjoyed reading this rant, but such enjoyment does me no credit; I enjoyed it because it did was agree with my disgust for crappy build systems like autotools.

Much of the complexity appears stems from the bad assumption that good design is flexible and adaptable. It is reflected in the notion that good build system adapts to the environment, rather than simply dictating how environment must work to be acceptable for the build system.

Socially, we all do our own thing and try to get along, nobody can force anybody else to do anything. Where people can't agree, complex compatibility measures get designed to allow every choice that exists, and the world moves on. The end result however is often not great. The bazaar's problems are not at the same level as the end-user's problems. Bazaar's problem is cooperation and allowing choice; end user's problems are more like lack of consistency and overall product vision and roadmap. The bazaar, I guess, can not deliver what end users need.

I fear the future when Wayland arrives and with its client-side decorations we can see per-app window decorations. Not only will decorations look different, the close buttons will be at left edge on one half of the applications and at right edge in others, because the developer groups for these applications were different and did not agree where the buttons should go. Time is running out.

Kamp: A Generation Lost in the Bazaar

Posted Aug 20, 2012 17:43 UTC (Mon) by mjg59 (subscriber, #23239) [Link]

You know that clients can already override the window manager's behaviour, right?

Kamp: A Generation Lost in the Bazaar

Posted Aug 20, 2012 17:52 UTC (Mon) by apoelstra (subscriber, #75205) [Link]

> You know that clients can already override the window manager's behaviour, right?

They can, but that's a lot of unnecessary work. App developers generally don't like making UI decisions, because they involve a lot of effort and always make half the userbase angry. But if Wayland -forces- them to make these decisions, every project will make different ones.

Kamp: A Generation Lost in the Bazaar

Posted Aug 20, 2012 17:58 UTC (Mon) by mjg59 (subscriber, #23239) [Link]

"Client side decorations" doesn't mean "Every client must reimplement its own decorations".

Kamp: A Generation Lost in the Bazaar

Posted Aug 20, 2012 18:35 UTC (Mon) by alankila (guest, #47141) [Link]

The current Wayland plan is still to force clients to do the decorations.

I favor this option personally because I believe it allows excellence in user interface design, but I also foresee a likely result which is that different classes of applications will have an entirely different look and feel also in the window decoration level.

Kamp: A Generation Lost in the Bazaar

Posted Aug 20, 2012 19:03 UTC (Mon) by robert_s (subscriber, #42402) [Link]

"I favor this option personally because I believe it allows excellence in user interface design"

The problem there being that 90% of developers think they're excellent at user interface design. And the people who suffer will be the users who disagree.

Kamp: A Generation Lost in the Bazaar

Posted Aug 20, 2012 22:18 UTC (Mon) by drag (subscriber, #31333) [Link]

If the developer writes shitty applications they why on earth would users inflict those applications on themselves.

Drawing your own decorations is not the same as forcing users to use bad software.

Kamp: A Generation Lost in the Bazaar

Posted Aug 23, 2012 17:21 UTC (Thu) by jedidiah (guest, #20319) [Link]

You're forcing a specialist to be a generalist. They are going to be bad at it because they are not generalists. Division of labor exists for a reason.

Kamp: A Generation Lost in the Bazaar

Posted Aug 21, 2012 6:47 UTC (Tue) by zdzichu (subscriber, #17118) [Link]

Sorry, I think you still don't get it. Client-side decoration will be in 99% provided by UI toolkit, be it GTK+ or Qt. Not by application itself (although it will be possible, just as it is possible now).

Kamp: A Generation Lost in the Bazaar

Posted Aug 21, 2012 11:20 UTC (Tue) by cortana (subscriber, #24596) [Link]

I'm *really* not looking forward to a world where all the GTK+ 3 programs have different decorations to the GTK+ 2 programs and the QT 4 and QT3 programs. Not to mention the programs that run in Wine, and those that use FLTK, etc.

Kamp: A Generation Lost in the Bazaar

Posted Aug 23, 2012 22:25 UTC (Thu) by mpr22 (subscriber, #60784) [Link]

Most app developers will make the decision "let the windowing toolkit draw the decorations", because they're lazy. The ones who will make the decision "draw my own decorations" are already making that decision today.

Kamp: A Generation Lost in the Bazaar

Posted Aug 20, 2012 17:03 UTC (Mon) by slashdot (guest, #22014) [Link]

This guy is an idiot.

He uses ***FreeBSD compiled from source*** and complains that the (automatic!) compilation process is complicated!

And then he bashes Autoconf which is the only reason that software even compiles on FreeBSD rather than being only for Linux.

Guess what, if he used something like Ubuntu, Windows (or Android or iOS...) raher than fucking FreeBSD he'd get what he wants.

Kamp: A Generation Lost in the Bazaar

Posted Aug 20, 2012 17:15 UTC (Mon) by juhl (subscriber, #33245) [Link]

Please just shut up and go get a clue.

*PLONK*

Kamp: A Generation Lost in the Bazaar

Posted Aug 20, 2012 19:01 UTC (Mon) by hummassa (subscriber, #307) [Link]

I know it's slashdot (the user, not the site) but I actually agree with him and I think that Kamp ignored what is the Cathedral and what is the Bazaar in the original esr's text, amongst many other things. And slashdot's complains about the text actually make a lot of sense. So please, re-read the text and hit yourself with your cluestick.

Kamp: A Generation Lost in the Bazaar

Posted Aug 20, 2012 19:36 UTC (Mon) by tzafrir (subscriber, #11501) [Link]

(Quoting the article here)

> Here is one example of an ironic piece of waste: Sam Leffler's graphics/libtiff
> is one of the 122 packages on the road to www/firefox, yet the
> resulting Firefox browser does not render TIFF images. For
> reasons I have not tried to uncover, 10 of the 122 packages need
> Perl and seven need Python; one of them, devel/glib20, needs both
> languages for reasons I cannot even imagine.

Libtiff, python and perl are used by various components of the GTK+ library / stack that Firefox uses. Firefox uses GTK+ rather than completely implementing printing and such on its own as it used to do (but even before that, Firefox relied on GTK+ for some basics. I believe that this is what the article refers to as code reuse. If the author's desktop is GNOME, XFCE and/or LXDE, or even if he merely uses the GIMP, he probably already has GTK+ built from a previous install and did not need to build it just for installing Firefox.

> you will find that you need three different versions of the
> make program

Maybe this is just FreeBSD? Where in Firefox's recursive build dependency on Linux can I fine any make other than gmake?

As for building glib without perl / python support: I have no idea if the FreeBSD ports system is designed to support this. The Gentoo portage system was designed to support exactly those things. But then again, most users of "Ubuntu" (as in slashdot's comment) and the likes normally don't bother (and hence don't consider this a required feature of the packaging system :-).

Kamp: A Generation Lost in the Bazaar

Posted Aug 20, 2012 21:45 UTC (Mon) by epa (subscriber, #39769) [Link]

He still has a legitimate point. If Firefox does nothing with TIFFs, why does it require libtiff? If the answer is that GTK contains functions which call libtiff, that's not a good enough reason if those functions are never used by Firefox. Fixing this would require more conditional compilation and/or looser, run-time linking of dependencies. Unfortunately both those things multiply the number of possible configurations and hence places for bugs to hide.

Kamp: A Generation Lost in the Bazaar

Posted Aug 20, 2012 23:25 UTC (Mon) by hummassa (subscriber, #307) [Link]

Have you considered the fact that it's possible that internally one or more of ff's dependencies uses libtiff for something?

Or -- more probable -- the fact that the ports system has a bug (because it does not have USE flags like portage or something) that does not let you compile those dependencies without libtiff?

Kamp: A Generation Lost in the Bazaar

Posted Aug 20, 2012 23:25 UTC (Mon) by tzafrir (subscriber, #11501) [Link]

Libtiff is not required by Firefox (Firefox). It is required by oy one of the GTK+ packages. GTK+ would actually work just fine without TIFF support. But I figure that the maintainers of the GTK+ port in FreeBSD figured it would be a nice idea for GTK+ programs to have TIFF support.

So you could potentially have two copies of GTK+ on the system. But that would in fact involve a longer build time. Assuming you do want TIFF support in some other GTK+ programs, why not have it just enabled in GTK+?

So either fix it the "Ubuntu" way (build GTK+ with TIFF support for everybody) or the "Gentoo" way (build GTK+ without TIFF support for everybody. Keeping multiple copies increases your overall build time (not to mention other types of overheads).

Oh, and do you mean that Firefox (and Chromium, likewise) have this nice habit of not reusing system components? Well, this is a know issue. In the works, I guess.

Kamp: A Generation Lost in the Bazaar

Posted Aug 21, 2012 1:52 UTC (Tue) by roc (subscriber, #30627) [Link]

Supporting TIFF in a Web browser is simply a bad idea. There's no important use-case which isn't just as well served by other image formats. Exposing TIFF support means that some site might come to depend on it and now the Web platform has just a bit more unnecessary complexity. Also, you'll have increased the security attack surface for no good reason.

Firefox can use system codecs, that's not the issue here.

Kamp: A Generation Lost in the Bazaar

Posted Aug 21, 2012 23:10 UTC (Tue) by epa (subscriber, #39769) [Link]

Yup, the problem is that the build process tends towards maximalism, since there is only one official build for GTK and it has to include any feature that anybody might want. Any work done by upstream to make dependencies optional is neutralized by the packager, who has the choice of building with libfoo as a hard dependency, or not at all.

Kamp: A Generation Lost in the Bazaar

Posted Aug 21, 2012 4:56 UTC (Tue) by dvdeug (subscriber, #10998) [Link]

I'm trying to understand what possible difference it makes. Yes, in the theoretical world where everything is perfect, we'd only have one language to fill the whole that Python/Perl/Ruby fills. But even in that world, if it had to interact with this world and its variety of graphic formats, I still think libtiff would likely be one of the basic support packages that would get pulled in when building all the libraries that Mozilla depends on. There is no reason that libgtk should be limited to the features that Mozilla needs, or that Mozilla shouldn't depend on it because it provides a feature that Mozilla doesn't need. Libraries just don't work that way.

Back to Perl and Python ... so? What's the cost? If you're talking about a tiny embedded system where only supporting one is a major space saver, then I understand the frustration, but if you're compiling Mozilla, both Perl and Python are negligible in size.

Kamp: A Generation Lost in the Bazaar

Posted Aug 20, 2012 22:41 UTC (Mon) by nix (subscriber, #2304) [Link]

I note also that he spends much of the article complaining about Autoconf and Libtool. Go look at their principal authors -- not much dotcom abandoned generation there, many of them were doing free software development in the early 90s or even before, and most of those people are still around, so the callow newcomers get their stuff vetted by the old farts of PHK's generation.

(I do wonder if, as a Linux user since 1997 and a Unix user since 1993, I would count as a dotcom abandoned generation member simply because I entered the workforce in 1999...)

Kamp: A Generation Lost in the Bazaar

Posted Aug 22, 2012 13:50 UTC (Wed) by markhb (guest, #1003) [Link]

You are apparently the only person Google is aware of who has ever used the phrase, "dotcom abandoned generation". For the benefit of those of us whose work and life experience hasn't had anything at all to do with Silicon Valley or even any technology startups, can you please explain what you meant by the phrase and how it relates to the rest of the discussion?

Kamp: A Generation Lost in the Bazaar

Posted Aug 22, 2012 16:34 UTC (Wed) by jwakely (guest, #60262) [Link]

From the original PHK article:

> "So far they have all failed spectacularly, because the generation of lost dot-com wunderkinder in the bazaar has never seen a cathedral and therefore cannot even imagine why you would want one in the first place, much less what it should look like."

Lost dot-com wunderkinder?

Posted Aug 22, 2012 17:40 UTC (Wed) by man_ls (guest, #15091) [Link]

Many in the dot-com generation have seen enough crumbled cathedrals to last us a lifetime, while the bazaars continue to thrive. Look the previous generation of monolithic operating systems (VMS, System/3x0), all the proprietary Unixen, their intellectual successor Plan 9, the *BSD family, all reduced to legacy state. Consider closed browsers as they lose the ongoing war against open source. Behold the multitude of proprietary abandoned web servers, while their Free counterparts continue to live in Linux repos and promiscuously accept patches from anywhere. Even the GNU project, the original cathedral, has adopted a more open approach to development. Not to speak about proprietary developments, supposedly cathedralicious but in reality more like oversized shacks, with reeking innards that require breathing equipment just to delve into.

It's a mystery how PHK can blame the bazaar for anything. In fact I wonder why he even wants to install Firefox on FreeBSD at all instead of, I don't know, Abaco for Minix.

Kamp: A Generation Lost in the Bazaar

Posted Aug 20, 2012 17:41 UTC (Mon) by hummassa (subscriber, #307) [Link]

Yes! That was exactly my opinion of the article (I don't know if the guy is an idiot, but the article certainly is STUPID).

Kamp: A Generation Lost in the Bazaar

Posted Aug 21, 2012 7:51 UTC (Tue) by jond (subscriber, #37669) [Link]

Weirdly enough, merely using Ubuntu is not enough to save you from libtool or autotools. Are you sure you read the article?

Kamp: A Generation Lost in the Bazaar

Posted Aug 21, 2012 13:17 UTC (Tue) by hummassa (subscriber, #307) [Link]

Yes it is. You just apt-get install firefox (actually, you click on the "install firefox for me please" icon) and you are done.

Kamp: A Generation Lost in the Bazaar

Posted Aug 21, 2012 8:15 UTC (Tue) by Pawlerson (guest, #74136) [Link]

You've got the point. FreeBSD and friends are relics of the past which shouldn't be even taken into account or serve as examples. They're dead, but summoned as living dead by some anti Linux companies.

Kamp: A Generation Lost in the Bazaar

Posted Aug 20, 2012 19:44 UTC (Mon) by iabervon (subscriber, #722) [Link]

He seems to have identified legitimate problems, but attributing them to the bazaar model seems to be entirely unsupported. In fact, half of the things he talks about were screwed up in the cathedral era and the bazaar just hasn't touched them. Unless, of course, he considers the UNIX diaspora (*BSD, HP/UX, Aix, Solaris, etc) to be part of the Bazaar, which might be a bit accurate. But even there, the lesson would really be: before the bazaar was recognized as a real model, it was actually used, but really badly.

Kamp: A Generation Lost in the Bazaar

Posted Aug 20, 2012 20:18 UTC (Mon) by Klavs (guest, #10563) [Link]

I actually agree with PHK, in the fact that "craftsmanship" has gone to the wayside. Atleast in the sense of a "finished product, including the compilation process". Though I tend to believe it is not the "way of the bazaar one should blame - more that of the ever increasing pressure to "get something out", and since the std. autotools mess "works" - people just leave it at that - and focus on their projects goal, instead of bothering with those details (they only scratch their own itch in that sense).

I bet there's just as many well-educated programmers, that do the exact same half-assed build-thing and are just glad it works, so they can focus on what their managers want them to deliver.

And since it's Open Source - I'm sure most of the projects would welcome an update to their build process, to help the situation.. anyone?

Kamp: A Generation Lost in the Bazaar

Posted Aug 20, 2012 20:19 UTC (Mon) by Klavs (guest, #10563) [Link]

one might argue that the "get something out" is related to the bazaar, but I'd disagree, since also the cathedral based projects, have a requirements to deliver.

Kamp: A Generation Lost in the Bazaar

Posted Aug 20, 2012 22:15 UTC (Mon) by pboddie (guest, #50784) [Link]

I see that autotools gets a lot of criticism here, but people do seem to forget that outside the open source mainstream criticised in the article, the alternatives are not better systems but are instead Makefiles with hardcoded library paths or, as someone else pointed out, a directory of "this worked for me once" system-specific Makefiles for ancient Unix implementations.

Just to avoid having to fix up Makefiles on a regular basis, I once applied autotools to an open source project (whose code was actually of a high quality) without having any experience with it beforehand, and it saved me (and hopefully others) a lot of time. It wasn't particularly pleasant, but other projects have failed to replicate autotools' functionality, so instead of pausing real life to undertake the mammoth task of making the perfect tool, we accept the compromise and do the job we were supposed to be doing.

It's possible to make autotools do the right thing if you're conservative and don't delegate the task to automake: it's at precisely that point that you have the resulting configure script sniffing around half the system, trying to figure out if you have the NIS header files in order to build a calculator program or whatever.

Kamp: A Generation Lost in the Bazaar

Posted Aug 20, 2012 23:06 UTC (Mon) by alankila (guest, #47141) [Link]

It's also possible to use, say, pkg-config. Which can spit out cflags and ldflags and compilation flags without any autotools voodoo. There is no need to hardcode library paths, that is simply a strawman.

Autotools brings a lot of complexity, and I would personally be far less opposed to it, if:

1. there were less code generation layers and fewer programming languages and fewer programs involved. Code-generation is imho amateur's tool. Very few systems that use this method stay maintainable, and it generates pain at every level when you hunt for the cause of error message in generated files that get increasingly bewildering the further down in the autogeneration chain the file is.

2. (related to 1.) the results were prettier. A hand-crafted makefile for a project can be a kilo or two. Autotools spits 50 kB behemots in every directory. Hand-rolled Makefiles are actually really pretty in comparison.

3. when it doesn't generate something right, there should be some way to reset the project tree to start over that isn't "rm -rf and fetch again from version control". There are so many caches and vague directories involved, and if you don't manage to hit every one of them, it won't generate things right and this makes everything related to fixing autotools builds horrible. In practice I've found the rm -rf solution to work best.

4. the build system is really unstable. It seems that every new version breaks a build that used to work before.

So yeah, before I start to rage incoherently, I'll just stop here.

Kamp: A Generation Lost in the Bazaar

Posted Aug 20, 2012 23:38 UTC (Mon) by hummassa (subscriber, #307) [Link]

> So yeah, before I start to rage incoherently, I'll just stop here.

Too late. :-D

Yes, there are alternative to autotools. Normally they are prettier, but less complete. If I have to take some code multiplatform nowadays, I normally go with CMake... but sometimes autotools is necessary, depending on how much multiplatform you want to go (you know, QNX is still a thing out there!).

And, moreover, if someone already autotooled a project (like firefox!), you have to do a lot of work to convert it to CMake (or other thing of your preference). It is janitorial, unglamourous work (so volunteers normally don't want to do it) and it takes some code that does X and transforms it in some other code that does X+0 (so managers won't let you do it, because they want work that makes code go from X to X+Y, preferably for a large positive value of Y).

Kamp: A Generation Lost in the Bazaar

Posted Aug 21, 2012 7:07 UTC (Tue) by cmccabe (guest, #60281) [Link]

I've done a few conversions to CMake, and none of them took longer than a weekend. The main issues were administrative (getting cmake installed on the build servers, making sure the continuous build continued to work.) The maintenance overhead of autotools is very, very, high, and you'll get back those two days and more in a few weeks. And you'll have a build system that you understand and have confidence in, rather than a black box.

I don't know much about QNX. I believe CMake supports it in some form, but they had trouble finding anyone who actually used QNX to do maintenance.

Kamp: A Generation Lost in the Bazaar

Posted Aug 22, 2012 8:14 UTC (Wed) by drag (subscriber, #31333) [Link]

> I don't know much about QNX.

Probably not much call for cross compiling suites for the software on gas pumps.

Kamp: A Generation Lost in the Bazaar

Posted Aug 23, 2012 20:44 UTC (Thu) by hummassa (subscriber, #307) [Link]

But some of us have to... (and voting machines, and ATMs...)

Kamp: A Generation Lost in the Bazaar

Posted Aug 21, 2012 10:18 UTC (Tue) by pboddie (guest, #50784) [Link]

It's also possible to use, say, pkg-config. Which can spit out cflags and ldflags and compilation flags without any autotools voodoo. There is no need to hardcode library paths, that is simply a strawman.

I wasn't defending autotools as such, but pointing out that the alternative in the wider world, beyond the group that the author berates and believes is responsible for the downfall of civilisation, is not another configuration tool but in fact nothing at all.

Of course there is no need to hard-code library paths, and of course you can use other tools instead, but the hard reality is that anything is a step up from the default, which is not to address deployment issues whatsoever. To say that such an observation is a "strawman" is laughable: I never claimed that there was a "need" for any such practice.

The fact is that the set of all people who can write code and the set of all people who know about software engineering practices to an acceptable degree do not fully intersect, and where they do intersect, people often choose to take an established tool instead of using another for a variety of reasons instead of making a choice purely on technical merit.

Kamp: A Generation Lost in the Bazaar

Posted Aug 22, 2012 0:16 UTC (Wed) by alankila (guest, #47141) [Link]

Okay. I believe your criticism of my criticism is fair. I withdraw it -- I misunderstood. I simply found the juxtaposition you made so gross that I objected instinctively.

If I am reading this communication correctly, you are stating that autotools is in fact *the only* option "in the wider world". If that is the case, I imagine it might be time to narrow some of that wider world so that the design could still be sane while working for almost everybody...

Kamp: A Generation Lost in the Bazaar

Posted Aug 22, 2012 13:51 UTC (Wed) by pboddie (guest, #50784) [Link]

I was just saying that the author thinks that people should just move up from autotools and use something better or write something cleaner, when in fact a lot of software developers aren't even aware of autotools or other ways of solving the problems of configuration and deployment. Indeed, they might not even be aware that those problems exist. For those developers "even" autotools is better than what they're doing.

In essence, the author is enjoying a luxury problem: he can complain about people who know what he is talking about while overlooking the hordes of software developers who hard-code everything, fire up Visual Studio, and pass round the resulting binary. It's like someone at a gentleman's club complaining about the manners of the doorman when just along the street people are swearing and fighting each other.

Kamp: A Generation Lost in the Bazaar

Posted Aug 21, 2012 22:33 UTC (Tue) by nix (subscriber, #2304) [Link]

Hang on, you say your autotools animosity derives from attempts to cross-compile? And you push *pkg-config*? pkg-config is *immensely* worse than autotools at cross-compilation: it barely supports it at all, and tends to cause silent horrible failures if it doesn't (picking up .pc files for entirely the wrong architecture, with no understanding of sysroots nor any of the other delicacies involved in cross-compilation).

Your claim that code generation is an amateur's tool is also laughable in the extreme.

As for resetting things, 'make distclean' is your friend (assuming you're either following the GNU Coding Standards or using Automake, which lets you follow them automatically).

Kamp: A Generation Lost in the Bazaar

Posted Aug 21, 2012 23:43 UTC (Tue) by alankila (guest, #47141) [Link]

I said nothing about cross-compiling. You're somehow combining unrelated statements at different contexts. I've previously said that I'm pleased that Android is largely innoculated against autotools because it is so bad at cross-compiling, though. Of course, I was hastily corrected and am now aware that large part of the autotools macros do not actually need to run the resulting programs, but can do their work already through inspecting output of cpp (or something).

As to the quality of autotools and its approach to code generation, I personally believe it is a prime example of code generation done wrong. Regardless, I agree that code generation IS an essential tool for some other contexts (e.g. I have nothing against gcc transforming C to assembly), but when you write configure.ac to generate configure.in to generate configure to generate Makefile.in to generate Makefile to generate the final product ... At some point you have used too much of a good thing, damnit. And I think my objection is not really about the many layers involved, but about the quality of the implementation at each layer, which imho sucks. Hard.

"make distclean". I will try to remember that if it ever happens that I have to deal with autotools again.

Kamp: A Generation Lost in the Bazaar

Posted Aug 22, 2012 0:55 UTC (Wed) by cesarb (subscriber, #6266) [Link]

> but when you write configure.ac to generate configure.in to generate configure to generate Makefile.in to generate Makefile to generate the final product ...

AFAIK, configure.ac is just the modern name for configure.in. You will not have both at the same time, and both are used to generate configure.

Usually, you do not use configure to generate Makefile.in. Instead, you use automake to generate Makefile.in from Makefile.am, and configure to generate Makefile from Makefile.in.

So, what you have is that you write configure.ac to generate configure, Makefile.am to generate Makefile.in, and configure and Makefile.in together generate Makefile. But that is simplifying things, there is also config.h.in to generate config.h together with configure, and some stamp.h.in/stamp-h thing.

But wait, there is more! I forgot aclocal. So, you write configure.ac to generate aclocal.m4 to generate configure. And then there is libtool. Since this post is already too long, let me just link to a page of its manual with an easy-to-understand ASCII art graphic: https://www.gnu.org/software/libtool/manual/html_node/Int... (there is one for autoconf too, see https://www.gnu.org/savannah-checkouts/gnu/autoconf/manua...).

autotools is fun!

Kamp: A Generation Lost in the Bazaar

Posted Aug 27, 2012 11:35 UTC (Mon) by pboddie (guest, #50784) [Link]

I almost feel I could get some quick technical support here. People reading my other comments may get the impression that I really like autotools, but I spent the weekend updating OpenWrt packages that rely on the underlying build systems of projects that are mostly autotools systems, and there can be some weird things going on.

First of all, a lot of projects use automake which doesn't really lend itself to concise, easily diagnosable scripts - there's plenty of "where does the Makefile get *that* from?" and "do I really need a few hundred lines of settings for hello_world.c?" - and having written .in files by hand a while ago, I do get the feeling that people might just be taking something that works for someone else and stuffing it into a situation where it doesn't necessarily make sense. Here, I disagree with the author of the article: the "bazaar" may make such arguably inappropriate re-use easier since you can just search the Internet for something that works, but it isn't a precondition of "bazaar" development that you just copy stuff around.

Cross-compilation is always going to stress build systems, and I found cases of things wanting to run recently built or installed things to configure themselves (Gtk+, I think) as well as a particularly bizarre interaction with pkg-config producing build-time "staging" paths instead of "target" paths. I'm pretty sure that OpenWrt does some magic to make pkg-config do the right thing, but then some other inscrutable thing seems to be overriding it.

What I would probably conclude from much of this is that less is, more often than not, more: a more economically stated set of configuration resources would probably produce fewer surprises, especially outside the narrow mode of operation of the upstream developers.

Kamp: A Generation Lost in the Bazaar

Posted Aug 22, 2012 13:32 UTC (Wed) by nix (subscriber, #2304) [Link]

The code generation I was really alluding to was not GCC's C->assembly transformations, but the various tools that run during a GCC build to turn machine description files into (often immense and ugly) C files implementing some aspect of those md files in the finished compiler. You could interpret the md files at runtime to implement the same behaviour, but it would be intolerably slow.

As for autotools' use of code generation, that has one raison d'etre: to ensure that a configure script has no dependencies other than on core shell-script tools. In this it is quite different from cmake and most other build tools (you can't build a cmake project without cmake installed). This property is probably an essential one for anything involved in the bootstrapping core (GNU Make has a similar trick, with a shell script that will build an initial make for you on a system that has none). This property is perhaps less important these days, when everyone has a Linux box at hand, but in the old days of proprietary Unix when even compilers were hard to come by and barely worked this property was essential.

As for 'make distclean', this is not an autotools thing but rather a make target defined in the GNU coding standards, and thus implemented even by GNU projects that do not use Autoconf at all. Automake happens to generate this target because part of its purpose is to make it easier to conform to the GNU coding standards by generating most of its mandated targets automatically.

Kamp: A Generation Lost in the Bazaar

Posted Aug 20, 2012 21:10 UTC (Mon) by lieb (guest, #42749) [Link]

I think he misses the point. He may go back a ways but I go back to when there was only one version of UNIX and it ran on my pdp-11/70. I suppose that is pre-Autotools because none of it was "portable" yet. Ok, I'm old(er). To the point:

1. Autotools was a good idea at the time. M4 was also the only "reasonable" tool at the time. I don't believe even Perl was around then. Having said that, it is long past its expiration date. See the KDE dev's article on converting to cmake. It is not perfect but it is better.

2. POSIX as the end-all, be-all of solving portability has been a mirage. It too was a nice start when some of us put together and reviewed the first /usr/group standard that was the seed for POSIX. It was a nice start. Too bad the commercial vendors pretty much ignored it.

3. He mistakes the various UNIXs he names as the bazaar. It was more like a whole town of cathedrals. They all developed and released _their_ version under the same assumptions and marketing lock-in attempts as they used for their proprietary ones. All of course, paid lip service to (2).

4. LWN did 3 articles a little while ago on the UNIX design, the good, bad and downright ugly, unfixable bits thereof. Linux is much better than either of its predecessors, both SVR4 and *BSD based and has fixed/avoided many of their shortcomings but it is not "perfect" either. Sometimes its barely adequate.

5. The final bit. The proof of the pudding. All of the others are dead. Linux (and UNIX before it) must have done something right whereas VMS, MPS, Sun, DEC, Tandem etc. didn't.

Evolution is messy. There are dinosaurs and dead-ends. Yea, we let all the kids in and they made a mess or two until they got potty trained. They also have produced some wicked smart code. That happens in open source. Everybody can read it and lots think they can also write it. In the long run, I prefer it to the alternative where one secretive (IP sensitive) organization tries to control it. The last one of these is Microsoft and this is their major weakness. Yes, they have some unity to it (I'm giving them the minimally deserved benefit of the doubt) but in comparison they have so few people who really know how it works. All you have to do is look at the mess of Vista and now Windows 8. One of these days they make a wrong turn and join VMS in the nursing home.

Linus is a genius where most of his *BSD contemporaries missed. No, I am not talking about his prowess as a kernel coder although he is *really*, *really* good. His genius has been his ability to get all the rest of us to do the work, good work, for free! And he let us go and get our companies to pay for it - without a single marketing presentation to anybody. Pure genius. Tom Sawyer and painting the fence was a piker by comparison.

The younger generation

Posted Aug 21, 2012 2:40 UTC (Tue) by man_ls (guest, #15091) [Link]

Thanks for a balanced comment, and much more interesting than the original rantish and trollish article. People often forget how progress works: the new thing does not need to be perfect, just better than what came before. Asking for perfection rarely helps a software project.

I would add that the "pile of old festering hacks, endlessly copied and pasted by a clueless generation of IT 'professionals'" is often a direct consequence of "the beautiful cathedral of Unix, deservedly famous for its simplicity of design". The LWN articles on Unix principles you cite did show the point very clearly: Unix principles do not fit all situations. Without entering to assess systemd as a replacement, SysV init is a good example of a mess brought about by trying to follow the purity of Unix. There is a point in allowing a younger generation disregard all those principles and remove the pile of hacks, to start something anew. And often it brings along a simpler vision, governed by a smaller team.

Kamp: A Generation Lost in the Bazaar

Posted Aug 20, 2012 23:08 UTC (Mon) by neilbrown (subscriber, #359) [Link]

> One of Brooks's many excellent points is that quality happens only if somebody has the responsibility for it, and that "somebody" can be no more than one single person—with an exception for a dynamic duo.

Doesn't that just say it all? Quality (as defined here) doesn't scale. And at Internet Speed - if it doesn't scale, it doesn't survive :-(

Kamp: A Generation Lost in the Bazaar

Posted Aug 20, 2012 23:20 UTC (Mon) by reddit (guest, #86331) [Link]

It can scale if that one person delegates things while still checking things and keeping ultimate responsibility.

That's how most corporations work in theory, for instance, and also how the Linux kernel project works in practice.

Kamp: A Generation Lost in the Bazaar

Posted Aug 20, 2012 23:56 UTC (Mon) by neilbrown (subscriber, #359) [Link]

> and also how the Linux kernel project works in practice.

I'm sorry, but I don't see that. The functionality of Linux is certainly going up, but I'm convinced that quality is going down (and that observation is from reading code).

You can only scale by delegation when the property delegated is transitive. However taste is not transitive and consistent taste is needed for high quality.

The linux development style tends to promote compromise rather than quality (Linus wont resolve your dispute, he'll just ignore both of you until you resolve it yourself). Compromise is certainly pragmatic and functional, but is unlikely to be elegant.

I'm not meaning to attack Linux here. It's success speaks for itself and I wouldn't try to change anything which would threaten that. But let's not pretend that quality - of the sort that come from a single guiding taste - has anything to do with it.

Kamp: A Generation Lost in the Bazaar

Posted Aug 21, 2012 14:41 UTC (Tue) by intgr (subscriber, #39733) [Link]

> but I'm convinced that quality is going down (and that observation is from reading code).
> The linux development style tends to promote compromise rather than quality (Linus wont resolve your dispute, he'll just ignore both of you until you resolve it yourself).

Perhaps it's true that the addition of new features is driving the average quality down, but I also get the impression that, as features stabilize and mature, their quality improves over time again.

One clear case is the mess that was Wi-Fi support back when it got started; how the crappy "code drop" drivers got cleaned up, duplicate Wi-Fi stacks got merged into one, and gained uniform support for features and a uniform configuration interface.

Another example is the ARM cleanups and unifications in recent releases.

I would say that these are a sign of increasing quality and elegance, and a direct result of "the Linux development style", and Linus in particular.

Would you disagree?

Kamp: A Generation Lost in the Bazaar

Posted Aug 21, 2012 23:04 UTC (Tue) by neilbrown (subscriber, #359) [Link]

You make an excellent point.

There are occasionally opportunities to do some refactoring to impose some design on something that has become an ad-hoc collection of features. And sometimes, these opportunities are turned into realities. And that is great.

However I don't think of this as 'design' in the way the original article was using the term. It is more of a case of "struggle along with no real guiding force until the weight of the mess becomes unbearable and then make the effort to clean it up".

That is the sort of design that can scale. It doesn't need just one person. It allows lots of work to be done in parallel and then while there is lots of working code and lots of examples to draw from, an improved design can be drafted and existing code moved over to it - slowly and by lots of people.

It is "A priori" design that people seem to value, but doesn't really scale. "post hoc" design can scale to some extent and is what you see happening in wifi and ARM (And lots of other places).

Of course a consequence of "post hoc" design is that lots of things will never get redesigned because the weight of the ugliness never gets high enough. Maybe that is where autotools is. It is ugly, but not quite ugly enough to force a re-write. The cost/benefit is still too high.

So while there are islands of good design in Linux, they can be expected to degrade over time unless someone puts consistent work into cleaning up. It's the constant battle between energy and entropy.

Kamp: A Generation Lost in the Bazaar

Posted Aug 22, 2012 8:13 UTC (Wed) by tnoo (subscriber, #20427) [Link]

I would claim that this is the only way to move forward. Evolution is just more efficient than intelligent design (sorry for using these expressions in the software context). It is not possible to design a sane interface for the mess that actually existing hardware is, and that will include future hardware. Similarly for any aspect of the kernel under real-world loads and use cases.

Getting some working implementation, and then abstract from it, while benchmarking alternative, more elegant solutions, is the only process that works if the problem is not fully understood, or is hard.

Kamp: A Generation Lost in the Bazaar

Posted Aug 20, 2012 23:41 UTC (Mon) by bojan (subscriber, #14302) [Link]

I honestly don't understand the point he's trying to make here. He at the same time claims that open source people are unqualified adolescent idiots and that they somehow managed to work through this mess all these years (which he claims is exceedingly difficult to do). Which is it?

We'd all like the world to be perfect. Buy one programming book of the only programming language in existence, write your perfect program (which nobody tried to create before), compile it with a perfect (and only) compiler and receive a unique and perfect binary copy to run on the only architecture available...

Kamp: A Generation Lost in the Bazaar

Posted Aug 20, 2012 23:45 UTC (Mon) by hummassa (subscriber, #307) [Link]

> We'd all like the world to be perfect. Buy one programming book of the only programming language in existence, write your perfect program (which nobody tried to create before), compile it with a perfect (and only) compiler and receive a unique and perfect binary copy to run on the only architecture available...

Oh, man, NO, please. That way lies madness and tyranny.

Kamp: A Generation Lost in the Bazaar

Posted Aug 21, 2012 0:06 UTC (Tue) by bojan (subscriber, #14302) [Link]

I was being sarcastic. Of course.

Kamp: A Generation Lost in the Bazaar

Posted Aug 21, 2012 3:03 UTC (Tue) by fest3er (guest, #60379) [Link]

Corbet asked, "[A]re we too focused on the accumulation of features at the expense of the design of the system as a whole?"

To this question, I will opine yes, there are many in the industry who focus on adding features rather than fixing what already exists. Take udev, for example. Every time I upgrade udev in my firewall project, it's weeks before it works again. And now udev can't be built by itself; it requires a bunch of stuff I neither want nor need. GCC: I had to change a bunch of C++ source code to handle syntax changes between 3.5 and 4.3. And now 4.6 won't build old grub (last I checked).

It comes down to adherence to the old 'grow or die' adage. Far too many believe their software will die if they don't pile on the features--creeping featuritis. Far too many have forgotten the basic principles of UNIX utilities, one of which is, "Do one thing, and do it well."

Kamp: A Generation Lost in the Bazaar

Posted Aug 21, 2012 7:26 UTC (Tue) by khim (subscriber, #9252) [Link]

This is great post. It shows what's really wrong out there.

Well, it's starts in the usual fashion:

To this question, I will opine yes, there are many in the industry who focus on adding features rather than fixing what already exists.

This is usual complaint - nothing new here.

Take udev, for example. Every time I upgrade udev in my firewall project, it's weeks before it works again. And now udev can't be built by itself; it requires a bunch of stuff I neither want nor need.

The fascinating stuff is that as example of "feature creep" we have rare project which concentrates on removing duplicated features.

Basically these two items boil down to: features are evil, we need to ged rid of them… except for the ones I want to use, of course.

GCC: I had to change a bunch of C++ source code to handle syntax changes between 3.5 and 4.3. And now 4.6 won't build old grub (last I checked).

The same tune again: GCC removed features which it was not supposed to have in the first place and now sky is falling. So yes, again: features are evil, but the ones I personally need are vital.

Far too many have forgotten the basic principles of UNIX utilities, one of which is, "Do one thing, and do it well."

Why do you think people forgot about this? They remember that motto. But smart ones remember that it's motto of a system which is dead and buried, as well (it's descendants are alive and some of them are even widely used, but they all violate said motto in one way or another).

Kamp: A Generation Lost in the Bazaar

Posted Aug 21, 2012 15:41 UTC (Tue) by JEFFREY (guest, #79095) [Link]

"But smart ones remember that it's motto of a system which is dead and buried"

*Not* dead and buried. Those of you who wish to write massive bloatware and spend the rest of your lives debugging, have fun.

The real smart ones will settle for "worse is better," and "Doing one thing well [enough]." And those who do will not be stuck in a quagmire of runtime engines and debuggers. They are the ones who will actually get their tasks done and their legacy will be a handful of scripts that are still used years after their authors' deaths.

Kamp: A Generation Lost in the Bazaar

Posted Aug 21, 2012 16:18 UTC (Tue) by khim (subscriber, #9252) [Link]

The real smart ones will settle for "worse is better," and "Doing one thing well [enough]." And those who do will not be stuck in a quagmire of runtime engines and debuggers. They are the ones who will actually get their tasks done and their legacy will be a handful of scripts that are still used years after their authors' deaths.

You mean things like TeX? Sorry to disappoint you but it was on fast-track to extinction and oblivion before it's descendant embraced "feature creep" approach which may be, just may be, will give a new lease of life.

*Not* dead and buried. Those of you who wish to write massive bloatware and spend the rest of your lives debugging, have fun.

You may spend the rest of lives doing mostly pointless work of writing massive bloatware and debugging it or you can spend your time and write handful of scripts that are still used [many] years after their authors' deaths where many will be equal to two, or, if you are lucky, for ten.

I try to keep scripts intended for my own consumption as small and simple as possible because they are not the point (the result of their work is), but if you want to write some program for someone else then you only have two sensible choices:

  • Write massive bloatware and hope that it'll be useful in the future.
  • Write lean and mean system which will not be useful with almost 100% certainity.

Choice is yours, but I'm not sure why you think second outcome is preferable.

Kamp: A Generation Lost in the Bazaar

Posted Aug 21, 2012 17:14 UTC (Tue) by JEFFREY (guest, #79095) [Link]

I was thinking more along the lines of tar.

Kamp: A Generation Lost in the Bazaar

Posted Aug 21, 2012 18:54 UTC (Tue) by rgmoore (✭ supporter ✭, #75) [Link]

I'm not sure how well tar fits with the "do one thing and do it well" motto anymore. Source code for the latest version of GNU tar weighs in at 13.5 MB and 825 files. It has more than 40 single letter command line switches and over 100 full command line switches. It may have been a nice, simple program once, but we've kept asking it to do more and more things until it became a monster.

I think this kind of thing is more or less inevitable. We've been asking our computers to do more and more, and you can't do that without adding more complexity somewhere. It can either be at the level of adding more simple, single-purpose programs or at the level of adding complexity to existing programs, but one way or the other the system gets more complex. The only way we can keep the nice, simple, clean Unix that we had 30 years ago is to stop asking it to do more than we did with it 30 years ago.

Kamp: A Generation Lost in the Bazaar

Posted Aug 21, 2012 20:38 UTC (Tue) by raven667 (subscriber, #5198) [Link]

Well said.

Its always an interesting thought experiment to see how much of what we do with modern systems can fundamentally be done with 30yr old tech. Instead of HTTP and the Web there was Telnet or 3270 which fulfilled much of the same goals as an application delivery platform. Graphics would be right out but the core functionality of apps like Twitter, Facebook, GMail, word processing and office apps etc. could be done on 30yr old systems. Maybe the biggest problem would not be functionality but scale, no system of the day could scale to the many millions of active users that current systems support, AFAIK.

Kamp: A Generation Lost in the Bazaar

Posted Aug 21, 2012 16:23 UTC (Tue) by adobriyan (guest, #30858) [Link]

Bonus points for getting away with it and not being cursed dozens of years after your handful of scripts being put into production. But I guess dead authors do not care.

Accumulating features is an essential design component

Posted Aug 21, 2012 15:58 UTC (Tue) by southey (subscriber, #9466) [Link]

Most open source projects pride themselves on backwards compatibility so your overall system design must accumulate features. Somehow the kernel needs to handle a external hard drive using many different interfaces plus one of those also connects to the display (not a reality 21 years ago). Now there is Intel's plan for recharging using WiFi.

Sure, with incredible vision, developers have gone back to change the design (IDE-SCSI?, merging i386 and amd64 subkernels). But that is code refactoring not design.

However backwards compatibility has to end for various reasons. Probably with GCC it is related to handling and enforcing different standards as much as new features. It is not really effective to maintain K&R's 1978, ANSCI's 1983, C89/90 in 1990, C99 in 1999, C11 in 2011 in the same compiler. Also, code should be maintained so depreciated features (or non-standard functionality) are removed. Otherwise you have to go back in the time machine and use the contemporary tools of the same age as the code.

Accumulating features is an essential design component

Posted Aug 21, 2012 22:48 UTC (Tue) by nix (subscriber, #2304) [Link]

With GCC, with a very few exceptions (the removal of a few ill-thought-out language extensions in 4.0, the removal of 'traditional mode' for complexity, ill-definedness, and no-one-uses-it reasons), most code broken by new GCC releases was simply buggy all along, but GCC was incorrectly accepting it. GCC never supported C83, and nothing post-C89 has ever been removed (nor, likely, ever will be). C78 was never supported either: traditional mode supported C78 plus a bunch of common vendor extensions plus a *lot* of bugs because traditional mode was basically never used by anyone for anything for decades before its removal.

Kamp: A Generation Lost in the Bazaar

Posted Aug 21, 2012 7:29 UTC (Tue) by jpfrancois (subscriber, #65948) [Link]

So a tool written in the mostly cathedral era, used to have portability across different style of cathedral, is an ugly piece of shit, and the fault is on the people that came after the mess was created ? They can be blamed for still using it, but as goes the quote there is two kind of systems :
"the kind nobody uses and the kind everyone bitches about"

His example is not really supporting his thesis. It is not a reason to discard the entire thesis, but come on, the beautiful Cathedral of Unix is more like the sagrada familia, and autotool a children of the pre dotcom era.

Generations Won For Java

Posted Aug 21, 2012 8:43 UTC (Tue) by rwst (guest, #84121) [Link]

This all is why people just take the platform, install a JRE and do real work.

Generations Won For Java

Posted Aug 21, 2012 10:21 UTC (Tue) by pboddie (guest, #50784) [Link]

I wonder if the author has seen Apache Maven. Plenty of extra material right there.

Generations Won For Java

Posted Aug 21, 2012 10:21 UTC (Tue) by bojan (subscriber, #14302) [Link]

You forgot a few steps: buy all available RAM in the local store, install it and then wait 2 minutes each time for the program to start. ;-)

It runs everywhere!

Posted Aug 21, 2012 12:35 UTC (Tue) by man_ls (guest, #15091) [Link]

And then convince everyone else to install the latest JRE to run your package. Wait, not 1.6; it has to be 1.7. Yes, all 50+ MB of it. No, IcedTea is no good; what sort of freak are you? ARM, what's that? Wait, you don't have all the Apache Commons libraries already? Choose between the following weird installation options: as an applet (does not work on Firefox or Chrome any more; IE required), as an inconvenient .jar file which will quickly get lost, as a hacky .bat or .sh wrapper on a .jar distributed inside a zipped directory, or as a .exe generated using black arts (which won't work on your Linux system anyway).

Life is good again! Such is progress!

It runs everywhere!

Posted Aug 21, 2012 14:12 UTC (Tue) by fb (subscriber, #53265) [Link]

Sure, Java and the JVM did not solve every deployment problem on earth. News at 11.

Compared to the possible amount of deployment/running errors you can get running C/C++, I'd say that Java (and all the associated tooling around it) actually brought a great deal of sanity to the table.

It runs everywhere!

Posted Aug 21, 2012 15:01 UTC (Tue) by man_ls (guest, #15091) [Link]

It is not just that Java did not solve every deployment problem; in fact Java has created additional deployment problems on every platform it runs on, which are just a handful. To name just a few issues: download a gigantic runtime, deploy a specific version of the JVM, deploy associated libraries (which may or may not conflict with those in the JRE), reserve a lot of memory for it (often more than the JVM provides by default) and make the whole thing run in a sensible way for end users without having to type an arcane command. A configuration nightmare. Compare with Python: a small runtime, standard library locations in the system, a simple way to run scripts, runs truly everywhere.

As to runtime errors, you are right about C/C++, but there is a reason why people put up with them. And Java is not without issues. I don't know if you have ever been in jar hell: looking for an error which manifests only on the web server, caused (supposedly) by a specific version of one the libraries installed, but you have no way to find out whether it was embedded in the JRE, loaded from the classpath, from the library path, or embedded in the .war file. (Normally you assume it is using the .jar in the .war, and only after a lot of head-scratching you work up from there.)

For development you need not only a way to manage all of the above, but also an alien build system (Ant, alien even to the JDK) and a way to manage dependencies (Maven), and then you have a lot of random packages not integrated with anything else -- not even with those already in the system. No wonder the only way to do sane development with Java is from within Eclipse, which provides a lot of the missing pieces -- but good luck trying to package and run that code elsewhere.

Makes autotools look like something easy. At least you run a standard set of commands: ./configure && make && make install and you are done, sometimes -- or you google the error.

It runs everywhere!

Posted Aug 21, 2012 15:19 UTC (Tue) by Cyberax (✭ supporter ✭, #52523) [Link]

>download a gigantic runtime, deploy a specific version of the JVM
That's not true. JVM is incredibly backwards-compatible, up to the point it becomes excessively so. That means you simply need a recent enough version of JVM.

>make the whole thing run in a sensible way for end users without having to type an arcane command. A configuration nightmare.
That's partially true. Java launchers are system-dependent, though WebStart helps somewhat.

But it's the same problem with Python, you have to package a Python app separately for Linux, Windows and Mac OS X.

Three answers in one

Posted Aug 21, 2012 16:23 UTC (Tue) by man_ls (guest, #15091) [Link]

But it's the same problem with Python
Not really. Python runs source code files directly, without compilation, so you can just git clone a repo and run it (double click on Windows): no need to package. If you want to package software, Python includes Distutils, which make installing software a breeze. Finally, Python has the official repo PyPI, and a tool called pip to download third party libraries. All that as part of Python itself, as officially supported tools. Hard to beat.

Three answers in one

Posted Aug 21, 2012 22:52 UTC (Tue) by Cyberax (✭ supporter ✭, #52523) [Link]

Try to beat Java WebStart. I simply give you a link and it downloads and starts application (with all its dependencies).

Now try that with Python. On Windows.

Go on, do it. You'll have to install Python first, or you won't be able to run anything. After you've installed it you might notice that clicking on a script results in an ugly console window (even though you might have a GUI app), so you need some sort of runner.

Three answers in one

Posted Aug 21, 2012 23:16 UTC (Tue) by man_ls (guest, #15091) [Link]

But to use WebStart you need to install the JRE first. How is it different than installing Python, except that Python is much smaller and available for many more platforms (and pre-installed on Mac OS X and most Linuxen)? Once installed I can beat your single link with a single command: pip install package. I can dictate it to aunt Tilly over the phone.

I don't know much about WebStart, but I have not seen it used for anything remotely serious. For packaging, distributing, installing and using software on a regular basis, and for basic configuration management, Python is great; I don't know the steps needed to package something using WebStart though.

Three answers in one

Posted Aug 22, 2012 2:42 UTC (Wed) by Cyberax (✭ supporter ✭, #52523) [Link]

>But to use WebStart you need to install the JRE first.
Go to http://java.com and download it. Big deal.

>Once installed I can beat your single link with a single command: pip install package. I can dictate it to aunt Tilly over the phone.

- Pip? Is it a character from the Great Expectations? Do you want me to reread this awful novel? How dare you!

Once you say "command line" you are WAY outside the realm of acceptable for end-user apps. WAAAAAAY too far.

> I don't know much about WebStart, but I have not seen it used for anything remotely serious. For packaging, distributing, installing and using software on a regular basis, and for basic configuration management, Python is great; I don't know the steps needed to package something using WebStart though.

Java has a package management system, it's called Maven. It's somewhat similar to pip/setuptools except that it works at _build_ _time_. And as a result you can generate Java WebStart packages, self-installing executables, etc.

The greatest problem with Java really is its 'heavy' JVM. Sun missed the opportunity to make it lightweight, alas.

Apache's role in all of this

Posted Aug 22, 2012 10:13 UTC (Wed) by man_ls (guest, #15091) [Link]

So, both WebStart and Python require the user to download it before using it. Unsurprisingly. From there Python is more geared to professional users, for which I am grateful (since amateurs are not going to use WebStart or pip anyway).

Maven is not in Java, it is an outside package. Which brings us to another one of Java's treats: the big ecosystem of Apache software. But wait, why is such a big corpus of code necessary at all? To provide for Java deficiencies, which means the 50MB+ JRE monstrosity is not even enough to have a decent set of collections. Each useful third-party package you find out there is likely to depend on a few Apache libraries, meaning that you will be soon managing tens of .jar files. To bring this into perspective, there are 39 Apache Commons projects, which are supposed to provide foundation libraries. Which is why Maven is needed. Whew!

Apache code is maintained by the ASF foundation which is at odds with Oracle. I guess that the worst problem here is that Java is controlled by Oracle, a hostile corporation. Compare with Python, Perl, C or even JavaScript (which seems to be in the hands of a handful of browser vendors, but at least they cooperate).

Apache's role in all of this

Posted Aug 22, 2012 15:38 UTC (Wed) by Cyberax (✭ supporter ✭, #52523) [Link]

>So, both WebStart and Python require the user to download it before using it. Unsurprisingly. From there Python is more geared to professional users, for which I am grateful (since amateurs are not going to use WebStart or pip anyway).

Actually, it is possible to install Java using browsers' automatic plugin detection.

>To provide for Java deficiencies, which means the 50MB+ JRE monstrosity is not even enough to have a decent set of collections.

Please, stop repeating nonsense. Commons-collections is a dead project, it hadn't been necessary for a loooooooong time. Sun JVM in fact has one of the best collection libraries, including parallel and non-blocking collections which you'd be hard-pressed to find in other languages.

You also might actually browse the list of apache-commons libraries. About a half of them are either thin wrappers over several other libraries and/or long-dead projects.

Apache's role in all of this

Posted Aug 22, 2012 16:12 UTC (Wed) by man_ls (guest, #15091) [Link]

I was not repeating anything, it is just my own recollection. If the situation has improved nowadays, good for them!

It runs everywhere!

Posted Aug 21, 2012 15:44 UTC (Tue) by pboddie (guest, #50784) [Link]

Sun missed the boat with regard to tooling: back in the early days of Java, had Sun made javac just that little bit more powerful, it could have solved many of the build solution needs there and then. Instead, people probably used Makefiles for a while before Apache Ant came along.

And the best retort to "install a JRE" is, of course, "which one?" I have to run stuff which only works with Sun's Java - it can't be anything like IcedTea that most probably implements the breadth of the required functionality - and so the practice of industry hacks targeting a single, narrow, effectively proprietary platform continues, but everyone can claim that they're using open standards.

Oh, and those industry hacks will all be working in a cathedral-style project up to their necks in dodgy code and with "security through obscurity" being one of the project value statements.

But I agree with you that high-level languages with managed environments can and should provide significant simplification over systems programming languages like C and C++. The emergence of stuff like Maven indicates that there's plenty of complexity remaining, however.

It runs everywhere!

Posted Aug 21, 2012 16:16 UTC (Tue) by Cyberax (✭ supporter ✭, #52523) [Link]

> And the best retort to "install a JRE" is, of course, "which one?"
And the best retort to it is, of course, "the most recent one". Sun JVM is very backwards-compatible.

It runs everywhere!

Posted Aug 21, 2012 16:39 UTC (Tue) by man_ls (guest, #15091) [Link]

Sun JVM is no more; Oracle JVM comes with a very annoying binary license:
Unless enforcement is prohibited by applicable law, you may not modify, decompile, or reverse engineer Software.
Remember where we are writing: we may be forgiven for disliking these restrictions.

Besides, Oracle only distributes JVMs for a very limited set of 8 operating system and processor architecture combinations: Linux x86, Linux x64, Mac OS X, Solaris x86, Solaris SPARC, Solaris x64, Windows x86, Windows x64. Considering that ARM systems probably outnumber all these combined, it is a great limitation.

In all this discussion I haven't even brought out the most annoying point of Java development: its "native" graphical toolkits AWT and Swing, which are not integrated with any environments and look alien everywhere. Luckily there is SWT, but it does not precisely simplify the configuration management problem.

It runs everywhere!

Posted Aug 21, 2012 22:52 UTC (Tue) by Cyberax (✭ supporter ✭, #52523) [Link]

Just use OpenJDK. It's fully functional for 99% of applications.

It runs everywhere!

Posted Aug 22, 2012 9:21 UTC (Wed) by cortana (subscriber, #24596) [Link]

Or fully function for 99% _of_ an application. Like Wine. :)

It runs everywhere!

Posted Aug 22, 2012 14:54 UTC (Wed) by pboddie (guest, #50784) [Link]

Actually, I was impressed with Wine running a fairly old, but still 3D-based, game that I thought would be a migration show-stopper recently. Even the fancy joystick controller worked as intended.

Meanwhile, with Java runtime environments, if you don't want to run the Sun JRE or just can't, even if OpenJDK should do the job because of its common heritage and proximity to the Sun JRE, that's no guarantee it will work. As I noted, people in certain industries are great at touting the supposed openness of a technology while deploying something that only works with one vendor's specific product.

It also doesn't inspire confidence that Oracle as a vendor, ignoring the reputation currently enjoyed by the company after its recent behaviour, still lives on pre-Internet time with respect to security updates. That said, I do wonder how much attention any of the JREs get in comparison to widely-used open source projects.

It runs everywhere!

Posted Aug 22, 2012 15:07 UTC (Wed) by cortana (subscriber, #24596) [Link]

Unlike with most Java programs, which are only tested on the Sun^WOracle JRE, at least there is a large community of users who try out Windows programs on Wine, and file bugs/send patches to get things fixed. :)

It runs everywhere!

Posted Aug 22, 2012 1:07 UTC (Wed) by pflugstad (subscriber, #224) [Link]

FWIW, Sun/Oracle JVM is available on ARM (and PowerPC), but it's only free for trial use:

<http://www.oracle.com/technetwork/java/embedded/downloads...>

I've had a lot better luck with those than I have with the various IcedTea incantations on ARM/Linux.

It runs everywhere!

Posted Aug 22, 2012 8:48 UTC (Wed) by nim-nim (subscriber, #34454) [Link]

Java didn't bring any sanity to the table, nor did it solve any deployment problem.

SUN used its 'rewrite everything in Java' mantra to create an ecosystem with no legacy code to manage, and apps so balkanized they could never grow to a full system, and then exploited this situation to avoid thinking about deployment.

In fact, it promoted bad engineering for years to postpone as much as possible tackling the deployment problem, and the attitude got so pervasive it infected SUN Java competitors (don't reuse code, copy it, don't propagate fixes, fork libs, everyone should hardcode classpath dependency chains, use maven to download specific binary artifacts to avoid thinking about code lifecycle, use osgi to pile up jar files without having to sort them, only successful commercial environment: application servers with dedicated sysadmins to sort the mess and the app server trying to salvage deployment because core Java is unable to do it).

As a result more than a decade later, when enough Java code accumulated SUN/Oracle finally has to tackle deployment, they've hit a hard wall of technical debt, and the modularization of Java gets postponed from release to release (it will be in Java 7, oops, let's split java 7 in 7 and 8 with the modularization in 8, oops, 8 will ship without it wait for 9 instead, etc).

Kamp: A Generation Lost in the Bazaar

Posted Aug 21, 2012 13:07 UTC (Tue) by xan (guest, #58606) [Link]

> Perhaps it's just venting by somebody who got left behind, but perhaps he has a point: are we too focused on the accumulation of features at the expense of the design of the system as a whole?

I'll quote this in the next (or the next 50) GNOME flamefests.

Kamp: A Generation Lost in the Bazaar

Posted Aug 21, 2012 14:14 UTC (Tue) by dgm (subscriber, #49227) [Link]

I will be right after you to point out that the quote is grossly out of context.

Kamp: A Generation Lost in the Bazaar

Posted Aug 21, 2012 14:20 UTC (Tue) by drag (subscriber, #31333) [Link]

I thought people got pissed off about Gnome _removing_ features and caring too much about design.

Kamp: A Generation Lost in the Bazaar

Posted Aug 21, 2012 18:01 UTC (Tue) by hummassa (subscriber, #307) [Link]

Gnome removes things not because they care about design, but because they want new shiny things and just eliminate the code for the old, working things.

Kamp: A Generation Lost in the Bazaar

Posted Aug 21, 2012 22:29 UTC (Tue) by Company (guest, #57006) [Link]

That assumes GNOME actually wants something.

Kamp: A Generation Lost in the Bazaar

Posted Aug 22, 2012 10:38 UTC (Wed) by dgm (subscriber, #49227) [Link]

Don't anthropomorphize GNOME, it hates it.

(Sorry, couldn't resist).

Kamp: A Generation Lost in the Bazaar

Posted Aug 23, 2012 13:15 UTC (Thu) by pboddie (guest, #50784) [Link]

Don't speak on GNOME's behalf, either! It hates that almost as much as being anthropomorphized!

Kamp: A Generation Lost in the Bazaar

Posted Aug 21, 2012 22:33 UTC (Tue) by bojan (subscriber, #14302) [Link]

The "design" of Gnome 3 is based on some misguided notion that there is a "philosophy" a desktop should follow. Creating a good desktop is essentially a utilitarian problem. When things become (measurably) more cumbersome, that is a design failure, no matter how much or how little one cares about the design. Similarly, when well established and understood metaphors are replaced with half baked hacks that have surprising effects, that is another design failure.

As for removing features, that is OK, as long as the functionality can be obtained in some other way, which is equal or better than the existing way. In the case of removal of the type-ahead feature of Nautilus, this is certainly not the case. Ergo, people complained.

Kamp: A Generation Lost in the Bazaar

Posted Aug 22, 2012 8:20 UTC (Wed) by drag (subscriber, #31333) [Link]

The original poster seemed to think that taking a quote from a blog article complaining about how Linux/free software-type folks have a lack of direction, discipline , and favor features willy-nilly would be useful in insulting gnome developers in future discussions.

I am just pointing out that people tend to piss and moan because of features being removed and Gnome caring too much about their design, which is quite the opposite.

How much you appreciate the design philosophy that you believe that the gnome project has is not really pertinent to my comment.

Kamp: A Generation Lost in the Bazaar

Posted Aug 22, 2012 23:05 UTC (Wed) by bojan (subscriber, #14302) [Link]

What I was trying to point out is that I think both you and the original poster have it wrong. Legitimate complaints about Gnome (3) are based on neither the amount of caring for "grand design" nor feature creep/removal. It is far more mundane than that. As I said, it is essentially a utilitarian problem.

People were able to do certain things the old way. They find that in the new system they either cannot do them or that they are more cumbersome. That's about it. They couldn't give a toss about the rest.

So, the pertinence of my comment in relation to Gnome design is in the fact that that design is concerning itself with irrelevancies (i.e. the philosophy), so it cannot possibly be the source of legitimate complaints.

Kamp: A Generation Lost in the Bazaar

Posted Aug 23, 2012 17:46 UTC (Thu) by jedidiah (guest, #20319) [Link]

The problem with GNOME3 is not their vision but that they didn't leave the old version in place while they created it. You should be able to upgrade any distro from GNOME2 to GNOME3 without seeing any differences. Your old interfaces should remain intact and in place.

That doesn't happen with something like Ubuntu 12.04.

They didn't just make something new. They trashed the old stuff while they were at it. They made new forks necessary just by refusing to leave the old stuff alone.

Kamp: A Generation Lost in the Bazaar

Posted Aug 24, 2012 1:56 UTC (Fri) by bojan (subscriber, #14302) [Link]

Exactly my point. Thank you.

Kamp: A Generation Lost in the Bazaar

Posted Aug 21, 2012 17:34 UTC (Tue) by renox (subscriber, #23785) [Link]

I don't think that the cathedral vs bazaar part is interesting (even if I cannot help thinking so Plan9 never got a good desktop, so much for cathedral development) some of his concrete points are interesting though, some not so much:

1) lack of portability to FreeBSD:
==> why aren't those patches upstream?

2) the software in the FreeBSD ports collection contains at least 1,342 copied and pasted cryptographic algorithms.
==> if true, given the security aspect, that is a big issue!

3) libtiff is used by one library even though Firefox cannot render TIFF images
==> someone made a mistake in the dependencies (most probably) or there is a bug/dead code in FF, a very minor issue.

4) one package needs both Perl and Python
==> an even more minor issue

5) you will find that you need three different versions of the make program, a macroprocessor, an assembler, and many other interesting packages.
==> code reuse and dev freedom have drawbacks..

6) libtool, which tries to hide the fact that there is no standardized way to build a shared library in Unix. Instead of standardizing how to do that across all Unixen—something that would take just a single flag to the ld(1) command
==> That is pure BS, what about existing systems??

6) he hates autoconf
==> the implementation is bad, I think that everybody agree here, but some here prefer autotools to the other tools (for example cmake) so what this means is that we still doesn't have "very good" build tools, a sad situation indeed.

So (2) is a big issue, (1) is an issue too, (6) is bad too..

Kamp: A Generation Lost in the Bazaar

Posted Aug 21, 2012 20:59 UTC (Tue) by tterribe (✭ supporter ✭, #66972) [Link]

> 3) libtiff is used by one library even though Firefox cannot render TIFF
> images
> ==> someone made a mistake in the dependencies (most probably) or there is
> a bug/dead code in FF, a very minor issue.

No, libtiff is a dependency of GTK+, but of course Firefox does not rely on GTK's image loading routines to determine what formats it supports in web pages, because it does not use GTK on all platforms.

Kamp: A Generation Lost in the Bazaar

Posted Aug 22, 2012 6:34 UTC (Wed) by renox (subscriber, #23785) [Link]

> of course Firefox does not rely on GTK's image loading routines to determine what formats it supports in web pages, because it does not use GTK on all platforms.

'of course'? That's not so obvious: Firefox displays H.264 when the OS supports this format, the same policy could be used for TIFF images.


Copyright © 2012, Eklektix, Inc.
Comments and public postings are copyrighted by their creators.
Linux is a registered trademark of Linus Torvalds