User: Password:
|
|
Subscribe / Log in / New account

Poettering: systemd for Administrators, Part XII

The twelfth installment of systemd for administrators covers securing services. "In this iteration of the series we want to focus on a couple of these security features of systemd and how to make use of them in your services. These features take advantage of a couple of Linux-specific technologies that have been available in the kernel for a long time, but never have been exposed in a widely usable fashion. These systemd features have been designed to be as easy to use as possible, in order to make them attractive to administrators and upstream developers..."
(Log in to post comments)

Poettering: systemd for Administrators, Part XII

Posted Jan 21, 2012 0:35 UTC (Sat) by Rudd-O (guest, #61155) [Link]

I will save everybody here some time by posting a generic rant aimed at preempting the typical rants people usually post when Lennart Poettering makes news:

RANT
RANT
RANT
RANT

OK. Done. Now you can proceed with your lives without having to bother ranting about Lennart, while the rest of us who derive much enjoyment from his work every day can continue to be unmolested by Lennart haters.

Poettering: systemd for Administrators, Part XII

Posted Jan 21, 2012 0:39 UTC (Sat) by HelloWorld (guest, #56129) [Link]

Let's not kid ourselves, that never worked.

Poettering: systemd for Administrators, Part XII

Posted Jan 21, 2012 2:37 UTC (Sat) by elanthis (guest, #6227) [Link]

I do find it funny. I'm probably one of the biggest ex-Linux-proponent-turned-detractors who still tries to stay up-to-date with and active in the Linux scene, yet I think Lennart is one of the better minds in the Linux community. His drinking of the GNOME 3 kool-aid is unfortunate, but without his many (oddly) controversial projects, Linux wouldnt even have a remote chance of competing with real desktop OSes. I can't fathom why all the fanboy nerds who want to believe that Linux is a major force outside the LAMP and HPC spaces hate Lennart's work so much; without him, the gap between Windows 7 and Linux would be so large as to be insurmountable. He and folks like Kay, Michael, Peter, and the other co-authors of the "Lennart projects" are practically the only people even trying to keep Linux technologically relevant to modern computer users' needs.

Poettering: systemd for Administrators, Part XII

Posted Jan 21, 2012 3:35 UTC (Sat) by martin.langhoff (subscriber, #61417) [Link]

Agreed. The platform needs to move forward with changes like these. There is disruption ahead, if we don't take it or we mess up the execution, the linux platform will quickly become even less relevant for non-server usage.

It is interesting though. The server space -- LAMP, routers, cloudy stuff and all its variations -- moves at a slower pace, and packs a lot less disruption. On the !server side, OTOH, the churn is relentless; and getting crazier now with all the different form factors and wider variation in hw.

This reminds me of the legendary evergreen paranoia of billg, who couldn't believe that the DoJ would consider MS a monopoly -- his view was that with the incessant churn and competition, the market leader was a blink away from losing any market advantage. Once you spend enough time close the churn of trying to be a widely used end-user platform, with quickly changing expectations, the mindset... understandable.

(Here I am thinking about desktop, laptop, netbook, tablets, mini-tblets, phones, ebook readers and whatever other variants people dream up next week.)

Poettering: systemd for Administrators, Part XII

Posted Jan 21, 2012 13:34 UTC (Sat) by Wol (guest, #4433) [Link]

Actually, I think you'll find that churn is a deliberate tactic by MS, it WASN'T a major factor before Win95.

Okay, there was a bit of hardware churn - Visicalc never transitioned from Apple to PC etc etc, but that's because they didn't watch the ball.

From Win95 on, Microsoft is both actively churning, AND hiding the ball. The churn in the server space is a lot slower because MS is finding it a lot harder to hide the ball.

Cheers,
Wol

Poettering: systemd for Administrators, Part XII

Posted Jan 21, 2012 15:58 UTC (Sat) by Cyberax (✭ supporter ✭, #52523) [Link]

Not really, IMHO.

It was mostly caused by explosion in PCs. Before 90-s there was a similar churn in old 8-bit/16-bit computers (ZX-Spectrum, Commodore 64, ....).

During 90-s it has transitioned to the PC market with lots and lots of manufacturers building new hardware. MS has actually been pretty good at providing reasonable support for hardware developers and outstanding backwards compatibility for drivers. One can take NT4 driver from 95 and it'll probably work with only small modifications on Win8.

Poettering: systemd for Administrators, Part XII

Posted Jan 21, 2012 13:08 UTC (Sat) by farnz (subscriber, #17727) [Link]

I think Lennart comes in for a lot of criticism for two reasons:

  1. He doesn't work around other people's bugs - he reports them, and helps upstream fix them. In the short term, this can result in his software being apparently horrendous (I'm thinking PulseAudio here), as it exposes huge numbers of upstream bugs that other people had worked around.
  2. Very little of what he does is new things; nothing PulseAudio does, or that systemd does, or indeed that Avahi does were things you simply could not do at all before his project existed. Instead, he takes things that a competent geek could rig, given 2 to 3 days work, and makes them manageable for everyone, often fixing up hidden gotchas in the process.

The second point takes a little expanding - you could get the user-visible effects of PulseAudio with ALSA's dmix. You can start a system running with upstart. You can get names into DNS with dynamic DNS and DHCP integration. None of the problems Lennart's projects set out to solve are completely unsolved problems when he starts - he "just" moves from a 10% solution that's good enough for enthusiasts, to a 99% solution that's good enough for almost everyone.

Poettering: systemd for Administrators, Part XII

Posted Jan 21, 2012 16:36 UTC (Sat) by HelloWorld (guest, #56129) [Link]

> The second point takes a little expanding - you could get the user-visible effects of PulseAudio with ALSA's dmix.
No you couldn't. dmix simply doesn't do what PulseAudio does: per-application volume control and transferring streams from one device to another. And it's the same with systemd, socket activation simply wasn't possible before, because there was no convention about how to pass an existing socket file descriptor to a newly started daemon.

Poettering: systemd for Administrators, Part XII

Posted Jan 21, 2012 20:30 UTC (Sat) by farnz (subscriber, #17727) [Link]

Which just goes to show how hard it was to do before PulseAudio. I did per application volume control with a nasty hack in asound.conf to set up a default PCM with a softvol control named using an environment variable. Not trivial, the way it is in PulseAudio, but not impossible either. Similarly, using some nasty hacks with the copy and route plugins, you could make ALSA redirect streams between devices. Again, not nearly as easy as it is in PulseAudio, but it works.

Socket activation is explicitly modelled on the inetd way of working (where a socket is passed to a newly started daemon) - again, not made as simple as it is by systemd, but quite possible before.

Poettering: systemd for Administrators, Part XII

Posted Jan 22, 2012 10:21 UTC (Sun) by hongli (guest, #75254) [Link]

It's true that none of the things he does are strictly things that weren't possible before, but I argue that they're still better and therefore worth doing.

For example the old init system uses shell scripts all over the place. The problem with shell scripts is that *a lot* of operations require a fork() and an exec() which makes things less efficient than they should be. Reading the contents of a PID file? Exec cat. Sending a signal to a process? Exec kill. Each of those commands in turn execute a lot of redundant code, like loading glibc, allocating memory, etc. A lot of people usually respond to this by saying "who cares, I boot my system and never shut it down". I'm baffled by their apparent ignorance of (or indifference to) desktops, laptops and mobile devices.

Poettering: systemd for Administrators, Part XII

Posted Jan 22, 2012 13:37 UTC (Sun) by farnz (subscriber, #17727) [Link]

Absolutely - I can't think of a Poettering project that isn't a net improvement on what came before. But that's the reason Poettering comes in for criticism; he's not doing stuff that couldn't be done before his project started, so when you hit one of the bugs (either in his code, or merely exposed by his code), it's all too easy to go "Poettering sucks! systemd/avahi/PulseAudio/ifplugd/project-of-the-day doesn't work nearly as well as sysvinit/dynamic-dns/asound.conf-of-doom/ethtool-based-hacky-shell-script/thing-that-did-this-before because its bugs make it completely unusable!".

It doesn't matter that once the bugs are fixed, the project is a net improvement on what came before; it doesn't even matter that there already improvements if you don't trip over the bugs. What the complaining users see is "it worked before, now it doesn't. Must be Poettering's fault, as that's the last thing I changed". Heck, to the complaining users, it doesn't even matter that they've seen a pile of other small improvements from Poettering's code, as they've hit a bug, and all they can see is "it worked before PoetteringProject, it doesn't work now".

Poettering: systemd for Administrators, Part XII

Posted Jan 22, 2012 18:39 UTC (Sun) by hongli (guest, #75254) [Link]

I think the problem is that Lennart doesn't do enough *reputation management*. His choices are sound in a perfect world, but in the real world systems are not perfect, have bugs and don't completely conform to specifications, and are used by people - creatures who are known to be bounded rational. I think his mistake was that he didn't do enough to educate people about the strictness of his software, or enough to prevent breakage. I'm not saying it's his fault, but it is his problem. I've never used systemd and but I keep hearing from people that it breaks the boot process. Lennart could have improved the systemd error messages so that users know why things break and how to fix things, or he could provide some kind of validation tool so that users can verify that the system will boot correctly prior to switching to systemd, or he could introduce some kind of "non-strict mode" that's off by default but that users can enable. All of this would have reduced the likelihood that people blames him for all the problems.

I try to build as much "reputation management code" in my software as possible so that people don't blame me for problems I'm not responsible for. For example one of the software that I write is a web application server. When something goes wrong, my software tries very, very, very hard to explain to the user what went wrong, why it went wrong, where it went wrong, and how to fix it. To further reduce user annoyance, the message is displayed in an aesthetically pleasing way. It's all psychology you see - if people see a beautiful error message page then they're less likely to be annoyed or to feel anxious and blame all the problems on my software. My software also actively works around broken environments. Some people set their ulimit stack size to 80 MB by mistake. Because my software is heavily multithreaded, they would exhaust their virtual memory if they launch my software, or they would wrongfully conclude that my software is a memory hog because the VSIZE is large. To prevent that kind of things I force a thread stack size of 128 KB in my software.

I realize that this sounds like a huge pain in the ass and that from a technical point of view you shouldn't need all these things. But we're working with people here so I think these things are very very important.

Poettering: systemd for Administrators, Part XII

Posted Jan 22, 2012 19:38 UTC (Sun) by obi (guest, #5784) [Link]

Well, one could argue that one of the reasons Lennart has been getting less pushback on systemd than pulseaudio, is because he's been documenting it more, and provided an elaborate rationale in his many blog posts. This despite the fact that systemd is much more central to the system than pulseaudio.

I have to give it to him: I'm not sure many people would have the stomach to wade in and improve/change/fix things that sysadmins have been doing for decades, and continuing to do so in spite of the incessant flaming. When I first heard about systemd or the /run transition, I was thinking "sounds fantastic, but how is he going to get people to agree". Well, in these cases it seems people are mostly agreeing - surprisingly enough.

As an aside: you should really give systemd a shot, it's great; I'm not running it on my servers (yet), but on my laptops/desktops it's a great improvement.

And thanks for Passenger - I've been using it for quite a while now, and it's a indeed a great piece of software!

Poettering: systemd for Administrators, Part XII

Posted Jan 23, 2012 9:29 UTC (Mon) by farnz (subscriber, #17727) [Link]

If my experiences are typical, part of the reason for less noise than you might expect is that the people who make lots of noise are generally not the people who make the decisions; a big clue is when someone says "[projectname] is about choice" - those people tend to make noise rather than help get things done. As long as the distro guys who actually make decisions don't push code before it's ready for prime time, the noisemakers don't see the code in unfinished buggy state.

With PulseAudio, it got pushed as a default to users when it wasn't quite stable enough to be useful. The noisy people got compelled to use it even though it didn't always work perfectly, and thus generated lots of noise.

With systemd, they deliberately delayed pushing it as a default (it was supposed to land for Fedora 14, but missed), because it wasn't quite ready. By the time systemd was pushed as a default, it was actually quite good (and getting better all the time); as a result, the noisy people didn't have a huge amount to complain about.

Poettering: systemd for Administrators, Part XII

Posted Jan 21, 2012 17:18 UTC (Sat) by tetromino (subscriber, #33846) [Link]

The real problem with #2 is that there are people for whom the old-fashioned 10% solution works well enough, but the Poettering solution fails disastrously (e.g. booting with systemd hard-locks a machine—I personally experienced this), and the barrier to hacking the Poettering solution (reading and modifying system-level C code inherently takes more mental effort than doing the same for shell scripts) is sufficiently high that the user never bothers with the Poettering solution again, and goes back to the traditional solution, until some upstream project (Gnome) forces him to install Poettering code, once again breaking his system…

All that said, I agree with Poettering's goals, and that the traditional Unixy way of doing things is often hardly ideal. I just wish the code he put out were less fragile on non-standard setups.

Poettering: systemd for Administrators, Part XII

Posted Jan 21, 2012 18:19 UTC (Sat) by HelloWorld (guest, #56129) [Link]

> All that said, I agree with Poettering's goals, and that the traditional Unixy way of doing things is often hardly ideal. I just wish the code he put out were less fragile on non-standard setups.
What makes you think that it's actually Poettering's code that is fragile, rather than something else involved? When I switched to systemd, my system also didn't boot properly. And guess what? It wasn't because of systemd, but because of broken LSB init script headers in some init script. sysvinit didn't actually care about those headers, while systemd did, and yet people whine about systemd. This is actually a perfect example of point no. 1: Lennart doesn't bother to work around other people's bugs, and rightly so.

Poettering: systemd for Administrators, Part XII

Posted Jan 21, 2012 18:33 UTC (Sat) by dlang (subscriber, #313) [Link]

and here we have a disconnect, if switching to a new tool doesn't work with the existing scripts/data, is this a problem with the new tool or with the scripts/data

LP considers it a problem with the other stuff, admins tend to consider it a problem with the new tool

Poettering: systemd for Administrators, Part XII

Posted Jan 21, 2012 18:54 UTC (Sat) by HelloWorld (guest, #56129) [Link]

> if switching to a new tool doesn't work with the existing scripts/data, is this a problem with the new tool or with the scripts/data
The answer to that question, as I'm sure you can tell, is yes.

Jokes aside, I believe it very much depends on the specific case we're talking about. Yet, many people won't bother figuring out precisely what went wrong and prefer to blame some random person or program instead.

Poettering: systemd for Administrators, Part XII

Posted Jan 21, 2012 20:00 UTC (Sat) by magfr (subscriber, #16052) [Link]

I think this is the failure of 'Be lenient in what you accept' striking once more.
You should be strict in both what you accept and in what you produce, otherwise someone out there will be sloppy and send crap that abuses your leniency.

Poettering: systemd for Administrators, Part XII

Posted Jan 21, 2012 20:12 UTC (Sat) by dlang (subscriber, #313) [Link]

I agree that the technically correct answer is 'it depends', but the technically correct answer isn't what really matters.

For users there is a regression, things that used to work are no longer working. It doesn't really matter which component is doing something wrong, the end result is that the user can't work.

If the reason for this is "we opted to go with this new tool" then the cause for the regression is that new tool (even if the technical fault is that some other component doesn't quite comply with the specs, but in a way that never mattered before.

This also points out the difference between the official spec of the API and the practical spec (how much of the official spec you really need to implement to make things work (the de-facto spec). As a practical matter, things that don't need to be implemented 'properly' to make things work probably won't be, so if you later change things in such a way that they break if these previously 'optional' things aren't right, you need to think really hard about what value you gain by requiring these things to be right and what fallback options you can provide (either ignoring the broken info or extracting what value you can get from the broken info) rather then breaking completely.

there are quite a few people doing infrastructure work for linux that don't pay attention to this sort of thing, and this causes all sorts of problems for users. the case mentioned above where script headers didn't matter before, but break under systemd is a perfect example of this, but the systemd developers are not the only offenders.

If you are really starting from scratch, with no installed base (like Android did), then you can just implement the new way of doing things without worrying about backwards compatibility, but if you are writing something that you hope to get added to an existing system (and this includes writing a new version of an existing system, android ICS, Gnome 3, KDE 4, systemd, etc), then you do have to deal with backwards compatibility and the de-facto standard.

Yes, thee are times when you can decide to break the de-facto standard (not being willing to do so under any condition leads to a windows-like mess), but you should be very reluctant to do so.

Poettering: systemd for Administrators, Part XII

Posted Jan 27, 2012 17:06 UTC (Fri) by jeremiah (subscriber, #1221) [Link]

I write and design a fair number of APIs. The habit that I've tried to get into is to never release those APIs without first writing a reference implement and then seeing what part of the designed API is actually needed. It's always a balancing act for me between writing the RI, Using the RI (full test cases), and Cleaning up the API. I don't know how many times Ive put something out there that seemed complete and elegant at the time. And Once we tried to use it it was just a pile of char[]. I think people run into problems by depending on 1.0 APIs, not that they have any choice a lot of times. APIs, file formats etc, all seem to suffer from the standard write the code , then rewrite it all over problem. Things always seem to settle down after/during that third rewrite. <snark>Gnome 3 being the exception.</snark> At least writing software is fun right?

Poettering: systemd for Administrators, Part XII

Posted Jan 22, 2012 19:27 UTC (Sun) by iabervon (subscriber, #722) [Link]

Actually, I think this is a failure of bulletproofing: an invalid-but-interpretable script or config file need not work as desired, but it shouldn't crash the system. Systemd should take responsibility for getting you to a state where you can tell what was invalid and what was wrong with it, and where you can try to fix it. The real need is not to be lenient in what you accept, but to be defensive about what you accept.

Poettering: systemd for Administrators, Part XII

Posted Jan 22, 2012 19:41 UTC (Sun) by HelloWorld (guest, #56129) [Link]

When I said my system didn't boot properly, I didn't mean it didn't boot *at all*. Due to the broken init script headers, there was a dependency cycle among a few services, which lead to some services not being started (notably dbus). The system was still usable enough to figure out what the problem was and fix it.

Poettering: systemd for Administrators, Part XII

Posted Jan 21, 2012 18:54 UTC (Sat) by tetromino (subscriber, #33846) [Link]

> What makes you think that it's actually Poettering's code that is fragile, rather than something else involved?

Testing. I managed to narrow the problem down to /etc/fstab, which I am quite certain was valid; replacing it with the bare default, without encrypted swap and without all the bind mounts that were needed for multiple development chroots, made systemd stop hard-locking the machine a minute after booting (instead, it failed at shutdown, but that's another story). I tried to read systemd source to figure out what could be causing the problem, gave up, and went back to openrc, which Just Works™, and boots noticeably faster than systemd as a nice bonus.

Poettering: systemd for Administrators, Part XII

Posted Jan 22, 2012 19:38 UTC (Sun) by iabervon (subscriber, #722) [Link]

Hard-locking the machine a minute after boot when /etc/fstab has certain valid but unusual things in it kind of sounds to me like a kernel bug between cgroups and clever VFS operations. I suppose the other possibility is that systemd segfaults, causing the kernel to panic (since it doesn't like killing init). But the most likely thing is probably that systemd set up the filesystem as you specified and used the features the kernel advertised (that nothing used before on your system), and the kernel broke.

Poettering: systemd for Administrators, Part XII

Posted Jan 24, 2012 22:12 UTC (Tue) by anselm (subscriber, #2796) [Link]

None of the problems Lennart's projects set out to solve are completely unsolved problems when he starts - he "just" moves from a 10% solution that's good enough for enthusiasts, to a 99% solution that's good enough for almost everyone.

That would also be a pretty fair description of what Steve Jobs did in his life. People like to diss Lennart Poettering while Steve Jobs is basically on the fast track to sainthood. It's a strange world.

Poettering: systemd for Administrators, Part XII

Posted Jan 26, 2012 8:15 UTC (Thu) by dlang (subscriber, #313) [Link]

> That would also be a pretty fair description of what Steve Jobs did in his life. People like to diss Lennart Poettering while Steve Jobs is basically on the fast track to sainthood. It's a strange world.

Actually, I suspect that the overlap of the people who want to cannonize Jobs and the people who have even heard of Lennart is very small.

It's telling that a lot of the criticism of Lennart is that he is blindly copying things from windows or OS/X. that hardly sounds like the same people who are Apple fans

Poettering: systemd for Administrators, Part XII

Posted Jan 26, 2012 9:06 UTC (Thu) by anselm (subscriber, #2796) [Link]

It's telling that a lot of the criticism of Lennart is that he is blindly copying things from windows or OS/X.

I don't think the idea behind systemd (for example) is to make Linux more like OS/X – systemd takes some inspiration from OS/X because launchd is fundamentally a reasonable concept, but I wouldn't describe systemd as »blindly copied«. Actually it seems to me that quite a considerable amount of independent creativity went into making the idea work well on Linux. So this »criticism« does not appear to be grounded in fact – it sounds like propaganda.

Anyway, what else did St. Steve do in his life but copy stuff from others and refine it?

Poettering: systemd for Administrators, Part XII

Posted Jan 26, 2012 9:17 UTC (Thu) by dlang (subscriber, #313) [Link]

note that what I was replying to seemed to imply that people are being hypocritical by criticizing Lennart because the same people praise Steve Jobs and the two are doing the same thing.

I agree they are doing the same thing, but I disagree that the same people are criticizing Lennart and praising Jobs

Poettering: systemd for Administrators, Part XII

Posted Jan 21, 2012 18:35 UTC (Sat) by Zack (guest, #37335) [Link]

"competing with *real* desktop OSes" .. ?

"the fanboy nerds" .. ?

That makes it sound like you're "one of the biggest" alright.

Excuse me for not taking your comment particularly serious or unbiased.

Poettering: systemd for Administrators, Part XII

Posted Jan 21, 2012 20:21 UTC (Sat) by elanthis (guest, #6227) [Link]

Like I said, I think Linux is crap on the desktop, and I'm not afraid to admit that. I used Linux exclusively for 10 years. Did the whole "everyone should use Linux, it's Free, and does everything a home user would need" song and dance. Pushed for it in the organizations I worked in, and even saw large scale desktop Linux rollouts happen as a result. Got a lot of friends and family to use it for a time. End of the story is that those all ended disasterously because of a constant stream of stupid obvious bugs, horrible UI design, and outright missing critical features. I _know_ Linux was and still is a complete and utter failure on the desktop, and I say that based on a hell of a lot more experience than "lolz I use Linux and got my equally boring spouse to use it and its so much better than Win98 was hurr-derp".

I'm seriously starting to believe that FOSS just doesn't work for desktop software in the general case, with things like Firefox being a rare jewel of an extraordinary exception rather than a model for what all FOSS can ever hope to be. Desktop software requires a QA approach that the mob of disorganized hobbyist individuals just don't seem to be capable of doing, for reasons that make a lot of logical sense (QA is soul-sucking and boring as hell, and everybody I know who does it hates it with a passion). However, as much as QA is something that nobody wants to do, Microsoft has more QA engineers (i.e. SDETs) in a single sub-department of their XBL team than there are contributors to the entirety of GTK, GNOME, Xorg, and Mesa. The quality difference is, quite simply, intractably immense.

Poettering: systemd for Administrators, Part XII

Posted Jan 21, 2012 23:59 UTC (Sat) by dskoll (subscriber, #1630) [Link]

I think Linux is crap on the desktop

You're entitled to think whatever you like, but that doesn't make it anything more than your opinion. My opinion is diametrically opposite to yours; I use and like Linux on all my desktops and I feel completely disconcerted and lost the odd time I have to use Windows.

Listening to people spout "Linux is crap on the desktop" gets boring pretty quickly.

Poettering: systemd for Administrators, Part XII

Posted Jan 22, 2012 1:17 UTC (Sun) by drag (subscriber, #31333) [Link]

hehe. It's a matter of perspective.

For my purposes the Linux desktop is undoubtedly superior in many very significant ways. I have no desire to use OS X and Windows desktop is extremely frustrating on many levels.

For the 'average person', not so much.

It's very frustrating sometimes.

Poettering: systemd for Administrators, Part XII

Posted Jan 22, 2012 3:01 UTC (Sun) by dskoll (subscriber, #1630) [Link]

It's also a matter of what you're used to. My parents and kids all use Linux on the desktop. They have never used anything else, so they're quite comfortable with it (and also pretty uncomfortable with Windows when they have to use it.)

Maybe if they'd started out with Windows, Linux would have seemed strange and uncomfortable.

Poettering: systemd for Administrators, Part XII

Posted Jan 22, 2012 4:53 UTC (Sun) by Cyberax (✭ supporter ✭, #52523) [Link]

Well, Linux usage share on the desktop seems to support the argument that Linux is not that great for desktop users.

Poettering: systemd for Administrators, Part XII

Posted Jan 22, 2012 5:32 UTC (Sun) by sfeam (subscriber, #2841) [Link]

It would only support the argument if the usage share were low among the desktop users who have tried it. While that might be true, I haven't seen any evidence presented to prove it.

Poettering: systemd for Administrators, Part XII

Posted Jan 22, 2012 17:21 UTC (Sun) by raven667 (subscriber, #5198) [Link]

While of course this is an anecdote and not evidence, in my experience most of the linux desktop users in the technical services fields switched to Mac OS X within the last 5 years. So the point is that I know a lot of people who have not only tried it but used Linux desktops for years who switched away because they were tired of fiddling with an unreliable, minority system and just wanted something where all the features like sound, graphics, networking and power management would continue to work from day to day.

Poettering: systemd for Administrators, Part XII

Posted Jan 28, 2012 0:25 UTC (Sat) by zlynx (subscriber, #2285) [Link]

Yep. I am one of those people. I used Linux exclusively on my laptop from 2004 to 2008. Then I went to using OS X on a Macbook Pro.

In OS X it sure was nice that my WiFi always worked and always instantly reconnected after suspend resume. It was also very nice that suspend always suspended and resume always resumed. It was great having OpenGL that was both fast and reliable. It was amazing to have OS updates that didn't break anything.

In 2011 I went back to Linux. Fedora 15 (16 now), on a new PC laptop. Because it happens to use all Intel chips, it works pretty well. It still has a few big annoyances, like the fact that WiFi takes over 20 seconds to reconnect after resume. OpenGL is slow. Sometimes I have to unplug and replug the USB keyboard. Some Bluetooth mice just don't work (oddly Microsoft and Apple mice do). And the trackpad will sometimes completely freak out. And I need to use a few funky options on my kernel command line so it doesn't panic on reboot. And the update from F15 to F16 required me to hand-edit those same kernel options into the grub file. Oh, and I also had to manually fix a 32-bit required library that the update installed on the 64-bit system.

In a lot of ways OS X and Windows 7 are a much better desktop experience. Linux is mostly there, but has so many little nits.

Poettering: systemd for Administrators, Part XII

Posted Feb 15, 2012 18:52 UTC (Wed) by mfedyk (guest, #55303) [Link]

Excuse me, but the point of Fedora is to be bleeding edge and expose those nits. You need to use a distro that focuses more on stability and polish. Opensuse and debian stable would probably be two of the more popular choices available.

Poettering: systemd for Administrators, Part XII

Posted Jan 22, 2012 6:10 UTC (Sun) by tetromino (subscriber, #33846) [Link]

> Well, Linux usage share on the desktop seems to support the argument that Linux is not that great for desktop users.

Only if you assume that people make a conscious, rational choice of desktop operating system. And that assumption is hard to justify; for far too many categories of items (think of food, music, elected politicians, or romantic partners), peoples' choices tend to be semi-conscious and utterly irrational.

Poettering: systemd for Administrators, Part XII

Posted Jan 23, 2012 4:25 UTC (Mon) by drag (subscriber, #31333) [Link]

The sole purpose of a operating system is to run applications.

That's it. 100% specific purpose of a operating system is to:

1. Make it easy to write applications.
2. Make it easy to run applications.

You do not have to be aware of this fact to use or judge operating systems either. When your applications do not run on your operating system it makes your operating system worthless. When your applications do not run on your hardware it makes your hardware worthless.

Linux desktop had it's chance with Linux netbooks. For quite a few months Linux netbook systems were the top most popular items sold on places like Amazon.com and quite a few. How many of the people that bought those systems actually went back and bought a second Linux system? I am sure that it is in the single digits.

Linux fails because:
1. It makes it harder to write applications because you have to deal with distribution's BS before you can reach your audience and that there is no standardization.
2. It does not run any of the applications people want to run after decades of using Windows XP.

You can get all hand-wavy and start talking out the side of your mouth about Microsoft being like McDonalds, people are too stupid for Linux, people just have not tried Linux yet, there is some grand conspiracy from Microsoft holding Linux down, etc etc... all of it is true to some amount, but picking one thing and saying that is _the_ problem really amounts to ignoring reality.

And it's not even things like Photoshop or video games. It's having the ability to check your email, run one of the hundreds of thousands of special purpose applications written for this or that corporation, integration into active directory, and a hundred other really PITA and mundane things that Windows does that Linux can't or Linux makes it difficult to do. It's all the hundreds of thousands of schools teaching Microsoft stuff. It's the businesses training people in it, that depend on it, that are happy with it. It is the 'geek squads' and millions of people that make a living working with individuals and businesses to work around Window's limitations.

It's not just the low hanging fruit that is killing Linux it's the long, long, long tail.

Incidentally this is why you will never see ARM systems displace x86 desktops (even with Windows 8). Even if ARM manufacturers discovers out a significant market segment that x86 folks are not exploiting correctly it will take years to get to the point were the applications are up to speed and by that Intel and friends would have competitive products to eliminate any desire to migrate in the first place.

Poettering: systemd for Administrators, Part XII

Posted Jan 23, 2012 12:28 UTC (Mon) by michaeljt (subscriber, #39183) [Link]

> Linux fails because:
> 1. It makes it harder to write applications because you have to deal with distribution's BS before you can reach your audience [...]

(Slightly) off-topic, but I always find the contrast between the Unix philosophy of "do one thing and do it well" and the Linux distributions philosophy of "try to do everything people could ever need" rather interesting.

Poettering: systemd for Administrators, Part XII

Posted Jan 23, 2012 20:20 UTC (Mon) by dlang (subscriber, #313) [Link]

the Unix philosophy is to have many tools that can be used together, each of which is designed to "do one thing well"

the linux distros generally go a good job of packaging all these tools together so that users can "do everything they ever need"

there's no inherent conflict between these two pieces.

"desktop environments" tend to break these philosophies badly, at both ends.

Poettering: systemd for Administrators, Part XII

Posted Jan 23, 2012 12:39 UTC (Mon) by dgm (subscriber, #49227) [Link]

> Linux fails because:
> 1. It makes it harder to write applications because you have to deal with distribution's BS before you can reach your audience and that there is no standardization.
> 2. It does not run any of the applications people want to run after decades of using Windows XP.

That's in clear contradiction with OSX success. Note you can write:
1. It makes it harder to write applications because you have to deal with distribution(Apple)'s BS before you can reach your audience.
2. It does not run any of the applications people want to run after decades of using Windows XP.

That lets us only with "standardization". Can that alone be the cause of Linux "failure"? I don't think so. If you want to know where the problems are, you have to digg a bit deeper. What are the true differences between _commercially_ successful OS (Windows, OSX, iOS)?

* comes with the hardware.
* has a powerful marketing department behind it that will pull all the tricks in the bag to make it desirable for you.
* has a vast amount of "entertainment" software available.
* you are expected to do nothing except maybe pay for it.

What success???

Posted Jan 23, 2012 15:54 UTC (Mon) by khim (subscriber, #9252) [Link]

That's in clear contradiction with OSX success.

Historically MacOS had 3-4% market share. Today is has 6-7% (depending on who's measuring it). And it looks like it achieved this "success" not because people switched from Windows, but because because people have bought slick hardware which included MacOS as the only option "out of the box".

Apples-to-apples comparison (Hackintoch vs pirated Windows vs Linux) shows that MacOS itself is not all that popular.

2. It does not run any of the applications people want to run after decades of using Windows XP.
It does: BootCamp works fine for rare cases (if you need to ruck Quicken couple of times per year), or you can run it seamlessly if compatibility is important.
That lets us only with "standardization". Can that alone be the cause of Linux "failure"?

Absolutely. If you do offer stable ABI then it's not guarantee of success (see Windows Phone), if you break the ABI then you lost (see Windows Phone 7). Linux breaks ABI few times per year.

What success???

Posted Jan 23, 2012 17:29 UTC (Mon) by dgm (subscriber, #49227) [Link]

> And it looks like it achieved this "success" not because people switched from Windows, but because because people have bought slick hardware which included MacOS as the only option "out of the box".

Thanks, you made my argument for me. ;-)

>> 2. It does not run any of the applications people want to run after decades of using Windows XP.

> It does: BootCamp works fine for rare cases (if you need to ruck Quicken couple of times per year), or you can run it seamlessly if compatibility is important.

I have used Wine to great success many times. Also, having an XP virtual machine does wonders (hey, Win7 does that too!). So this is also not the problem.

>> That lets us only with "standardization". Can that alone be the cause of Linux "failure"?

> Absolutely. If you do offer stable ABI then it's not guarantee of success (see Windows Phone), if you break the ABI then you lost (see Windows Phone 7). Linux breaks ABI few times per year.

What was the last time you heard somebody that some random program stopped working in Linux because unstable ABI? I haven't ever. Not a single time. Why? because all the Linux users I know of use software that comes from their distro repositories.

What success???

Posted Jan 23, 2012 18:39 UTC (Mon) by paulj (subscriber, #341) [Link]

What was the last time you heard somebody that some random program stopped working in Linux because unstable ABI? I haven't ever. Not a single time. Why? because all the Linux users I know of use software that comes from their distro repositories.

Referencer is a neat little C++ GNOME app to manage BibTex. The GNOME APIs it uses have been deprecated, along with the C++ bindings. Starting with Fedora 16, the -devel libraries no longer exist of the old GNOMEmm libraries. Presumably there's some technical reason for that, I havn't investigated that yet. As a consequence, the "referencer" app no longer builds on Fedora 16, and they're considering whether to remove it from Fedora.

So at the moment I'm trying to fix up referencer to use the new APIs that replace the deprecated ones. This mostly involves replacing uses of the GNOME::VFS:Uri class with Gio::File - which offers almost-but-not-quite the same functionality (+ a superset of other stuff), and one or two other class that have changed from GNOME:: to Gtk:: something, but are otherwise essentially the same with respect to what referencer required of it at least.

Ok, this example is from the API side, not ABI, but still..

Yesterday...

Posted Jan 23, 2012 19:31 UTC (Mon) by khim (subscriber, #9252) [Link]

What was the last time you heard somebody that some random program stopped working in Linux because unstable ABI?

Wrong question. Right question: when was the last time you heard somebody that some random program is no longer available in their distribution repository. And the answer will be: yesterday (gnochm). And it's quite hard to find out how many developers just gave up, but there are some estimates. At this point Windows Phone 7 has less users then Linux - yet there are some 50'000 apps for it. Debian has less then half of that. Sure, applications in Debian are usually more serious then some random screensavers in Android/iOS/WP7 markets - but this exactly the problem: it's not that hard to develop Linux program (Android or iOS APIs certainly look more alien then Linux APIs for Windows developer), but try to make it available for the user... and you'll hit the wall.

Why? because all the Linux users I know of use software that comes from their distro repositories.

Yup. That means that for 99% of users Linux is absolutely useless. People complain that iOS is "golden cage" because the only way to publish iOS program is to ask Apple, but in comparison to Linux it's paradise: Apple will kick you out only if it decides to compete with you, while Linux distributions can kick you out for bazillion different reasons - the most popular being "this obsolete application uses XYZ library which we want to remove and authors are not responsive. Come one: just why the authors should be "responsive"? They have create program, it works, users are happy, what right do you have to demand anything else? As drag said:
100% specific purpose of a operating system is to:

1. Make it easy to write applications.
2. Make it easy to run applications.

As I've said years ago: Friedliness to ISV is almost directly proportional to market share: Linux is 1% (because of the aforementioned problems), Mac OS is 5-10% (it's better then Windows, but it still likes to do things like drop support for Carbon when 100% incompatible replacement was only available for eight years), and the rest is Windows (which is horrible in many ways but has excellent backward compatibility).

As long as question about "how to install program on Linux?" question will remain "use the distribution's repo, stupid" Linux will be 1% niche.

P.S. Note: I use Linux myself and I'm quite happy with it. I still hate the fact that I need to compile some programs from sources and I understand that as long as that's the case I can not in good conscience recommend it to my non-IT friends.

Yesterday...

Posted Jan 24, 2012 14:56 UTC (Tue) by dgm (subscriber, #49227) [Link]

> Right question: when was the last time you heard somebody that some random program is no longer available in their distribution repository.

A good question for the distro, but not in the context of a reply to the original assertion, wich was: "Absolutely. If you do offer stable ABI then it's not guarantee of success (see Windows Phone), if you break the ABI then you lost".

In the same vein, Microsoft DOES break up ABI and API compatibility from time to time. I'm told that there are many programs that worked just fine under XP, but no longer do under Vista and Win 7.

Yesterday...

Posted Jan 24, 2012 15:33 UTC (Tue) by khim (subscriber, #9252) [Link]

In the same vein, Microsoft DOES break up ABI and API compatibility from time to time. I'm told that there are many programs that worked just fine under XP, but no longer do under Vista and Win 7.

Sure. Nobody is perfect. Some programs don't work or work incorrectly. Especially low-level stuff like Visual Studio 6.0 (there are some problems with debugger). But the fact that Visual Studio 6.0 (released over 10 years ago) still works at all (and Visual Basic 6.0 is still fully supported) says you something.

Yesterday...

Posted Jan 24, 2012 19:08 UTC (Tue) by HelloWorld (guest, #56129) [Link]

> But the fact that Visual Studio 6.0 (released over 10 years ago) still works at all
Visual Studio 6.0 never worked to begin with, unless a very loose definition of "work" is applied.
And besides, making old software work on Linux is usually just a matter of installing a few really old libraries.

Yesterday...

Posted Jan 24, 2012 21:30 UTC (Tue) by Cyberax (✭ supporter ✭, #52523) [Link]

Please, VisualStudio 6 was a very good IDE. It was fast, snappy and with Tomato plugin it had great autocomplete.

VisualC++ 6 was mediocre, mostly because it hasn't supported features like SFINAE and member templates. But it worked OK for simple C++.

Yesterday...

Posted Jan 24, 2012 16:04 UTC (Tue) by mpr22 (subscriber, #60784) [Link]

I'm told that there are many programs that worked just fine under XP, but no longer do under Vista and Win 7.

Yes. The difference being, a significant proportion of them were doing things that Microsoft had told people to stop doing back when Windows 2000 was current.

Yesterday...

Posted Jan 24, 2012 16:51 UTC (Tue) by Cyberax (✭ supporter ✭, #52523) [Link]

Not really. Windows ABI is well designed and is incredibly stable. One can still run NT4-era programs without much problems now. Windows Vista mostly broke a lot of programs with embedded Internet Explorer controls. And usually not even seriously.

Besides, Windows uses "bring all your libraries with you" ideology, so it's very easy to package stuff.

To be fair, Linux kernel ABI and libc are very stable. And if you care to bring along all other libraries you can still run software from 1995 on the newest distributions. However, as we know, bundling libraries is discouraged in Linux.

Yesterday...

Posted Jan 25, 2012 1:13 UTC (Wed) by Trelane (subscriber, #56877) [Link]

> However, as we know, bundling libraries is discouraged in Linux.

Last I knew, it was discouraged everywhere. Otherwise, you get to maintain security updates for all those libs.

Yesterday...

Posted Jan 25, 2012 6:37 UTC (Wed) by Cyberax (✭ supporter ✭, #52523) [Link]

Yet it's commonplace on Windows because it's easy. On Unix systems it's noticebly more complex.

Yesterday...

Posted Jan 25, 2012 16:37 UTC (Wed) by nybble41 (subscriber, #55106) [Link]

Setting LD_LIBRARY_PATH is "noticeably more complex"? Personally I find that quite easy to do, and prefer it to the security nightmare which is the Windows default of automatically loading libraries from the current directory.

Yesterday...

Posted Jan 25, 2012 16:48 UTC (Wed) by Cyberax (✭ supporter ✭, #52523) [Link]

Try to bundle glibc with your app.

Bundling MSVCRT in Windows is a piece of cake.

Yesterday...

Posted Jan 25, 2012 17:11 UTC (Wed) by nybble41 (subscriber, #55106) [Link]

I'll grant that glibc is an exception. Among other things, it's closely tied into the dynamic loader (ld-linux), the path to which is hard-coded into every executable, so setting LD_LIBRARY_PATH isn't sufficient here. The glibc team recommends a chroot environment, though you might be able to get away with specifying the proper loader explicitly. (It can be run as a normal program, with the dynamic executable path as an argument.)

Out of curiosity, why would you *want* to bundle glibc? They actually do a very good job of ensuring backward-compatibility, versioning the symbols where necessary. The fact that many Windows developers seem to feel it necessary to bundle their version of the C runtime with their application feels more like a handicap than a feature by comparison; on UNIX, the C runtime is traditionally considered a system component, not part of the application. If you're going so far as to bundle glibc, I don't see why you wouldn't just statically link the entire executable. That would certainly be much simpler from a distribution point-of-view.

Yesterday...

Posted Jan 25, 2012 17:14 UTC (Wed) by Cyberax (✭ supporter ✭, #52523) [Link]

Well, I've just built an executable on recent Ubuntu. And now I want to run it on RHEL 6.2. Guess what happens?

Static linking of glibc would be nice if it were supported.

Actually this is the case which is absolutely identical in Windows world.

Posted Jan 25, 2012 17:21 UTC (Wed) by khim (subscriber, #9252) [Link]

Well, I've just built an executable on recent Ubuntu. And now I want to run it on RHEL 6.2. Guess what happens?

Absolutely the same thing as in Windows world. Hint: recent versions of MVCR*.dll just flat out refuse to work if you don't install them properly. To make portable application you need jump trough hoops. On Linux it means using older version of GLibC, on Windows you need specially crafted project. And you should not use features not available in older version of your OS (or you can use them conditionally when they are available). This is not such a big deal. Lack of forward compatibility is [relatively] easy to handle, but the fact that Linux constantly breaks backward compatibility is complete disaster.

Actually this is the case which is absolutely identical in Windows world.

Posted Jan 25, 2012 17:29 UTC (Wed) by raven667 (subscriber, #5198) [Link]

That article is pretty funny, at every point the author has to say "wait, it's actually more complicated than that" and then drop another level down the rabbit hole. It's amazing Windows works as well as it does because what goes on under the hood is just ridiculous.

Actually this is the case which is absolutely identical in Windows world.

Posted Jan 25, 2012 18:06 UTC (Wed) by Cyberax (✭ supporter ✭, #52523) [Link]

Windows is like that. Over the years the famous DLL Hell problem led to a lot of "solutions" which are worse than problems they try to solve.

Yet it all somehow clunks along...

Actually this is the case which is absolutely identical in Windows world.

Posted Jan 25, 2012 17:32 UTC (Wed) by Cyberax (✭ supporter ✭, #52523) [Link]

>Absolutely the same thing as in Windows world. Hint: recent versions of MVCR*.dll just flat out refuse to work if you don't install them properly.

Not true. MSVCRT DLLs work just fine if you put it beside the application. It's just that they have special 'magical' hooks into SxS.

But you can bundle them with your app. MS even recommends it: http://support.microsoft.com/kb/326922 and I've done it personally.

>On Linux it means using older version of GLibC

Which means that I have to find the most ancient distro, replicate all the toolchain (hey, do you know what fun it is to compile GCC 4.7 on CentOS 4?) and then build your app.

That is a very real pain point. And Something Must Be Done.

I'm thinking about binary patching executable file to make it look like it requires older version of glibc.

Actually this is the case which is absolutely identical in Windows world.

Posted Jan 25, 2012 17:49 UTC (Wed) by jrn (subscriber, #64214) [Link]

Might be worth trying lsbcc first.

Sorry, but you are wrong...

Posted Jan 25, 2012 19:05 UTC (Wed) by khim (subscriber, #9252) [Link]

Not true. MSVCRT DLLs work just fine if you put it beside the application.

Nope. msvcrt.dll (the old one from MSVC 6.0) does. Newer ones just complain bitterly "R6034 An application has made an attempt to load the C runtime library incorrectly". You can only install them as Shared Side-by-Side Assemblies (as explained here).

It's just that they have special 'magical' hooks into SxS.

Not only that. They are always shared (real private installation is no longer an option) and they must be installed in the system, not just dropped in the directory with your binary.

I'm thinking about binary patching executable file to make it look like it requires older version of glibc.

Will not work™. Linux does not even store the version of GLibC needed to run your application in binary linked with GLibC. What it does store is version of library function called from your program. And if you call new function not available in old version of GLibC (because you are using headers with redirect "open" to "__open_2") then you really need GLibC (or some other library) which will actually include these functions.

What you can do is to create library which will provide such symbols (as weak alias, probably) for older versions of GLibC. But this sounds like an SDK - and if you are doing an SDK then you can just use older version of GLibC in it...

Sorry, but you are wrong...

Posted Jan 25, 2012 20:10 UTC (Wed) by boudewijn (subscriber, #14185) [Link]

"Nope. msvcrt.dll (the old one from MSVC 6.0) does. Newer ones just complain bitterly "R6034 An application has made an attempt to load the C runtime library incorrectly". You can only install them as Shared Side-by-Side Assemblies (as explained here)."

And certain version(s) of Visual Studio would actually create the side-by-side assembly wrong, which meant we had to patch some xml files when delivering our application on Windows. That was quite a bit of an eye-opener.

Frankly, having produced software for Windows for twenty years, Linux for fifteen years and OSX for three years, I find there's precious little to advocate one platform over another. All have peculiarities, pitfalls and idiosyncrasies that are consume roughly the same amount of time when making end-user software ready for installation.

Free software on Linux has the advantage that as a volunteer developer I can leave most of that to the distributions, though. Makes life easy and lets me focus on coding...

There are huge difference, however...

Posted Jan 25, 2012 21:27 UTC (Wed) by khim (subscriber, #9252) [Link]

Frankly, having produced software for Windows for twenty years, Linux for fifteen years and OSX for three years, I find there's precious little to advocate one platform over another. All have peculiarities, pitfalls and idiosyncrasies that are consume roughly the same amount of time when making end-user software ready for installation.

The interesting question happens after you've managed to write working installer.

On Windows compatibility is big deal™ - this means that next versions of Windows will include bunch of hacks which will try to keep your software from breaking. They will not always work (it's not really possible), but it'll try to guess what to do with your program to make it usable (for example Windows Vista will automatically ask for administrator privileges for programs named install.exe or setup.exe - unless they include manifest which disabled such requests: this way old installers work and new installers may decide to not request admin provileges by default).

On MacOS the same happens - but for limited time only. For example when Mac was transitioned from PowerPC to x86 it only supported old binaries for pathetic five years. Developers were forced to recompile (and sometimes rewrite) everything in this short period.

On Linux... nobody cares. Beauty os the desktop is paramount and if it requires breaking the ABI - nobody will think twice. Libraries are added and removed in each revision of OS (sometimes even minor security updates change SO versions), files are moved around without any kinds autodetection, etc. And it looks like temporary stabilization I've talked about back then was short-lived: in last 3-4 years almost everything was broken on desktop (on level above libx11/glibc).

There are huge difference, however...

Posted Jan 26, 2012 1:34 UTC (Thu) by HelloWorld (guest, #56129) [Link]

> On Linux... nobody cares.
Yeah, except that they *do* care. That posting you've linked to is full of FUD and lies, you can find the details in my response to that posting (which I had written before realizing I was responding to a 3-year-old posting. Oh well).

That's funny. really...

Posted Jan 26, 2012 8:18 UTC (Thu) by khim (subscriber, #9252) [Link]

That posting you've linked to is full of FUD and lies

Hardly. Instead it includes bitter truth - which nicely explained why Linux is constantly losing battle for desktop.

you can find the details in my response to that posting (which I had written before realizing I was responding to a 3-year-old posting. Oh well).

Nope. Not well. In fact it makes all the difference in a world. All the compatibility layers were afterthoughts and they were often added to the system later, often years later. When users have cried themselves hoarse and developers have left. In cases where they were added from the start (V4L) they were often buggy - and response from the driver developers was often "just stop using this obsolete interface", which is insulting for developers (who have other plans besides the need to chase random changes in Linux ABI) and absolutely unsuitable for users (who have no way to "stop using the obsolete interfaces").

As I've said: situation is slowly improving, but it's still far from perfect. You continue to say that you've run gtk2 and gtk3 applications side-by-side, that you've run Qt3 applications, etc but you forget to say what you needed to do to make them work. Usually you need to find some libraries or modules and install them, use LD_LIBRARY_PATH or other tricks. Nothing works out of the box. This is what I call nobody cares: "naïve" developers think that backward compatibility is OS developers responsibility first and theirs second if at all (most think OS developers should solve everything without them), "naïve" users think that they don't care who's responsible - but they do know they are not (especially if they paid for the application and OS... since OS is often free and "you can't get much for free" developer is usually one who's drowned in complains) and "self-righteous" Linux desktop architects "know" it's not theirs problem. The end result? 1% on desktop, lost mobile platform, etc.

BTW current 1% is actually good result: at least you still have hardware which you can use to play your power games. Don't count on it to be available forever, though.

That's funny. really...

Posted Jan 26, 2012 11:28 UTC (Thu) by HelloWorld (guest, #56129) [Link]

> Nope. Not well. In fact it makes all the difference in a world. All the compatibility layers were afterthoughts and they were often added to the system later, often years later.
Please, stop spreading bullshit. ALSA has had an OSS emulation layer ever since it was merged into the kernel. PulseAudio has supported OSS and ALSA compatibility for ages.

> In cases where they were added from the start (V4L) they were often buggy
[citation needed]

> As I've said: situation is slowly improving, but it's still far from perfect. You continue to say that you've run gtk2 and gtk3 applications side-by-side, that you've run Qt3 applications, etc but you forget to say what you needed to do to make them work.
Nothing. I use ekiga (gtk2) and pavucontrol (gtk3). Both are shipped with debian, the package manager pulls in the required dependencies. And I've used Xilinx ISE, which shipped with Qt3 included (and even if it hadn't, there'd be no problem, as debian still ships Qt3 packages).

Besides, I'd like to see some kind of backup for your claim that compatibility is as important as you seem to think it is. The only "application" I use that only supports OSS is Unreal Tournament, which works just fine. And I don't know many other people who use applications of similar age. I can't think of anyone, actually.

That's funny. really...

Posted Jan 26, 2012 14:44 UTC (Thu) by khim (subscriber, #9252) [Link]

Please, stop spreading bullshit. ALSA has had an OSS emulation layer ever since it was merged into the kernel.

It's one thing to have something called "emulation layer". It's something else entirely to support old programs. For years said emulation layer was buggy and basically unusable. When I start Adobe Reader I should not lose the ability to watch YouTube videos - even if both use "obsolete" OSS interfaces.

Linux desktop finally got good, usable OSS emulation in 2008 - years after switch from ALSA to OSS.

> As I've said: situation is slowly improving, but it's still far from perfect. You continue to say that you've run gtk2 and gtk3 applications side-by-side, that you've run Qt3 applications, etc but you forget to say what you needed to do to make them work.
Nothing.

Hardly.

I use ekiga (gtk2) and pavucontrol (gtk3). Both are shipped with debian, the package manager pulls in the required dependencies.

Ah, so to make them work you only need to give up your freedom and stop choosing your software for yourself - you should just use what your distribution offers you. This is good band-aid, but it does not solve the problem. ISVs are still out of the loop and this means Linux is still unsuitable for a desktop.

The only "application" I use that only supports OSS is Unreal Tournament, which works just fine. And I don't know many other people who use applications of similar age. I can't think of anyone, actually.

That's because there are none. Unreal Tournament is remnant of the brief era in which it looked like Linux is gearing to be real contender for a desktop. Then "great desktop designers" started breaking stuff repeatedly and ISVs abandoned their Linux efforts.

Besides, I'd like to see some kind of backup for your claim that compatibility is as important as you seem to think it is.

Well, it's kinda hard to do proper scientific experiment in this area, thus we only have one observation - but it's damning. We've had lots of OSes created for user-facing devices (desktops, mobile phones, tablets). Some of them cared about ABI stability (Windows/WindowsCE, MacOS/iOS, Palm, Symbian, etc), some have not (WindowsCE, Linux, Palm, etc). Note that couple of OSes are in two categories at once (WindowsCE and Palm). That's because they had two distinct phases: in one phase they cared about backward compatibility very much and in the next - they dropped it to create "greater, more popular platform". In all cases these attempts led to disaster: platform either died altogether or went below 1% market share for many years (most died, WindowsCE become incompatible Windows Phone 7 and while it's not technically dead yet it's market share collapsed catastrophically).

Correlation looks quite striking - but one-sided. There are plenty of OSes which had good backward-compatibility yet failed anyway, but we have no widely used OSes which treat backward-compatibility as cavalierly as Linux does. Even MacOS does it better: it gives ISVs short time (just a few years) till they are forced to do major changes to the programs - but at least in this short period old programs still work just fine and neither users nor developers are forced to search for the solution on forums. Start PowerPC program in Intel Mac - and it works, no question asked. Believe me, difference between PowerPC binary and Intel binary is much larger then difference between OSS and ALSA.

That's funny. really...

Posted Jan 26, 2012 15:10 UTC (Thu) by HelloWorld (guest, #56129) [Link]

> It's one thing to have something called "emulation layer". It's something else entirely to support old programs. For years said emulation layer was buggy and basically unusable. When I start Adobe Reader I should not lose the ability to watch YouTube videos - even if both use "obsolete" OSS interfaces.
What you're asking for has nothing whatsoever to do with compatibility. You're asking for new features that exceed the ones that OSS provided in the first place, not for compatibility.

> Ah, so to make them work you only need to give up your freedom and stop choosing your software for yourself - you should just use what your distribution offers you. This is good band-aid, but it does not solve the problem. ISVs are still out of the loop and this means Linux is still unsuitable for a desktop.
This is, again, just bullshit. Using a package manager has *nothing* to do with giving up freedom, it is simply the most common and convenient way to install software on Linux, and nothing stops ISVs from adopting it. In fact, they do, e. g. Skype provides packages for various distros and a statically linked binary, in case all else fails.

> That's because there are none.
Yes, and that's the reason why nobody really cares about OSS compatibility (except for trolls like you).

> Correlation looks quite striking - but one-sided.
Yeah, except that it doesn't, because Windows really isn't all that backward-compatible either. Command & Conquer for Windows 95 wouldn't work on Windows XP (the installer crashes). And I didn't get Unreal II to work on Windows 7 either.

Anyway, I'm really tired of wasting my time with your stupid trolling attempts. Have a nice life.

Please

Posted Jan 26, 2012 15:31 UTC (Thu) by corbet (editor, #1) [Link]

Can we please aim for a slightly higher level of discourse here?

This is aimed at everybody, not just the immediate parent post. Disagreements are fine, but we do not need to fling personal insults at each other. If you think somebody is a troll, then don't feed them! If you must respond, please find a way to do so without turning LWN into some sort of elementary school playground, OK?

No, that's regression...

Posted Jan 26, 2012 16:29 UTC (Thu) by khim (subscriber, #9252) [Link]

What you're asking for has nothing whatsoever to do with compatibility. You're asking for new features that exceed the ones that OSS provided in the first place, not for compatibility.

I've used RealVideo player in parallel to MP3 player with OSS in 1998, sorry. ALSA broke this ability not just for OSS applications, native ALSA applications can not issue any sound if Adobe Reader hogs the device either. And support was only reintroduced in 2008. That's 10 years, give or take.

In fact, they do, e. g. Skype provides packages for various distros and a statically linked binary, in case all else fails.

Exactly! They provide one version for Windows (85-90%+ market share), one version for Mac (5-10% market share) and eight versions for Linux (2-3% market share). Why? Because all these Linux versions have forsaken compatibility in the pursuit of pretty colors. And even then it does not work with all versions of Linux. For most ISVs this mess just does not make sense.

There are hope: at least they can write "Ubuntu 10.4+ 32-bit." and are not forced to provide separate versions for Ubuntu 10.4, 10.10, 11.04, 11.10... but the fact that it's problematic to use 32bit version on 64bit Ubuntu is already quite aggravating and the need to provide eight packages to cover most (but not all!) distributions is just wrong.

Yes, and that's the reason why nobody really cares about OSS compatibility (except for trolls like you).

I don't care either at this point. OSS compatibility was sore point five years ago, but today it's thing of the past. Sadly linux desktop developers invent something new to break almost every year.

Command & Conquer for Windows 95 wouldn't work on Windows XP (the installer crashes).

Unless you'll run it in compatibility mode. I know: right-click and Run this program in compatibility mode is ubercomplicated trick and not all Windows users can do that. But it says something about expectations, too.

And I didn't get Unreal II to work on Windows 7 either.

Which is still possible if you install appropriate .dll files. Yes, Unreal II is rare case where Microsoft decided that small piece of API is not worth supporting in future versions of Windows thus you need to hunt down and install Windows Server 2003 DirectMusic Patch with proper versions of DirectMusic .DLLs. Somehow such activity is perceived as simple (some even say trivial) in Linux, yet when you hit the same trouble in Windows you've given up immediately. Don't it say something about your expectations?

No, that's regression...

Posted Jan 26, 2012 20:22 UTC (Thu) by HelloWorld (guest, #56129) [Link]

> I've used RealVideo player in parallel to MP3 player with OSS in 1998, sorry.
Then the sound card you used then probably had a hardware mixer.

> ALSA broke this ability not just for OSS applications, native ALSA applications can not issue any sound if Adobe Reader hogs the device either.
It works perfectly fine if your sound card has a hardware mixer (as my Asus A7V880 did), and if it doesn't, you can use dmix.

> Exactly! They provide one version for Windows (85-90%+ market share), one version for Mac (5-10% market share) and eight versions for Linux (2-3% market share). Why? Because all these Linux versions have forsaken compatibility in the pursuit of pretty colors. And even then it does not work with all versions of Linux.
Which version of Linux does the statically linked binary not work on? And besides, this still isn't a backwards compatibility issue, but a cross-distro interoperability problem. I agree that having so many different distros sucks.

> There are hope: at least they can write "Ubuntu 10.4+ 32-bit." and are not forced to provide separate versions for Ubuntu 10.4, 10.10, 11.04, 11.10... but the fact that it's problematic to use 32bit version on 64bit Ubuntu is already quite aggravating
Yet again, this is not a backwards compatibility issue but a cross-architecture compatibility problem specific to one distro (i. e. debian and its derivatives). It's trivial on, say, Fedora.
Aside from that, it is trivial to make the 32-bit version run on a 64-bit machine: just use the statically linked version. But then, why would anybody do that, given that a 64-bit version exists? You keep mixing up different issues and whining about things that nobody cares about in the real world.

> Unless you'll run it in compatibility mode.
Do you think I'm stupid or something? I tried that of course, it just didn't help.

> Which is still possible if you install appropriate .dll files.
Err, no. I got it to start somehow, but it ran unplayably slowly and it had massive glitches in the menu and the HUD. It way unplayable.

That's funny. really...

Posted Jan 26, 2012 18:12 UTC (Thu) by Cyberax (✭ supporter ✭, #52523) [Link]

>Yeah, except that it doesn't, because Windows really isn't all that backward-compatible either. Command & Conquer for Windows 95 wouldn't work on Windows XP (the installer crashes). And I didn't get Unreal II to work on Windows 7 either.

I actually have the original C&C disks. I've tried to install C&C from them, but disks have become unreadable over the years. So I downloaded it from TPB and installed it just fine in Win7 in compat mode.

There's a good story about Windows compatibility - they've actually made a special-purpose memory allocator for Win95 to work around a bug in Sim City.

Read it here: http://www.joelonsoftware.com/articles/APIWar.html - that's how much MS cared about compatibility.

That's funny. really...

Posted Jan 26, 2012 19:56 UTC (Thu) by HelloWorld (guest, #56129) [Link]

> I actually have the original C&C disks. I've tried to install C&C from them, but disks have become unreadable over the years. So I downloaded it from TPB and installed it just fine in Win7 in compat mode.
Well, it didn't work when I tried to install it from my disks. Perhaps this issue only applies to the german version or something.

That's funny. really...

Posted Jan 29, 2012 15:00 UTC (Sun) by boudewijn (subscriber, #14185) [Link]

Hm... Emboldened by this discussion I tried to install any of my collection of Corel Painter disks on my Windows 7 installation. None would start. Items tried range from a ten year old Corel Painter Essentials disk to a Corel Painter X trial -- we're now at Corel Painter XII, btw.

We can blame Corel, of course... Their products have never had a reputation for being solid, well-built applications, but then, wasn't the contention in this thread that on Windows that's not necessary, since Windows keeps everything running through amazing binary compability through the ages?

Have you tried the usual tricks?

Posted Jan 29, 2012 16:12 UTC (Sun) by khim (subscriber, #9252) [Link]

Have you tried to run setup in "Windows XP" compatibility mode? Corel Painter 11 works fine. You can find and start setup.exe directly. In general the appropriate help page can help you.

AFAICS from forum posts Corel Painter Essentials works fine with Windows 7... with the exception that it insists in scanning the whole system drive at each program start. Which is sloow as you can guess (45min startup time is not unheard of).

Yes, Windows tries to stay compatible very hard - but it can not fix all the bugs in all the programs.

There are huge difference, however...

Posted Jan 26, 2012 8:11 UTC (Thu) by boudewijn (subscriber, #14185) [Link]

"The interesting question happens after you've managed to write working installer."

After having created the installer, I write the next version of the application which means I'll have to write update installers, and on windows, figure out some way of semi-automatically updating the existing installed base. Which is always fun, if you've got half a million users.

And after having created the installer, I get bug reports, often about obscure incompatibilities between the development and testing systems and the user's windows system. For which I, sometimes, have to actually buy the actual hardware the user uses so I can figure out that the problem is related to the particular version of the Intel graphics driver Asus installs on that particular laptop model.

Or (ten years ago) that the application crashes Windows because the font the user has installed has some broken truetype code embedded that causes a kernel panic when hyphenating.

We have to face it: all operating systems suck, all hardware sucks and there is no such thing as a free lunch. Arguing that Windows makes life easier for a software developer than Linux is an exercise in futility.

Sorry, but you are wrong...

Posted Jan 26, 2012 12:17 UTC (Thu) by Cyberax (✭ supporter ✭, #52523) [Link]

You're confusing two issues. You DO need a manifest to bind with the correct MSVCRT.

However, you can ship MSVCRT DLLs along with your app by simple copying without any installation. That's called "Private Assemblies": http://msdn.microsoft.com/en-us/library/aa375674.aspx

Let me quote this: http://msdn.microsoft.com/en-us/library/ms235299.aspx

>To deploy Visual C++ redistributable files, you can use the Visual C++ Redistributable Package (VCRedist_x86.exe, VCRedist_x64.exe, or VCRedist_ia64.exe) that is included in Visual Studio, or use Redistributable Merge Modules, or you can directly install specific Visual C++ DLLs to the application local folder. An application local folder is a folder that contains an executable application file. DLLs must be deployed to the application local folder.

As I've said, I've done this personally. It works.

Yesterday...

Posted Feb 6, 2012 7:20 UTC (Mon) by elanthis (guest, #6227) [Link]

It's mostly commonplace on Windows because upstream often gives us little choice. Even if I compiled something like TinyXML, where would I put it that it would be found? How would I version the DLL? How do I make sure that my app isn't relying on a broken modified version someone else installed? It's not hard or difficult to build redistributable packages for Windows DLLs, but most upstream developers just don't bother. Possibly the installer toolchain just needs to be nicely integrated with Visual Studio.

The Linux ecosystem solves this with ham-fisted militarized regulation of software via the distribution package repositories. Which in turn results in very slow update cycles, a constant churn and a need to follow the bleeding edge for the entire OS to get a bug in one app fixed, and an inability to get a lot of software packaged for enough users because that packaging work has to be done 1000x times over for every possible distribution.

Which way is better? It depends on the use case (server, embedded, desktop, etc.). The desktop and consumer device market seems to be leaning towards bundled libraries, via means like the iOS App Store, Steam, and soon Windows Store. Especially when the Web hyperlink capabilities are used properly, they allow users to find software that most natural way (the Web) and then install software with minimal of fuss. No need to dig through a package repository after already having found the software online. No need to find out the latest available version of the software requires a library version the OS hasn't packaged yet. No need to find out that the latest version of the software has a bunch of features and fixes that the OS bundled package doesn't include and won't include for another 6 months (in which case the entire OS must be upgraded, along with possibly getting a whole new UI and a new slew of bugs, just to get that one application update).

The primary issues of bundled libraries -- space and security -- are a lot less interesting than most people think. Disk space is certainly cheap these days, and even the memory costs of loading multiple versions of a library for different apps are easy to ignore in today's post-32-bit world. The library security issues are things to be concerned about, but less so than most think. Sure, getting a patched OpenSSL right away is important on Linux, but then Linux is primarily a server OS. Everything it's running is some Internet-facing risk, and quite a few libraries are processing data that is literally sent to the machine from any ol' random node on the Internet.

On a modern day desktop OS there's practically only app that ever even hits the wide open 'Net is the Web browser. Users don't have much need to care that Random Grandma-Friendly Family Geneaology Wizard 2008 doesn't have the latest patched libpng because frankly it doesn't matter if it does or not; it doesn't grab random PNGs off the Internet and it's not an attack vector for any plausible threat.

Of course the "consumer-friendly" Linux-based OS (Android) uses this same application-centric distribution model. Users install and update individual self-contained applications or entire OS updates, not individual components of an OS+Apps repository. Small wonder.

Yesterday...

Posted Feb 7, 2012 15:01 UTC (Tue) by dgm (subscriber, #49227) [Link]

Very accurate. But do not forget another of the consequences of the application-centric model: malware.

On a repository based system you have an additional layer of people reviewing the software that you will end running in your machine. Part of the responsibility for the excellent track record of desktop Linux with regards to malware is by this layer of people. A tightly controlled App Store, like Apple's, has a similar effect, but having access to the source code makes repositories much more effective.

Yesterday...

Posted Jan 24, 2012 15:43 UTC (Tue) by dgm (subscriber, #49227) [Link]

> it's not that hard to develop Linux program (Android or iOS APIs certainly look more alien then Linux APIs for Windows developer), but try to make it available for the user... and you'll hit the wall.

Not my experience. Many moons ago I wrote a silly little app that relied on the GTK 1.2 toolkit. After publishing it in Sourceforge, an Ubuntu packager (hi Chris) got in touch, and shortly after I could install it from repositories. I doesn't get any easier, does it?

> Friedliness to ISV is almost directly proportional to market share

We do agree on that. But, again, that's not what I was replying to. Lack of (ABI) compatibility is not the cause for failure. Rather ISV unfriendliness (witch is a greater subject) is. Or put another way, you can be successful begin incompatible (OS X) and you can fail being compatible (OS/2). It's a bit more complex than that.

OEM friendliness is also a sore point. Note I mean the kind that just rushes cheap consumer hardware with crappy closed source drivers out the door and moves on. Currently you could buy any cheap piece of sh^Whardware and expect it to work on Windows. Or you could reach an Apple store and shell extra bucks for Apple approved hardware that will work with Macs. In any case, the rules are clear and easy. With Linux... well, it's a bit more work. But surprisingly Joe and Jane Sixpack don't care a bit. They want it easy, so they can go on with their own lives. Who nows, maybe they are up to something.

In any case, we seem to be in an stale mate situation. Ubuntu is trying HARD to break that, but even after all those years, you cannot go out and look for "made for Ubuntu" (or whatever) hardware and software in stores. Maybe the time will come. But a few things will have to change for that.

Poettering: systemd for Administrators, Part XII

Posted Jan 23, 2012 13:42 UTC (Mon) by dskoll (subscriber, #1630) [Link]

You have nailed the cause for Linux's failure on the desktop. It has absolutely nothing to do with supposed inferiority of Linux, the Linux desktops, or Linux applications. It's simply the network effect of Windows. When "everyone" runs Windows, of course all the niche application developers target Windows. Then it becomes necessary to run Windows and you get a feedback loop. And you get a huge community built around Windows training, Windows development, etc. And this doesn't even take into account Microsoft's enormous marketing machine.

The network effect is the same reason Google+ is dead in the water. Even if Google+ were by some objective measure "better" than Facebook, it doesn't matter. All your friends are on Facebook, so you'll go on Facebook too.

Thankfully, the Linux community is large enough to support a pretty vibrant software ecosystem, even on the desktop. So those of us who like Linux on the desktop will continue to use it and ignore those who spout opinions as if they were facts.

Sorry, but here you are wrong.

Posted Jan 23, 2012 16:01 UTC (Mon) by khim (subscriber, #9252) [Link]

The network effect is the same reason Google+ is dead in the water. Even if Google+ were by some objective measure "better" than Facebook, it doesn't matter. All your friends are on Facebook, so you'll go on Facebook too.

By this logic Facebook should be dead, too: five years ago "all your friends" were on MySpace. And before that they all used Friendster.

I'm not saying Google+ will kill the Facebook - I just point out that network effects, while powerful, can be overcome. Not if you shoot yourself in the foot every five minutes, of course.

Sorry, but here you are wrong.

Posted Jan 23, 2012 16:23 UTC (Mon) by dskoll (subscriber, #1630) [Link]

By this logic Facebook should be dead, too: five years ago "all your friends" were on MySpace. And before that they all used Friendster.

The key here is timing. Facebook came at just the right time when social networking was ready to explode. Also, when Facebook was launched, Myspace wasn't as overwhelmingly dominant as Windows was when Linux actually became usable on the desktop.

Microsoft came at just the right time when PCs exploded. (Before MS-DOS, "everyone" used CP/M or some other system, but Microsoft got a lucky break.) By the time Linux really was viable as a desktop system ~1999 or so, Microsoft was utterly entrenched with a huge community and network effect.

Linux may eventually eat away at Microsoft's market share, but it's going to take a very long time. It will also take a few major mis-steps on Microsoft's part to erode their enormous market advantage.

Linux desktop market share

Posted Jan 22, 2012 8:17 UTC (Sun) by dskoll (subscriber, #1630) [Link]

Right, because we all know that desktop market share is determined purely on the basis of technical merit. Decades of Microsoft stong-arming and anti-competitive business practices can't possibly have anything to do with it.

Linux desktop market share

Posted Jan 22, 2012 8:34 UTC (Sun) by Cyberax (✭ supporter ✭, #52523) [Link]

This excuse is wearing a little bit thin now. After all, Android completely slaughtered Windows Mobile in half a year, despite all the Microsoft strong-arming.

Let's face it, Linux is not that better than modern Windows OSes and in a lot of regards it's actually inferior. For example, in the corporate world ActiveDirectory rules supreme and we're only now are getting close with Samba4.

From the technical point of view, Windows has the best graphics stack (DX11 + driver infrastructure) and quite good audio stack (I can easily tune my 7.1 audio output on Windows, something I still can't do on Linux).

So there are no real compelling technical reasons for people to switch.

Linux desktop market share

Posted Jan 22, 2012 9:50 UTC (Sun) by Pawlerson (guest, #74136) [Link]

Your example with Android just proves market share is not about technical things, but about software. Microsoft doesn't have any single advantage like MS Office, Photoshop, games in the mobile market. Bring mentioned applications to desktop Linux and Linux will become the most popular operating system. When comes to technical things Linux also have advantages over proprietary systems: better USB handling, file system that doesn't fragment like hell, software repositories, very fast startup, no viruses, security fixes that comes after hours not months etc.

Linux desktop market share

Posted Jan 22, 2012 10:05 UTC (Sun) by Cyberax (✭ supporter ✭, #52523) [Link]

Software can be ported to new platforms, especially if they are powerful enough.

WinCE had lots and lots of software when the first Android phones came out. Yet Android won because technically it is head-and-shoulders above anything WinCE can offer.

>better USB handling
Which nobody really cares about. Windows is good enough.

>file system that doesn't fragment like hell
NTFS works well enough so most users simply don't care.

>software repositories, very fast startup, no viruses, security fixes that comes after hours not months etc.
Which is nice, but not enough to switch.

You see, Windows computers in general work pretty well. With properly installed antivirus they don't get that much infections and most of important USB devices are now supported by default on Win7/8.

So Linux has to offer something that is better than "I guess it's OK". We can do this on mobile devices and on embedded devices. On desktops? I'm not sure.

Linux desktop market share

Posted Jan 22, 2012 14:09 UTC (Sun) by Pawlerson (guest, #74136) [Link]

Yeah, I agree with you points. In my opinion the main culprit of Linux 'unpopularity' on desktops is the lack of commercial software. Things like graphic and audio stack aren't so important, but still. Thankfully they're improving rapidly.

Linux desktop market share

Posted Jan 22, 2012 17:27 UTC (Sun) by raven667 (subscriber, #5198) [Link]

Well, commercial/proprietary software is also more difficult to support on Linux because once you get above the kernel and glibc there is no coherent single platform that can be targeted. Android gives that and you can see how commercially successful it has been.

Linux desktop market share

Posted Jan 22, 2012 22:57 UTC (Sun) by jrn (subscriber, #64214) [Link]

Technically, there's LSB, too.

Linux desktop market share

Posted Jan 23, 2012 11:22 UTC (Mon) by Pawlerson (guest, #74136) [Link]

IMHO proprietary software vendors should just focus on Ubuntu LTS or RHEL.

Linux desktop market share

Posted Jan 23, 2012 22:18 UTC (Mon) by sorpigal (subscriber, #36106) [Link]

Even that doesn't really solve the problem.

Ubuntu LTS and RHEL are still not supported for very long, compared to Windows, and at the end your app *will* be dead.

Try this: Pick a graphical binary from 10 years ago that was written for Windows 2000. Pick one from 10 years ago that was written for a then-enterprise Linux distro. Try to run the Win2k one on Windows Server 2008 or Windows 7, then try to run the Linux one on RHEL6. Report findings.

For any non-trivial application I practically guarantee you that you will either (1) not start due to missing libs, or (2) crash, or (3) fail to work for a non-obvious reason.

In the cast of (1) I will further bet you that the libs you need are not packaged for your chosen distro, officially.

And now we see the problem. I can still run *games* written for Win 3.1 on modern Windows and get sound and everything without lifting a finger (okay, maybe I toggle some compatibility flags in the properties of the executable). Try finding a *binary* of an X11 game that ran on Linux in the 90s and see how well it works.

I'm pretty sure I can take almost any executable written for NT 4 and run it today on modern Windows. Compare that with e.g. RHEL 1.0 and RHEL 6 and see how well it fares.

Having no platform and no care for backwards compatibility makes Linux unattractive to ISVs and users alike. When software you like, and maybe paid for, rapidly stops working you tend to give up.

Linux desktop market share

Posted Jan 23, 2012 23:01 UTC (Mon) by dlang (subscriber, #313) [Link]

actually, most unix programs from 10 or 20 years ago do still run on linux systems. most libraries are backwards compatible, or where they aren't it's possible to have both versions.

there are exceptions, especially with the "desktop environment" libraries, but if you don't tie your software explicitly to one of these desktop environments things 'just work'

I've got commercial programs that I have purchased that run just fine on my up-to-date system that were last compiled about 5 years ago, and I've run other things that are significantly older without a problem. I have heard of people who love wordperfect still running the binaries that were distributed by Caldera back pre-2000 on new systems.

Yes, you can find things that won't work, but the same is true in the windows world. Every time a new version of windows comes out, there are a slew of compatibility issues that break specific programs.

Not a good choice...

Posted Jan 23, 2012 23:35 UTC (Mon) by khim (subscriber, #9252) [Link]

There are exceptions, especially with the "desktop environment" libraries, but if you don't tie your software explicitly to one of these desktop environments things 'just work'

The problem with this approach is that the end result looks ugly as sin then. It does not play nice with themes, it does not show nice notification messages, etc. Here we, again, hit the problem of iceberg: Don't, for a minute, think that you can get away with asking anybody to imagine how cool this would be. Don't think that they're looking at the functionality. They're not. They want to see pretty pixels.

Even geeks want to see pretty pixels - that's why migrate to MacOS in droves! MacOS is not as backward-compatible as Windows and thus will probably never be as popular as Windows (abandonment of PowerPC emulation in Leon was nasty surprise for a lot of my friends why are using MacOS now), but it does better then Linux where you can create server-side (CLI-based application) or ugly as sin (yet powerful and fast) application for geeks but where applications for Joe Average don't exist.

The whole story makes me kinda sad: Linux desktop developers (GNOME, KDE and others) do enormous work to make desktops pretty - and the goal here is, apparently, to attract users. Yet they do everything they could to make absolutely sure there will be no hundreds of thousands of pretty apps for said desktop - which makes the rest of their work pretty pointless. Why? They are intelligent, absolutely not crazy people so... why? I just can not understand.

Not a good choice...

Posted Jan 23, 2012 23:51 UTC (Mon) by dlang (subscriber, #313) [Link]

I disagree on the importance of pretty pixels being consistant in every application.

for examples, what apps in windows tie in to the windows themeing tool? and how many people change the desktop theme in windows?

people can make pretty apps using gtk or qt and don't have major problems, even years later with the libraries being unavailable (or they provide the libraries themselves to avoid using the packaging system)

as for your feeling that windows in the nirvana of compatibility, one phrase 'DLL Hell'. It's a problem common enough to have coined a term. This is why in a windows production server you see very few cases of multiple apps running on one server (which is why virtualization is so wonderful for them)

however, with your final statement

> The whole story makes me kinda sad: Linux desktop developers (GNOME, KDE and others) do enormous work to make desktops pretty - and the goal here is, apparently, to attract users. Yet they do everything they could to make absolutely sure there will be no hundreds of thousands of pretty apps for said desktop - which makes the rest of their work pretty pointless. Why? They are intelligent, absolutely not crazy people so... why? I just can not understand.

you are making a statement that I absolutely agree with, especially when they make decisions that drive away existing users.

Not a good choice...

Posted Jan 24, 2012 0:32 UTC (Tue) by khim (subscriber, #9252) [Link]

for examples, what apps in windows tie in to the windows themeing tool?

Most of them. Some are coded incorrectly and don't react well to the different theme, but they are rare. And till Windows Vista most had trouble with DPI changes. But then, again, to properly support different DPIs in Linux you need to use modern libraries which are in constant flux, ages-old stable ABI approach will only give you bitmap fonts.

and how many people change the desktop theme in windows?

It's the same 80/20 myth: sure, limited number of people will use themes, most can live without sound and so on - but if you'll reject all these "unnecessary niceties" then you'll lose most users.

people can make pretty apps using gtk or qt and don't have major problems, even years later with the libraries being unavailable (or they provide the libraries themselves to avoid using the packaging system)

Sure, you can provide libraries with the program - but they you get this "ugly as sin" look. I can tolerate P4V which looks totally alien on my desktop, my niece will not.

as for your feeling that windows in the nirvana of compatibility, one phrase 'DLL Hell'.

When was the last time you've used Windows? Yes, long, long ago it was serious problem. But mitigation techniques more-or-less solved it today. The most visible remnant is problem with fonts: if application installs some unusual fonts then they can replace standard fonts and then your apps look awful (or in some case text can not be read at all). Surprisingly enough Linux also have this problem - specifically with fonts.

This is why in a windows production server you see very few cases of multiple apps running on one server (which is why virtualization is so wonderful for them)

Server is different thing. There the ability to run "out-of-the-box" is not as important (because you have skilled sysadmin) and the ability to fix problems is more important (again: because it's sysadmin's job). Linux does fine on server.

On desktop, though... this is a disaster. Sure, it's not hard for me to find out what libraries or symlinks this or that program needs (using objdump, strace, etc) and add them, but if my niece tries to run, for example, Google Earth and it does not work then the only thing she can really do is to try to reinstall - which, of course, does not help.

Not a good choice...

Posted Jan 24, 2012 10:52 UTC (Tue) by paulj (subscriber, #341) [Link]

Except you can't change themes in GNOME anymore. At least, not easily. It's been stripped out of the default UI. There's some advanced tweak tool, but there's no easy way to preview a theme. The UI shell themes are even harder to change - it requires restarting the shell to test a new theme.

(And don't get me started on all the other stuff that's been removed :( ).

GNOME, KDE etc. ABI breakage

Posted Jan 24, 2012 0:11 UTC (Tue) by dskoll (subscriber, #1630) [Link]

The whole story makes me kinda sad: Linux desktop developers (GNOME, KDE and others) do enormous work to make desktops pretty - and the goal here is, apparently, to attract users. Yet they do everything they could to make absolutely sure there will be no hundreds of thousands of pretty apps for said desktop - which makes the rest of their work pretty pointless. Why? They are intelligent, absolutely not crazy people so... why? I just can not understand.

I can't speak for those developers, but I'll take a guess: They only care about source-code compatibility (at most). With free software, the developers feel they can break the ABI without a second thought because you "just" recompile. And you can break the API with only a passing thought because developers are supposed to keep up.

I do agree that this is a large barrier to proprietary Linux desktop apps. But I disagree that it makes the Linux desktop technically inferior to the Windows desktop. It hurts the marketing and it annoys proprietary developers, but those are political issues, not really technical ones.

GNOME, KDE etc. ABI breakage

Posted Jan 24, 2012 0:28 UTC (Tue) by dlang (subscriber, #313) [Link]

the idea that binary compatibility can break because you recompile the app has some defense (not a great one, but some)

however the idea that all developers are supposed to 'keep up' with API changes is just silly (to put it politely) and needs to be re-thought.

even if you only want free software to be able to run on your desktop, breaking the API and expecting every application developer in the world to change their software to still run and keep doing the things it did before is deciding that you are not building a real system, just a toy system.

requiring application updates to take advantage of new stuff is expected, but not to keep working.

GNOME, KDE etc. ABI breakage

Posted Jan 25, 2012 11:21 UTC (Wed) by sorpigal (subscriber, #36106) [Link]

Indeed; source-level compatbility is sufficient for most purposes, but a stable API is essential.

Not even a recompile can fix things when the API changes radically and this is not just a problem for commercial apps. The fact that I can't run my favorite open source CD burning front end, unmaintained for 8 years now, and that I couldn't run it almost as soon as it stopped being maintained, is a problem for me. I have many such examples.

GNOME, KDE etc. ABI breakage

Posted Jan 26, 2012 0:57 UTC (Thu) by HelloWorld (guest, #56129) [Link]

What stops you from putting the required libraries in the program directory (and set LD_LIBRARY_PATH accordingly) and be done with it? After all, if the program (xcdroast?) hasn't been maintained for 8 years, who cares if the libraries it uses are?

GNOME, KDE etc. ABI breakage

Posted Jan 31, 2012 13:12 UTC (Tue) by sorpigal (subscriber, #36106) [Link]

It was cdbakeoven, and as near as I can tell it was a kernel change that finally broke it beyond my willingness to repair. I gave up on the idea of leaving a goodly chunk of KDE2 in a separate directory tree just for one program.

Incidentally, have you tried using xcdroast lately? It's an adventure.

This makes Linux crap on desktop

Posted Jan 24, 2012 0:46 UTC (Tue) by khim (subscriber, #9252) [Link]

I do agree that this is a large barrier to proprietary Linux desktop apps. But I disagree that it makes the Linux desktop technically inferior to the Windows desktop.

I don't want to talk about terms: it makes Linux desktop crap. As drag noted:

100% specific purpose of a operating system is to:

1. Make it easy to write applications.
2. Make it easy to run applications.

If my desktop does not make it easy to run the application then it's crap - by definition. Is it technical inferiority or some other kind of inferiority is unimportant.

This makes Linux crap on desktop

Posted Jan 24, 2012 3:33 UTC (Tue) by viro (subscriber, #7872) [Link]

khim, would you please stop morphing the subject line? This newly acquired habit of yours is really annoying...

Troll

Posted Jan 24, 2012 14:46 UTC (Tue) by dskoll (subscriber, #1630) [Link]

You are trolling. There is no substantial difference between the "Linux Desktop" and the "Windows Desktop". Windows, Icons, Menus, Pointers... they're all basically the same. OpenOffice vs. MS Office: They're both horrible in their own ways, but neither is really any worse than the other. Firefox, Thunderbird, productivity apps: About the same on Linux and Windows.

You may be right that the Linux desktop environment is hostile to proprietary desktop application deployment. That may be a problem for Linux's desktop adoption, but it doesn't make it "crap".

Look, I run a business where everyone (including non-technical staff) uses Linux on the desktop. We're just as productive as a Windows shop and no-one has any problems getting work done. The apps are different, but not worse.

If you want to complain bitterly that Linux on the desktop is "crap", I suggest you do it on a forum with MS fanbois rather than one with thinking people.

Troll

Posted Jan 24, 2012 17:43 UTC (Tue) by nye (guest, #51576) [Link]

>If you want to complain bitterly that Linux on the desktop is "crap", I suggest you do it on a forum with MS fanbois rather than one with thinking people.

Characterising anyone who disagrees with you as 'trolling' and 'non-thinking' seems a little excessive, especially when that category includes, by most estimates, around 99% of all desktop computer users.

Part of the reason LWN has such an exceptionally low S:N ratio, even by the standard of online discussion forums, is the reality distortion field that makes sane discussion almost entirely impossible.

Troll

Posted Jan 25, 2012 14:21 UTC (Wed) by dskoll (subscriber, #1630) [Link]

Characterising anyone who disagrees with you as 'trolling' and 'non-thinking' seems a little excessive, especially when that category includes, by most estimates, around 99% of all desktop computer users.

He was trolling.

And 99% of desktop users don't think that Linux is "crap". Probably 95% of desktop users have never used Linux on the desktop so they don't even have an opinion.

LWN has such an exceptionally low S:N ratio...

I think LWN has an excellent S:N ratio. But at the same time, occasionally it's necessary to call a troll a troll.

Not a good choice...

Posted Jan 24, 2012 0:57 UTC (Tue) by Trelane (subscriber, #56877) [Link]

> Even geeks want to see pretty pixels - that's why migrate to MacOS in droves!

I was under the distinct impression that the migration was because it was a Unix system with commercial app support and real hardware vendor support, as opposed to the headaches inherent in buying a system Designed for Windows and trying to get Linux working on it.

Not a good choice...

Posted Jan 24, 2012 4:29 UTC (Tue) by raven667 (subscriber, #5198) [Link]

I'll chime in. I've been using Mac OS X for about 5 years or so since 10.4 and the reasons you state are exactly the reasons it replaced my Linux desktop which I had been using for about 10 years. I had a system which had working, solid video, wireless and power management out of the box, had a reasonable terminal emulator, ssh and X11. What more do you need. Actually I'm going to say my PowerBook G4 is the best computer that I've ever owned, it has every conceivable feature. I still have it and I'd probably use it more if it could decode 720p h.264 video but it's just too slow.

Not a good choice...

Posted Jan 24, 2012 10:48 UTC (Tue) by paulj (subscriber, #341) [Link]

The whole story makes me kinda sad: Linux desktop developers (GNOME, KDE and others) do enormous work to make desktops pretty - and the goal here is, apparently, to attract users. Yet they do everything they could to make absolutely sure there will be no hundreds of thousands of pretty apps for said desktop - which makes the rest of their work pretty pointless. Why? They are intelligent, absolutely not crazy people so... why? I just can not understand.

Because they're building prototypes for their own enjoyment. Because all the main Linux vendors seem to be focussed on the 2 main money-pots for Linux: enterprise server support & embedded system development support. ?? You can't really blame the remaining few of those who're working, often in their own time, on the Linux desktop, for not being able to get all the boring QA & long-term support work done that's needed for a really mass-market Linux.

I blame the vendors more, for having given up on making a business out of desktop Linux/Unix.

Not a good choice...

Posted Jan 24, 2012 14:46 UTC (Tue) by rahulsundaram (subscriber, #21946) [Link]

Nonsense. Look at where the majority of commits for GNOME including GNOME Shell is coming from.

Not a good choice...

Posted Jan 24, 2012 15:23 UTC (Tue) by paulj (subscriber, #341) [Link]

If it's not nonsense, where can I buy a support contract for GNOME-shell, etc.? Ok, right now, RHEL has a relatively modern desktop, as it's only a year-old, but RHEL on average will leave you with a desktop that's at least a year out of date. RHEL5 on release shipped with a nearly 1yo GNOME desktop, RHEL6 with a 1yo desktop. I really would like to pay RedHat, but the last RHEL release cycle took 4 years. I'm not really prepared to risk spending 2 years on a 3+ year old Linux desktop. RedHats' public statements also say they have little interest in anything beyond an ultra-stable corporate desktop:

http://www.redhat.com/about/news/blog/whats-going-on-with...

I'd like a QAed desktop, yes. But I don't quite need it QAed for >1 year, nor do I wish to have my desktop release schedule depend on the ultra-conservative corporate server product. I'd like something in between Fedora and RHEL Workstation basically. Maybe it's just me, however it's extremely rare to find people running RHEL or CentOS for their desktops or laptops (outside of managed corporates). So probably I'm not alone in finding RHEL too conservative for desktop use.

I do appreciate that RedHat invest resources into the desktop, but I stick to my claim: Not being done for mass-market Linux desktop. Their own statements seem to concur.

Not a good choice...

Posted Jan 24, 2012 15:35 UTC (Tue) by rahulsundaram (subscriber, #21946) [Link]

"Because they're building prototypes for their own enjoyment."

That's the nonsensical part (Not the part about consumer desktop) You have no way to defend that. Having said that, the latest bits of any FOSS technology, be in the latest Apache release or the latest GNOME release, takes time to get into a enterprise release. You can't have the latest and have it supported for a decade at the same time. So bringing it up as a example of poor support is a non starter. If you want support for GNOME Shell, wait till RHEL 7.

Not a good choice...

Posted Jan 24, 2012 16:06 UTC (Tue) by paulj (subscriber, #341) [Link]

So scratch the "for enjoyment" part then (the motivation matters little, relative to the end result). The widely used Linux desktop stack remains a prototype - for a product next-to-0 normal users have any interest in using as a desktop, for a vendor that has stated explicitly that it has no commercial interest in normal users.

Other than "for enjoyment", my claim doesn't quite seem to be nonsense.

Not a good choice...

Posted Jan 24, 2012 16:11 UTC (Tue) by paulj (subscriber, #341) [Link]

Oh, I'd dearly love to be proven wrong. Ubuntu comes close to doing that, but they employ far far fewer of the people who provide my desktop than RedHat do.

When I'm not a student again and can afford it, I will probably have to just buy both Ubuntu & RedHat support contracts, and just try to shoe-horn my problems into their product support structures somehow, I guess.

Not a good choice...

Posted Jan 24, 2012 16:29 UTC (Tue) by rahulsundaram (subscriber, #21946) [Link]

Motivation matters a lot actually and is my central disagreement with you. Disregarding motivation is pretty insulting to people who spend a lot of time on this.

Not a good choice...

Posted Jan 24, 2012 16:49 UTC (Tue) by paulj (subscriber, #341) [Link]

If you look at my original comment, I was offering a list of reasons (well 2), with question marks after them, in response to khim's question. One option was "prototypes for their enjoyment" another was "vendors focussed on enterprise". This wasn't meant to be an exhaustive list, but I apologise if your motivation wasn't on there. Still, I don't feel the "for their enjoyment" option is insulting to Linux/Unix desktop developers (though, I'm not a desktop developer myself). Perhaps my understanding is lacking, and/or perhaps you're being a little over-sensitive.

Linux itself is quite compatible

Posted Jan 23, 2012 23:12 UTC (Mon) by khim (subscriber, #9252) [Link]

For any non-trivial application I practically guarantee you that you will either (1) not start due to missing libs

If you'll try to run pre-GNOME/KDE apps they will most likely work. I know people who still use libc5-based applications (mathematical and statistical packages, etc).

But if you want to use desktopy apps... you are out of luck (as I wrote before). GNOME and KDE certainly made Linux desktop prettier, yet they started the upgrade treadmill which is more exhausting then what can be found in proprietary world by far: not only you need to upgrade your OS to use newer versions of software, you then need to upgrade most of your applications, too! Sure, it works somehow if you only concentrate on free software world (applications die from time to time, but important ones are ported to latest and greatest version of underlying architecture), but this approach practically guarantees that ISVs will ignore you - and then real users will ignore you as well.

Linux itself is quite compatible

Posted Jan 23, 2012 23:35 UTC (Mon) by dlang (subscriber, #313) [Link]

> If you'll try to run pre-GNOME/KDE apps they will most likely work.

change this to non-GNOME/KDE instead of pre- and you are correct (not everyone who writes a new application builds it for GNOME/KDE)

Linux desktop market share

Posted Jan 23, 2012 23:55 UTC (Mon) by HelloWorld (guest, #56129) [Link]

> Try finding a *binary* of an X11 game that ran on Linux in the 90s and see how well it works.
I played the original Unreal Tournament (released in 1999) something like two hours ago. It works just fine, it even uses PulseAudio (via padsp).
Oh, by the way, I tried to play Unreal II working on Windows 7 a few months ago, but I couldn't get it to work. I also couldn't get Command & Conquer 1 (SVGA Version) to work on Windows XP, the installer simply crashed.

Linux desktop market share

Posted Jan 24, 2012 4:34 UTC (Tue) by raven667 (subscriber, #5198) [Link]

You might try dosbox, it's necessary on Windows to run old DOS software, the built-in DOS support blows goats. Also, gog.com. I too have tried running my old DOS and Windows games on a modern Windows and the DOS games are mostly handled by dosbox but old Windows 9x games just don't work on NT very reliably. Win9x and WinNT may look similar but they are very different and software is not very portable between them. Anyone remember iBCS?

Linux desktop market share

Posted Jan 24, 2012 4:38 UTC (Tue) by HelloWorld (guest, #56129) [Link]

The SVGA Version of Command & Conquer is a Windows 95 application, dosbox won't help (plus I don't care about that game any more anyway.)

Linux desktop market share

Posted Jan 24, 2012 5:08 UTC (Tue) by raven667 (subscriber, #5198) [Link]

I played Red Alert back in the day and found it difficult to get running on modern systems as well. Thankfully Starcraft is better designed and built and more modern in every respect.

Linux desktop market share

Posted Jan 25, 2012 18:05 UTC (Wed) by sorpigal (subscriber, #36106) [Link]

I run old Windows games via wine with mostly great success. Often times the experience is far better than friends who try them under Windows 7.

Linux desktop market share

Posted Jan 25, 2012 10:44 UTC (Wed) by cortana (subscriber, #24596) [Link]

Two games I had terrible problems running on Linux just a couple of years after they released were Tribes 2 and Civilization: Call to Power. IIRC Tribes had broken audio, and CTP would segfault on the menu screen while doing something involving the network (or maybe it would just lock up... ISTR having to SSH into the system to kill it; at least on Windows you can always press ctrl+alt+delete to get to task manager!)

Poettering: systemd for Administrators, Part XII

Posted Jan 22, 2012 9:37 UTC (Sun) by Pawlerson (guest, #74136) [Link]

It depends on users needs. If someone doesn't play games and doesn't use specialized software then Linux is enough for him. I have Kubuntu next to xp which I use 'just' for playing games. However, I installed wine yesterday and realized most of my favorite games work out of the box. Linux + wine probably gives you much better gaming experience than os x - wine seems to work better on Linux thus it's better as a gaming platform. :)

Poettering: systemd for Administrators, Part XII

Posted Jan 22, 2012 9:17 UTC (Sun) by AndreE (guest, #60148) [Link]

Saying that it is "crap" is certainly unnecessary exaggeration. Still, there are a lot of things that work quite nicely in Windows and OSX that haven't been properly addressed in Linux.

Poettering: systemd for Administrators, Part XII

Posted Jan 22, 2012 11:34 UTC (Sun) by epa (subscriber, #39769) [Link]

I use and like Linux on all my desktops
That's what the disagreement is about, I think. You like it - but you are a technically literate person. You are not providing IT for two hundred non-technical users who are less able to work around problems they may encounter. I like Emacs and LaTeX, but I would never say that they are any good as a way for most people to produce documents because that ignores the reality that 99% of people have different needs from the techie 1%.

Looking after two PCs for my parents doesn't give as much experience as managing a Linux desktop rollout in a whole organization, but it is a taster of the same difficulties. That said, Windows too has its problems, but people are more used to them.

Poettering: systemd for Administrators, Part XII

Posted Jan 22, 2012 15:04 UTC (Sun) by dskoll (subscriber, #1630) [Link]

That's what the disagreement is about, I think. You like it - but you are a technically literate person. You are not providing IT for two hundred non-technical users who are less able to work around problems they may encounter

I don't provide IT support for two hundred non-technical users, but I do for around 10. Those 10 are the non-technical people at my company plus my family. In my experience, Linux on the desktop is no worse than Windows when it comes to a non-technical user's experience. The difficulties with Linux are different from the difficulties with Windows, but not any more problematic.

Poettering: systemd for Administrators, Part XII

Posted Jan 22, 2012 5:29 UTC (Sun) by cmccabe (guest, #60281) [Link]

There are a lot of good open source desktop applications out there-- Gimp, Pidgin, LibreOffice, and so forth.

I don't know what an SDET or an XBL is, but I do know good quality assurance engineers who enjoy their jobs. Maybe it's easier when you're working for a company you don't hate.

Poettering: systemd for Administrators, Part XII

Posted Jan 22, 2012 22:28 UTC (Sun) by elanthis (guest, #6227) [Link]

I've yet to meet anyone who actually liked the GiMP in any way when compared to its proprietary rivals. Likewise with OpenOffice. These programs got popular for being lowercase free, not because they're at all quality software compared to the incumbent proprietary offerings.

This is in stark contrast to the server offerings on Linux, where aside from directory services and management the Linux/FOSS offerings are leading edge in terms of quality, features, and performance.

An SDET is a QA engineer; Software Development Engineer in Testing (compared to a regular dev at MS, called an SDE). The folks who write tests, test harnesses, etc. A lot of folks in the Windows OS SDET groups are actually Linux nerds, surprisingly enough (surprised me, at least), and Cygwin and various Linux/UNIX tools play a pretty big role in their work. You're not likely to hear that advertised much, naturally. :)

XBL is the Internet services division for XBox (XBox Live) and Games for Windows Live. It's a tiny division compared to the Windows Desktop division. And, btw, not recognizing what XBL is puts you at odds with about 60% of normal consumers, which is just another tiny bit of evidence that Linux folks are completely out of touch with what normal non-nerd computer users care about or do with their PCs and electronics. Meeting computer geeks who aren't also gamers is so "weird" feeling. It's like meeting cinema buffs who don't know what Netflix is. :)

Also, hating Microsoft had nothing to do with hating QA. I've met very, very few Microsoft employees who hate the company in any way; the few I have met were mostly just unhappy with how various dev divisions at Microsoft are still using the waterfall dev model (mostly the business software groups; one might note that SCRUM was invented at Microsoft in the Visual Studio division). Microsoft treats its people very well, theres a lot of really cool people who work there, lots of great parties and social events... honestly most people outside the FOSS camp love Microsofts products and services and think they're a great company. Microsoft hate is an Apple/Linux user phenomenon, not a general trend in the rest of the industry.

Yeah, Microsoft's marketing and business groups can be pretty slimy, but the engineering groups are quite disconnected from those. FOSS proponents tend to forget that in a company with over 100,000 employees, the public faces and the internal cogs can be and often are quite at odds with each other. Kind of like how some groups at Sony lobby against DRM and other groups lobby for it: big companies can be quite schizophrenic. :)

Most people hate test engineering because it's boring and immensely less interesting than developing new technology. Hence the problems in Linux- land where most the volunteer workers are busy rewriting half the desktop stack every few years instead of polishing and fixing bugs and such. I don't blame the developers for doing that - I would never ever accept a QA engineer job because I know damn well I'd hate it with a passion and hence would do a very lazy half-add job at it, and I much prefer doing fun new projects than maintaining boring old projects. I can blame Red Hat, Canonical, and so on for not paying for lots of QA engineers and QA testers, though, since they're trying to get money out of us in exchange for horrible desktop products. (Well, not Red Hat anymore, but I did used to buy their offerings back when they were pretending to be a desktop OS company still.)

The really sad bit is how all the Linux desktop folks are echoing the "PC is dead" nonsense ( PC sales are up, not dying off - mobile is conplimentary to the desktop, not a replacement pf it) and trying to chase markets that Linux cannot succeed in. Linux-the-OS-family is NOT a significant force in the mobile market, although Linux-the-kernel is seeing good success thanks to Android. Meego, Tizen, whatever are all flops. Likewise on tablets, the release of Windows 8 is going to end things for Linux there quite quickly, especially if Intel can realize its dream of tablets on x86. Why use Linux (Android or otherwise) when Windows tends to get better battery life, has less bugs, more features, more software, and isn't fractured between 20 different incompatible offerings? One might also note that iOS is still dominating the developer market in the mobile space, due to a combination of better APIs, better dev tools, and a user base that actually buys software.

The LInux API story is pretty bad, btw. You have to rely on a bazillion different libraries written by different developers with different naming and interface conventions, often needing to rewrite glue code for each of them to tie into the applications IO, memory, or debugging systems. That's a huge part of why unified platforms like Qt or .NET are so popular; there's no need to learn a ton of different API paradigms just to make a simple MP3 player. Tying that back into this thread, that's one of the great things about systemd - the unified monolithic platform approach is just so much easier to build off of than the scrambled disconnected grab bag UNIX Philosophy approach the traditional init/service/logging/session systems were all cobbled together with. OS X and iOS have a very solid unified API and Windows is making decent progress at modernizing an unifying their APIs (somebody still needs to take tr core C Win32 API out back and shoot it, though; ugh is that thing horrific).

Poettering: systemd for Administrators, Part XII

Posted Jan 23, 2012 0:11 UTC (Mon) by jimparis (subscriber, #38647) [Link]

> I've yet to meet anyone who actually liked the GiMP in any way when compared to its proprietary rivals.

If you sat me in front of a Windows machine with two icons, GIMP and Photoshop, and asked me to do some image editing, I'd click GIMP.

> Likewise with OpenOffice. These programs got popular for being lowercase free, not because they're at all quality software compared to the incumbent proprietary offerings.

You could just as well say that the incumbent proprietary offerings got popular for being first and most familiar. You're not offering much proof of anything you say here -- it's just assertions and anecdotes.

> And, btw, not recognizing what XBL is puts you at odds with about 60% of normal consumers, which is just another tiny bit of evidence that Linux folks are completely out of touch with what normal non-nerd computer users care about or do with their PCs and electronics.

I could believe your claim that 60% of consumers would recognize "XBox Live", but chastising someone for not identifying "XBL" in a conversation that had nothing to do with gaming is a bit of a stretch. Even the Wikipedia page for "Xbox Live" doesn't mention the acronym XBL -- contrary to the Playstation Network, which _is_ commonly referred to as PSN in the general media.

> Meeting computer geeks who aren't also gamers is so "weird" feeling. It's like meeting cinema buffs who don't know what Netflix is. :)

This was more like meeting a cinema buff who didn't know what you meant when you said "Sony's SXRD beats DLP, but NFLX doesn't show it."

> The really sad bit is how all the Linux desktop folks are echoing the "PC is dead" nonsense ( PC sales are up, not dying off - mobile is conplimentary to the desktop, not a replacement pf it)

Just google for "pc sales figures" or "post-pc" and you'll see people from Microsoft, Apple, IBM, and everywhere else saying the same thing, with numbers. It's not "Linux desktop folks" making things up.

> One might also note that iOS is still dominating the developer market in the mobile space due to a combination of better APIs, better dev tools, and a user base that actually buys software.

Linux users are perfectly willing to buy software if it's any good.
Look at the recent Humble Bundle statistics: users pay what they want for the same software, and Linux users paid more than twice as much as Windows users on average.

Poettering: systemd for Administrators, Part XII

Posted Jan 23, 2012 9:59 UTC (Mon) by dgm (subscriber, #49227) [Link]

> Linux users are perfectly willing to buy software if it's any good.

Beware of over-generalization. The Linux user base is big (in the millions) despite what many would like to think. As such, it's a very diverse group of people. Obviously, Mac and Windows users are still bigger groups, but I believe that diversity is more or less the same.

Poettering: systemd for Administrators, Part XII

Posted Jan 23, 2012 7:55 UTC (Mon) by niner (subscriber, #26151) [Link]

> I've yet to meet anyone who actually liked the GiMP in any way when compared to its proprietary rivals. Likewise with OpenOffice. These programs got popular for being lowercase free, not because they're at all quality software compared to the incumbent proprietary offerings.

Maybe we'll meet someday then you can at least claim one. I started out doing graphics stuff with GIMP and every time I have to use Photoshop I find it quite difficult to use and even limited.

Even though I started out using Microsoft Office, I switched to OpenOffice 10 years ago because it was much more stable and intuitive. For example, the page formatting is in the Format menu where all other formats are, not in File which makes just no sense at all.

> one might note that SCRUM was invented at Microsoft in the Visual Studio division

About as true as that Microsoft invented the internet. Scrum is older than Visual Studio and the inventors have nothing to do with Microsoft at all.

> honestly most people outside the FOSS camp love Microsofts products and services and think they're a great company. Microsoft hate is an Apple/Linux user phenomenon, not a general trend in the rest of the industry.

Most Microsoft/Windows haters I know are actually Microsoft users that never even tried Linux.

> FOSS proponents tend to forget that in a company with over 100,000 employees...

Acording to Wikipedia, Microsoft had 92,000 employees in 2011.

> Most people hate test engineering because it's boring and immensely less interesting than developing new technology. Hence the problems in Linux- land where most the volunteer workers are busy rewriting half the desktop stack every few years instead of polishing and fixing bugs and such.

And still for example the Perl community has switched pretty much in whole to test driven development more than half a decade ago. There are now millions of test cases for all the modules on the CPAN.

> Likewise on tablets, the release of Windows 8 is going to end things for Linux there quite quickly, especially if Intel can realize its dream of tablets on x86. Why use Linux (Android or otherwise) when Windows tends to get better battery life, has less bugs, more features, more software, and isn't fractured between 20 different incompatible offerings?

Wow, you're now predicting the future? Right now Windows tablets have exactly 0 market share, because they don't even exist. Yet you can already say, that they will have less bugs, more features and more software? In contrast to the desktop, Windows on the tablet actually starts out at zero with no software available at all. So it actually has to compete in a market on equal ground with already established players. A situation in which Microsoft has lost just about every time they tried (remember Zune?)

Poettering: systemd for Administrators, Part XII

Posted Jan 23, 2012 17:11 UTC (Mon) by rgmoore (✭ supporter ✭, #75) [Link]

Even though I started out using Microsoft Office, I switched to OpenOffice 10 years ago because it was much more stable and intuitive.

It has some other nice, user-friendly features, too. For example, Microsoft tries to keep everyone on the upgrade treadmill by defaulting to their latest, backwards incompatible file format and popping up warning messages if you try to save in some non-MS Office format. OO.o assumes that if you opened a text file, you want to save a text file and don't need to be warned that it's going to stay as text. This is very helpful if you're editing, say, .csv files that other programs use to share information.

I think that's a general trend. Free Software usually assumes that its users are experienced and know what they're doing, while shrinkwrapped proprietary software is more likely to treat its users as if they're total novices. There's obviously not a perfect correlation- some proprietary software is really designed for experts and has a very steep learning curve, and occasional free software packages are designed for novices- but it seems like a common difference. The problem is that this makes it harder for complete novices to get started in Free Software- they may want and need the hand holding- while the minor differences between packages interfere with proprietary software users switching to Free Software.

Poettering: systemd for Administrators, Part XII

Posted Jan 24, 2012 12:51 UTC (Tue) by nye (guest, #51576) [Link]

>It has some other nice, user-friendly features, too. For example, Microsoft tries to keep everyone on the upgrade treadmill by defaulting to their latest, backwards incompatible file format and popping up warning messages if you try to save in some non-MS Office format.

Your analogy is wrong. I'm sure it wasn't your intention, but what you've written is pure FUD.

In fact, both MS Office and LO work in *exactly the same way* - by default both save in their own format, and pop up a warning if you want to save in another format. In both cases it's possible to change the default format to whatever you want.

Poettering: systemd for Administrators, Part XII

Posted Jan 24, 2012 19:19 UTC (Tue) by rgmoore (✭ supporter ✭, #75) [Link]

You're right; I didn't get the behavior exactly right, but there is still an important difference in their behavior. They both default to warning you if you're trying to save in a non-native format, unless you've changed the default and are saving in the now-changed default. But LibreOffice has a "warn me about using non-default format" checkbox in the warning dialog, so you can disable it if you think you know what you're doing. As far as I can tell- and this is after a fair amount of looking because it annoys me so much- there's no way of turning off the warnings in MS Office.

Poettering: systemd for Administrators, Part XII

Posted Jan 24, 2012 11:16 UTC (Tue) by elanthis (guest, #6227) [Link]

> For example, the page formatting is in the Format menu where all other formats are, not in File which makes just no sense at all.

Somewhat amusing, given that just this morning I helped someone else who was utterly confused by formatting-related dialogs in OOo.

OOo today is very similar to that OOo 10 years ago, which is very similar to StarOffice, which is very similar to Office 95. One might have noticed that Microsoft is still actively improving the UI to Office these 17 years hence. Just saying.

> About as true as that Microsoft invented the internet. Scrum is older than Visual Studio and the inventors have nothing to do with Microsoft at all.

My apologies, I was confusing Scrum with the McCarthy Core Protocols. They blur together a bit once you learn both.

> Acording to Wikipedia, Microsoft had 92,000 employees in 2011.

I believe that figure is wrong. Or possibly the number I was told includes contractors and other "non-employees" that nonetheless work full-time for MS.

> And still for example the Perl community has switched pretty much in whole to test driven development more than half a decade ago. There are now millions of test cases for all the modules on the CPAN.

I'm impressed if CPAN does have tests for every last module. I'm not (frequently) a Perl user so I was unaware of that. Thank you for pointing it out.

You're not implying that CPAN's tests coverage are the norm in software (FOSS or otherwise) rather than an (awesome) exception to the rule, are you?

> Yet you can already say, that they will have less bugs, more features and more software?

Comparing the Win8 builds to the latest Ubuntu? Yes. Yes I can.

Poettering: systemd for Administrators, Part XII

Posted Jan 24, 2012 16:53 UTC (Tue) by rgmoore (✭ supporter ✭, #75) [Link]

One might have noticed that Microsoft is still actively improving the UI to Office these 17 years hence.
s/improving/changing/ and you might be correct. There's a strong argument for "if it ain't broke, don't fix it", so there's a serious question about whether all of Microsoft's UI development has been an improvement or just change for change sake. I'm not claiming the UI should be static- at the very least there need to be changes to incorporate new features, and a change in the scale of the program may require a complete UI revamp- but I'm skeptical that every change is an improvement.

Poettering: systemd for Administrators, Part XII

Posted Jan 26, 2012 18:13 UTC (Thu) by wookey (subscriber, #5501) [Link]

The couple of users I know that have used office for a long time certainly bitch and moan about 'new office' and how much they hate the changes. I've not heard the same complaints about ooffice.

Users _do_ complain about formatting differences and doc format incompatibilities when switching back and forth or dealing with others. But there is nothing _wrong_ with what OO does, its just the network effect of many more office users and (so far as I know) current office still not supporting ODF out of the box.

Continuing the general anecdoates of 'normal users' switching to windows. I know two who used to be Ubuntu users but now have Windows laptops. That's mostly because their new machines came with Windows and they didn't have huge incentives to change it. Most of that was that its just easier to be doing the same as the majority (doc formats, general knowledge on how to fix things).

I help one get access to a mercurial repo a few days ago and she immediately recognised that this would be _so_ much easier on ubuntu because she could just do apt-get install mercurial; hg clone <repo>.
On windows it was a faff. She kept saying 'Windows is fine so long as you aren't trying to do anything more that browse and write docs'.

So users do understand, and it's not really about proprietary apps. It's largely inertia in terms of what comes with the machines, combined with the networks effects (at least in these cases).

Poettering: systemd for Administrators, Part XII

Posted Jan 23, 2012 14:44 UTC (Mon) by pyellman (guest, #4997) [Link]

>I've yet to meet anyone who actually liked the GiMP in any way when compared to its proprietary rivals.

My now 14-year old daughter learned to use the Gimp alongside various commercial alternatives (Photoshop, Corel), and, at this point, immediately downloads and install the Gimp on any machines she uses, despite having access to aforementioned commercial alternatives. She is quite competent with all the products mentioned. This was with absolutely no pressure from me, other than introducing her to the Gimp (nearly 6 years ago) and a one-time discussion of its benefits, which include always having access to that software.

If you are ever in the Charlottesville, Virginia, area, please stop by and I will introduce you.

Poettering: systemd for Administrators, Part XII

Posted Jan 23, 2012 22:32 UTC (Mon) by cmccabe (guest, #60281) [Link]

Hi elanthis,

It seems like a lot of what you posted is off-topic, as well as factually incorrect. I'm glad that you're a gamer, own an XBOX, and so forth, but it's hardly relevant to this article.

Poettering: systemd for Administrators, Part XII

Posted Jan 22, 2012 9:10 UTC (Sun) by AndreE (guest, #60148) [Link]

You have decent points, but it would be easier to take these comments seriously if they weren't full of weasel words and pejorative castings like "herp-derp"

Poettering: systemd for Administrators, Part XII

Posted Jan 23, 2012 11:01 UTC (Mon) by cortana (subscriber, #24596) [Link]

Reddit is leaking... :)

Poettering: systemd for Administrators, Part XII

Posted Jan 24, 2012 11:20 UTC (Tue) by elanthis (guest, #6227) [Link]

Hey, at least I cut down on the swearing. :)

Neat

Posted Jan 21, 2012 4:49 UTC (Sat) by dskoll (subscriber, #1630) [Link]

I've never used systemd (I run Debian Squeeze which does not yet ship it). I'm also leery of radical changes to SysVinit, but I have to say these features look really cool and interesting. They could improve Linux systems' security quite a bit. In spite of bad ideas like "the journal", Poettering also comes up with some very good ideas.

Neat

Posted Jan 21, 2012 11:24 UTC (Sat) by zdzichu (subscriber, #17118) [Link]

Well, credit where it's due. All described things are standard Linux features, available for years. What systemd provides is simple, coherent and easy to use interface for configuring them. Which itself is as important as features themselves.

Neat

Posted Jan 21, 2012 13:43 UTC (Sat) by martin.langhoff (subscriber, #61417) [Link]

Well, it takes some courage to use all the interesting features in Linux in depth and as a central part of your OS component. It means you're not portable to any POSIXy kernel, and the flamers will be out to get you :-)

Neat

Posted Jan 23, 2012 19:52 UTC (Mon) by wahern (subscriber, #37304) [Link]

Indeed. Enjoy your local root Linux exploit this morning. When you're tied to a single POSIXy system, you'll have a monoculture just like an all Windows shop. Several tons of features and a dash of automation isn't going to get rid of these all too common exploits.

Systemd is pretty neat from a single _user's_ perspective. But ignore portability at your peril if you're a developer or you maintain a significant number of services. That's not a flame, just common sense. It doesn't detract from systemd one iota.

Now if Red Hat could get some useful process management interfaces integrated into POSIX and (or) adopted by other OSs, then that'd be worthy of some serious accolades.

Neat

Posted Jan 23, 2012 21:31 UTC (Mon) by HelloWorld (guest, #56129) [Link]

> Now if Red Hat could get some useful process management interfaces integrated into POSIX and (or) adopted by other OSs, then that'd be worthy of some serious accolades.
The interfaces that systemd offers are well-documented and can easily be implemented on pretty much any UNIX system. That's all that can be reasonably expected from Red Hat. If other vendors choose not to implement these interfaces, it's definitely not Red Hat's job to make them.

Neat

Posted Jan 24, 2012 4:14 UTC (Tue) by raven667 (subscriber, #5198) [Link]

If someone wanted to go down that road, which I think is probably a good idea, they can treat the project like OpenSSH. OpenSSH is very much developed for OpenBSD, using OpenBSD only features just like systemd. What most people run is Portable OpenSSH which is a separate patch set against OpenBSD OpenSSH that makes it portable to other operating systems. OpenSSH is exactly the model that systemd could use if a vendor wanted to ship that instead of sysvinit or whatever they are shipping.

I'm not sure that will actually happen though, MacOS X already has a similar launchd and the rest of the *BSD and UNIX variants seem pretty set in their ways.

Neat

Posted Jan 26, 2012 2:56 UTC (Thu) by alankila (guest, #47141) [Link]

It's technically easier to deliver a working, sophisticated solution in a monoculture. Here's why. On monoculture's side, you have two pieces of software: systemd and the linux kernel.

On heteroculture, you'd have systemd + stable API which it must use + the kernel of user's choice. The API component would be adjusted per kernel.

What happens now is that you need to maintain this intermediate kernel-specific API, which is actually pure overhead relative to the monoculture solution, and you can't expose any features in systemd which are not supported in all of the api+kernel combinations. (If you make kernel-specific features, you are becoming a monoculture again: you have this API thing there but it is actually useless because stuff only works against certain specific kernel.) So the more kernels there are, the more likely it is that your new awesome software actually can do what anything new and interesting.

The point is, this thinking you have there is one of the primary problems. There's always been too much flexibility historically, or as I put it, open source programmers are scared to delete code. The code which they should delete is all this intermediatory API glue.

Neat

Posted Jan 26, 2012 19:28 UTC (Thu) by wahern (subscriber, #37304) [Link]

It depends. If by easier you mean you can reuse more existing libraries and components, yes. But what matters far more is the design and complexity of the interface.

While PID file management is a PITA (something which a 128-bit getlpid()+lkill() would easily fix), all of the other stuff is trivial to do. You can daemonize, with a watchdog, by calling fork()+setsid()+fork()+chdir()+chroot()+setuid().

I recently tried to compile pkg-config the other day from scratch. It's basically a couple of .c files. Yet it has an insane circular dependency on glib; it uses glib's list implementation. Ignoring for the moment that <sys/queue.h> exists on all FOSS implementations and OS X (and is trivial to copy into your project anyhow), does it really make sense to reuse glib for a simple list implementation? glib is a monstrous piece of software. You would be effectively deleting a ton of code by rolling your own list implementation from scratch.

Now I'm not saying people should always roll their own daemon management code instead of using systemd. But consider that when you use systemd you're also using a gajillion other components, many of which reside in the Linux kernel. Do you really want to tie yourself to all that dead weight when you could get 80% of the features by using, e.g., inetd? Why not let the user decide whether to use systemd or something else? Most of the conveniences benefit the administrator, not the developer.

Neat

Posted Jan 28, 2012 4:37 UTC (Sat) by alankila (guest, #47141) [Link]

So it takes pkg-config to build glib, but glib can't be build without pkg-config? Oh well, chicken-egg problems are unfortunately commonplace in software development. You need a compiler to compile a compiler, after all...

In the end, I'm not moved by arguments about sheer complexity or software size. What matters is that everybody runs the same software and therefore there exists certain fixed capability that can be used to build other software. I think it's called platformization.

Some solutions seem very popular such as dbus, and it appears to have made impressive amount of cooperation between programs possible. I also love pulseaudio for doing the impossible: reducing volume control problem to adjustment of a single slider and providing every application the capability to play audio together.

I think systemd and the /usr merge will again remove tons of useless complexity when it comes to managing services and deciding where to place the relevant files. I'm all for removing choice like this which doesn't seem to do anything else but hold the platform back. systemd appears very sophisticated to me and it's productized well, so it will make new and interesting capabilities available to all.

Neat

Posted Jan 28, 2012 17:31 UTC (Sat) by Darkmere (subscriber, #53695) [Link]

"Back in the day" , pkg-config used to ship an inline copy of glib (2.x) in case it needed to build on a system without glib, if it was built against this, it was linked in statically into pkg-config itself.

This was removed in early 2011.

Also, it's possible to build pkg-config without glib, simply by setting the environment variables for linking/including against glib by yourself.

Hot news, pkg-config is optional, export GLIB_CFLAGS, GLIB_LIBS, ZLIB_LIBS, ZLIB_CFLAGS.

And if you are one of those who think that pkg-config is "unnecessary bloat" (quite a few people thought that when it was included back in the day, autoconf could do that! ) then you can still just export the variables for all the libraries and be done with it ;)

Neat

Posted Jan 21, 2012 13:54 UTC (Sat) by Wol (guest, #4433) [Link]

What systemd also provides is the application of the Unix philosophy - "do one thing and do it well" - to the mess that is SysV init.

Rather than have hundreds of lines of bug-prone shell script duplicated across hundreds of scripts, it replaces them with little C modules. Fix it once, and fix it for good.

Again, typical Lennart.

Cheers,
Wol

Neat

Posted Jan 21, 2012 14:08 UTC (Sat) by stumbles (guest, #8796) [Link]

That part of systemd I like; do one thing well.

Still not real fond of writing the service files. Could not figure out why virtuoso.service would not start. Just happen to stumble over I was using the wrong Type=, turns out virtuoso is a forker. Aside from that yeah, never really cared about the symlink-ism of sysvinit and always thought it to messy. But that's just me.

Poettering: systemd for Administrators, Part XII

Posted Jan 23, 2012 15:56 UTC (Mon) by willnewton (subscriber, #68395) [Link]

The link is broken. Strange that no-one has noticed this. ;-)

Poettering: systemd for Administrators, Part XII

Posted Jan 23, 2012 16:17 UTC (Mon) by rahulsundaram (subscriber, #21946) [Link]

Maybe because its not broken?


Copyright © 2012, Eklektix, Inc.
Comments and public postings are copyrighted by their creators.
Linux is a registered trademark of Linus Torvalds