Then why bother?
Then why bother?
Posted Jan 4, 2009 22:51 UTC (Sun) by khim (subscriber, #9252)In reply to: LSB is a hoax - on desktop at least by drag
Parent article: Android netbook is a possibility (Inquirer)
Well you can't expect LSB to do everything.
No? Why bother then? ISV need 100% solution - and it's not LSB. Time will tell if it's Android or, not but LSB failed so far.
It's not LSB's fault that the problem is so difficult and such a mind blastingly waste of time and effort compared to getting people to agree with each other... but it's something that is required anyways.
The problem is not difficult at all. You need some standard and some way to ensure that people will create products compatible with said standard. And in real world that means money.
Each Android application runs as it's own entire 'fork()', right? So it shouldn't be very difficult to be able to integrate Android apps into a desktop situation.
You are missing the point completely. Yes, it should be possible to run Android applications under GNOME or KDE. But this is not the solution. It looks to me that you don't even understand the problem.
The problem is: usability of the old software on modern system.
Basically I take the CD with 10 years plus program (Neverhood or Office 97, WordP
erfect or Chrono
Cross), put it in drive, start the program - and use it. What is the
success rate? The story goes more or less like this:
1. PlayStation/Wii/etc - 95-99% success (even on newer console
versions).
2. Windows - 90-95% success.
3. MacOS - 80-90% success.
4. Linux - 5-10% success.
As long as Linux does not offer such capability - it's non-conteder for
Consumer Desktop. But you often only need few tweaks! Right - and
that means Linux is close to being ready for Enthusiast Desktop or may be
even Enterprise Desktop. But for consumer desktop - you need something
usable without HOWTO or tweaks. Android promises this. Will it fulfull this
promise or not - time will tell. But so far it looks fine: when it was
found that you can call some undocumented features via reflections -
Android developers pulled the plug. SONY/Nintendo/etc achieves it's success
in the same way: developers are punished quite severely if they go beyond
spec, but as result - programs (mostly games, of course) are usable for
decades...
Just like how the barriers between 'CPU' and 'GPU' and how they are handled are slowly being erased (See also Intel Larrabee and AMD/ATI's Fusion), the differences in mobile phones, internet-focused netbooks, and even general purpose computing systems are going to be eliminated.
Yes, but if Linux community will not produce something usable in 100% plug-and-play mode - it'll not be Linux-based. So far there are very few contenders - and Adroid is amongst them while LSB is not.
As for broader Linux community... what can you say when vital compatibility stuff is unmerged for years? Clearly situation is perfectly hopeless there - not even worth discussing...
Posted Jan 5, 2009 0:51 UTC (Mon)
by dlang (guest, #313)
[Link] (8 responses)
for the game consoles there is basicly zero compatibility between games produced for one piece of hardware and the new models
for windows the compatibility was really good up through win 98, but since then microsoft changed a bunch of stuff, so a lot of win98 and earlier software won't work without re-writing it. and even without that, haven't you ever heard of 'DLL hell' (where one piece of software needs a new version of a DLL and another need an old version of the same DLL)? that doesn't require 10 year old software to trigger, it can be triggered by conflicts in brand-new software from different vendors.
apple has done a good job (they had to, they've changed hardware platforms enough that anything less would have put them out of business)
for linux most old software will just work on new versions, the only issue is that you may have to get copies of older libraries as well (and unlike windows, *nix has provisions for having multiple versions of a library and letting the software specify which one it needs)
the problem isn't that the linux software won't work, it's that there is no easy way to setup the appropriate libraries for it to work.
there are some exceptions (mostly in terms of interpreted languages that I know about) but not that many.
even with OSS (which you posted the link to), the issue isn't that the software doesn't run, it's that it doesn't play nicely with other things using sound at the same time, but that software never did, so while that's not nice for a modern desktop, it's no worse than the software was 10 years ago when it was first written.
I agree that the LSB is basicly meaningless, but the overall compatibility in linux is actually pretty good.
Posted Jan 5, 2009 1:37 UTC (Mon)
by Kit (guest, #55925)
[Link] (4 responses)
Not true in the slightest for any of the major current generation systems. The Wii can play Gamecube games, the Xbox360 (tri-core PPC) can play Xbox (x86) games, and the PS3 (Cell) can play PS2 ("Emotion Engine", vaguely MIPS-like), as well as PS1 games (MIPS).
>and unlike windows, *nix has provisions for having multiple versions of a library and letting the software specify which one it needs
I'm not sure I've seen any support for multiple versions beyond simply changing the .so's name with every version (which you could do with .dlls on Windows).
Posted Jan 5, 2009 2:48 UTC (Mon)
by dlang (guest, #313)
[Link]
I ran into this yesterday on my Dad's laptop.
he installed mulberry (precompiled binary) and it wanted libstdc++.so.5 and his system has libstdc++.so.6
first try, make a symlink to .6, result: additional errors
second try, use the ubuntu package manager to install the .5 version (in addition to the .6 version), result: a system with both versions installed and apps depending on each one working happily.
I've dealt with this for several other libraries over the years, and in almost every case there have been no problems having multiple versions of the libraries on the system.
a year or so ago I dug out some ancient linux binaries (wordperfect and the first Civ game that was released for linux, both from an early Caldera release), and was able to get them both running without much hassle
Posted Jan 5, 2009 11:39 UTC (Mon)
by tialaramex (subscriber, #21167)
[Link] (2 responses)
There's a difference between the marketing bullet point "backward compatible" and actual backward compatibility. After all, the Linux kernel's continued support for early ELF binaries with the original system calls means in theory you can run a 13 year old Linux program on today's Fedora, but no-one pretends (as the grandparent poster does for consoles) that means 99% compatibility.
The Wii plays Gamecube games, mostly. I'd buy the idea that it could even be 99% of them (not that anyone can think of more than ten decent Gamecube games). The 360 plays /some/ Xbox titles, but notably not several best selling ones that people are most likely to own (it's all very well having a big list of titles like "Barbie Horse Adventures" and various movie tie-in generic platformers, but no-one actually owns those, they're shelf-filler). Microsoft claims that overall "more than 50%" work, but they count a game as "working" even if it's noticeably slower and has visual defects.
The PS3 is even worse - some models, those released early in Japan and North America, had expensive compatible hardware to run most of the GTA games (with "some issues" ie you'll probably still wish you were using a real PS2) but those released later (e.g. in Europe) rely purely on software emulation. Which basically isn't adequate for any of the top-selling games on the PS2. This means even if a game works on your friend's PS3, it may not work on yours. You have to compare the model of hardware you have to a compatibility list...
The console makers know that back compat is mostly a marketing bullet point, customers (particularly the most profitable ones who buy lots of new games) don't really use the back compat. So it's important to have some sort of offering, but you can't afford to spend much R&D money on it. If it falls out naturally from evolving hardware, great. Otherwise, too bad.
I think the "enterprise" Linux distributions can do a pretty good job of supporting the same binaries for 10 years or so, via compat packages, configuration of the environment (with judicious symlinks etc.) and so on. It shouldn't be a huge surprise if the same isn't true in Fedora 11 where the OS itself has a life of only 12-13 months.
Posted Jan 5, 2009 13:11 UTC (Mon)
by khim (subscriber, #9252)
[Link]
Sorry but the fact that XBox360 and PlayStation3 (both with radical
changes in hardware from predeccessors) support any backward
compatibility means developers spend millions and millions of dollars to
achieve that. Backward compatibility is easy way to solve checken and egg
problem: nobody buys your console because there are no games for it - and
game developers don't create games because there are no buyers for said
games! Microsoft decided to solve the problem by other means: just give
money to game deveopers directly - this should be incentive enough. SONY
decided that "it's not so important" - and PlayStation3 became a
pariah. Linux developers don't have money to solve the problem "Microsoft way"
so the fact that they ignore compatibility problem and talk about
"Linux on consumer desktop" is puzzling. There are no way to achieve it -
at least with LSB/GNOME/KDE/etc => Linux on consumer desktop is a non-
starter. With Android... there are a chance. Small chance to be sure, but a
chance... I don't fault Fedora at all - it's not the system for a consumer
desktop, so this level of compatibility is not a requirement.
Posted Jan 8, 2009 12:45 UTC (Thu)
by rwmj (subscriber, #5474)
[Link]
The console makers know that back compat is mostly a marketing bullet point, customers
(particularly the most profitable ones who buy lots of new games) don't really use the back
compat.
That's a joke right? The Wii has backwards compatibility with about a half-dozen consoles.
Nintendo sell the older games by the bucketload through their online service.
Rich.
Posted Jan 5, 2009 12:44 UTC (Mon)
by khim (subscriber, #9252)
[Link] (2 responses)
It used to be true. Ten years ago! SONY changed the trend with
PlayStation2 - almost 100% compatible with original PlayStation (only some
accessories can not be used). SEGA and Nintendo had "new and improved
incompatible consoles" (SEGA Dreamcast and Nintendo GameCube) but they had
no compatibility with previous models - and as result SEGA gone bancrupt
while Nintendo survived only because of success of GBA... compatible with
original GB and GBC! Surprisingly enough XBox360 and PlayStation3 are bad with backward
compatibility where Wii can support games from GameCube (no charge), NES,
SNES, Nintendo64, Sega Master System, Sega Genesis, TurboGrafx-16, Neo Geo,
Commodore 64 and MSX (for a fee)! Of course it's not the only reason to the
run-away success of Wii, but SONY found that PSP gained in popularity
significantly once support for original PlayStation titles was added. Sorry, but times are changed. Backward compatibility is the king today.
And Linux does not support it. Heck: today old Windows programs are
more compatible with latest version of Linux than old Linux
programs. Have you actually tried to use it or is it hearsay info too?
Compatibility is certainly not perfect but Windows XP and Windows Vista
include A LOT OF stuff intended to make software compatible. Ironically
enough worst compatibility is with Microsoft's packages: because Microsoft
can just push updates. Even there: basic functions of MS Office 97 work
perfectly fine with Windows Vista - but some obscure features are broken.
If you install some old program Windows XP knows how to handle it.
Sometimes it just uses "fixed" version of some .dll transparently for the
old program! Of course I've had! I reinstall my niece's Windows system regularly -
it's the only way to keep it working after you've installed/deinstalled
hundred different programs or so. But you miss the point: dll hell is
certainly VERY bad thing. But alternative (you can only install something
if you know how the thing works) is totally unacceptable for
consumer desktop. If you can eliminate dll hell and keep system
backward compatible - it'll be great (Google tries to do exactly that with
Android) but if you can not provide compatibilty - don't even start. Dll
hell is infinitely preferrable to unusable program in non-IT person
POV. Sorry, but "will just work" have different meanings for geek and normal
person. For geek "will just work" == "will need some tweaks here and there
and after that it's half-usable". For normal person "will just work" == "I
insert CD or start install program, wait 15 minutes and use it". Anything
else means "it does not work"! It does not matter. It does not work - that's what matters. And
libraries are pretty minor thing. Think about it: if you install program
and it does not add icons to the "Start Menu" - is it working or not? My
niece will say: "of course not - how to run it? well... too bad - I'll try
to install something else". Yes, it looks insane for a geek, but that's how
non-IT person perceive stuff: even "minor" issues are perceived as
insurmountable... Sorry, but no. It worked perfectly 10 years ago with "SoundBlaster
Live!". It does not work today on today's system. Again: does not work ==
I've installed it, started and can not use it. It does not matter WHAT
EXACTLY makes it unusable - I've installed it and it does not work,
period. Now, don't get me wrong: I know old stuff on Linux is usually in state
"you only need to tweak it a big, may be change permissions and voila" -
but it's not in the working state from the non-IT person POV... P.S. Recall what was really-really wrong with Windows Vista? High system
requirements? Nope: KDE4 is in the same league. Hardware compatibility? No:
today Windows XP is worse for the new hardware yet people are asking their
friends to somehow make a miracle and install Windows XP on thus %@###!%@!
"Windows Vista capable laptop". Then what? Simple: it broke a lot of stuff
and showed a lot of scary warnings when you've done things "as usual". As
long as Linux does not offer smoth upgrade path - it' non-conteder for
consumer desktop...
Posted Jan 5, 2009 21:14 UTC (Mon)
by dlang (guest, #313)
[Link] (1 responses)
there is a difference between things not being able to work and things requiring effort to make work (by loading the appropriate libraries). there can be new tools written to assist in loading the appropriate libraries without requiring any changes to either the old binaries or the new systems.
I also think that you are very mistaken if you think that the right thing to do is to make linux work like windows. we already have many things that are designed in much more robust ways, what we need isn't to throw away what we have to chase windows, but to get new tools written that take advantage of what we already have.
This is where Ubuntu has done such a good job, they have written a lot of these tools (or at least put them togeather in ways that no other distro ever dis) to make things easier for the user. If running old binaries is so valuble then someone will eventually write tools to automate the library discovery/install process for them.
Posted Jan 6, 2009 11:00 UTC (Tue)
by khim (subscriber, #9252)
[Link]
Of course! And that's exactly where split
between consumer desktop and corporate/geek desktop lies. Company does have
guys dedicated to this process, geek can do it with Google's help, but
"normal consumer" can not do it at all. Sure. Microsoft does it too.
The catch: such tools must be employed automagically and work "behind the
scenes" without help from user. If you ask for manual intervention - you've
lost. There
are no need - Windows does the required thing via band-aids over band-aids
and this approach is failing them: Windows Vista was mostly rejected by
consumers (where they had choice) for this very reason. Sorry, but no. Linux breaks significantly easier
than Windows. It's easier to fix, of course - but as I've said: there are
noone who can do it behing the keyboard! If we are talking about consumer
desktop, that is. Yes - but these tools fail constantly and need manual
intervention. This is not consumer desktop. By this time Linux, of course, is abandoned and
mark "for geeks only" is applied to it for the next ten years. Backward
compatibility matter, forward compatibility matter. The consumers are using
computer in a "simple" way: they buy it with the system pre-installed,
install stuff accomulated over years (sometimes stuff is thrown away - but
it takes years, not months), then expect to download/buy/find new stuff and
install it there. Often when Windows is finally broken enough to be totally
unsable system is just replaced with a new one! OS is never updated
(unless autoupdate does it) and is never upgraded. Linux fails on
both fronts: old stuff does not work, new stuff can not be used too! I've
tried to help my friend to install Skype year or so ago: Skype was already
abandoned Dapper by then - before Hardy was out! So much for "long term
support"... As far as consumer desktop is concerned Windows is poor choice, but
Linux is not even a contender! Android... may be... time will tell. Ubuntu?
No - not even close...
Posted Jan 5, 2009 18:31 UTC (Mon)
by drag (guest, #31333)
[Link] (5 responses)
LSB can't provide it because the people that produce the platforms can't agree on everything. It IS NOT IN LSB's CONTROL. They can only work with what they are given.
They can go around and wave their hands and try to make a 100% solution, but it's completely bullshit if nobody follows it, which nobody will.
As for 'Why bother?'.. there is a certain subset of Linux OS that people can agree on, more or less. So... LSB defines that for ISVs. It at least gives people something to work with.
LSB doesn't do the dictating to distributions. Distributions do the dictating and LSB has to work with what they can find in common and try to formalize it.
-----------------------------------------
What is required right now is to get people stop trying to make slightly-incompatible versions of Linux OS and make one based on the same core. Everything else you talked about is completely irrelevant and is not going to happen until people decide to grow up and start getting rid of competing 'mainstream' Linux distributions.
If I can't produce binaries that work on both Ubuntu and Fedora, then what is the point worrying about backward compatibility? Backward compatability with WHAT EXACTLY? Redhat? Fedora? Debian? Slackware? Suse? Ubuntu?
Hell I don't even have compatibility _right_now_. How the hell is anything suppose to be compatibility with software from 10 years ago with Linux software wasn't compatibility with anything from back then in the first place?
Posted Jan 5, 2009 21:34 UTC (Mon)
by jspaleta (subscriber, #50639)
[Link] (3 responses)
So when Suse was derived from Slackware that was childish?
Get rid of mainstream linux distributions? What exactly would you replace the linux distribution concept with? Which diamond in the rough out of the hundreds of non-mainstream linux distributions would you hold up as the core platform that everyone needs to target? Perhaps we should de-contruct the last 16 years or so of distribution development, and go back to Slack or SLS as a core platform and live with the frustrations thereof?
I think you grossly over-simplify the problem. Is there any linux distribution of merit which does not include out of linux kernel tree patches of significant value to its userbase? Or other patches in the plumbing layer around the kernel?
Hell which upstream projects would you even include in that core platform? And how would you get the upstream projects to agree to hold their API and ABI stable long enough to give the core platform any significant longevity while also pledging the manhours to keep the platform patched for vulnerabilities and bugfixes that do not jeopardize the platform stability?
The entire open software project model is a whirlwind of individual moving pieces, with very different rates of development. You are only going to impose order on how projects interact and form a core platform by injecting significant manpower across a number of key projects with the express purpose of making consistent ABI and API stabilization a priority ahead of other interests.
Perhaps projects are manpower strapped and they don't have the ability to push forward and keep a shared platform specification maintained. Its not a matter of "growing up" its a matter of making choices to use limited resources among competing priorities. The people doing the development of the kernel and associated plumbing do not necessarily put ABI and API stability at the forefront on competing interests when it comes to how they burn manpower. The people looking for a core platform are going to have to pony up that additional manpower to make the core platform happen..its as simple as that. Implying the current project developers are childish is a gross over-simplification of the problem. Developer time is limited and platform stability commitments require manpower resources which compete with active development interests.
For an individual corporate entity its far easier to take a limited snapshot of that storm, fork it off, and then impose API and ABI stability on that snapshot..slowing down the rate of development significantly in their walled garden..and increasing the maintenance burden with regard to bug fixing and vulnerability patching.
-jef
Posted Jan 5, 2009 21:55 UTC (Mon)
by khim (subscriber, #9252)
[Link]
What's the point? Mainstream linux distribution work just fine for geeks
- why kill them? The fact that mainstream linux distributions can not
produce consumer desktop does not mean they are useless. None, of course. May be kernel - because it's too big and relatively
well-supported. The core platform must be the ultimate upstream.
Packages and libraries must be brought in the core distribution when core
distribution is ready to include them, not the other way around. Two possible solutions: So far Android team did everything right: they refused to provide "C
ABI" (despite outcry from public) and I hope when/if they'll provide such
ABI it'll be as restrictive as their Java ABI...
Posted Jan 6, 2009 0:41 UTC (Tue)
by dlang (guest, #313)
[Link] (1 responses)
that post was logical and relativly calm, and it even mentioned Ubuntu without bashing it.
posts like this are very welcome. thanks for changing your attitude a bit.
Posted Jan 6, 2009 1:01 UTC (Tue)
by jspaleta (subscriber, #50639)
[Link]
And since bashing Canonical is what is expected of me...I'll go ahead and do just that. I'm more than happy to live up to your expectations of me. There's so very very much to find fault with in Canonical's approach to doing things.
If we are talking about trying to create a usable platform experience... it doesn't not help when corporate entities like Canonical commit to pushing experimental features into distributions they control which deliberately break the consensus based cross desktop specification work going on at freedesktop.org. Specifications which application developers rely on to form the basis of interprocess UI services.
Its perfectly fine to want to experiment with different UI approaches, but to commit to shipping that UI experiment in the community distribution you manage as a corporate entity, that impacts how multiple applications work..as a default in an OEM customized distribution...without first reaching a consensus with upstream on how to minimize disruption to existing upstream application functionality...does not help create a cohesive multi-project based "platform" environment that non-Canonical employed application developers can rely on.
-jef
Posted Jan 5, 2009 21:41 UTC (Mon)
by khim (subscriber, #9252)
[Link]
If LSB can't provide then LSB
must die. It's as simple as that. LSB has no value, it gives you nothing -
except the raight to put yet another useless stamp on your
distribution. It's not used by ISV (how many LSB packages can you name?), it's huge
time sink for a lot of people - why does it exist? To pay salary for
"professionals" who create this stillborn standard? If you are big enough then there are possibility that people will
follow 100% solution. Time will tell if Google and OHA are big
enough. Nobody follows LSB so why are you still beating this dead
horse? Name one who bought this bull-shit and is still around to tell
the tales. There are some LSB packages - but they are made from native
Linux packages by companies who want one more stamp, not by companies who
foolishly tried to use LSB as platform. Easy: you make the platform, declare it "version 1.0" (or 2.0 or 10.0)
and say "we will support binary compatibility for this version...
Normal distributors can not work this way: if Ubuntu will try to limit
choices - free software developers will just declare jihad and refuse to
support such a distribution. And since there are no ISVs to fill the
void... the whole platform will just disapper. OHA does have weight to say
to developers "my way or the highway" - or so it looks today. If it'll
happen - we'll have linux (albeit not GNU/Linux, sadly) for consumer
desktop. If not... well may be someone else will do it... P.S. The really sad thing is that induvidual projects work this way
already. They just can not agree to do "big switches" synchronously - and
so we have major breakage twice per year instead of once per decade...
Posted Jan 5, 2009 20:58 UTC (Mon)
by mjthayer (guest, #39183)
[Link]
That could actually be the point. Getting "native" Linux applications to work in a plug and play way when you stick in a CD is more or less impossible as things are. However, getting Windows applications to plug and play under Linux via Wine actually seems more realistic (if the application works at all of course - persuading Windows application authors to test under Wine might be the best hope for that). The same might apply for Android applications.
Posted Jan 8, 2009 11:25 UTC (Thu)
by njs (subscriber, #40338)
[Link] (3 responses)
So, umm, your argument is that the reason the consumer market has not switched en masse to Linux is that lots of people would like too, but have decided to wait because they have piles of Linux software from 1998 sitting around, and modern Linux can't support it so they just stay with... Windows?
Backwards compatibility of the sort you mention will start mattering a few years after Linux becomes mainstream and there are lots of funky binary-only apps being distributed through non-distribution channels -- if that day ever arrives. (Personally I sort of hope the latter part never happens, but anyway.) Until then, basically no-one cares about that lack of functionality, because no-one needs it, and community-developed FOSS is never written before there is user demand.
There's a separate question of whether we'll be able to provide compatibility then, when it does start mattering, but given our excellent support for *Windows* apps from 1998 (and for that matter, <a href="http://gwenole.beauchesne.info//en/projects/basilisk2">Mac</a> apps, and I bet some of those consoles too), I'm not staying up nights worrying about our ability to make frikkin' ELF-with-some-old-libraries work.
Posted Jan 8, 2009 12:11 UTC (Thu)
by khim (subscriber, #9252)
[Link] (2 responses)
Nope. My argument is that it's the reason few "early adopters" return
back in frustration. "The ability to seamlessly run 10 year old apps" and
"the ability to run apps 3-5 year newer than your distribution" are
necessary features, not sufficient. Transition is inherently
slow and constly process process: Microsoft spent many billions and over 10
years to switch consumers from DOS to Windows - and it was in control of
both DOS and Windows back then! But without expected ability to run newer
programs on older systems and older programs on newer systems you are
losing what little consumers you manage to gain! Actually such compatibility
matter here and now. How can I use Python3 on my Ubuntu 8.04 desktop? How
can I use OpenOffice3 on Ubuntu 8.04 desktop? The answer is "there are no
easy way" and frankly it's insane. The ultimate goal must be 8-10 years
back and 3-5 years forward, but so far you don't have even few month
compatibility in forward direction and 1-2 years in backward direction.
That's not consumer-acceptable rate. Then it just
means that community-developer FOSS can not ever produce consumer
desktop. Not a big deal, we have big companies for that. May be it'll be
Android, may be it'll be something else - but eventually someone will
produce FOSS consumer desktop, but it'll not be community-developed... Windows and Mac apps are supported well because
Microsoft and Apple spent billions on solving this problem. They planned in
advance (applications from pre-planned era are not well-supported at all)
and while the did few mistakes then never suggested recompilation as
solution for compatibility problem. I think that eventualy we'll have FOSS consumer desktop, but it'll
never be GNOME/KDE/etc desktop. They respect rights of developer
more then rights of user and so they can not cover this niche.
Posted Jan 8, 2009 13:21 UTC (Thu)
by njs (subscriber, #40338)
[Link] (1 responses)
Please demonstrate one of these real existing users that switched to Linux, but then switched back when they could not run 10 year old Linux programs.
The argument about the forward direction you just added is more reasonable, but still -- please demonstrate one of these real existing users who switched to Linux, and was desperate to have the latest and greatest version of random apps like Python 3 and OO.org, *but* was scared of upgrading to the latest and greatest version of their distro.
Unless you can demonstrate that such users exist and are the normal case, then your claim that this is the actual cause of Linux's failure to dominate the desktop is simply inaccurate.
>Windows and Mac apps are supported well because Microsoft and Apple spent billions on solving this problem
Please read harder. Microsoft did not spend billions developing Wine. The opposite, if anything. Wine's existence disproves your argument that wah-wah FOSS people cannot achieve compatibility with anything -- they can do pretty amazing things, in fact, so long as doing so actually accomplishes something useful. The reason no-one spends much effort on compatibility for Linux apps because if they did, *under present circumstances* -- which are different from the circumstances for Windows and Mac OS! -- no-one would use it anyway, and it would have no positive effect in the real world.
Posted Jan 8, 2009 14:00 UTC (Thu)
by khim (subscriber, #9252)
[Link]
These people don't really create a lot
of noise (who will admit they made a mistake?) but you can find a lot of
horror stories related to distro upgrade (use Google is you wish) - usually
it ends with Switch back to Windows? No way! (or something similar)
but that just means guy is geeky enough to actually stick with Linux. Visit any newbie forum. Usually such people are
ridiculed and ostracized, so you'll only ever find one or two messages from
them and then long thread about virtues of the system upgrade - you can
safely presume at this point person in question either gave up on
Python3/OO.org/whatever and may be on Linux is general. I can name one such
guy with 100% certainity however: myself. I'm using Linux on my "server
system" but stopped trying to switch to Linux on my main system. Too much
hassle. It was fun to tweak this and that 10 years ago - and Windows was
pretty unreliable back then. Today Windows is pretty reliable, I can throw
random stuff on it (including OO.org 3.0, Firefox 3.0 and so on) and it
usually lasts 2-3 years before it becomes totally unusable - at which point
it's just reinstall and start from scratch. I personally know quite a few
Linux administrators who are using Windows (or sometimes Mac) on desktop
"because it's easier to support"! If you can not convince guy who's work is
to support thousand of Linux systems to use Linux on his/her desktop
because "it's too much hassle" - who can you convince? Microsoft spent billions developing backward-
compatible system. Emulation is not invention. It's hard work, but if you
manage do it - you get all properties of the original for free. There are
free emulators for most consoles (except latest generation), yet there are
zero free consoles in the world.
Then why bother?
Then why bother?
Then why bother?
Then why bother?
Bullet point?
The console makers know that back compat is mostly a marketing
bullet point, customers (particularly the most profitable ones who buy lots
of new games) don't really use the back compat. So it's important to have
some sort of offering, but you can't afford to spend much R&D money on
it. If it falls out naturally from evolving hardware, great. Otherwise, too
bad.
I think the "enterprise" Linux distributions can do a pretty
good job of supporting the same binaries for 10 years or so, via compat
packages, configuration of the environment (with judicious symlinks etc.)
and so on. It shouldn't be a huge surprise if the same isn't true in Fedora
11 where the OS itself has a life of only 12-13 months.
Then why bother?
Have you checked?
for the game consoles there is basicly zero compatibility
between games produced for one piece of hardware and the new
models
for windows the compatibility was really good up through win
98, but since then microsoft changed a bunch of stuff, so a lot of win98
and earlier software won't work without re-writing it.
haven't you ever heard of 'DLL hell' (where one piece of
software needs a new version of a DLL and another need an old version of
the same DLL)?
for linux most old software will just work on new versions, the
only issue is that you may have to get copies of older libraries as well
(and unlike windows, *nix has provisions for having multiple versions of a
library and letting the software specify which one it needs)
the problem isn't that the linux software won't work, it's that
there is no easy way to setup the appropriate libraries for it to
work.
even with OSS (which you posted the link to), the issue isn't
that the software doesn't run, it's that it doesn't play nicely with other
things using sound at the same time, but that software never did, so while
that's not nice for a modern desktop, it's no worse than the software was
10 years ago when it was first written.
Have you checked?
Yes, there is: it's the difference between corporate desktop and consumer desktop...
there is a difference between things not being able to work and
things requiring effort to make work (by loading the appropriate
libraries).
there can be new tools written to assist in loading the
appropriate libraries without requiring any changes to either the old
binaries or the new systems.
I also think that you are very mistaken if you think that the
right thing to do is to make linux work like windows.
we already have many things that are designed in much more
robust ways
This is where Ubuntu has done such a good job, they have
written a lot of these tools (or at least put them togeather in ways that
no other distro ever dis) to make things easier for the
user.
If running old binaries is so valuble then someone will
eventually write tools to automate the library discovery/install process
for them.
Then why bother?
Then why bother?
When Ubuntu was derived from Debian that was childish?
When Mandrake was derived from RHL that was childish?
When Debian was founded as an alternative to SLS that was childish?
Get rid of mainstream linux distributions? Why?
Get rid of mainstream linux distributions?
Hell which upstream projects would you even include in that
core platform?
And how would you get the upstream projects to agree to hold
their API and ABI stable long enough to give the core platform any
significant longevity while also pledging the manhours to keep the platform
patched for vulnerabilities and bugfixes that do not jeopardize the
platform stability?
1. Don't expose underlying projects in your core distribution API.
2. Be ready support packages which must be exposed in your API without
help from upstream.Then why bother?
Then why bother?
Please read what you wrote...
LSB can't provide
They can go around and wave their hands and try to make a 100%
solution, but it's completely bullshit if nobody follows it, which nobody
will.
As for 'Why bother?'.. there is a certain subset of Linux OS
that people can agree on, more or less. So... LSB defines that for ISVs. It
at least gives people something to work with.
Hell I don't even have compatibility _right_now_. How the hell
is anything suppose to be compatibility with software from 10 years ago
with Linux software wasn't compatibility with anything from back then in
the first place?
forever for 10 years".
It was done before. GNOME, KDE, etc - they all support backward
compatibility... but it does not really work. Why? Easy: system includes
some compatible components (GTK+ or QT) and incompatible components
(GStreamer, libstdc++, etc). Programs use both stable components and
unstable ones - and the end result is unstable system. The only way to
overcome this is to limit amount of stuff in your distribution. Zero
unstable components. You can not fix this or that problem without breaking
binary compatibility? No problem - fix it! We'll gladly include your
version in next release - expect it to be out 2018 or 2019. What? You can
not wait so long? Ok - then implement some workaround.Then why bother?
> Android applications under GNOME or KDE. But this is not the solution. It
> looks to me that you don't even understand the problem.
Then why bother?
Then why bother?
So, umm, your argument is that the reason the consumer market
has not switched en masse to Linux is that lots of people would like too,
but have decided to wait because they have piles of Linux software from
1998 sitting around, and modern Linux can't support it so they just stay
with... Windows?
Backwards compatibility of the sort you mention will start
mattering a few years after Linux becomes mainstream and there are lots of
funky binary-only apps being distributed through non-distribution channels
-- if that day ever arrives.
Until then, basically no-one cares about that lack of
functionality, because no-one needs it, and community-developed FOSS is
never written before there is user demand.
There's a separate question of whether we'll be able to provide
compatibility then, when it does start mattering, but given our excellent
support for *Windows* apps from 1998 (and for that matter, Mac apps,
and I bet some of those consoles too), I'm not staying up nights worrying
about our ability to make frikkin' ELF-with-some-old-libraries
work.
Then why bother?
Visit any newbie forum, please.
Please demonstrate one of these real existing users that
switched to Linux, but then switched back when they could not run 10 year
old Linux programs.
The argument about the forward direction you just added is more
reasonable, but still -- please demonstrate one of these real existing
users who switched to Linux, and was desperate to have the latest and
greatest version of random apps like Python 3 and OO.org, *but* was
scared of upgrading to the latest and greatest version of their
distro.
Microsoft did not spend billions developing
Wine.
The reason no-one spends much effort on compatibility for
Linux apps because if they did, *under present circumstances* -- which are
different from the circumstances for Windows and Mac OS! -- no-one would
use it anyway, and it would have no positive effect in the real world.
Sorry, but this is bull-shit. People do try to solve
this problem (there are a lot of failed solutions like Autopackage or
0install) - but without organized effort from distributiors it's
impossible. That's one of the reasons of low Linux penetration in my
company, for example. We have choice of OS here: Linux, Mac or Windows on
laptop (desktop is always Linux because a lot of our developer tools work
only under Linux and only under one particular version of Linux). Most
choose Windows (including me, of course), roghly 1/4 choose Mac and less
then 5% choose Linux. Why? Too much hassle. Programs don't play well with
others: flash can be blocked by acrobat reader, for example. This is result
of backward compatibility neglect mostly: both flash and acrobat reader use
OSS and linux still does not support such programs right on modern
hardware.