2 updates per month should have delta updates
2 updates per month should have delta updates
Posted May 5, 2014 15:39 UTC (Mon) by clump (subscriber, #27801)In reply to: 2 updates per month should have delta updates by juliank
Parent article: CyanogenMod 11.0 M6 is available
Other strange aspects of the Android world: random forum attachments/binaries, "modders", lack of source code for certain projects, etc.
Posted May 5, 2014 15:43 UTC (Mon)
by marduk (subscriber, #3831)
[Link] (1 responses)
I'm not sure it ever will catch up. There does not seem to be (enough) demand in the community (e.g. Google, hardware providers) to make it happen.
Of course it could happen further downstream at the risk of random breakage from upstream.
Posted May 5, 2014 16:00 UTC (Mon)
by juliank (guest, #45896)
[Link]
Posted May 5, 2014 16:18 UTC (Mon)
by shmerl (guest, #65921)
[Link]
Posted May 5, 2014 16:38 UTC (Mon)
by tsmithe (guest, #57598)
[Link] (3 responses)
In fact, it's chaos, with no real infrastructure. Rather than take the methods of and lessons learned by the 'traditional FOSS' community and change or improve them, this new ecosystem does almost everything from scratch. It's so inefficient -- just when traditional FOSS has got itself a fairly sleek development cycle.
Posted May 5, 2014 16:52 UTC (Mon)
by juliank (guest, #45896)
[Link] (2 responses)
Posted May 5, 2014 16:53 UTC (Mon)
by juliank (guest, #45896)
[Link] (1 responses)
Posted May 6, 2014 16:35 UTC (Tue)
by clump (subscriber, #27801)
[Link]
It just so happens that Valve has a Linux-based SteamOS, but there's nothing stopping users from installing Steam alone on their distribution.
Posted May 5, 2014 22:27 UTC (Mon)
by drag (guest, #31333)
[Link] (110 responses)
Why would Android even want to 'Catch up'?
Android is massively more popular then any sort of Linux distribution when it comes to end users. It has exponentially large number of packages available for it and a far wider developer audience.
It's successful in every single way that Linux user oriented systems isn't.
It's massively popular. It has a relatively strong API for applications. It defeated Microsoft. It defeated Apple. It's the most powerful mobile platform currently available. It has a rich set of development tools and yet is still friendly enough to be used by the majority of people. etc etc.
They have sold 967,775,800 Android phones last year. And that is only large publicly traded corporations selling phones. There are still probably millions of devices that are sold that are forks or offshoots that are using AOSP. And then there are tablets, kiosks, vehicle entertainment systems, etc etc. Android's 'long tail' is extremely long.
Debian has 48,574 packages.
Android has 1,203,791.
Going to a OS-level package management model like Apt/Yum would a be a HUGE step backwards.
The traditional Linux package management scheme had it's benefits and it's purposes, but it's not the end all or be all. To say that Android is somehow slow or sloppy compared to Linux distributions is just us wanting to be on the wrong side of history.
The whole move to virtualization and things like docker is largely driven by the fact that software management in Linux is such a huge PITA even for sysadmins.
In fact it's entirely the other direction; Linux distributions have a huge number of things to learn from Android.
Posted May 5, 2014 22:31 UTC (Mon)
by paravoid (subscriber, #32869)
[Link] (46 responses)
Posted May 6, 2014 1:05 UTC (Tue)
by torquay (guest, #92428)
[Link] (45 responses)
No. The entire idea of a distro is broken in many ways. A distro is essentially a monolithic mudball, made up of a bazillion packages, glued together to mask the core problem of API instability within the OSS world. Every distro is its own little fiefdom, requiring the unnecessarily repeated (ie. wasted) work of packaging. The packaging processes introduces a bottleneck between software creators and consumers. Even the word distro is wrong: in reality every Linux "distro" is actually a separate Linux-based operating system.
Docker et al, and to some extent the "Fedora.next" effort, see these problems and are working towards addressing them.
Android definitely got one thing right: API stability. This is far better than the typical "distro" situation, where there are no guarantees that software written for version N of a given distro will work for version N+1. Going cross-distro is even more problematic.
Posted May 6, 2014 1:46 UTC (Tue)
by pizza (subscriber, #46)
[Link] (7 responses)
Android's forward compatibility basically consists of perpetually carrying forward the old API stack and runtime. There's no inherent reason why everyone else can't do that; it's just a matter of figuring out how to pay for the man-hours that will require on top of new development efforts.
(I might add that the forward/backward compatibility is considerably more flaky if you target the NDK and thus bypass the fixed/versioned ABI of the Java/Dalvik VM+libraries)
Posted May 6, 2014 1:55 UTC (Tue)
by raven667 (subscriber, #5198)
[Link] (6 responses)
Well yes, if there was strong ABI compatibility on the Linux desktop then there would be strong ABI compatibility on the Linux desktop. It's not technically impossible but it is not being done and is not prioritized.
Posted May 6, 2014 13:04 UTC (Tue)
by renox (guest, #23785)
[Link] (5 responses)
Because nobody is earning 'big money' with Linux's desktop, and that most 'amateur' developers(*) don't care about such feature..
*: amateur isn't a bad word or a criticism.
Posted May 6, 2014 15:26 UTC (Tue)
by raven667 (subscriber, #5198)
[Link] (4 responses)
Posted May 6, 2014 16:29 UTC (Tue)
by torquay (guest, #92428)
[Link] (3 responses)
Posted May 6, 2014 18:12 UTC (Tue)
by tuna (guest, #44480)
[Link] (2 responses)
Posted May 6, 2014 18:57 UTC (Tue)
by dlang (guest, #313)
[Link] (1 responses)
the comment above awas about the linux kernel, there they declare zero stability for internal interfaces, but very strong stability (not quite 100%, but close to it) for external interfaces.
I wish that more higher level layers took this approach.
Posted May 8, 2014 19:30 UTC (Thu)
by tuna (guest, #44480)
[Link]
Posted May 6, 2014 16:49 UTC (Tue)
by clump (subscriber, #27801)
[Link] (9 responses)
I presume you're thinking of the ISV ecosystem. This ecosystem, no matter the operating system (let alone package manager), largely ignores native software distribution. In the Linux world, there happen to be generous community members that package, distribute, and maintain software that is typically installed by end users. LibreOffice, for example. Tarball, .deb, .rpm? Sure. Yum install LibreOffice? Sure. Are there native LibreOffice packages (let alone maintainers) for Mac and Windows?
Per a different comment, Docker probably doesn't do what you think it does. If a user wanted an Android app or a Linux end user application the solution wouldn't be a PaaS-oriented container download complete with its own operating system image. Docker aims to fix a problem for developers that happens to exist on any platform, which is providing a platform to create and run your application with all its dependencies.
Posted May 6, 2014 17:24 UTC (Tue)
by torquay (guest, #92428)
[Link] (8 responses)
And therein lies the rub: "someone is maintaining it". Let's look at this closer. Who exactly is maintaining LibreOffice at Fedora? Let's assume it's a volunteer and not a RH employee. Why do we need to have someone to make a package? What happens if the volunteer gets bored, goes on holidays, moves onto other things, or gets hit by a bus? Why can't we simply install the latest version of LibreOffice directly from the LibreOffice site, instead of waiting for the package to be made?
Does the volunteer in question also make the corresponding rpm package for Suse, and deb packages for Debian, Ubuntu, Mint? Or Megeia, Mandriva, or whatever it's called these days? If not, why is this effort duplicated? Why do we need 100000 separate packages and 100000 separate packagers for each Linux-based OS?
What happens when version N+1 of Fedora is released? Based on history, as soon as Fedora N+1 is released, version N generally gets no software updates apart from security fixes. So, would "yum update LibreOffice" still work? Probably not. So if we want to update LibreOffice, we're forced to upgrade Fedora to N+1. Fedora is notorious for breaking things willy nilly at each upgrade, and to a large extent this affects other "distros" as well. Why should I end up with a half-broken set of applications and OS components, when all I wanted was to upgrade just one application?
The point is that distro-specific packages are not suitable for user-facing software. They are a bottleneck. They get in the way. They do not scale. They delay and/or effectively prevent upgrades. In other words, they are not an effective way for software developers to deliver their software to users.
A much better solution is a solid baseline OS which provides API and ABI guarantees, clearly separated from the applications that run on top it. This is in contrast to the current practice of chucking everything together in a massive mudball and covering up the problems.
Posted May 6, 2014 18:15 UTC (Tue)
by tuna (guest, #44480)
[Link] (5 responses)
Posted May 6, 2014 19:06 UTC (Tue)
by mathstuf (subscriber, #69389)
[Link] (4 responses)
Posted May 6, 2014 22:29 UTC (Tue)
by clump (subscriber, #27801)
[Link] (2 responses)
Posted May 6, 2014 23:19 UTC (Tue)
by Cyberax (✭ supporter ✭, #52523)
[Link] (1 responses)
It depended on OpenSSL 0.97b which is not packaged since forever. Attempting to download and install it caused pretty much ALL of the packages to be marked for deletion.
And that's just one example...
Posted May 6, 2014 23:33 UTC (Tue)
by dlang (guest, #313)
[Link]
Posted May 8, 2014 19:32 UTC (Thu)
by tuna (guest, #44480)
[Link]
Posted May 6, 2014 18:29 UTC (Tue)
by pizza (subscriber, #46)
[Link]
It's actually a LO developer, but your point is still valid..
> Why can't we simply install the latest version of LibreOffice directly from the LibreOffice site, instead of waiting for the package to be made?
Basically, it's due to the reality of not having a platform monoculture defined and enforced by some benevolent dictator. Whether or not that is a good thing depends on one's point of view.
Everything else flows from that; the wide variety of platform-level options and combinations means that you can't make many assumptions about what the end-user's setup looks like.
The only way to solve this is to restrict choice at the platform level. While a valid option, this happens to run counter to the historical reasons for Linux's adoption to begin with.
Historically, this hasn't been that big a deal for software distributed as source code (those interested can do the work to port it over), but it is a far bigger problem for software distributed in binary form -- only the author can do the porting work.
While I don't think many folks out there deliberately try to make it *harder* for binary software distribution, they don't really care much about making it easier, unless as a side effect of making it easier for software they do care about -- F/OSS delivered in source form. At the source level, the ABI and API problems largely go away, but the platform definition problems remain. (FWIW Fedora.next's focus is on improving the platform definition/baseline)
> A much better solution is a solid baseline OS which provides API and ABI guarantees, clearly separated from the applications that run on top it. This is in contrast to the current practice of chucking everything together in a massive mudball and covering up the problems.
It's a different solution, not necessarily a *better* one. It solves some problems, but creates others as well -- and history shows that it is far more prone to "covering up problems in a massive mudball" than the current status quo.
Case in point -- we already have what you've asked for, in the form of RHEL/SLES, and they are so API/ABI guaranteed that they are effectively set in stone for ten years. The price one pays for that guarantee is that what they do ship is obsolete almost immediately, so in order to do more interesting/modern things, one has to install various mudballs on top of that baseline. The ISVs are the worst offenders here, as their mudballs have the distinction of not playing well with others'.
Posted May 6, 2014 18:29 UTC (Tue)
by clump (subscriber, #27801)
[Link]
Nobody seems to be defending Android often requiring users to download large binary images for simple updates. I also imagine you're not arguing Android is good at long term maintenance of point releases, let alone major releases...
If you want N+1 look into the RHEL 6 Developer Toolset.
Posted May 7, 2014 0:15 UTC (Wed)
by rqosa (subscriber, #24136)
[Link] (26 responses)
> Going cross-distro is even more problematic. However, it is apparently possible for the likes of Humble Store, Desura, and Steam to provide cross-distro packages for a fairly large amount of games. Granted, there are occasional problems (e.g. depending on libpulse vs. libasound), but overall it seems to work pretty well. (Incidentally, some of those games are free software, such as this one and this one except for its sound / graphics / level-map data, and yet they still manage to get paying customers.)
Posted May 7, 2014 3:30 UTC (Wed)
by mathstuf (subscriber, #69389)
[Link] (25 responses)
They do this my shipping their own "platform" libraries (SDL and the like) and bundling everything else. I think all they expect on the host is X, glibc, and a driver capable of the games you want. One of the Humble Bundle games even ships its own WINE to run on Linux (IIRC, LIMBO).
Posted May 7, 2014 3:49 UTC (Wed)
by torquay (guest, #92428)
[Link] (1 responses)
Which, to my understanding, is essentially what Docker does, but in a more formalised manner. This is a perfectly workable and understandable solution to the problems caused by the "distro" model of software distribution.
Distro-loving folks bemoan bundled libraries, but when you ask them to guarantee stable APIs and ABIs so that bundling is not required, they wash their hands of the problem, pretend it doesn't exist, and say "it's upstream's fault".
Posted May 7, 2014 5:17 UTC (Wed)
by rqosa (subscriber, #24136)
[Link]
> Distro-loving folks bemoan bundled libraries Not really… it's more that distro developers bemoan in-tree source copies of libraries (especially forked / API-incompatible ones) in upstream source trees, because that prevents them from building the program against system-provided libraries. For anything third-party, though, I don't think any distro developers would object to that — it's not their problem.
Posted May 7, 2014 5:29 UTC (Wed)
by rqosa (subscriber, #24136)
[Link] (22 responses)
> They do this my shipping their own "platform" libraries (SDL and the like) and bundling everything else. The thing is, though, that's pretty much what you have to do on every other operating system. (Want to use SDL in your Windows game? You have to bundle it. Qt? Same thing. Or suppose you're depending on QuickTime… the last time I saw a Windows program that needed this, it didn't bundle it but instead required the end-user to download the QuickTime installer and run it themself.)
Posted May 7, 2014 15:09 UTC (Wed)
by raven667 (subscriber, #5198)
[Link] (21 responses)
Posted May 8, 2014 1:20 UTC (Thu)
by rqosa (subscriber, #24136)
[Link] (20 responses)
> The only reason to bundle them on Linux […] is the poor compatibility from version to version The trouble is, for a lot of libraries, it is expected that you must bundle (or static-link) the library with your applications — an expectation that's usually true outside of desktop-Linux distros, because there the library isn't included with the OS. And since the app developers have to bundle the library anyway for that reason, the library developers have little incentive to freeze their API, and the app developers have little incentive to avoid depending on very specific versions of the library. Consider Allegro, for example — the last time I compiled a game that uses it, the game seemed to require a 5.1.x version, i.e. the current "unstable" branch (which tends not to be packaged by distros); it would not work with 5.0.x. Consequently, I ended up compiling Allegro 5.1.8 myself instead of using a distro-provided build. However, for anyone who got the game in binary form, that work has already been done for them, because the binary they got has Allegro bundled with it— and that's true for all OSes that the game is available for! So for this particular development "platform", the situation is basically the same on GNU/Linux / desktop-Linux / Unix-like Linux as every other OS, both for end-users and for app-developers. (For cross-distro builds, the main APIs you can rely on to be present are libc, the X client libs, Mesa, and ALSA; or at least those are the ones that commercial game developers seem to be targeting, it seems.)
Posted May 8, 2014 5:46 UTC (Thu)
by khim (subscriber, #9252)
[Link] (19 responses)
Nope. Situation is not even close. With any other system you know that you need to compile and bundle this library. With Linux you have no idea if you need to do that or not. Which later turns installation of your application into a quest: find which library is missing and invent some clever way to inject it into a binary app. Dependences don't really work because names of pre-requisites are not constant even in the same distro (they change between releases) and if you consider dirrerent distributions then it's totally hopeless. I've tried to do it with GIMP 2.8 and failed: I was able to start it on Ubuntu 12.04 Precise Pangolin but was unable to convince it to show anything at all. Not even sure what was missing, I just gave up in the end. Should I even mention that the very same version of GIMP works just fine without any issues on Windows XP which is much older than Precise Pangolin? That's with a new stuff. With old stuff it's even worse. Suprisingly enough the best way to run old programs under Linux is to take their Windows version and run them under WINE. If that is not failure of the model then I don't know what is.
Posted May 9, 2014 1:01 UTC (Fri)
by rqosa (subscriber, #24136)
[Link] (18 responses)
> With any other system you know that you need to compile and bundle this library. With Linux you have no idea if you need to do that or not. The developer didn't know whether they have to bundle Allegro or not… so, of course, they did bundle it (using static-linkage). Problem solved! …for the person building it, not for you — unless you are the one doing the build.
> invent some clever way to inject it into a binary app Which is just a matter of using static-linkage, or setting RPATH or RUNPATH, or setting LD_LIBRARY_PATH (e.g. from a shell script wrapper — I've seen some binary-only stuff do it this way), plus maybe building with libltdl in situations where dlopen is called with a pathname. > the very same version of GIMP works just fine without any issues on Windows XP That's only because someone else already made a Windows build with all necessary dependencies bundled in.
Posted May 9, 2014 8:00 UTC (Fri)
by khim (subscriber, #9252)
[Link] (17 responses)
The sad fact is that you are not even wrong. Sure. They went and changed one option in one place. Compare with number of hoops one need to jump on Linux: Many FOSS guys clarly live in some kind of cocoon which separates them from the rest of the world. It's so thick it's not even funny. No, really. What can you talk about with guy who says the reason an Android app, a Linux app, or even a Windows app works is because someone is maintaining it? It does not even look like he's joking! The reality of big wild world flies straight over his (or her?) head! There are few facts about desktop (smartphone, etc) which we need to keep in mind when we are discussing… anything, really. Basically they treat software like you would treat your furniture or bed linen. You don't rush out of the door to replace your furniture when IKEA issues new model of bed, right? And you don't plan to replace your bed sheets when you finally decide to “upgrade” your bed? Well, that's how people treat software, too. Thus we have all the facts listed above. Distributions fail in this wold. Completely. Totally. Utterly. Compare with Android: Now, of course nobody prevents you from solving these problem. You can compile gcc 4.9 in a way thay it'll be able to produce RHEL4.5-compatible binaries. You can bundle all the required components (even GLibC can be bundled if it's really needed). You can rigorously test your software with all popular distributions and release new versions in timely matter when they break. This is what Valve is trying to do with Steam. But if they will succeed then then next question will be: why bother with distributions at all? Why not drop support for Steam on Linux and keep only SteamOS? If distrubuions will leave these problems to Valve, then I'm sure that's exactly what eventually happen. Distributions brillintly solve problem of use of Linux as “old-style Unix”: when one large (and expensive!) box was shared among many developers distribution was not a problem (software was only ever used on the very same box where it was developed) and the ability to have bazillion packages was both a boon (hey, I don't need to build libXYZ from source and waste my precious tiny quota, it's now available in our system as system-wide library!) and necessity (because HDDs were not large enough to copy with gigabytes of software). In today's world the problems they tried to solve are no longer as acute and they fail miserably at solving problems which are actual in today's world.
Posted May 9, 2014 8:08 UTC (Fri)
by dlang (guest, #313)
[Link] (1 responses)
but you are contradicting yourself
you say that Android support something forever, well, if you compile something for RHEL5 it will run on newer systems as well.
but you say you want C++11 features, well, if you want Android 4.4 features, your app isn't going to work on any older android system
same thing with IOS, or Windows. If you use a feature introduced in a recent version, it's not going to work on older versions.
If you want no development at all, then you can have an absolutely compatible OS ABI, but as soon as you allow for new features to be added, software developed for new versions won't work on old ones.
Posted May 9, 2014 8:41 UTC (Fri)
by khim (subscriber, #9252)
[Link]
Fair enough, I guess. Why? You can pick, e.g. GCC 4.7 for use as an “NDK toolchain” and use it to target Android 2.3 by choosing “API level” 9. It's perfectly valid combination. Except when such combination is supported. And of course combination of C++11 features and old versions of OS are supported on both Android and Windows. Not sure about iOS, but since Apple uses it's “walled garden” approach to push updates (they are even sued now for that!) this problem is not as acute there. Why? GNU/Mess is the only [relatively] popular platform where it's true. On most other platforms I could create a program which probes OS for capabilties and uses only these which are available. Often new capabilities can even be used on old OS versions! Here is how it's done on Android. Most developers out there don't even think about that: they just assume such thing will be available. It's only natural, after all: people are replacing computers only every 3-5 years, of course OSes 3-5 years old must be supported! For them the whole Linux distributions mess is quite a shock.
Posted May 9, 2014 11:44 UTC (Fri)
by jwakely (subscriber, #60262)
[Link]
Pick a different enterprise/LTS distro and you might have a point, but that's not true for RHEL5+ ;-)
Posted May 9, 2014 12:15 UTC (Fri)
by clump (subscriber, #27801)
[Link] (6 responses)
My entire point on this thread is that phones could learn from desktops. I seem to have caused all kinds of fits and panics with that sentiment. The response seems to be "it's easier to put software on phones".
If you just throw phones away who cares about download sizes, software updates, or even security? I do, and if history is any indicator, you will too.
Posted May 9, 2014 15:25 UTC (Fri)
by khim (subscriber, #9252)
[Link] (5 responses)
Not even close. App is published there. Once. Then people buy it and use it. If it does not work they get a refund (and even that is not always easy to do). That's it. It was the same way before appstores, BTW: you get the box and says on it's cover: designed for Windows 3.1 (or Windows XP, or Windows 7). if you are lucky (and usually you are) it'll work on other versions of Windows, too. If you are not lucky… get a refund. If you are really unlucky and can not get a refund… try to sell the box to someone. Sure. But there are big difference: when we are talking Android, Windows, Solaris, Mac OS, iOS, etc that someone is developer of an OS. And it does it's work without luxury of having access to the program! Not just source is unaccessible, binaries are often not accesible, too! Most programs are internal, private affairs, people will not send them to you for the purpose of maintaining them. Then you should stop pretending that you are doing something people should care about. They can continue to use other OSes and ignore what goes on in your little corner. Think OpenBSD: they don't really care to cater for people wants and needs and people are reciprocating (as in: noone cares except for an OpenSSH). Most of them. These are supported on the “best effort” basis. They are not guaranteed to work but usually they do—and that's enough. Now you are painting nice stawman. Sorry, but no. Phone you've just bough may come with old version of Android, sure, but it'll work with a lot of “latest and greatest” programs. Many apps support at least Gingerbread and if you have at least Ice Cream Sandwich (which is three years old by now!) then most apps will work for you. Things like Autocad 360 or Microsoft Office Mobile or Need for Speed—they all work on Android 4.0+ just fine.
Report it to Google. They will kill the app even if it was not installed via the Google Play service. Note that appropriate feature was delivered to all users of all versions of Android starting from Android 2.3. What exactly could they learn? How to make life of the user miserable? There are bazillion simpler ways to achieve that “nirvana”.
Posted May 9, 2014 16:12 UTC (Fri)
by clump (subscriber, #27801)
[Link]
I absolutely agree with your assessment of the old Windows days. I don't think that's good. Why? Because of security, updates, user control, etc. You could point out that I'm probably in the minority for feeling that way. Per the broader thread, you'd be right.
Posted May 10, 2014 4:15 UTC (Sat)
by rqosa (subscriber, #24136)
[Link] (3 responses)
> you get the box and says on it's cover: designed for Windows 3.1 I remember what it was like using Windows 3.11, and as far as I'm concerned, desktop-Linux is paradise compared to that! For one thing, dealing with conflicting .so dependencies on Linux is nowhere near as bad as the real "DLL hell" that we had back then. It was like this: a program would often come bundled with its dependencies, and then install those dependencies system-wide, overwriting whatever version might already be there. So now that you've installed program A, program B doesn't work anymore… or worse, you've installed programs C D E F G and H, then went back and tried to run program B several months since the last time you ran it, it doesn't work anymore, and you have no idea which thing you did in the meantime is the one that broke it. Or maybe an old program still works, but one of its dialog boxes looks totally different now. Or even this: you installed some program that depends on Video for Windows, and some time later on you discover that installing VfW had the side effect of replacing Media Player with a newer version that looks completely different, without asking you. (I seem to remember one time when Trumpet TCP mysteriously stopped working, so for at least the next few months I resorted to running an old-fashioned terminal emulator and downloading files with lynx and sz! I don't remember for sure what made it start working again… most likely it was a system reinstall. And of course, your "user data" wasn't cleanly separated out into directories under /home, but rather was scattered throughout all the applications' own directories, making a reinstall painful.) > if you are lucky (and usually you are) it'll work on other versions of Windows, too. If you are not lucky… … you wait 2 or 3 years until you get a chance to try it on Windows 98, and it still doesn't work… and then you wait 13 more years until you get a chance to try it on Windows Vista (32-bit), and through sheer trial-and-error finally manage to find the right locations in C: to copy the files to (to work around its installer, which always crashes) plus the right QuickTime version (2.1.2, 32-bit) plus the right video mode (16bpp) plus the right backwards-compatibility mode (Windows 95), and then it finally works. All of this really happened to me.
Oh but wait, it doesn't end there: now I want to run this thing, too, and it also depends on QuickTime— a much newer version of QuickTime, that is. And who knows what might happen if I try to install both versions at once? (QuickTime is one of those things that seems to insist on being installed system-wide, rather than having an app include its own private copy in its own installation directory; I'm not sure why. The intended way to install that game is to first download QuickTime from Apple and install it with its own installer, then run the game's installer afterwards.) So what I ended up doing was to just run it on a different computer… or sometimes even run it in Wine. (Tip: with Wine, you can set up multiple different "WINEPREFIX" directories, and each one will be its own separate evironment with a completely different set of "system-wide" libraries etc.)
Posted May 10, 2014 14:43 UTC (Sat)
by raven667 (subscriber, #5198)
[Link] (2 responses)
Posted May 10, 2014 21:43 UTC (Sat)
by clump (subscriber, #27801)
[Link]
Posted May 10, 2014 22:26 UTC (Sat)
by rqosa (subscriber, #24136)
[Link]
> do you have any more recent experience? Well, even today I believe it's pretty common for Windows programs to have an installer that insists on installing the program "system-wide" and only wants to have one version of it installed at a time — I can think of one like this where I would have liked to have two different versions on hand for use. QuickTime for Windows is also like this.
Interestingly, Wine actually has a way (the WINEPREFIX mechanism) to let you work around things like this. > What we are discussing is how current systems work It stopped being only about current systems once Windows 3.1 was brought into the discussion… and it wasn't me who first mentioned it.
Posted May 10, 2014 3:01 UTC (Sat)
by rqosa (subscriber, #24136)
[Link] (6 responses)
> They went and changed one option in one place. This guy only had to set one option: > Compare with number of hoops one need to jump on Linux Usually the number of hoops is 1. I was saying that any one of those 4 options alone is sufficient in the common case. (Tip: the dynamic linker expands the variable "$ORIGIN" within an executable's RPATH to the directory that contains that executable. That lets you put .so files into the same directory as the executable, or subdirectory of it, and the dynamic linker will prefer those libs over "system" libs for that one program.) For programs/libraries that try to dynamically load a library by filename (not pathname), static-linkage doesn't work (unless using something like Libtool's "dlpreopening" — see below), but the other options should work. (Those other 3 are basically just different ways of doing the same thing: setting the dynamic linker's library search path.) Now, for programs/libraries that try to dynamically load a library by pathname, the simple solutions won't work. (I'm pretty sure that Windows has the exact same issue, since LoadLibrary/LoadLibraryEx can take a pathname parameter too.) That's where Libtool / libltdl may be useful: it was designed to (among other things) emulate dlopen / dlsym / etc. on systems that don't even support dynamic linkage, by "dlpreopening" them. (This involves building an executable with a bunch of libraries statically-linked in and then arrange for wrapper-versions of dlsym et al. to be able to resolve references to those libraries' symbols.) Beyond that, there may be some other situations where bundling libraries can't really be done, e.g. a library that wants to launch a daemon running under the same UID as the program that links to the library (or detect that the daemon has already been started by another process and talk to the existing one). Situations like this are not unique to Unix, though… just look at KDE on Windows, for example (disclaimer: it's been 3 or 4 years since I last tried it out). > You can bundle all the required components […] This is what Valve is trying to do with Steam. Exactly my point! (Like I said, though: they're not the only ones doing that.) > But if they will succeed then then next question will be: why bother with distributions at all? Because the "bundle all the required components"-separately-for-each-app way of doing things has its downsides. It sometimes forces you to update lots of apps all at once every time a library vulnerability is found. (This used to happen a lot with the compression library libz.) It wastes memory and storage space. (Imagine if you have 5 apps that use the same 30 libs, and you have at least one instance of each of the 5 running all at once, on a machine without much RAM.) And in the absence of a single central "app store" (e.g. the way Windows and MacOS used to be and mostly still are), you often have every single app running its own "updater", with the result that the user gets pestered to run updates way too often. Personally, the way I prefer to get the programs I use is to get many of them from the distro, plus get some from 3rd-party repositories (e.g. Arch's AUR, Ubuntu's PPAs), plus get some cross-distro binary packages with bundled dependencies (read: games), plus compile a few things from upstream source. Sticking to just one way is too limiting.
Posted May 10, 2014 14:40 UTC (Sat)
by raven667 (subscriber, #5198)
[Link] (5 responses)
Just to make sure we don't lose the main point and go off on a tangent, the libraries which games are bundling so they are cross-distro compatible are libraries that the distro _also_ packages, like SDL, libpng, freetype, ALSA libs, zlib, etc. Why is that, what are the distros doing such that developers who want to make cross-distro binaries have to bundle rather than relying on what the OS provides? Shouldn't the distros be organized such that packagers can rely on what they provide?
Posted May 10, 2014 21:48 UTC (Sat)
by rqosa (subscriber, #24136)
[Link] (4 responses)
> what are the distros doing such that […] It's not because of something the distros are doing, it's because of something the library developers are doing: changing the APIs frequently. And they do that because, to them, their libraries are not "vendor provided platform libraries that come with the system", they're libraries that programs will need to bundle with them. After all, on every other OS, these libraries are not provided with the system. (For example, that one game I linked to above depends on a bleeding-edge version of Allegro, which the distros generally don't want to package.)
> like SDL, libpng, freetype, ALSA libs, zlib, etc. However, ALSA is one that really is always a "vendor provided platform librar[y]" with an API that's pretty much unchanging by now, and so you should use the system-provided copy of it.
Posted May 10, 2014 22:38 UTC (Sat)
by raven667 (subscriber, #5198)
[Link]
What distros aren't doing is providing sufficient backpressure for the madness to stop, either by shipping all the relevant versions of libraries or standardizing what version and API they are shipping, library authors aren't being given sufficient motivation to care because you can just rebuild the world for the next release, right?
Posted May 11, 2014 1:28 UTC (Sun)
by khim (subscriber, #9252)
[Link] (2 responses)
Distrubutions are also part of the problem. Even if library has [relatively] stable ABI (e.g. expat) there are no cross-distribution way to request it. It may be installed or not installed (that's the whole point of the distribution!) thus application developers are forced to treat it as if it's never available. That is why distrubutions are pointless: application developers can only rely on the core components which are always installed anyway and if we decide that it's not good idea to stuff end-user programs (like LibreOffice or GIMP) into a distribution (because they rightfufflly belong to an appstore), then... what do we have left? Bits and pieces of integration and plumbing (things like dbus) which could not be used by applications anyway (because they are not in the core part) or could be used at the expense of frustration for the end user (who does not know enough to know to install dbus to run his or her favorite game)? In a world where appstore applications (or even separately distributed applications) are first-class citizens classic Linux distributions are not only not needed, they are actively harmful!
Posted May 11, 2014 9:03 UTC (Sun)
by juliank (guest, #45896)
[Link] (1 responses)
Seriously? An end user that does not have dbus installed? Are you joking?
If someone does not have dbus installed, they will be experts, because no real desktop runs without it.
Posted May 11, 2014 14:03 UTC (Sun)
by khim (subscriber, #9252)
[Link]
You, again, don't see forest for the trees. Yes, today d-bus is basically installed everywhere (since it's needed by systemd). Ok, but that's today, in 2014. Situation 10 years back (when d-bus was first introduces) was quite different. And even if d-bus it's there it does not mean that facilities needed by your program are there, too. What should you do to support sound output? There are OSS, ALSA, pulseaudio, etc… and even today there are “non-experts” without pulseaudio because some distributions don't offer it. It's not as I'm saying anything new: Adobe said the same thing almost a decade ago. What have changed today? Ah, right, Adobe decided that enough is enough. Make no mistake: “Pepper” Flash is there not because it somehow solves the compatibility problem. No, it's there for ChromeOS (and it's developed on Google's dime, not on Adobe's one). A Linux “distribution” built on top of Gentoo but which does not offer any flexibility to the end user. There are nothing like Windows Installer on Linux (which resolved former horrors). Windows installer may not be as capable as apt or yum, but it's interface is stable, one can use it to install pre-requisites (such as MSDE or .NET). On Linux… it just does not work: even if you use DPKG or RPM properly and specify dependencies correctly often you can not just take binary developed for one version and run it on other version of the very same distribution. Grab
Posted May 6, 2014 0:25 UTC (Tue)
by pabs (subscriber, #43278)
[Link] (31 responses)
Posted May 6, 2014 8:30 UTC (Tue)
by NAR (subscriber, #1313)
[Link] (25 responses)
If they didn't want to press your ideological views - I don't think it is necessarily a problem. Now the question is: what is the purpose of a Linux distribution? Provide a stable and useful OS to users or evangelize? They failed miserably on the first task and I'm not interested in the second.
Posted May 6, 2014 10:29 UTC (Tue)
by pizza (subscriber, #46)
[Link] (24 responses)
You say you're not interested in the second, yet your benchmark for "usefulness" seems to be "number of users".
Meanwhile, several million folks (myself included) disagree with your assertion that Linux distributions failed miserably to provide a stable and useful OS to users.
Posted May 6, 2014 12:43 UTC (Tue)
by torquay (guest, #92428)
[Link] (23 responses)
There is no need to assert anything, as this is plainly true. A given random distro (really a separate Linux-based OS) is not stable in the sense of API stability from one version to another.
The OSS "community", as a whole, is incapable of creating a viable competitor to Android, or Mac OS X, or Windows. The only success is partial, and comes with caveats: three separate parts of the community (ie. Debian, Red Hat and Suse) have managed to create competitors to traditional UNIX servers. However, there are no guarantees that a piece of software written for Debian will run on RHEL, or Suse (or choose your favorite permutation).
Posted May 6, 2014 13:26 UTC (Tue)
by pizza (subscriber, #46)
[Link] (5 responses)
So.. what you're saying here is that a loose collection of (mostly) volunteers working (mostly) for free are incapable of directly competing with tightly-focused, top-down, multi-billion R&D budgets.
And I'm not sure where you got the idea that this is the only measure of success, and that this is what "the community" is trying to become.
Posted May 6, 2014 14:36 UTC (Tue)
by raven667 (subscriber, #5198)
[Link] (4 responses)
... Yes? This seems an uncontroversial point. Even the Linux kernel fits your second criteria better than the first, there are millions or billions of dollars from hundreds or thousands of companies poured into R&D. The lose collection of mostly volunteers works best in niche software where the target audience is technical and can participate, so servers and development tools are much stronger than end user GUI software.
>> The OSS "community", as a whole, is incapable of creating a viable competitor to Android, or Mac OS X, or Windows.
... Because the stated goal of most distributions and desktop environments is to get software in front of users? You can try and redefine success to be anything you want it to be I guess but I think most desktop developers would like their software to be widely used.
http://www.gnome.org/about/
Lets be clear, the Linux desktop isn't terrible, I think we can both be impressed by how much has been accomplished and know that it could be better. The way we've been trying to package and distribute software just doesn't seem to scale as well as other systems like Android, Mac and Windows.
Posted May 6, 2014 18:52 UTC (Tue)
by dlang (guest, #313)
[Link] (3 responses)
> ... Yes? This seems an uncontroversial point. Even the Linux kernel fits your second criteria better than the first, there are millions or billions of dollars from hundreds or thousands of companies poured into R&D. The lose collection of mostly volunteers works best in niche software where the target audience is technical and can participate, so servers and development tools are much stronger than end user GUI software.
however the linux kernel development is anything but "tightly focused, top-down" directed
Posted May 6, 2014 19:55 UTC (Tue)
by raven667 (subscriber, #5198)
[Link] (2 responses)
Posted May 6, 2014 20:59 UTC (Tue)
by dlang (guest, #313)
[Link]
Posted May 6, 2014 21:21 UTC (Tue)
by pizza (subscriber, #46)
[Link]
Heh, they do a fair amount of the latter in order to accomplish the former. :)
In all fairness, the closer you get to the bare metal the less room there is for re-imagining things. No matter how you slice it, the kernel still has to communicate with hardware and manage system resources -- and there are usually hard numbers to prove or refute the suitability of new ideas.
It's when you start interacting with these fiddly, inconsistent humans that stuff starts going awry. I swear, why can't everyone just agree on everything?
Posted May 6, 2014 13:34 UTC (Tue)
by pizza (subscriber, #46)
[Link] (16 responses)
You're begging the question.
If "API stability" is important to you, you have a choice of several distros that provide that stability. If it isn't, you make the choice using whatever criteria that do matter. Either way, you're not going to just "choose randomly", so you don't get to use that as "proof" of general, "plainly true" unsuitability.
Posted May 6, 2014 14:20 UTC (Tue)
by torquay (guest, #92428)
[Link] (15 responses)
There is actually no such "distro". Even stuff written for RHEL 5 is not guaranteed to run (or compile) on RHEL 6.
The apparent "choice" is also an indicator of greater failure. The very fact that there are multiple competing (and incompatible) Linux-based OSes (distros) indicates that the Linux-focused OSS community, as a whole, is incapable of organising itself into a coherent group, and is incapable of agreeing on anything besides the kernel (and perhaps glibc).
Instead, we have endless infighting, endless software rewrites (where a given previous component is suddenly replaced with half-baked "new technology") and needless repeated duplication of effort (rpm vs deb, GTK vs QT, Gnome vs KDE vs whatever, etc, etc).
Posted May 6, 2014 15:22 UTC (Tue)
by raven667 (subscriber, #5198)
[Link]
It's not as bad as it used to be, and getting better all the time, much current development is sustainable even if it isn't dominant so if the stars align some day there is a wide body of software that could be ready.
Posted May 6, 2014 15:33 UTC (Tue)
by pizza (subscriber, #46)
[Link] (13 responses)
I hate to break it to you, but *nobody* provides that sort of guarantee. Not Microsoft, not Apple, not Google, not Redhat, *nobody*.
> ...the Linux-focused OSS community, as a whole, is incapable of organising itself into a coherent group and is incapable of agreeing on anything ...
When, throughout the span of recorded history, has this *ever* happened? (other than existential threats along the lines of "if we don't band together, $badguys will invade our land, kill our women, and rape our sheep")
That aside, you still haven't explained why you expect a large, dispersed pile of people who don't share any common ideals or goals to self-organize into a tightly focused entitity.
Because you seem to be complaining that it's the apple's fault it's not a banana, and that if only the apple was more banana-like, we could sell more apples at banana stands.
Posted May 6, 2014 16:22 UTC (Tue)
by torquay (guest, #92428)
[Link] (12 responses)
There is no need to explain, as it's an observation of how things currently are, in the context of API stability, software maturity, efficiency, and market share. People are of course free to be as dis-organised as they like, and are free to have API instability, replace components willy-nilly with half-baked tech, pointlessly reinvent the wheel and duplicate effort, and basically create software and OSes that are essentially irrelevant except in the (boring) server context.
Posted May 6, 2014 17:25 UTC (Tue)
by pizza (subscriber, #46)
[Link] (11 responses)
Win32 itself isn't the problem (it's sort of analagous to a small subset of libc combined with an elf linker/loader) but it's also not terribly useful on its own. Instead, folks write stuff targeting the multitude of layers of crap that was [middle-]layered on top of Win32; the crap that anyone trying to get stuff done actually used. (The way MS sort of solved this was to essentially bundle every MS-produced library ever previously shipped. And assuming you weren't trying to use something whose functionality they later decided to bundle into the core OS; backwards compatibility for 3rd-party stuff tended to be a little flaky after that point)
And let's not forget the Win9x/WinNT API (and behavioral!) differences (WinXP finally unified the kingdom, huzzah!) affecting things as core as winsock, but also the likes of media playback.
All that said, MS in general has tried pretty hard to maintain backwards compatibility, because that was their core value proposition. Then they tried to throw it under the bus with Win8 and Metro. :)
> The same can't be said for any Linux based OS, apart from Android.
Android at the SDK layer is decently backwards-compatible (providing you stick only to what's in the official APIs) but if you write or interact with any sort of native code (via the NDK), it's actually a worse mess than desktop Linux is claimed to be -- and that's before the various vendors perform their mangling. :)
FWIW Android 4.0+ has finally stabilized a lot of this churn; in part due to standardizing more stuff into the core OS, but also due to the various vendors finally realizing that going off and doing their own thing hurts them more than it helps them...
Posted May 6, 2014 18:01 UTC (Tue)
by Cyberax (✭ supporter ✭, #52523)
[Link] (10 responses)
It's about as complete as you can get in a mainstream OS. It certainly has a bigger scope than libc or even POSIX.
> And let's not forget the Win9x/WinNT API (and behavioral!) differences (WinXP finally unified the kingdom, huzzah!) affecting things as core as winsock, but also the likes of media playback.
And most of 9x apps actually worked just fine in XP/2k, if they used only Win32 API.
> Android at the SDK layer is decently backwards-compatible (providing you stick only to what's in the official APIs) but if you write or interact with any sort of native code (via the NDK), it's actually a worse mess than desktop Linux is claimed to be -- and that's before the various vendors perform their mangling. :)
Posted May 6, 2014 18:53 UTC (Tue)
by pizza (subscriber, #46)
[Link] (7 responses)
Win32 (as opposed to "The Windows API") is quite limited.
Win32 is little more than the core syscall API, consisting solely of the contents of kernel32.dll, user32.dll, and gdi32.dll. The only thing you listed that's actually part of Win32 is "a standard GUI framework", and that's because the core GUI framework is actually a kernel-level feature.
Audio isn't technically part of Win32, but it was present in the first Win32 implementations (ie WinNT and Win32s) in the form of basic waveout support, no codecs. Codec support along with unaccelerated video playback APIs first shipped with Win95, I believe.
DirectX 1.0 was released *after* Win95, and was first bundled with Win95OSR2 and NT4 in 1996.
Accelerated video came via DXVA, which was introduced with Windows 2000.
Shall I go on?
> Windows 2000 unified 9x and NT families, but who cares...
No, WinME was released after Win2000; it wasn't until XP came along that the 9x series finally ended.
Posted May 6, 2014 19:10 UTC (Tue)
by Cyberax (✭ supporter ✭, #52523)
[Link] (6 responses)
> Win32 is little more than the core syscall API, consisting solely of the contents of kernel32.dll, user32.dll, and gdi32.dll. The only thing you listed that's actually part of Win32 is "a standard GUI framework", and that's because the core GUI framework is actually a kernel-level feature.
> DirectX 1.0 was released *after* Win95, and was first bundled with Win95OSR2 and NT4 in 1996.
GameSDK was earlier a part of Win32s which allowed to run 32-bit applications on top of Win 3.11, in particular it allowed direct access to bitmaps and optimized palette manipulation (/me switches into nostalgia mode).
> Shall I go on?
Posted May 6, 2014 21:13 UTC (Tue)
by pizza (subscriber, #46)
[Link] (5 responses)
Please don't pretend that a given API's current presence (or even as of WinXP RTM) means that it was *always* available for use.
But if your point is that once MS introduces an Official API they tend carry it forward indefinitely (with some exceptions), then yes, I would agree.
(Incidentally, isn't it nice to completely control the entire platform from top to bottom, along with a nine-figure R&D budget?)
Posted May 7, 2014 12:44 UTC (Wed)
by tialaramex (subscriber, #21167)
[Link] (4 responses)
The compatibility promise is implicit, not explicit, so engineers aren't just obliged to support stuff that they documented as working, but anything commonly relied on, including stuff which they specifically said nobody should do, but everyone did it anyway. That's enough to have any maintenance programmer tearing their hair out.
Raymond Chen has extensively documented examples where Win32 has to do something completely insane, that you would put top of your list of code smell fix-that-first items but which they are unable to do anything about because Windows programmers rely on "but it worked when I did it 15 years ago" as an excuse above all else.
And it hurts end users too, by resulting in confusing and unintuitive behaviour that can't be fixed without breaking the software they love.
Posted May 7, 2014 13:08 UTC (Wed)
by NAR (subscriber, #1313)
[Link]
Posted May 7, 2014 14:58 UTC (Wed)
by renox (guest, #23785)
[Link] (2 responses)
Posted May 8, 2014 0:07 UTC (Thu)
by mathstuf (subscriber, #69389)
[Link] (1 responses)
Posted May 8, 2014 6:22 UTC (Thu)
by cjr (guest, #88606)
[Link]
Posted May 6, 2014 18:54 UTC (Tue)
by dlang (guest, #313)
[Link] (1 responses)
Posted May 6, 2014 19:57 UTC (Tue)
by raven667 (subscriber, #5198)
[Link]
Posted May 6, 2014 15:28 UTC (Tue)
by drag (guest, #31333)
[Link] (4 responses)
There are thousands of apps on F-Droid.
I tried to find a way to count them, but nothing easy presented itself.
I would not be surprised if the new number of open source android applications in the past few years outnumbers the amount of new end user packages in Debian or Fedora by a order of magnitude.
Posted May 6, 2014 17:54 UTC (Tue)
by mathstuf (subscriber, #69389)
[Link] (3 responses)
Posted May 6, 2014 19:07 UTC (Tue)
by mathstuf (subscriber, #69389)
[Link] (2 responses)
Posted May 8, 2014 18:36 UTC (Thu)
by pj (subscriber, #4506)
[Link] (1 responses)
Posted May 8, 2014 21:20 UTC (Thu)
by mathstuf (subscriber, #69389)
[Link]
Posted May 6, 2014 0:47 UTC (Tue)
by pizza (subscriber, #46)
[Link] (2 responses)
...by being the cheapest, not through any inherent technical superiority. (in many ways, Android is still technically the weakest!)
> Android has 1,203,791.
I'll grant you that there are many more "packages", but the vast majority of those are basically boilerplate wrappers/replacements for hitting a random web site. Even the majority of the "games" are basically wrappers around flash, designed around milking money from the gullible, a dollar at a time.
Posted May 6, 2014 1:51 UTC (Tue)
by raven667 (subscriber, #5198)
[Link]
Posted May 6, 2014 4:35 UTC (Tue)
by donbarry (guest, #10485)
[Link]
The entire platform is designed, like that of Apple's, to support privacy-invasive apps which can mine user data for most gullible users. And let's not forget the deliberate removal of all GPL code save the kernel to create a "friendlier" environment -- friendlier to those hostile to free software.
The entire culture and ecosystem of Android is hostile to the ethos of free software. There's not even any indication of license on the app store, which is itself a proprietary app, as is now large parts of the "Google Play" infrastructure which apps are encouraged to use.
Google sang different hosannas to different camps to gain their market share -- remembering that primarily they are a *advertising* company, not a search company, and are now maximizing their ability to monetize this system. The phone companies have flocked to it because they get it for free without the restrictions Apple places on their offering, or the per unit cost of Microsoft's offering. And they have loaded up most of their systems with their own layers of proprietary filth.
Technically, the system is competent. In terms of being responsive to the *user*, offering by default expectations of privacy, and a community built around such expectations, they've totally failed. But they never had any interest in this: they wanted to monetize the user data stream, and they enabled the carriers and the app writers to further monetize this stream. Parasites left and right, maintaining huge databases on the behavior of each and every consumer, while talking about how "free" the platform is.
Never has the difference between free as in freedom and free as in beer been more clear.
Posted May 6, 2014 8:42 UTC (Tue)
by russell (guest, #10458)
[Link]
Posted May 6, 2014 10:57 UTC (Tue)
by juliank (guest, #45896)
[Link]
There's no need to go all the way. But just shipping the core OS, that is, /system in packagized form would help and prevent lots of builds, thus reducing build times, host storage, because binaries can be shared between multiple devices. On top of that, you can still add an App store.
That's relatively similar to how Ubuntu is structured on the touch side.
Posted May 6, 2014 12:56 UTC (Tue)
by leoc (guest, #39773)
[Link] (6 responses)
Posted May 6, 2014 15:31 UTC (Tue)
by drag (guest, #31333)
[Link] (5 responses)
The traditional Linux distribution model lacks on both counts when it comes to fulfilling the needs of end users and the developers that serve them.
All I am saying that is if anybody is 'playing catchup' it's going to be Linux distro as a end user platform.
Posted May 6, 2014 15:45 UTC (Tue)
by clump (subscriber, #27801)
[Link]
Posted May 6, 2014 20:18 UTC (Tue)
by anselm (subscriber, #2796)
[Link] (2 responses)
Debian GNU/Linux has served me well as a desktop and server OS, for all sorts of tasks including offering Internet services, professional publishing, and various flavours of software development, on a considerable number of machines for more than 15 years now. I call that »success«.
Posted May 6, 2014 22:14 UTC (Tue)
by jspaleta (subscriber, #50639)
[Link] (1 responses)
Utility, or otherwise stated, usefulness, can be much more easily determined in an ad hoc fashion, after the fact, with no concern at all as to what the original intent was. Something does not have to succeed to have significant utility to you, or me, or to society. But to call such utility a success, is a mischaracterization of the product and a missed opportunity to learn something vital about the marketplace in which the product was introduced. Indeed, products can be successful and have zero utility for me personally. And similarly, if a product fails to be useful to me personal, I can make not claim that it is a failure. There many examples of successful products that personally find useless.
Now obviously products that fail to succeed can still be useful, can still have utility.. and very observant entrepreneurs and experimenters and can learn from such useful failures and retool or redirect them into successes by learning from the original _failure_ and defining new goals and building towards those goals. But just simply moving the goal posts about what success means without acknowledging the failure, is just rationalizing delusion which prevents a deep learning, assessment and understanding of the market. Such good feel revisionism only makes it harder to effectively define goals and reach true product success in the iterative process of product refinement.
That being said. I think Debian is pretty successful in the context of the project's stated goals. But I would also add, that if the Debian project was being created from scratch today, I think the project would make a number of different technology choices. I'd wager that project architects would be heavily influenced by day-to-day use of "app" distributions used by other platforms and would build on that model instead of the packaging model commonly in use now. What exists now in linux projects is as much about inertia and familiarity as it is about technical superiority for a given set of prioritized requirements.
-jef
Posted May 6, 2014 23:28 UTC (Tue)
by anselm (subscriber, #2796)
[Link]
Debian, unlike other Linux distributions, actually has stated goals. While it is not perfect, I think that overall it is doing fine.
I'm not convinced. Everybody is screaming for »apps« these days, because »apps« seem to be the newest cool thing, but it is by no means clear that a Linux distribution built on »apps« would actually work any better than the packaging-based ones we already have. Somebody should build one so we can see whether it will catch on. Right now the »app« proponents remind me of Professor Tanenbaum arguing that it would be a lot better if Linux was based on a microkernel. There are certain things to be said for that, to be sure, and the monolithic model espoused by Linux-as-we-know-it does have certain weaknesses that a microkernel-based approach might be able to avoid, but the microkernel approach comes with its own problems and so far has not been competitive in the general OS arena.
Android doesn't count in this context because it has the benefit of a captive audience; a general »app-based« Linux distribution would have to be able to compete with today's Ubuntu, Debian, CentOS or openSUSE before we can say that it is a viable proposition.
Posted May 7, 2014 18:39 UTC (Wed)
by leoc (guest, #39773)
[Link]
The kind of end users who love mainstream platforms demand convenience and simplicity at the expense of everything else. I'd argue that Desktop Linux can and does succeed perfectly well on its own terms by simply continuing to ignore the needs of those people.
All I am saying that is if anybody is 'playing catchup' it's going to be Linux distro as a end user platform.
Yeah, we "traditional Linux distribution" users really need to catch up to the malware availability and censorship of superior platforms.
Posted May 6, 2014 14:19 UTC (Tue)
by ccchips (subscriber, #3222)
[Link] (7 responses)
Posted May 6, 2014 15:41 UTC (Tue)
by drag (guest, #31333)
[Link] (6 responses)
AOSP is as open source as OpenSuse or Debian.
What you experience with difficulties on tablets is because the lack of standardization of the ARM systems.
You know the whole Microsoft and Intel cabal, the horrors of ACPI, BIOS, EUFI... and how that stuff is forced on all x86-type systems? Well that is _standardization_. That stuff is why you are able to take a random laptop out of a store and install any Linux version you want and it will more then likely run. It'll run like shit, but it'll still run.
Exactly how the deb/apt/yum/rpm actually solve this problem?
Oh, it doesn't. It has absolutely nothing to do with it.
> I don't get system-level security updates.
I can't apply security fixes to my Linux desktop without being forced to upgrade all my applications to their newest versions.
If the versions of the software I want to use do not happen to be the ones that were the most convenient for my distribution to compile when it was released 6-12 months ago then that means I can't use them.
Unless I completely ignore the package management system and download and install software like it was 1995 and personally maintain it... which I am forced to do on a regular basis.
> I can't upgrade the system to newer versions.
I upgrade my Android systems all the time to newer versions.
> I'd say Android's sucess has come at the expense of technical excellence and versatility.
uhuh.
Posted May 6, 2014 18:58 UTC (Tue)
by dlang (guest, #313)
[Link] (1 responses)
Posted May 6, 2014 19:16 UTC (Tue)
by micka (subscriber, #38720)
[Link]
So "AOSP is as open source as OpenSuse or Debian", maybe, but AOSP is not Android if the main advantage of Android is its apps.
Posted May 6, 2014 21:33 UTC (Tue)
by debacle (subscriber, #7114)
[Link]
Don't blame others for not using Debian :~) When Debian releases security fixes, you get the same library or program as in stable, with just the security problem fixed. Same for Debian testing. I even believe that other Linux distributions offer the same service for you.
> Unless I completely ignore the package management system and download and install software like it was 1995 and personally maintain it... which I am forced to do on a regular basis.
While using Linux since 1993, currently both at home and at work, on desktop, on servers, and on embedded systems (measurement devices), I almost never have to do this. For the embedded devices I have to backport many packages from Debian wheezy or jessie to squeeze, though.
Posted May 7, 2014 0:16 UTC (Wed)
by clump (subscriber, #27801)
[Link] (2 responses)
If you want long-running versions of software, run an enterprise distribution. You could also statically-compile your binaries and run into exactly the same security and update problems that exist in the phone world. If your argument is that Linux runs poorly because you choose to run unsupported or unpackaged software, nothing about Android will help you.
You seem to simply be dismissive and have adopted a negative, fatalistic tone.
Posted May 7, 2014 19:14 UTC (Wed)
by Wol (subscriber, #4433)
[Link] (1 responses)
Maybe because you did? You moaned that Android versions have to be specific to certain tablets. That's because (1) the hardware is different across different tablets, and (2) linux can't always probe to find it. Do you *really* want to brick your tablet because a linux probe to to find the graphics driver accidentally hoses the hard drive controller?
Do you remember the trouble with Intel graphics chips a few years back (or was it network chips)? Where laptops kept getting hosed and it was a nightmare resetting them? That's the *norm* for ARM hardware, and much as I don't like admitting that Microsoft has done some good, as drag says they *have* forced a standardised x86 hardware layout. So linux knows where and how to find stuff.
That's what all this stuff about "device tree" is about - telling linux what hardware is where on ARM systems, and it doesn't (yet) work that nicely or well.
Cheers,
Posted May 8, 2014 14:58 UTC (Thu)
by clump (subscriber, #27801)
[Link]
Posted May 6, 2014 15:36 UTC (Tue)
by clump (subscriber, #27801)
[Link] (2 responses)
Android is popular. To your point, it's popular despite its deficiencies. By your logic, imagine Android's popularity were it to learn lessons from more mature platforms.
Docker's origins are in PaaS, and its surrounding projects still lean that way. Including everything needed to run your application is useful, but don't forget Docker is a method of packaging your application *and* running it. For this to be relevant to phones, it would be akin to making a new phone with every application update.
Virtualization isn't a way to circumvent package management. Virtualization helps with utilization, compartmentalization, platform diversity, and decoupling operating systems from hardware.
Posted May 6, 2014 18:04 UTC (Tue)
by Cyberax (✭ supporter ✭, #52523)
[Link] (1 responses)
Adding delta-encoding for Cyanogen updates is not complicated, T-Mobile and AT&T (at least) do it for their OTA Android updates. And it doesn't require splitting the base system into packages.
It's just that nobody bothered enough to add it to Cyanogen.
Posted May 6, 2014 18:36 UTC (Tue)
by clump (subscriber, #27801)
[Link]
Posted May 6, 2014 21:19 UTC (Tue)
by debacle (subscriber, #7114)
[Link] (6 responses)
Posted May 7, 2014 3:24 UTC (Wed)
by mathstuf (subscriber, #69389)
[Link] (5 responses)
Posted May 8, 2014 22:21 UTC (Thu)
by debacle (subscriber, #7114)
[Link] (4 responses)
Posted May 8, 2014 22:53 UTC (Thu)
by juliank (guest, #45896)
[Link] (3 responses)
What music do you listen to that you care about those fields?
Posted May 9, 2014 2:33 UTC (Fri)
by mathstuf (subscriber, #69389)
[Link]
Anyways, Vanilla doesn't support such tags because the Android MediaStore[1] doesn't keep them.
[1]https://developer.android.com/reference/android/provider/...
Posted May 11, 2014 10:32 UTC (Sun)
by debacle (subscriber, #7114)
[Link] (1 responses)
Posted May 12, 2014 6:06 UTC (Mon)
by mathstuf (subscriber, #69389)
[Link]
Posted May 6, 2014 23:47 UTC (Tue)
by shmerl (guest, #65921)
[Link]
2 updates per month should have delta updates
2 updates per month should have delta updates
2 updates per month should have delta updates
2 updates per month should have delta updates
2 updates per month should have delta updates
2 updates per month should have delta updates
2 updates per month should have delta updates
2 updates per month should have delta updates
2 updates per month should have delta updates
2 updates per month should have delta updates
2 updates per month should have delta updates
2 updates per month should have delta updates
2 updates per month should have delta updates
2 updates per month should have delta updates
However, the Linux kernel does explicitly acknowledge and respect the boundary between kernel space and user space. In other words, the kernel has strong ABI and API guarantees for stuff that uses the kernel. How I wish the idea of strict API/ABI boundaries was more prevalent and enforced in the rest of the OS stack !
2 updates per month should have delta updates
2 updates per month should have delta updates
2 updates per month should have delta updates
2 updates per month should have delta updates
You seem to be responding to a comment that hasn't been written. You're also asserting that because multiple distributions exist there is automatically waste, and that a universal API would be "good".2 updates per month should have delta updates
Android definitely got one thing right: API stability. This is far better than the typical "distro" situation, where there are no guarantees that software written for version N of a given distro will work for version N+1. Going cross-distro is even more problematic.
Except that in my case "yum install LibreOffice" provides me with something akin to your guarantee. Do I then care about API's? Does an Android market user care about API's? We all care that "it works". The reason an Android app, a Linux app, or even a Windows app works is because someone is maintaining it.
2 updates per month should have delta updates
Except that in my case "yum install LibreOffice" provides me with something akin to your guarantee. Do I then care about API's? Does an Android market user care about API's? We all care that "it works". The reason an Android app, a Linux app, or even a Windows app works is because someone is maintaining it.
2 updates per month should have delta updates
2 updates per month should have delta updates
2 updates per month should have delta updates
2 updates per month should have delta updates
2 updates per month should have delta updates
2 updates per month should have delta updates
2 updates per month should have delta updates
2 updates per month should have delta updates
And therein lies the rub: "someone is maintaining it". Let's look at this closer. Who exactly is maintaining LibreOffice at Fedora? Let's assume it's a volunteer and not a RH employee. Why do we need to have someone to make a package? What happens if the volunteer gets bored, goes on holidays, moves onto other things, or gets hit by a bus? Why can't we simply install the latest version of LibreOffice directly from the LibreOffice site, instead of waiting for the package to be made?
Seriously? You can install LibreOffice directly from the site. It just so happens it's also available through the generosity of others at various distros. Lucky us. Sometimes projects do release in .deb, .pkg, .rpm and others. Not sure what your argument is. You aren't suggesting nobody maintains Android/IOS software I hope.What happens when version N+1 of Fedora is released? Based on history, as soon as Fedora N+1 is released, version N generally gets no software updates apart from security fixes. So, would "yum update LibreOffice" still work? Probably not. So if we want to update LibreOffice, we're forced to upgrade Fedora to N+1. Fedora is notorious for breaking things willy nilly at each upgrade, and to a large extent this affects other "distros" as well. Why should I end up with a half-broken set of applications and OS components, when all I wanted was to upgrade just one application?
When Fedora makes releases, older versions are updated to a point. This is documented. Are you seriously arguing Android has an advantage here? Would you argue carriers are good about maintaining their hardware/software images? Do carriers have stated update policies like Fedora?The point is that distro-specific packages are not suitable for user-facing software. They are a bottleneck. They get in the way. They do not scale. They delay and/or effectively prevent upgrades. In other words, they are not an effective way for software developers to deliver their software to users.
Except that I've provided reasons it's superior for me and no doubt others. Distro packages in no way inhibit you from installing by other means. Package managers very nicely solve the issues you've mentioned in the rest of the paragraph. Do individually installed packages make security concerns easer? Does they ease maintenance and reporting?
2 updates per month should have delta updates
2 updates per month should have delta updates
2 updates per month should have delta updates
They do this my shipping their own "platform" libraries (SDL and the like) and bundling everything else. I think all they expect on the host is X, glibc, and a driver capable of the games you want. One of the Humble Bundle games even ships its own WINE to run on Linux (IIRC, LIMBO).
2 updates per month should have delta updates
2 updates per month should have delta updates
2 updates per month should have delta updates
2 updates per month should have delta updates
2 updates per month should have delta updates
So for this particular development "platform", the situation is basically the same on GNU/Linux / desktop-Linux / Unix-like Linux as every other OS, both for end-users and for app-developers.
2 updates per month should have delta updates
2 updates per month should have delta updates
That's only because someone else already made a Windows build with all necessary dependencies bundled in.
Which is just a matter of using static-linkage, or setting RPATH or RUNPATH, or setting LD_LIBRARY_PATH (e.g. from a shell script wrapper — I've seen some binary-only stuff do it this way), plus maybe building with libltdl in situations where dlopen is called with a pathname.
2 updates per month should have delta updates
2 updates per month should have delta updates
when android is as old as RHEL5 then you can start talking about how it is actually providing better long term support
but you say you want C++11 features, well, if you want Android 4.4 features, your app isn't going to work on any older android system
same thing with IOS, or Windows. If you use a feature introduced in a recent version, it's not going to work on older versions.
If you want no development at all, then you can have an absolutely compatible OS ABI, but as soon as you allow for new features to be added, software developed for new versions won't work on old ones.
2 updates per month should have delta updates
2 updates per month should have delta updates
No, really. What can you talk about with guy who says the reason an Android app, a Linux app, or even a Windows app works is because someone is maintaining it?
Think of "maintaining" as "doing the work". To make software work on any platform, Linux desktops, Android, Windows, Solaris, Mac OS, iOS, etc, someone has to "do the work". The app store model in the phone world often means that the original developer is doing the work. The typical Linux desktop world has others, maintainers if you will, often doing that work. You and others are asserting it's easier to "make things work" outside of desktop Linux. Fabulous!There are few facts about desktop (smartphone, etc) which we need to keep in mind when we are discussing… anything, really.
For the reasons you mentioned, I don't want the typical smartphone software model. I don't want software and hardware to throw away every few years. I don't want software without tighter interaction between the system and the application. I don't want to be without assurances that I'm patched against known security problems.
They assume that someone will “maintain” the software. But in many cases there are noone who could even try to do that. Source is incomplete, unavailable, there are no developers and so on. A lot of companies do their operations on PDP-11 emulators on top of contemporary hardware (no, I'm not joking)—which, of course, would make no sense in a world where the reason an Android app, a Linux app, or even a Windows app works is because someone is maintaining it.
The "someone maintaining" is often the distribution. And how do distributors feel about unmaintained, incomplete, insecure code? It gets removed or isn't even provided. This far worse in the app store world. How often do we hear of Android malware being offered from the Play Store? This is clearly a weakness of the distribution/approval model of app stores.
You can try to target something like Debian stable or RHEL but since there no way to use newer versions of libraries with these ancient versions of OS… it works, but is huge PITA.
For RHEL, check out the RHEL Developer Program. There are ways (Software Collections, etc) to use newer versions of libraries and runtimes. It's not a PITA either.
There are maintained stable API which guarantees that properly written program will continue to work “forever”.
Ah, yes, "properly written". You and I both know many, many apps aren't "properly written". So you do have lots of apps that don't work on the now too old phone you just bought. "Properly written" is a big tent to hide under, and can excuse many of your distro complaints.
Problem of “unmaintained” or “poorly maintained” software exist, of course, but it's mitigated by the fact that it's easy to write software which is compatible with wide range of versions and hard to write software which is incompatible. You can, e.g., link your binary with system version of OpenSSL (which will then be broken when OpenSSL will be upgraded), but to actually do that you'll need to pull OpenSSL from the device since it's not included in the SDK!
As a user, I'm poked. There are no mailing lists or other decent ways to notify users that their wallpaper app is stealing their personal information or bank credentials. Just install the app and hope for the best. I trust distributions far more in this regard. Apps being easy to write would make the problem worse.
This is what Valve is trying to do with Steam. But if they will succeed then then next question will be: why bother with distributions at all? Why not drop support for Steam on Linux and keep only SteamOS? If distrubuions will leave these problems to Valve, then I'm sure that's exactly what eventually happen.
Steam is a game app store. Steam for Linux seems simply like a test bed for SteamOS. That SteamOS will let you do more than games is interesting to me.
Distributions brillintly solve problem of use of Linux as “old-style Unix”: when one large (and expensive!) box was shared among many developers distribution was not a problem (software was only ever used on the very same box where it was developed) and the ability to have bazillion packages was both a boon (hey, I don't need to build libXYZ from source and waste my precious tiny quota, it's now available in our system as system-wide library!) and necessity (because HDDs were not large enough to copy with gigabytes of software). In today's world the problems they tried to solve are no longer as acute and they fail miserably at solving problems which are actual in today's world.
I don't agree with your characterization. Distros do a brilliant job of providing safe, easy to install and maintain platforms for servers, desktops, and other uses. Many (not all) distros also offer a wide range of software (backed by maintainers) that work to ensure the software is current, secure, and works as well as possible. As we've all discussed, this is a different use case for phones.2 updates per month should have delta updates
The app store model in the phone world often means that the original developer is doing the work.
To make software work on any platform, Linux desktops, Android, Windows, Solaris, Mac OS, iOS, etc, someone has to "do the work".
For the reasons you mentioned, I don't want the typical smartphone software model. I don't want software and hardware to throw away every few years. I don't want software without tighter interaction between the system and the application. I don't want to be without assurances that I'm patched against known security problems.
You and I both know many, many apps aren't "properly written".
So you do have lots of apps that don't work on the now too old phone you just bought.
There are no mailing lists or other decent ways to notify users that their wallpaper app is stealing their personal information or bank credentials.
My entire point on this thread is that phones could learn from desktops.
2 updates per month should have delta updates
Not even close. App is published there. Once. Then people buy it and use it. If it does not work they get a refund (and even that is not always easy to do). That's it.
That's it, except when a new phone comes out. Except when a new feature becomes available. Except when a new version of Android comes out. The developer, not Google, not Apple, not the community, not the distribution does that work. You can argue that an app store is a better way for a developer to get his/her app directly to users. Fabulous! My point still is that somebody does the work, easy or not.
Then you should stop pretending that you are doing something people should care about. They can continue to use other OSes and ignore what goes on in your little corner. Think OpenBSD: they don't really care to cater for people wants and needs and people are reciprocating (as in: noone cares except for an OpenSSH).
Going straight to hyperbole eh? Many major distributions offer SELinux, mailing lists, errata/alerts, targeted updates of libraries like openssl. Smartphone users are affected by security issues. Ignoring phone security will not fix the problem.
Now you are painting nice stawman. Sorry, but no. Phone you've just bough may come with old version of Android, sure, but it'll work with a lot of “latest and greatest” programs.
In three sentences you make an accusation, then contradict yourself.
Report it to Google. They will kill the app even if it was not installed via the Google Play service. Note that appropriate feature was delivered to all users of all versions of Android starting from Android 2.3.
So if I'm a user and want to know if I'm vulnerable to something (please see my original comment) the solution is to notify Google of an app I already know is malicious?
What exactly could they learn? How to make life of the user miserable? There are bazillion simpler ways to achieve that “nirvana”.
Ah, hyperbole again. My wishlist:
You could argue that what I'm describing is more akin to a Linux distribution. You'd be right. Taking the list alone, however, you wouldn't find anything incompatible with what's made smartphones successful today.2 updates per month should have delta updates
2 updates per month should have delta updates
2 updates per month should have delta updates
2 updates per month should have delta updates
2 updates per month should have delta updates
-DALLEG_SUFFIX="-static"2 updates per month should have delta updates
> cross-distro binary packages with bundled dependencies (read: games)
2 updates per month should have delta updates
2 updates per month should have delta updates
2 updates per month should have delta updates
It's not because of something the distros are doing, it's because of something the library developers are doing: changing the APIs frequently.
2 updates per month should have delta updates
> dbus to run his or her favorite game)?
2 updates per month should have delta updates
teamviewer_linux_x64.deb, for example, and try to install it on new version of Ubuntu then say how well it works. Distributions should stop using “it's all upstream's fault” ridiculous excuse. In that particular case the underlaying libraries are all there and with proper ABI. The brokennes belongs to the ABI which is provided by distribution, that is: ABI for which “upstream” is the distribution itself!2 updates per month should have delta updates
Actually on the technical level, it is the success. It shows that they managed to create a stable platform, something where most (if not all) Linux distributions failed.
2 updates per month should have delta updates
2 updates per month should have delta updates
2 updates per month should have delta updates
disagree with your assertion that Linux distributions failed miserably to provide a stable and useful OS to users.
2 updates per month should have delta updates
2 updates per month should have delta updates
> And I'm not sure where you got the idea that this is the only measure of success, and that this is what "the community" is trying to become.
> We make GNOME 3: a complete free software solution for everyone.
2 updates per month should have delta updates
2 updates per month should have delta updates
2 updates per month should have delta updates
2 updates per month should have delta updates
2 updates per month should have delta updates
2 updates per month should have delta updates
If "API stability" is important to you, you have a choice of several distros that provide that stability.
2 updates per month should have delta updates
2 updates per month should have delta updates
2 updates per month should have delta updates
but *nobody* provides that sort of guarantee. Not Microsoft, not Apple, not Google, not Redhat, *nobody*.
Eh? Notwithstanding that Win32 API is horrible, the fact is that stuff written for the official (and old!) Win32 API will work on the current Windoze 8. While the guarantee can't be 100% (both OS and software writers can and do interpret API docs differently, etc), there is an enforced and concerted effort for that guarantee to be as solid as possible. The same can't be said for any Linux based OS, apart from Android.
That aside, you still haven't explained why you expect a large, dispersed pile of people who don't share any common ideals or goals to self-organize into a tightly focused entitity.
2 updates per month should have delta updates
2 updates per month should have delta updates
Windows 2000 unified 9x and NT families, but who cares...
Nope, you just need to package all the libraries you need and you're golden. NDK is pretty stable.
2 updates per month should have delta updates
2 updates per month should have delta updates
That's distinction without difference. Microsoft tries to maintain stability for its entire documented API (unless otherwise noted), which is quite vast.
Nope. Win32 has a client-side GUI decoration rendering, but the actual drawing operations and event handling (call it 'X-server') are in-kernel.
Nope. The first beta-versions of DX were available through GameSDK. It was released just after Win95 as a separate download, but only the next version of DirectX became popular enough.
Well, yes. By the time of WinXP I could write pretty much everything using documented Windows API with near-perfect stability record.
2 updates per month should have delta updates
ABI compatibility, not a free lunch
ABI compatibility, not a free lunch
ABI compatibility, not a free lunch
And eventually you deprecate, it may take a long time but so what?
ABI compatibility, not a free lunch
ABI compatibility, not a free lunch
2 updates per month should have delta updates
2 updates per month should have delta updates
2 updates per month should have delta updates
2 updates per month should have delta updates
2 updates per month should have delta updates
2 updates per month should have delta updates
2 updates per month should have delta updates
2 updates per month should have delta updates
2 updates per month should have delta updates
2 updates per month should have delta updates
2 updates per month should have delta updates
2 updates per month should have delta updates
If popularity is your only concern, then why use anything except Windows or iOS?
2 updates per month should have delta updates
2 updates per month should have delta updates
2 updates per month should have delta updates
Popularity is less important then success.
You've defined success as popularity earlier. This is a common mistake Android advocates make.The traditional Linux distribution model lacks on both counts when it comes to fulfilling the needs of end users and the developers that serve them.
Compared to Android, a traditional Linux desktop is far more successful at sane updates, package management, and I'd argue security. You need to define developer goals for the second half of your comment. You also seem to conflate phones and desktops. Phones can't serve different purposes and learn from desktops? All I am saying that is if anybody is 'playing catchup' it's going to be Linux distro as a end user platform.
You have no argument other than popularity.
2 updates per month should have delta updates
The traditional Linux distribution model lacks on both counts when it comes to fulfilling the needs of end users and the developers that serve them.
2 updates per month should have delta updates
2 updates per month should have delta updates
I think Debian is pretty successful in the context of the project's stated goals.
But I would also add, that if the Debian project was being created from scratch today, I think the project would make a number of different technology choices. I'd wager that project architects would be heavily influenced by day-to-day use of "app" distributions used by other platforms and would build on that model instead of the packaging model commonly in use now.
The traditional Linux distribution model lacks on both counts when it comes to fulfilling the needs of end users and the developers that serve them.
2 updates per month should have delta updates
2 updates per month should have delta updates
2 updates per month should have delta updates
2 updates per month should have delta updates
2 updates per month should have delta updates
Why would you need to upgrade your desktop applications?
2 updates per month should have delta updates
That stuff is why you are able to take a random laptop out of a store and install any Linux version you want and it will more then likely run. It'll run like shit, but it'll still run.
What's happened to the quality of your posts?Exactly how the deb/apt/yum/rpm actually solve this problem?
Oh, it doesn't. It has absolutely nothing to do with it.
And because package managers have nothing to do with generic hardware/software why did you bring it up? I can't apply security fixes to my Linux desktop without being forced to upgrade all my applications to their newest versions.
Heartbleed was a recently fixed, system library bug. "yum install openssl -y" fixed it for me. Nothing about updating "all my applications". With regard to Heartbleed and other vulnerabilities, I have various mailing lists and other ways to check and be notified if I'm vulnerable. The same is not true of telephone software.2 updates per month should have delta updates
Wol
2 updates per month should have delta updates
Maybe because you did? You moaned that Android versions have to be specific to certain tablets. That's because (1) the hardware is different across different tablets, and (2) linux can't always probe to find it. Do you *really* want to brick your tablet because a linux probe to to find the graphics driver accidentally hoses the hard drive controller?
I'm afraid you're quite mistaken. Please read through the thread.
Pertaining to my comment, to catch up would be to adopt sane updates and maybe a saner distribution model than "app stores" and random forums. The original concern is downloading hundreds of megabytes of software every couple of weeks. This isn't progress, this is regression closer to the days of giant, slowly released service packs. In this light, typical Linux distros are way, way ahead.2 updates per month should have delta updates
Going to a OS-level package management model like Apt/Yum would a be a HUGE step backwards.
Because? Hundreds of megabytes of updates is a good thing for small changes? Not every update in Android requires a re-flash, certainly.2 updates per month should have delta updates
2 updates per month should have delta updates
Drag, your package count is misleading
So far, while CyanogenMod + f-droid.org user since some years, I'm not really impressed by Android. It is more or less usable, but it's not comparable to a real system like Debian or any other Linux distribution.
Drag, your package count is misleading
Drag, your package count is misleading
Drag, your package count is misleading
Drag, your package count is misleading
Yes, my music collection (mainly swing, tango, folklore, classical) is tagged. Generally I'm using title, artist (= orchestra), performer (= singer), date (= recording date), composer, lyricist, genre. In some rare cases I have even location, language, or album. So far, none of the Android music players let me use this information.
Drag, your package count is misleading
Drag, your package count is misleading
2 updates per month should have delta updates
