Why Steam on Linux matters for non-gamers
Why Steam on Linux matters for non-gamers
Posted Sep 19, 2013 13:21 UTC (Thu) by drag (guest, #31333)In reply to: Why Steam on Linux matters for non-gamers by Chousuke
Parent article: Why Steam on Linux matters for non-gamers
Even if Steam wasn't DRM'd most steam users will still happily use Steam because the distribution model it uses is very effective.
The traditional Linux distribution model is fine for what it is, but there is a huge amount of Free software that it will never be suitable for. It's effectively hit it's scalability limits due to the centralized packing approach.
Posted Sep 19, 2013 14:31 UTC (Thu)
by ewan (guest, #5533)
[Link] (68 responses)
That's got a lot more in common with the Linux distro model than it does with the other traditional approach of getting each bit of software from its individual vendor separately.
Posted Sep 19, 2013 14:55 UTC (Thu)
by mpr22 (subscriber, #60784)
[Link] (66 responses)
Posted Sep 19, 2013 18:25 UTC (Thu)
by Gerardo (subscriber, #37539)
[Link] (43 responses)
Posted Sep 19, 2013 20:25 UTC (Thu)
by khim (subscriber, #9252)
[Link] (42 responses)
Posted Sep 22, 2013 8:32 UTC (Sun)
by krake (guest, #55996)
[Link] (41 responses)
Posted Sep 22, 2013 11:44 UTC (Sun)
by khim (subscriber, #9252)
[Link] (40 responses)
Really? Ok, then. Let's consider simple example: I'm producing game for the upcoming Ender's Game launch. Which means game should become available in the first week of November 2013, but not before film's release (November 1). And it should be held out till December for Australia and till January for New Zeland. How can I do that in the "Linux repo" model? When should I apply for the position of package maintainer, when I'll receive said position and how could I choose the day of release? Note while formally Apple does not guarantee that your app will be approved by certain date it (which creates some grief for some developers), of course, says that when your app is approved, you use iTunes Connect to release it by setting the date when the app will be available to customers.
Posted Sep 22, 2013 12:14 UTC (Sun)
by krake (guest, #55996)
[Link] (38 responses)
Even huge releases, such as GNOME's or KDE's full product portofolio, are usually available on the day of release, with most customary only a single week of uploading and automated reviewing/testing.
I think it would be reasonable to assume that for something way smaller, like a single product, this could be cut down to a day or two, maybe even just hours.
Posted Sep 22, 2013 13:10 UTC (Sun)
by khim (subscriber, #9252)
[Link] (37 responses)
Try it. Unless you are offering well-known product you'll need to spend inordinate amount of time trying to explain why you want this product and even if you'll succeed (which is not guaranteed if your are not willing to open-source your product) it'll be included in next version of the distro. There are an exceptions (like Ubuntu Shop), but these appeared after rise of AppStores and, in fact, are trying to emulate AppStores.
Posted Sep 22, 2013 14:45 UTC (Sun)
by krake (guest, #55996)
[Link] (28 responses)
My software is currently packaged by a respective service provider so I don't know exactly how they do it, but I might be intrigued enough to try for myself next time I create something new.
I would expect that publishing times vary quite a lot between channels, just like they do on mobile app store channels.
As far as license restrictions are concerned, there might be some for Linux distribution channels just like there are for app store channels. Since I haven't read up on any respective legal documents I can't say for sure but I would be surprised if would be more limiting than mobile app stores.
Well, for GNewSense maybe, but then that is their goal.
Posted Sep 22, 2013 17:03 UTC (Sun)
by khim (subscriber, #9252)
[Link] (27 responses)
IOW: you don't know anything about how distributions work, you don't know anything about their strength and weaknesses and you only assume they behave in some way you expect them to behave. Sorry to disappoint you, but they don't behave like you would expect, delays measured in months and years are not uncommon (as I've already pointed out) and it's almost impossible to push your software via regular distribution unless you publish it under open-source license first. The most you can expect is to build your software in form of the package suitable for some Linux distribution (using OpenSUSE Build Service or anything else) and then try to convince people to download your stuff from your server and then manually install it. There exist analogues of the App Stores for Linux (Ubuntu's shop is one example), but these are separate entities and they were brought to this world after rise of Steam and AppStores, not before. And they still are not as easy to use as their counterparts from MacOS/Windows and mobile world (as SDK is still in preview state, e.g.) Well, that's something. It took a decade (Steam is over 10 years old, remember?) for the Linux guys to finally grok the difference between distributions and AppStores, hopefully it'll take less then 10 years to finally catch up with the rest of the world on the availability and usability side. P.S. The really funny thing is that first Linux App Store was created before Steam, but of course everyone derided it and insisted that they don't need anything like that when they have regular Linux distributions where, you know, "community" is in charge, not actual developer of the software. Because, you know, one can not trust developer (which is, ironically, true to some extent, but then developers can only stand so much abuse till they leave… and to offer highest amount of abuse and least lucrative users was not a winning combination).
Posted Sep 22, 2013 18:20 UTC (Sun)
by lsl (subscriber, #86508)
[Link] (1 responses)
That, too, is a good thing. Think about it: Why would I make random code part of my product that I can't fix and can't even _look_at_ to determine what it actually does? This is crazy.
Posted Sep 22, 2013 19:11 UTC (Sun)
by khim (subscriber, #9252)
[Link]
Why should it be part of your product? Why distribution can not play the role, of, you know, distribution mechanism and give me the ability to install the software I want and bug the author of said software in case of error? Why must everything pass via the distribution's buildfarm? If you don't want to have your good name “tarnished” by such an awful thing as a popular game or a usable CAD then create separate channel (as Ubuntu did). Leave “there should only be FOSS” craziness to gNewSense—at least these guys don't pretend they make something Joe Average can use. I respect these guys as much as Free60 or KolibriO guys: they are creating something they like, they know that what they are creating will never be a mainstream… and they are Ok with that. Fine. That I can understand. But when you create general-purpose operating system why don't you make it, you know, general-purpose?
Posted Oct 2, 2013 11:52 UTC (Wed)
by krake (guest, #55996)
[Link] (24 responses)
I obviously don't know about how all distributions work and how everything works for them, but I can definitely make my conclusions based on what I see in those I am monitoring closely enough,
> Sorry to disappoint you, but they don't behave like you would expect, delays measured in months and years are not uncommon
Sorry to disappoint you, but no.
Heck, sometimes the time between tagging (if I am priviledged to that information) and the package availability being announced via automated notification mail is a couple of hours. And that process involves several steps including building, testing and uploading.
> it's almost impossible to push your software via regular distribution unless you publish it under open-source license first
That "almost" must be easily surmountable, in all those years of using various Linux distributions I've installed proprietary software uncountable number of times. The only times when this wasn't available is when the vendor didn't allow redistribution, but then no other form of software repository would be able to distribute it either.
Posted Oct 2, 2013 14:12 UTC (Wed)
by khim (subscriber, #9252)
[Link] (23 responses)
Sure, but if I'm developer I'm usually is not interested in that time. I'm interested in deploying application to end-user, not to “distribution channel”. Distributions which give you ability to push your update in the stable channels exist (Fedora, for example), but usually you need to wait for the next release, which happens quite infrequently. Why not? Ubuntu does that today. It's relatively new development, of course, but it's perfectly doable.
Posted Oct 2, 2013 17:59 UTC (Wed)
by krake (guest, #55996)
[Link] (22 responses)
Well, the time to the channel is the one controlled by a third party. The time to the upload is controlled by the developer, the time from the channel is controlled by the user.
Once the package is available on the channel, the user needs to pull it from there. The factors of that are the user becoming aware of the package and the time needed for download.
The time to awareness will depend on many factors, like whether a user visits the channel regularily (or has that automated) ir whether the channel sends out notifications or whether the vendor sends out notifications.
The distributor can of course influence those, e.g. by restricting bandwidth, but why would they?
> Why not? Ubuntu does that today. It's relatively new development, of course, but it's perfectly doable.
I have no experience with Ubuntu, so just to be sure: you are saying that Canonical shop redirects you to a vendor controlled download server once the purchase is completed?
Posted Oct 2, 2013 18:29 UTC (Wed)
by khim (subscriber, #9252)
[Link] (21 responses)
Only if user is adventurous enough to play with "unstable" channels. Most users don't want to visit unstable channels which can very well kill their system to receive stable version of software. They are doing for various reasons, but usually new version only pushed to stable "channel" when new stable version of distribution is released. Which may take months if you are lucky or years if you are not. No. It gives you access to the app store which looks like a typical app store, where you only need to grant the right to distribute you application under certain conditions, where you can verify license keys, etc. When Oracle decided to retire Operating System Distributor License for Java Java was removed from Ubuntu store, e.g.
Posted Oct 2, 2013 18:54 UTC (Wed)
by nybble41 (subscriber, #55106)
[Link]
The "app store" model *is* an "unstable" channel. The point of calling the channel "unstable" isn't that it contains unstable software, but rather that the software hasn't been tested together as part of a stable distribution. Putting your app in an app store and telling users to install from there is no different from putting it in e.g. Debian sid. Either way, you're asking them to install software which may be stable on its own but remains unproven as part of a larger system.
> No. It gives you access to the app store which looks like a typical app store, where you only need to grant the right to distribute you application under certain conditions, where you can verify license keys, etc.
Yeah, I can see why proprietary software vendors might want that, but my impression is that most users of FLOSS operating systems prefer similarly FLOSS applications. We don't want "app stores" selling locked-down proprietary applications which fail to grant essential rights and require things like license keys. That's the sort of thing we switched away from Windows or iOS to avoid.
Posted Oct 2, 2013 19:02 UTC (Wed)
by krake (guest, #55996)
[Link] (19 responses)
Multiple channels for different usage scenarious might not be common for app stores yet, or they might handle the different requirements differently, so it is understandable that some of the more traditional channel names lead to wrong conclusions :)
They are not about stability of the software but stability of the version, meaning "unstable" will see faster changes to the version than "stable".
It might help if you think about the levels as terms of updated speeds.
Private end users usually want to have new software fast, so they use the fastest channel.
End users in controlled environments might only have access to some medium speed channel, much like getting operating system updates from a company internal update server so that sysadmins can decide when to clear updates on a case by case basis.
Adminstrators of mission critical systems will usually subscribe to multiple channels, slow ones for non-critical updates, fast ones for critical updates.
The multiple channels are basically an feature previously only available to enterprise customers of large software vendors.
> They are doing for various reasons, but usually new version only pushed to stable "channel" when new stable version of distribution is released.
You are confusing to different things.
You are takling about a service provided to those who prefer controlled update points over getting the most recent release as quickly as possible.
> No. It gives you access to the app store which looks like a typical app store, where you only need to grant the right to distribute you application under certain conditions, where you can verify license keys, etc.
Right, that's what I thought as well. An early comment suggested that the Ubuntu Store and/or other stores could distributed software without requiring the software's vendor to having allowed that.
Posted Oct 2, 2013 20:10 UTC (Wed)
by khim (subscriber, #9252)
[Link] (18 responses)
Not really. Users want updates some software fast and most software stable. I don't particularly care if a LibreOffice which I use occasionally to print some documents if up-to-date or not and I certainly don't want to see it changing suddenly (like it did with conditional formatting which made it much harder to use), but for programs which I use frequently and which I want to help develop I want to see latest version available. If I'm software developer this will be a GCC and if I'm crazy painter (sane painters use Photoshop on Mac or Windows) it'll be GIMP. BTW about GCC: I use MSVC 2012 to develop software which is usable on all systems from Windows XP (released dozen of years ago) to Windows 8.1. Can I use GCC 4.8 to develop software usable on Wheezy? Wheezy was released less then half-year ago, right? No? Why the heck no? Wasn't Linux developer tools touted as superior to Microsoft's "junk"? True. Sysadmins here support Ubuntu 12.04 LTS and Windows 7 (which is older by two years then said version of Ubuntu), but I can use GIMP 2.8 (and aforementioned MSVC 2012) on Windows yet only GIMP 2.6 and GCC 4.6 on Linux. This makes me sad. What about users who don't want it? You are touting the virtues of this "perfect" model for so long yet I'm still see no instructions which explain how to install GIMP 2.8 on Ubuntu 12.04 LTS. I can probably build it from sources, but, well... distributions were invented exactly to make sure I will not need to do that, right?
Posted Oct 3, 2013 2:25 UTC (Thu)
by hummassa (subscriber, #307)
[Link] (15 responses)
Ok, let's lay down the pipe.
How come? Hm, khim *must* be right, let's check it, just in case. Get random C program, compile with gcc-4.8 under sid, copy binary to wheezy machine. Works. Ok, copy binary to lenny machine. Still works. Nope. Not right.
Packages.debian.org say that gimp in wheezy is 2.8.
I conclude *I* must be intoxicated, and that I cannot parse you anymore.
Posted Oct 3, 2013 2:48 UTC (Thu)
by dlang (guest, #313)
[Link] (14 responses)
the correct test would be to compile on Wheezy and try to run the resulting binary on Sid
you can't compile a program with the latest Windows 8 APIs and run it on older versions of Windows either.
If you could always run new software on old systems it would mean that you are not using any new features, which would stop all development
Posted Oct 3, 2013 2:59 UTC (Thu)
by Cyberax (✭ supporter ✭, #52523)
[Link] (13 responses)
Posted Oct 3, 2013 3:21 UTC (Thu)
by dlang (guest, #313)
[Link] (12 responses)
> Actually, you can do it just fine. As long as you don't use the API in question.
while this is technically true, when you compile on a new system, you pull in new versions of libraries (or at least library headers), and so you are going to end up using new versions of the various APIs unless you go to a lot of effort to avoid it (much more effort than just compiling on an older system)
Posted Oct 3, 2013 3:34 UTC (Thu)
by Cyberax (✭ supporter ✭, #52523)
[Link] (11 responses)
>#define WINVER 0x0501
And that's it. Really.
Now, how can I do this with Ubuntu 13.04? I'd really like my programs compiled against the system glibc work on RHEL6.
Posted Oct 3, 2013 3:42 UTC (Thu)
by dlang (guest, #313)
[Link] (9 responses)
On windows you may be able to do that for some microsoft controlled libraries because they control everything.
but if you use any third party libraries, you can't just set WINVER and count on it working, because the library version isn't tied to the windows version.
the way to do the equivalent on Linux is to get a set of the headers for whatever version you want to compile against, and use those for your compile, but Ubuntu doesn't provide RHEL libraries any more thatn RHEL provides Slackware libraries.
remember that most of the APIs for RHEL were frozen well before 6.0 was released in 2010, so what you are asking for is even harder than saying that you want something compiled against the 13.04 libraries to work against Ubuntu 10,10 (or more likely 10.04) libraries.
If you want everything for an ecosystem to be controlled by one entity, go get yourself a Mac.
Posted Oct 3, 2013 4:00 UTC (Thu)
by Cyberax (✭ supporter ✭, #52523)
[Link] (8 responses)
> the way to do the equivalent on Linux is to get a set of the headers for whatever version you want to compile against, and use those for your compile, but Ubuntu doesn't provide RHEL libraries any more thatn RHEL provides Slackware libraries.
Compiling stuff on Windows for older versions of OS? Easy, just set WINVER and maybe a couple of compiler switches. And don't forget to package your libraries. On Mac OS X it's the same (__MAC_OS_X_VERSION_MAX_ALLOWED), except that there's also a standard bundle format.
On stock Linux distributions it's basically impossible. You have to install a parallel environment in a some kind of container.
Posted Oct 3, 2013 10:48 UTC (Thu)
by khim (subscriber, #9252)
[Link] (7 responses)
Actually it's possible to do: as I've pointed out earlier LSB solves this problem. What you cannot really do is to use newer version of GCC. Not because of GLibC, but because of libstdc++. Just like GLibC libstdc++ is not designed to be linked into many different modules and it's not forward-compatible. LSB solves tis problem, too - but only up to GCC 4.6 (since libstdc++ is part of GCC you need to modify GCC to solve it, you can not just modify headers and libraries). This is valid approach, too.
Posted Oct 3, 2013 11:08 UTC (Thu)
by hummassa (subscriber, #307)
[Link] (3 responses)
this is incredible BS. LSB 4.1 is supported perfectly by debian, via the installation of the (oooh, surprise) "lsb" package. even packages compiled in Fedora 12 in C++ work if you install the (surprise, again!) "lsb-cxx" package. Just "alien"ate it, install it, voila.
Posted Oct 3, 2013 11:46 UTC (Thu)
by khim (subscriber, #9252)
[Link] (2 responses)
That's not support. That's a joke. It reminds me my first experience with Debian many years ago. I've used RedHat back then with a couple of custom PAM modules and wanted to play with Debian. And shiny new (back then) version of Debian supposedly included PAM support. Ok, fine, I've installed Debian and tried this “support”. Can you guess what happened? Right: it was possible to install PAM and get shiny new Sure, you can use it to run some LSB-compatible software (which does not really exist) but can you develop something for Debian using LSB? How will you know which version of LSB to target to support which version of Debian? Where will you find libraries to access APIs not included in LSB (and there are many APIs not included in LSB)? LSB package, yes, it's supported by Debian. LSB development… nope. LSB was developed as a good basis for the hypothetical “Linux distribution's SDK”, but nobody bothered to actually make one. Because it's more-or-less impossible without help from distribution makers and distribution makers are not interested.
Posted Oct 3, 2013 13:11 UTC (Thu)
by peter-b (guest, #66996)
[Link]
Posted Oct 6, 2013 12:12 UTC (Sun)
by krake (guest, #55996)
[Link]
The problem is not interest of distribution makers, the problem with LSB is that it codifies what RHEL ships at the time of a LSB release.
To be actual useful for both SDK and Application vendors, it would have to specify a situation that distributions would then strive to implement.
I've been to an LSB meeting once. SDK providers asked for a binary compatible update of some libraries for the standard's next(!) version, but the request was turned down because those versions would not be already be available in RHEL at the time of the standard's release.
Posted Oct 3, 2013 14:49 UTC (Thu)
by jwakely (subscriber, #60262)
[Link] (2 responses)
Could you expand on this part please?
What changed after 4.6, and what changes can you make to "fix" it?
Posted Oct 3, 2013 16:36 UTC (Thu)
by khim (subscriber, #9252)
[Link] (1 responses)
Nothing have changed after 4.6, but as I've said it's a lot of work to support newer version of gcc and Linux Foundation have only done said work for GCC 4.6, not for GCC 4.7 or 4.8 The problem with LSB is that it's tries to solve backward compatibility problem by adding some packages “on the side” which obviously does not work: it's very easy to break compatibility if you are not thinking about it and the only way to make sure it's not broken is to use it. Compare situation with Android SDK, e.g.: Google actually builds things using said SDK which means that if it's broken problems are quickly found and fixed. And if some important APIs are needed then they are added to SDK, too. Distributions don't use LSB to build anything which means that problems with LSB are very low-priority (if they have any priority at all).
Posted Oct 4, 2013 14:19 UTC (Fri)
by jwakely (subscriber, #60262)
[Link]
Posted Oct 4, 2013 10:40 UTC (Fri)
by nix (subscriber, #2304)
[Link]
Posted Oct 6, 2013 12:27 UTC (Sun)
by krake (guest, #55996)
[Link] (1 responses)
Yes, but I didn't claim differently. We were talking about the general update rate of the distribution channel, i.e. time between package being released/uploaded and it being available to users.
My guess is that you are somewhat confused by common channel names.
An "unstable" channel reacts almost immediately to package uploads, a "stable" channel updates its package list at pre-determined dates.
The common app stores are all "unstable" channels, there is no coordination between publishing times of packages of different vendors.
> What about users who don't want it?
Channel selection is the user's choice on a private end user's machine, the administrator's choice in a managed setup.
Posted Oct 6, 2013 13:07 UTC (Sun)
by khim (subscriber, #9252)
[Link]
I'm not sure what you are talking about, but I'm talking about user experience from one side and developer experience on the other side. User [naturally] does not want to know about channels, update rates and other such things. S/he just wants to play latest Angry Birds or visit some website using Chrome. Developers want to offer some way to make it possible. Everything else are technical details. Sure—but this is what users are supposed to use! Last time I've checked Debian's position was that “yes, you can use Debian unstable, but there are no promises”. If users are not supposed to use that thing (except for a brave few) then developers can not use it do deliver their software to the users. With Android or Windows user can decide if s/he wants Beta release or Stable release (even on corporate phone/desktop… well, if developer offers beta at all, of course), while on Linux user can not even use latest stable version! Do you want to imply that it's perfectly reasonable to install “unstable” on mission-critical server or on CEO desktop? Because it does not look like what Debian project implies from their explanations. Well, that makes sense for the OS, but why should “an IT department” decide what kind of software must be installed there. An IT department decides what should be installed here, of course, but then individual teams decide if they need MSVC 2010 or MSVC 2012, e.g. Note that both of them go on top of Windows 7 because Windows 8 (released almost year ago, remember?) is considered “too new and too risky”. With Linux there are no such choice: either one need to pick “you are on your own” unstable channel or one is stuck with years old programs.
Posted Sep 22, 2013 18:04 UTC (Sun)
by lsl (subscriber, #86508)
[Link] (7 responses)
No, you don't. Almost no distribution requires a justification for packaging some piece of software as long as it complies with the packaging guidelines. "It's useful and I'm willing to do the work" is justification enough. Do you have a counter-example? RHEL et al. obviously don't count.
The single major reason delaying package reviews is packages conflicting with the guidelines. In practice this often boils down to one of the following: bad interaction with other parts of the system which the submitter did not anticipate, licensing issues, bundled libraries and, of course, broken packaging due to an inexperienced submitter.
Those issues usually get resolved but the review is stuck until then.
Also, at least in Fedora, it's the maintainer who decides in which version the new package will appear. It's just that many choose to only ship it for the current and future versions.
Posted Sep 22, 2013 18:53 UTC (Sun)
by khim (subscriber, #9252)
[Link] (6 responses)
You forgot to mention that it must be free which automatically excludes most of the software available. And “I'm willing to do the work” means not “I'm ready to read the guidelines and make a package in accordance to them”, but rather “I'm ready do change my package when guidelines are changing”. Not even Apple (which offers access to the most lucrative market over there) does that! It may refuse to allow newer version in the store if you don't upgrade it in accordance to the updated guidelines and it may remove it if it'll find out that you actually violated third-party license or something like this, but in general: what was added to AppStore once will be available there years later. There is nothing wrong with trying to push FOSS, but this goal is fundamentally incompatible with the goal of having sizable presence on the desktop. You could always contact RPMfusion maintainers, of course, but it's not part of the Fedora and there are large wishlist already which includes such pearls as Packages [which] already exist for RPM Fusion Russia and Chromium which violates Fedora's policies of handling libraries and various other issues. I know. But Fedora is rare exception: most other distributions only upgrade packages in exceptional circumstances even if maintainer wishes otherwise. Which, frankly, makes absolutely no sense from user's POV. Currently used distribution model basically asks "do you want to change your wardrobe, your car and your house once per month or once per five years". Which is ridiculous: I want to change my wardrobe when it goes out of fashion but I only change my house once per few years (some people never change their house). The offer to completely redecorate my house just to get a new tie sounds crazy, but somehow the offer to break my [perfectly working and tuned up] desktop environment just to get GEGL-based GIMP is normal? Gosh. Frankly I don't know what "problem" distributions are solving in their current form. Disk space savings? Come on: my phone has 32 GB of flash and my desktop has terabyte (wel, four, actually, but who's counting?)! Why not install all the essential components from the start and allow me to pick and choose the rest? Ah, security… but why security is affected to such a degree by just installing features I'll not use??? All these "numerous dependencies" are mostly shared libraries which represent just random set of bytes as long as they are stored on my system, but not actually used by programs.
Posted Sep 22, 2013 20:19 UTC (Sun)
by lsl (subscriber, #86508)
[Link] (3 responses)
They provide convenience. Tons of it. Aside from my webbrowser the upstreams of virtually all the software I use ship source code and source code only. This has nothing to do with any ABI instabilities, they don't ship binaries for Solaris or OS X or whatever either.
For the majority of those programs I don't really care what exact version I'm running but just that it's recent and bugs get fixed. So I simply install it from the distribution repo und get it updated through yum/apt/zypper/pacman. I don't have to track upstream myself.
Only for some projects which I care deeply about I might want something else than what the standard distribution offer can provide. There I just pull the code from git or hg and work with that. I get to pick where to invest my time. I don't have to track the upstreams for hundreds or thousands of programs and libraries just to keep the system running. Oh, and since I'm already involved with the code and upstreams I care about why not help my distro making nice packages of it so others don't have to care?
The system works very well. Not for everybody and not for every kind of software but for many.
Posted Sep 22, 2013 20:50 UTC (Sun)
by khim (subscriber, #9252)
[Link] (2 responses)
Really? I somehow don't find it more “convenient” when I find out that to run latest version of GIMP I must upgrade the whole thing with unpredictable results. Or may be you wanted to say that they provide “convenience for packagers”? Well, may be, but I'm, as user, is not impressed. If you don't care about version of the software then why would you want to track it at all? This makes no sense! Something like monolithic Android release will work just as well. And what do you do with a few programs which you do care about? Wow. That means that if I want to just use some features from latest released version of Inkskape I'll need to become a co-developer? Thnks, but no, thnks. Most my friends are not software engineers, they don't want to be a software engineers and they just want to draw, of play or write, they do not want to pull the code from git or hg and work with that—and so, increasingly, do I. The fact that source code is available should not mean that only someone who can actually track dependencies and compile it deserve to use it.
Posted Sep 22, 2013 21:21 UTC (Sun)
by lsl (subscriber, #86508)
[Link] (1 responses)
Uhm, I wrote that. I want a somewhat recent version that gets bugfixes and new features. If I just install something from upstream one time and then forget about it I don't have that. Aside from that 'yum install gimp' (which gives me 2.8.6, btw) is much more convenient than installing GIMP from upstream.
> Something like monolithic Android release will work just as well.
I don't understand what you mean here. Who is going to create a monolithic release image with all the programs I want to use? The result of just stuffing a whole distro archive with >10k packages into some image isn't going to be acceptable to most people.
Posted Sep 22, 2013 21:32 UTC (Sun)
by khim (subscriber, #9252)
[Link]
Why would you want to stuff >10k packages in the image? You only need to include libraries with enough users. If there are 2-3-5 packages which need a particular library they can carry it with them, it's not a big deal. This simple procedure will shrink you list from >10k packages to 300-500 packages (or may be even less), which can easily be included in a single image. Programs themselves can come from program authors: somehow it works for MacOS, Windows, Android and iOS, why wouldn't it work for Linux?
Posted Sep 27, 2013 8:41 UTC (Fri)
by Quazatron (guest, #4368)
[Link] (1 responses)
Every single piece of software you have on your system can potentially be exploited to escalate privileges. Obviously, the less cruft you have lying around, the smaller the probability of having an exploitable bug.
It's the traditional convenience vs. security trade-off.
Posted Sep 27, 2013 14:54 UTC (Fri)
by raven667 (subscriber, #5198)
[Link]
Posted Sep 22, 2013 18:10 UTC (Sun)
by lsl (subscriber, #86508)
[Link]
Playing such control games just doesn't work in an open source environment and that's a good thing. It doesn't even work well in today's proprietary world and more and more producers switch to the whole world on same day model.
Posted Sep 19, 2013 19:26 UTC (Thu)
by andrel (guest, #5166)
[Link] (4 responses)
Posted Sep 20, 2013 9:15 UTC (Fri)
by mpr22 (subscriber, #60784)
[Link] (3 responses)
Posted Sep 22, 2013 8:28 UTC (Sun)
by krake (guest, #55996)
[Link] (2 responses)
Steam of course could be different and not require any manual work on Steam's part but the mobile app stores all seem to require review by the store provider.
I haven't followed app store policies closely enough to be sure, but I am not aware of any of them guaranteeing an upper bound for delays.
Posted Sep 22, 2013 11:26 UTC (Sun)
by khim (subscriber, #9252)
[Link] (1 responses)
There are no guarantee because there are scanners which try to catch and stop publishing of malware but is most cases promise that your app appears in the store listings within hours, not weeks is satisfied.
Posted Sep 22, 2013 12:01 UTC (Sun)
by krake (guest, #55996)
[Link]
So for Google it is closer to uploading to a Linux distribution repository, with the assumption that the uploaded package will show up on mirrors within the next synchronization cycle.
I guess Apple is the odd one out then with their infamous reviewers
Posted Sep 20, 2013 0:04 UTC (Fri)
by lsl (subscriber, #86508)
[Link]
Posted Sep 22, 2013 8:38 UTC (Sun)
by krake (guest, #55996)
[Link] (15 responses)
So the resulting difference is more like that in the "App store" model the developers have to be the packagers and in the "Linux repo" model they can be the packagers.
Posted Sep 22, 2013 11:32 UTC (Sun)
by khim (subscriber, #9252)
[Link] (14 responses)
No, the resulting difference is that it takes hours to publish something in App store and months to publish something in the distribution—which makes all kind of marketing efforts absolutely pointless: either you offer will not be available in distribution when ads go out or they will be available for weeks if not months and will be "old news" when ads go out.
Posted Sep 22, 2013 12:06 UTC (Sun)
by krake (guest, #55996)
[Link] (13 responses)
Any distribution where it would take months, or even just weeks from upload to becoming-available would lose all its developers pretty fast.
Posted Sep 22, 2013 13:06 UTC (Sun)
by khim (subscriber, #9252)
[Link] (12 responses)
Really? News to me. I'm talking about new packages here—and in most cases these are not added to the "stable" repo but are only included in the next release. Which means months of delay. Sure, few distributions are doing better (Arch and Gentoo come to mind) and very few offer what people really want (the ability to get this-new-game-everyone-talks-about yet keep "good old" productivity applications untouched). Or rather: most distributions offer such ability in some form but few actually support that (any mix-and-match installation is usually considered "tainted" and people usually will ask you to stop doing that). And even in the distributions which do offer to push applications between releases all the testing must be done in the open which means you can not really do a controlled beta-testing. IOW: App Stores are more-or-less continuation of brick-and-mortar shops where developers and users interact with limited control by mediators (shop owners) while distributions put packagers firmly in control which means both developers and users are not satisfied and go elsewhere.
Posted Sep 22, 2013 14:33 UTC (Sun)
by krake (guest, #55996)
[Link] (11 responses)
Yes. The software business requires fast feedback cycles.
> I'm talking about new packages here
Of course. Old packages usually remain on the channel and don't have to be re-uploaded in regular intervals or something like that.
> And even in the distributions which do offer to push applications between releases all the testing must be done in the open which means you can not really do a controlled beta-testing.
True. The beta testers select what they test, not the developers who tests their software.
> IOW: App Stores are more-or-less continuation of brick-and-mortar shops where developers and users interact with limited control by mediators (shop owners) while distributions put packagers firmly in control which means both developers and users are not satisfied and go elsewhere.
This is interesting and very welcome news to me!
All app stores I had observed so far did require pre-packaging, i.e. only allowed uploads of packages, making them equivalent to distributions repostories which also require the same packaing step.
Even worse, I had experienced situations where some target platforms required packaging even for local testing!
The only system that improved on that which I knew about was the OpenSUSE Build Service, which takes care of packaging and distribution and thus, as you said, only puts very limited obstacles between developers and users.
Good to know that some app stores have incorporated that now as well!
Posted Sep 22, 2013 16:36 UTC (Sun)
by khim (subscriber, #9252)
[Link] (10 responses)
Have you actually checked what happens in real world, or do you want to discuss some imaginary alternate reality? Simple check: GIMP (very popular image manipulation program) and Ubuntu (one of the most popular Linux distributions). If you use latest LTS version you are stuck with GIMP 2.6.12 (released 1.5 years ago), if you use latest stable version of Ubuntu you are stuck with GIMP 2.8.4 (released over half-year ago), only if you pick unstable and unfinished version of Ubuntu you finally get GIMP 2.8.6 (which is latest stable version of GIMP). Can we please, discuss realities of this world, not your fantasies? Well, sure. How else this stuff is supposed to work? This may not be very convenient, but I don't see what's so problematic about that. OpenSUSE Build Service really solves minor and mostly unimportant part of the problem. Really hard stuff happens before building and packaging (when developer need to actually write that stuff, you know) and after packaging (when Q&A must test the result and it must be delivered somehow to the end user). It's not a bad software, but it does not even try solve the distribution problem: for the distribution channel to work it must come pre-installed with your OS (or it must be promoted heavily like Amazon's AppStore) and AFAICS the OpenSUSE Build Service does not even have an UI for that. With Google Play you reach half-billion of users or so, with Apple's App Store you reach two million premium users (the most lucrative ones) or so, even with Samsung's Store or Amazon's Store you reach hundred of million users or so. Who do you reach if you pick OpenSUSE Build Service? And how?
Posted Oct 2, 2013 12:11 UTC (Wed)
by krake (guest, #55996)
[Link] (9 responses)
Yes I have and I do. Almost daily. email notifications can be a handy thing.
> Can we please, discuss realities of this world, not your fantasies?
That's what I do, but we can discuss your fantasies if you prefer.
For example, in order to discuss the situation for the GIMP packages, can you provide the datetimes of their upload and those of their respective appearance on their distribution channel?
I am unfortunately neither monitoring GIMP nor Ubuntu so I lack data on when the packages were uploaded into the package-in-queue and when the packages appeared in the repository they were uploaded to.
> Well, sure. How else this stuff is supposed to work?
Don't ask me. Your previous comment seem to indicate that app stores somehow did not require any packaging before uploading while Linux repositories did. Hence me being positively surprised since it wasn't what I had heared and withnessed so far.
> This may not be very convenient, but I don't see what's so problematic about that.
I didn't claim it was problematic, I just personally find it highly inconvenient given the alternatives on systems that do not require that.
> OpenSUSE Build Service really solves minor and mostly unimportant part of the problem
Maybe. It was just the only thing that came to my mind that had the properties of what you seemed to describe, i.e. not requiring any packaging as part of the upload process.
It is of course not relevant for the discussion if none of the app stores allow you to do that either.
Posted Oct 2, 2013 14:33 UTC (Wed)
by khim (subscriber, #9252)
[Link] (8 responses)
GIMP 2.8 was added to Debian “Tue, 08 May 2012”. It become available to Debian users “Tue, 04 May 2013” (when Wheezy was finally released). That's almost the whole year! And it's not available in Ubuntu LTS till today. Why are interested in this time? Sure, that time is measured in hours, but end-users don't see the package for months and years. And if I'm developer I'm usually interested in the last one: what does it change for me if it's available in some obscure place or not if users don't see it and can't use it? The important property of app stores is emulation of regular stores. App developer determines the date when software is available (it must be submitted in advance if review process is involved but even Apple allows you to pick the exact time when reviewed and approved app will be available to the end user), end user determines when s/he wants to upgrade it (or install for the first time). Distributions reserve all the timing rights to themselves: developer can not affect this time at all and user should either update everything or nothing. Not cool. Not cool at all.
Posted Oct 2, 2013 18:33 UTC (Wed)
by krake (guest, #55996)
[Link] (7 responses)
Added as in uploaded or added as becoming part of the repository?
> It become available to Debian users “Tue, 04 May 2013”
That would suggest that the first of the two dates was the upload date, not the date when it added to the Debian package repository, however
> when Wheezy was finally released
suggest that the second date is a Debian release date, which makes is highly unlikely that it is also the date of the package being added to the repository (nobody adds new packages on the day of release).
So from those pieces I gather that GIMP 2.8 appeared in the Debian package respository on Tue, 08 May 2012.
According to Wikipedia that version was released on Thu,03 May 2012.
So assuming a fully ready, packaged and tested, GIMP was uploaded on that day, then it took 5 days for the package to become available.
> Why are interested in this time?
It is the only time that is added by the app store or repository provider. All time before that is controlled by the software vendor, all time after that is controlled by the person or automated system installing the software.
> Sure, that time is measured in hours, but end-users don't see the package for months and years
So we are talking about user who neither use automated update checking nor manually open the store frontend or have it confgured to not download new package lists on open?
The only app store that I have experience with that works differently is the PlayStation store, which displayes new games in some kind of overlay when the user idles in the main menu. Neither Nokia's Ovi Store, nor BlackBerry's AppWorld ever did that on my devices.
> App developer determines the date when software is available
Specifying a later date, as in wait for a certain date even if approved earlier, is the only difference that I can see.
I wonder if anyone has ever approached any Linux distributor with a request for that and if, why they decided not to add that capability.
Posted Oct 2, 2013 20:10 UTC (Wed)
by khim (subscriber, #9252)
[Link] (6 responses)
Does it really matter? That's the date from GIMP's package changelog and I don't care about few days. Years - yes, that I do care about. You have very strange definition of "available package". To me it's very simple: package is available when it's available. That is: when I can actually use it. If I start package manager, appstore application or appstore's website and can not click "Install" (or "Upgrade") then package is not available. Believe me: most users have this exact definition of package availability. Remember? Simple things should be simple, complex things should be possible. You want to skip the first part and talk only about second one, but in reality first part is more important! Ok, perhaps I'm missing something obvious. Let's consider one simple example. My needs are simple: I want to develop software usable on Ubuntu Lucid 12.04 LTS and use GIMP 2.8 to edit my photos. What should I do to achieve that? We are talking about “Joe Average” use. The one who never installs new OS (when it's broken beyond repair s/he calls the support who will do a new installation for a fee) yet wants to use couple of recently released programs (you know: games, may be some photo editing software like GIMP). I can be “Joe Average” user with Android or Windows (to some degree), but on Linux I get only “features previously only available to enterprise customers of large software vendors”. Sorry, but I neither need them nor want them. My needs are simple and they are not met.
Posted Oct 3, 2013 2:27 UTC (Thu)
by hummassa (subscriber, #307)
[Link] (1 responses)
You *do* get that this site's name stands for "Linux Weekly News", don't you? If linux does not meets your needs, are you here just for the trolling factor?
Posted Oct 3, 2013 10:56 UTC (Thu)
by khim (subscriber, #9252)
[Link]
Posted Oct 4, 2013 10:45 UTC (Fri)
by nix (subscriber, #2304)
[Link] (1 responses)
Your plaint appears to be that you want software to appear instantly in LTS releases of distributions (RHEL, Debian stable, Ubuntu LTS...). Well, guess what, the long-term-support releases are called that because they are *stable*. They are stable because they *do not change unnecessarily*. Massive backports of everything everyone ever wants into LTS releases would make them not LTS releases anymore.
Posted Oct 4, 2013 15:31 UTC (Fri)
by raven667 (subscriber, #5198)
[Link]
To have the system work like this though requires a commitment to ABI stability that at this time can only be assured by never changing any libraries or applications, it would require re-defining the system in terms of a core and a GUI and applications with ABI guarantees between the layers, rather than the unlayered system we have now where everything tracks a full dependency graph to everything else.
I'm not sure I'm explaining my thoughts very well, I might need more coffee yet 8-)
Posted Oct 6, 2013 12:55 UTC (Sun)
by krake (guest, #55996)
[Link] (1 responses)
Of course it does if the discussion's topic is time needed for a certain step in chain of events. The time required for the step is determined by the start time and the end time, so it really matters a lot which one we are looking at because one needs to find the other one.
In this case the time given was the end time, so missing one was the start time, which was approximated to be about 5 days earlier.
> You have very strange definition of "available package". To me it's very simple: package is available when it's available.
That is exactly my definition as well.
It is then reported as available to the user running the update request.
> That is: when I can actually use it.
The time between the software becoming available and you using it is totally up to you. There is no difference between any forms of software distribution, even for software pressed onto CDs this time is your choice and your choice only.
> We are talking about “Joe Average” use.
I had assumed as much, but your comment about users not seeing updates for months or years threw me off. The only way that to happen is the user having turned off automatic updates and never checking for updates manually.
I certainly know about users that do not check for updates manually and have automatic updates turned off, but I am afraid no distribution method in the world would make them see new versions.
Any user who does check for updates or has that delegated to some automated process will see changes if there have been any.
The example with GIMP 2.8 on Debian shows that a user who checks for updates daily would not see the update for five days after release by the software vendor.
I agree that this is not very fast, but probably acceptable for most users unless they have been waiting impatiently for exactly that release to happen.
> I can be “Joe Average” user with Android or Windows (to some degree), but on Linux I get only “features previously only available to enterprise customers of large software vendors”.
Actually on Linux (at least most disitributions) you have the choice.
Posted Oct 6, 2013 13:34 UTC (Sun)
by khim (subscriber, #9252)
[Link]
Actually that's something you obviously need to do. If you are “Joe Average”, that is. Remember: today's “Joe Average” does not know about existence of command line (Ok, Ok, Ok, this may be overstatement: many of them actually know that such thing actually exist, but very few of them ever seen it) and have only vague understanding of configuration knobs available in GUI. You may ask: how can s/he survive in Windows world where such things are also sometimes needed? Well, that's easy: they are doing this under guide from others. On the west it's usually done by the computer vendors (who will automatically refuse to deal with you if you ever mention the fact that you've used some other version of OS then what was preinstalled), while in the east it's separate firms (one, two, three—this makes them somewhat more flexible because they can install different versions of OS, but again if user reinstalled OS then warranty is null and void). The gist of the situation is the same in all cases: every time you need to go to command line and/or need to change a config file it's a big problem and it's really expensive for you. If that happens on the few computers in your firm it's a disaster (and may be significantly more expensive then just payment for the “computer master”). Do you still want to imply that it's safe for such a user to switch to unstable channel to receive latest stable version of GIMP? I was under impression that it does not work that way. The one way for that to happen is to have OS (installed by “computer master”, not by user, remember?) tuned to “stable channel”. And I kind of assumed that it's the default setting (at least it was the default setting in all companies, schools and other places where I've seen Linux in use). Do you want to imply that you have stats which show that users actually use “unstable channel” en masse?
Posted Sep 23, 2013 10:15 UTC (Mon)
by Cato (guest, #7643)
[Link]
It also includes features such as game state syncing, in-game instant messaging, social network (profile pages, friends, games played), and so on.
Generally Steam is a great app store / platform, to which PC gamers have shown great loyalty because Valve are mostly concerned with doing the right thing for gamers, and making money as a side effect - unlike EA Games whose ostensibly similar Origin app store / platform is extremely unpleasant to use, and best avoided (Windows only in any case).
Why Steam on Linux matters for non-gamers
In the Steam (also Android and iOS) software delivery model, package updates are built and made available at the convenience of the developers. In the Linux distro model, package updates are built and made available at the convenience of the distro's maintainer(s) for the package in question.
Why Steam on Linux matters for non-gamers
Why Steam on Linux matters for non-gamers
Developers don't want to act as package maintainers, but they very much want the ability to push updates when they want/need them and to do that they must act as packagers, too.
Why Steam on Linux matters for non-gamers
Why Steam on Linux matters for non-gamers
Why Steam on Linux matters for non-gamers
Why Steam on Linux matters for non-gamers
Why Steam on Linux matters for non-gamers
I think it would be reasonable to assume that for something way smaller, like a single product, this could be cut down to a day or two, maybe even just hours.
Why Steam on Linux matters for non-gamers
Also, new software will likely take longer than updates. but again that seems to be the case everywhere.
Why Steam on Linux matters for non-gamers
Why Steam on Linux matters for non-gamers
Why Steam on Linux matters for non-gamers
Why would I make random code part of my product that I can't fix and can't even _look_at_ to determine what it actually does?
Why Steam on Linux matters for non-gamers
My observation is that the time between uploading the package and it appearing on the respective distribution channel is mostly measurable in hours.
Naturally none of those steps, including the step between upload and package availability on the channel cannot take longer than the overall time.
In most likelyhood the building and testing steps take the majority of it.
Why Steam on Linux matters for non-gamers
My observation is that the time between uploading the package and it appearing on the respective distribution channel is mostly measurable in hours.
The only times when this wasn't available is when the vendor didn't allow redistribution, but then no other form of software repository would be able to distribute it either.
Why Steam on Linux matters for non-gamers
My understanding of App Stores was that packages are actually delivered from the App Store's servers, thus requiring that the App Store owner has distribution rights from the package vendors, just like Linux distributors.
Why Steam on Linux matters for non-gamers
The time to the upload is controlled by the developer, the time from the channel is controlled by the user.
The distributor can of course influence those, e.g. by restricting bandwidth, but why would they?
I have no experience with Ubuntu, so just to be sure: you are saying that Canonical shop redirects you to a vendor controlled download server once the purchase is completed?
Why Steam on Linux matters for non-gamers
Why Steam on Linux matters for non-gamers
I was talking about artificially prolonging the time it takes for a user to download the software, e.g. by throttelling bandwidth, which I am not aware of any serious distributor doing. Actually they are usually working hard to allow best possible speeds, by mirroring the package repository to lots of servers all over the world.
Very similar to software rollout tools and company internal update servers provided by some operating system software vendors to their enterprise customers, just made available to all customers who want it.
Why Steam on Linux matters for non-gamers
Private end users usually want to have new software fast, so they use the fastest channel.
End users in controlled environments might only have access to some medium speed channel, much like getting operating system updates from a company internal update server so that sysadmins can decide when to clear updates on a case by case basis.
Very similar to software rollout tools and company internal update servers provided by some operating system software vendors to their enterprise customers, just made available to all customers who want it.
Why Steam on Linux matters for non-gamers
Why Steam on Linux matters for non-gamers
Why Steam on Linux matters for non-gamers
Actually, you can do it just fine. As long as you don't use the API in question.
Why Steam on Linux matters for non-gamers
Why Steam on Linux matters for non-gamers
Let me quote this incredible effort to limit APIs to the ones supported by WinXP:
>#include <windows.h>
Why Steam on Linux matters for non-gamers
Why Steam on Linux matters for non-gamers
On Windows that works just fine for the core Windows services. Which cover quite a lot of ground - from loading images and UI to network protocols support.
I can package most of the third-party libraries with my application - but I can't package glibc and _everything_ depends on it. And there's no way to say "use symbols that existed in glibc x.y.z".
Why Steam on Linux matters for non-gamers
I can package most of the third-party libraries with my application - but I can't package glibc and _everything_ depends on it. And there's no way to say "use symbols that existed in glibc x.y.z".
On stock Linux distributions it's basically impossible. You have to install a parallel environment in a some kind of container.
-mmacosx-version-min=XXX
on MacOS does something like this. Problem lies not with the fact that it's impossible to do in principle, but in the fact that nobody bothers to do that in timely manner: LSB is not supported by Debian, it's totally separate thing and new version of GCC is supported by LSB not when it's released, but years later.Why Steam on Linux matters for non-gamers
Why Steam on Linux matters for non-gamers
LSB is not supported by Debian
this is incredible BS. LSB 4.1 is supported perfectly by debian, via the installation of the (oooh, surprise) "lsb" package.
/lib/libpam.so.0
library… and that's it. You could not use it for the logon authentication, it's not used by the xlock, etc.even packages compiled in Fedora 12 in C++ work if you install the (surprise, again!) "lsb-cxx" package.
Why Steam on Linux matters for non-gamers
Why Steam on Linux matters for non-gamers
Why Steam on Linux matters for non-gamers
Why Steam on Linux matters for non-gamers
What changed after 4.6, and what changes can you make to "fix" it?
Why Steam on Linux matters for non-gamers
Why Steam on Linux matters for non-gamers
Why Steam on Linux matters for non-gamers
As I tried to explain before, names like "unstable" do not refer to the stability of the software on the channel, but rather to the lack of fixed update interval.
All are usually available in parallel, e.g. an IT department can chose to use different channels for different types of machines.
Why Steam on Linux matters for non-gamers
We were talking about the general update rate of the distribution channel, i.e. time between package being released/uploaded and it being available to users.
The common app stores are all "unstable" channels, there is no coordination between publishing times of packages of different vendors.
Channel selection is the user's choice on a private end user's machine, the administrator's choice in a managed setup.
All are usually available in parallel, e.g. an IT department can chose to use different channels for different types of machines.
Why Steam on Linux matters for non-gamers
Why Steam on Linux matters for non-gamers
"It's useful and I'm willing to do the work" is justification enough.
Also, at least in Fedora, it's the maintainer who decides in which version the new package will appear.
It's just that many choose to only ship it for the current and future versions.
Why Steam on Linux matters for non-gamers
Why Steam on Linux matters for non-gamers
They provide convenience. Tons of it.
For the majority of those programs I don't really care what exact version I'm running but just that it's recent and bugs get fixed. So I simply install it from the distribution repo und get it updated through yum/apt/zypper/pacman. I don't have to track upstream myself.
There I just pull the code from git or hg and work with that.
Why Steam on Linux matters for non-gamers
Why Steam on Linux matters for non-gamers
The result of just stuffing a whole distro archive with >10k packages into some image isn't going to be acceptable to most people.
Why Steam on Linux matters for non-gamers
Why Steam on Linux matters for non-gamers
Why Steam on Linux matters for non-gamers
Why Steam on Linux matters for non-gamers
It doesn't affect the user interface, but it does affect the user experience, because now there's a probably-unpaid middleman who might go on holiday at the wrong time, or have their depression suddenly get worse so they become non-responsive because they're too busy doing things like getting out of bed and remembering to eat, or have their computer fried by lightning, or have a personality clash with the upstream developer or another package maintainer, sitting between you and the shiny new version of your favorite piece of software.
Why Steam on Linux matters for non-gamers
Why Steam on Linux matters for non-gamers
Why Steam on Linux matters for non-gamers
I haven't followed app store policies closely enough to be sure, but I am not aware of any of them guaranteeing an upper bound for delays.
Why Steam on Linux matters for non-gamers
Why Steam on Linux matters for non-gamers
Why Steam on Linux matters for non-gamers
Why Steam on Linux matters for non-gamers
So the resulting difference is more like that in the "App store" model the developers have to be the packagers and in the "Linux repo" model they can be the packagers.
Why Steam on Linux matters for non-gamers
Why Steam on Linux matters for non-gamers
Any distribution where it would take months, or even just weeks from upload to becoming-available would lose all its developers pretty fast.
Why Steam on Linux matters for non-gamers
Waiting for months for an uploaded artifact to appear on the channel would make that impossible.
That is indeed a difference.
Why Steam on Linux matters for non-gamers
> Really? News to me.
Yes. The software business requires fast feedback cycles.
Waiting for months for an uploaded artifact to appear on the channel would make that impossible.All app stores I had observed so far did require pre-packaging, i.e. only allowed uploads of packages, making them equivalent to distributions repostories which also require the same packaing step.
Even worse, I had experienced situations where some target platforms required packaging even for local testing!
The only system that improved on that which I knew about was the OpenSUSE Build Service, which takes care of packaging and distribution and thus, as you said, only puts very limited obstacles between developers and users.
Why Steam on Linux matters for non-gamers
Why Steam on Linux matters for non-gamers
For example, in order to discuss the situation for the GIMP packages, can you provide the datetimes of their upload and those of their respective appearance on their distribution channel?
I am unfortunately neither monitoring GIMP nor Ubuntu so I lack data on when the packages were uploaded into the package-in-queue and when the packages appeared in the repository they were uploaded to.
It is of course not relevant for the discussion if none of the app stores allow you to do that either.
Why Steam on Linux matters for non-gamers
If the former, how long did it take to become part of the repository it was uploaded to? If the latter, what was the time of the upload?
http://en.wikipedia.org/wiki/GIMP_version_history#GIMP_2.8
Longer than the couple of hours that I have witnessed for some packages, but still a lot faster than months or years as others have claimed.
They notify about updates but so do Linux distribution update managers.
I can definitely see the value in that, i.e. uploading your package a month in advance to have some buffer for review delays.
Why Steam on Linux matters for non-gamers
Added as in uploaded or added as becoming part of the repository?
So assuming a fully ready, packaged and tested, GIMP was uploaded on that day, then it took 5 days for the package to become available.
All time before that is controlled by the software vendor, all time after that is controlled by the person or automated system installing the software.
So we are talking about user who neither use automated update checking nor manually open the store frontend or have it confgured to not download new package lists on open?
Why Steam on Linux matters for non-gamers
Who's trolling whom? There are at least two versions of Linux which can be used by “Joe Average”. Sadly they both can not be used to develop software and lack decent graphic editor. So by now it's a race: either one of these will receive things needed for development and serious work or other Linux versions will get their act together. Sadly it looks like the first will happen and we'll be faced with the need for another “careful examination” like already discussed one.
Why Steam on Linux matters for non-gamers
Why Steam on Linux matters for non-gamers
Why Steam on Linux matters for non-gamers
Why Steam on Linux matters for non-gamers
Once a package is but into a repository and the repository's index has been updated (which usually is very fast), the package will be seen by any machine requesting an update of the index.
You can subscribe to enterprise channels, often without extra cost. It is not something you have to do.
Why Steam on Linux matters for non-gamers
You can subscribe to enterprise channels, often without extra cost. It is not something you have to do.
I had assumed as much, but your comment about users not seeing updates for months or years threw me off. The only way that to happen is the user having turned off automatic updates and never checking for updates manually.
Why Steam on Linux matters for non-gamers