User: Password:
|
|
Subscribe / Log in / New account

Improving Ubuntu's application upload process

By Jonathan Corbet
September 5, 2012
There has been a surplus of articles recently on how Linux "lost" the desktop. Fingers have been pointed in all directions, from the Windows monopoly to competing desktop projects to Linus Torvalds's management style. Over in the Ubuntu camp, though, there does not appear to be any sense that the desktop has been lost; they are still working hard to win it. Ubuntu's recently-proposed new application upload process highlights its vision of the desktop and what they think needs to be done to make things happen there.

The problem

Serious Linux users tend not to think of availability of software as a problem; distribution repositories typically carry tens of thousands of packages, after all, and any of those packages can be installed with a single command. The problem with distribution repositories, from Ubuntu's point of view, is that they can be stale and inaccessible to application developers. The packages in the repository tend to date from before a given distribution release's freeze date; by the time an actual distribution gets onto a user's machine, the applications found there may be well behind the curve. In some cases, applications may have lost their relevance entirely; as Steve Langasek put it:

Why put an upstream through a process optimized for long-term integration of the distribution when all they care about is getting an app out to users that gives them information about this month's beer festival?

Beyond that, getting a package into a distribution's repository is not something just anybody can do; developers must either become a maintainer for a specific distribution or rely on somebody else to create and add a package for their application. And, in most distributions, there is no place in the repository at all for proprietary applications.

Ubuntu's owner Canonical sees these problems as significant shortcomings that are holding back the creation of applications for the Linux desktop; that, in turn, impedes the development and adoption of Linux as a whole. So, a few years back, Canonical set out to remedy these problems through the creation of the Ubuntu Software Centre (USC), a repository by which developers could get applications to their users quickly. The USC is not tied to the distribution release cycle; applications added there become available to users immediately. There is a mechanism for the handling of payments, allowing proprietary applications to be sold to users. A glance through the USC shows a long list of applications (some of which are non-free) and other resources like fonts and electronic books. Guides to nearby beer festivals are, alas, still in short supply.

Naturally, Canonical does not want to provide an unsupervised means by which arbitrary software can be installed on its users' systems. Experience shows that it would not take long for malware authors, spammers, and others to make their presence felt. So the process for putting an application into the USC involves a review step. For paid applications, for which Canonical takes a 20% share of the price, there appears to be a fully-funded mechanism that can review and place applications quickly. For free applications, instead, review is done by a voluntary board and that group, it seems, has been having a hard time keeping up with the workload. The result is long delays in getting applications into the USC, discouraged developers, and frustration all around.

Automatic review

The new upload process proposal aims to improve the situation for free applications; Canonical does not seem to intend to change the process for paid applications. There are a number of changes intended to make life easier for everybody involved, but the key would appear to be this:

We should not rely on manual reviews of software before inclusion. Manual reviews have been found to cause a significant bottleneck in the MyApps queue and they won’t scale effectively as we grow and open up Ubuntu to thousands of apps.

In other words, they want to make the process as automatic as possible, but not so automatic that Bad Things make it into the USC.

The first step requires developers to register with the USC, then request access to upload one or more specific packages. Getting that access will require convincing Canonical that they hold the copyrights to the code or are otherwise authorized to do the upload; it will apparently not be possible for third parties to upload software without explicit permission, even if the software is licensed in a way that would allow that to happen. A review board will look at the uploader's application and approve it if that seems warranted.

Once a developer has approval, there are a few more steps involved in putting an application into the USC. The first is to package it appropriately with the Quickly tool and submit it for an upload. That is mostly basic packaging work. Uploads through this mechanism will be done in source form; binaries will, it seems, be built within the USC itself.

But, before the application can be made available, it must be accompanied by a security policy. The mechanism is superficially similar to the privilege scheme used by Android, but the USC bases its security on the AppArmor mandatory access control mechanism instead. The creation of a full AppArmor profile can be an involved process; Canonical has tried to make things simpler by automating most of the work. The uploader need only declare the specific access privileges needed by the application. These include access to the X server, access to the network, the ability to print, and use of the camera. Interestingly, access to spelling checkers requires an explicit privilege.

All (free) USC applications will run within their own sandbox with limited access to the rest of the system. Only files and directories found in a whitelist will be accessible, for example. Applications will be prevented from listening to (or interfering with) any other application's X server or D-Bus communications. There will be a "helper" mechanism by which applications can request access to non-whitelisted files; the process will, inevitably, involve putting up a dialog and requiring the user to allow the access to proceed. That, naturally, will put some constraints on what these applications can usefully do; it is hard to imagine a new compiler working well in this environment, for example. The payoff is that, with these restrictions in place, it should not be possible for any given application to damage the system or expose information that the user does not want disclosed.

And, with all that structure in place, Canonical feels that it is safe to allow applications into the USC without the need for a manual review. That should enable applications to get to users more quickly while taking much of the load off the people who are currently reviewing uploads.

Naming issues

From the discussion on the mailing lists, it would seem that the biggest concern has to do with the naming of packages and files. If an application in the USC uses a name (of either a package or a file) that is later used by a package in the Ubuntu or Debian repositories, a conflict will result that is unlikely to make the user happy. Addressing this problem could turn out to be one of the bigger challenges that Ubuntu has to face.

Current USC practice requires all files to be installed under /opt; this rule complies with the filesystem hierarchy standard and prevents file conflicts with the rest of the distribution. The problem, according to David Planella (one of the authors of the proposal), is that a lot of things just don't work when installed under /opt:

We are assuming that build systems and libraries are flexible enough to cater for an alternative installation prefix, and that it will all just work at runtime. Unfortunately, this has proven not to be the case. And I think the amount of coordination and work that'd be required to provide solid /opt support in Ubuntu would be best put in other parts of the spec, such as its central one: sandboxing.

In other words, the /opt restriction was seen as making life difficult for developers and Ubuntu lacks the resources and will to fix the problems; the restriction has thus been removed in the proposal. With Ubuntu, Debian, and USC packages all installing files into the same directory hierarchy, an eventual conflict seems certain. There has been talk of forcing each USC package to use its own subdirectory under /usr, a solution that, evidently, is easier than /opt, but nothing has been settled as of this writing.

Presumably some solution will be found and something resembling this proposal will eventually be put into place. The result should be a leaner, faster USC that makes it possible to get applications to users quickly. Whether that will lead to the fabled Year of the Linux Desktop remains to be seen. The "app store" model has certainly helped to make other platforms more attractive; if its absence has been one of the big problems for Linux, we should find out fairly soon.


(Log in to post comments)

Improving Ubuntu's application upload process

Posted Sep 5, 2012 18:18 UTC (Wed) by aliguori (subscriber, #30636) [Link]

Great article!

I saw this posting in a few sources and read through most of it. It sounded appealing but now after reading this article, it falls short of what I was hoping for.

I really just want a way that I (as an upstream) can get my package to users ASAP after I do a release and without the semi-random collection of patches that tend to get added.

That does not seem to be what this is though :-/

Improving Ubuntu's application upload process

Posted Sep 6, 2012 13:57 UTC (Thu) by stefanor (subscriber, #32895) [Link]

I don't think that's an entirely solvable problem. Packaging an application very often involves making some changes. The upstream developers don't usually think too much about how it'll eventually be deployed.

Improving Ubuntu's application upload process

Posted Sep 7, 2012 3:01 UTC (Fri) by pabs (subscriber, #43278) [Link]

You can avoid some of those random changes by being distribution-friendly and running various static analysis tools, here are a couple of links for Debian:

http://wiki.debian.org/UpstreamGuide
http://wiki.debian.org/HowToPackageForDebian#Check_points...

As an upstream developer I always run whohas before every release to find out what distro patches/bugs I may have missed and then merge or fix those when there is time.

It sounds like what you want is one of these, that way you get to do the packaging instead of distributions and you can target all distros instead of just Ubuntu.

http://zero-install.sourceforge.net/
http://listaller.tenstral.net/
http://code.google.com/p/autopackage/
http://zero-install.sourceforge.net/comparison.html

Improving Ubuntu's application upload process

Posted Sep 10, 2012 21:49 UTC (Mon) by mathstuf (subscriber, #69389) [Link]

> As an upstream developer I always run whohas before every release to find out what distro patches/bugs I may have missed and then merge or fix those when there is time.

Wow, never knew about whohas. *Adds to personal packaging tools comps file* Thanks!

Improving Ubuntu's application upload process

Posted Sep 5, 2012 19:24 UTC (Wed) by klbrun (subscriber, #45083) [Link]

The number 2 desktop OS, with 5% of the market, is Mac OS. The reason for this is 80% of Mac OS users don't care about the OS, they only care about the glamour Apple has created. Apple could be running Windows under the covers for all those people care.

Ubuntu is not going to conquer the desktop because it is a "better Linux" (unless the desktop goes away except for die hard Linux aficionados). The question is, is there a future for the desktop computer? Will it be replaced by the mobile device that plugs in to the big screen and links to bluetooth mouse and keyboard?

I just had an electrician in to replace a chandelier. He had a desktop computer, but it broke, and now that he has an iPhone, he sees no need to get his desktop fixed.

Improving Ubuntu's application upload process

Posted Sep 5, 2012 19:39 UTC (Wed) by dlang (subscriber, #313) [Link]

> The question is, is there a future for the desktop computer? Will it be replaced by the mobile device that plugs in to the big screen and links to bluetooth mouse and keyboard?

why would a mobile device plugged into a big screen, keyboard, speakers, etc not be treated as a traditional desktop?

Think about how laptops are used.

yes, on the go their keyboards and screens are used, but just about everyone who uses a laptop in a common location for any length of time has a full keyboard and screen that they hook up (either with many cables, or with a docking station). At that point, the 'laptop' is really no different from a small 'desktop' system. But the important point is that it's _used_ as a desktop system.

Why would shrinking the laptop down to a handheld size change how it's used when docked?

If your answer is that the software will work differently because it's a mobile device, then bzz, wrong answer. The software will need to work the same as current desktops as that's what users want to use. The device may have _other_ software that is more suited for the mobile environment, but that's not what the successful devices are going to be using when 'docked'

Improving Ubuntu's application upload process

Posted Sep 5, 2012 20:06 UTC (Wed) by khim (subscriber, #9252) [Link]

Why would shrinking the laptop down to a handheld size change how it's used when docked?

Because the transtion period will change the software.

If your answer is that the software will work differently because it's a mobile device, then bzz, wrong answer. The software will need to work the same as current desktops as that's what users want to use. The device may have _other_ software that is more suited for the mobile environment, but that's not what the successful devices are going to be using when 'docked'!

Sure. And this is exactly what'll happen. The only problem: it'll not be the software we currently use on the desktop!

Think about the previous such transition (when PC replaced UNIX workstations and dedicated wordprocessors) or the one before that (when said workstations replaced big iron). Every time the software goes full circle (from the "joke software" which is not as capable as the software from previous era to the "real software" which is even more capable) and every time the end result looks similar - but with a completely rewamped core. Why do you think this time will be any different? Because this time around developers kept the core intact (both iOS and Android are using UNIX-style cores)? I don't think it'll be enough. NT kernel supported POSIX, too, you know...

Improving Ubuntu's application upload process

Posted Sep 5, 2012 20:43 UTC (Wed) by dlang (subscriber, #313) [Link]

software didn't change significantly with the shift from 'desktop' to 'laptop' form factor. I don't see why it would change significantly when moving to the 'docked mobile' form factor.

Yes, in parallel with this, there is going to be development in software for the 'mobile, handheld' form factor, but so far, no software designed for that form factor works reasonably with large screens and full keyboards.

as for your claims about the migration from big iron down, the user interaction has been far more stable than you indicate.

The user interaction migration has really been

1. batch jobs (i.e. bunch card era)

2. keyboard/screen/text

3. mouse/keyboard/screen/text in windows/graphics

some people are claiming that there will be a

4. touchscreen (i.e. mobile)

that everything will migrate to, but while a touchscreen has big advantages for specific workloads and situations, it's horrible for a huge amount of stuff that people use computers to do.

For the first three steps, each step included all the capabilities of the step before it, just with added flexibility. However the touchscreen mode of operation does not cleanly and easily replace the prior modes of use, and as such it's not going to replace the prior modes, instead it's going to supplement them.

So when you have your handheld CPU/storage device (aka phone) connected to a keyboard and screen, you are going to be using mode 3 software, not mode 4 software (as always, with some exceptions)

Improving Ubuntu's application upload process

Posted Sep 5, 2012 22:53 UTC (Wed) by khim (subscriber, #9252) [Link]

software didn't change significantly with the shift from 'desktop' to 'laptop' form factor.

Laptops never existed as separate software or hardware category. Sure, mobile CPUs are less powerful then desktop CPUs and gap between desktop GPUs and mobile GPUs is even larger, but ultimately they come from the same company, from the same lab, they are just a slight variations.

I don't see why it would change significantly when moving to the 'docked mobile' form factor.

They will change on the road there. When they will be transitioned to mobile. Microsoft hoped that this switch will be similar to switch between desktop and laptop (that's why it's pushing x86-based tablets BTW), but it does not look like this will happen: people are buying totally different architecture with totally different OS instead.

Yes, in parallel with this, there is going to be development in software for the 'mobile, handheld' form factor, but so far, no software designed for that form factor works reasonably with large screens and full keyboards.

This is the same argument SGI and Sun used back in the day. Look on them now. The simple fact which decides everything: where are the money?

Mobile already is bigger then PC when you just count number of devices (smartphones only, obviously) and pretty soon it'll be bigger in $$, too.

as for your claims about the migration from big iron down, the user interaction has been far more stable than you indicate.

It's not the question of user interaction. It's the question of direction of development. Please read The Innovator's Dilemma or at least the Wikipedia article. The very fact that

For the first three steps, each step included all the capabilities of the step before it, just with added flexibility. However the touchscreen mode of operation does not cleanly and easily replace the prior modes of use, and as such it's not going to replace the prior modes, instead it's going to supplement them.

means that the existing desktop is doomed. Not because you can easily replace it with mobile/desktop hybrid today, but because some people are ready to use mobile as replacement for desktop/laptop yet noone may use desktop/laptop as a replacement for mobile today.

Think about it: there is clear and simple road from mobile to the mobile/desktop hybrid: you just need to add some nice dock and install faster, more powerful CPU. More RAM will be nice (1GB-2GB is not enough for many desktop programs), but bigger storage is not a requirement (32GB-64GB is enough for most laptop/desktop users). Oh, and you need to add some capabilities to mobile programs to make them usable with large monitor, too - but you can always borrow code from the desktop version of the same program thus it's not a big deal. This approach was already tested on the transition between phone and tablets, it works.

Now, move in the other direction. There are no road from the desktop/laptop to mobile/desktop hybrid. x86 CPUs and contemporary desktop/laptop software are too power-hungry and unweildy to use on mobile today. This means that there is huge chasm ahead: before you'll be able to sell the first mobile/desktop hybrid which is based on the desktop OS and desktop hardware you need to shrink them both a lot (in physical terms).

Can you bridge this chasm? Sure, no doubt about it! It shrinks every year and soon it'll be able to cross it… only to find out that all the districts are already claimed and people are already living happily there.

This is what Microsoft is experiencing right now. It only have gotten kind-of-competitive OS for mobiles this year: Windows core is just to big to run on the "underpowered" mobile hardware of year 2007 or 2008 and Windows CE core was never good enough to power mobile/desktop hybrid. But today mobile war is basically over: it's a split between Android and iOS. They will fight long and bitter trench warfare but other possibilities are already more or less closed: this train have not just left the station, it's about half-way to the next one!

Even if Microsoft (using it's billions in bank) will be able to catch this train it'll not change the fate of Linux desktop: Microsoft shows every indication to keep WRT/WP8 hardware closed and not available for Linux vendors. And the same with Intel: it, too, is trying very hard to catch this train - but it embraces Android on the road there, so no GNOME or KDE in sight.

Improving Ubuntu's application upload process

Posted Sep 5, 2012 23:22 UTC (Wed) by dlang (subscriber, #313) [Link]

you are mixing up established software and desktop approaches to software with specific processor chips.

Linux works perfectly fine on X86, ARM, etc.

touchscreen software is horrid to use on a big screen with a keyboard and mouse. it's not just a small change, it's a completely different paradigm.

Linux will work without a problem, Microsoft may or may not be able to adapt, Apple has made this sort of conversion in the past, so if they think it's worthwhile they will be able to do it.

There _is_ a very straightforward path for desktop software to work on docked mobile devices.

Improving Ubuntu's application upload process

Posted Sep 6, 2012 10:11 UTC (Thu) by khim (subscriber, #9252) [Link]

You are mixing up established software and desktop approaches to software with specific processor chips.

No. I'm talking about platforms. Linux desktop is not relevant for the fate of the war: it's too small. And Windows is pretty tied up to x86. Windows itself can be ported to ARM (Windows RT is just that even if they intentionally crippled it), and all the programs can be ported too, but why would anyone except Microsoft do that? Most users prefer Android and iOS today.

touchscreen software is horrid to use on a big screen with a keyboard and mouse. it's not just a small change, it's a completely different paradigm.

Sure. ASUS Transformer is much less useful because of that. But you can change and fix the programs. You can not easily change the underlying platform.

There _is_ a very straightforward path for desktop software to work on docked mobile devices.

Absolutely! In some different universe where docked mobile devices will be based on Windows or X Window system. In our reality these devices will be based on Android or iOS. This excludes the existing software. Sure, some libraries and other important pieces will be reused, but the desktop as we know it? Forget out it.

The best case scenario: it'll survave as some kind of very niche player (similarly to how today it's very niche player). Sure, you can use Linux to build an ATM or kiosk, but why should you do that when there are another familiar platform (Android)? Heck, even today such devices are often using Windows (even if you need to spend money to buy it), because it's more familier to the developers.

Existential thread for the "traditional Linux desktop" today is not Windows or MacOS, but Android and iOS. I'm not touching iOS with ten-feet pole (you need to sell your soul before you can ever do anything with it), but I find it hard to justify fight against Android: what will it accomplish? Android is already free, after all…

Improving Ubuntu's application upload process

Posted Sep 6, 2012 6:58 UTC (Thu) by iabervon (subscriber, #722) [Link]

When not plugged into a docking station, a laptop has a slightly smaller screen, a keyboard without some of the duplicate keys, and a pointing device that works a bit differently. That is, at its most different, it's not that different. Really, the main difference is that the hotplug system gets a major workout. A phone-class device spends most of its time without any keyboard at all, with a much smaller (in inches) screen than any laptop since the 80s, and with a touchscreen (which is much better than a regular pointing device for some things and much worse for others). That's way more different from a desktop than a laptop is. And a phone-class device (in its common configuration) has a different ordering of what operations are more convenient than other operations.

Of course, it's entirely plausible that someone's "desktop computer" will actually be a virtual machine running on their phone. But that sort of user probably spends 75% of their time away from their desk, with their phone behaving in ways that work when you're standing on a ladder with a hammer in one hand. They're not just getting rid of the tower case next to their monitor, they're no longer going over to their desk any more, because the time they're on the go is all of their billable hours.

Improving Ubuntu's application upload process

Posted Sep 6, 2012 7:12 UTC (Thu) by dlang (subscriber, #313) [Link]

I am not saying that it will run the desktop inside a VM, but I aldo don't see touchscreen software being run on a 30" non-touchscreen monitor well.

If you have a Linux base, there's nothing (conceptually) preventing you from running libreoffice and thunderbird when docked, but then suspend/checkpoint that software when you undock, using touch-friendly Android apps on the small screen. You may have some 'mobile' apps running in small windows on your big screen (no need to have two weather apps for example)

Yes, right now Android and Linux have completely different userspace libraries, but are they really that different in their functionality and API? or could you have one libc that will work for both?

I'm a bit puzzled that nobody has created the ability to run android apps on a full Linux box yet (I think we're getting very close on the kernel capabilities side, but Android hasn't taken advantage of all the recent mainstream developments yet).

Once you have the ability to access the android app store and run regular Linux apps, I expect that something like the Transformer phone/laptop will start to appear, with normal Linux apps when docked and limited to android apps when undocked.

I don't expect such things to first show up from the "big boys", they will be trying to do the all singing all dancing single solution to everything. Instead I expect that some of the low-end "knock-off" companies will throw something like this together and start selling it with it working well enough that others will then jump on the bandwagon

Improving Ubuntu's application upload process

Posted Sep 6, 2012 7:22 UTC (Thu) by Cyberax (✭ supporter ✭, #52523) [Link]

>If you have a Linux base, there's nothing (conceptually) preventing you from running libreoffice and thunderbird when docked, but then suspend/checkpoint that software when you undock, using touch-friendly Android apps on the small screen.
Apparently, Ubuntu has done exactly this. It haven't went anywhere.

Improving Ubuntu's application upload process

Posted Sep 6, 2012 7:40 UTC (Thu) by dlang (subscriber, #313) [Link]

I heard that they announced that they were going to, not that they had done so.

If they have done so, the reason it hasn't gone anywhere is that nobody knows about it. I run ubuntu but have no idea how I would install and android app.

There are a couple android apps I'd really like to have, the webex app for one (since Webex has chosen not to support running on 64 bit linux under any conditions, but has a working android app)

Improving Ubuntu's application upload process

Posted Sep 6, 2012 7:49 UTC (Thu) by Cyberax (✭ supporter ✭, #52523) [Link]

They've demoed it this winter. I think there was even an article on LWN about it. They haven't released a public version of it, unfortunately.

Anyway, I think that's a dead end. Interface switching between ugly OpenOffice and some slick Android app would be too jarring. Far more likely, that Android apps would be modified to work in full "desktop" mode.

So imagine this: you plug in your phone into a dock station, Android apps are snapshotted and then migrated to the dock's CPU (already possible with CRIU or OpenVZ). It can be animated on screen by "expanding" application window from phone size to classic desktop size. All smooth, no jarring switches between eye candy and fugly apps.

Improving Ubuntu's application upload process

Posted Sep 6, 2012 8:07 UTC (Thu) by dlang (subscriber, #313) [Link]

well, if they haven't released it, it seems a bit early to say 'mobody cared'. And as for their demo, did it really implement everything? or just enough to give examples? If they include it in 12.10, publicise it, and then nobody cares I'll start to believe you.

docking units don't have CPUs.

Plus you have this amazing assumption that people who write small android apps can scale them up to full features apps without a problem.

your attitude that the current apps and android apps cannot co-exist, all existing software must be thrown out and re-written is exactly what I expect from the "big boys" I mentioned above.

I expect that the reality is going to be that the convenience of having your entire system with you is going to overcome a lot of the 'opposition' to the continued use of existing 'ugly' apps.

but once people actually have the options we'll see what happens with it.

Improving Ubuntu's application upload process

Posted Sep 6, 2012 10:28 UTC (Thu) by khim (subscriber, #9252) [Link]

Plus you have this amazing assumption that people who write small android apps can scale them up to full features apps without a problem.

Where have you seen this assumption? Yes, people who write Android apps today may not be able to write the full-blown desktop apps. But they work in the same company as the people who write the full-blown desktop apps! They can easily coopt code and people from today's big apps.

This is similar to UNIX-to-Windows switch which happend with such big apps 10-15 years ago. Sure it was painful back then and it will be painful tomorrow, but apps follow the money. And money are in Android and iOS apps, not in the Ubuntu apps.

your attitude that the current apps and android apps cannot co-exist, all existing software must be thrown out and re-written is exactly what I expect from the "big boys" I mentioned above.

Where have you seen such an attitude? It didn't happen last time, why will it happen this time? Yes, unimportant parts (things like support for X Window system or Win32 API) will be ripped out, but core will be kept up. It'll be quite painful for the people who tied up the core to Win32 API, but there will be some adapters, etc. The transition already happend not once, but twice or in some cases trice (from big iron to VMS, then to UNIX and finally to Windows), it'll hapen yet another time.

Improving Ubuntu's application upload process

Posted Sep 6, 2012 14:36 UTC (Thu) by Cyberax (✭ supporter ✭, #52523) [Link]

>well, if they haven't released it, it seems a bit early to say 'mobody cared'. And as for their demo, did it really implement everything? or just enough to give examples? If they include it in 12.10, publicise it, and then nobody cares I'll start to believe you.
They've implemented it enough to be useful. Still, nobody of large OEMs cared.

>docking units don't have CPUs.
What? I'm off to USPTO then!

But seriously, that's just a logical progression - simply add ability to easily offload apps from a tiny CPU in a phone to a real desktop CPU.

>Plus you have this amazing assumption that people who write small android apps can scale them up to full features apps without a problem.
Why not? We've already seen this happening with iPad apps. Quite a few "real" desktop applications were quickly ported to it (like OmniGraffle, for example).

Improving Ubuntu's application upload process

Posted Sep 6, 2012 19:07 UTC (Thu) by dlang (subscriber, #313) [Link]

connecting two CPUs together efficiently is hard, and not something that you are going to do with a docking station (the connector alone will be a large percentage of the size of the mobile device, you aren't just talking about a USB cable or something like that.

Also, a docking station with it's own CPU, memory, etc isn't cheap, but it also already exists (it's called a "desktop computer")

what you seem to be wanting is checkpoint/restore capability to move running applications from one server to another. with the appropriate emulation, the desktop doesn't need to have the same processor architecture that the mobile device has

docking stations are much simpler, they are little more than convienient ways to plug in several devices at once instead of having multiple wires.

Improving Ubuntu's application upload process

Posted Sep 6, 2012 19:10 UTC (Thu) by Cyberax (✭ supporter ✭, #52523) [Link]

> connecting two CPUs together efficiently is hard
You don't need to do it. Just migrate all running apps to the docked CPU.

>Also, a docking station with it's own CPU, memory, etc isn't cheap, but it also already exists (it's called a "desktop computer")
Yup, and right now it's not integrated with phones. This is just the next step and it has nice smooth continuity from pads/phones.

Improving Ubuntu's application upload process

Posted Sep 6, 2012 10:20 UTC (Thu) by khim (subscriber, #9252) [Link]

Once you have the ability to access the android app store

Sorry to burst your bubble, but Android app store will not be available on this Frankenstein: it's only available on certified devices.

You can create separate app store (like Amazon and Nook are doing), but this will require significant marketing to attract the developers to your store. I don't see anyone with deep enough pockets interested.

I don't expect such things to first show up from the "big boys", they will be trying to do the all singing all dancing single solution to everything. Instead I expect that some of the low-end "knock-off" companies will throw something like this together and start selling it with it working well enough that others will then jump on the bandwagon.

Yup. The only problem: when "big boys" will finally jump on this train they will probably throw away the "regular Linux userspace" to comply with certification requirements.

Improving Ubuntu's application upload process

Posted Sep 6, 2012 16:06 UTC (Thu) by drag (guest, #31333) [Link]

> You can create separate app store (like Amazon and Nook are doing), but this will require significant marketing to attract the developers to your store. I don't see anyone with deep enough pockets interested.

There are alternative app stores for Android. Have been since the beginning. And not just Amazon. There were others before that.

example:
http://slideme.org/

...

The problem you are alluding to, of course, is that while Android is open source the Android apps are not. So Google has certification requirements for the platform before you can distribute their software at no-cost.

This is not a insurmountable barrier. I don't know what would be required for Ubuntu to make a 'frankenstein' OS, but it's something they will have to work with Google to do if they want people to have access to the Google applications they are familiar with.

Improving Ubuntu's application upload process

Posted Sep 6, 2012 17:42 UTC (Thu) by khim (subscriber, #9252) [Link]

There are alternative app stores for Android. Have been since the beginning. And not just Amazon. There were others before that.

example:
http://slideme.org/

What devices they are selling? AFAICS this is just an app store for the Android. This is not what Amazon and Nook are doing and this is not what I'm talking about.

The problem you are alluding to, of course, is that while Android is open source the Android apps are not. So Google has certification requirements for the platform before you can distribute their software at no-cost.

Yup. Please read the requirements before you'll go any further. For example, though the Android source code could be ported to run on a phone that doesn't have a camera, the CDD requires that in order to be compatible, all phones must have a camera. This allows developers to rely on a consistent set of capabilities when writing their apps.

"Certification" is not something you can do as afterthough: if you want to receive Google Play then you must design your device specifically for the Google's requirements.

This is not a insurmountable barrier. I don't know what would be required for Ubuntu to make a 'frankenstein' OS, but it's something they will have to work with Google to do if they want people to have access to the Google applications they are familiar with.

This is not about 'frankenstein' OS. This is about 'frankenstein' hardware. I doubt Google will want to certify any hardware which needs third-party platfrom to be fully usabled as "compatible device". This is what happened with tablets, for example: third-party tablets all lacked Google Play till Honeycomb. Because it was impossible to certify them.

And if Android will be extended to effectively support mouse, keyboard and large monitor then why will you want to develop programs for Ubuntu? You'll develop them for Android instead - this will cover larger audience.

Improving Ubuntu's application upload process

Posted Sep 8, 2012 16:17 UTC (Sat) by cesarb (subscriber, #6266) [Link]

> the CDD requires that in order to be compatible, all phones must have a camera.

That FAQ seems to be outdated. Take a look at the current CDD (https://source.android.com/compatibility/4.1/android-4.1-...):

> Device implementations SHOULD include a rear-facing camera, and MAY include a front-facing camera.

That is a SHOULD, not a MUST.

Improving Ubuntu's application upload process

Posted Sep 8, 2012 18:17 UTC (Sat) by khim (subscriber, #9252) [Link]

Yes, the requirements changed with Android 2.3. This is when MUST was replaced with SHOULD WRT camera.

But I'm not talking about sepcifics here, I'm talking about the fact that to get Google Play you need to satisfy pretty long and detailed list of requirements.

Want to create something new and exciting? Feel free: Android is FOSS, after all! Want to get hundreds of thousands applications and, perhaps, small percentage of sales (not sure if Google actually does this or not but this is logic approach)? Talk with Google first and see if they'll be interested in your creation enough to alter the CDD.

Improving Ubuntu's application upload process

Posted Sep 9, 2012 7:43 UTC (Sun) by danieldk (subscriber, #27876) [Link]

> Ubuntu is not going to conquer the desktop because it is a "better Linux" (unless the desktop goes
away except for die hard Linux aficionados). The question is, is there a future for the desktop computer? Will it be replaced by the mobile device that plugs in to the big screen and links to bluetooth mouse and keyboard?

For media consumption (regular home users), yes, I believe smartphones and tables will replace the desktop PC and even laptops. Even for simpler productivity applications (simple text processing, time management), we will probably see tablets becoming increasingly popular. I think there are two important causes:

1. The user interfaces of newer mobile devices are radically simpler than their desktop counterparts. Sure, you could port it to a desktop, which changes for mouse/keyboard use. But:

2. Mobile devices are far more flexible in their use. You can carry an iPad around the house, to meetings, etc. You can also dock it, and have a something that almost resembles a very user-friendly desktop.

But I do think there is a feature for the desktop/laptop computer in more heavy-duty production work. For instance, development or movie production.

The sad thing is that not everyone realised that mobile computing and desktop computing require radically different user interfaces. The latter requires good window management and should be focused on mouse and keyboard use. The former is oriented at touch interfaces and simplicity.

It would be nice if the Linux desktop could be focused on what it did best: provide a good environment for programming and other heavy-duty productivity work. No more big desktop adventures, iterative refinement, and finishing all the rough edges. If the Linux desktop focused on just the desktop, things would become much simpler. Now that all the action is taking place in the mobile sphere, it's much easier to catch up with things.

For the mobile market, there's already a big (opensource-ish) ecosystem, Android. Let's focus on keeping Google honest and pushing CyanogenMod, rather than trying to morph traditional desktop environments into mobile environments. It'll be hard to make any dent, and there already is a good opensource Linux-based player.

mobile computers replacing desktop

Posted Sep 9, 2012 18:22 UTC (Sun) by giraffedata (subscriber, #1954) [Link]

When the handheld computer is docked, you have a desktop computer. So this proposition isn't that desktop computers will be replaced by handheld computers, but that the desktop computer of the future will share major parts with a handheld one.

That seems really unrealistic to me. The handheld parts are so constrained by the mobility requirement that the desktop would be far less capable per unit cost than if it were made entirely of desktop-optimized parts. Any savings from reusing parts would be outweighed by the loss from weakening both the desktop and the handheld.

And do bear in mind that one's data will be accessible by all of the handheld and desktop computers one uses, even without docking.

There is a separate proposition that I think some people confuse with this docking idea: most desktop computers will actually be replaced by handhelds. I.e. people will compute on their hands, not at their desks. There will be far fewer big screens and mice.

mobile computers replacing desktop

Posted Sep 9, 2012 19:19 UTC (Sun) by Cyberax (✭ supporter ✭, #52523) [Link]

Who says anything about savings? Pads and "smart docks" with CPU/GPU offloading support can very well be more expensive than traditional desktops.

However, they'll also be much more flexible. Just like laptops versus tabletops - laptops were initially much more expensive, but they have already taken over the tabletop market.

mobile computers replacing desktop

Posted Sep 9, 2012 19:43 UTC (Sun) by giraffedata (subscriber, #1954) [Link]

Who says anything about savings?

Just guessing at reasons people might think it makes sense to share parts between the mobile and desktop computer.

You apparently have a particular configuration in mind that would be much more flexible because the desktop is somehow married to the handheld than if the desktop were standalone. I'd like to hear what you have in mind.

In comparing the evolution of the laptop to the evolution of the handheld, I note that the laptop was always just a scaled down desktop. E.g. it ran the same OS and most of the same applications. Every advance was designed to make it more like the former desktop. In contrast, the handhelds have taken off in a separate direction. That should make a difference.

mobile computers replacing desktop

Posted Sep 9, 2012 21:41 UTC (Sun) by khim (subscriber, #9252) [Link]

Just guessing at reasons people might think it makes sense to share parts between the mobile and desktop computer.

You are still looking on the whole story from wrong direction.

This is not about inventing desktop/mobile hybrids. No. This is about attaching large monitor, keyboard and mouse to handheld or tablet. And then inventing desktop/mobile hybrids.

Why would anyone do that? Are they geeks or just eccentric? Neither.

Let's start with facts.

Sales of PCs in 2011: 364 million units, 3.8% growth comparing to 2010

Sales of smartphones in 2011: 486 million units, 63.2% growth comparing to 2010.

What does it mean? Well, one simple thing: money are moving in the direction of smartphones (and to much lesser degree - tablets). And fast.

Even today there are a lot of guys who have smartphone and no PC (or they have brand-new smartphone and 5-6 years old PC). Tomorrow most smartphone owners will not own a PC - but they will still want to use large monitor, mouse and keyboard.

If the dock station for smartphone will be cheaper and will provide usable experience - they'll pick it over PC. And as smartphones will become more and more powerful this combination will be more and more natural.

And just like it happened with workstations at some point people will just stop buying desktops and laptops. It's hard to predict exactly when that'll happen, but when this will happen anyone who'll have no presence on mobile will go bankrupt. It's that simple.

Now you see why Microsoft is ready to put quite literally everything to win the mobile market? And why Intel spends billions to somehow shoehorn it's CPUs in mobile phones? The writings are on the wall.

In comparing the evolution of the laptop to the evolution of the handheld, I note that the laptop was always just a scaled down desktop. E.g. it ran the same OS and most of the same applications. Every advance was designed to make it more like the former desktop. In contrast, the handhelds have taken off in a separate direction. That should make a difference.

Absolutely. Laptops were never an existential threat for the developers of desktop software. Handhelds are such a threat. Exactly because there are continuity between desktop and laptops, but no continuity between laptops and handhelds.

mobile computers replacing desktop

Posted Sep 10, 2012 1:43 UTC (Mon) by dlang (subscriber, #313) [Link]

you are comparing a mature market (desktops) with a emerging market (mobile devices). The rate of growth of mobile devices is going to flatten out in a few years

as for the rate of growth, there's a xkcd for that http://xkcd.com/1102/

while some people will stop buying desktops and laptops, claiming that anyone who only produces desktops and laptops will go bankrupt is rather far-fetched.

That's like saying that since since many people have already stopped buying desktops and instead buy laptops that anyone who doesn't have a laptop presence, and all the companies who only make devices and cards that work on desktops are going to go bankrupt. that's far from true, they can continue to make a nice profit selling there devices.

IBM only makes mainframes and servers, the vast majority of people don't buy such systems, and haven't for MANY years. why didn't they go bankrupt decades ago?

mobile computers replacing desktop

Posted Sep 10, 2012 1:51 UTC (Mon) by Cyberax (✭ supporter ✭, #52523) [Link]

Because IBM sells stuff to entities that think along the line: "the more expensive the better" and "we're not going to use all these new-fangled x86 CPUs, if PowerPC was good enough for Abraham Lincoln it's good enough for everyone!"

mobile computers replacing desktop

Posted Sep 10, 2012 2:23 UTC (Mon) by dlang (subscriber, #313) [Link]

There are lots of situations where the Power chips are far better than the x86 chips.

The question is if this added capabilities are worth the money.

If you have an application that is easy to split across multiple systems, the answer is probably no, but for applications that are harder to scale, the dramatic improvement in per-thread performance on a Power chip can be worth a LOT.

x86 didn't win because it was better than it's competition, it won because it was more common. It was more common because it was 'good enough' and cheaper than the competition, and it is what happened to be picked for the PC that got cloned.

In many ways x86 is a far worse architecture than just about anything out there. But AMD and Intel have put HUGE amounts of effort into speeding it up. But even with all this effort, it's not the best performing chip available, and it's not the most power efficient chip available

mobile computers replacing desktop

Posted Sep 10, 2012 2:35 UTC (Mon) by Cyberax (✭ supporter ✭, #52523) [Link]

Not really. It's just that IBM is willing to produce CPUs that cost $20000 each (I'm not exaggerating) by brute force (using large dies). That's why they can get to 6GHz clock speed.

That could be easily replicated by x86 using the same methods. But Intel and AMD probably are not interested. It's a niche, after all.

mobile computers replacing desktop

Posted Sep 10, 2012 2:48 UTC (Mon) by dlang (subscriber, #313) [Link]

I'm not disputing that it's a niche.

However, the x86 architecture is rather poor in many areas (extremely low register count for example)

Sparc, Power, Alpha are all better architectures than x86 (or amd64), but since they can't run the software that can run on x86, and that software is a huge percentage of the industry, they are limited to the server niche.

the highest performing processors always have a large price premium attached to them. IBM makes really good profit margins on the Power line, but they do also include capabilities that are not found on x86 systems. These capabilities are usually not worth their cost, which is why it remains a niche, but if you need them, they are cheap at that price.

But the point was that just because they didn't jump on the latest bandwagon doesn't mean that they went bankrupt.

mobile computers replacing desktop

Posted Sep 10, 2012 3:08 UTC (Mon) by Cyberax (✭ supporter ✭, #52523) [Link]

Let me quote the inimitable Linus Torvalds: http://yarchive.net/comp/linux/x86.html

>And the baroque instruction encoding on the x86 is actually a _good_
>thing: it's a rather dense encoding, which means that you win on icache.
>It's a bit hard to decode, but who cares? Existing chips do well at
>decoding, and thanks to the icache win they tend to perform better - and
>they load faster too (which is important - you can make your CPU have
>big caches, but _nothing_ saves you from the cold-cache costs).

>The low register count isn't an issue when you code in any high-level
>language, and it has actually forced x86 implementors to do a hell of a
>lot better job than the competition when it comes to memory loads and
>stores - which helps in general. While the RISC people were off trying
>to optimize their compilers to generate loops that used all 32 registers
>efficiently, the x86 implementors instead made the chip run fast on
>varied loads and used tons of register renaming hardware (and looking at
>_memory_ renaming too).

And that's true. ARM folks found out that while ARM is nice and great for low-performance low-power computing, you really need to do all sorts of tricks x86 does once you want to move to high-performance computing. Large register file does not give you much in that case.

mobile computers replacing desktop

Posted Sep 10, 2012 5:44 UTC (Mon) by dlang (subscriber, #313) [Link]

on the other hand, the x86 design is hard to scale up in speed and efficiently use more transisters. Other designs do scale up better.

mobile computers replacing desktop

Posted Sep 10, 2012 6:33 UTC (Mon) by khim (subscriber, #9252) [Link]

Not really. There are designs which can cover lower-power/lower-speed niche (ARM and MIPS), there are designs which cover higher-power/higher-speed niche (Itanic, POWER), there are no designs which scale better than x86.

That's why most TOP500 computers use it: they need something with the best available performance per watt (wall-power is one of the most important limitations for the supercomputers) and that "something" is either GPU or x86. Often both.

mobile computers replacing desktop

Posted Sep 10, 2012 6:56 UTC (Mon) by Cyberax (✭ supporter ✭, #52523) [Link]

Technically, the latest ARMs have a better peformance/power ratio than x86/64. But they still can't approach the single-core speed of x86.

mobile computers replacing desktop

Posted Sep 10, 2012 7:07 UTC (Mon) by dlang (subscriber, #313) [Link]

> Technically, the latest ARMs have a better peformance/power ratio than x86/64. But they still can't approach the single-core speed of x86.

which doesn't approach the single-core speed of the Power systems.

each of them has a market, they don't have to directly compete.

Unless you are shipping binary-only software and aren't willing to ship multiple copies. the industry has been in that mindset for a long time.

But in the mobile space they don't have the luxury of having a single binary target, and until and unless they do (which seems unlikely to happen in the short term), we have a real chance for competition between different processors, and for different processors to be used in different places. In the short term it's easier to try and mandate a hardware monoculture, but I don't think anyone has enough power to do so, especially with Android and Linux _not_ mandating it and giving everyone an example and an option to work with when they run into the monoculture limitations.

mobile computers replacing desktop

Posted Sep 10, 2012 11:55 UTC (Mon) by Cyberax (✭ supporter ✭, #52523) [Link]

> which doesn't approach the single-core speed of the Power systems.
It does. Easily.

New Xeons beat all the PPCs except for the top-of-the line POWER-7 CPUs. And then top-of-the line PPCs beat Xeons only because of their humongous die size and clock speeds (with corresponding TDP).

In reality there's no magic sauce that makes PPC to be inherently better than x86. They both use hardware decoders to split instructions into pipeline-able sub-instructions. Sure, decoders for RISCs are easier to implement but the difference in the number of transistors is trivial for desktop/server CPUs.

I'd several run-ins with IBM mainframe guys about a year ago. They told my client how great and fast their mainframes were, that $100000 per system couldn't be wrong, could it? So my client went and leased a 'small' unit - telling how great it would be.

IO turned out to be slow as hell compared to our simple Xeon-based server with a RAID of Intel SSDs. CPU performance indeed was higher, but not by much - and we got one of the fastest CPUs available.

mobile computers replacing desktop

Posted Sep 10, 2012 13:16 UTC (Mon) by khim (subscriber, #9252) [Link]

But in the mobile space they don't have the luxury of having a single binary target, and until and unless they do (which seems unlikely to happen in the short term), we have a real chance for competition between different processors, and for different processors to be used in different places.

Define "short term". We had similar situation on PC, too. Early versions of many programs supported many different CPUs (I've personally used Turbo Pascal on x86-based systems and on z80-based systems). But eventually market settled on "IBM PC compatible" and most other systems died of. Mac survived (barely), but all other IBM PC-incompatible personal computers perished. x86-incompatible systems perished without exceptions. This is why Intel and MIPS are so desperate to push their CPUs in mobile space: they know that if they'll not do that in a year or two they'll lose the opportunity forever. Already the biggest complaint WRT x86-phones Androids and MIPS-based Androids is poor ARM emulation and poor work of many popular programs with binary-only components.

In the short term it's easier to try and mandate a hardware monoculture, but I don't think anyone has enough power to do so, especially with Android and Linux _not_ mandating it and giving everyone an example and an option to work with when they run into the monoculture limitations.

Bwa-ha-ha. Just want and see. Windows NT also supported multitude of CPUs (), but in the end… market have chosen just one architecture: x86. Short-term? Nope, this is not the question of short-term. I'm not sure who'll win the mobile race (there are still some time left and MIPS and x86 both have advantages: MIPS is free and x86 makes it easy to port software from PC), but in the end it'll be one choice.

mobile computers replacing desktop

Posted Sep 10, 2012 6:02 UTC (Mon) by khim (subscriber, #9252) [Link]

But the point was that just because they didn't jump on the latest bandwagon doesn't mean that they went bankrupt.

Oh yes, it does. IBM sold out it's PC division because it was unprofitable and they no longer sell workstations, too.

They survived by shifting to other venues, they are doing consulting work, they are selling mainframes, they no longer produce PCs or workstations. That part is gone.

The problem with most players in desktop area today is that they have no plan B, or if they do (HP has printers and enterprise software, for example) they are not big enough to support the whole structure.

Today they have difficult dilemma: either they will try to shift to mobile or they need to switch to some other field (like IBM did 20 years ago). Desktop will no longer support them.

They have time (you right: right now, today mobile apps tend to be very weak at doing any content creation), but it's large shift, I'm not sure they have enough time.

mobile computers replacing desktop

Posted Sep 10, 2012 6:21 UTC (Mon) by khim (subscriber, #9252) [Link]

as for the rate of growth, there's a xkcd for that http://xkcd.com/1102/

Beep, beep, beep. Bullshit detected.

you are comparing a mature market (desktops) with a emerging market (mobile devices). The rate of growth of mobile devices is going to flatten out in a few years

I'm comparing two markets of almost identical sizes - that's why XKCD is irrelevant here. They have almost identical sizes (PCs have [slightly] larger installed base, but smartphones already have bigger per-year sales). That's true that growth of mobile devices is going to flatten out in a few years (when there will be more smartphones than peoples on Earth), but why does it matter? We already are talking about big enough numbers to seal the PC fate.

IBM only makes mainframes and servers, the vast majority of people don't buy such systems, and haven't for MANY years. why didn't they go bankrupt decades ago?

They almost did. You can read about recounting of that era from the guy who saved them from that fate. IBM survived, but not as producer of workstations or PCs. Sure, they sell mainframes and servers - but this not where most of their money comes from.

mobile computers replacing desktop

Posted Sep 10, 2012 6:53 UTC (Mon) by giraffedata (subscriber, #1954) [Link]

I'm comparing two markets of almost identical sizes

I wondered why you were comparing markets at all. We were having a discussion about whether mobile computing would replace desktop computing, and the figures you cited were in fact only tangentially related to that question. The figures that would help with that are how fast the number of people with mobile computers is growing and how fast the number of people with desktop computers is declining (if at all). The sales numbers totally skip the disposal side of the equation, and the high growth rate of sales is less predictive when you realize there's a saturation point.

As for the matter of Lenovo taking over IBM's personal computer product line, I'm afraid I've become lost in the analogies again, and I can't see what that teaches us about whether desktop computing will still be around as mobile computing grows.

mobile computers replacing desktop

Posted Sep 10, 2012 13:16 UTC (Mon) by khim (subscriber, #9252) [Link]

The figures that would help with that are how fast the number of people with mobile computers is growing.

This is what the numbers above show, isnt't it? With market which is growing as fast as smartphones are growing disposal side of the equation is not relevant yet: 50% of smartphones ever sold were sold in the last 18 months. We can safely assume they all are still in use.

And how fast the number of people with desktop computers is declining (if at all).

It's growing right now :-) Disruptive collapse will happen later. UNIX workstations grew for more then decade after introduction of personal computers. In fact DEC was portrayed as unstoppable force and as pinnacle of business acumen in 1980th. Business Week warned IBM in 1986: "Taking on Digital Equipment Corp. these days is like standing in front of a moving train." This was five years after introduction of IBM PC!

As for the matter of Lenovo taking over IBM's personal computer product line, I'm afraid I've become lost in the analogies again, and I can't see what that teaches us about whether desktop computing will still be around as mobile computing grows.

It shows what happens when you try to stop the disruptive collapse. IBM owned the PC market. Heck, it created the "IBM PC-compatibles" market. But this happened because this thing was not created by "big IBM" (this attempt predictably failed), it was created by semi-autonomous group. When "big IBM" realized what happened it immediately tried to change the direction to make sure they have control over market. It tried to make sure personal computers will be confined it it's niche and will not hurt sales of RS/6000 and big iron with PS/2 and OS/2. It failed, of course, and IBM eventually relented, but by that time it was too late: ThinkPads were (and are) good devices (as were PS/1), and they are quite popular among some users (ThinkPads, not PS/1), but ultimately IBM was unable to keep up with a PC market.

mobile computers replacing desktop

Posted Sep 9, 2012 19:30 UTC (Sun) by khim (subscriber, #9252) [Link]

That seems really unrealistic to me. The handheld parts are so constrained by the mobility requirement that the desktop would be far less capable per unit cost than if it were made entirely of desktop-optimized parts. Any savings from reusing parts would be outweighed by the loss from weakening both the desktop and the handheld.

Déjà_vu: you explanation replicates reasons which should have kept DEC and Sun viable forever. Just replace "desktop" with "workstation" and "handheld" with "personal computer". Remember these times when first personal computers stuffed everything in one case with a keyboard and used TV instead of big, professional monitor? It's quite literally impossible to put hot, powerful CPU, powerful graphic and all the goodies workstations offered in such a tiny box.

Couple of decades later it was supposed to keep laptop a niche product.

Today it's supposed to keep desktop alive and well.

There is a separate proposition that I think some people confuse with this docking idea: most desktop computers will actually be replaced by handhelds. I.e. people will compute on their hands, not at their desks. There will be far fewer big screens and mice.

This will happen to some extent, but I doubt a lot of users will use tiny tablet (10"-11" is tiny when compared with 30" monitor) when alternative will be available.

If you'll compare quater-century old workstations with today's personal computers and with personal computers of that era then you'll find that, suprisingly, PCs of today resemble these old workstation more then they resemble these old PCs relics - but it does not change anything: people switched to more affordable PCs when they could and over time brought niceties they lost after said switch while companies which tried to sell these wonderful machines wend bankrupt.

Today's phone is not powerful enough to drive today's desktop but it is more powerful then all these old workstations!

mobile computers replacing desktop

Posted Sep 9, 2012 22:44 UTC (Sun) by giraffedata (subscriber, #1954) [Link]

I guess I'm not following your analogy. You're saying a Sun workstation is to an Apple II as MacBook is to iPhone?

If so, I don't see it. An Apple II is an inexpensive approximation of a Sun workstation. An iPhone does a whole different job from a MacBook.

mobile computers replacing desktop

Posted Sep 9, 2012 23:27 UTC (Sun) by khim (subscriber, #9252) [Link]

I guess I'm not following your analogy. You're saying a Sun workstation is to an Apple II as MacBook is to iPhone?

In sense, yes.

An iPhone does a whole different job from a MacBook.

Sure, you can use it as phone, as camera, as a music player - but these things are not important here. What is important is that you can use it as a poor's-man-MacBook, too. Well, this is less true for an iPhone, because you can not actually program it on the phone, but with Android you can do that... in many ways, in fact. You can write simple scripts or use Java or C++. Sure, just like CAD or office these IDEs are poor substitute for full-blown analogues found on desktop (or laptop), but there are no reason to believe they will always be underpowered.

mobile computers replacing desktop

Posted Sep 10, 2012 1:36 UTC (Mon) by dlang (subscriber, #313) [Link]

> And do bear in mind that one's data will be accessible by all of the handheld and desktop computers one uses, even without docking.

That is a dream that depends on having unlimited bandwideth available everywhere, something that is far from true today, and while things will improve, it's very unlikely that it will ever be true.

There are still large areas of the world where dial-up or satellite are the only options for Internet connections

the fact that the mobile phone companies are being greedy with the data usage costs is another strike against this idea.

> That seems really unrealistic to me. The handheld parts are so constrained by the mobility requirement that the desktop would be far less capable per unit cost than if it were made entirely of desktop-optimized parts. Any savings from reusing parts would be outweighed by the loss from weakening both the desktop and the handheld.

I really don't see how a combo device hurts device compared to a mobile device. If you just add a displayport connector to an existing system, you have everything you need to make it work (the keyboard, mouse, and network can attach via USB)

Yes, it will be far less flexible, and far more expensive than an equivalent desktop device, but with laptops we already see that many people are willing to make the trade-off of a less powerful, more expensive device that they can move around. I know of some people who buy laptops, but almost never move them off of a desk at home.

> I.e. people will compute on their hands, not at their desks. There will be far fewer big screens and mice.

unless people's eyesight improves significantly, and their fingers shrink, I really don't expect this to happen across the board. Yes, people who mainly consume data produced by others may shift this way, but people who create data (even if it's only spreadsheets, word processing, or non-trivial amounts of e-mail) are very unlikely to make this shift.

As a result, all the computers used for people's work are unlikely to be replaced by handheld devices, but they could be replaced by dockable mobile devices the same way that they are frequently being replaced today by dockable laptops. If you go into a large company nowdays and walk the cube farm, you are at least as likely to see docking stations or bundles of cables to plug into a laptop as you are to see a desktop. As mobile devices get more powerful, they will start to become 'good enough' when docked and the trend can easily continue to see the laptops replaced by mobile devices (very probably eliminating the phone on everyone's desk in the process)

But these people are not needing to run mobile apps at the office (which tilt heavily towards content consumption and games), they are needing desktop apps that can create content. Mobile apps tend to be very weak at doing any content creation.

mobile computers replacing desktop

Posted Sep 10, 2012 3:14 UTC (Mon) by giraffedata (subscriber, #1954) [Link]

And do bear in mind that one's data will be accessible by all of the handheld and desktop computers one uses, even without docking.
That is a dream that depends on having unlimited bandwidth available everywhere,

Connection to the Internet is not the only way to have data accessible by multiple computers. Proximity replication works too. If every time I walk into my house (or hut) my pocket computer and desktop computer replicate with each other via Bluetooth, I have the same data accessible by both computers (and I don't consider that that docking).

And there will have to be not many, but huge numbers of people — people who matter — who can't share data between computers for the new computing paradigm for the world to assume that data is tied to some physical entity, meaning it makes more sense to plug a keyboard into that entity than have duplicate hardware just to attach to the keyboard.

You know, I once thought the computer in my house would some day be wired into every appliance and work station in my house. As it turned out, every appliance and work station in my house has its own computer now. Sometimes it works out that way.

mobile computers replacing desktop

Posted Sep 10, 2012 5:42 UTC (Mon) by dlang (subscriber, #313) [Link]

Unless there is a significant breakthrough in battery technology, you aren't gong to want to have your mobile device auto-connect to your system and act as a hard drive without plugging it in. It's also not going to be an option to do a complete copy of your files as mobile storage grows (it will just take too long)

And if you plug it in, you may as well take advantage of being plugged in for your communications (and possibly other things) as well.

I also don't think that the idea of just moving the files around is really going to work that well in practice. The incompatibilities that creep in between versions of software make it hard to have two machines that will always work with the same file.

not to mention the fact that people like to leave work open when they suspend/undock and move to a new location. you can't do that if you just move data.

> And there will have to be not many, but huge numbers of people — people who matter — who can't share data between computers for the new computing paradigm for the world to assume that data is tied to some physical entity, meaning it makes more sense to plug a keyboard into that entity than have duplicate hardware just to attach to the keyboard.

here I disagree with you. As long as things 'sometimes' don't work, the new paradigm is not going to take over and shut down the existing one., even a smallish minority of people not accepting the new way will keep it from becoming 'THE' way to do things.

mobile computers replacing desktop

Posted Sep 10, 2012 6:52 UTC (Mon) by Cyberax (✭ supporter ✭, #52523) [Link]

We already have wireless charging that can work at distances up to 3-5 meters.

>Unless there is a significant breakthrough in battery technology, you aren't gong to want to have your mobile device auto-connect to your system and act as a hard drive without plugging it in. It's also not going to be an option to do a complete copy of your files as mobile storage grows (it will just take too long)
"Hard drive"? What is it? You mean people earlier carried their only copy of data on a physical medium without backing it up into the Cloud? How quaint!

Probably you won't even _have_ much local data. It's already happening. All my code is on GitHub, all my photos are on Picassa, email on GMail, music on Google Music, etc. I've a backup of everything on my small NAS and I'm even going to get rid of it in favor of Amazon Glacier.

>not to mention the fact that people like to leave work open when they suspend/undock and move to a new location. you can't do that if you just move data.
Duh. Move the computation with the data.

> here I disagree with you. As long as things 'sometimes' don't work, the new paradigm is not going to take over and shut down the existing one., even a smallish minority of people not accepting the new way will keep it from becoming 'THE' way to do things.
Nope. Smallish minorities will be swept by the change. As usual.

mobile computers replacing desktop

Posted Sep 10, 2012 7:02 UTC (Mon) by dlang (subscriber, #313) [Link]

please go back and read the thread

I say that the 'cloud' is not going to be good enough to keep your data all there

I get a reply that the person doesn't mean the 'cloud', they mean carrying a physical copy of the data.

I point out the problems with that and the reply is that the 'cloud' makes that obsolete

so go back up the thread and read the problems with that.

If you are moving the computation along with the data, you are back to the docking station that we were talking about before the claim was made that it was better to just move the data.

> Nope. Smallish minorities will be swept by the change. As usual.

nope, minorities that are happy with how they do things get ignored by people who want to claim that everyone has changed how they do things.

mobile computers replacing desktop

Posted Sep 10, 2012 7:09 UTC (Mon) by Cyberax (✭ supporter ✭, #52523) [Link]

>I say that the 'cloud' is not going to be good enough to keep your data all there
It's _already_ good enough. And it's going to become even easier in the future.

>nope, minorities that are happy with how they do things get ignored by people who want to claim that everyone has changed how they do things.
Sure. Nobody cares about those people still keeping Amigas and ZX Spectrums alive. It's just that they become totally irrelevant.

mobile computers replacing desktop

Posted Sep 10, 2012 7:14 UTC (Mon) by dlang (subscriber, #313) [Link]

>> I say that the 'cloud' is not going to be good enough to keep your data all there
> It's _already_ good enough. And it's going to become even easier in the future.

It's only good enough if you live in an area with good Internet connectivity and don't need to use mobile networks much.

If you are not so lucky, the cost (in time, and sometimes in money) of moving the data back and forth to the cloud makes it impractical.

Game companies are giving up on cloud based DRM, and customers are rebelling against game companies that want to have even their single-user games require full time Internet connectivity.

that hardly sounds like it's "there" now.

It's Ok to have your stuff take advantage of good connectivity if it's there, but if you make it _require_ good connectivity, you are abandoning large parts of the market.

mobile computers replacing desktop

Posted Sep 10, 2012 10:20 UTC (Mon) by khim (subscriber, #9252) [Link]

It's only good enough if you live in an area with good Internet connectivity

Well, this is where all the PCs live. In areas where you don't have good Internet connectivity (such as Africa) you don't have PCs, just mobile phones so all these discussions are even less relevant.

and don't need to use mobile networks much.

Huh? What this has to do with anything? If you need to use mobile networks then you use them via your phone. How can you use them with a PC?

Game companies are giving up on cloud based DRM,

Source?

and customers are rebelling against game companies that want to have even their single-user games require full time Internet connectivity.

Customer's complain. A lot. Yet they still buy more games if could-based DRM is used thus I doubt it'll change. Some customers don't want to have anything to have with such DRM schemes, but these can be served later. First you sell games to a new customers with DRM and all other lockdown schemes and then, years later, GOG (or someone similar) sell unlocked games to the rest of the public.

It's Ok to have your stuff take advantage of good connectivity if it's there, but if you make it _require_ good connectivity, you are abandoning large parts of the market.

Sure, but these are less-affluent parts of the market. If you can squeeze more money from the ones who don't mind always-on requirement then this is still a net win.

mobile computers replacing desktop

Posted Sep 10, 2012 12:42 UTC (Mon) by dlang (subscriber, #313) [Link]

>> It's only good enough if you live in an area with good Internet connectivity

> Well, this is where all the PCs live. In areas where you don't have good Internet connectivity (such as Africa) you don't have PCs, just mobile phones so all these discussions are even less relevant.

sorry to surprise you, but it's not just places like Africa that don't have good connectivity.

It's also places in the US.

And it's not even limited to things like people living on farms. I live just outside of Los Angeles and the best I can get is 1.5Mb down. There are people within 10 miles of me who cannot even get that, and who don't have any cellular network coverage at their houses. In some cases these are million dollar houses, so it's not just the poor people who are impacted. I'm not even talking 4G/LTE coverage, I'm talking 3G and voice covereage that is spotty.

I have a friend who lives within 5 miles of the Jet Propulsion Lab in Pasadena who can get cell coverage from outside his house, but inside it usually doesn't work

mobile computers replacing desktop

Posted Sep 10, 2012 16:34 UTC (Mon) by khim (subscriber, #9252) [Link]

In some cases these are million dollar houses, so it's not just the poor people who are impacted.

It's not the question of how much money a given person has. It's question of how much a given person is ready to spend on the internet, phone and all other goodies. If s/he does not want to spend enough to get a good internet in their home then why do you think s/he'll be ready to spend substantial money on software and hardware?

mobile computers replacing desktop

Posted Sep 10, 2012 17:17 UTC (Mon) by Jonno (subscriber, #49613) [Link]

> If s/he does not want to spend enough to get a good internet in their home then why do you think s/he'll be ready to spend substantial money on software and hardware?

Because "substantial money on software and hardware" is only about €5'000, while a good internet connection cost about €10'000 per km away from the nearest ISP junction. You know, not everyone live in big cities...

mobile computers replacing desktop

Posted Sep 12, 2012 18:46 UTC (Wed) by nix (subscriber, #2304) [Link]

It's the UK too. Whole villages (and some towns) have full exchanges, no cable modem fitted yet, no new broadband installs, and DACSes landing on lines everywhere -- and the first person to ask for broadband gets hit with the full cost of the exchange upgrade. Oddly everyone there survives on high-latency hyper-expensive horrors such as satellite broadband (more expensive for 1Mb/s 4Gb/month, metered, HTTP only than I pay for two lines 40Mb/s 250Gb/month on domestic ADSL), or goes without.

Who submitted == Apple security model

Posted Sep 5, 2012 20:06 UTC (Wed) by david.a.wheeler (subscriber, #72896) [Link]

Interesting, but I think another thing that could be done would be put something in place so that you know WHO submitted the software. That's actually the primary way Apple deals with the iPhone... you can write malware, but the cops will be arriving later.

Who submitted == Apple security model

Posted Sep 5, 2012 20:44 UTC (Wed) by drag (guest, #31333) [Link]

They depend on credit card infrastucture that the banks (all credit card companies are banks or owned by banks) have set up to monitor people. That way if you get a credit card number you have decent chance of having the ability to track down who you are dealing with.

Unless Ubuntu decides to try to charge people a small fee to get the ability to upload programs then they won't be able to use the same system.

This is probably something that would not be popular. HOWEVER, besides the political implications this is NOT a terrible idea, IMO. One-time fees can be benefitial for multiple parties involved.

Who submitted == Apple security model

Posted Sep 5, 2012 22:55 UTC (Wed) by khim (subscriber, #9252) [Link]

One-time fee is how CWS solves this problem, too.

Who submitted == Apple security model

Posted Sep 6, 2012 9:44 UTC (Thu) by njwhite (guest, #51848) [Link]

Which I for one despise, which is why my chromium extension isn't on their 'store.' That and signing a complex legal document that I lack the training or money (for a lawyer) to understand.

A requirement give a provable identity in order to distribute your software is a dangerous thing.

I'd far rather see some slick desktop interface on (say) freecode (nee freshmeat), which aggregates from wherever the developers can comfortably distribute, under their own terms.

Who submitted == Apple security model

Posted Sep 6, 2012 10:06 UTC (Thu) by njwhite (guest, #51848) [Link]

> I'd far rather see some slick desktop interface on (say) freecode (nee freshmeat), which aggregates from wherever the developers can comfortably distribute, under their own terms.

Using some sort of web of trust based approach for determining the trustworthiness of the programs and authors (Ingo Molnar mentioned this recently, and I would very much like somebody to figure out how to make it work.)

Who submitted == Apple security model

Posted Sep 6, 2012 16:56 UTC (Thu) by drag (guest, #31333) [Link]

> A requirement give a provable identity in order to distribute your software is a dangerous thing.

You are not required to provide identity in any cause to distribute software. Except maybe on iOS.

What you are required to do is provide some form of identity to use another person's service to distribute your software. This is not a bad thing. IF they are providing a service and you have a relationship with them then a payment for access to that service is not something that is a wrong thing to require.

Your one time payment would go to services and vetting so that people can be paid to go through software and check it out as 'safe' or not. Is this not a real issue?

Right now the 'vetting' is done by requiring third parties (Party 1: Developer, Party 2: Users, Party 3: Distributions) to build the software and then only allowing users easy access to those. The 'distributions' vet their 'vetters' by requiring years of devotion and history before they are allowed to build and upload software.

It seems to be that process is no less distasteful then asking a payment.

Either system is ripe for abuse, for different reasons. But that is why users need to be know which 'software distributors' they can trust regardless of the method used. If you are going to delegate your security to other parties (the developers and those who vet and distributes) then it's your responsibility to be somewhat aware of who and the type of people you are dealing with.

Q: "Who watches the watchers?",
A: the people being watched, of course.

One time fees

Posted Sep 7, 2012 11:25 UTC (Fri) by man_ls (guest, #15091) [Link]

One-time fees can be benefitial for multiple parties involved.
Let me state the obvious: fees are mostly beneficial for the receiving party, i.e. Ubuntu here. But of course it all depends on how small the fee really is. There is quite a range from Apple's $99 to Android's $25 (source); an even lower fee would probably not change things much, as the fee itself is the biggest hurdle for application developers. If Ubuntu wants to attract developers they may want to skip the fee.

Improving Ubuntu's application upload process

Posted Sep 5, 2012 20:11 UTC (Wed) by mattdm (subscriber, #18) [Link]

Apps like the beer-festival interactive advertisement are an artifact of the iPhone app store environment and the relatively limited features of mobile Safari. Such a thing is handy on one's phone, because it can provide an optimized user interface for a small space. Because of this and because of the mania over "there's an app for that", phone app stores are full of event-of-the-week apps — like for example LinuxCon for iPhone.

On a desktop or laptop, is there a point in making these things be "apps" rather than web sites? There are significant disadvantages in the app approach — all the things this project is trying to work around. The primary advantage is the app-as-info-channel expectation smartphone users have come to expect. Maybe that alone makes this worth doing, but since web sites are cross-platform and web browsers already widely deployed, I'm not convinced it's the best approach for this kind of thing.

Don't get me wrong — I'm all for making an easier interface for end-users to find and self-install applications. I'm just skeptical about that whole class of apps on non-phones. It's probably a more useful channel for games....

Improving Ubuntu's application upload process

Posted Sep 6, 2012 11:28 UTC (Thu) by pboddie (guest, #50784) [Link]

Agreed. The craze for repackaging Web content as "apps" may excite the producers of these things along with willing (and perhaps also ignorant) consumers, but apart from third-party attempts to make the user interfaces to sites better (and even that can be dubious if someone's "app" is acting as a middle-man), indulging the craze seems like an inefficient throwback to "hit parade" thinking and a time when only new applications could deliver new functionality.

Surely the retort to such things should be, "Steve, there's an Internet for that."

Improving Ubuntu's application upload process

Posted Sep 6, 2012 19:01 UTC (Thu) by dlang (subscriber, #313) [Link]

remember when "having a website" was not much more than converting a company brochure to HTML and putting it up?

many "apps" are in that state now, and just like this has largely faded away (or rather, become so useless that nobody cares about your site if that's all you do), these apps will probably fade away as well.

Part of the problem is that Web bookmarks are so clumsy to use on mobile devices that there really isn't a good way to distribute and use them.

Improving Ubuntu's application upload process

Posted Sep 9, 2012 12:58 UTC (Sun) by valhalla (subscriber, #56634) [Link]

The process to provide informations on current event such as the beer festival can be improved from the website / website_wrapped_in_an_app model, but I don't think that improved custom apps for each event is going to be what is best for the users.

Of course, if your aim is to let me spend more and more time ignoring[esc]bdwawatching ads, or to prevent me from getting informations from your competitor, locking me into an app is a good solution, but this is definitely not good for me.

On the other hand, something that would be useful for the user is improving websites with rss feeds and an rss-like way to subscribe to time/space based informations (with the ability to choose between server-side sorting to save bandwidth, or client-side sorting to save privacy), displayed by a choice of apps both on the desktop and mobile devices.

Improving Ubuntu's application upload process

Posted Sep 9, 2012 17:43 UTC (Sun) by jwarnica (guest, #27492) [Link]

I'm not sure that a pamphlet-as-an-app is a bad thing. I'm not sure the dev tools for such things (personally, my research has been from the more software dev viewpoint more than a content creator). Do things like Dreamweaver have a "publish to app store" button?

Anyway. The requirement for web content is that it is continuously updated. Even and especially the old content. People don't use CMS systems to make todays content fit in todays template, they use CMS systems so content from 5 years ago magically fits in todays template. Exaggerating and simplifying for effect, but you get the point.

If you ship an pamphlet-as-an-app, its done, and its done forever.

Improving Ubuntu's application upload process

Posted Sep 6, 2012 8:59 UTC (Thu) by etienne (guest, #25256) [Link]

> Uploads through this mechanism will be done in source form ...
> ... applications will run within their own sandbox

There is a lot of security checks which would be better done on the source code (i.e. once and not at each execution), even automatically.
For instance, by automatically analysing the source, you can deduce which files are needed, which libraries are linked-in (so which package versions are needed), and only allow at run time to access those files (without a user popup) - file name/path accepted at compile time.
You can also reject strait away some function calls (like "system()" or some library binding), even if at run time you did not find a way to trigger them.

Also, for the "Guides to nearby beer festivals" apps, it would be nice to have a "use-by date" where at that date, a check for a newer version is automatically done; if no new version is found the app is automatically dis-installed - else the O.S. asks if the user want to "renew the subscription" or remove the app.
Moreover, if version updates is done magically (the next month of beer festivals), there is no need for the app to ask for network access privilege.

Improving Ubuntu's application upload process

Posted Sep 7, 2012 10:21 UTC (Fri) by krake (guest, #55996) [Link]

"stalled under /opt; this rule complies with the filesystem hierarchy standard and prevents file conflicts with the rest of the distribution. The problem, according to David Planella (one of the authors of the proposal), is that a lot of things just don't work when installed under /opt"

I really don't buy that.
I've been installing proprietary software into /opt for years and on the Free Software front built and installed software as complex as all of KDE's products into any number of prefixes.

Heck, during development I even run make install as a non-priviledged user, into non-FHS trees and even under $HOME.

The more I think about it the more I believe their observations are based on their application launcher failing to provide the appropriate environment, e.g. properly expanding $XDG_DATA_DIRS and friends.

Improving Ubuntu's application upload process

Posted Sep 10, 2012 21:58 UTC (Mon) by mathstuf (subscriber, #69389) [Link]

> I've been installing proprietary software into /opt for years and on the Free Software front built and installed software as complex as all of KDE's products into any number of prefixes.

KDE uses a useful build system (CMake). Not every project does, unfortunately :( . These are the ones that tend to blow up outside of /usr. If DESTDIR and PREFIX are expanded, there's a hope for things working. I've come across some which don't even have install rules. Loads of fun.

Personally, I prefer $HOME/misc/root/$project[-$version][-$description] for local builds (vim, git, tmux, boost, cmake, etc.) that I use for testing or upgrading when I'm too lazy to spin local RPMs.

> e.g. properly expanding $XDG_DATA_DIRS and friends.

git finally does this in the latest version :) . One less ~/.* file. Lots more to go though :/ . I wish $XDG_RUNTIME_DIR would get some more traction though to avoid /tmp pollution (tmux and uzbl mainly; I have a partial patch somewhere for uzbl).


Copyright © 2012, Eklektix, Inc.
This article may be redistributed under the terms of the Creative Commons CC BY-SA 4.0 license
Comments and public postings are copyrighted by their creators.
Linux is a registered trademark of Linus Torvalds