|
|
Subscribe / Log in / New account

Shuttleworth: Unity on Wayland

Mark Shuttleworth has described the next major step for the Unity interface: putting it on the Wayland OpenGL-based display system. "But we don't believe X is setup to deliver the user experience we want, with super-smooth graphics and effects. I understand that it's *possible* to get amazing results with X, but it's extremely hard, and isn't going to get easier. Some of the core goals of X make it harder to achieve these user experiences on X than on native GL, we're choosing to prioritize the quality of experience over those original values, like network transparency."

to post comments

Shuttleworth: Unity on Wayland

Posted Nov 5, 2010 0:05 UTC (Fri) by juanjux (guest, #11652) [Link] (6 responses)

I think Wayland or some simple graphical backend is something that can't be bad to have. The question is: will current propietary drivers (NVidia & ATI) work with Wayland?

Shuttleworth: Unity on Wayland

Posted Nov 5, 2010 0:08 UTC (Fri) by tzafrir (subscriber, #11501) [Link]

Also: what about all the PowerVR-based ones?

Shuttleworth: Unity on Wayland

Posted Nov 5, 2010 1:23 UTC (Fri) by marineam (guest, #28387) [Link] (4 responses)

Wayland and other ideas for the future of graphics on Linux came up today at Keith Packard's talk at the Linux Plumbers Conference[1]. The basic premise was that recent developments like KMS and GEM have made it possible for X alternatives to actually be viable, the big example being Wayland. When X is no longer responsible for the low level video driver X becomes a heck of a lot easier to replace.

On the flip side though, none of this applies to things still using the traditional driver-in-X model such as proprietary drivers and all the odds and ends that aren't Intel, nVidia, and ATI. Keith's answer was basically that Nouveau is doing well which covers the last proprietary driver and as for the others all he had was "I don't know."

I'm assuming that for the "I don't know" category Ubuntu will still include X and a traditional window manager as an alternative to Wayland and setup the toolkit to use either backend. The few users in that boat won't have the smooth experience Unity is aiming for but probably won't see a significant change from what their current state is.

[1] http://www.linuxplumbersconf.org/2010/ocw/proposals/483

Shuttleworth: Unity on Wayland

Posted Nov 5, 2010 9:10 UTC (Fri) by mjthayer (guest, #39183) [Link]

> On the flip side though, none of this applies to things still using the traditional driver-in-X model such as proprietary drivers and all the odds and ends that aren't Intel, nVidia, and ATI.

I can't say for sure, but I suspect that getting basic (that is, unaccelerated) KMS drivers working for the "odds and ends" would be feasible if someone finds the hardware to test. With the big caveat that that needs people to understand KMS. But if someone were to produce a nice KMS driver stub, which more or less worked except for a few empty functions with big "fill in XXX here" stickers there would be people able to fill in the gaps and lift things like the actual initialisation and register poking from the DDX drivers. I fear that I may have to port a DDX to KMS myself in the not too distant future, so perhaps I should give that a try.

Shuttleworth: Unity on Wayland

Posted Nov 5, 2010 13:31 UTC (Fri) by wookey (guest, #5501) [Link] (2 responses)

On the flip side though, none of this applies to things still using the traditional driver-in-X model such as proprietary drivers and all the odds and ends that aren't Intel, nVidia, and ATI. Keith's answer was basically that Nouveau is doing well which covers the last proprietary driver and as for the others all he had was "I don't know."

That's a very 'desktop' view of the world.

ARM is rapdily becoming a platform that matters for shiny graphics use and we have a choice of GPUs: Imagination Tech PowerVR, ARM Mali, Nvidia GeForce, and Broadcom BCM2727(?).

Guess how many of those have free drivers? Exactly zero. So just as we get to a reasonable state on the desktop we have the same old mess all over again for new netbooks, tablets, phones, TVs and general widgetry. Suggesting that this problem has been dealt with is a very long way off the mark - it's about to get a whole pile worse.

Do please heckle at every opportunity you get at conferences, product unveilings, design meetings and so on. The people who matter need to hear over and over that that is going to be a massive PITA and that they need to fix it, as it is being fixed on the desktop. The engineers already know this, but there are lawyers and execs who still don't get it and are afeared of everyone's patents. I'm doing my bit here at ARM every time they come to tell us how cool it all is. I tell them it stinks :-)

Pretending we don't have a big problem does no-one any good. And unfortunately whilst we have 4 choices all with the same problem we have limited leverage: we can't pick the good choice because there isn't one. I we can get one manufacturer to open up then the others are likely to follow, as has happened reasonably convincingly on the desktop.

Shuttleworth: Unity on Wayland

Posted Nov 5, 2010 15:07 UTC (Fri) by BenHutchings (subscriber, #37955) [Link] (1 responses)

Keith says that the most pressure for avoiding the need for X is coming from the embedded world. (Maybe not your bit of the embedded world.)

Shuttleworth: Unity on Wayland

Posted Nov 6, 2010 0:28 UTC (Sat) by Lennie (subscriber, #49641) [Link]

Well, the only thing which was suggested was that X was to much overhead for smaller devices, so Wayland would be a better fit.

But if Rasterman does not agree (which he has supposed said), someone who works on X and graphics and embedded/mobile, then I don't agree either.

Shuttleworth: Unity on Wayland

Posted Nov 5, 2010 0:48 UTC (Fri) by thoffman (guest, #3063) [Link] (87 responses)

Hmmm, I like my network transparency, and use it a lot, both at work and at home. Although, I now tend to use NX or Vinagre/Vino more often than the port forwarding via OpenSSH I used to rely on.

So, although I'm hopeful and supportive of most anything which would make my Linux desktop more responsive, I hope this Unity on Wayland effort will not prevent new Gnome apps from being usable through a networked desktop.

Shuttleworth: Unity on Wayland

Posted Nov 5, 2010 1:21 UTC (Fri) by JohnLenz (guest, #42089) [Link] (84 responses)

The next generation of gnome (and KDE) apps seems to be going towards much more use of OpenGL and the graphics card. So even if the desktop is based on X, I don't see what good the network transparency at the X level will get. Sure there is indirect GLX rendering over the network but that is too slow so you won't be able to run the program over the network anyway. Instead, the rendering will have to happen on the client using the clients graphics card.

I currently use network transparency of X too, but it looks like the X split of rendering on the server will need to be abandoned to instead use something like VNC where the rendering is allowed to happen on the client.

Perhaps wayland plus SPICE could be used to replace the network transparency.

Shuttleworth: Unity on Wayland

Posted Nov 5, 2010 1:43 UTC (Fri) by gmaxwell (guest, #30048) [Link] (78 responses)

Or— we could just stop using this dancing candy cram-ware.

I'm sure someone loves all this bouncing fading crud that halts your machine for seconds at a time just to play a pretty animation— but for me the computer is an important _tool_ and things like speed and network transparency are important for me.

Fortunately there is a whole suite of active developed toolsets focused on users like me— things like xmonad— as well as all the "classic" unix tools. The biggest downside to using them now is that it means breaking from the norm of your distros default configuration and thus losing part of the outsourced systems administration value they provide. Perhaps a distro fork will arise targeting people who are technically competent more interested in productivity.

Shuttleworth: Unity on Wayland

Posted Nov 5, 2010 2:36 UTC (Fri) by bjacob (guest, #58566) [Link] (13 responses)

I think that the two main ways in which the approach to network transparent GUI apps have changed since X was designed, are:

1. web apps (more generally, in-browser apps --- the CUPS config tool over port 631 now appears like a visionary precursor!)

2. the _clients_ now have insanely powerful graphics hardware, in any case there is no reason anymore for wanting to do graphics on the server side (where I mean "server" in its proper network sense, not in the GPU sense).

And for non-GUI apps, SSH is all you need...

Shuttleworth: Unity on Wayland

Posted Nov 5, 2010 3:30 UTC (Fri) by davide.del.vento (guest, #59196) [Link]

Mmmm. not so much. Yes, there are some web apps like CUPS config (which ironically I always needed only locally), and there are things that can be done in "reverse" compared to how they are done now.

But there will still be the need for "normal" X-forwarding via ssh, so a distro that completely kills it will be a deal breaker for me.

Shuttleworth: Unity on Wayland

Posted Nov 5, 2010 7:09 UTC (Fri) by rqosa (subscriber, #24136) [Link] (10 responses)

The way that Web apps are now (usually), they have a big deficiency in comparison to console or X apps running over SSH: when you use the console/X app, it runs with your UID, but when you use the Web app, it runs with the same UID as it does for everyone else. This is bad for security; it's as though every app were a setuid app.

Apache's "mod_suexec" is one solution to this problem, but its limitations (it can only run CGI apps, and it chooses which UID to run as according to the directory where the program resides) make it rather impractical.

Shuttleworth: Unity on Wayland

Posted Nov 5, 2010 12:51 UTC (Fri) by alankila (guest, #47141) [Link] (9 responses)

Webapps actually have far more interesting issue than the one you talk about. The most important problem is that all end-user data generally is accessible from the same server-side UID, as the guy who logs in a web application isn't using a distinct UID on the server side. The data is in fact usually written in SQL storage and there are no explicit security tags that the database server could check on behalf of the application: thus all data is available to every query, at least in principle.

Thus, all security features must be implemented through other kind of checking, often manually by comparing the user id on the database row being requested with the user id currently logged in. Not everyone remembers to do this all the time, though: generally a missing check for things like "is this user to authorized to view that page" results in information disclosure bugs.

I think a lot of real-world systems don't run a ton of webapps within one server and one uid. I personally tend to isolate running webapps inside virtual machines and use reverse proxying techniques to expose them where I want them. Virtual machines can be backed up and recreated wherever I want, so they are actually quite convenient to move as black boxes from a system administrator's perspective...

Shuttleworth: Unity on Wayland

Posted Nov 5, 2010 14:03 UTC (Fri) by rqosa (subscriber, #24136) [Link] (7 responses)

> Webapps actually have far more interesting issue than the one you talk about. The most important problem is that all end-user data generally is accessible from the same server-side UID, as the guy who logs in a web application isn't using a distinct UID on the server side.

Uh, that is the issue I'm talking about. Or at least it's a consequence of it (because the app always runs as the same UID, all of its stored data is accessible to that UID).

> The data is in fact usually written in SQL storage

But, the grandparent post was talking about, essentially, using a Web browser & Web apps to do what we formerly did/still do with an X server & remote X clients. That is to say, to take the kind of apps which now are X clients (e.g. image editor, email user agent, text editor, office suite, RSS feed reader, terminal emulator, XMPP client, etc.) and make them into Web apps. These don't usually use SQL; or if they do, they use a per-user database instance, e.g. an SQLite file owned by the user, or a per-user instance of mysqld (IIRC, KDE's "Akonadi" does this).

There's no good reason for these apps to suddenly become dependent on a central RDBMS server, just because they have migrated from one remote-user-interface protocol (X Window System) to another one (HTTP + HTML5 + JavaScript + whatever)!

Shuttleworth: Unity on Wayland

Posted Nov 5, 2010 14:51 UTC (Fri) by alankila (guest, #47141) [Link] (6 responses)

Hm. Well, suexec does nothing to solve that problem as far as I know, so I concluded that you must be talking about separating webapps from each other, not separating the users within a webapp from each other.

Otherwise I am in agreement.

Shuttleworth: Unity on Wayland

Posted Nov 5, 2010 17:07 UTC (Fri) by rqosa (subscriber, #24136) [Link] (5 responses)

> suexec does nothing to solve that problem as far as I know

Well, with suexec, the UID a CGI program runs as is determined by what directory it's in. A typical usage scenario is that each user has a "home" (or "public_html") directory (that is, a directory found at a path like "~user/public_html" or something similar on the machine where Apache runs, which Apache then exposes to HTTP clients as the URL "http://hostname/~user/") which may contain CGI programs, and when one of those program is executed, suexec will set the UID for its process to the UID of the user who owns the "home" directory it's in. (Or maybe it just picks the UID that owns the program file; I don't remember which way it is, but it doesn't make much difference.)

So, basically, suexec will separate webapps that "belong" to one user from webapps that "belong" to other users. Now, if you take one CGI program and make multiple copies of it, each belonging to a different user (that is, each in a different user's Apache home dir), then the different users of that app are separated from each other. But that is an ugly kludge, necessary only because of the limitations of suexec. So suexec isn't a good solution for this problem.

(Also, suexec is only compatible with CGI programs. CGI has its own problems, the biggest of which is that it requires every webapp process to exit immediately when it finishes generating a response message; that is really bad for performance. There are much better IPC protocols for webapps, such as SCGI, AJP, and FastCGI.)

Here's a suggestion: for "single-user" webapps, the UID to run the app as should be determined by the user specified in the HTTP request, with HTTP authentication (basic or digest).

Look at mod_wsgi...

Posted Nov 5, 2010 21:13 UTC (Fri) by Pc5Y9sbv (guest, #41328) [Link]

For Python web apps, you can use mod_wsgi which can use daemon processes under different UIDs to run different apps. It uses Unix domain sockets between httpd and the app daemon and runs many requests through one daemon instance, so you don't have the overhead of suexec on every web request. With a bit of work, this model ought to be portable to any app language, not made Python-specific, just by standardizing the unix domain socket protocol used for the request proxying. The mod_wsgi module is written in C, and I suspect would be a reasonable starting point if you gutted its embedded Python runtime (used when apps are not split into separate daemon processes), and instead made a separate daemon process stub that contained the app language runtime.

I've frequently considered that it would be nice to have a generalized mod_wsgi like this, and a user-mapping variant that could manage a daemon process pool with each authenticated web user getting his own app instance, which can be reused for many requests and shut down automatically when idle for too long. There is already some basic pool management in mod_wsgi, but it needs more features.

However, other aspects of the security model need to be matured, as web frameworks have such an in-built idea of one app for all users. You'd really need to make all of your server side resources now owned by the individual web users, e.g. good user/group models for files and good multi-role policies for your RDBMS.

Shuttleworth: Unity on Wayland

Posted Nov 6, 2010 11:33 UTC (Sat) by alankila (guest, #47141) [Link] (3 responses)

Um. The model of using CGI programs run by a central apache from user's home directory is probably not in the future. It would make slightly more sense to enable browsers to execute applications without a web server by emulating some common protocols such as fastcgi so that the application could be run without exposing it on any URL, even if that URL was just locally visible.

I don't think anybody is going to actually do desktop apps in the web browser. The most important feature of a web application is probably still the fact that it's accessible anywhere and requires no installation from user's viewpoint. A local application implemented as web app is only available locally and needs to be installed, so that advantage is wholly gone.

Shuttleworth: Unity on Wayland

Posted Nov 7, 2010 0:36 UTC (Sun) by rqosa (subscriber, #24136) [Link] (2 responses)

> The model of using CGI programs run by a central apache from user's home directory is probably not in the future.

Indeed, it isn't.

Here's how it should be instead: When person goes to use a remote app, they point their browser at the URL for the host where that app resides and the pathname on that host where the app is installed; for example "http://hostname/bin/my_app.py". Then, the user enters their authentication credentials (use HTTP digest or basic authentication for this) for that remote host. Then, any subsequent HTTP requests from that user will be forwarded (by SCGI or AJP or similar) to an instance of that app running as that user's UID. So, the Web app is installed in just one location, but there will be multiple running instances of the app, one instance per user. (Think about what happens with, for example, a host running an SSH server where many user log in via SSH and then run various console apps and X apps. It's the same principle: apps are installed system-wide, and there's a separate running instance of each app for each user using the app.)

> enable browsers to execute applications without a web server

I think this is already possible. (If you've got the Python documentation installed, try going to "/usr/share/doc/python/html/index.html" in a browser, type something in the search box, and press "Go".) But I wasn't talking about running web applications locally.

> I don't think anybody is going to actually do desktop apps in the web browser.

The trouble is, some people here are saying that we don't need X Window System anymore, because we don't need X's network-transparency anymore, because we have a better way to use apps remotely: Web apps. But, with X, most apps that you can run locally (image editor, text editor, etc.) you can also run remotely, and lots of people use this feature. That won't be possible if those apps migrate to a non-networked UI system (e.g. Wayland).

If we're really going to adopt HTTP + HTML5 + (whatever else) as the replacement for remote X, we've got to have these same kinds of apps available for it!

Shuttleworth: Unity on Wayland

Posted Nov 7, 2010 0:38 UTC (Sun) by rqosa (subscriber, #24136) [Link]

s/requests from that user/requests from that user to that URL/

Shuttleworth: Unity on Wayland

Posted Nov 9, 2010 0:29 UTC (Tue) by alankila (guest, #47141) [Link]

Hm. Okay, I start to see the point of the argument. I do have some severe skepticism that we'll rewrite gimp as web application anytime soon, though. I rather expect that something similar to RDP/VNC or yet-to-be-defined networking protocol will be used to take remote connections to wayland applications.

The most popular X-forwarded application I use personally is xterm and that's mostly because I'm too lazy to open local terminals and use separate ssh connections for them. If I had to choose between X-style vs. VNC-style, I guess I actually prefer VNC-style remoting because of the ability to leave the session running perpetually on the server. Unfortunately, in practice, VNC is not really such a stellar protocol, and I've seen RDP between 2 Windows systems perform better than VNC seems able to, for some reason.

Shuttleworth: Unity on Wayland

Posted Nov 6, 2010 5:44 UTC (Sat) by butlerm (subscriber, #13312) [Link]

The data is in fact usually written in SQL storage and there are no explicit security tags that the database server could check on behalf of the application: thus all data is available to every query, at least in principle.

There are good ways to fix that problem, namely "virtual private databases". You can implement them in any database that has update-able views that can filter on session variables.

I have an application that sets the database session state to match the application session when handling each page request. Until that state is set, all the "tables" return zero rows. After it is set, all the virtual tables contain only the rows the user is allowed to have access to, only those rows can be updated, and the application can only insert rows into the same range. Near perfect isolation. Any kind of attack can only affect the data of the logged in user.

Shuttleworth: Unity on Wayland

Posted Nov 5, 2010 19:46 UTC (Fri) by misiu_mp (guest, #41936) [Link]

Of cource you can ssh with vnc also, just tunnel the right ports, like that:
ssh -L 5902:localhost:5901

Shuttleworth: Unity on Wayland

Posted Nov 5, 2010 3:27 UTC (Fri) by davide.del.vento (guest, #59196) [Link]

Completely agreed

Shuttleworth: Unity on Wayland

Posted Nov 5, 2010 3:57 UTC (Fri) by Kit (guest, #55925) [Link] (39 responses)

> but for me the computer is an important _tool_ and things like speed
> and network transparency are important for me.

Having a tool be capable of what it's intended to do is important, but would you really want to use an oily screw driver with no grip? Or a hammer where the metal on the handle is flaking?

That's part of the idea behind this "dancing candy cram-ware", as you call it. Even modern netbooks have MORE than enough power to handle these animations and transitions, it's just that the current stack has a variety of issues (immature drivers, 2D operations that act as basically a worst-case-scenario for the graphics accelerator, etc). Animations and transitions can work really well when _done_ well. Any that are showy, flashy, or long are prime examples of _bad_ ones... determining what's good is a bit harder, with subtlety generally always being best.

Shuttleworth: Unity on Wayland

Posted Nov 5, 2010 4:53 UTC (Fri) by gmaxwell (guest, #30048) [Link] (14 responses)

I would say instead that these systems are so fast that they ought to have no time in which to display the bling. Every action should occur so fast that unless the animation is slowing it down you wouldn't be able to perceive the animation. If there are still cycles left over the system should be conserving battery (if it's battery powered) or pre-calculating possible next moves on my part (if it's not).

So done well— most of them would be invisible. Unfortunately they aren't done nearly well at all. My favorite peeve today is the combination of: gnome-screensaver doesn't reliably measure idle status when using the keyboard exclusively, and getting an uninterruptable several second fade out animation when it decides to blank.

It's not that I want a oily screwdriver. I want the nano-diamond tipped tungsten-carbide rocked-powered screwdriver. I want amenities, but they ought to be ones I consider helpful rather than hindrances. If someone wants to paint it pretty colors— thats fine as long as it doesn't damage the atomically sharp pointy end. But absolutely no wind-load adding spoilers please.

I fully expect that different people will have different preferences in this regard. Unfortunately I don't feel that they are any good "power users" distros these days which don't leave me playing sysadmin over every piece of minutia (e.g. gentoo). Although I feel like the reduction in sysadmin work I get from using fedora vs gentoo is constantly decreasing due to "usability improvements", which seem to take the form of making the user's first hour 1% easier at the expense of adding a 10% cost to the user's next 20 years. Things like having to use some opaque "registry editor" in order to set a distinct lock and save time — when almost 20 years ago xscreensaver gave me a perfectly accessible text file (or even a gui!) with these settings.

The value of animations

Posted Nov 5, 2010 5:55 UTC (Fri) by sladen (guest, #27402) [Link] (3 responses)

Look carefully are certain platforms, like the iPhone. The animations (done for free by the GPU) are masking the general setup and teardown of applications running on the main CPU. The user gets their perceived instant-response, and because "something is happening" doesn't mind the 1–2 second it takes for the application to appear.

The value of animations

Posted Nov 5, 2010 9:20 UTC (Fri) by mjthayer (guest, #39183) [Link] (2 responses)

> The animations (done for free by the GPU) are masking the general setup and teardown of applications running on the main CPU.

I don't think things are quite as simple as you suggest. One, the animations aren't done by the GPU alone, they need support (think loading, preparing, scheduling) from the rest of the system. Two, the GPU draws power to do those animations, which is a cost. And three, perhaps most relevantly, they are not free from a developer time perspective. At least in theory (though this may not apply to software developed by volunteers and/or enthusiasts), that time could have been put into reducing your general setup and teardown time rather than creating animations.

The value of animations

Posted Nov 5, 2010 11:27 UTC (Fri) by Cyberax (✭ supporter ✭, #52523) [Link] (1 responses)

"I don't think things are quite as simple as you suggest. One, the animations aren't done by the GPU alone, they need support (think loading, preparing, scheduling) from the rest of the system."

Negligible.

"Two, the GPU draws power to do those animations, which is a cost. And three, perhaps most relevantly, they are not free from a developer time perspective."

GPU draws power to draw stuff in any case. And most effects are so simple that from GPU's point of view they are essentially free.

The value of animations

Posted Nov 5, 2010 15:42 UTC (Fri) by drag (guest, #31333) [Link]

Yes.

If your interested in speed and battery life then using the GPU to it's full extent will get you both faster then trying to depend on the CPU alone.

Using the CPU to do things that the GPU can do faster just means your wasting cycles and ruining your efficiency and performance.

The GPU is now a part of your computer as much as floating point processing is or DMA. It's not longer possible to treat it like it's some sort of optional add-on or something you only use for games. It's a native part of the architecture and should be possible for application writers to easily take advantage of.

In PCs this has been true for a while and with mobile world this is more and more true. After all you can look at the requirements for Windows Phone 7... they require a DirectX 9 capable GPU.

Shuttleworth: Unity on Wayland

Posted Nov 5, 2010 9:15 UTC (Fri) by mjthayer (guest, #39183) [Link]

> It's not that I want a oily screwdriver. I want the nano-diamond tipped tungsten-carbide rocked-powered screwdriver.

And instead of that you got the Tungsten Graphics powered one! (Sorry, couldn't resist there...)

Shuttleworth: Unity on Wayland

Posted Nov 5, 2010 9:58 UTC (Fri) by roc (subscriber, #30627) [Link] (8 responses)

Animations serve more purposes than just distracting you during a delay.

For example, even if the application can move a visual object from point A to point B instantly, an animation can still be a helpful cue to remind the user that motion has occurred. Our brains aren't designed to process objects teleporting around.

Shuttleworth: Unity on Wayland

Posted Nov 5, 2010 11:47 UTC (Fri) by orabidoo (guest, #6639) [Link] (7 responses)

"Our brains aren't designed to process objects teleporting around."

That's fine as a general case, by default. By all means provide a pretty animation, OpenGL-powered or otherwise, to make that window minimize to the taskbar or wherever.

BUT, it so happens that many of us, technically minded users, already know exactly what we expect from our computers when we press a key or click a button.

In those cases, having stuff visibly move around is just a plain distraction. The human eye, like most animals, is designed to follow stuff that moves and pay much more attention to it than to the static background.

And if I *know* that the window is going to minimize, my brain is already onto what I want to do with that window out of the way. So to have an eye-catching animation at that point is not just harmless eye-candy. It's actively distracting me from where my mind wants to go.

For that reason, every power-user friendly GUI and desktop should have an option to disable all animations. Current GNOME/Metacity has a gconf key for that (Apps->metacity->general->reduced_resources), which is nice. I sure hope the future GNOME shell(s) also have an equivalent setting.

Shuttleworth: Unity on Wayland

Posted Nov 5, 2010 13:07 UTC (Fri) by flammon (guest, #807) [Link] (6 responses)

If you clicked on a window control button and the window disappeared, was is closed, minimized or moved to another desktop?

Shuttleworth: Unity on Wayland

Posted Nov 5, 2010 13:26 UTC (Fri) by dskoll (subscriber, #1630) [Link] (5 responses)

It depends on which button you clicked. I use XFCE without any animations and I'm never confused about what happens when I do things to windows.

I use network transparency all the time. Eye-candy is great for those who want it, I suppose, but please keep an escape-hatch for those of use who like network transparency.

Shuttleworth: Unity on Wayland

Posted Nov 5, 2010 13:38 UTC (Fri) by Janne (guest, #40891) [Link] (4 responses)

"It depends on which button you clicked."

Well, duh. But people don't always know which button does what. If the UI can guide them with animations and such, that's only a good thing. App-window that minimizes in to the button on the taskbar is a GOOD IDEA. If the window simply vanished, users can be left confused as to what happened. Even you. What if your aim was few pixels off, and you accidentally closed the windows instead of minimized it? With animations you would instantly know that you closed the windows, instead of minimized it.

And I kept on hearing comments about "technically minded people". You do know that those people are in the minority? Most people are NOT "technically minded", they just want to get their stuff done. And if they can get it done elegantly, all the better.

And I find these comments about "eye-candy that freezes the desktop" strange. I have all kinds of animations and the like on my Mac, and the UI does not freeze.

Shuttleworth: Unity on Wayland

Posted Nov 5, 2010 15:14 UTC (Fri) by dskoll (subscriber, #1630) [Link]

But people don't always know which button does what.

A UI that doesn't make that clear is fundamentally broken and no amount of animation can fix that.

To be clear: If people want to implement fancy animations, that's fine. I don't care. Even make it the default if you like. But make it possible to switch them off because I do care if animations are forced on me.

Shuttleworth: Unity on Wayland

Posted Nov 5, 2010 15:42 UTC (Fri) by tjc (guest, #137) [Link] (1 responses)

But people don't always know which button does what.... If the window simply vanished, users can be left confused as to what happened.

Well, maybe the first time they don't know what happened. But if someone clicks a button five times, and the same thing happens every time-- and they still don't know what's going on-- then they have issues that can't be addressed by the UI.

Everyone is confused from time to time, but it usually passes. There are very few people who live in a state of perpetual confusion, so why target a UI at some imaginary, gormless twit who doesn't even exist?

Shuttleworth: Unity on Wayland

Posted Nov 6, 2010 10:22 UTC (Sat) by Janne (guest, #40891) [Link]

With attitude like this, it's no wonder that Linux on the desktop is perpetually stuck at under 1% market-share...

People are not computer-wizards. It might be obvious to you and me how and why computers work the way they do, but rest of the people have no idea. The computer should do everything in it's power to help the user. But every time something like that is attempted in Linux, we get whining about "dumbing down" the UI or something. Only in Linux, complexity is considered a good thing, and helping the user is considered a sign of stupidity.

End result is that Linux on the desktop is something that normal people do not want to use.

And sure, people will learn which button does what. But animations still help. When you have dozen apps in the taskbar, it's useful to have an animation that shows you which of those is the app you just minimized. Sure, you could visually scan the taskbar, but you must admit that animation is a lot faster way to do this.

And there are even studies about this. Researchers set up two functionally identical systems. The difference was that one system looked plain and basic, while the other has nice graphics ("useless eye-candy" as it's called in Linux-community). It was found that people were more productive on the system that looked better. People found the better-looking system more pleasant to use. And that in turn made them more productive. And happy users are a good thing.

Shuttleworth: Unity on Wayland

Posted Nov 6, 2010 21:21 UTC (Sat) by orabidoo (guest, #6639) [Link]

"Well, duh. But people don't always know which button does what."

Well duh right back. As I said above, I'm all for having such friendly animations on by default.

I'm just pointing out a good reason why a subset of users find them counterproductive, and pleading that every GUI should have an option to turn animations off. I don't mind if the knob is quite well hidden, like a gconfkey. Just let those of us who like to think ahead of the computer save that 0.10s of time, or feel like we did. Thanks.

Shuttleworth: Unity on Wayland

Posted Nov 5, 2010 8:36 UTC (Fri) by janpla (guest, #11093) [Link] (1 responses)

Kit said:

"Having a tool be capable of what it's intended to do is important, but would you really want to use an oily screw driver with no grip? Or a hammer where the metal on the handle is flaking?

That's part of the idea behind this "dancing candy cram-ware", as you call it. Even modern netbooks have MORE than enough power to handle these animations and transitions ..."

- All this may be true, but there are some (I am one) who avoid this kind of thing because it is too intrusive and too much of a distraction. I am perfectly happy with graphics where relevant and useful, but in my view trying to work in the middle of an advanced light-show will only detract from the real enjoyment of computer programming.

Apart from that, I think it is deeply unfair to compare X to a broken tool. To take you up on the tool-analogy, you may prefer a sleek-looking electric drill with automatic cable roll-up, cool colour and some impressive graphics printed on the body, but if you want to drill a hole, all you need is a hand-cranked drill; and if you know how, you can normally do a much better job faster, because you have far better control over it.

X may be hand-cranked, but it is a very well-designed tool and there is nothing broken about it.

Shuttleworth: Unity on Wayland

Posted Nov 5, 2010 8:57 UTC (Fri) by marcH (subscriber, #57642) [Link]

>X may be hand-cranked, but it is a very well-designed tool and there is nothing broken about it.

Some insiders do not agree: http://lwn.net/Articles/390389/

Shuttleworth: Unity on Wayland

Posted Nov 5, 2010 8:57 UTC (Fri) by codefisher (guest, #64993) [Link] (19 responses)

I think people are forgetting the real purpose of all the animations - well at least the good ones - is to provide feedback to the user about what actions just happened. If everything happens at blindingly fast speeds like some people want, it may leave you confused as to what actually happened.

Shuttleworth: Unity on Wayland

Posted Nov 5, 2010 13:08 UTC (Fri) by gmaxwell (guest, #30048) [Link] (18 responses)

For the user's first hour on the system this kind of training-wheels may be necessary… but for their next twenty years?

The way I see it— if _I_ need animations to tell what the system has done then the system has already failed. The computer's behavior should consist mostly of deterministic direct responses to my actions so that I should almost never need help figuring out what it has done.

In the rare cases of doubt (such as a cat jumping on the keyboard) I should not have to fuddle out what happened from my memories of the computer's graphical interpretive dance, instead a log/history should be provided which I can reference whenever I need to.

There are a great many operations that a computer can conduct which have no intuitive mapping to an animation. We would weaken our computers to uselessness if we constrained their easily accessible abilities to those which could be represented accurately as dance. I could possibly memorize a long list of animations— "When the screen vibrates up and down, the system has unmounted a disk and it can be removed."— but part of the reason for having _language_ is that you don't need to memorize a unique symbol for every possible idea. Textual communication can provide a precise, searchable, learnable, archivable, and accurate information channel between the computer and the user. Language is a tool which is equally available to all applications, including GUI ones.

Much of the Unix environment already works this way, certainly the CLI itself does— but it seems that many desktop tools comes from a different culture where they use things like focus stealing animated popups with ideograms to inform the user about system activities. When users complain that sometimes the message disappear, never to be recovered, before they had a chance to see them the 'desktop' culture seems to think "lets make the animation slower and more intrusive!". If that kind of thing makes someone happy, good for them— but it isn't something that I want.

Shuttleworth: Unity on Wayland

Posted Nov 5, 2010 15:51 UTC (Fri) by drag (guest, #31333) [Link] (16 responses)

[quote]The way I see it— if _I_ need animations to tell what the system has done then the system has already failed. The computer's behavior should consist mostly of deterministic direct responses to my actions so that I should almost never need help figuring out what it has done.[/quote]

Lolz over Lolz. :)

Don't you know that, you know, text scrolling is a ANIMATION?

The way it makes it sound is like your computer is just something with a big red button on the front that you press and it says "DO WHAT I WANT" and then it plugs into your mind or something.

It's all about information feedback. There are lots of ways to provide information. Lots of different ways to receive it. If you want to live in a weird sort of Max headroom type universe were all that exists is just you and your PC then that's a interesting idea, but I (and most people) want to be interact with the real world.

This means things happening outside your control and interacting with you and your computer. GPS, temperatures, news feeds, message notifications, etc etc. All sorts of stuff is going on all the time. We want 'augmented reality', 'feedback', and that sort of thing. It's the dream to be able someday go 'Hello Computer' and have some sort of meaningful response.

Shuttleworth: Unity on Wayland

Posted Nov 5, 2010 17:41 UTC (Fri) by gmaxwell (guest, #30048) [Link] (15 responses)

Don't you know that, you know, text scrolling is a ANIMATION? The way it makes it sound is like your computer is just something with a big red button on the front that you press and it says "DO WHAT I WANT" and then it plugs into your mind or something.

Please don't be silly. I think I made it amply clear that I don't expect to work with the computer without it communicating with me— but I want it to communicate with only when required or requested and I want it to use the high bandwidth channel of _language_ to do that communication— when communicating things which are intuitively and obviously graphical or when doing so is most efficient, instead of what I characterized as "interpretative dance" or cave drawings.

I certainly do want to interact with the outside world— but I also want to be able to control that interaction. A small status indicator, additional comments in addition to some output the computer is already providing. Among humans we generally consider it impolite to interrupt someone with something unless it's urgent or you know that its something that they want to know about. My computer is far too stupid to reliably know when something meets that criteria, so it ought to be especially cautious in its interruptions unless I tell it otherwise.

It's not like you really have a choice of the matter. In environments where the computer is constantly presenting the user with a barge of focus stealing choices users quickly learn to simply confirm everything that comes before them. "Install this?" "Yes." "Remove this?" "Yes." "Transfer your bank account to Nigeria?" "Yes. er. SHIT!". My bandwidth is finite— and I'd much rather spend it on the interactions I have initiated.

Well, duh. But people don't always know which button does what. If the UI can guide them with animations and such, that's only a good thing.

Not always— but experienced users USUALLY do. Why should I pay the price of an animation every time just to provide a small benefit in a small minority of cases? Give me a session history, give me an undo, give me a "WTF just happened button". These things would all be great. An animation? Without things like undo an animation just lets you know how screwed you are a bit faster. Without a history/WTF-button an infrequently encountered animation is likely to be inscrutable.

The kinds of events which are likely to confuse me are also likely to not be representable by an animation. I'm not going to be confused by accidentally dropping a file in the wrong directory, I'm going to be confused by something like a glob having unintended consequences.

Shuttleworth: Unity on Wayland

Posted Nov 5, 2010 18:20 UTC (Fri) by drag (guest, #31333) [Link] (14 responses)

[quote]It's not like you really have a choice of the matter. In environments where the computer is constantly presenting the user with a barge of focus stealing choices users quickly learn to simply confirm everything that comes before them. "Install this?" "Yes." "Remove this?" "Yes." "Transfer your bank account to Nigeria?" "Yes. er. SHIT!". My bandwidth is finite— and I'd much rather spend it on the interactions I have initiated.[/quote]

Well you can avoid that just by using software that does not suck.

The only time I want my attention to be stolen from what I am working on is if it's something damn important. Then in those cases I WANT my attention to be stolen.

But really nobody is advocating that we should have constant big swooping animations that do nothing but get between you and what ever text box you happen to be interacting with at the time.

And these animations don't cost you anything really. If you think that having a translucent notification box pop up to tell you received a email is going to take away from whatever your doing your probably very wrong.

Or at least you should be wrong.

We have had hardware around since the late 1990's that is perfectly capable of performing the necessary functions to get what people are trying to get at with things like Unity, Gnome-3, etc etc. Apple's first OS X desktop ran with no GPU acceleration at all!

It's just that graphics suck in Linux. This is what _may_ get fixed if we can break away from the tyranny of proprietary video drivers and everything-we-use-must-be-X.

Shuttleworth: Unity on Wayland

Posted Nov 5, 2010 18:44 UTC (Fri) by gmaxwell (guest, #30048) [Link] (6 responses)

I can't speak for anyone but myself but I find things like "translucent notification box pop up to tell you received a email" to be very distracting and irritating. It intevitably appears over material I'm currently reading or in the way of something I'm typing, and I have to wait for it or perform some action to dismiss it. Any software that does that _sucks_ in my book.

I'm sure that some other people, perhaps most other people, are completely fine with that sort of thing. I wish you luck in creating software for those people to use. Though I am somewhat skeptical that most people actually prefer this sort of thing— outside of computers almost nothing else provides indications in that kind of intrusive—"interrupt driven"— way. (My car doesn't overlay a gigantic oilcan on my windshield when the oil pressure is low— it lights up a discrete check engine light and I can attach and OBD tool to find out the cause. When my office paper-mailbox has a letter it's left sticking out where I can see— no one copies the letter onto a transparency then slams it in my face)

But even if I really am in the crazy minority here, please don't think that you speak for everyone. You certainly don't speak for me— and at least a few other curmudgeons like me— and I've been using computers long enough to have a pretty good idea what works for me. That kind of annoyance isn't how I work, it isn't what I want. I put up with this kind of behavior from my computer only in so far as putting up with it is less costly to my time and endurance than maintaining every part of my systems on my own. But as far as I'm concerned a step down from a system that provides no notification at all.

Shuttleworth: Unity on Wayland

Posted Nov 6, 2010 16:08 UTC (Sat) by andreashappe (subscriber, #4810) [Link] (5 responses)

> But even if I really am in the crazy minority here, please don't think that you speak for everyone.

Time to cut back on the hyperboles..

The X-Protocol is currently getting more in the way than helping stuff -- don't take my word for it, Keith Packard's should be enough. There are people trying to improve that: look at the quality of the X stack, they are are a long way ahead from the things that I had to use in the last millenium.

Animations might be added. So what? Scrolling is an animation, tear-free window movement was made possible through that animation work. Who did suffer from that? Wayland might the way forward, but still there's X11 as a possible client to that.

If you don't like it: turn them off. Come on, that would have taken less time than the whining on this forum. So you don't like those transparent popups that disappear after 2-3 seconds and hide the some 8cm^2 in the top right corner of your screen where most of your work seems to happen: disable them. You are able to disable them, some no-clue first-time Linux user surely ain't able to enable them. If you (for some reason) need to install your linux distribution every year automate it. Create a package that does all the magic for you -- other people might even like to use it.

Shuttleworth: Unity on Wayland

Posted Nov 6, 2010 20:19 UTC (Sat) by gmaxwell (guest, #30048) [Link] (4 responses)

To quote my first message on this thread:
Perhaps a distro fork will arise targeting people who are technically competent [and are] more interested in productivity.

I run a distribution in order to outsource basic system maintenance. I have more pressing things to do with my time and I'm willing to tolerate the consequence of system operation that I don't agree with but that doesn't mean that I don't have preferences. I'm speaking up here because I believe that it would be a disservice for me to everyone who has common interests to sit quietly while people pushing features which are harmful to those interests are so vocal.

You make it sound like it's so easy to disable these things. Sadly it is usually not— in the interest of "usability" the mere option to disable these things is often completely eliminated or if it remains at all it is deeply hidden (often inside some undiscoverable registry tool). Just because I am more capable than joe-random that doesn't mean my time is less valuable, that I am more patient, or that I am infinitely capable. In cases where the functionality is eliminated patching the software breaks updates and leaves me tracking development, which is the work I was hoping to avoid by using a distribution in the first place.

Going back to the subject that started this sub-thread: If network transparency is abandoned in the GNU/Linux desktop infrastructure I can't simply turn a knob to bring it back! Remote X is functionality I use _every day_. I have three windows open on my laptop right now to a system with a large amount of ram which is able to work on data sets that I can't reasonable work on locally. It works great. And the notion of it only working via shims or with arcane software which I have to maintain myself troubles me greatly.

I'm certainly not opposed to _performance improvements_. By all means, making it faster has my full support. The discussion here was about tossing functionality (which I find critical) in order to enable performance improvements which are mostly inconsequential to me. I am not comforted by the argument that this change is urgently needed due to make improvements like increasingly intrusive animations.

Posted Nov 6, 2010 10:22 UTC (Sat) by Janne (guest, #40891)
With attitude like this, it's no wonder that Linux on the desktop is perpetually stuck at under 1% market-share...

Janne, I must admit that I'm not quite sure if you're trolling me or not but if you are I guess I'm going to fall for it.

Your market share strawman is not well supported by the evidence. Systems with clearly superior user experience have time and time again failed to capture really significant market share (Mac OS for the longest time and even today it's only at perhaps 7%, BeOS, etc).

You're also making the erroneous assumption that I care about having 7% market share (like OSX) vs 2% market share(numbers source). I don't. I care about having a usable _computer_ (as opposed to a home entertainment center, which has large orthogonal usability requirements). I care about having a good option to recommend to other technical people. I care about not having to build my own desktop software stack, even though I would probably be able to create one which met my needs— I have other things that I'm working on. While I'd love to see most people running Free software, 7% wouldn't be much of an improvement against the 85% on windows for that purpose... even if I believed that we could solve the marketshare gap with UI improvements.

People use computers for different purposes. Even windows has a small market share if I count televisions and video game systems as "computers". I wonder if we're using 'desktop' market share numbers which are diluted by a great many use cases which would be better served by an appliance? If I were to care about market share— I'd want to first care about getting 100% of uses which are best met by powerful computing systems rather than by media players or the like.

People are not computer-wizards.

I am and I am not alone. And I want a system which is useful for me to run. I also want other people to have systems which are useful for them, even if their needs are different than mine. I feel that non of the major distributions are catering to my interests, and I think thats unfortunate and I hope it changes. The major distributions and major Linux desktop software suites are clearly prioritizing non-technical novice users today. They even say so explicitly. They may be actually failing to satisfy the needs of those users too, but failing to make your target happy isn't equal to having a target which includes other people.

The computer should do everything in it's power to help the user.
It seems to me that the people carrying the biggest "help people" banner often do the most harm. I too want the computer to help people, even non-technical people. I suspect we have very different ideas of what "help" means. I can assure you that adding more popups and interface interrupting animations will not help _me_ in the slightest. Other folks, perhaps, but I don't intend to speak for anyone else.

And sure, people will learn which button does what. […] And there are even studies about this. Researchers set up two functionally identical systems. The difference was that one system looked plain and basic, while the other has nice graphics […]. It was found that people were more productive on the system that looked better.
If you provided citations I would read them. But what concerns me with this is that it seems like an unhealthy obsession with the initial impression. Your first few hours with a system are entirely different than your next twenty years with it. Unless you are worried about every last fraction of a percent of market share I believe you should optimize as much for the 'next twenty years' as is possible without turning people off completely. (For example, I think Blender fails a bit too hard on the initial impression)

Perhaps animations can play a useful role in a typical user's "next twenty years"— but the animations that do probably won't be the same training-wheels animations that you'll create if you're optimizing for the initial impression. I found the example about minimizing to be pretty humorous. Why would I want that? If I care it's because I either don't know what I did, or because I wish I hadn't done it. In either case what I need is an undo button, not an animation. An animation might make it a little easier to manually undo my mistake, but thats really a half-step... We have computers to eliminate manual processes. How many significant usability improvements are we missing because everyone focused on usability is primary focused on newbies and the initial impression?

Shuttleworth: Unity on Wayland

Posted Nov 6, 2010 21:22 UTC (Sat) by dlang (guest, #313) [Link] (3 responses)

adding more info here.

it's really hard to put a couple hundred gig of ram into a laptop, but trivial to remote the display from a server that has a couple hundred gig of ram to a laptop that you can carry into a conference room.

you may try to argue that the app could be written to work that way through other means, but that misses the point that with X the app author doesn't have to make a decision of if the app should be network accessable or not. If app authors have to go to extra effort to make their stuff network accessable, most of them won't go to that effort (after all, nobody needs that anyway, that's why the feature was removed from linux systems to start with right?) and the only apps that will have the ability to be networked are the ancient X apps (that predate the change), or high-end 'enterprise' commercial apps where someone is paying for the feature.

this leaves out the huge middle ground where the app author never thought about the need to be networked, but that app ends up being the perfect thing to use when backed by the right hardware. Instead someone will have to fork or recreate the app in a networked version.

Shuttleworth: Unity on Wayland

Posted Nov 7, 2010 9:31 UTC (Sun) by roc (subscriber, #30627) [Link] (2 responses)

> with X the app author doesn't have to make a decision of if the app should
> be network accessable or not.

Maybe true for simple apps, but complex apps are basically unusable over modest-latency links unless they've been significantly optimized to reduce round-trips to the X server. There are a lot of X APIs that you simply cannot use if you want to be fast over the network.

> this leaves out the huge middle ground where the app author never thought > about the need to be networked, but that app ends up being the perfect
> thing to use when backed by the right hardware. Instead someone will have
> to fork or recreate the app in a networked version.

Or just run it under a modern screen-remoting tool.

Shuttleworth: Unity on Wayland

Posted Nov 9, 2010 2:02 UTC (Tue) by nix (subscriber, #2304) [Link] (1 responses)

There are a lot of systems on fast LANs with big servers nearby. Low-latency LANs are downright *commonplace* these days: why not optimize for them?

Shuttleworth: Unity on Wayland

Posted Nov 9, 2010 6:40 UTC (Tue) by dlang (guest, #313) [Link]

at this point the argument isn't even about optimizing for them, it's just arguing that we should support them with something a little more efficient that bitmap images of the screen being shipped around (the VNC approach)

Shuttleworth: Unity on Wayland

Posted Nov 5, 2010 19:30 UTC (Fri) by dskoll (subscriber, #1630) [Link] (6 responses)

And these animations don't cost you anything really. If you think that having a translucent notification box pop up to tell you received a email is going to take away from whatever your doing your [sic] probably very wrong.

Are you kidding me? Those notification boxes drive me crazy. There I am, deep in an xterm or an emacs debugging session and some stupid box obscures my text? I want the computer to stay out of my face!

Shuttleworth: Unity on Wayland

Posted Nov 5, 2010 20:06 UTC (Fri) by drag (guest, #31333) [Link] (2 responses)

It depends on what is important to you.

For example at my current job some of the emails that I get are going to be critical and far more important then anything I would happen to be working on, unless I am working on a emergency... at which time I would have 3 phone lines blazing, people talking over everybody else, etc etc. Then a little pop up in the corner of my window is going to be the last thing on my mind.

In my old job I couldn't care less. There was no communication that mattered enough to be answered right away.

But now I WANT to see that stuff. I WANT to be interrupted. That's a good thing. Because if I get a notification and act fast enough I can stop those above mentioned emergencies. :)

Shuttleworth: Unity on Wayland

Posted Nov 6, 2010 10:25 UTC (Sat) by modernjazz (guest, #4185) [Link]

There's another issue here, too: there are legitimate examples of work that require good graphics. I for one am looking forward to being able to run GLSL-requiring scientific visualization software on my ATI GPU someday, and start leveraging OpenCL for numeric computation. (My GPU is not supported by Catalyst anymore, or I could be doing those things now as long as I was using a proprietary driver.)

The problem is there just hasn't been enough effort put into open-source drivers until recently, and the quest for "bling" has really ramped up those efforts. Just like how the commodity/gaming market increased power and decreased the price of computing, for both "serious" and "fluffy" use-cases.

So I'm happy about where things have been going, even though it has made X a pain in the neck for the last couple of years. (Fortunately, it seems to be getting better, at least for me.) But I would bemoan the loss of network transparency in situations where I didn't need the absolute highest-performance graphics.

Shuttleworth: Unity on Wayland

Posted Nov 11, 2010 17:13 UTC (Thu) by cdmiller (guest, #2813) [Link]

And thus *CHOICE* of the interrupt and mechanism is important. This is why I use gcalcli rather than the web interface of google calendar for example. I'm sure you wish for (or have already invented) a process whereby only the important email's rather than every message interrupts your day to day work.

Shuttleworth: Unity on Wayland

Posted Nov 6, 2010 22:31 UTC (Sat) by Wol (subscriber, #4433) [Link] (2 responses)

Or they hide the very thing they're SUPPOSED to be telling me about.

Perfect example. KDE. I have a taskbar at the bottom of my screen that currently says "Konsole (2)" - ie I have two Konsoles (currently hidden). Let's say I put my mouse over it - it now displays what those two consoles are. All fine and dandy - UNTIL I actually want to select the upper of the two.

If I don't know which one I want, or I'm slightly hesitant, or I'm not good at moving my mouse, or or or ... the mouse hovers over the FIRST konsole description a tiny moment too long, and the information popup appears, COMPLETELY obscures the second Konsole button that I actually want, and JUST WON'T GO AWAY until I go back to "Konsole (2)", get rid of the whole damn lot, and have to start ALL OVER AGAIN.

Don't forget - these information popup bars have a habit of following the mouse. In other words, if you're slightly unsteady, or can't aim quite right, or anything else where the mouse is wobbly, there's a damn good chance the popup is going to pick a damn inconvenient place to appear.

Quite why the KDE people chose the place they did for the popup I'm moaning about I do not know - it is INCREDIBLY stupid, but hey, I'm sure they have some very clever people who thought it was a good idea ... :-)

Cheers,
Wol

Shuttleworth: Unity on Wayland

Posted Nov 6, 2010 22:43 UTC (Sat) by dlang (guest, #313) [Link]

in KDE, if you right click on the taskbar and select task manager settings, there is an option 'grouping and sorting' you can configure it to never group, or to only group when the taskbar if full.

personally, I choose to have it never group and I set the taskbar to be tall enough to show enough rows to have a useable amount of text in each of the icons.

KDE obscuring tool tips

Posted Nov 7, 2010 1:32 UTC (Sun) by boog (subscriber, #30882) [Link]

It's annoying and surely a bug. No doubt it will be fixed soon.

Shuttleworth: Unity on Wayland

Posted Nov 5, 2010 19:30 UTC (Fri) by rgmoore (✭ supporter ✭, #75) [Link]

The computer's behavior should consist mostly of deterministic direct responses to my actions so that I should almost never need help figuring out what it has done.

That's only likely to be true for UI limited tasks. I expect my computer to be doing many things at once. Some of those tasks are things that don't and can't happen instantly because they require substantial processing or data retrieval time. Many of them are background tasks that are running without requiring my explicit instructions every time. I want feedback about what's happening with those tasks, and some kind of unobtrusive desktop effect can be a better way of providing it than yet another message window popping up.

And that's just for desktop notification type effects. There are other useful things you can do with graphics. For example, I find that I wind up with overlapping windows fairly regularly, even though I have a very large monitor with multiple virtual desktops. I like some of the eye-candy effects that are used to help out with that problem- translucent window borders, temporarily reconfiguring the desktop so I can see miniature versions of all windows, etc. Those kinds of effects may not be vital, but they make the system easier to use- which should be the single biggest goal of development.

Shuttleworth: Unity on Wayland

Posted Nov 5, 2010 12:28 UTC (Fri) by pboddie (guest, #50784) [Link] (1 responses)

Having a tool be capable of what it's intended to do is important, but would you really want to use an oily screw driver with no grip? Or a hammer where the metal on the handle is flaking?

What a flawed analogy! It would be more appropriate to liken a plain, functional desktop to a working screwdriver and a fancy, animation-heavy desktop to a screwdriver with a diamond-encrusted gold handle with a fluffy comfort grip. The former is sufficient to get the job done whereas the latter looks great if you want to give a demo.

Shuttleworth: Unity on Wayland

Posted Nov 5, 2010 21:58 UTC (Fri) by Kit (guest, #55925) [Link]

>What a flawed analogy! It would be more appropriate to liken a plain,
>functional desktop to a working screwdriver and a fancy, animation-heavy
>desktop to a screwdriver with a diamond-encrusted gold handle with a fluffy
>comfort grip. The former is sufficient to get the job done whereas the
>latter looks great if you want to give a demo.

Your version only further illustrates the point I was making. _BADLY_ done transitions and animations are FAR easier to notice than _well_ done ones, because _well_ done ones you only really notice subconsciously, while badly done ones demand your attention.

Well done transitions must be VERY quick, and completely smooth. They must be over far faster than a person could actually react to them, because they're only supposed to provide a hint at what's going on. People that are used to a system operating a specific way might not like it, because people fear change... but to a user that has to learn both systems, the ones where transitions and animations are used well will be far easier to learn. And then, after using the system for a while, once they're used to how it operates, the one with the transitions will _continue_ to be the more pleasant one to use.

Animations and transitions can transform an interface from feeling like a computer, to feeling like an actual physical thing, operating under the normal physical properties.

---

There are other non-animation/transition effects that the system can use to improve usability, such as apply an effect to the windows of an application that appears to be frozen, or very subtly dimming background windows (but it needs to be subtle enough that you wouldn't notice, likely even if someone told you it was doing it). Humans notice far more than what they're consciously aware of, interfaces should take advantage of that.

Shuttleworth: Unity on Wayland

Posted Nov 5, 2010 8:08 UTC (Fri) by laf0rge (subscriber, #6469) [Link] (4 responses)

I could not agree more. I have long given up on using KDE, Gnome or any other 'desktop environment'. A simple window manager like ion3 or its derivatives, a large number of uxterms and one window that runs firefox seem more than sufficient for me.

And as for making 'the linux desktop' attractive to end-users in an office: i could not care less personally. I am interested in making technology work for those people who have an interest in technology and want to understand it. People who use it so much, that adopting the human being to the computer results in much more productivity than trying to adopt the computer to human beings.

Shuttleworth: Unity on Wayland

Posted Nov 5, 2010 8:40 UTC (Fri) by pheldens (guest, #19366) [Link] (3 responses)

Gnome and kde have proven to be unreliable, and changing way too often (now again with gnome-shell), ion3 with a load of console apps and graphical apps where they make sense, is the best DE for me. Last night I noticed theres an ion3 through start at http://notion.sourceforge.net/

Shuttleworth: Unity on Wayland

Posted Nov 5, 2010 9:11 UTC (Fri) by Karellen (subscriber, #67644) [Link] (2 responses)

<blockquote>Gnome and kde have proven to be unreliable, and changing way too often</blockquote>

WTF?

Since the Gnome 1.0 release in March '99, there has been *one* incompatible change, when 2.0 was released, in June 2002. Since then, no incompatible changes in over 8 years. Gnome 3 is (currently) scheduled for March 2011, and there's no reason to think that it will last for any less time than Gnome 2 did.

Similarly, KDE 1.0 was in July '98, 2.0 in October 2000, 3.0 in April 2002, and 4.0 in Jan 2008. While 1 and 2 were fairly short-lived, 3 was a lot more mature lasting for 6 years, and with a lot of the technologies in 4 still being built upon with newer minor releases at nearly 3 years in, I predict that 4 will be a longer-lasting base than 3 was.

And, of course, apps written for Gnome 1, and KDE 1, 2 & 3 should all still run fine on any current and future Gnome/KDE desktops. You don't have to rewrite your KDE3 app to KDE4 technology if you don't want to. You can keep developing it against the old libs for as long as you want.

Shuttleworth: Unity on Wayland

Posted Nov 5, 2010 12:51 UTC (Fri) by BeS (guest, #43108) [Link]

>Gnome 3 is (currently) scheduled for March 2011, and there's no reason to think that it will last for any less time than Gnome 2 did.

Reading this blog post about the plans for Gtk4 I have the feeling that Gtk3 and GNOME3 will have a rather short live:

http://blogs.gnome.org/desrt/2010/11/02/gtk-hackfest-summary

Shuttleworth: Unity on Wayland

Posted Nov 8, 2010 18:19 UTC (Mon) by jond (subscriber, #37669) [Link]

Gtk1 is for all intents and purposes dead and buried, running gtk1 apps is not really possible anymore.

Shuttleworth: Unity on Wayland

Posted Nov 5, 2010 9:31 UTC (Fri) by nhippi (subscriber, #34640) [Link]

Yeah, give us back those snappy and simple Xaw and Xview Applictions! Rapid indexed color modes too, who needs all these shades?

Back to seriousness, is there really much point in investing heavily in desktop window management? Most users end up switching mostly between browser tabs. Just compare how many people bother with evolution or thunderbird and just go with gmail nowadays? Developers and HC users will still have some xterms open too, but they will probably use something tiling to manage them (like now).

Shuttleworth: Unity on Wayland

Posted Nov 5, 2010 19:24 UTC (Fri) by Simetrical (guest, #53439) [Link] (15 responses)

Why do you imply GPU acceleration is only useful for "dancing candy cram-ware"? It's useful for mundane things like video playback, not to mention games. Plus for saving power, or so I've been told. Even browsers are all becoming GPU-accelerated these days. GPUs make things faster, it's bad interface design that makes things slower.

Shuttleworth: Unity on Wayland

Posted Nov 5, 2010 20:42 UTC (Fri) by gmaxwell (guest, #30048) [Link] (14 responses)

Why do you imply GPU acceleration is only useful for "dancing candy cram-ware"? It's useful for mundane things like video playback, not to mention games. Plus for saving power, or so I've been told. Even browsers are all becoming GPU-accelerated these days. GPUs make things faster, it's bad interface design that makes things slower.

(1) We have GPU acceleration already. Perhaps it could be made to work better. I'm all for that. I protest the idea that we must toss the very useful network transparency just to get the small incremental improvement that might come in cases where network transparency isn't being used.

(2) While I'll concede on the power savings bit— I don't actually believe that we do need GPU acceleration. A fairly typical PC can compute and blit out 1080P video at 60FPS without any "gpu acceleration" at all. The system is already far beyond my reaction time— at least until someone adds a bunch of fancy animations.

Shuttleworth: Unity on Wayland

Posted Nov 6, 2010 16:17 UTC (Sat) by andreashappe (subscriber, #4810) [Link]

> A fairly typical PC can compute and blit out 1080P video at 60FPS without any "gpu acceleration" at all. The system is already far beyond my reaction time— at least until someone adds a bunch of fancy animations.

Yeah, but the CPU utilization drop from 40% to 4-5% was kinda nice.. do you buy cpus to just get spammed by tasks that they are not suited to?

Shuttleworth: Unity on Wayland

Posted Nov 6, 2010 16:26 UTC (Sat) by Darkmere (subscriber, #53695) [Link] (6 responses)

I'm afraid I cannot agree with you about that. The local systems I have ( Core 2 Duo as well as some Atom based systems) simply _cannot_ do even 720p decode and playback without glitching either with dropped frames or audio latencies.

This may because the current Linux-based software stack is impossible to deal with it properly, but that doesn't really make a difference to the end user.

Shuttleworth: Unity on Wayland

Posted Nov 6, 2010 19:16 UTC (Sat) by gmaxwell (guest, #30048) [Link] (2 responses)

Then you must have a broken driver or the like. Core 2 Duo 1.6GHz does 1080p playback fine for me. Atom is indeed problematic but as a desktop it's something of a throwback system, it is basically a mobile device processor, so you shouldn't be surprised that it has problems driving desktop sized screens— I was making something of an extreme example RGB 1080p60 is 3gbit/sec of data. It's not a light load and video decompression is not a minor task. If systems can do this then they ought to be able to run a desktop enviroment! :)

Shuttleworth: Unity on Wayland

Posted Nov 6, 2010 19:43 UTC (Sat) by Darkmere (subscriber, #53695) [Link]

Pulseaudio + intel HDA is probably the main cause here, really.

Shuttleworth: Unity on Wayland

Posted Nov 9, 2010 15:36 UTC (Tue) by nye (subscriber, #51576) [Link]

FWIW My Atom D510 can happily play 1080p x264 to a 1920x1200 display without issues. I imagine that it is accelerated to some extent though so I'm not sure how relevant that really is. (On the other hand enabling desktop effects in KDE is sluggish and unpleasant, so perhaps it's not accelerated *very much* :D.)

Shuttleworth: Unity on Wayland

Posted Nov 6, 2010 19:58 UTC (Sat) by sfeam (subscriber, #2841) [Link]

+1 on above comments. I have seen no problem with 1080p video on Core 2 Duo systems. Heck, my netbook (Core 2 solo) manages 720p just fine. I have seen some audio problems, but nothing that wasn't fixed by cutting PulseAudio out of the pipeline.

Shuttleworth: Unity on Wayland

Posted Nov 8, 2010 8:26 UTC (Mon) by buchanmilne (guest, #42315) [Link] (1 responses)

What GPU does your Atom system have?

While my Atom-based netbook can handle some 720p content, it struggles with others. However, my Atom-based HTPC, which has an ION GPU, running XBMC on Linux, plays almost all 1080p H.264/5.1 content (with sound going through pulseaudio) without going over 20% CPU utilisation. But, the Nvidia ION chipset has VDPAU with H.264 decoding support, whereas the cheap intel chipset on my Atom-based netbook doesn't.

I don't think the problem is the rest of the software stack, it's probably that your GPU doesn't have accelerated decoding, or the driver doesn't support it.

Shuttleworth: Unity on Wayland

Posted Nov 8, 2010 12:47 UTC (Mon) by Darkmere (subscriber, #53695) [Link]

Intel chipset for the graphicscard there, but, what I was commenting on was the statement that the CPU's are strong enough and we do not _need_ the GPU acceleration, which I find quite wrong.

Simply the amount of data to shuffle is enough to choke certain machines, and will remain so for quite a while.

(Especially with "slow" graphics memory/shared memory setups which remain common on laptops of the cheaper inclination)

Shuttleworth: Unity on Wayland

Posted Nov 7, 2010 17:55 UTC (Sun) by Simetrical (guest, #53439) [Link] (5 responses)

"We have GPU acceleration already. Perhaps it could be made to work better. I'm all for that. I protest the idea that we must toss the very useful network transparency just to get the small incremental improvement that might come in cases where network transparency isn't being used."

Well, we aren't, are we? You can still run X on top of Wayland, and Wayland will presumably support other types of networked desktops. Every OS does, after all. Why do you think Wayland will wind up being less nice to use over the network than X in the end? I've found even NX to be almost unusably slow with even 50 ms latency, between uptown and downtown Manhattan. Regular old X forwarding didn't even work at that latency, practically speaking (taking minutes to even draw the window).

Shuttleworth: Unity on Wayland

Posted Nov 7, 2010 18:57 UTC (Sun) by dlang (guest, #313) [Link] (4 responses)

if you could run Wayland apps on top of X you would not be tossing the ability to do things over the network, but the problem with the current 'plan' is to encourage people to write all new apps for Wayland instead of for X, and anything written for Wayland would not be able to run across the network.

different people have different tolorance for the effects of latency. while you consider 50ms unusable, other people have been reasonably happy with X over dialup (~300ms latency)

Shuttleworth: Unity on Wayland

Posted Nov 7, 2010 23:23 UTC (Sun) by Simetrical (guest, #53439) [Link] (3 responses)

Wayland can bring its own way of running things over the network. There's no reason you can't have network transparency that's oriented around compressed bitmaps. If the application uses lots of bitmaps anyway, you won't use much more bandwidth, and if it's mainly text, then the graphics will compress very well if you choose the right algorithm. This is the way things like VNC work today. What's wrong with it?

The lag I saw in X forwarding latency is not a question of individual tolerance. When I tried regular X forwarding on Chromium, it took minutes to even draw the thing once at startup. It was not usable as an interactive application by any stretch of the word.

With NX, it was usable, but with lag of a couple of seconds on everything I did. This should not be necessary -- it should take exactly one round-trip for my mouse click to get to the other computer and all changes to get back. NX was taking dozens of times that. We live in an era of high latency and low bandwidth; the X way of doing things no longer makes sense. Pushing around bitmaps is a much better strategy, and will become ever more so with time, as network connections get faster and latency remains constant.

Unless I'm missing something, which is entirely possible, since I have only the vaguest idea of how anything related to graphics works. In that case, corrections appreciated. :)

Shuttleworth: Unity on Wayland

Posted Nov 8, 2010 2:49 UTC (Mon) by dlang (guest, #313) [Link] (2 responses)

yes, Wayland could bring it's own way of running things over the network, but as near as I can tell they don't intend to. their attitude is that nobody needs that capability (or if they do, all they need is to run VNC to remote the entire desktop as bitmap images and deltas)

Shuttleworth: Unity on Wayland

Posted Nov 11, 2010 18:03 UTC (Thu) by Quazatron (guest, #4368) [Link] (1 responses)

VNC is, in my experience, much better than X over the same link.

Shuttleworth: Unity on Wayland

Posted Nov 11, 2010 18:41 UTC (Thu) by gmaxwell (guest, #30048) [Link]

Depends on the link. Over the a ethernet lan, 802.11a/n, or even the internet between my work and home (10ms rtt, >10mbit/sec file transfers) X11 is a big win vs VNC. I sometimes confuse myself a bit by starting something up on a remote system and only notice when I go to save and don't see my local file systems. Thats not a mistake anyone would make with VNC.

Over slower links VNC will stay usable (if slow) while X becomes useless.

There are various x protocol compressing proxies available for these situations, but I haven't had cause to use them for years. Networks got faster.

Shuttleworth: Unity on Wayland

Posted Nov 8, 2010 8:10 UTC (Mon) by nix (subscriber, #2304) [Link]

Judging from the comments, you are not the only one. I know that without X network transparency, half of what I do at home and 98% of what I do at work would abruptly become impossible or horrifically difficult. We need network transparency or something that emulates it effectively enough to be invisible (and, no, shipping giant bitmaps about is *not* acceptable: we need something like the render extension if rendering windows full of text, always likely to remain a common requirement, is to remain tolerable: however, if that's the only extension above giant-bitmap shipping and latency-reduction hacks, it would probably be fine as that's pretty much what we're doing today).

Shuttleworth: Unity on Wayland

Posted Nov 5, 2010 9:06 UTC (Fri) by modernjazz (guest, #4185) [Link] (4 responses)

Something I don't understand is why OpenGL doesn't make it _faster_ to run over the network. I've never sat down and learned OpenGL, so I am probably deeply confused about this. But if I understand correctly, GL is basically a "compressed" description of what you want rendered on the screen, much in the same way that a line drawing can often require less storage than the corresponding bitmap (depending of course on what is considered "acceptable" resolution). Reducing the amount of data needed to render the scene would seem to improve, not hurt, ones' ability to run applications over the network.

So, are the concerns that Wayland will break network transparency merely "short-term," meaning that the technology is feasible but simply not in place yet? Or are they "long-term," meaning the OpenGL path is basically not amenable to network transparency?

Shuttleworth: Unity on Wayland

Posted Nov 5, 2010 10:07 UTC (Fri) by dgm (subscriber, #49227) [Link] (2 responses)

One word: textures.

Shuttleworth: Unity on Wayland

Posted Nov 5, 2010 22:05 UTC (Fri) by Kit (guest, #55925) [Link]

> One word: textures.

For a normal desktop environment, that shouldn't be a huge deal. The problem is, desktop apps seem a bit hell bent on trying to make GPUs lives as difficult as possible, and will frequently do the least efficient thing possible when it comes to drawing. Toolkits/applications are too insistent are throwing everything away and doing it all over again from scratch, instead of reusing what has _already_ been uploaded to the GPU, when that's what the application was wanting to render anyways.

Shuttleworth: Unity on Wayland

Posted Nov 6, 2010 10:35 UTC (Sat) by modernjazz (guest, #4185) [Link]

But a 2D texture is a bitmap. So VNC or any other "dumb" protocol that sends bitmaps back and forth has to cope with just as much data as sending the texture. And if, as the commenter above suggests, the texture is sent just once and can be re-used, it seems like it should be less expensive in the long run.

3D textures are of course worse, but one could restrict use of them to cases that are strictly necessary so as not to kill network performance.

Shuttleworth: Unity on Wayland

Posted Nov 8, 2010 18:37 UTC (Mon) by daniel (guest, #3181) [Link]

So your proposal is to develop a network transparent version of OpenGL that does not rely on X?

Shuttleworth: Unity on Wayland

Posted Nov 5, 2010 9:59 UTC (Fri) by dgm (subscriber, #49227) [Link]

Nothing prevents having X as a client application of Wayland, the same way you probably are using a text console as a window in X today. In fact, you could be running many X sessions in Wayland, all in paralel and side by side.

What's more, this way you don't force applications where network transparency makes no sense -like video players and games- through the X protocol. Probably the desktop itself is a good example of such an application that's better run close to the display hardware.

In the end, what you will get is a (much) simpler X server, focused on the network protocol and independent of the hardware. Good stuff, if you ask me.

Shuttleworth: Unity on Wayland

Posted Nov 5, 2010 20:07 UTC (Fri) by jonas.bonn (subscriber, #47561) [Link]

I don't really see the problem. Most applications are written using a toolkit, thus abstracting out the backend. The toolkits can be made to choose what they render to, so GTK, for example, can be made to render to X if it finds DISPLAY set and to Wayland otherwise. Thus, if you launch a GTK app from an SSH session with X forwarding, it will find DISPLAY set and render, transparently, over the network.

Network displays

Posted Nov 5, 2010 2:23 UTC (Fri) by ringerc (subscriber, #3071) [Link]

The loss of network support at the display protocol level could be a real loss - or the push required to finally get something better in place.

I've been using Ubuntu at work to serve remote X thin clients over LTSP, and overall it works fairly well. I'm not overly attached to X, though; the network "transparency" model falls down on the fact that network round trips add latency, so you have to test your toolkits/apps with network X to find problems. Additionally, shared memory is not available over the network, so anything that uses xshm needs to be able to fall back to protocol requests. Ditto DRI. In other words, network X is not truly transparent. I've had to hunt down several bugs that only affect apps run over network X, including one particularly bad one in Evolution's tooltip handling that made the compose window take *minutes* to appear.

I just hope Wayland can present something a bit better than a plain frame buffer to clients, so it's possible to implement a smart (RDP or ICA-like) network client not just a dumb frame buffer client like VNC.

The irony

Posted Nov 5, 2010 2:37 UTC (Fri) by martinfick (subscriber, #4455) [Link] (26 responses)

To think that with slow machines and slow networks in the late 1980s, network transparency was acceptable. Strange that with networks and PCs having increased by several orders of magnitude in speed since, suddenly the display is too slow? The math here just doesn't seem right. We can stream video across the internet with youtube, or skype, but a desktop has to run locally? Really, what is wrong with this picture? I am sure that there will be lots of technical excuses to defend this, but it just doesn't seem to add up. Something is seriously wrong with the desktop if it cant' keep up with the web. Add in the twist that the desktop is supposedly going to be replaced by webapps and a webdesktop..., are webapps fast enough to run remotely over the web, but not over X? ...Something stinks in graphics land.

The irony

Posted Nov 5, 2010 3:00 UTC (Fri) by bjacob (guest, #58566) [Link] (23 responses)

I'll tell you what the difference is:

* in the network-transparent X desktop, the graphics are (at least partially) done *server-side*. So we're killing ourselves doing roundtrips between the client and the server (so GUI snappiness is hurt by network latency), and we don't scale as we tax the poor server too much.

* in modern web apps, the graphics are done on the *client side* (in the browser, in JS). No round trips, and newer web standards (canvas! WebGL!) allow web apps to do client-side the same graphics operations that a local application could do, with WebGL even giving fairly direct access to the GPU.

The irony

Posted Nov 5, 2010 3:14 UTC (Fri) by martinfick (subscriber, #4455) [Link] (22 responses)

This is a lame excuse, surely X could be improved to do more client side if that were truly the problem. But this argument falls on its face when you consider the fact that the people likely pushing for this change are the people who likely do not even use X in a network transparent way in the first place. This means that they are complaining about the performance of a local X server WITHOUT ANY network latency (but they are using remote X as an excuse)!!! Surely this local model could be enhanced without killing X. Again, as I said in my original post, this does not add up. A web app written in JavaScript making remote web RPC calls is good enough, but local graphics on local X server is not? What could possibly be required for normal desktop usage that even a Gigabit ethernet (and local X) could not handle, but yet 10Mbit could handle CAD remotely in 1987. I am sure that my memory is forgetting how bad it was in 1987, but surely, it can't be getting worse with faster technology, so why drop X now suddenly?

The irony

Posted Nov 5, 2010 3:33 UTC (Fri) by bjacob (guest, #58566) [Link] (9 responses)

My point was that the whole idea of a network transparent protocol like X is now looking obsolete. It is, if you want, trying to do network transparency at a level that now appears like the wrong level (given current hardware, and given how the browser is successful with its different model).

Then, about your point that least the network transparency of X should come for free when the server is local --- no idea, letting others reply here. But what's the point of a network protocol if it's going to be less and less used with remote servers.

The irony

Posted Nov 5, 2010 11:47 UTC (Fri) by sorpigal (guest, #36106) [Link] (8 responses)

I strongly disagree. X does network transparency at the right level, or at least a much better level than any other current system.

The irony

Posted Nov 5, 2010 12:58 UTC (Fri) by ibukanov (subscriber, #3942) [Link] (7 responses)

> X does network transparency at the right level,

Over the network VNC have been working better for me than X especially on high-latency links. That tells that from a practical point of view X alone does not provide the right answer.

The irony

Posted Nov 5, 2010 15:23 UTC (Fri) by drag (guest, #31333) [Link]

* Citrix ICA
* Microsoft RDP
* Redhat Spice

VNC is not the only game in town, of course. X Windows networking is, indeed, very very cool. But it's been a very long time since it had any sort of monopoly over remote applications.

Windows users have been enjoying Windows-apps-over-internet for many many years now.

Does anybody have a good how many people use 'Go to My PC'? It's a huge number and they all do it over the internet and it works far better, far easier, then X Windows does.

The irony

Posted Nov 5, 2010 15:54 UTC (Fri) by deepfire (guest, #26138) [Link] (5 responses)

This is all based on the fact that the X implementation (the Xorg stack) we use daily isn't particularly efficient.

As I've already said below, if you want an apples to apples comparison see Nomachine's NX. As I said, I use it daily, and my experience is extremely positive.

And yes, it's open source.

The irony

Posted Nov 6, 2010 22:17 UTC (Sat) by ceswiedler (guest, #24638) [Link] (4 responses)

Slight correction: there's an open, but somewhat old, version of NX. Recent versions maintained by Nomachine are free-as-in-beer for noncommercial use.

NX is excellent and I highly recommend it for remote X access, even on a local network since it provides session restoration and "just works". From what I understand, it compresses extremely well due to the nature of the X protocol, since it can see when things actually need to be sent to the client. A VNC or RDP server by comparison only has the final rendered product.

The irony

Posted Nov 7, 2010 11:24 UTC (Sun) by deepfire (guest, #26138) [Link] (3 responses)

No, you are wrong, see http://www.nomachine.com/sources.php

The sources for the core transport libraries are all there.

The missing stuff is the end-user application code, which they make money from.

The irony

Posted Nov 7, 2010 20:41 UTC (Sun) by dtlin (subscriber, #36537) [Link] (2 responses)

http://www.nomachine.com/redesigned-core.php

The new core of NX 4.0 is made up of a set of libraries written from the ground up to ensure portability, flexibility and state-of-the art performance. NX 4.0 core libraries will not be made open source. Although NX 3.x core compression technology and earlier versions will remain GPL, NoMachine engineers will not be developing the code further.

The irony

Posted Nov 7, 2010 21:27 UTC (Sun) by rahulsundaram (subscriber, #21946) [Link] (1 responses)

The irony

Posted Nov 8, 2010 17:12 UTC (Mon) by dtlin (subscriber, #36537) [Link]

neatx is a wrapper for the 3.x NX core libraries, much like NoMachine's nxserver.

It does not support the NX 4.0 progress, and never will because there's nobody working on it anymore and the libraries are not open.

The irony

Posted Nov 5, 2010 3:42 UTC (Fri) by davide.del.vento (guest, #59196) [Link] (8 responses)

I partially agree. Probably people trying to "kill" X don't use and don't need (thus don't care) about network transparency like you and I do.
Nevertheless, running remote X apps on the network, can be painful. I speak because we do that. One day we had a dozen laptops, connecting wireless to a single router, running interactive X sessions on a system in the basement. It sucked.
I'm sure we could improve the X protocol and make it much faster, but I am not sure we can improve it enough (e.g. if a user click a button, that info must go to the server to decide what that button does, right? and latency is there to kill us...)
Now completely killing network transparency doesn't solve the problem either, or does it? Besides that installing some exotic "hi performance" stuff in our "basement machine" would be almost impossible.... Or is wayland transparent on the "basement machine" and requires only stuff on the laptops?

The irony

Posted Nov 5, 2010 4:17 UTC (Fri) by dlang (guest, #313) [Link] (2 responses)

the time it takes to make the network hop to the server over a local gig-E network is so small that it just doesn't matter. The message will travel faster than your system will time-slice with a jiffies setting ao 100Hz

over high-latency links X performs poorly because it serializes everything and so you have a huge number of round trips, but since these are very standard messages that have the same answer for all applications, most of this data can be cached and replied to locally, eliminating the network latency. there's still the message passing and parsing latency, and most of these messages could be combined to save that, but still keep the network transparency in place.

The irony

Posted Nov 5, 2010 13:30 UTC (Fri) by rgoates (guest, #3280) [Link] (1 responses)

The last time I checked (which has been several years), an interactive X session sends a network packet for each and every keystroke. A horrible waste of bandwidth, but I'm not sure how you can improve that without affecting interactivity (or significantly changing the client-server model X is built around).

The irony

Posted Nov 6, 2010 3:42 UTC (Sat) by mfedyk (guest, #55303) [Link]

just like..... ssh.

please explain why nx hasn't become the wire protocol for x...

The irony

Posted Nov 5, 2010 7:32 UTC (Fri) by kevinm (guest, #69913) [Link]

if a user click a button, that info must go to the server to decide what that button does, right? and latency is there to kill us...

Not necessarily. We can solve the problem in the same way that web applications do: provide a lightweight VM on the UI side, and allow the application to push small chunks of bytecode down to the UI to tell it how to respond to things like button-pushes.

The irony

Posted Nov 5, 2010 16:03 UTC (Fri) by deepfire (guest, #26138) [Link] (3 responses)

Please, forgive me for the third reference to NX in this thread, but I'm really, really surprised to see people discuss X issues and not factoring it in.

So, there are these Nomachine people from Italy, and they seem to have done some pretty good open-source work on optimising the hell out of the X protocol implementation.

At least I use NX daily, my experience is very positive and by now I consider it indispensable.

NX and others

Posted Nov 5, 2010 19:10 UTC (Fri) by boog (subscriber, #30882) [Link] (2 responses)

Has anybody gotten around to trying xpra?

http://code.google.com/p/partiwm/wiki/xpra

NX and others

Posted Nov 10, 2010 9:56 UTC (Wed) by nix (subscriber, #2304) [Link]

Yes, but its keyboard handling is bad enough that I use it only to host xemacses that I'm then going to connect to with gnuclient.

NX and others

Posted Nov 11, 2010 5:10 UTC (Thu) by njs (subscriber, #40338) [Link]

I use it all the time. As nix says, keyboard handling is its weakest point (esp. with non-US layouts), and I've seen some poorly coded apps flip out when the "virtualized" WM doesn't respond as quickly as they're assuming it will, but mostly it works well for me.

Disclaimer: I wrote it, so any show-stopper bugs affecting me *would* be fixed now, wouldn't they ;-).

The irony

Posted Nov 5, 2010 4:41 UTC (Fri) by jwb (guest, #15467) [Link] (2 responses)

A lot of people have tried to make all the drawing happen in the server, by having the client basically upload the view program into the server and having the server run it. One recent one was "Berlin" later renamed "Fresco" which of course failed but it was a nice idea. Their real problem was they relied on a graphics middleware called ggi which was independent of the output target and therefore totally useless. GGI is completely forgotten at this stage, I think.

Another famous one of course was Display PostScript.

The irony

Posted Nov 5, 2010 13:14 UTC (Fri) by vonbrand (subscriber, #4458) [Link]

Another famous one of course was Display PostScript.

Oh, you mean that junk that came with Suns, and made me compile plain X for them on arrival because anything using the display was unbearably slow?

The irony

Posted Nov 12, 2010 2:54 UTC (Fri) by jmorris42 (guest, #2203) [Link]

> Another famous one of course was Display PostScript.

And of course it lives. It started at Next and evolved into Display PDF and is still around in the end product of NextStep now known as Apple's OS X. If they can separate the display rendering from the application I really don't understand why this argument is even taking place. If it is possible the argument should be focused on identifying limitations in X that are preventing a similar performing system and how they might be corrected.

The irony

Posted Nov 5, 2010 13:16 UTC (Fri) by alankila (guest, #47141) [Link] (1 responses)

The reason why streaming video over youtube works is that it's a nice compressed noninteractive bytestream. The fact it gets displayed on screen and is actually a video is pretty much irrelevant from viewpoint of server, which just sends data.

JavaScript RPC model is actually pretty interesting. The web is evolving to allow the programmer the freedom to select a suitable boundary between client and server. Sometimes you just want dump data frames from server as stream (similar to VNC in browser, it's doable), sometimes you run almost the entire application on client and only send data to server so that it can save the work which really occurred almost completely on the client. In fact, the most extreme design has server just feed the original UI files and afterwards no more interaction with server occurs.

Your comment also seems to be missing the point that linux webapps still do run over X. The browser is a X client. It worries me that you seem to jumble everything together here.

The irony

Posted Nov 5, 2010 15:53 UTC (Fri) by lacostej (guest, #2760) [Link]

> it's a nice compressed noninteractive bytestream

and buffered on the client side !

Legacy X and network transparency

Posted Nov 5, 2010 5:52 UTC (Fri) by PO8 (guest, #41661) [Link] (20 responses)

Folks should keep in mind that even with Wayland world domination in place X won't be going anywhere anytime soon. The plan is to provide a legacy X server hosted by Wayland. This will enable both remote and local existing X clients to work just as they always did on the local display. The only thing that won't work over the wire is new Wayland-only apps. This will be sad, but given the nature of graphics use by these apps their local-ness won't be so evitable anyhow.

Remoting a window manager, in particular, hasn't been a terribly useful option for a long time.

Legacy X and network transparency

Posted Nov 5, 2010 7:13 UTC (Fri) by jzbiciak (guest, #5246) [Link] (16 responses)

Reading the 20,000ft version of how Wayland works, it seems like it ought to be possible (eventually) to run a Wayland app over the network also. The main difference is that it looks like you'd use a simple bitmap-oriented VNC-like protocol between the Wayland client and the compositor rather than the complex beast that X is. (I don't know if anyone has proposed this--it just looks to me like the obvious way to do it.)

I'm actually quite OK with this, since it plays well into advances we've made (fast CPUs == fast compression, and we have ever increasing bandwidth), and does a better job of tolerating the one major bit that hasn't advanced much: round-trip latency.

I have an X application I sometimes need to run remotely over a VPN link over VDSL. I have gobs of bandwidth, but the RTT sucks. The app is barely usable. In contrast, Windows Remote Desktop and VNC both work just fine over the same link. Both of the latter seem to be more of the "dumb bitmap plus compression" school of thought, and that seems to work pretty well with modern setups.

Legacy X and network transparency

Posted Nov 5, 2010 7:34 UTC (Fri) by dlang (guest, #313) [Link] (7 responses)

and this approach throws away the huge advantage that the Wayland folks (or at least their advocates in this thread) are claiming, the ability to take advantage of the powerful local graphics capibility.

Legacy X and network transparency

Posted Nov 5, 2010 12:50 UTC (Fri) by i3839 (guest, #31386) [Link] (5 responses)

That is not necessarily true. If you chop up the window into a bunch of tiny textures, you can cache it all on the graphics card and only update the bits that changed. So it should be still plenty fast, because there's only one round trip.

The main advantage of Wayland is that it simplifies the whole graphics stack enormously. It uses DRI2/KMS, just like X does, so it doesn't give extra possibilities or a magic performance increase.

Legacy X and network transparency

Posted Nov 5, 2010 15:19 UTC (Fri) by drag (guest, #31333) [Link] (4 responses)

YES THIS.

Wayland is much simpler because it depends on a modern graphic stack. It'll be faster then X, though, simply because it's much less overhead and cleaner implementation. It won't be magical, of course. Only modest improvements. Probably be better in terms of battery life....

There is also no reason why you need to give up X Windows to use Wayland. I use X Windows just fine in Microsoft Windows. Also lots of people use X Windows just fine in OS X. Given that Wayland is naturally composited interface then having a Wayland-specific X Server that draws to off-screen buffers will allow natural integration and backwards compatibility with current applications.

Not that there is a Wayland DDX like there is for MS Windows DDX and XQuartz DDX, but it's certainly going to be a requirement. It's one of those things that will have to be made before Wayland is usable.

Applications that use Wayland will immediately be able to benefit from being 'native wayland', but X apps won't get lost out in the cold.

Legacy X and network transparency

Posted Nov 6, 2010 4:22 UTC (Sat) by rqosa (subscriber, #24136) [Link] (3 responses)

> Applications that use Wayland will immediately be able to benefit from being 'native wayland'

These native Wayland apps use DRI2 to draw to offscreen buffers, right? So, isn't it true that there's no reason why X clients couldn't also be made to use DRI2 to draw to offscreen buffers?

And in that case, there's no significant speed penalty to using X (because the X clients then are doing direct rendering without needing to go through the X server (as in AIGLX)), right?

And if Wayland doesn't have a speed advantage over X, then what is its advantage?

Legacy X and network transparency

Posted Nov 6, 2010 6:04 UTC (Sat) by PO8 (guest, #41661) [Link] (2 responses)

The big advantage of Wayland is simplicity. Because it is so much simpler to implement than X, our tiny pool of X developers is better leveraged. Because modern applications typically just want to emit OpenGL at the end of the day (albeit maybe by some client-side library such as Cairo) and most modern hardware directly supports OpenGL, having X "get in the way" just ticks app developers off by making their job harder. At some point, X starts looking like a huge bag on the side of app interactions with the display, hence Wayland.

Legacy X and network transparency

Posted Nov 8, 2010 16:14 UTC (Mon) by renox (guest, #23785) [Link] (1 responses)

Given that apparently you still need to keep an X Server for legacy application AND the ability for application/toolkit to speak X when they want to have network transparency, adding Wayland will add code, not remove code..
So this 'simplicity' isn't very convincing: yes, Wayland itself is simple, but as it's not a complete solution, the result won't be simple!

Legacy X and network transparency

Posted Nov 9, 2010 0:19 UTC (Tue) by alankila (guest, #47141) [Link]

Well, hopefully a method to move a single application's window over network can be found, somehow. If Wayland is to win, it absolutely has to replace X, and that includes some kind of support for this feature. So you can be pretty sure that use of X will be seen as a bug if we actually do get Wayland-managed display system going.

Legacy X and network transparency

Posted Nov 5, 2010 17:50 UTC (Fri) by jzbiciak (guest, #5246) [Link]

If I'm not mistaken, it's still possible to use the video card for acceleration, even if what you generated didn't go to the display. Otherwise, it seems like rendering in the client and sending to the compositor wouldn't work.

You just lose the shared-memory efficiency when you hand over the results, since the client's video card and the compositor's video card are in two entirely different boxes.

Legacy X and network transparency

Posted Nov 5, 2010 9:07 UTC (Fri) by marcH (subscriber, #57642) [Link] (3 responses)

> ... and does a better job of tolerating the one major bit that hasn't advanced much: round-trip latency.

Err... are you seriously expecting network latency to become better? You know that it depends on the speed of light, right?

I know it could be better than Ethernet *on the LAN* but I doubt that X protocols are soooo chatty they would feel any difference.

> I have an X application I sometimes need to run remotely over a VPN link over VDSL.

DSL offers notoriously bad round trip times (20-30ms) because of the massive amount of Forward Error Correction. You should either look for an ISP that allows to tune your FEC (as explained here http://www.dslreports.com/faq/2182), or for an entirely different and better access technology like DOCSIS. Maybe even 3G has better latency than DSL. Anyone knows?

Legacy X and network transparency

Posted Nov 5, 2010 15:22 UTC (Fri) by centenary (guest, #71028) [Link] (2 responses)

> Err... are you seriously expecting network latency to become better? You know that it depends on the speed of light, right?

I think you're misreading his point. His point *is* the fact that network latencies won't improve (which you're also saying here). Since network latencies won't improve, bitmap-oriented protocols have an advantage since X-forwarding performs poorly under network latency.

Legacy X and network transparency

Posted Nov 5, 2010 17:58 UTC (Fri) by jzbiciak (guest, #5246) [Link] (1 responses)

Yeah, that was pretty much my point: Bandwidth has (and will likely) continue to make fairly large leaps, while latency will make incremental gains at best or occasionally go backwards depending on the technology you use.

I didn't call out physics as the cause. I figured the speed of light should be pretty obvious in this crowd. :-) As for using DSL vs. something else: At least I'm not using satellite. *shudder*

In the end it wasn't a technical decision on my part anyway: The state of broadband being what it is around here, my shopping experience for a provider amounted to telling the sales person "I run servers", and seeing what happened. The cable guys told me "have a nice day," whereas the DSL guys asked "with or without static IP?" It may've changed since then, but does it really matter? I'm now spiraling way off topic.

Legacy X and network transparency

Posted Nov 7, 2010 15:52 UTC (Sun) by marcH (subscriber, #57642) [Link]

> I'm now spiraling way off topic.

With your shopping experience yes maybe... on the other hand the latencies of broadband technologies is quite relevant.

Legacy X and network transparency

Posted Nov 5, 2010 15:48 UTC (Fri) by deepfire (guest, #26138) [Link] (3 responses)

> I have an X application I sometimes need to run remotely over a VPN link
> over VDSL. I have gobs of bandwidth, but the RTT sucks. The app is barely
> usable. In contrast, Windows Remote Desktop and VNC both work just fine
> over the same link. Both of the latter seem to be more of the "dumb bitmap
> plus compression" school of thought, and that seems to work pretty well
> with modern setups.

Well, this is somewhat of a strawman. The fact that X is not implemented particularly efficiently does not mean that X is not fundamentally better than dumb protocols.

If you want to compare apples to apples, see NX, the heavily-optimised implementation of the X protocol. I use it daily, and FWIW it beats crap out of the competition.

Legacy X and network transparency

Posted Nov 6, 2010 6:20 UTC (Sat) by butlerm (subscriber, #13312) [Link] (2 responses)

NX is orders of magnitude better than X over a WAN, but in my experience it still pales in comparison to RDP.

That said, I sure hope someone is looking at a mid-layer API that can be adapted to virtually any combination of user interface toolkit and display communication protocol without having to reduce everything to a bitmap first.

Why should nearly any kind of application be programmed to an API that is designed to be non-remotable? That is the highway to balkanization. What we need is a generic mid-layer API that can reasonably support both scenarios without the historical infelicities of X, so you do not have to re-port your entire application just because you want to run it remotely on occasion, preferably without feeling you are watching satellite tv during a snowstorm.

Legacy X and network transparency

Posted Nov 6, 2010 9:37 UTC (Sat) by quotemstr (subscriber, #45331) [Link]

Sorry. I'm a heavy user of both RDP and NX, and I definitely prefer the latter. Not only is it actually faster, but there's more fine-grained control over authentication, forwarding mechanisms, and so on with X than there is with RDP.

Legacy X and network transparency

Posted Nov 6, 2010 13:09 UTC (Sat) by deepfire (guest, #26138) [Link]

Ditto, my experience with RDP and its kind is what prompted me to switch to NX in the first place.

Legacy X and network transparency

Posted Nov 8, 2010 8:16 UTC (Mon) by nix (subscriber, #2304) [Link] (2 responses)

And because remoting a window manager is a bad idea (odd, I did it last week and it worked fine), we can just drop the whole idea of a window manager! Nobody needs that flexibility, right? Everyone can just use Metacity, perhaps with some extra themes! There are no competitors worth speaking of!

(oh, wait.)

Legacy X and network transparency

Posted Nov 12, 2010 3:07 UTC (Fri) by jmorris42 (guest, #2203) [Link] (1 responses)

> we can just drop the whole idea of a window manager!

Yea, that is just an extra helping of fail to go along with ditching network transparency. So not only are we supposed to be happy tossing "The Network is the Computer, the Computer is the Network" we also lose "Mechanism not policy" along with it.

After both of those are gone, might as well just buy a Mac and be done with it.

Ya know, one of the attractions of Free Software for me was the hope for freedom from being abandoned by a vendor. But with the lemming like action of the distributions chasing "The Year of Linux on the Desktop" it looks like we (we the *NIX loving folk who were the early adopters) are about to be abandoned. Thankfully we will at least have the option to fall back to a distribution preserving the *NIX way.... even if we have to fork it off from an existing one and maintain it. Right up until Firefox and Chromium go Wayland and drop X support, then things might get messy.

Legacy X and network transparency

Posted Nov 14, 2010 21:07 UTC (Sun) by nix (subscriber, #2304) [Link]

Even then, Konqueror should still work. I don't see Qt dropping X support: after all, they already support, what, three or four or is it five different graphics layers? Adding Wayland shouldn't require *them* to drop X :)

Shuttleworth: Unity on Wayland

Posted Nov 5, 2010 8:59 UTC (Fri) by rqosa (subscriber, #24136) [Link] (4 responses)

What about doing this:

  • Have the toolkits use OpenGL for drawing instead of XRender. (Qt has experimental support for this, though it seems to be broken currently; a Qt program executed with the switch "--graphicssystem opengl" will use OpenGL for drawing.)
  • Have each OpenGL-using X client which is running locally send its drawing commands to the video hardware directly, to be rendered to that client's offscreen buffer (except for the window manager, which renders onscreen).
  • Have the window manager do compositing, same as now.
I'm not convinced that Wayland has any real benefit over doing things that way. This page seems to be saying that Wayland's benefit over X is that, in Wayland, the compositing window manager decides what is the receiving window and window-local coordinate for each pointer event (because it knows the location/size/shape/transformation of every window, whereas the X server doesn't have that information when using a compositing window manager). But, is it really all that difficult to make it possible for the X server to delegate this responsibility to the window manager?

Shuttleworth: Unity on Wayland

Posted Nov 5, 2010 10:36 UTC (Fri) by rqosa (subscriber, #24136) [Link]

> delegate this responsibility to the window manager

Maybe that would increase input latency too much. Here's another possibility: have an X server that doesn't need a window manager, because it can do window management itself. (For example, the "XQuartz" X server in Mac OS X is like this.) Maybe even have a plugin API in the server for server-side "window managers".

Shuttleworth: Unity on Wayland

Posted Nov 5, 2010 12:24 UTC (Fri) by bluebugs (guest, #71022) [Link] (2 responses)

Just for the record, OpenGL is not always the best way to do 2D graphics. When you do OpenGL you need to redraw all the window completly, there is no partial redraw. That means you need a lot more bandwith and fillrate with OpenGL that you would need with a good software engine.

The main issue is that today graphic toolkit are mostly badly designed, they are just thinking at starting to implement a 2D pipeline rendering like games are doing since years. If toolkit could go away from the direct rendering mode and instead implement canvas that track objects, thinks would really looks better.

There is so many room for improvement in both QT and GTK toolkit that thinking switching to yet another windows server will be the answer is a waste of ressource. With good software pipeline you will already know that frame buffer vs X means only a 5% slowdown. That's just nothing compared to the rest of the rendering stack.

Hopefully QT is starting to get it right with its move to QML and SceneGraph rendering. But there is still a lot to do before it really use your hardware at max possible rendering speed. Sadly I have not seeing such a project in GTK land,. And if you want to see code that do what I am describing look at the Enlightenment Foundation Library.

Shuttleworth: Unity on Wayland

Posted Nov 5, 2010 14:14 UTC (Fri) by rqosa (subscriber, #24136) [Link] (1 responses)

> Just for the record, OpenGL is not always the best way to do 2D graphics.

Yeah, although I said "each OpenGL-using X client", it doesn't have to be OpenGL. The important point is that the clients talks directly to the hardware (with the kernel as an intermediary), and the hardware renders offscreen.

(Last time I posted about X here, I mentioned OpenVG for 2D graphics, only to be told that OpenVG will never gain adoption…)

Shuttleworth: Unity on Wayland

Posted Nov 5, 2010 15:36 UTC (Fri) by drag (guest, #31333) [Link]

OpenGL is 'teh suck' when doing applications.

What your looking for is Gallium.

Gallium is a modern graphic stack were you have a single driver that supports multiple graphics APIs. Instead of having to have separate drivers for OpenGL vs X Render.... you just have application libs that tie into Gallium 'State Trackers'.

Gallium3D has:
* Mesa OpenGL state tracker
* DirectX state tracker
* OpenVG state tracker
* X state tracker
* OpenCL state tracker
etc etc.

The state tackers keep track of what application is requesting what and the Gallium driver combines them into GPU instruction primitives that end up getting piped to the graphics card. Want to support a new GPU-level API? then make a new tracker. The trackers should be mostly hardware agnostic so they can work just as uniformly well completely regardless of what your using for acceleration... it could be a video card driver, LLVM, or some sort of weird future GPGPU. Write once, compile everywhere type thing.

There really is no such thing as 'OpenGL acceleration' on video cards anymore. Those are long dead. Like before ATI R200 dead type dead. Instead what we have is all-software-stacks that are compiled to run on both CPU and GPU, using whatever gives the best performance or is the most available. It's complicated, but it works and GPU is now a native part of the PC architecture and can be taken advantage of much like a math coprocessor from 486DX days was used. The GPU is just another processor.

That way instead of forcing everybody to write new APIs by abstracting on top of OpenGL or whatever else is lower, the application developers can just use whatever is best for their purposes and the driver takes care of it for them. This is the way things should be. This is something that Wayland is going to need to have underneath it in order to be better.

Currently working Gallium drivers exist for Nvidia and AMD/ATI video hardware, but it's all a HEAVY work in progress. Only very recently have they gotten OpenGL working on real hardware.

Shuttleworth: Unity on Wayland

Posted Nov 5, 2010 9:01 UTC (Fri) by AlexHudson (guest, #41828) [Link] (2 responses)

Given the current state of the likes of Compiz/Mutter, KDE, et al., I struggle to think what effects Unity is possibly going to offer that will be obviously and visibly different on Wayland as opposed to X.org.

The network transparency of X has been criticised in places like Slashdot for years, yet I always understood that this criticism was basically mis-placed: that the protocol itself doesn't particularly impinge on the efficiency of the server at all, and that the problem is more that the basic X protocol was essentially static for years.

We've finally got an X.org project that is kicking ass, modernising the code and the protocols, simplifying and chopping out the crufty bits, but which is still terribly short on hackers. Splitting the stack in two - X.org for compatibility, Wayland for new hawtness - just seems mad.

This is exactly the kind of thing that ought to get cross-distro agreement. If X.org isn't up to the job long-term, then let's everyone set out a roadmap of where we need to go. Let's not have Xgl/AIGLX all over again.

Shuttleworth: Unity on Wayland

Posted Nov 5, 2010 12:19 UTC (Fri) by sorpigal (guest, #36106) [Link] (1 responses)

Every time I hear another "let's throw out X!" campaign, I become suspicious. It's not as if X is flawless, far from it, but its detractors seem to harp on petty and irrelevant nonsense more than actual flaws--nonsense like "Remote X is hurting desktop effects."

A new system designed to do what X does (the good parts) in a way which is incompatible with X and builds on what we've learned from X in the last 20 years, especially about what not to do, would be welcome. If the new system is just replicating a small piece of what X does and ignoring the rest then it isn't a replacement for X.

Is it time for X12 or a Z? Probably. But, please, let's not step backwards while we're at it.

Shuttleworth: Unity on Wayland

Posted Nov 5, 2010 16:09 UTC (Fri) by drag (guest, #31333) [Link]

It would be quite another thing if somebody was actually working on X12.

But nobody is.

People are, however, working on Wayland....

Shuttleworth: Unity on Wayland

Posted Nov 5, 2010 11:15 UTC (Fri) by emk (subscriber, #1128) [Link] (5 responses)

What does this mean to me as an occasional user of a tiling window manager? I have nothing against a pure-GL compositing stack, but I'd hate to give up shiny toys like xmonad.

Shuttleworth: Unity on Wayland

Posted Nov 5, 2010 21:22 UTC (Fri) by ballombe (subscriber, #9523) [Link] (3 responses)

You know, to me, user of the linux console, KMS is just a gigantic regression, because it does not implement the VGA text mode. The program "sl" is about five times slower in the framebuffer than in VGA text mode.

Shuttleworth: Unity on Wayland

Posted Nov 5, 2010 22:35 UTC (Fri) by foom (subscriber, #14868) [Link] (2 responses)

I quite dislike that it defaults to an unreadably small font size, instead of good old 80x25.

Shuttleworth: Unity on Wayland

Posted Nov 6, 2010 16:34 UTC (Sat) by rleigh (guest, #14622) [Link] (1 responses)

I'm not sure I agree with the console being inferior. I have a 120×37 console with a 16×32 console font. Far more readable and useful than 80×25. While "hardware" VGA consoles are indeed faster, they are also far less portable: x86 only and with a lot of restrictions. In comparison, a framebuffer console offers far more features and works on all architectures. Once KMS is ubiquitous, I can't see why the framebuffer console can't take better advantage of the hardware and properly accelerate all the terminal operations such that the speed is as good as (or better) than a hardware VGA console.

My main beef with the console (of all flavours) is the appallingly bad (simplified) VT100 emulation and terrible Unicode and font support. With KMS, a decent userspace emulator using fontconfig/freetype and OpenGL/Gallium should be able to do an absolutely amazing job compared with the limited kernel emulator.

Shuttleworth: Unity on Wayland

Posted Nov 7, 2010 21:05 UTC (Sun) by dtlin (subscriber, #36537) [Link]

bogl-bterm, jfbterm, and fbterm are existing userspace terminal emulators running on the Linux framebuffer, with improved Unicode/font support over the built-in console.

Shuttleworth: Unity on Wayland

Posted Nov 6, 2010 16:33 UTC (Sat) by andreashappe (subscriber, #4810) [Link]

As long as they use a toolkit that is supported by wayland: nothing. If wayland supports some sort of X11 emulation layer (x11 client) they should also work out-of-the-box.

AFAI-remember ion's drawing code an additional drawing engine shouldn't be too hard.

Please, people, before commenting on X performance try NX

Posted Nov 5, 2010 16:20 UTC (Fri) by deepfire (guest, #26138) [Link]

...because the current Xorg stack is just another implementation, and isn't a particularly efficient one.

It saddens me to see people judging the idea by its most popular implementation.

My experience with NX has been all-around positive (well, except minor glitches), and I consider it indispensable in my daily work.

Please, please, before bashing X /the technology/, make sure you don't actually mean Xorg /the implementation/.

Anyway, for me it raises another question -- why are the core people behind Xorg so awfully quiet about NX?

Shuttleworth: Unity on Wayland

Posted Nov 5, 2010 22:58 UTC (Fri) by hschildt (guest, #71034) [Link] (8 responses)

"Is it time for X12 or a Z? Probably. But, please, let's not step backwards while we're at it."

Like some Linux people, you think like a bureaucrat who just wants to get along with your community.

After reading X.org/Gnome blogs it appears Linux is not an open community, but actually a closed one.

Ubuntu is correct in moving forward past the blurry fonts of X.org and the retarded Gnome desktop.

Maybe Ubuntu can get Linux past its 1 percent market share, because it appears to be stuck?

Shuttleworth: Unity on Wayland

Posted Nov 6, 2010 10:55 UTC (Sat) by GhePeU (subscriber, #56133) [Link] (7 responses)

Ubuntu is correct in moving forward past the blurry fonts of X.org and the retarded Gnome desktop.

What the hell has X.org to do with "blurry fonts"? The fonts are rendered by freetype, who's not a part of X.org, and nobody is going to touch it. What are you going to blame X for now, the rain on weekend days while it was sunny all the rest of the week? Poverty and famine in Africa?

If the price for "getting past the 1 percent market share" is a new influx of fanboys who talk with arrogance about things they don't know and spout out the worst idiocies, even in a place where usually you could find interesting commentary, then I'd really prefer to stay with the current usage level, thank you very much.

Shuttleworth: Unity on Wayland

Posted Nov 6, 2010 19:39 UTC (Sat) by alankila (guest, #47141) [Link] (6 responses)

I'm probably indirectly feeding a troll, but I can tell you that the real problem why Linux has bad fonts -- I won't call them blurry because I don't find that meaningful criticism -- has to deal with too few people understanding how fonts should be layered on top of the existing graphics.

To explain it in simplest term, a diagonal like \ or / requires drawing partially eclipsed pixels to eliminate jaggedness of the monochromatic rendering. Assuming a white background and black foreground, Linux people seem to think that a half coverage of black within that white pixel should result in color 0x80 because that is 50 % blend between 0x00 and 0xff. This is, however, wrong, because of gamma. To render a physical intensity that is close to 50 % of the brightness of white, you will actually want to use color value 0xb4. 0x80 is simply way too dark.

I am demoing this problem on this page with the green text on purple background: http://bel.fi/~alankila/lcd/

The reason nobody does this right is that to X, and pretty much everyone else, text is just an ARGB pixmap, and libraries like cairo get the text up from freetype (which is not the faulty party) in linear alpha space, and turn that into linear ARGB bitmap with subpixel positioning. However, this bitmap eventually ends up on a sRGB surface, and the intermediate colors are destroyed, giving the color fringing and darkening artifacts that I try to demonstrate as prominently as I am amble.

Shuttleworth: Unity on Wayland

Posted Nov 6, 2010 20:25 UTC (Sat) by quotemstr (subscriber, #45331) [Link] (2 responses)

So where would be the best place to implement gamma-corrected compositing? New XRENDER primitives?

Shuttleworth: Unity on Wayland

Posted Nov 7, 2010 14:09 UTC (Sun) by alankila (guest, #47141) [Link]

A solution along those lines has been suggested, at least. Changing the whole texture blending algorithm to do gamma correction would break all applications that assume the current behavior.

Shuttleworth: Unity on Wayland

Posted Jun 8, 2012 7:59 UTC (Fri) by alankila (guest, #47141) [Link]

Old thread but I did actually implement gamma-corrected XRENDER. Not every application is correctly rendered yet, but most are...

http://bel.fi/~alankila/lcd/linuxgc.png

Shuttleworth: Unity on Wayland

Posted Nov 7, 2010 3:52 UTC (Sun) by jwb (guest, #15467) [Link] (2 responses)

It's funny you should mention this. I used to complain about the lack of gamma-corrected Freetype and Cairo rendering on mailing lists roughly 5-10 years ago. At that time I was repeatedly assured that of course this would be implemented quite soon. And yet it never was.

Shuttleworth: Unity on Wayland

Posted Nov 7, 2010 10:07 UTC (Sun) by HelloWorld (guest, #56129) [Link] (1 responses)

Well, you could open a bug report or ask again :).

Shuttleworth: Unity on Wayland

Posted Nov 7, 2010 14:05 UTC (Sun) by alankila (guest, #47141) [Link]

I already have a bug open with cairo about this. They are aware of the problem and understand this. The folks there seemed sympathetic to my cause and even told me the name of the person who was supposedly working on it... The idea is to support textures in different color spaces, so they are supposedly planning to solve the whole problem at once.

Hopefully they aren't chewing too large a piece. At this point I could live with almost any kind of unspeakable kludge, rather than wait 10 years for the perfect solution.

More important to me than super-smooth graphics.

Posted Nov 8, 2010 5:17 UTC (Mon) by gmatht (subscriber, #58961) [Link] (1 responses)

For me network transparency is more important that "supersmooth" graphics. However there are a number of ways limitations of X that I do care about. These relate to the robustness of X.
1) Would the simplicity of Wayland mean that it would be less likely to crash than an X-server?
2) Would it be possible to reset, for example, the graphics card without killing all Wayland clients (applications). Roughly, I mean could we do the equivalent of Ctrl-Alt-Backspace without losing all our open applications?
3) Would motion of the the mouse pointer remain smooth, even when a couple of background tasks are performing heavy IO?
4) Would Wayland improve resource management? For example,
4a) a poorly written application can cause the X-server to allocate large amounts of memory. Would Wayland force or encourage that application to allocate that memory itself (making the culprit clear in top and to the OOM killer)?

More important to me than super-smooth graphics.

Posted Nov 17, 2010 17:10 UTC (Wed) by renox (guest, #23785) [Link]

> 1) Would the simplicity of Wayland mean that it would be less likely to crash than an X-server?

Wayland itself, yes, as it does less than an X-server, but if you do the network transparency in the toolkitS for example, as there can be several toolkit the duplication of code increase the total probability of failure.

For the point 2: perhaps, but AFAIK the issue with restarting X is that currently clients don't reopen a connection with X in case of X's failure, so in theory this isn't related, in practice it could be if Wayland clients are written with this possibility in mind..

Shuttleworth: Unity on Wayland

Posted Nov 11, 2010 18:19 UTC (Thu) by cdmiller (guest, #2813) [Link] (2 responses)

"we’re choosing to prioritize the quality of experience over those original values, like network transparency"

Q. Who uses network transparency where I work?
A. UNIX and Linux systems administrators, and some students and faculty.

Q. Who advocates for Linux on the desktop?
A. UNIX and Linux systems administrators, and some students and faculty.

Q. Who else uses Linux on the desktop?
A. Spouses, family, and friends of the Linux on the desktop advocates.

Q. Who will the removal of X "original values" piss off?
A. The first line advocates of Linux desktops.

Duh.

"it’s *possible* to get amazing results with X, but it’s extremely hard, and isn’t going to get easier"

It's painful to get an amazing desktop and somewhat hard to get a nicely working easily manageable desktop, but I don't agree it can't get easier under X.

Prediction: Big win for Xubuntu or similar in the academic arena.

Shuttleworth: Unity on Wayland

Posted Nov 11, 2010 19:57 UTC (Thu) by bronson (subscriber, #4806) [Link] (1 responses)

I'll take that bet.

Prediction: your Unix and Linux system administrators and some students and faculty happily start using Spice, VNC, or some other remote protocols (yes, they can do individual apps and windows just like X). Code size decreases significantly, development pace quickens, and only a few stuck-in-the-90s graybeards lament that the port 6k-over-ssh hack doesn't work anymore and all their hard-fought xauth knowledge is now obsolete.

I haven't actually used wayland yet so I'm just playing devil's advocate here. :)

Shuttleworth: Unity on Wayland

Posted Nov 14, 2010 19:58 UTC (Sun) by cdmiller (guest, #2813) [Link]

Heh, actually X11 forwarding over SSH obsoleted Xauth knowledge long ago. If that doesn't work anymore it's yet another thing to be pissed about.

Truly, we already run XFCE in our LTSP academic computer labs and kiosks as the Ubuntu Gnome default was not easy to manage in that sort of environment. I can only imagine the pain and unhappiness in trying to manage an environment running a mix of X, Unity/Wayland, SPICE, and VNC apps all at once. It's pretty easy to imagine big win for Xubuntu and similar. Inevitably the Linux distribution admins find easier to manage is what will be installed, used, and advocated for.


Copyright © 2010, Eklektix, Inc.
Comments and public postings are copyrighted by their creators.
Linux is a registered trademark of Linus Torvalds