Shuttleworth: Unity on Wayland
But we don't believe X is setup to deliver the user experience we want, with super-smooth graphics and effects. I understand that it's *possible* to get amazing results with X, but it's extremely hard, and isn't going to get easier. Some of the core goals of X make it harder to achieve these user experiences on X than on native GL, we're choosing to prioritize the quality of experience over those original values, like network transparency."
Posted Nov 5, 2010 0:05 UTC (Fri)
by juanjux (guest, #11652)
[Link] (6 responses)
Posted Nov 5, 2010 0:08 UTC (Fri)
by tzafrir (subscriber, #11501)
[Link]
Posted Nov 5, 2010 1:23 UTC (Fri)
by marineam (guest, #28387)
[Link] (4 responses)
On the flip side though, none of this applies to things still using the traditional driver-in-X model such as proprietary drivers and all the odds and ends that aren't Intel, nVidia, and ATI. Keith's answer was basically that Nouveau is doing well which covers the last proprietary driver and as for the others all he had was "I don't know."
I'm assuming that for the "I don't know" category Ubuntu will still include X and a traditional window manager as an alternative to Wayland and setup the toolkit to use either backend. The few users in that boat won't have the smooth experience Unity is aiming for but probably won't see a significant change from what their current state is.
Posted Nov 5, 2010 9:10 UTC (Fri)
by mjthayer (guest, #39183)
[Link]
I can't say for sure, but I suspect that getting basic (that is, unaccelerated) KMS drivers working for the "odds and ends" would be feasible if someone finds the hardware to test. With the big caveat that that needs people to understand KMS. But if someone were to produce a nice KMS driver stub, which more or less worked except for a few empty functions with big "fill in XXX here" stickers there would be people able to fill in the gaps and lift things like the actual initialisation and register poking from the DDX drivers. I fear that I may have to port a DDX to KMS myself in the not too distant future, so perhaps I should give that a try.
Posted Nov 5, 2010 13:31 UTC (Fri)
by wookey (guest, #5501)
[Link] (2 responses)
That's a very 'desktop' view of the world.
ARM is rapdily becoming a platform that matters for shiny graphics use and we have a choice of GPUs: Imagination Tech PowerVR, ARM Mali, Nvidia GeForce, and Broadcom BCM2727(?).
Guess how many of those have free drivers? Exactly zero. So just as we get to a reasonable state on the desktop we have the same old mess all over again for new netbooks, tablets, phones, TVs and general widgetry. Suggesting that this problem has been dealt with is a very long way off the mark - it's about to get a whole pile worse.
Do please heckle at every opportunity you get at conferences, product unveilings, design meetings and so on. The people who matter need to hear over and over that that is going to be a massive PITA and that they need to fix it, as it is being fixed on the desktop. The engineers already know this, but there are lawyers and execs who still don't get it and are afeared of everyone's patents. I'm doing my bit here at ARM every time they come to tell us how cool it all is. I tell them it stinks :-)
Pretending we don't have a big problem does no-one any good. And
unfortunately whilst we have 4 choices all with the same problem we
have limited leverage: we can't pick the good choice because there
isn't one. I we can get one manufacturer to open up then the others
are likely to follow, as has happened reasonably convincingly on the desktop.
Posted Nov 5, 2010 15:07 UTC (Fri)
by BenHutchings (subscriber, #37955)
[Link] (1 responses)
Posted Nov 6, 2010 0:28 UTC (Sat)
by Lennie (subscriber, #49641)
[Link]
But if Rasterman does not agree (which he has supposed said), someone who works on X and graphics and embedded/mobile, then I don't agree either.
Posted Nov 5, 2010 0:48 UTC (Fri)
by thoffman (guest, #3063)
[Link] (87 responses)
So, although I'm hopeful and supportive of most anything which would make my Linux desktop more responsive, I hope this Unity on Wayland effort will not prevent new Gnome apps from being usable through a networked desktop.
Posted Nov 5, 2010 1:21 UTC (Fri)
by JohnLenz (guest, #42089)
[Link] (84 responses)
The next generation of gnome (and KDE) apps seems to be going towards much more use of OpenGL and the graphics card. So even if the desktop is based on X, I don't see what good the network transparency at the X level will get. Sure there is indirect GLX rendering over the network but that is too slow so you won't be able to run the program over the network anyway. Instead, the rendering will have to happen on the client using the clients graphics card. I currently use network transparency of X too, but it looks like the X split of rendering on the server will need to be abandoned to instead use something like VNC where the rendering is allowed to happen on the client. Perhaps wayland plus SPICE could be used to replace the network transparency.
Posted Nov 5, 2010 1:43 UTC (Fri)
by gmaxwell (guest, #30048)
[Link] (78 responses)
I'm sure someone loves all this bouncing fading crud that halts your machine for seconds at a time just to play a pretty animation but for me the computer is an important _tool_ and things like speed and network transparency are important for me.
Fortunately there is a whole suite of active developed toolsets focused on users like me things like xmonad as well as all the "classic" unix tools. The biggest downside to using them now is that it means breaking from the norm of your distros default configuration and thus losing part of the outsourced systems administration value they provide. Perhaps a distro fork will arise targeting people who are technically competent more interested in productivity.
Posted Nov 5, 2010 2:36 UTC (Fri)
by bjacob (guest, #58566)
[Link] (13 responses)
1. web apps (more generally, in-browser apps --- the CUPS config tool over port 631 now appears like a visionary precursor!)
2. the _clients_ now have insanely powerful graphics hardware, in any case there is no reason anymore for wanting to do graphics on the server side (where I mean "server" in its proper network sense, not in the GPU sense).
And for non-GUI apps, SSH is all you need...
Posted Nov 5, 2010 3:30 UTC (Fri)
by davide.del.vento (guest, #59196)
[Link]
But there will still be the need for "normal" X-forwarding via ssh, so a distro that completely kills it will be a deal breaker for me.
Posted Nov 5, 2010 7:09 UTC (Fri)
by rqosa (subscriber, #24136)
[Link] (10 responses)
The way that Web apps are now (usually), they have a big deficiency in comparison to console or X apps running over SSH: when you use the console/X app, it runs with your UID, but when you use the Web app, it runs with the same UID as it does for everyone else. This is bad for security; it's as though every app were a setuid app. Apache's "mod_suexec" is one solution to this problem, but its limitations (it can only run CGI apps, and it chooses which UID to run as according to the directory where the program resides) make it rather impractical.
Posted Nov 5, 2010 12:51 UTC (Fri)
by alankila (guest, #47141)
[Link] (9 responses)
Thus, all security features must be implemented through other kind of checking, often manually by comparing the user id on the database row being requested with the user id currently logged in. Not everyone remembers to do this all the time, though: generally a missing check for things like "is this user to authorized to view that page" results in information disclosure bugs.
I think a lot of real-world systems don't run a ton of webapps within one server and one uid. I personally tend to isolate running webapps inside virtual machines and use reverse proxying techniques to expose them where I want them. Virtual machines can be backed up and recreated wherever I want, so they are actually quite convenient to move as black boxes from a system administrator's perspective...
Posted Nov 5, 2010 14:03 UTC (Fri)
by rqosa (subscriber, #24136)
[Link] (7 responses)
> Webapps actually have far more interesting issue than the one you talk about. The most important problem is that all end-user data generally is accessible from the same server-side UID, as the guy who logs in a web application isn't using a distinct UID on the server side. Uh, that is the issue I'm talking about. Or at least it's a consequence of it (because the app always runs as the same UID, all of its stored data is accessible to that UID). > The data is in fact usually written in SQL storage But, the grandparent post was talking about, essentially, using a Web browser & Web apps to do what we formerly did/still do with an X server & remote X clients. That is to say, to take the kind of apps which now are X clients (e.g. image editor, email user agent, text editor, office suite, RSS feed reader, terminal emulator, XMPP client, etc.) and make them into Web apps. These don't usually use SQL; or if they do, they use a per-user database instance, e.g. an SQLite file owned by the user, or a per-user instance of mysqld (IIRC, KDE's "Akonadi" does this). There's no good reason for these apps to suddenly become dependent on a central RDBMS server, just because they have migrated from one remote-user-interface protocol (X Window System) to another one (HTTP + HTML5 + JavaScript + whatever)!
Posted Nov 5, 2010 14:51 UTC (Fri)
by alankila (guest, #47141)
[Link] (6 responses)
Otherwise I am in agreement.
Posted Nov 5, 2010 17:07 UTC (Fri)
by rqosa (subscriber, #24136)
[Link] (5 responses)
> suexec does nothing to solve that problem as far as I know Well, with suexec, the UID a CGI program runs as is determined by what directory it's in. A typical usage scenario is that each user has a "home" (or "public_html") directory (that is, a directory found at a path like "~user/public_html" or something similar on the machine where Apache runs, which Apache then exposes to HTTP clients as the URL "http://hostname/~user/") which may contain CGI programs, and when one of those program is executed, suexec will set the UID for its process to the UID of the user who owns the "home" directory it's in. (Or maybe it just picks the UID that owns the program file; I don't remember which way it is, but it doesn't make much difference.) So, basically, suexec will separate webapps that "belong" to one user from webapps that "belong" to other users. Now, if you take one CGI program and make multiple copies of it, each belonging to a different user (that is, each in a different user's Apache home dir), then the different users of that app are separated from each other. But that is an ugly kludge, necessary only because of the limitations of suexec. So suexec isn't a good solution for this problem. (Also, suexec is only compatible with CGI programs. CGI has its own problems, the biggest of which is that it requires every webapp process to exit immediately when it finishes generating a response message; that is really bad for performance. There are much better IPC protocols for webapps, such as SCGI, AJP, and FastCGI.) Here's a suggestion: for "single-user" webapps, the UID to run the app as should be determined by the user specified in the HTTP request, with HTTP authentication (basic or digest).
Posted Nov 5, 2010 21:13 UTC (Fri)
by Pc5Y9sbv (guest, #41328)
[Link]
I've frequently considered that it would be nice to have a generalized mod_wsgi like this, and a user-mapping variant that could manage a daemon process pool with each authenticated web user getting his own app instance, which can be reused for many requests and shut down automatically when idle for too long. There is already some basic pool management in mod_wsgi, but it needs more features.
However, other aspects of the security model need to be matured, as web frameworks have such an in-built idea of one app for all users. You'd really need to make all of your server side resources now owned by the individual web users, e.g. good user/group models for files and good multi-role policies for your RDBMS.
Posted Nov 6, 2010 11:33 UTC (Sat)
by alankila (guest, #47141)
[Link] (3 responses)
I don't think anybody is going to actually do desktop apps in the web browser. The most important feature of a web application is probably still the fact that it's accessible anywhere and requires no installation from user's viewpoint. A local application implemented as web app is only available locally and needs to be installed, so that advantage is wholly gone.
Posted Nov 7, 2010 0:36 UTC (Sun)
by rqosa (subscriber, #24136)
[Link] (2 responses)
> The model of using CGI programs run by a central apache from user's home directory is probably not in the future. Indeed, it isn't. Here's how it should be instead: When person goes to use a remote app, they point their browser at the URL for the host where that app resides and the pathname on that host where the app is installed; for example "http://hostname/bin/my_app.py". Then, the user enters their authentication credentials (use HTTP digest or basic authentication for this) for that remote host. Then, any subsequent HTTP requests from that user will be forwarded (by SCGI or AJP or similar) to an instance of that app running as that user's UID. So, the Web app is installed in just one location, but there will be multiple running instances of the app, one instance per user. (Think about what happens with, for example, a host running an SSH server where many user log in via SSH and then run various console apps and X apps. It's the same principle: apps are installed system-wide, and there's a separate running instance of each app for each user using the app.) > enable browsers to execute applications without a web server I think this is already possible. (If you've got the Python documentation installed, try going to "/usr/share/doc/python/html/index.html" in a browser, type something in the search box, and press "Go".) But I wasn't talking about running web applications locally. > I don't think anybody is going to actually do desktop apps in the web browser. The trouble is, some people here are saying that we don't need X Window System anymore, because we don't need X's network-transparency anymore, because we have a better way to use apps remotely: Web apps. But, with X, most apps that you can run locally (image editor, text editor, etc.) you can also run remotely, and lots of people use this feature. That won't be possible if those apps migrate to a non-networked UI system (e.g. Wayland). If we're really going to adopt HTTP + HTML5 + (whatever else) as the replacement for remote X, we've got to have these same kinds of apps available for it!
Posted Nov 7, 2010 0:38 UTC (Sun)
by rqosa (subscriber, #24136)
[Link]
s/requests from that user/requests from that user to that URL/
Posted Nov 9, 2010 0:29 UTC (Tue)
by alankila (guest, #47141)
[Link]
The most popular X-forwarded application I use personally is xterm and that's mostly because I'm too lazy to open local terminals and use separate ssh connections for them. If I had to choose between X-style vs. VNC-style, I guess I actually prefer VNC-style remoting because of the ability to leave the session running perpetually on the server. Unfortunately, in practice, VNC is not really such a stellar protocol, and I've seen RDP between 2 Windows systems perform better than VNC seems able to, for some reason.
Posted Nov 6, 2010 5:44 UTC (Sat)
by butlerm (subscriber, #13312)
[Link]
Posted Nov 5, 2010 19:46 UTC (Fri)
by misiu_mp (guest, #41936)
[Link]
Posted Nov 5, 2010 3:27 UTC (Fri)
by davide.del.vento (guest, #59196)
[Link]
Posted Nov 5, 2010 3:57 UTC (Fri)
by Kit (guest, #55925)
[Link] (39 responses)
Having a tool be capable of what it's intended to do is important, but would you really want to use an oily screw driver with no grip? Or a hammer where the metal on the handle is flaking?
That's part of the idea behind this "dancing candy cram-ware", as you call it. Even modern netbooks have MORE than enough power to handle these animations and transitions, it's just that the current stack has a variety of issues (immature drivers, 2D operations that act as basically a worst-case-scenario for the graphics accelerator, etc). Animations and transitions can work really well when _done_ well. Any that are showy, flashy, or long are prime examples of _bad_ ones... determining what's good is a bit harder, with subtlety generally always being best.
Posted Nov 5, 2010 4:53 UTC (Fri)
by gmaxwell (guest, #30048)
[Link] (14 responses)
So done well most of them would be invisible. Unfortunately they aren't done nearly well at all. My favorite peeve today is the combination of: gnome-screensaver doesn't reliably measure idle status when using the keyboard exclusively, and getting an uninterruptable several second fade out animation when it decides to blank.
It's not that I want a oily screwdriver. I want the nano-diamond tipped tungsten-carbide rocked-powered screwdriver. I want amenities, but they ought to be ones I consider helpful rather than hindrances. If someone wants to paint it pretty colors thats fine as long as it doesn't damage the atomically sharp pointy end. But absolutely no wind-load adding spoilers please.
I fully expect that different people will have different preferences in this regard. Unfortunately I don't feel that they are any good "power users" distros these days which don't leave me playing sysadmin over every piece of minutia (e.g. gentoo). Although I feel like the reduction in sysadmin work I get from using fedora vs gentoo is constantly decreasing due to "usability improvements", which seem to take the form of making the user's first hour 1% easier at the expense of adding a 10% cost to the user's next 20 years. Things like having to use some opaque "registry editor" in order to set a distinct lock and save time when almost 20 years ago xscreensaver gave me a perfectly accessible text file (or even a gui!) with these settings.
Posted Nov 5, 2010 5:55 UTC (Fri)
by sladen (guest, #27402)
[Link] (3 responses)
Posted Nov 5, 2010 9:20 UTC (Fri)
by mjthayer (guest, #39183)
[Link] (2 responses)
I don't think things are quite as simple as you suggest. One, the animations aren't done by the GPU alone, they need support (think loading, preparing, scheduling) from the rest of the system. Two, the GPU draws power to do those animations, which is a cost. And three, perhaps most relevantly, they are not free from a developer time perspective. At least in theory (though this may not apply to software developed by volunteers and/or enthusiasts), that time could have been put into reducing your general setup and teardown time rather than creating animations.
Posted Nov 5, 2010 11:27 UTC (Fri)
by Cyberax (✭ supporter ✭, #52523)
[Link] (1 responses)
Negligible.
"Two, the GPU draws power to do those animations, which is a cost. And three, perhaps most relevantly, they are not free from a developer time perspective."
GPU draws power to draw stuff in any case. And most effects are so simple that from GPU's point of view they are essentially free.
Posted Nov 5, 2010 15:42 UTC (Fri)
by drag (guest, #31333)
[Link]
If your interested in speed and battery life then using the GPU to it's full extent will get you both faster then trying to depend on the CPU alone.
Using the CPU to do things that the GPU can do faster just means your wasting cycles and ruining your efficiency and performance.
The GPU is now a part of your computer as much as floating point processing is or DMA. It's not longer possible to treat it like it's some sort of optional add-on or something you only use for games. It's a native part of the architecture and should be possible for application writers to easily take advantage of.
In PCs this has been true for a while and with mobile world this is more and more true. After all you can look at the requirements for Windows Phone 7... they require a DirectX 9 capable GPU.
Posted Nov 5, 2010 9:15 UTC (Fri)
by mjthayer (guest, #39183)
[Link]
And instead of that you got the Tungsten Graphics powered one! (Sorry, couldn't resist there...)
Posted Nov 5, 2010 9:58 UTC (Fri)
by roc (subscriber, #30627)
[Link] (8 responses)
For example, even if the application can move a visual object from point A to point B instantly, an animation can still be a helpful cue to remind the user that motion has occurred. Our brains aren't designed to process objects teleporting around.
Posted Nov 5, 2010 11:47 UTC (Fri)
by orabidoo (guest, #6639)
[Link] (7 responses)
That's fine as a general case, by default. By all means provide a pretty animation, OpenGL-powered or otherwise, to make that window minimize to the taskbar or wherever.
BUT, it so happens that many of us, technically minded users, already know exactly what we expect from our computers when we press a key or click a button.
In those cases, having stuff visibly move around is just a plain distraction. The human eye, like most animals, is designed to follow stuff that moves and pay much more attention to it than to the static background.
And if I *know* that the window is going to minimize, my brain is already onto what I want to do with that window out of the way. So to have an eye-catching animation at that point is not just harmless eye-candy. It's actively distracting me from where my mind wants to go.
For that reason, every power-user friendly GUI and desktop should have an option to disable all animations. Current GNOME/Metacity has a gconf key for that (Apps->metacity->general->reduced_resources), which is nice. I sure hope the future GNOME shell(s) also have an equivalent setting.
Posted Nov 5, 2010 13:07 UTC (Fri)
by flammon (guest, #807)
[Link] (6 responses)
Posted Nov 5, 2010 13:26 UTC (Fri)
by dskoll (subscriber, #1630)
[Link] (5 responses)
It depends on which button you clicked. I use XFCE without any animations and I'm never confused about what happens when I do things to windows.
I use network transparency all the time. Eye-candy is great for those who want it, I suppose, but please keep an escape-hatch for those of use who like network transparency.
Posted Nov 5, 2010 13:38 UTC (Fri)
by Janne (guest, #40891)
[Link] (4 responses)
Well, duh. But people don't always know which button does what. If the UI can guide them with animations and such, that's only a good thing. App-window that minimizes in to the button on the taskbar is a GOOD IDEA. If the window simply vanished, users can be left confused as to what happened. Even you. What if your aim was few pixels off, and you accidentally closed the windows instead of minimized it? With animations you would instantly know that you closed the windows, instead of minimized it.
And I kept on hearing comments about "technically minded people". You do know that those people are in the minority? Most people are NOT "technically minded", they just want to get their stuff done. And if they can get it done elegantly, all the better.
And I find these comments about "eye-candy that freezes the desktop" strange. I have all kinds of animations and the like on my Mac, and the UI does not freeze.
Posted Nov 5, 2010 15:14 UTC (Fri)
by dskoll (subscriber, #1630)
[Link]
But people don't always know which button does what.
A UI that doesn't make that clear is fundamentally broken and no amount of animation can fix that.
To be clear: If people want to implement fancy animations, that's fine. I don't care. Even make it the default if you like. But make it possible to switch them off because I do care if animations are forced on me.
Posted Nov 5, 2010 15:42 UTC (Fri)
by tjc (guest, #137)
[Link] (1 responses)
Well, maybe the first time they don't know what happened. But if someone clicks a button five times, and the same thing happens every time-- and they still don't know what's going on-- then they have issues that can't be addressed by the UI. Everyone is confused from time to time, but it usually passes. There are very few people who live in a state of perpetual confusion, so why target a UI at some imaginary, gormless twit who doesn't even exist?
Posted Nov 6, 2010 10:22 UTC (Sat)
by Janne (guest, #40891)
[Link]
People are not computer-wizards. It might be obvious to you and me how and why computers work the way they do, but rest of the people have no idea. The computer should do everything in it's power to help the user. But every time something like that is attempted in Linux, we get whining about "dumbing down" the UI or something. Only in Linux, complexity is considered a good thing, and helping the user is considered a sign of stupidity.
End result is that Linux on the desktop is something that normal people do not want to use.
And sure, people will learn which button does what. But animations still help. When you have dozen apps in the taskbar, it's useful to have an animation that shows you which of those is the app you just minimized. Sure, you could visually scan the taskbar, but you must admit that animation is a lot faster way to do this.
And there are even studies about this. Researchers set up two functionally identical systems. The difference was that one system looked plain and basic, while the other has nice graphics ("useless eye-candy" as it's called in Linux-community). It was found that people were more productive on the system that looked better. People found the better-looking system more pleasant to use. And that in turn made them more productive. And happy users are a good thing.
Posted Nov 6, 2010 21:21 UTC (Sat)
by orabidoo (guest, #6639)
[Link]
Well duh right back. As I said above, I'm all for having such friendly animations on by default.
I'm just pointing out a good reason why a subset of users find them counterproductive, and pleading that every GUI should have an option to turn animations off. I don't mind if the knob is quite well hidden, like a gconfkey. Just let those of us who like to think ahead of the computer save that 0.10s of time, or feel like we did. Thanks.
Posted Nov 5, 2010 8:36 UTC (Fri)
by janpla (guest, #11093)
[Link] (1 responses)
"Having a tool be capable of what it's intended to do is important, but would you really want to use an oily screw driver with no grip? Or a hammer where the metal on the handle is flaking?
That's part of the idea behind this "dancing candy cram-ware", as you call it. Even modern netbooks have MORE than enough power to handle these animations and transitions ..."
- All this may be true, but there are some (I am one) who avoid this kind of thing because it is too intrusive and too much of a distraction. I am perfectly happy with graphics where relevant and useful, but in my view trying to work in the middle of an advanced light-show will only detract from the real enjoyment of computer programming.
Apart from that, I think it is deeply unfair to compare X to a broken tool. To take you up on the tool-analogy, you may prefer a sleek-looking electric drill with automatic cable roll-up, cool colour and some impressive graphics printed on the body, but if you want to drill a hole, all you need is a hand-cranked drill; and if you know how, you can normally do a much better job faster, because you have far better control over it.
X may be hand-cranked, but it is a very well-designed tool and there is nothing broken about it.
Posted Nov 5, 2010 8:57 UTC (Fri)
by marcH (subscriber, #57642)
[Link]
>X may be hand-cranked, but it is a very well-designed tool and there is nothing broken about it.
Some insiders do not agree: http://lwn.net/Articles/390389/
Posted Nov 5, 2010 8:57 UTC (Fri)
by codefisher (guest, #64993)
[Link] (19 responses)
Posted Nov 5, 2010 13:08 UTC (Fri)
by gmaxwell (guest, #30048)
[Link] (18 responses)
The way I see it if _I_ need animations to tell what the system has done then the system has already failed. The computer's behavior should consist mostly of deterministic direct responses to my actions so that I should almost never need help figuring out what it has done.
In the rare cases of doubt (such as a cat jumping on the keyboard) I should not have to fuddle out what happened from my memories of the computer's graphical interpretive dance, instead a log/history should be provided which I can reference whenever I need to.
There are a great many operations that a computer can conduct which have no intuitive mapping to an animation. We would weaken our computers to uselessness if we constrained their easily accessible abilities to those which could be represented accurately as dance. I could possibly memorize a long list of animations "When the screen vibrates up and down, the system has unmounted a disk and it can be removed." but part of the reason for having _language_ is that you don't need to memorize a unique symbol for every possible idea. Textual communication can provide a precise, searchable, learnable, archivable, and accurate information channel between the computer and the user. Language is a tool which is equally available to all applications, including GUI ones.
Much of the Unix environment already works this way, certainly the CLI itself does but it seems that many desktop tools comes from a different culture where they use things like focus stealing animated popups with ideograms to inform the user about system activities. When users complain that sometimes the message disappear, never to be recovered, before they had a chance to see them the 'desktop' culture seems to think "lets make the animation slower and more intrusive!". If that kind of thing makes someone happy, good for them but it isn't something that I want.
Posted Nov 5, 2010 15:51 UTC (Fri)
by drag (guest, #31333)
[Link] (16 responses)
Lolz over Lolz. :)
Don't you know that, you know, text scrolling is a ANIMATION?
The way it makes it sound is like your computer is just something with a big red button on the front that you press and it says "DO WHAT I WANT" and then it plugs into your mind or something.
It's all about information feedback. There are lots of ways to provide information. Lots of different ways to receive it. If you want to live in a weird sort of Max headroom type universe were all that exists is just you and your PC then that's a interesting idea, but I (and most people) want to be interact with the real world.
This means things happening outside your control and interacting with you and your computer. GPS, temperatures, news feeds, message notifications, etc etc. All sorts of stuff is going on all the time. We want 'augmented reality', 'feedback', and that sort of thing. It's the dream to be able someday go 'Hello Computer' and have some sort of meaningful response.
Posted Nov 5, 2010 17:41 UTC (Fri)
by gmaxwell (guest, #30048)
[Link] (15 responses)
Please don't be silly. I think I made it amply clear that I don't expect to work with the computer without it communicating with me but I want it to communicate with only when required or requested and I want it to use the high bandwidth channel of _language_ to do that communication when communicating things which are intuitively and obviously graphical or when doing so is most efficient, instead of what I characterized as "interpretative dance" or cave drawings.
I certainly do want to interact with the outside world but I also want to be able to control that interaction. A small status indicator, additional comments in addition to some output the computer is already providing. Among humans we generally consider it impolite to interrupt someone with something unless it's urgent or you know that its something that they want to know about. My computer is far too stupid to reliably know when something meets that criteria, so it ought to be especially cautious in its interruptions unless I tell it otherwise.
It's not like you really have a choice of the matter. In environments where the computer is constantly presenting the user with a barge of focus stealing choices users quickly learn to simply confirm everything that comes before them. "Install this?" "Yes." "Remove this?" "Yes." "Transfer your bank account to Nigeria?" "Yes. er. SHIT!". My bandwidth is finite and I'd much rather spend it on the interactions I have initiated.
Not always but experienced users USUALLY do. Why should I pay the price of an animation every time just to provide a small benefit in a small minority of cases? Give me a session history, give me an undo, give me a "WTF just happened button". These things would all be great. An animation? Without things like undo an animation just lets you know how screwed you are a bit faster. Without a history/WTF-button an infrequently encountered animation is likely to be inscrutable.
The kinds of events which are likely to confuse me are also likely to not be representable by an animation. I'm not going to be confused by accidentally dropping a file in the wrong directory, I'm going to be confused by something like a glob having unintended consequences.
Posted Nov 5, 2010 18:20 UTC (Fri)
by drag (guest, #31333)
[Link] (14 responses)
Well you can avoid that just by using software that does not suck.
The only time I want my attention to be stolen from what I am working on is if it's something damn important. Then in those cases I WANT my attention to be stolen.
But really nobody is advocating that we should have constant big swooping animations that do nothing but get between you and what ever text box you happen to be interacting with at the time.
And these animations don't cost you anything really. If you think that having a translucent notification box pop up to tell you received a email is going to take away from whatever your doing your probably very wrong.
Or at least you should be wrong.
We have had hardware around since the late 1990's that is perfectly capable of performing the necessary functions to get what people are trying to get at with things like Unity, Gnome-3, etc etc. Apple's first OS X desktop ran with no GPU acceleration at all!
It's just that graphics suck in Linux. This is what _may_ get fixed if we can break away from the tyranny of proprietary video drivers and everything-we-use-must-be-X.
Posted Nov 5, 2010 18:44 UTC (Fri)
by gmaxwell (guest, #30048)
[Link] (6 responses)
I'm sure that some other people, perhaps most other people, are completely fine with that sort of thing. I wish you luck in creating software for those people to use. Though I am somewhat skeptical that most people actually prefer this sort of thing outside of computers almost nothing else provides indications in that kind of intrusive"interrupt driven" way. (My car doesn't overlay a gigantic oilcan on my windshield when the oil pressure is low it lights up a discrete check engine light and I can attach and OBD tool to find out the cause. When my office paper-mailbox has a letter it's left sticking out where I can see no one copies the letter onto a transparency then slams it in my face)
But even if I really am in the crazy minority here, please don't think that you speak for everyone. You certainly don't speak for me and at least a few other curmudgeons like me and I've been using computers long enough to have a pretty good idea what works for me. That kind of annoyance isn't how I work, it isn't what I want. I put up with this kind of behavior from my computer only in so far as putting up with it is less costly to my time and endurance than maintaining every part of my systems on my own. But as far as I'm concerned a step down from a system that provides no notification at all.
Posted Nov 6, 2010 16:08 UTC (Sat)
by andreashappe (subscriber, #4810)
[Link] (5 responses)
Time to cut back on the hyperboles..
The X-Protocol is currently getting more in the way than helping stuff -- don't take my word for it, Keith Packard's should be enough. There are people trying to improve that: look at the quality of the X stack, they are are a long way ahead from the things that I had to use in the last millenium.
Animations might be added. So what? Scrolling is an animation, tear-free window movement was made possible through that animation work. Who did suffer from that? Wayland might the way forward, but still there's X11 as a possible client to that.
If you don't like it: turn them off. Come on, that would have taken less time than the whining on this forum. So you don't like those transparent popups that disappear after 2-3 seconds and hide the some 8cm^2 in the top right corner of your screen where most of your work seems to happen: disable them. You are able to disable them, some no-clue first-time Linux user surely ain't able to enable them. If you (for some reason) need to install your linux distribution every year automate it. Create a package that does all the magic for you -- other people might even like to use it.
Posted Nov 6, 2010 20:19 UTC (Sat)
by gmaxwell (guest, #30048)
[Link] (4 responses)
I run a distribution in order to outsource basic system maintenance. I have more pressing things to do with my time and I'm willing to tolerate the consequence of system operation that I don't agree with but that doesn't mean that I don't have preferences. I'm speaking up here because I believe that it would be a disservice for me to everyone who has common interests to sit quietly while people pushing features which are harmful to those interests are so vocal.
You make it sound like it's so easy to disable these things. Sadly it is usually not in the interest of "usability" the mere option to disable these things is often completely eliminated or if it remains at all it is deeply hidden (often inside some undiscoverable registry tool). Just because I am more capable than joe-random that doesn't mean my time is less valuable, that I am more patient, or that I am infinitely capable. In cases where the functionality is eliminated patching the software breaks updates and leaves me tracking development, which is the work I was hoping to avoid by using a distribution in the first place.
Going back to the subject that started this sub-thread: If network transparency is abandoned in the GNU/Linux desktop infrastructure I can't simply turn a knob to bring it back! Remote X is functionality I use _every day_. I have three windows open on my laptop right now to a system with a large amount of ram which is able to work on data sets that I can't reasonable work on locally. It works great. And the notion of it only working via shims or with arcane software which I have to maintain myself troubles me greatly.
I'm certainly not opposed to _performance improvements_. By all means, making it faster has my full support. The discussion here was about tossing functionality (which I find critical) in order to enable performance improvements which are mostly inconsequential to me. I am not comforted by the argument that this change is urgently needed due to make improvements like increasingly intrusive animations.
Janne, I must admit that I'm not quite sure if you're trolling me or not but if you are I guess I'm going to fall for it.
Your market share strawman is not well supported by the evidence. Systems with clearly superior user experience have time and time again failed to capture really significant market share (Mac OS for the longest time and even today it's only at perhaps 7%, BeOS, etc).
You're also making the erroneous assumption that I care about having 7% market share (like OSX) vs 2% market share(numbers source). I don't. I care about having a usable _computer_ (as opposed to a home entertainment center, which has large orthogonal usability requirements). I care about having a good option to recommend to other technical people. I care about not having to build my own desktop software stack, even though I would probably be able to create one which met my needs I have other things that I'm working on. While I'd love to see most people running Free software, 7% wouldn't be much of an improvement against the 85% on windows for that purpose... even if I believed that we could solve the marketshare gap with UI improvements.
People use computers for different purposes. Even windows has a small market share if I count televisions and video game systems as "computers". I wonder if we're using 'desktop' market share numbers which are diluted by a great many use cases which would be better served by an appliance? If I were to care about market share I'd want to first care about getting 100% of uses which are best met by powerful computing systems rather than by media players or the like.
I am and I am not alone. And I want a system which is useful for me to run. I also want other people to have systems which are useful for them, even if their needs are different than mine. I feel that non of the major distributions are catering to my interests, and I think thats unfortunate and I hope it changes. The major distributions and major Linux desktop software suites are clearly prioritizing non-technical novice users today. They even say so explicitly. They may be actually failing to satisfy the needs of those users too, but failing to make your target happy isn't equal to having a target which includes other people.
Perhaps animations can play a useful role in a typical user's "next twenty years" but the animations that do probably won't be the same training-wheels animations that you'll create if you're optimizing for the initial impression. I found the example about minimizing to be pretty humorous. Why would I want that? If I care it's because I either don't know what I did, or because I wish I hadn't done it. In either case what I need is an undo button, not an animation. An animation might make it a little easier to manually undo my mistake, but thats really a half-step... We have computers to eliminate manual processes. How many significant usability improvements are we missing because everyone focused on usability is primary focused on newbies and the initial impression?
Posted Nov 6, 2010 21:22 UTC (Sat)
by dlang (guest, #313)
[Link] (3 responses)
it's really hard to put a couple hundred gig of ram into a laptop, but trivial to remote the display from a server that has a couple hundred gig of ram to a laptop that you can carry into a conference room.
you may try to argue that the app could be written to work that way through other means, but that misses the point that with X the app author doesn't have to make a decision of if the app should be network accessable or not. If app authors have to go to extra effort to make their stuff network accessable, most of them won't go to that effort (after all, nobody needs that anyway, that's why the feature was removed from linux systems to start with right?) and the only apps that will have the ability to be networked are the ancient X apps (that predate the change), or high-end 'enterprise' commercial apps where someone is paying for the feature.
this leaves out the huge middle ground where the app author never thought about the need to be networked, but that app ends up being the perfect thing to use when backed by the right hardware. Instead someone will have to fork or recreate the app in a networked version.
Posted Nov 7, 2010 9:31 UTC (Sun)
by roc (subscriber, #30627)
[Link] (2 responses)
Maybe true for simple apps, but complex apps are basically unusable over modest-latency links unless they've been significantly optimized to reduce round-trips to the X server. There are a lot of X APIs that you simply cannot use if you want to be fast over the network.
> this leaves out the huge middle ground where the app author never thought > about the need to be networked, but that app ends up being the perfect
Or just run it under a modern screen-remoting tool.
Posted Nov 9, 2010 2:02 UTC (Tue)
by nix (subscriber, #2304)
[Link] (1 responses)
Posted Nov 9, 2010 6:40 UTC (Tue)
by dlang (guest, #313)
[Link]
Posted Nov 5, 2010 19:30 UTC (Fri)
by dskoll (subscriber, #1630)
[Link] (6 responses)
And these animations don't cost you anything really. If you think that having a translucent notification box pop up to tell you received a email is going to take away from whatever your doing your [sic] probably very wrong.
Are you kidding me? Those notification boxes drive me crazy. There I am, deep in an xterm or an emacs debugging session and some stupid box obscures my text? I want the computer to stay out of my face!
Posted Nov 5, 2010 20:06 UTC (Fri)
by drag (guest, #31333)
[Link] (2 responses)
For example at my current job some of the emails that I get are going to be critical and far more important then anything I would happen to be working on, unless I am working on a emergency... at which time I would have 3 phone lines blazing, people talking over everybody else, etc etc. Then a little pop up in the corner of my window is going to be the last thing on my mind.
In my old job I couldn't care less. There was no communication that mattered enough to be answered right away.
But now I WANT to see that stuff. I WANT to be interrupted. That's a good thing. Because if I get a notification and act fast enough I can stop those above mentioned emergencies. :)
Posted Nov 6, 2010 10:25 UTC (Sat)
by modernjazz (guest, #4185)
[Link]
The problem is there just hasn't been enough effort put into open-source drivers until recently, and the quest for "bling" has really ramped up those efforts. Just like how the commodity/gaming market increased power and decreased the price of computing, for both "serious" and "fluffy" use-cases.
So I'm happy about where things have been going, even though it has made X a pain in the neck for the last couple of years. (Fortunately, it seems to be getting better, at least for me.) But I would bemoan the loss of network transparency in situations where I didn't need the absolute highest-performance graphics.
Posted Nov 11, 2010 17:13 UTC (Thu)
by cdmiller (guest, #2813)
[Link]
Posted Nov 6, 2010 22:31 UTC (Sat)
by Wol (subscriber, #4433)
[Link] (2 responses)
Perfect example. KDE. I have a taskbar at the bottom of my screen that currently says "Konsole (2)" - ie I have two Konsoles (currently hidden). Let's say I put my mouse over it - it now displays what those two consoles are. All fine and dandy - UNTIL I actually want to select the upper of the two.
If I don't know which one I want, or I'm slightly hesitant, or I'm not good at moving my mouse, or or or ... the mouse hovers over the FIRST konsole description a tiny moment too long, and the information popup appears, COMPLETELY obscures the second Konsole button that I actually want, and JUST WON'T GO AWAY until I go back to "Konsole (2)", get rid of the whole damn lot, and have to start ALL OVER AGAIN.
Don't forget - these information popup bars have a habit of following the mouse. In other words, if you're slightly unsteady, or can't aim quite right, or anything else where the mouse is wobbly, there's a damn good chance the popup is going to pick a damn inconvenient place to appear.
Quite why the KDE people chose the place they did for the popup I'm moaning about I do not know - it is INCREDIBLY stupid, but hey, I'm sure they have some very clever people who thought it was a good idea ... :-)
Cheers,
Posted Nov 6, 2010 22:43 UTC (Sat)
by dlang (guest, #313)
[Link]
personally, I choose to have it never group and I set the taskbar to be tall enough to show enough rows to have a useable amount of text in each of the icons.
Posted Nov 7, 2010 1:32 UTC (Sun)
by boog (subscriber, #30882)
[Link]
Posted Nov 5, 2010 19:30 UTC (Fri)
by rgmoore (✭ supporter ✭, #75)
[Link]
That's only likely to be true for UI limited tasks. I expect my computer to be doing many things at once. Some of those tasks are things that don't and can't happen instantly because they require substantial processing or data retrieval time. Many of them are background tasks that are running without requiring my explicit instructions every time. I want feedback about what's happening with those tasks, and some kind of unobtrusive desktop effect can be a better way of providing it than yet another message window popping up.
And that's just for desktop notification type effects. There are other useful things you can do with graphics. For example, I find that I wind up with overlapping windows fairly regularly, even though I have a very large monitor with multiple virtual desktops. I like some of the eye-candy effects that are used to help out with that problem- translucent window borders, temporarily reconfiguring the desktop so I can see miniature versions of all windows, etc. Those kinds of effects may not be vital, but they make the system easier to use- which should be the single biggest goal of development.
Posted Nov 5, 2010 12:28 UTC (Fri)
by pboddie (guest, #50784)
[Link] (1 responses)
What a flawed analogy! It would be more appropriate to liken a plain, functional desktop to a working screwdriver and a fancy, animation-heavy desktop to a screwdriver with a diamond-encrusted gold handle with a fluffy comfort grip. The former is sufficient to get the job done whereas the latter looks great if you want to give a demo.
Posted Nov 5, 2010 21:58 UTC (Fri)
by Kit (guest, #55925)
[Link]
Your version only further illustrates the point I was making. _BADLY_ done transitions and animations are FAR easier to notice than _well_ done ones, because _well_ done ones you only really notice subconsciously, while badly done ones demand your attention.
Well done transitions must be VERY quick, and completely smooth. They must be over far faster than a person could actually react to them, because they're only supposed to provide a hint at what's going on. People that are used to a system operating a specific way might not like it, because people fear change... but to a user that has to learn both systems, the ones where transitions and animations are used well will be far easier to learn. And then, after using the system for a while, once they're used to how it operates, the one with the transitions will _continue_ to be the more pleasant one to use.
Animations and transitions can transform an interface from feeling like a computer, to feeling like an actual physical thing, operating under the normal physical properties.
---
There are other non-animation/transition effects that the system can use to improve usability, such as apply an effect to the windows of an application that appears to be frozen, or very subtly dimming background windows (but it needs to be subtle enough that you wouldn't notice, likely even if someone told you it was doing it). Humans notice far more than what they're consciously aware of, interfaces should take advantage of that.
Posted Nov 5, 2010 8:08 UTC (Fri)
by laf0rge (subscriber, #6469)
[Link] (4 responses)
And as for making 'the linux desktop' attractive to end-users in an office: i could not care less personally. I am interested in making technology work for those people who have an interest in technology and want to understand it. People who use it so much, that adopting the human being to the computer results in much more productivity than trying to adopt the computer to human beings.
Posted Nov 5, 2010 8:40 UTC (Fri)
by pheldens (guest, #19366)
[Link] (3 responses)
Posted Nov 5, 2010 9:11 UTC (Fri)
by Karellen (subscriber, #67644)
[Link] (2 responses)
WTF?
Since the Gnome 1.0 release in March '99, there has been *one* incompatible change, when 2.0 was released, in June 2002. Since then, no incompatible changes in over 8 years. Gnome 3 is (currently) scheduled for March 2011, and there's no reason to think that it will last for any less time than Gnome 2 did.
Similarly, KDE 1.0 was in July '98, 2.0 in October 2000, 3.0 in April 2002, and 4.0 in Jan 2008. While 1 and 2 were fairly short-lived, 3 was a lot more mature lasting for 6 years, and with a lot of the technologies in 4 still being built upon with newer minor releases at nearly 3 years in, I predict that 4 will be a longer-lasting base than 3 was.
And, of course, apps written for Gnome 1, and KDE 1, 2 & 3 should all still run fine on any current and future Gnome/KDE desktops. You don't have to rewrite your KDE3 app to KDE4 technology if you don't want to. You can keep developing it against the old libs for as long as you want.
Posted Nov 5, 2010 12:51 UTC (Fri)
by BeS (guest, #43108)
[Link]
Reading this blog post about the plans for Gtk4 I have the feeling that Gtk3 and GNOME3 will have a rather short live:
Posted Nov 8, 2010 18:19 UTC (Mon)
by jond (subscriber, #37669)
[Link]
Posted Nov 5, 2010 9:31 UTC (Fri)
by nhippi (subscriber, #34640)
[Link]
Back to seriousness, is there really much point in investing heavily in desktop window management? Most users end up switching mostly between browser tabs. Just compare how many people bother with evolution or thunderbird and just go with gmail nowadays? Developers and HC users will still have some xterms open too, but they will probably use something tiling to manage them (like now).
Posted Nov 5, 2010 19:24 UTC (Fri)
by Simetrical (guest, #53439)
[Link] (15 responses)
Posted Nov 5, 2010 20:42 UTC (Fri)
by gmaxwell (guest, #30048)
[Link] (14 responses)
(1) We have GPU acceleration already. Perhaps it could be made to work better. I'm all for that. I protest the idea that we must toss the very useful network transparency just to get the small incremental improvement that might come in cases where network transparency isn't being used.
(2) While I'll concede on the power savings bit I don't actually believe that we do need GPU acceleration. A fairly typical PC can compute and blit out 1080P video at 60FPS without any "gpu acceleration" at all. The system is already far beyond my reaction time at least until someone adds a bunch of fancy animations.
Posted Nov 6, 2010 16:17 UTC (Sat)
by andreashappe (subscriber, #4810)
[Link]
Yeah, but the CPU utilization drop from 40% to 4-5% was kinda nice.. do you buy cpus to just get spammed by tasks that they are not suited to?
Posted Nov 6, 2010 16:26 UTC (Sat)
by Darkmere (subscriber, #53695)
[Link] (6 responses)
This may because the current Linux-based software stack is impossible to deal with it properly, but that doesn't really make a difference to the end user.
Posted Nov 6, 2010 19:16 UTC (Sat)
by gmaxwell (guest, #30048)
[Link] (2 responses)
Posted Nov 6, 2010 19:43 UTC (Sat)
by Darkmere (subscriber, #53695)
[Link]
Posted Nov 9, 2010 15:36 UTC (Tue)
by nye (subscriber, #51576)
[Link]
Posted Nov 6, 2010 19:58 UTC (Sat)
by sfeam (subscriber, #2841)
[Link]
Posted Nov 8, 2010 8:26 UTC (Mon)
by buchanmilne (guest, #42315)
[Link] (1 responses)
While my Atom-based netbook can handle some 720p content, it struggles with others. However, my Atom-based HTPC, which has an ION GPU, running XBMC on Linux, plays almost all 1080p H.264/5.1 content (with sound going through pulseaudio) without going over 20% CPU utilisation. But, the Nvidia ION chipset has VDPAU with H.264 decoding support, whereas the cheap intel chipset on my Atom-based netbook doesn't.
I don't think the problem is the rest of the software stack, it's probably that your GPU doesn't have accelerated decoding, or the driver doesn't support it.
Posted Nov 8, 2010 12:47 UTC (Mon)
by Darkmere (subscriber, #53695)
[Link]
Simply the amount of data to shuffle is enough to choke certain machines, and will remain so for quite a while.
(Especially with "slow" graphics memory/shared memory setups which remain common on laptops of the cheaper inclination)
Posted Nov 7, 2010 17:55 UTC (Sun)
by Simetrical (guest, #53439)
[Link] (5 responses)
Well, we aren't, are we? You can still run X on top of Wayland, and Wayland will presumably support other types of networked desktops. Every OS does, after all. Why do you think Wayland will wind up being less nice to use over the network than X in the end? I've found even NX to be almost unusably slow with even 50 ms latency, between uptown and downtown Manhattan. Regular old X forwarding didn't even work at that latency, practically speaking (taking minutes to even draw the window).
Posted Nov 7, 2010 18:57 UTC (Sun)
by dlang (guest, #313)
[Link] (4 responses)
different people have different tolorance for the effects of latency. while you consider 50ms unusable, other people have been reasonably happy with X over dialup (~300ms latency)
Posted Nov 7, 2010 23:23 UTC (Sun)
by Simetrical (guest, #53439)
[Link] (3 responses)
The lag I saw in X forwarding latency is not a question of individual tolerance. When I tried regular X forwarding on Chromium, it took minutes to even draw the thing once at startup. It was not usable as an interactive application by any stretch of the word.
With NX, it was usable, but with lag of a couple of seconds on everything I did. This should not be necessary -- it should take exactly one round-trip for my mouse click to get to the other computer and all changes to get back. NX was taking dozens of times that. We live in an era of high latency and low bandwidth; the X way of doing things no longer makes sense. Pushing around bitmaps is a much better strategy, and will become ever more so with time, as network connections get faster and latency remains constant.
Unless I'm missing something, which is entirely possible, since I have only the vaguest idea of how anything related to graphics works. In that case, corrections appreciated. :)
Posted Nov 8, 2010 2:49 UTC (Mon)
by dlang (guest, #313)
[Link] (2 responses)
Posted Nov 11, 2010 18:03 UTC (Thu)
by Quazatron (guest, #4368)
[Link] (1 responses)
Posted Nov 11, 2010 18:41 UTC (Thu)
by gmaxwell (guest, #30048)
[Link]
Over slower links VNC will stay usable (if slow) while X becomes useless.
There are various x protocol compressing proxies available for these situations, but I haven't had cause to use them for years. Networks got faster.
Posted Nov 8, 2010 8:10 UTC (Mon)
by nix (subscriber, #2304)
[Link]
Posted Nov 5, 2010 9:06 UTC (Fri)
by modernjazz (guest, #4185)
[Link] (4 responses)
So, are the concerns that Wayland will break network transparency merely "short-term," meaning that the technology is feasible but simply not in place yet? Or are they "long-term," meaning the OpenGL path is basically not amenable to network transparency?
Posted Nov 5, 2010 10:07 UTC (Fri)
by dgm (subscriber, #49227)
[Link] (2 responses)
Posted Nov 5, 2010 22:05 UTC (Fri)
by Kit (guest, #55925)
[Link]
For a normal desktop environment, that shouldn't be a huge deal. The problem is, desktop apps seem a bit hell bent on trying to make GPUs lives as difficult as possible, and will frequently do the least efficient thing possible when it comes to drawing. Toolkits/applications are too insistent are throwing everything away and doing it all over again from scratch, instead of reusing what has _already_ been uploaded to the GPU, when that's what the application was wanting to render anyways.
Posted Nov 6, 2010 10:35 UTC (Sat)
by modernjazz (guest, #4185)
[Link]
3D textures are of course worse, but one could restrict use of them to cases that are strictly necessary so as not to kill network performance.
Posted Nov 8, 2010 18:37 UTC (Mon)
by daniel (guest, #3181)
[Link]
Posted Nov 5, 2010 9:59 UTC (Fri)
by dgm (subscriber, #49227)
[Link]
What's more, this way you don't force applications where network transparency makes no sense -like video players and games- through the X protocol. Probably the desktop itself is a good example of such an application that's better run close to the display hardware.
In the end, what you will get is a (much) simpler X server, focused on the network protocol and independent of the hardware. Good stuff, if you ask me.
Posted Nov 5, 2010 20:07 UTC (Fri)
by jonas.bonn (subscriber, #47561)
[Link]
Posted Nov 5, 2010 2:23 UTC (Fri)
by ringerc (subscriber, #3071)
[Link]
I've been using Ubuntu at work to serve remote X thin clients over LTSP, and overall it works fairly well. I'm not overly attached to X, though; the network "transparency" model falls down on the fact that network round trips add latency, so you have to test your toolkits/apps with network X to find problems. Additionally, shared memory is not available over the network, so anything that uses xshm needs to be able to fall back to protocol requests. Ditto DRI. In other words, network X is not truly transparent. I've had to hunt down several bugs that only affect apps run over network X, including one particularly bad one in Evolution's tooltip handling that made the compose window take *minutes* to appear.
I just hope Wayland can present something a bit better than a plain frame buffer to clients, so it's possible to implement a smart (RDP or ICA-like) network client not just a dumb frame buffer client like VNC.
Posted Nov 5, 2010 2:37 UTC (Fri)
by martinfick (subscriber, #4455)
[Link] (26 responses)
Posted Nov 5, 2010 3:00 UTC (Fri)
by bjacob (guest, #58566)
[Link] (23 responses)
* in the network-transparent X desktop, the graphics are (at least partially) done *server-side*. So we're killing ourselves doing roundtrips between the client and the server (so GUI snappiness is hurt by network latency), and we don't scale as we tax the poor server too much.
* in modern web apps, the graphics are done on the *client side* (in the browser, in JS). No round trips, and newer web standards (canvas! WebGL!) allow web apps to do client-side the same graphics operations that a local application could do, with WebGL even giving fairly direct access to the GPU.
Posted Nov 5, 2010 3:14 UTC (Fri)
by martinfick (subscriber, #4455)
[Link] (22 responses)
Posted Nov 5, 2010 3:33 UTC (Fri)
by bjacob (guest, #58566)
[Link] (9 responses)
Then, about your point that least the network transparency of X should come for free when the server is local --- no idea, letting others reply here. But what's the point of a network protocol if it's going to be less and less used with remote servers.
Posted Nov 5, 2010 11:47 UTC (Fri)
by sorpigal (guest, #36106)
[Link] (8 responses)
Posted Nov 5, 2010 12:58 UTC (Fri)
by ibukanov (subscriber, #3942)
[Link] (7 responses)
Over the network VNC have been working better for me than X especially on high-latency links. That tells that from a practical point of view X alone does not provide the right answer.
Posted Nov 5, 2010 15:23 UTC (Fri)
by drag (guest, #31333)
[Link]
VNC is not the only game in town, of course. X Windows networking is, indeed, very very cool. But it's been a very long time since it had any sort of monopoly over remote applications.
Windows users have been enjoying Windows-apps-over-internet for many many years now.
Does anybody have a good how many people use 'Go to My PC'? It's a huge number and they all do it over the internet and it works far better, far easier, then X Windows does.
Posted Nov 5, 2010 15:54 UTC (Fri)
by deepfire (guest, #26138)
[Link] (5 responses)
As I've already said below, if you want an apples to apples comparison see Nomachine's NX. As I said, I use it daily, and my experience is extremely positive.
And yes, it's open source.
Posted Nov 6, 2010 22:17 UTC (Sat)
by ceswiedler (guest, #24638)
[Link] (4 responses)
NX is excellent and I highly recommend it for remote X access, even on a local network since it provides session restoration and "just works". From what I understand, it compresses extremely well due to the nature of the X protocol, since it can see when things actually need to be sent to the client. A VNC or RDP server by comparison only has the final rendered product.
Posted Nov 7, 2010 11:24 UTC (Sun)
by deepfire (guest, #26138)
[Link] (3 responses)
The sources for the core transport libraries are all there.
The missing stuff is the end-user application code, which they make money from.
Posted Nov 7, 2010 20:41 UTC (Sun)
by dtlin (subscriber, #36537)
[Link] (2 responses)
http://www.nomachine.com/redesigned-core.php
Posted Nov 7, 2010 21:27 UTC (Sun)
by rahulsundaram (subscriber, #21946)
[Link] (1 responses)
Posted Nov 8, 2010 17:12 UTC (Mon)
by dtlin (subscriber, #36537)
[Link]
neatx is a wrapper for the 3.x NX core libraries, much like NoMachine's nxserver.
It does not support the NX 4.0 progress, and never will because there's nobody working on it anymore and the libraries are not open.
Posted Nov 5, 2010 3:42 UTC (Fri)
by davide.del.vento (guest, #59196)
[Link] (8 responses)
Posted Nov 5, 2010 4:17 UTC (Fri)
by dlang (guest, #313)
[Link] (2 responses)
over high-latency links X performs poorly because it serializes everything and so you have a huge number of round trips, but since these are very standard messages that have the same answer for all applications, most of this data can be cached and replied to locally, eliminating the network latency. there's still the message passing and parsing latency, and most of these messages could be combined to save that, but still keep the network transparency in place.
Posted Nov 5, 2010 13:30 UTC (Fri)
by rgoates (guest, #3280)
[Link] (1 responses)
Posted Nov 6, 2010 3:42 UTC (Sat)
by mfedyk (guest, #55303)
[Link]
please explain why nx hasn't become the wire protocol for x...
Posted Nov 5, 2010 7:32 UTC (Fri)
by kevinm (guest, #69913)
[Link]
if a user click a button, that info must go to the server to decide what that button does, right? and latency is there to kill us... Not necessarily. We can solve the problem in the same way that web applications do: provide a lightweight VM on the UI side, and allow the application to push small chunks of bytecode down to the UI to tell it how to respond to things like button-pushes.
Posted Nov 5, 2010 16:03 UTC (Fri)
by deepfire (guest, #26138)
[Link] (3 responses)
So, there are these Nomachine people from Italy, and they seem to have done some pretty good open-source work on optimising the hell out of the X protocol implementation.
At least I use NX daily, my experience is very positive and by now I consider it indispensable.
Posted Nov 5, 2010 19:10 UTC (Fri)
by boog (subscriber, #30882)
[Link] (2 responses)
Posted Nov 10, 2010 9:56 UTC (Wed)
by nix (subscriber, #2304)
[Link]
Posted Nov 11, 2010 5:10 UTC (Thu)
by njs (subscriber, #40338)
[Link]
Disclaimer: I wrote it, so any show-stopper bugs affecting me *would* be fixed now, wouldn't they ;-).
Posted Nov 5, 2010 4:41 UTC (Fri)
by jwb (guest, #15467)
[Link] (2 responses)
Another famous one of course was Display PostScript.
Posted Nov 5, 2010 13:14 UTC (Fri)
by vonbrand (subscriber, #4458)
[Link]
Oh, you mean that junk that came with Suns, and made me compile plain X for them on arrival because anything using the display was unbearably slow?
Posted Nov 12, 2010 2:54 UTC (Fri)
by jmorris42 (guest, #2203)
[Link]
And of course it lives. It started at Next and evolved into Display PDF and is still around in the end product of NextStep now known as Apple's OS X. If they can separate the display rendering from the application I really don't understand why this argument is even taking place. If it is possible the argument should be focused on identifying limitations in X that are preventing a similar performing system and how they might be corrected.
Posted Nov 5, 2010 13:16 UTC (Fri)
by alankila (guest, #47141)
[Link] (1 responses)
JavaScript RPC model is actually pretty interesting. The web is evolving to allow the programmer the freedom to select a suitable boundary between client and server. Sometimes you just want dump data frames from server as stream (similar to VNC in browser, it's doable), sometimes you run almost the entire application on client and only send data to server so that it can save the work which really occurred almost completely on the client. In fact, the most extreme design has server just feed the original UI files and afterwards no more interaction with server occurs.
Your comment also seems to be missing the point that linux webapps still do run over X. The browser is a X client. It worries me that you seem to jumble everything together here.
Posted Nov 5, 2010 15:53 UTC (Fri)
by lacostej (guest, #2760)
[Link]
and buffered on the client side !
Posted Nov 5, 2010 5:52 UTC (Fri)
by PO8 (guest, #41661)
[Link] (20 responses)
Remoting a window manager, in particular, hasn't been a terribly useful option for a long time.
Posted Nov 5, 2010 7:13 UTC (Fri)
by jzbiciak (guest, #5246)
[Link] (16 responses)
I'm actually quite OK with this, since it plays well into advances we've made (fast CPUs == fast compression, and we have ever increasing bandwidth), and does a better job of tolerating the one major bit that hasn't advanced much: round-trip latency.
I have an X application I sometimes need to run remotely over a VPN link over VDSL. I have gobs of bandwidth, but the RTT sucks. The app is barely usable. In contrast, Windows Remote Desktop and VNC both work just fine over the same link. Both of the latter seem to be more of the "dumb bitmap plus compression" school of thought, and that seems to work pretty well with modern setups.
Posted Nov 5, 2010 7:34 UTC (Fri)
by dlang (guest, #313)
[Link] (7 responses)
Posted Nov 5, 2010 12:50 UTC (Fri)
by i3839 (guest, #31386)
[Link] (5 responses)
The main advantage of Wayland is that it simplifies the whole graphics stack enormously. It uses DRI2/KMS, just like X does, so it doesn't give extra possibilities or a magic performance increase.
Posted Nov 5, 2010 15:19 UTC (Fri)
by drag (guest, #31333)
[Link] (4 responses)
Wayland is much simpler because it depends on a modern graphic stack. It'll be faster then X, though, simply because it's much less overhead and cleaner implementation. It won't be magical, of course. Only modest improvements. Probably be better in terms of battery life....
There is also no reason why you need to give up X Windows to use Wayland. I use X Windows just fine in Microsoft Windows. Also lots of people use X Windows just fine in OS X. Given that Wayland is naturally composited interface then having a Wayland-specific X Server that draws to off-screen buffers will allow natural integration and backwards compatibility with current applications.
Not that there is a Wayland DDX like there is for MS Windows DDX and XQuartz DDX, but it's certainly going to be a requirement. It's one of those things that will have to be made before Wayland is usable.
Applications that use Wayland will immediately be able to benefit from being 'native wayland', but X apps won't get lost out in the cold.
Posted Nov 6, 2010 4:22 UTC (Sat)
by rqosa (subscriber, #24136)
[Link] (3 responses)
> Applications that use Wayland will immediately be able to benefit from being 'native wayland' These native Wayland apps use DRI2 to draw to offscreen buffers, right? So, isn't it true that there's no reason why X clients couldn't also be made to use DRI2 to draw to offscreen buffers? And in that case, there's no significant speed penalty to using X (because the X clients then are doing direct rendering without needing to go through the X server (as in AIGLX)), right? And if Wayland doesn't have a speed advantage over X, then what is its advantage?
Posted Nov 6, 2010 6:04 UTC (Sat)
by PO8 (guest, #41661)
[Link] (2 responses)
Posted Nov 8, 2010 16:14 UTC (Mon)
by renox (guest, #23785)
[Link] (1 responses)
Posted Nov 9, 2010 0:19 UTC (Tue)
by alankila (guest, #47141)
[Link]
Posted Nov 5, 2010 17:50 UTC (Fri)
by jzbiciak (guest, #5246)
[Link]
You just lose the shared-memory efficiency when you hand over the results, since the client's video card and the compositor's video card are in two entirely different boxes.
Posted Nov 5, 2010 9:07 UTC (Fri)
by marcH (subscriber, #57642)
[Link] (3 responses)
Err... are you seriously expecting network latency to become better? You know that it depends on the speed of light, right?
I know it could be better than Ethernet *on the LAN* but I doubt that X protocols are soooo chatty they would feel any difference.
> I have an X application I sometimes need to run remotely over a VPN link over VDSL.
DSL offers notoriously bad round trip times (20-30ms) because of the massive amount of Forward Error Correction. You should either look for an ISP that allows to tune your FEC (as explained here http://www.dslreports.com/faq/2182), or for an entirely different and better access technology like DOCSIS. Maybe even 3G has better latency than DSL. Anyone knows?
Posted Nov 5, 2010 15:22 UTC (Fri)
by centenary (guest, #71028)
[Link] (2 responses)
I think you're misreading his point. His point *is* the fact that network latencies won't improve (which you're also saying here). Since network latencies won't improve, bitmap-oriented protocols have an advantage since X-forwarding performs poorly under network latency.
Posted Nov 5, 2010 17:58 UTC (Fri)
by jzbiciak (guest, #5246)
[Link] (1 responses)
I didn't call out physics as the cause. I figured the speed of light should be pretty obvious in this crowd. :-) As for using DSL vs. something else: At least I'm not using satellite. *shudder*
In the end it wasn't a technical decision on my part anyway: The state of broadband being what it is around here, my shopping experience for a provider amounted to telling the sales person "I run servers", and seeing what happened. The cable guys told me "have a nice day," whereas the DSL guys asked "with or without static IP?" It may've changed since then, but does it really matter? I'm now spiraling way off topic.
Posted Nov 7, 2010 15:52 UTC (Sun)
by marcH (subscriber, #57642)
[Link]
With your shopping experience yes maybe... on the other hand the latencies of broadband technologies is quite relevant.
Posted Nov 5, 2010 15:48 UTC (Fri)
by deepfire (guest, #26138)
[Link] (3 responses)
Well, this is somewhat of a strawman. The fact that X is not implemented particularly efficiently does not mean that X is not fundamentally better than dumb protocols.
If you want to compare apples to apples, see NX, the heavily-optimised implementation of the X protocol. I use it daily, and FWIW it beats crap out of the competition.
Posted Nov 6, 2010 6:20 UTC (Sat)
by butlerm (subscriber, #13312)
[Link] (2 responses)
That said, I sure hope someone is looking at a mid-layer API that can be adapted to virtually any combination of user interface toolkit and display communication protocol without having to reduce everything to a bitmap first.
Why should nearly any kind of application be programmed to an API that is designed to be non-remotable? That is the highway to balkanization. What we need is a generic mid-layer API that can reasonably support both scenarios without the historical infelicities of X, so you do not have to re-port your entire application just because you want to run it remotely on occasion, preferably without feeling you are watching satellite tv during a snowstorm.
Posted Nov 6, 2010 9:37 UTC (Sat)
by quotemstr (subscriber, #45331)
[Link]
Posted Nov 6, 2010 13:09 UTC (Sat)
by deepfire (guest, #26138)
[Link]
Posted Nov 8, 2010 8:16 UTC (Mon)
by nix (subscriber, #2304)
[Link] (2 responses)
(oh, wait.)
Posted Nov 12, 2010 3:07 UTC (Fri)
by jmorris42 (guest, #2203)
[Link] (1 responses)
Yea, that is just an extra helping of fail to go along with ditching network transparency. So not only are we supposed to be happy tossing "The Network is the Computer, the Computer is the Network" we also lose "Mechanism not policy" along with it.
After both of those are gone, might as well just buy a Mac and be done with it.
Ya know, one of the attractions of Free Software for me was the hope for freedom from being abandoned by a vendor. But with the lemming like action of the distributions chasing "The Year of Linux on the Desktop" it looks like we (we the *NIX loving folk who were the early adopters) are about to be abandoned. Thankfully we will at least have the option to fall back to a distribution preserving the *NIX way.... even if we have to fork it off from an existing one and maintain it. Right up until Firefox and Chromium go Wayland and drop X support, then things might get messy.
Posted Nov 14, 2010 21:07 UTC (Sun)
by nix (subscriber, #2304)
[Link]
Posted Nov 5, 2010 8:59 UTC (Fri)
by rqosa (subscriber, #24136)
[Link] (4 responses)
What about doing this:
Shuttleworth: Unity on Wayland
Shuttleworth: Unity on Wayland
Shuttleworth: Unity on Wayland
Shuttleworth: Unity on Wayland
Shuttleworth: Unity on Wayland
On the flip side though, none of this applies to things still using the traditional driver-in-X model such as proprietary drivers and all the odds and ends that aren't Intel, nVidia, and ATI. Keith's answer was basically that Nouveau is doing well which covers the last proprietary driver and as for the others all he had was "I don't know."
Shuttleworth: Unity on Wayland
Shuttleworth: Unity on Wayland
Shuttleworth: Unity on Wayland
Shuttleworth: Unity on Wayland
Shuttleworth: Unity on Wayland
Shuttleworth: Unity on Wayland
Shuttleworth: Unity on Wayland
Shuttleworth: Unity on Wayland
Shuttleworth: Unity on Wayland
Shuttleworth: Unity on Wayland
Shuttleworth: Unity on Wayland
Shuttleworth: Unity on Wayland
Look at mod_wsgi...
Shuttleworth: Unity on Wayland
Shuttleworth: Unity on Wayland
Shuttleworth: Unity on Wayland
Shuttleworth: Unity on Wayland
The data is in fact usually written in SQL storage and there are no explicit security tags that the database server could check on behalf of the application: thus all data is available to every query, at least in principle.Shuttleworth: Unity on Wayland
There are good ways to fix that problem, namely "virtual private databases". You can implement them in any database that has update-able views that can filter on session variables.
I have an application that sets the database session state to match the application session when handling each page request. Until that state is set, all the "tables" return zero rows. After it is set, all the virtual tables contain only the rows the user is allowed to have access to, only those rows can be updated, and the application can only insert rows into the same range. Near perfect isolation. Any kind of attack can only affect the data of the logged in user.
Shuttleworth: Unity on Wayland
ssh -L 5902:localhost:5901
Shuttleworth: Unity on Wayland
Shuttleworth: Unity on Wayland
> and network transparency are important for me.
I would say instead that these systems are so fast that they ought to have no time in which to display the bling. Every action should occur so fast that unless the animation is slowing it down you wouldn't be able to perceive the animation. If there are still cycles left over the system should be conserving battery (if it's battery powered) or pre-calculating possible next moves on my part (if it's not).
Shuttleworth: Unity on Wayland
The value of animations
The value of animations
The value of animations
The value of animations
Shuttleworth: Unity on Wayland
Shuttleworth: Unity on Wayland
Shuttleworth: Unity on Wayland
Shuttleworth: Unity on Wayland
Shuttleworth: Unity on Wayland
Shuttleworth: Unity on Wayland
Shuttleworth: Unity on Wayland
Shuttleworth: Unity on Wayland
But people don't always know which button does what....
If the window simply vanished, users can be left confused as to what happened.
Shuttleworth: Unity on Wayland
Shuttleworth: Unity on Wayland
Shuttleworth: Unity on Wayland
Shuttleworth: Unity on Wayland
Shuttleworth: Unity on Wayland
Shuttleworth: Unity on Wayland
Shuttleworth: Unity on Wayland
Shuttleworth: Unity on Wayland
Don't you know that, you know, text scrolling is a ANIMATION?
The way it makes it sound is like your computer is just something with a big red button on the front that you press and it says "DO WHAT I WANT" and then it plugs into your mind or something.
Well, duh. But people don't always know which button does what. If the UI can guide them with animations and such, that's only a good thing.
Shuttleworth: Unity on Wayland
Shuttleworth: Unity on Wayland
Shuttleworth: Unity on Wayland
To quote my first message on this thread:
Shuttleworth: Unity on Wayland
Perhaps a distro fork will arise targeting people who are technically competent [and are] more interested in productivity.
Posted Nov 6, 2010 10:22 UTC (Sat) by Janne (guest, #40891)
With attitude like this, it's no wonder that Linux on the desktop is perpetually stuck at under 1% market-share...
People are not computer-wizards.
The computer should do everything in it's power to help the user.
It seems to me that the people carrying the biggest "help people" banner often do the most harm. I too want the computer to help people, even non-technical people. I suspect we have very different ideas of what "help" means. I can assure you that adding more popups and interface interrupting animations will not help _me_ in the slightest. Other folks, perhaps, but I don't intend to speak for anyone else.
And sure, people will learn which button does what. [
] And there are even studies about this. Researchers set up two functionally identical systems. The difference was that one system looked plain and basic, while the other has nice graphics [
]. It was found that people were more productive on the system that looked better.
If you provided citations I would read them. But what concerns me with this is that it seems like an unhealthy obsession with the initial impression. Your first few hours with a system are entirely different than your next twenty years with it. Unless you are worried about every last fraction of a percent of market share I believe you should optimize as much for the 'next twenty years' as is possible without turning people off completely. (For example, I think Blender fails a bit too hard on the initial impression)
Shuttleworth: Unity on Wayland
Shuttleworth: Unity on Wayland
> be network accessable or not.
> thing to use when backed by the right hardware. Instead someone will have
> to fork or recreate the app in a networked version.
Shuttleworth: Unity on Wayland
Shuttleworth: Unity on Wayland
Shuttleworth: Unity on Wayland
Shuttleworth: Unity on Wayland
Shuttleworth: Unity on Wayland
Shuttleworth: Unity on Wayland
Shuttleworth: Unity on Wayland
Wol
Shuttleworth: Unity on Wayland
KDE obscuring tool tips
Shuttleworth: Unity on Wayland
The computer's behavior should consist mostly of deterministic direct responses to my actions so that I should almost never need help figuring out what it has done.
Shuttleworth: Unity on Wayland
Having a tool be capable of what it's intended to do is important, but would you really want to use an oily screw driver with no grip? Or a hammer where the metal on the handle is flaking?
Shuttleworth: Unity on Wayland
>functional desktop to a working screwdriver and a fancy, animation-heavy
>desktop to a screwdriver with a diamond-encrusted gold handle with a fluffy
>comfort grip. The former is sufficient to get the job done whereas the
>latter looks great if you want to give a demo.
Shuttleworth: Unity on Wayland
Shuttleworth: Unity on Wayland
Shuttleworth: Unity on Wayland
>Gnome 3 is (currently) scheduled for March 2011, and there's no reason to think that it will last for any less time than Gnome 2 did.
Shuttleworth: Unity on Wayland
Shuttleworth: Unity on Wayland
Shuttleworth: Unity on Wayland
Shuttleworth: Unity on Wayland
Shuttleworth: Unity on Wayland
Why do you imply GPU acceleration is only useful for "dancing candy cram-ware"? It's useful for mundane things like video playback, not to mention games. Plus for saving power, or so I've been told. Even browsers are all becoming GPU-accelerated these days. GPUs make things faster, it's bad interface design that makes things slower.
Shuttleworth: Unity on Wayland
Shuttleworth: Unity on Wayland
Shuttleworth: Unity on Wayland
Shuttleworth: Unity on Wayland
Shuttleworth: Unity on Wayland
Shuttleworth: Unity on Wayland
Shuttleworth: Unity on Wayland
Shuttleworth: Unity on Wayland
Shuttleworth: Unity on Wayland
Shuttleworth: Unity on Wayland
Shuttleworth: Unity on Wayland
Shuttleworth: Unity on Wayland
Shuttleworth: Unity on Wayland
Shuttleworth: Unity on Wayland
Shuttleworth: Unity on Wayland
Shuttleworth: Unity on Wayland
Shuttleworth: Unity on Wayland
Shuttleworth: Unity on Wayland
Shuttleworth: Unity on Wayland
Shuttleworth: Unity on Wayland
Shuttleworth: Unity on Wayland
Shuttleworth: Unity on Wayland
Network displays
The irony
The irony
The irony
The irony
I strongly disagree. X does network transparency at the right level, or at least a much better level than any other current system.
The irony
The irony
The irony
* Microsoft RDP
* Redhat Spice
The irony
The irony
The irony
The irony
The new core of NX 4.0 is made up of a set of libraries written from the ground up to ensure portability, flexibility and state-of-the art performance. NX 4.0 core libraries will not be made open source. Although NX 3.x core compression technology and earlier versions will remain GPL, NoMachine engineers will not be developing the code further.
The irony
The irony
The irony
Nevertheless, running remote X apps on the network, can be painful. I speak because we do that. One day we had a dozen laptops, connecting wireless to a single router, running interactive X sessions on a system in the basement. It sucked.
I'm sure we could improve the X protocol and make it much faster, but I am not sure we can improve it enough (e.g. if a user click a button, that info must go to the server to decide what that button does, right? and latency is there to kill us...)
Now completely killing network transparency doesn't solve the problem either, or does it? Besides that installing some exotic "hi performance" stuff in our "basement machine" would be almost impossible.... Or is wayland transparent on the "basement machine" and requires only stuff on the laptops?
The irony
The irony
The irony
The irony
The irony
NX and others
NX and others
NX and others
The irony
The irony
Another famous one of course was Display PostScript.
The irony
The irony
The irony
Legacy X and network transparency
Legacy X and network transparency
Legacy X and network transparency
Legacy X and network transparency
Legacy X and network transparency
Legacy X and network transparency
Legacy X and network transparency
Legacy X and network transparency
So this 'simplicity' isn't very convincing: yes, Wayland itself is simple, but as it's not a complete solution, the result won't be simple!
Legacy X and network transparency
Legacy X and network transparency
Legacy X and network transparency
Legacy X and network transparency
Legacy X and network transparency
Legacy X and network transparency
Legacy X and network transparency
> over VDSL. I have gobs of bandwidth, but the RTT sucks. The app is barely
> usable. In contrast, Windows Remote Desktop and VNC both work just fine
> over the same link. Both of the latter seem to be more of the "dumb bitmap
> plus compression" school of thought, and that seems to work pretty well
> with modern setups.
Legacy X and network transparency
Legacy X and network transparency
Legacy X and network transparency
Legacy X and network transparency
Legacy X and network transparency
Legacy X and network transparency
Shuttleworth: Unity on Wayland
I'm not convinced that Wayland has any real benefit over doing things that way. This page seems to be saying that Wayland's benefit over X is that, in Wayland, the compositing window manager decides what is the receiving window and window-local coordinate for each pointer event (because it knows the location/size/shape/transformation of every window, whereas the X server doesn't have that information when using a compositing window manager). But, is it really all that difficult to make it possible for the X server to delegate this responsibility to the window manager?
Posted Nov 5, 2010 10:36 UTC (Fri)
by rqosa (subscriber, #24136)
[Link]
> delegate this responsibility to the window manager Maybe that would increase input latency too much. Here's another possibility: have an X server that doesn't need a window manager, because it can do window management itself. (For example, the "XQuartz" X server in Mac OS X is like this.) Maybe even have a plugin API in the server for server-side "window managers".
Posted Nov 5, 2010 12:24 UTC (Fri)
by bluebugs (guest, #71022)
[Link] (2 responses)
The main issue is that today graphic toolkit are mostly badly designed, they are just thinking at starting to implement a 2D pipeline rendering like games are doing since years. If toolkit could go away from the direct rendering mode and instead implement canvas that track objects, thinks would really looks better.
There is so many room for improvement in both QT and GTK toolkit that thinking switching to yet another windows server will be the answer is a waste of ressource. With good software pipeline you will already know that frame buffer vs X means only a 5% slowdown. That's just nothing compared to the rest of the rendering stack.
Hopefully QT is starting to get it right with its move to QML and SceneGraph rendering. But there is still a lot to do before it really use your hardware at max possible rendering speed. Sadly I have not seeing such a project in GTK land,. And if you want to see code that do what I am describing look at the Enlightenment Foundation Library.
Posted Nov 5, 2010 14:14 UTC (Fri)
by rqosa (subscriber, #24136)
[Link] (1 responses)
> Just for the record, OpenGL is not always the best way to do 2D graphics. Yeah, although I said "each OpenGL-using X client", it doesn't have to be OpenGL. The important point is that the clients talks directly to the hardware (with the kernel as an intermediary), and the hardware renders offscreen. (Last time I posted about X here, I mentioned OpenVG for 2D graphics, only to be told that OpenVG will never gain adoption…)
Posted Nov 5, 2010 15:36 UTC (Fri)
by drag (guest, #31333)
[Link]
What your looking for is Gallium.
Gallium is a modern graphic stack were you have a single driver that supports multiple graphics APIs. Instead of having to have separate drivers for OpenGL vs X Render.... you just have application libs that tie into Gallium 'State Trackers'.
Gallium3D has:
The state tackers keep track of what application is requesting what and the Gallium driver combines them into GPU instruction primitives that end up getting piped to the graphics card. Want to support a new GPU-level API? then make a new tracker. The trackers should be mostly hardware agnostic so they can work just as uniformly well completely regardless of what your using for acceleration... it could be a video card driver, LLVM, or some sort of weird future GPGPU. Write once, compile everywhere type thing.
There really is no such thing as 'OpenGL acceleration' on video cards anymore. Those are long dead. Like before ATI R200 dead type dead. Instead what we have is all-software-stacks that are compiled to run on both CPU and GPU, using whatever gives the best performance or is the most available. It's complicated, but it works and GPU is now a native part of the PC architecture and can be taken advantage of much like a math coprocessor from 486DX days was used. The GPU is just another processor.
That way instead of forcing everybody to write new APIs by abstracting on top of OpenGL or whatever else is lower, the application developers can just use whatever is best for their purposes and the driver takes care of it for them. This is the way things should be. This is something that Wayland is going to need to have underneath it in order to be better.
Currently working Gallium drivers exist for Nvidia and AMD/ATI video hardware, but it's all a HEAVY work in progress. Only very recently have they gotten OpenGL working on real hardware.
Posted Nov 5, 2010 9:01 UTC (Fri)
by AlexHudson (guest, #41828)
[Link] (2 responses)
The network transparency of X has been criticised in places like Slashdot for years, yet I always understood that this criticism was basically mis-placed: that the protocol itself doesn't particularly impinge on the efficiency of the server at all, and that the problem is more that the basic X protocol was essentially static for years.
We've finally got an X.org project that is kicking ass, modernising the code and the protocols, simplifying and chopping out the crufty bits, but which is still terribly short on hackers. Splitting the stack in two - X.org for compatibility, Wayland for new hawtness - just seems mad.
This is exactly the kind of thing that ought to get cross-distro agreement. If X.org isn't up to the job long-term, then let's everyone set out a roadmap of where we need to go. Let's not have Xgl/AIGLX all over again.
Posted Nov 5, 2010 12:19 UTC (Fri)
by sorpigal (guest, #36106)
[Link] (1 responses)
A new system designed to do what X does (the good parts) in a way which is incompatible with X and builds on what we've learned from X in the last 20 years, especially about what not to do, would be welcome. If the new system is just replicating a small piece of what X does and ignoring the rest then it isn't a replacement for X.
Is it time for X12 or a Z? Probably. But, please, let's not step backwards while we're at it.
Posted Nov 5, 2010 16:09 UTC (Fri)
by drag (guest, #31333)
[Link]
But nobody is.
People are, however, working on Wayland....
Posted Nov 5, 2010 11:15 UTC (Fri)
by emk (subscriber, #1128)
[Link] (5 responses)
Posted Nov 5, 2010 21:22 UTC (Fri)
by ballombe (subscriber, #9523)
[Link] (3 responses)
Posted Nov 5, 2010 22:35 UTC (Fri)
by foom (subscriber, #14868)
[Link] (2 responses)
Posted Nov 6, 2010 16:34 UTC (Sat)
by rleigh (guest, #14622)
[Link] (1 responses)
My main beef with the console (of all flavours) is the appallingly bad (simplified) VT100 emulation and terrible Unicode and font support. With KMS, a decent userspace emulator using fontconfig/freetype and OpenGL/Gallium should be able to do an absolutely amazing job compared with the limited kernel emulator.
Posted Nov 7, 2010 21:05 UTC (Sun)
by dtlin (subscriber, #36537)
[Link]
bogl-bterm, jfbterm, and fbterm are existing userspace terminal emulators running on the Linux framebuffer, with improved Unicode/font support over the built-in console.
Posted Nov 6, 2010 16:33 UTC (Sat)
by andreashappe (subscriber, #4810)
[Link]
AFAI-remember ion's drawing code an additional drawing engine shouldn't be too hard.
Posted Nov 5, 2010 16:20 UTC (Fri)
by deepfire (guest, #26138)
[Link]
It saddens me to see people judging the idea by its most popular implementation.
My experience with NX has been all-around positive (well, except minor glitches), and I consider it indispensable in my daily work.
Please, please, before bashing X /the technology/, make sure you don't actually mean Xorg /the implementation/.
Anyway, for me it raises another question -- why are the core people behind Xorg so awfully quiet about NX?
Posted Nov 5, 2010 22:58 UTC (Fri)
by hschildt (guest, #71034)
[Link] (8 responses)
Like some Linux people, you think like a bureaucrat who just wants to get along with your community.
After reading X.org/Gnome blogs it appears Linux is not an open community, but actually a closed one.
Ubuntu is correct in moving forward past the blurry fonts of X.org and the retarded Gnome desktop.
Maybe Ubuntu can get Linux past its 1 percent market share, because it appears to be stuck?
Posted Nov 6, 2010 10:55 UTC (Sat)
by GhePeU (subscriber, #56133)
[Link] (7 responses)
Posted Nov 6, 2010 19:39 UTC (Sat)
by alankila (guest, #47141)
[Link] (6 responses)
To explain it in simplest term, a diagonal like \ or / requires drawing partially eclipsed pixels to eliminate jaggedness of the monochromatic rendering. Assuming a white background and black foreground, Linux people seem to think that a half coverage of black within that white pixel should result in color 0x80 because that is 50 % blend between 0x00 and 0xff. This is, however, wrong, because of gamma. To render a physical intensity that is close to 50 % of the brightness of white, you will actually want to use color value 0xb4. 0x80 is simply way too dark.
I am demoing this problem on this page with the green text on purple background: http://bel.fi/~alankila/lcd/
The reason nobody does this right is that to X, and pretty much everyone else, text is just an ARGB pixmap, and libraries like cairo get the text up from freetype (which is not the faulty party) in linear alpha space, and turn that into linear ARGB bitmap with subpixel positioning. However, this bitmap eventually ends up on a sRGB surface, and the intermediate colors are destroyed, giving the color fringing and darkening artifacts that I try to demonstrate as prominently as I am amble.
Posted Nov 6, 2010 20:25 UTC (Sat)
by quotemstr (subscriber, #45331)
[Link] (2 responses)
Posted Nov 7, 2010 14:09 UTC (Sun)
by alankila (guest, #47141)
[Link]
Posted Jun 8, 2012 7:59 UTC (Fri)
by alankila (guest, #47141)
[Link]
Posted Nov 7, 2010 3:52 UTC (Sun)
by jwb (guest, #15467)
[Link] (2 responses)
Posted Nov 7, 2010 10:07 UTC (Sun)
by HelloWorld (guest, #56129)
[Link] (1 responses)
Posted Nov 7, 2010 14:05 UTC (Sun)
by alankila (guest, #47141)
[Link]
Hopefully they aren't chewing too large a piece. At this point I could live with almost any kind of unspeakable kludge, rather than wait 10 years for the perfect solution.
Posted Nov 8, 2010 5:17 UTC (Mon)
by gmatht (subscriber, #58961)
[Link] (1 responses)
Posted Nov 17, 2010 17:10 UTC (Wed)
by renox (guest, #23785)
[Link]
Wayland itself, yes, as it does less than an X-server, but if you do the network transparency in the toolkitS for example, as there can be several toolkit the duplication of code increase the total probability of failure.
For the point 2: perhaps, but AFAIK the issue with restarting X is that currently clients don't reopen a connection with X in case of X's failure, so in theory this isn't related, in practice it could be if Wayland clients are written with this possibility in mind..
Posted Nov 11, 2010 18:19 UTC (Thu)
by cdmiller (guest, #2813)
[Link] (2 responses)
Q. Who uses network transparency where I work?
Q. Who advocates for Linux on the desktop?
Q. Who else uses Linux on the desktop?
Q. Who will the removal of X "original values" piss off?
Duh.
"its *possible* to get amazing results with X, but its extremely hard, and isnt going to get easier"
It's painful to get an amazing desktop and somewhat hard to get a nicely working easily manageable desktop, but I don't agree it can't get easier under X.
Prediction: Big win for Xubuntu or similar in the academic arena.
Posted Nov 11, 2010 19:57 UTC (Thu)
by bronson (subscriber, #4806)
[Link] (1 responses)
Prediction: your Unix and Linux system administrators and some students and faculty happily start using Spice, VNC, or some other remote protocols (yes, they can do individual apps and windows just like X). Code size decreases significantly, development pace quickens, and only a few stuck-in-the-90s graybeards lament that the port 6k-over-ssh hack doesn't work anymore and all their hard-fought xauth knowledge is now obsolete.
I haven't actually used wayland yet so I'm just playing devil's advocate here. :)
Posted Nov 14, 2010 19:58 UTC (Sun)
by cdmiller (guest, #2813)
[Link]
Truly, we already run XFCE in our LTSP academic computer labs and kiosks as the Ubuntu Gnome default was not easy to manage in that sort of environment. I can only imagine the pain and unhappiness in trying to manage an environment running a mix of X, Unity/Wayland, SPICE, and VNC apps all at once. It's pretty easy to imagine big win for Xubuntu and similar. Inevitably the Linux distribution admins find easier to manage is what will be installed, used, and advocated for.
Shuttleworth: Unity on Wayland
Shuttleworth: Unity on Wayland
Shuttleworth: Unity on Wayland
Shuttleworth: Unity on Wayland
* Mesa OpenGL state tracker
* DirectX state tracker
* OpenVG state tracker
* X state tracker
* OpenCL state tracker
etc etc.
Shuttleworth: Unity on Wayland
Shuttleworth: Unity on Wayland
Shuttleworth: Unity on Wayland
Shuttleworth: Unity on Wayland
Shuttleworth: Unity on Wayland
Shuttleworth: Unity on Wayland
Shuttleworth: Unity on Wayland
Shuttleworth: Unity on Wayland
Shuttleworth: Unity on Wayland
Please, people, before commenting on X performance try NX
Shuttleworth: Unity on Wayland
Ubuntu is correct in moving forward past the blurry fonts of X.org and the retarded Gnome desktop.Shuttleworth: Unity on Wayland
What the hell has X.org to do with "blurry fonts"? The fonts are rendered by freetype, who's not a part of X.org, and nobody is going to touch it. What are you going to blame X for now, the rain on weekend days while it was sunny all the rest of the week? Poverty and famine in Africa?
If the price for "getting past the 1 percent market share" is a new influx of fanboys who talk with arrogance about things they don't know and spout out the worst idiocies, even in a place where usually you could find interesting commentary, then I'd really prefer to stay with the current usage level, thank you very much.
Shuttleworth: Unity on Wayland
Shuttleworth: Unity on Wayland
Shuttleworth: Unity on Wayland
Shuttleworth: Unity on Wayland
Shuttleworth: Unity on Wayland
Shuttleworth: Unity on Wayland
Shuttleworth: Unity on Wayland
More important to me than super-smooth graphics.
1) Would the simplicity of Wayland mean that it would be less likely to crash than an X-server?
2) Would it be possible to reset, for example, the graphics card without killing all Wayland clients (applications). Roughly, I mean could we do the equivalent of Ctrl-Alt-Backspace without losing all our open applications?
3) Would motion of the the mouse pointer remain smooth, even when a couple of background tasks are performing heavy IO?
4) Would Wayland improve resource management? For example,
4a) a poorly written application can cause the X-server to allocate large amounts of memory. Would Wayland force or encourage that application to allocate that memory itself (making the culprit clear in top and to the OOM killer)?
More important to me than super-smooth graphics.
Shuttleworth: Unity on Wayland
A. UNIX and Linux systems administrators, and some students and faculty.
A. UNIX and Linux systems administrators, and some students and faculty.
A. Spouses, family, and friends of the Linux on the desktop advocates.
A. The first line advocates of Linux desktops.
Shuttleworth: Unity on Wayland
Shuttleworth: Unity on Wayland