LPC: Life after X
LPC: Life after X
Posted Nov 6, 2010 18:51 UTC (Sat) by drag (guest, #31333)In reply to: LPC: Life after X by dskoll
Parent article: LPC: Life after X
Sending a compressed texture that is something like 1024x768 over most networks is not going to be a problem any more in a lot of cases.
A lossless 1280x800 PNG image itself is only something like 260.1KB, which will transfer over most internet connections in a fraction of a second. High quality JPEG or WebP is even smaller and compresses much faster with relatively little discernible image problems.
It's when you run into issues with applications that want to have something like a animated menu or whatever that takes 100 redraws to go from start to finish. When your on a local machine something like that is just stupidly fast and it is irrelevant. When your over the internet something that used to take 0.1 second now takes 5 due to all the time lost to latency from going back and forth 'draw' 'finished draw' 'draw again' etc etc.
When the fastest network most people had was 10 people sharing a single 10Mb/s ethernet on a single hub with all of them sharing the same collision domain... THEN that was when X networking was very troublesome in terms of Mb/s used.
Nowadays even common consumer internet connections are faster then that.
But when you have 128msec latency and it takes 2000 round trips between a server and a client to draw a new web page on your browser.... THEN that is when you run into serious performance problems. It does not matter if your sending 10's of KBs of information or your just sending 5Bs each trip it's going to create huge delays.
Your far better off just taking a image of a 1024x768 desktop at 15 FPS, sending it over a network then working on some special protocol to relay input back. (I am not sure about SPICE or ICA, but I am pretty sure that their technology is more sophisticated then just that.)
This is why people report VNC working better then X when it's obvious that in terms of actual bandwidth used X is often going to be better.
But it's not like VNC or X is even close to the state of the art. Both of them are obsolete with their own set of problems.
Seriously, check out:
http://www.gotomypc.com/remote_access/remote_access
While people have been arguing over the merits of being able to remote access a single application over X vs a entire desktop over VNC.... the ability to remotely access your GUI over the internet has gone mainstream.
ANY PC, ANY Mac. Over your browser. Very simple to setup, relatively inexpensively, adequately secure, and good enough that the average customer can use it without pulling their hair out.
You can even do it on your iPhone or IPad....
Sure I am not going to use it and it's not suitable if you care about your security, but the networking aspect of X is far from unique or special anymore and it's performance in common situations is inadequate compared to contemporary solutions.
Posted Nov 6, 2010 18:55 UTC (Sat)
by drag (guest, #31333)
[Link]
Many of these applications are, in fact, virtualized and are individually remote'd to my desktop. The way it is done is completely transparent and there is not a single non-technical user on a corporate windows desktop that will be able to tell you what applications are remote and which ones are local. They will not even be able to tell you that they are using virtualized applications remotely at all.
The experience is completely integrated and there is no discernible way, in terms of performance or image quality, to tell the difference between local and remote apps.
Posted Nov 6, 2010 20:09 UTC (Sat)
by dskoll (subscriber, #1630)
[Link] (9 responses)
It's when you run into issues with applications that want to have something like a animated menu or whatever that takes 100 redraws to go from start to finish.
So it's just bad application design, then. Or if you really want fancy animations, you do as I suggested: Let the server do the fancy effects. The client sends a little bit of information specifying how to do the animation (start, number of steps, time increment, etc.) and the server handles it. Augmenting X or something similar to do that wouldn't be too hard.
Somewhat off-topic: The contraction of "you are" is "you're", not "your". Sorry... just a pet peeve...
Posted Nov 7, 2010 4:39 UTC (Sun)
by drag (guest, #31333)
[Link] (1 responses)
Sometimes. Or bad toolkit design, or maybe just a different theme that the user selected. And it's only bad design if your trying to run your software over a high latency link, otherwise it's probably quite sane.
> Or if you really want fancy animations, you do as I suggested: Let the server do the fancy effects. The client sends a little bit of information specifying how to do the animation (start, number of steps, time increment, etc.) and the server handles it.
Ya. OpenGL works surprisingly well sometimes. When AIGLX first was supported on my video card I ran 1024x768 Wolfenstien (Quake3 improved engine) over wireless. Worked very well and I got about 48-60 FPS.
It was playable for the most part, except the mouse lagged horribly. Keyboard input was fast and everything rendered fine otherwise.
Cairo and Clutter may help out quite a bit I suppose. I don't know for certain.
> Augmenting X or something similar to do that wouldn't be too hard.
It's called NoMachine NX. ;)
Posted Nov 7, 2010 7:14 UTC (Sun)
by mfedyk (guest, #55303)
[Link]
that is because mouse processing takes so many round trips. to see this in action run dstat in a terminal, then move your mouse in a slow steady circle. your context switches per second will go up several hundred.
Posted Nov 7, 2010 15:35 UTC (Sun)
by khim (subscriber, #9252)
[Link]
Sure. No - because client-side solution is even simpler to implement. No again. It's harder then to write the loop in program. Well - can you do it in such a way as to make that it's easier to use your scheme then the naive animation implementation? You can not do that. Applications the sole justification for the fancy protocols, kernels and computers. Thus application writers dictate the rules. The only time when you can impose some restrictions is when they are caused by the law of physics. Because it's the only situation where all alternatives will impose the same restrictions so application developers will have no choice: otherwise the solution with the simplest usage from coder POV will always win. Witness fate of transputers and modern CPUs: today way have 16-way SMP on desktop while transputers offered 32 or 64 twenty years ago. Why we've only switched to SMP on desktop five years ago? Easy: before that it was possible to create more and more powerful UP machines. Only when UP machines hit the hard limit (speed of light, essentially) the direction changed. The same with animated menus today: application developers do and will continue to do applications which abuse fast CPU<->GPU connection till it works on desktop. They will not change their designs to accommodate "network transparency" - because their users mostly don't care. Thus all these ideas are not worth even talking about. If you can explain why/if they will stop working even on desktop - we can talk about redesign.
Posted Nov 8, 2010 4:46 UTC (Mon)
by elanthis (guest, #6227)
[Link] (5 responses)
The drawing primitives suck. You may not realize this, it's probably been a long time since you've seen or worked with a mainstream app that limits itself solely to the X drawing primitives (probably).
We don't want that. Users don't want that. The whole world -- other than a teeny tiny little fraction of people so small in significance as to be entirely irrelevant -- wants pretty UIs. Pretty UIs are actually _more usable_, as tasteful and skillful application of that prettiness results in more easily comprehensible and digestible information display and user focus direction. Put plainly: that shit matters.
If you're really that interested in continuing to use the Xerox Parc UI innovations and nothing else, knock yourself out. Arguing today that the people designing the graphics framework that goes beyond what X is capable of should instead stick to basic rendering primitives is every last bit as stupid as an old-time radio host arguing the radio is the best media ever while the Internet already started killing off TV which already killed radio dead.
The other problem is that you seem to have no grasp of how modern rendering is done. When I say "the client" does the rendering, what I actually mean is that the client is programming a GPU to do the rendering. What you end up wanting to do things your way then is a complete implementation of OpenGL over the pipe (which GLX is NOT in any way). You also completely ignored the parts about using the GPU for more than just graphics, too, including input handling and other general computation that you absolutely do not want in the display server, at all, period.
Your notion of how the desktop should work is wrong, dated, and (thankfully) totally irrelevant as you're not the one making the development decisions.
Posted Nov 8, 2010 9:27 UTC (Mon)
by quotemstr (subscriber, #45331)
[Link]
Posted Nov 8, 2010 16:44 UTC (Mon)
by dskoll (subscriber, #1630)
[Link] (2 responses)
Your so-called solutions goes against X's core design and principle, though. X doesn't do fancy effects, or any effects. X is mechanism, not policy. X is just a dumb rendering and event pipeline, by design.
Yeah, so? Change that aspect of X rather than throwing out the whole thing. Eventually, the best way to do things would be to have toolkits like Qt and Gtk be loadable modules that get installed in the X server rather than in client applications. That could greatly reduce the number of network round-trips required and greatly mitigate the latency problem.
There are plenty of security concerns with this, of course. You wouldn't want to load Gtk or Qt into the X server unless it's running with your UID. But that's a much easier problem to solve and a much smoother transition to the future than throwing out X completely.
Posted Nov 9, 2010 15:23 UTC (Tue)
by khim (subscriber, #9252)
[Link] (1 responses)
How come? You seem to assume that developers are using X and the only problem here is some shortcomings. Well, newsflash: no, they don't! Most applications today are written for Windows, PS3, Wii, iOS or Android. Not for X. Developers know toolkits (mostly GDI, but sometimes WPF or even Qt) and DirectX/OpenGL. They don't know X and then don't want to know X. This is fact of life. That's why all these band-aids are doomed: they impose burden on developers for megligible benefit. X is this thing down there which only exist to make our life misarable - this is POV of many (most?) developers. That's why it must be removed. But why introduce this stupid layer at all? Give the developers the means to run client app which talks with GPU and server - and he'll decide how to split the work. This is how it works on Windows, XBox360 and PS3 - and it certainly attracted significantly more developers then X redesigns ever could.
Posted Nov 17, 2010 7:11 UTC (Wed)
by mcrbids (guest, #6453)
[Link]
As an application developer with over 10 years of experience, I can say with certainty that this is not true, at least, not for me.
I don't write applications for Windows, PS3, Linux, Android, or *any* of the platforms listed. I write for the web browser! I write complex, data-driven applications and it's been a very, very long time since I wrote anything that wouldn't easily work on Win/Mac/Lin/Android/Iphone/Xbox and anything else with a reasonable browser.
The browser I most target is Firefox since it seems to be the most "Cross platform" although Chrome is close. I develop on Linux, it runs FF well, I don't worry about viruses and stuff like that, and can offer excellent compatibility with all my clients.
I don't want to replace X - I get the best of all possible worlds by making the specific rendering requirements of my applications something handled by the context of the user. And I use network transparency all the time - I can run several Firefox instances concurrently, on the desktop, as different users, without any danger of cookie or session interaction between browsers. As a web-based, network application developer, this is so incredibly useful!
Posted Nov 8, 2010 16:48 UTC (Mon)
by dskoll (subscriber, #1630)
[Link]
Your notion of how the desktop should work is wrong, dated, and (thankfully) totally irrelevant as you're not the one making the development decisions.
That certainly deserves a *plonk*. How about trying to stay civil?
LPC: Life after X
LPC: Life after X
LPC: Life after X
LPC: Life after X
Facts of life
So it's just bad application design, then.
Or if you really want fancy animations, you do as I suggested: Let the server do the fancy effects.
The client sends a little bit of information specifying how to do the animation (start, number of steps, time increment, etc.) and the server handles it.
Augmenting X or something similar to do that wouldn't be too hard.
LPC: Life after X
LPC: Life after X
LPC: Life after X
Smooth transition? From what?
But that's a much easier problem to solve and a much smoother transition to the future than throwing out X completely.
Eventually, the best way to do things would be to have toolkits like Qt and Gtk be loadable modules that get installed in the X server rather than in client applications.
Smooth transition? From what?
LPC: Life after X