Not logged in
Log in now
Create an account
Subscribe to LWN
LWN.net Weekly Edition for May 23, 2013
An "enum" for Python 3
An unexpected perf feature
LWN.net Weekly Edition for May 16, 2013
A look at the PyPy 2.0 release
but is it really the right thing to do to make those two assumptions?
St. Pierre: The Linux Graphics Stack
Posted Jun 23, 2012 4:50 UTC (Sat) by smoogen (subscriber, #97)
My understanding since the early 1990's was that the large data blobs going over the network can't be fixed without fundamental rewrites of the X stack itself.
Posted Jun 23, 2012 12:22 UTC (Sat) by daniels (subscriber, #16193)
Posted Jun 24, 2012 13:56 UTC (Sun) by ebassi (subscriber, #54855)
replacing high quality rendering with, what? X drawing primitives? the 1987 called, and it wants its graphics system back.
also, for added fun, try to convince people writing web rendering engines to ditch client-side rendering and compositing to implement the W3C spec, and just use X drawing primitives instead.
Posted Jun 24, 2012 21:43 UTC (Sun) by dlang (✭ supporter ✭, #313)
everywhere else the trend is to push more of the display workload towards the edge where there are lots of relatively powerful devices, but for some reason when you start talking about X/Wayland people talk about making the layer closest to the display dumber and doing more of the work on the other end.
Posted Jun 24, 2012 21:47 UTC (Sun) by daniels (subscriber, #16193)
Posted Jun 24, 2012 21:58 UTC (Sun) by dlang (✭ supporter ✭, #313)
In games, this is why cheats make it possible to let you see through walls, the game has pushed the rendering to the client, so if the client is modified the player can get information that they could not get if the server was only telling the client what to display
but when we start talking about remote desktops, suddenly we are told that it's obvious that the only thing to do is to ship bitmaps around, anything else has been proven to be too horrible to tolerate?
If this is true, why is it that the 'modern desktop' somehow has higher requirements than any games? because the games all want to have the display (or system the display runs on) do more of the work
Posted Jun 24, 2012 22:14 UTC (Sun) by daniels (subscriber, #16193)
Posted Jun 24, 2012 22:22 UTC (Sun) by dlang (✭ supporter ✭, #313)
exactly, which is why so many of us are saying that the Wayland approach of doing remoting at the buffer level is such a bad idea.
I don't think anyone is claiming that X11 is the perfect toolkit for doing remoting, but whose of us who do use remoting routinely are concerned that the X replacement doesn't include any concept of remoting at the toolkit level.
Posted Jun 24, 2012 23:20 UTC (Sun) by daniels (subscriber, #16193)
It sounds like we agree that remoting should be done at the toolkit level. Your argument seems to be that Wayland should also be a toolkit, which I think is a terrible, terrible, terrible idea.
Posted Jun 24, 2012 23:55 UTC (Sun) by neilbrown (subscriber, #359)
There are lots of ways to do remoting, so Wayland doesn't provide any. However any system that tries to add remoting can focus on just moving the images/events across the network without worrying too much about how to put them on the display - Wayland does that
With the Wayland approach, each of these can co-exist as equals, can be independently developed and can compete with each other for different market segments.
Wayland does seem to exclude the possibility of running the window manager over the network, but I cannot imagine anyone really wanting to do that. Even the thinnest of thin clients from 20+ years ago had the option of a local window manager
Posted Jun 25, 2012 16:47 UTC (Mon) by nix (subscriber, #2304)
Posted Jun 25, 2012 0:32 UTC (Mon) by dlang (✭ supporter ✭, #313)
Wayland is claiming that it's the replacement for X11.
If this is true, then Wayland needs to be, or provide the remoting toolkit. Or the Wayland developers need to be designing wayland to interact with the separate remoting toolkit.
Instead Wayland advocates are saying that remoting will be done in wayland by shipping bitmaps around similar to how VNC does it, they then go further and say that is the only sane way to do remoting, and by the way, remoting is obsolete and not a desirable feature anyway.
Posted Jun 25, 2012 2:02 UTC (Mon) by Cyberax (✭ supporter ✭, #52523)
Currently X11 is a *failed* remoting toolkit. FTFY.
Posted Jun 25, 2012 8:57 UTC (Mon) by dlang (✭ supporter ✭, #313)
nobody is saying that it's perfect, not by any means, it's just better than anything else currently available (in part due to it's large installed base and good backwards compatibility)
Posted Jun 25, 2012 14:25 UTC (Mon) by HelloWorld (guest, #56129)
> nobody is saying that it's perfect, not by any means, it's just better than anything else currently available
Uh, so what?
Posted Jun 25, 2012 2:35 UTC (Mon) by alankila (subscriber, #47141)
Technically the compositor gets notification from the client about new pixels, so it is in position to do something like break the updated region in 8x8 tiles and check out if they match previously remoted tiles, and thus only send cache key instead of the data to allow cheap form of bandwidth reduction which should fit well for most of our GUIs. The downside is that it does force the compositor to do pixel scraping to discover what is different since last time, and that part imho may need help to reduce CPU usage of the Weston server and to improve efficiency of tile caching (if this is the way it will be done).
Maybe some hints should be provided, like "this region is video and will change every frame, just compress and send it" and "this area was scrolled by (x, y) pixels before updating, so you probably want to apply an offset in your tile-matcher." etc. These should be viewed as purely performance optimization to improve remoting for some specific workloads such as scrolling a viewport or watching video, both which are liable to generate tremendous amount of network traffic almost no matter what.
Posted Jun 26, 2012 4:32 UTC (Tue) by drag (subscriber, #31333)
For example Spice is able to heuristically determine if a portion of the screen is displaying video and then will use mjpeg to transmit that to the client.
I would expect that at a minimum if you are expect wayland itself to be network transparent then you would just transmit portions of the visual buffer that changes. If a window contents have not updated then there is no reason to send a update. If just some text or some portion of the window has updated then there is no reason to send the entire buffer again.
If the window portion is not visible then no reason to send a update. If you are moving a window around then there is no reason to send a update.
All sorts of stuff like that.
A lot of this stuff is not theoretical. All you have to look at is something like ICA or RDP to find something that is much more widely used and performs much better then X ever did.
Posted Jun 25, 2012 16:12 UTC (Mon) by nix (subscriber, #2304)
Posted Jun 24, 2012 21:56 UTC (Sun) by gioele (subscriber, #61675)
Anyway, I do not dislike the idea of having a fancy framebuffer API (Wayland) on top of which place a more updated declarative graphical API (X12 let's say). The problem with that does not lay in Wayland, but in the fact that we are losing a declarative API to draw things on the screen (we actually lost it years ago when we started using client-side fonts) in favour of toolkits like Qt and GTK+. Maybe an extended OpenVG will fill this void, maybe HTML will.
Posted Jun 24, 2012 22:02 UTC (Sun) by dlang (✭ supporter ✭, #313)
I see them as being completely the opposite, the equivalent of wayland would be to stop having the browser doing the rendering and having the web server create the bitmaps and send them to the browser.
For Wayland, people are saying that you don't want the smarts at the display, you want to ship bitmaps around to the display, having the other end (server in web terms, client in X terms) doing all the rendering.
Posted Jun 24, 2012 22:21 UTC (Sun) by raven667 (subscriber, #5198)
Posted Jun 24, 2012 22:45 UTC (Sun) by dlang (✭ supporter ✭, #313)
with local displays, the apps are doing less of the rendering themselves, and offloading more of the work to the video card.
with websites, the apps are doing less of the rendering on the server side and pushing more to the browsers.
with games, the central servers are doing less of the rendering, pushing more of the work to the clients.
with displayport, even the link from the video card to the display is becoming more client-server like, pushing more smarts to the display rather than just providing a stream of pixels like everything before it has done.
everywhere you look the trend is to segment out the work, doing less of it in what has been the core application, and hand as much of the work as possible off to other components, Those components may be separate software processes, or they may be hardware components, but more of the work is being pushed out towards the user.
so I don't believe that remoting a desktop should be an exception to this trend. and I especially don't believe the statements that there is something inherently wrong with a smart protocol when compared to shipping bitmaps around.
Posted Jun 25, 2012 8:27 UTC (Mon) by renox (subscriber, #23785)
That's not strictly true, there has been some talks of 'gaming in the cloud' where the servers do all the computations and send bitmaps to the display (like Wayland will do), of course it doesn't exist currently because first one must solve bufferbloat and even then it's not sure that the result will be good enough.
That said, I agree with you, shipping bitmap around is OK for a LAN but will probably suck on a WAN.
Posted Jun 25, 2012 9:52 UTC (Mon) by Jonno (subscriber, #49613)
> That said, I agree with you, shipping bitmap around is OK for a LAN but will probably suck on a WAN.
Well, it depends on the WAN in question, X11 (and even NX) are more latency-sensitive while Xpra and Wayland are (will be) more bandwidth-sensitive.
For example, over an 8 Mbps ADSL line (moderate latency) NX performs better than Xpra, but over an 6 Mbps HDSPA connection (high latency) Xpra performs better.
Essentially, as long as the bandwidth isn't saturated, Xpra (and in the future Wayland) will perform better (e.g. be more responsive), but once the bandwidth is saturated, it depends on the complexity of the application. In most cases NX will perform better, but in some cases (i.e. when total texture size >= final bitmap size) Xpra/Wayland will perform better).
Posted Jun 25, 2012 11:44 UTC (Mon) by cortana (subscriber, #24596)
Posted Jun 25, 2012 16:31 UTC (Mon) by Cato (subscriber, #7643)
I used OnLive for a while and it's quite usable, although the graphical quality was really fairly poor on a 3-4 Mbps connection. Being able to just play a game for 30 minutes before deciding to buy it is really handy, and generally it's nice to be able to play a game on any (Windows) PC.
Game streaming is just a special case of remote desktop access of course - OnLive also provide access to Windows desktops in the cloud, which would be useful for Windows to Linux migrations if they had a Linux client.
Posted Jun 25, 2012 17:35 UTC (Mon) by butlerm (subscriber, #13312)
Presumably, GTK and Qt could be changed to rely on XRender and any other necessary extensions when they are available, instead of pre-compositing everything itself.
XRender seems like an adequate, modern alpha composited rendering API. If it isn't it should be fixed. That plus frame barriers to avoid tearing should be enough to do it right.
Posted Jun 25, 2012 18:18 UTC (Mon) by ebassi (subscriber, #54855)
Presumably, GTK and Qt could be changed to rely on XRender and any other necessary extensions when they are available, instead of pre-compositing everything itself.
this is what happens already.
it turns out, though, that none (except older intel ones) of the modern GPUs are optimized for XRender's usage of trapezoids; as every GPU is, essentially, a programmable 3D pipeline, triangles are actually more suited for them. in point of fact, XRender is not at all a good drawing and compositing API. the proper drawing and compositing API is, actually, OpenGL.
that is also why Cairo, which was born as the client-side XRender API, is now moving to GL internally, and its internals are in the process of being overhauled for that.
Posted Jun 26, 2012 17:09 UTC (Tue) by butlerm (subscriber, #13312)
Is it really a serious problem to convert trapezoids into a pair of triangles, or is the problem simply that trapezoid based rendering requires too many trapezoids period?
Posted Jun 26, 2012 21:56 UTC (Tue) by ebassi (subscriber, #54855)
plus, XRender has nothing to describe vertex shaders, or geometry shaders, and even the convolution filters are a poor version of the GL fragment shaders.
in practice, OpenGL provides a better API to talk to modern GPUs than XRender.
Posted Jun 27, 2012 21:15 UTC (Wed) by butlerm (subscriber, #13312)
Using OpenGL to do 2D graphics strikes me as first class overkill. But if it is the only thing that a GPU can actually do well, then fine. I don't know why anyone at the application or toolkit levels would want to write OpenGL code to do 2D graphics rendering though, if there were any possible way they could avoid it.
The issue with remote access is worse - relying on a serialization of OpenGL on the wire to handle 2D over a low bandwidth channel would be insane, as in not fit for the purpose at all. As an actual wire protocol, XRender ought to be much better, simply because it is much simpler, much more stable, and much less likely to break.
However, if there is some sort of gratuitous impedance mismatch between XRender and the way GPUs like to see things, perhaps the designers of the next generation remote graphics rendering protocol could take that into account. I am curious what could be done in any case - it isn't at all obvious why using triangles instead of trapezoids for 2D graphics is going to make anything better. The source material is Bezier curves after all, not some sort of polygonal mesh.
Posted Jun 29, 2012 12:33 UTC (Fri) by ebassi (subscriber, #54855)
Using OpenGL to do 2D graphics strikes me as first class overkill.
that's because you still assume that: a) there is such a thing as a separate 2D and 3D and b) that this fictitious separation applies to the hardware. both assumptions are entirely fallacious.
2D is just a special case of 3D; modern GPUs are just programmable 3D rendering pipelines, and OpenGL is just a vendor-neutral API to program them without using driver-specific API: it provides you with the API to compile and upload programs to the GPU, as well as uploading geometry and texture data to the GPU memory - that's it.
not only the separation of 2D and 3D does not exist anywhere near the software layer, it's also not in the hardware: there are no modern GPUs with separate 2D pipelines like they existed in the past - except some Intel stuff.
I don't know why anyone at the application or toolkit levels would want to write OpenGL code to do 2D graphics rendering though
I maintain a toolkit that uses OpenGL. Cairo has a GL rendering surface that allows you to use the device-independent API to draw high quality 2D content using GL. Qt does the same. ideally, you don't want app developers to fudge around with GL - it's an awful API, with lots design-by-committee-of-CAD-developers crap - unless you're a game developer. that's why people write toolkits on top of it, exactly like people wrote toolkits on top of Xlib.
this has happened on other platforms as well: CoreAnimation and CoreGraphics on MacOS and iOS are two APIs used to do 2D (and 2D-layers-in-3D-space) UIs based on OpenGL and OpenGLES; CoreAnimation is the base toolkit used to write the whole user interface on iOS. Windows has its own toolkit based on DirectX. Android moved from software rendering to hardware acceleration through GLES.
tl;dr: the world has changed in the past 10 years - both at a hardware and a software level. GPU manufacturers standardised on GL and DirectX as the API exposed to control their cards, and platform and UI design moved to those two as the preferred drawing API for implementing their user interfaces.
Posted Jun 29, 2012 17:27 UTC (Fri) by daenzer (✭ supporter ✭, #7050)
Copyright © 2013, Eklektix, Inc.
Comments and public postings are copyrighted by their creators.
Linux is a registered trademark of Linus Torvalds