|
|
Subscribe / Log in / New account

Linux and desktop graphics

There is a lot to be said for the X window system. It is, after all, one of the oldest and most successful free software development projects in existence. X helped to pioneer many concepts, including the idea of a graphical display as a network service and the absolute separation of graphical mechanism and policy. Long before Linux began to make proprietary Unix vendors worry, X was pushing aside proprietary desktop implementations.

X has a problem, however: it is very much a two-dimensional system in a three-dimensional world. It was designed around dumb frame buffers, but is now expected to run on graphical adaptors which, in terms of processor performance, far outclass the central processor they serve. As a result, X tends to make poor use of contemporary video hardware; it restricts itself to the hardware's two-dimensional processor (a nearly vestigial afterthought bolted onto the real hardware) and cannot make use of many of the capabilities provided by the 3D processor. X is, essentially, using a legacy interface which is poorly supported now, and which may go away in the near future.

To remain viable, and to help free operating systems develop the best desktop experience possible, X must grow into the current crop of hardware. The X developers have understood this for some time, and have been working in that direction. Events from this week demonstrate, however, that there is a lack of consensus on what needs to be done, and when.

The person driving the debate is Jon Smirl, an active graphics programmer. Frustrations with the X development process have led Jon to write and post a document called The State of Linux Graphics. Regardless of how one feels about Jon's opinions, the document is worth a read; it is a comprehensive overview of the problem and the current body of low-level graphical software. If you've ever wondered what all those acronyms (XAA, EXA, DRI, ...) mean, this document will clarify a number of things.

X developers seem to agree that X needs to make a switch from 2D to 3D hardware. There is less consensus on how the 3D hardware should be made available to user space. One approach is to make OpenGL be the API for next-generation graphics. This interface is relatively well designed, is open, and already has a certain level of support in free software. It is a high-level interface which allows an application to take advantage of the hardware's capabilities. OpenGL supporters see the X of the future as being a sort of management layer around the OpenGL interface.

Jon Smirl is one of those supporters. He has been working on Xegl, a version of the X server which makes the OpenGL interface available. A few weeks ago, however, Jon announced an end to his Xegl work. In his opinion, Xegl is not going to reach a usable state anytime soon, so it is not worth working on.

The problem, it seems, is that Xegl lacks developers and is progressing too slowly. According to Jon, a big part of the problem is that development work in the X community has been spread in too many directions. He is, in particular, critical of an effort called EXA, which is working to integrate drivers using the 3D hardware into the existing X API. EXA may have the effect of extending the life of the current X server, but it does relatively little to make the hardware's capabilities available to applications. As a result, the X server will be faster on supported hardware, but it will still be a 2D server. Says Jon:

End result is that EXA is just a bandaid that will keep the old X server code going another year or two. There is also a danger that EXA will keep expanding to expose more of the chip's 3D capabilities. The EXA bandaid will work but it is not a long term fix. Its existence also serves to delay the creation of a long term fix.

Jon seems to believe that the main thing EXA will accomplish is to push back the date when Xegl will show up as the real solution to the problem. He claims that Linux is already far behind the proprietary platforms in providing a desktop which can take advantage of contemporary hardware, and has little patience for developments which threaten to widen that gap. So Jon has stopped development work on Xegl, and is working for process change instead. His conclusion states:

As a whole, the X.org community barely has enough resources to build a single server. Splitting these resources over many paths only results in piles of half finished projects. I know developers prefer working on whatever interests them, but given the resources available to X.org, this approach will not yield a new server or even a fully-competitive desktop based on the old server in the near term. Maybe it is time for X.org to work out a roadmap for all to follow.

Not all X developers are entirely supportive of Jon's position. The administrator of freedesktop.org, where Jon's document is hosted, posted a dismissive response and promptly shut down Jon's account, making the document unavailable. It has since been restored, but that action (ostensibly taken for other reasons) added an unpleasant note to the debate.

Some developers seem to agree that the OpenGL approach is the right one for the long term, but they never believed that this solution could be implemented in the near future. It is, after all, a complex project. For these developers, EXA makes sense as a short term, relatively easy solution to make X functional on current hardware.

Others seem to disagree with the transition to OpenGL altogether. The current X Render extension makes a number of capabilities available to applications, and it could be extended where needed. Render is seen as a friendlier API for 2D applications than OpenGL. Not moving to OpenGL would mean less disruption for applications and would avoid impacting X performance on older hardware without 3D acceleration.

The discussion, as of this writing, has not reached much in the way of new conclusions. The Xorg project lacks a dictator, and will thus be hard put to pick a direction and expect that the developers will simply follow. What does seem clear, however, is that the developers are determined to bring X forward to where it is, once again, a leading-edge graphical platform. They will probably get there, one way or another.


to post comments

Linux and desktop graphics

Posted Sep 1, 2005 3:11 UTC (Thu) by rknop (guest, #66) [Link] (14 responses)

Not moving to OpenGL would mean less disruption for applications and would avoid impacting X performance on older hardware without 3D acceleration.

There's another thing to think about: the newest video cards are also not supported if you are unwilling to run a non-free binary-only driver.

Until we get to a world where 3D programming info is as widely available as 2D programming info is, and as such we can count on free software developers being able to provide drivers for new cards, it would be a huge mistake to tie the very core of the Unix graphics system to a 3D library.

Unfortunately, the world does not seem to be going that way.

I like and use 3D acceleration, and avoid buying anything newer than a Radeon 9200 for that reason. However, at work, I only use it a little bit. For the most part, I need decent 2D performance. I'd hate to see that sacrificed when, two or three years from now, it's impossible to buy a card that has any free 3D drivers.

-Rob

Linux and desktop graphics

Posted Sep 1, 2005 3:33 UTC (Thu) by eyal (subscriber, #949) [Link] (13 responses)

Two or three years from now you may not be able to buy a video adapter that has any 2D API. What then?

Eyal.

Linux and desktop graphics

Posted Sep 1, 2005 4:05 UTC (Thu) by tjc (guest, #137) [Link] (6 responses)

Two or three years from now you may not be able to buy a video adapter that has any 2D API. What then?

Well, maybe.

There are a lot of people who never use 3D for anything, other than perhaps a screensaver.

I would like to see an updated 2D architecture to replace VGA released as a fully documented free hardware specification. The implementation cost would be nominal, but the result very useful to a large group of users.

Linux and desktop graphics

Posted Sep 1, 2005 7:06 UTC (Thu) by anselm (subscriber, #2796) [Link] (3 responses)

> There are a lot of people who never use 3D for anything, other than
> perhaps a screensaver.

This is beside the point. Most of these people *will* be using 3D in
the guise of a system like Windows or MacOS that does everything in 3D,
even if it looks 2D.

Anselm

Linux and desktop graphics

Posted Sep 1, 2005 18:09 UTC (Thu) by tjc (guest, #137) [Link] (2 responses)

This is beside the point. Most of these people *will* be using 3D [snip]

Thanks for the prophetic utterance...

Linux and desktop graphics

Posted Sep 2, 2005 6:24 UTC (Fri) by man_ls (guest, #15091) [Link] (1 responses)

Actually, it's not very prophetic. I don't know about Windows, but on Mac OS X it is like that today. Apple calls it Quartz Extreme, and it makes every window an OpenGL object.

Linux and desktop graphics

Posted Sep 8, 2005 21:59 UTC (Thu) by barrygould (guest, #4774) [Link]

Windows Vista will allegedly work similarly.

Linux and desktop graphics

Posted Sep 1, 2005 14:34 UTC (Thu) by smitty_one_each (subscriber, #28989) [Link] (1 responses)

Is a cheap 2D card an impossiblity?
Considering what a challenge a stand-alone (non-WinModem) analog modem is, maybe.

Linux and desktop graphics

Posted Sep 1, 2005 18:14 UTC (Thu) by tjc (guest, #137) [Link]

Is a cheap 2D card an impossiblity?

I was thinking more along the lines of cheap 2D integrated in the mobo chipset, sort of like serial ports are today. This would be nice for servers, and for troubleshooting desktop systems.

like the text-only console, you mean?

Posted Sep 1, 2005 4:09 UTC (Thu) by xoddam (subscriber, #2322) [Link] (2 responses)

Why on earth would the 2D programming interface disappear?
The VGA text console interface is still available.

like the text-only console, you mean?

Posted Sep 1, 2005 10:18 UTC (Thu) by nix (subscriber, #2304) [Link]

That is also disappearing in high-end adapters. (This is one reason why the Linux kernel has framebuffer support.)

like the text-only console, you mean?

Posted Sep 1, 2005 14:00 UTC (Thu) by smoogen (subscriber, #97) [Link]

Because it is extra bits that have to be kept around, tested and ported for cards. A lot of the newer cards do not have a 2D API in them because it is too suboptimal for the cards CPU. I think I heard one engineer put it as "Its like putting a trailer hitch on a dragster."

Linux and desktop graphics

Posted Sep 1, 2005 12:44 UTC (Thu) by rknop (guest, #66) [Link] (2 responses)

Two or three years from now you may not be able to buy a video adapter that has any 2D API. What then?

Then we're really screwed, because there may be no video adapters for which free drivers of any kind are available.

We'll either have to give in and start using proprietary drivers, or we'll have to keep running Linux only on old, scaveneged hardware.

-Rob

Linux and desktop graphics

Posted Sep 1, 2005 16:02 UTC (Thu) by rjw (guest, #10415) [Link] (1 responses)

What is more likely is that "minimal" open source 3d drivers will be released, just using a fixed function pipeline, with basic performance ( equivalent to a 9200). The snazzy features (that are most likely to reveal patent infringing hardware) will be locked up in the proprietary drivers.

There is very little that can be done to fix this without getting rid of the patent system altogether:
* Outlawing pure IP speculation ( these are what they are really scared of, there is nothing they can cross licence with these guys. They just want a payoff.)
* Shortening patent terms.

These are pretty unlikely. So what we can hope for is that the APIs ( OpenGL & DirectX) become so high level that they are implemented directly on the card, and the drivers are a very thin shim that just passes data through to them. Then the driver reveals very little, and can be open sourced.

Linux and desktop graphics

Posted Sep 1, 2005 21:59 UTC (Thu) by cventers (guest, #31465) [Link]

Another option is a class action lawsuit against the major GPU
manufacturers for refusing to release the specifications needed to
communicate properly with their hardware that you've paid for. True, you
didn't have to buy it, but if they start getting a market lock, such a
practice could be deemed to be in support of a monopoly.

Linux and desktop graphics

Posted Sep 1, 2005 8:35 UTC (Thu) by jamesh (guest, #1159) [Link]

Given that the major toolkits are moving towards heavier use of the RENDER extension (most through Cairo, and Qt through Arthur), it seems sensible to put time into improving the performance of RENDER.

The Xgl servers (Xegl, Xglx, etc) do this by implementing the RENDER primitives in the server using the glitz library and OpenGL, but this is certainly not the only path to take. The main downside is that you need good OpenGL drivers to get good performance, and there are

The EXA driver architecture is a different approach, providing a way to accelerate the RENDER primitives in a way that lets the driver make use of the 3D hardware without writing a full OpenGL driver. It seems quite sensible to look at both approaches (one long term, and a simpler one that will provide results sooner) -- much more so than putting yourself in a position where you are dependent on Nvidia and ATI for basic graphics.

Linux and desktop graphics

Posted Sep 1, 2005 8:52 UTC (Thu) by cloose (guest, #5066) [Link] (2 responses)

Well, the author of EXA, Zack Rusin, said the following on the x.org mailing list:

"And I'm sure it's a _really_ good article but unfortunately at the moment I don't have any time whatsoever (I appologize for not reviewing some of the recent Exa changes), but as soon as I have a little bit of time I'll finish up Xegl. I said I'll do it and I will do it.

There's too much drama on X.Org lists lately. We're moving forward, we're doing fine. We will rule them all (not necessarily with an iron fist but definitely with a graphics infrastructure of the future).

"Don't think that a small group of dedicated individuals can't change the world; it's the only thing that ever has.". And in case you have forgot that - we're here to prove you wrong.

Just watch me..."

(http://lists.freedesktop.org/archives/xorg/2005-August/009655.html)

Linux and desktop graphics

Posted Sep 1, 2005 14:49 UTC (Thu) by Zenith (guest, #24899) [Link] (1 responses)

These quotes ought to make it for the next edition of LWN.net Weekly ;-)

With an attitude like that, who dares to argue against the man?

Two thumbs up from here to Zack and all the other X developers!

Linux and desktop graphics

Posted Sep 10, 2005 9:20 UTC (Sat) by renox (guest, #23785) [Link]

His sig is nice too and in the same style, it must have been his "boost my ego" day ;-) :
"I don't care WHO you are, you're not walking on the water while I'm fishing."

Open Graphics

Posted Sep 1, 2005 15:00 UTC (Thu) by yashi (subscriber, #4289) [Link] (1 responses)

Does anyone have a comment on Open Graphics?

http://lists.duskglow.com/mailman/listinfo/open-graphics

We didn't have free/open _software_ when RMS started GNU project. (ok, we did but...) Is this a good time to start on free/open _hardware_?

Open Graphics

Posted Sep 1, 2005 15:59 UTC (Thu) by rknop (guest, #66) [Link]

We didn't have free/open _software_ when RMS started GNU project. (ok, we did but...) Is this a good time to start on free/open _hardware_?

The comparison isn't really 1:1, since you can copy a piece of software without in any way degrading the copy of the person you're copying from. This is obviously not true with hardware. (At least until the nanotech "hardware printers" get working.)

However, there really ought to be open specs and open standards. This has been the de facto case in the past. The problem is that it takes a fair centralization of effort to make hardware that is used by any reasonable fraction of the people out there. Once you have that centralization of effort, you probably have a corporation whose legal and marketing divisions realize that they might be able to profit from lock-in, and encourage proprietary, closed specs.

With software, long-haired birkenstock-wearing geeks working either at companies who want software support, or in their parents' garage, are able to produce things that can get used-- largely because design is mainly what needs to be done, there's no physical construction. That means that a grassroots effort such as Stallman started is possible, and now many people work using 100% free software. (I'm almost there, but not quite. I have to admit to occasionally using IDL, to using Acroread, and to *sometimes* using a binary modem driver on a laptop. Oh, and I have an X-box, but I made peace with the idea that if Microsoft were just a games company, they'd be less of a threat to my work. Call me a rationalizer if you will.)

-Rob

Linux and desktop graphics

Posted Sep 1, 2005 21:55 UTC (Thu) by cventers (guest, #31465) [Link]

s/probably get there/surely get there/;

Linux and desktop graphics

Posted Sep 4, 2005 0:31 UTC (Sun) by njhurst (guest, #6022) [Link] (3 responses)

"run on graphical adaptors which, in terms of processor performance, far outclass the central processor they serve. "

Is this really true? GPUs are certainly faster at doing certain classes of operations (such as drawing textured polygons), but are they in general superior to CPUs? A simple test would be to implement a CPU emulator on top of a GPU and a GPU emulator on top of a CPU. I suspect that for general purpose computing a CPU is a better compromise.

Not that this is really relevant to the thrust of the argument :)

Linux and desktop graphics

Posted Sep 8, 2005 12:38 UTC (Thu) by csamuel (✭ supporter ✭, #2624) [Link] (2 responses)

There's a group of folks who are starting to use graphics cards for
general purpose (HPC) programming because of their massive memory
bandwidth and floating point performance. Of course you need to be able
to frame your problem in terms of polygons, etc, that the cards
understand, but there are people doing it for all sorts of things
(including databases).

Check out http://www.gpgpu.org/ for a quick intro!

Linux and desktop graphics

Posted Sep 10, 2005 9:32 UTC (Sat) by renox (guest, #23785) [Link] (1 responses)

And you need also a problem where you don't need high precision FP operations: AFAIK currently the GPU don't provide 64 floating point numbers: when NVidia talk about 64bit, it means 4 component times 16bit FP per component..

So not only you need algorithms which are well suited to the architecture of the GPU, you need also to be very careful about the precision (as a reminder Intel provides upto 80bit FP operations), so color me not very impressed by the hype about doing everything on GPUs.

GPUs, precision and general purpose computing

Posted Sep 11, 2005 10:05 UTC (Sun) by csamuel (✭ supporter ✭, #2624) [Link]

People are working on it though..

Accelerating Double-Precision FEM Simulations with GPUs

This paper by Dominik Göddeke, Robert Strzodka and Stefan Turek describes a preliminary algorithm to achieve double precision results by adding a CPU-based defect correction to iterative linear system solvers on the GPU. We demonstrate that identical accuracy as compared to a full CPU double precision solver is possible while still gaining a factor of 2 in speedup compared to a highly tuned cache-aware CPU reference implementation in double precision.
(Accelerating Double Precision FEM Simulations with GPUs. Dominik Göddeke, Robert Strzodka and Stefan Turek. To appear in Proceedings of ASIM 2005 - 18th Symposium on Simulation Technique.)

Linux and desktop graphics

Posted Sep 4, 2005 16:26 UTC (Sun) by sbergman27 (guest, #10767) [Link] (2 responses)

The article implies that the 3D hardware on cards is so much more powerful than the puny 2D hardware that it makes sense to abandon the 2D approach.

After reading the article, I got curious and ran a little home-cooked benchmark and would be interested in people's responses to the results.

I ran a test video with mplayer:

mplayer -benchark -nosound testvideo.avi

using various output rendering drivers.

For the test, I used x11 (straight x11 rendering), xv (XVideo rendering), gl (OpenGL rendering), and gl2 (a newer, better OpenGL rendering driver).

Here are the results:

Driver Time CPU Usage User System
x11: 153 sec 100% 95% 5%
xv: 114 sec 100% 95% 5%
gl: 241 sec 100% 99% 1%
gl2: 255 sec 100% 99% 1%

I started to run the same benchmark using mesa software rendering, but did not let it finish. It ran in slow motion. I'd guess 1/3 to 1/2 of realtime. And would have taken something like 3000 to 4000 seconds to run.

This is all on an Athlon64 2800+ with an NVidia GForce 6800GT AGP 8x with 256MB of DDR3 and the GPU is running at 1GHz, and with SBA enabled.

So where is the huge performace advantage of rendering 2D through the 3D hardware? Is my card not powerful enough?

Linux and desktop graphics

Posted Sep 4, 2005 23:05 UTC (Sun) by daenzer (subscriber, #7050) [Link]

It's more likely that the GL implementation isn't optimized for this usage pattern yet (you'd probably get different results when comparing performance of the RENDER extension between Xglx and a traditional X server). This would likely change once Xgl becomes widely used.

Also keep in mind that unless mplayer can use some kind of YUV texture extension in the GL implementation, it still has to do the colourspace conversion in software. Again, this would likely become more widely available along with Xgl usage.

Linux and desktop graphics

Posted Sep 8, 2005 5:34 UTC (Thu) by zblaxell (subscriber, #26385) [Link]

"For the test, I used x11 (straight x11 rendering), xv (XVideo rendering), gl (OpenGL rendering), and gl2 (a newer, better OpenGL rendering driver)."

mplayer is not exactly a typical application--it is a video player, and video has very different system requirements than other kinds of graphics, even animated graphics. Video uses color spaces and a small set of scaling functions (which are occasionally implemented with fixed parameters to reduce software or silicon), and advanced implementations feature motion compensation functions that are utterly useless to anyone who is implementing games, CAD applications, word processors, or web browsers (or, for that matter, software video players running on very fast CPUs).

That said, I've done the test with all four of the drivers you mention. For my test I chose a 4m14s MPEG-1 352x240 file so that the decode time is not dominated by the codec. According to mplayer the video decode takes 5.7 seconds. First, let's play the file at 1:1 scaling in a window:

x11: 9.441s
xv: currently broken (see note about fragility below)
gl: 12.291s
gl2: 12.397s

OK, now instead of watching a postage stamp, let's scale those images by an integer factor of 2:

x11 + -zoom: 47.247s
xv: still broken
gl: 5.7s
gl2: 5.16s

Just to be sure I didn't screw up, I ran the benchmarks again at 1:1 resolution. 'gl' really is faster if you ask it to scale up 2:1 while it blits.

When zooming 4:1 things get slower again. At 1408x1056:

x11 + -zoom: 169.34s
gl: 23.37s
gl2: 23.72s

I generally watch most video with 'gl' at 1920x1200, which is the resolution of my laptop display, which in this case takes 59.71s, or about 23% of the time of playing the video. When 'xv' was working on this machine, it wasn't able to keep up with realtime decoding, although I can't reproduce the hard numbers now. At 1920x1200, 'x11' is just barely able to keep up with this particular video file, but at 98% CPU utilization between mplayer and the X server, 'x11' leaves less 2% of the CPU available for video decoding--even an MPEG2 file would be unplayable, and MPEG2 is pretty low-tech by modern video codec standards.

x11 is plain old X--RGBx frame buffers pushed across the video bus by the CPU. You can expect this to be the slowest or most CPU intensive of the four, unless one of the other drivers is insane (software GL is insane, or at least the Mesa libraries are completely unoptimized for simple 2D cases). Also, x11 is only fast when the ratio of video pixels to display pixels is 1:1--otherwise it gets very slow very quickly. Video tends to need weird non-integer pixel scaling to come out right on any computer display, so for real life video watching purposes x11 is useful only a last resort.

I have found 'xv' is what you want for mplayer if and only if high performance GL is unavailable. The problem with Xv is that it is incredibly fragile--it is only usable by one process at a time (including that phantom process that you just can't seem to find or kill, but which currently owns Xv so you can't run kino/xine/ogle/mplayer properly), and it tends to interact badly with text mode, virtual/physical framebuffer size mismatches, and suspend/resume cycles (software suspend2 or ACPI or APM BIOS). It took two years for the radeon driver's Xv implementation to catch up with my previous laptop's video hardware, and frankly it's still a little quirky even a year after that. Xv is also much slower than GL on my ATI Radeon 9700 AGP 8x with proprietary drivers--it just looks wrong most of the time (bad interlacing support? low frame rate? I'm not sure exactly--it was easier to switch -vo than to figure out what was wrong ;-), and needs a lot of CPU and even some framedropping to keep up with many video files. On the other hand, most of the video cards I own have sufficiently broken GL support that Xv is still quite useful.

I've never had 'gl2' work very well. It does work, but it has OSD bugs and I can't see any reason to prefer it over 'gl'. Many of the features that 'gl2' claims to have seem to also be implemented in the 'gl' driver. From what I've seen of it, 'gl2' was a newer driver at one point, and might have been better than 'gl' at the time, but eventually 'gl' got fixed more than 'gl2' did, to the point where 'gl2' is now obsolete. Maybe I don't own the one video card which is better supported by 'gl2' than 'gl' or 'xv'.

'gl' does need the CPU to do YUV->RGB colorspace conversion, but those conversions are generally cheap to do on the host CPU. I'd imagine that the GL driver is also doing DMA from the video card to move the video frames onto the display, so extra CPU on colorspace conversion is traded for less CPU on data transfer (and data transfer generally costs more).

"a leading-edge graphical platform"

Posted Sep 12, 2005 1:43 UTC (Mon) by RiverOfNotMe (guest, #32396) [Link]

When has X ever been considered leading-edge? It's always been behind the pack as far as I can
tell.

1983 - Lisa released. First widely-available usable GUI.
1984 - Mac released. First usable GUI under $10,000.
1988 - X still can't draw a circle portably. Standard toolkit (more or less): Motif, a cheapass copy
of the Windows look, which couldn't hold a candle to the Mac at that time.
2001 - Mac OS X window server; every window is double-buffered; no update flicker.
2004 - Mac OS X has Quartz Extreme (OpenGL-based windowserver -- every window is now a
texture). Render and Cairo finally bring device-independent rendering to X.
2005 - Xegl dies. EXA on the horizon.

I'm not trolling here, honest. X has some good aspects: it exists, it's a standard on non-OS X
Unix platforms, it's fairly interoperable, it has much software written for it.

But it's not leading edge, dude. At least, the graphics aspects have never been leading-edge.
Network transparency (in 1984) was leading-edge, sure. But we're only discussing the graphics
aspects here.

Maybe it will be leading-edge in the future. Fine. But let's keep the record straight.


Copyright © 2005, Eklektix, Inc.
Comments and public postings are copyrighted by their creators.
Linux is a registered trademark of Linus Torvalds