Devils are in the details
Devils are in the details
Posted Mar 2, 2011 9:56 UTC (Wed) by renox (guest, #23785)In reply to: An Early Look at GNOME 3.0 (Linux.com) by drag
Parent article: An Early Look at GNOME 3.0 (Linux.com)
So don't claim that 2D acceleration is identical to 3D acceleration even if they use the same hardware and drivers: this is only true if the drivers are flawless..
Posted Mar 2, 2011 12:11 UTC (Wed)
by drag (guest, #31333)
[Link] (8 responses)
It's really bad form for applications developers to be forced to hack around Linux being shitty. It's better just to fix Linux and application developers can concentrate on using the APIs that are most appropriate for what they want to do.
Otherwise what your really doing is forcing the application developers to force users to make the choice between something that runs like crap versus something that crashes their computer. Choice is not a good thing when all the choices suck.
Posted Mar 2, 2011 13:16 UTC (Wed)
by mjthayer (guest, #39183)
[Link] (4 responses)
Currently that means using proprietary graphics drivers for many people, which is something a lot of Linux users don't feel comfortable with. As far as I can see, the Nouveau people and their colleagues are putting in a heroic effort to "fix" their bits of Linux, but it is hard for them to promise anything on a predictable time scale.
Posted Mar 2, 2011 13:19 UTC (Wed)
by rahulsundaram (subscriber, #21946)
[Link] (3 responses)
Posted Mar 2, 2011 14:14 UTC (Wed)
by mjthayer (guest, #39183)
[Link] (2 responses)
The Nouveau people are still very cautious regarding their 3D support, and warn people not to use it on production systems. Of course delivering more than they promise is much better than promising things and not delivering...
Posted Mar 2, 2011 16:34 UTC (Wed)
by drag (guest, #31333)
[Link]
It's actually faster then the older more traditional DRI drivers already.
I am cautiously optimistic that we have reached a similar situation with graphics that we saw with wifi... were the introduction of mac80211 protocol stack actually provided the basis for making it easier to write great working drivers for the majority of the hardware.
Posted Mar 3, 2011 9:46 UTC (Thu)
by rahulsundaram (subscriber, #21946)
[Link]
Posted Mar 2, 2011 14:01 UTC (Wed)
by renox (guest, #23785)
[Link] (2 responses)
Well it's called being pragmatic and wanting to have your application being available now, not several years in the future..
>It's better just to fix Linux
Except that it's just wishful thinking:
>[cut] Choice is not a good thing when all the choices suck.
Welcome to the real world.
Posted Mar 2, 2011 14:40 UTC (Wed)
by mjg59 (subscriber, #23239)
[Link]
No. It may require different knowledge, but the skills are exactly the same.
Posted Mar 2, 2011 16:21 UTC (Wed)
by drag (guest, #31333)
[Link]
So is having a good experience with a desktop that is designed to both use XRender and OpenGL to do the same thing. :P
> 1) some HW don't have open specifications!
Yes. This sucks. But there is plenty that do, or at least are 'open enough' that they work out well. Nowadays we have enough hardware on the market made by Linux-friendly-enough companies that I think it's pretty reasonable to expect that if people want to run Linux with the "Full Desktop" experience that they can obtain the necessary hardware.
And like I mentioned before: On modern hardware there is no longer 2D. It's not simple like that anymore. Drivers capable of driving XRender are going to be capable of doing other stuff. It's a API that is designed for hardware that doesn't exist anymore, I suspect. I am sure that other people understand the details better.
What we really need is a DE environment that is willing to depend on fully functioning subset of a particular popular API. OpenGL is probably just fine in this regard. It should concentrate on just working with that set of assumptions. Not trying to support multiple APIs and forcing users to choose when they have neither the knowledge of the hardware or understanding of the environment necessary to make a proper choice.
Then we need a 'lite' fall back were there is no acceleration available at all. That all your expected to be using is just pure software rendering or bit blotted display or something like that. Even if it ends up slow because application developers learn to depend on a properly configured computer it is not terribly important. It just needs to work.
> 2) Application developers may not own the problematic HW
This is what quality assurance and test suites are there to deal with.
Live CDs, usb drive images, automated benchmarks. It's easy nowadays to setup previews and things that people can just download and run. No compiling necessary or deep understanding. Trace logs and scripts to help providing debug information to developers should be built into the evaluation environments and nice documentation on what developers want needs to be provided. Information on the experience of the tester can be uploaded automatically or be sent with a press of a button.
It's often very useful to have the folks that QA be completely different from the folks that develop. Programmers are often 'too close' to the software to be able to see how users will approach it and what issues they will find. This helps issues get identified quicker in many cases.
Devils are in the details
Devils are in the details
Devils are in the details
Devils are in the details
Devils are in the details
Devils are in the details
Devils are in the details
KDE developers do maintain a 'blacklist' list of features, but apparently there still wasn't enough testing (perhaps a lack of communication?) as I remember reading some complaints about HW-acceleration bugs..
1) some HW don't have open specifications!
2) Application developers may not own the problematic HW
3) plus fixing HW drivers (especially complex one such as 3D drivers) is a very different skill than developing applications.
Devils are in the details
Devils are in the details