LCA: The X-men speak
Posted Feb 21, 2013 8:59 UTC (Thu) by khim
In reply to: LCA: The X-men speak
Parent article: LCA: The X-men speak
It's not at all uncommon on Windows to have self-written games and graphics apps crash and burn if you don't tell Optimus to use the discrete GPU for that process.
It used to be common problem, but is it really common here and now, though? Reviews of "third generation" HD Graphics (Ivy Bridge) usually highlight the fact that it's important milestone not because it's especially fast but because it finally bug-free enough to run most programs without crashing and artifacts. AMD's integrated graphics was always quite good in this regard.
Part of the problem is that some heuristic and app list makes those decisions instead of the app saying during device handle creation that it would prefer the more capable GPU vs the more power efficient GPU. The app knows what it needs out of a GPU, and it should be able to at least hint to the stack about its preferences.
Hint - yes, pick - no. Most GPU-using programs are proprietary ones which means that they are have the least amount of information in the whole system (user knows what s/he bought, OS can be upgraded but programs are written once then used for years).
to post comments)