|
|
Subscribe / Log in / New account

Continuous-integration testing for Intel graphics

Continuous-integration testing for Intel graphics

Posted Oct 14, 2017 10:08 UTC (Sat) by excors (subscriber, #95769)
In reply to: Continuous-integration testing for Intel graphics by JFlorian
Parent article: Continuous-integration testing for Intel graphics

Would that be much better than simply buying a whole new processor? See e.g. https://cdn.arstechnica.net/wp-content/uploads/2015/01/Sc... ("5th Gen Intel Core Processor Die Map") - the GPU itself is about two thirds of the silicon area when paired with a dual-core CPU, plus the CPU and GPU both need the 'shared L3 cache' and memory controllers etc, so just removing the CPU cores looks like it would save very little cost. (It seems these chips essentially are graphics chips already, with a few minor peripherals like CPU cores stuck on the side.)

Then you'd need to add gigabytes of dedicated VRAM to make it work as a discrete card. And in terms of performance the current highest-end Intel GPUs would still only compete with perhaps a $70 NVIDIA card, so it doesn't seem there's much opportunity for profit there.


to post comments

Continuous-integration testing for Intel graphics

Posted Oct 17, 2017 15:13 UTC (Tue) by JFlorian (guest, #49650) [Link]

Sure I could upgrade the processor but it seems that always requires a new socket type. Thanks for the picture, I hadn't realized how disproportionate the balance has become. That's almost comical.

The Nvidia cards I do buy are often in the $70 range. I don't play games so most anything is overkill.


Copyright © 2025, Eklektix, Inc.
Comments and public postings are copyrighted by their creators.
Linux is a registered trademark of Linus Torvalds