|
|
Subscribe / Log in / New account

Continuous-integration testing for Intel graphics

Continuous-integration testing for Intel graphics

Posted Oct 11, 2017 16:45 UTC (Wed) by tialaramex (subscriber, #21167)
In reply to: Continuous-integration testing for Intel graphics by jhoblitt
Parent article: Continuous-integration testing for Intel graphics

My understanding is that there are two related answers

1. From very early, much earlier than for a comparable commercial system like NT, the Linux kernel's developers subsisted largely and in some cases entirely on dogfood. So all core systems that must work to have an environment in which you can edit text files, compile and link a large project and then ship it somewhere were tested de facto continuously by their developers. If you broke chown() then it didn't fail a unit test that would result in an email and public shame - it broke your computer, and you spent miserable hours figuring out what was wrong. If you broke the filesystem your files got trashed and you didn't have anything to show for it.

2. Almost all drivers and subsystems which aren't actively used by developers rot and die. In some ways this is "better" than a traditional CI because traditional CI causes a bias where the "loudest screams win" - effort may be expended on fixing something that failed unit tests even though it's actually not very important, and that's always effort which could have been directed at something which _is_ important. In Linux the developers were unavoidably focused on making their own computers actually work. When those computers had NE2000 ISA cards they made sure the driver for NE2000 ISA cards worked. Today they have Intel graphics chips.

So: Lots of things broke, but, relatively few of them were things people cared about.


to post comments

Continuous-integration testing for Intel graphics

Posted Oct 11, 2017 22:23 UTC (Wed) by roc (subscriber, #30627) [Link] (9 responses)

Even Intel graphics don't always work. The i915 driver has never worked reliably in my Dell Skylake laptop. (The actual bugs have varied over time as I update my kernel, but currently it appears to be memory management issues causing my laptop to freeze multiple times a day if I'm running Eclipse.)

Every time my laptop freezes I feel a little bit grumpy about how i915 is the poster child for how Linus kernel graphics should be done.

Continuous-integration testing for Intel graphics

Posted Oct 12, 2017 1:05 UTC (Thu) by ras (subscriber, #33059) [Link] (8 responses)

> Even Intel graphics don't always work. The i915 driver has never worked reliably in my Dell Skylake laptop.

Me too. The headline prompted me to read the article. I was hoping to read a mea culpa from the Intel devs, along with the how they were addressing their quantity issues. Instead I got Intel devs pontificating about testing, and it does not sit well with me.

I can only hope that they are talking about some different driver. The i915 driver was terrible. In fact maybe that drivers quality issues is what drove them to implement CI. In 4.12 it's not so bad - but it's taken 2 years(!) after the chip was released to get a stable driver. Their first 10 or so releases were so bad people were returning Dell Laptops as unusable and subsequently ranting at Dell on every form the could find. All the noise came from people running Windows - but I suspect only because the people running Linux knew who was really to blame, and trusted it would be fixed. I was one of those. But I never dreamed it would take so long.

Then there whatever driver xblacklight depends on. Promised for 4.11, not still not delivered in 4.12. https://bugs.freedesktop.org/show_bug.cgi?id=96572#c11 That's been over 2 years(!).

Continuous-integration testing for Intel graphics

Posted Oct 12, 2017 7:32 UTC (Thu) by blackwood (guest, #44174) [Link] (7 responses)

Listen to the full talk and you get your apology (the lpc one at least, unfortunately that wasn't recorded). The CI we currently have is still 1-2 years of work behind where it needs to be, and due to corporate reorg shenagians, 2-3 years were lost (i.e. we would have prevented the last 2 years of fail, except we couldn't because for that time we had effectively zero CI).

If you read the article it says clearly that 1 year ago the entire board was red, which is around 4.10/11. We're not blind idiots who can only pontificate, the reason we pontificate is that CI actually dug us out of this huge hole we've got into over 2-3 years of no testing at all due to reorg madness within less than 1 year (you can't see all that yet because development is 4-6 months ahead of the latest release). So yeah, CI is pretty much the only way to get quality on track.

And yes skl didn't work on those older kernels. Per CI, it still didn't work well on 4.12, but at least it's better (4.14 should be actually good).

Continuous-integration testing for Intel graphics

Posted Oct 12, 2017 7:46 UTC (Thu) by andreashappe (subscriber, #4810) [Link]

> We're not blind idiots who can only pontificate, the reason we pontificate is that CI actually dug us out of this huge hole we've got into over 2-3 years of no testing at all due to reorg madness within less than 1 year (you can't see all that yet because development is 4-6 months ahead of the latest release). So yeah, CI is pretty much the only way to get quality on track.

If there were a way of upvoting a comment, I would do that. Thanks for your work.

Continuous-integration testing for Intel graphics

Posted Oct 12, 2017 7:48 UTC (Thu) by ras (subscriber, #33059) [Link] (1 responses)

> Listen to the full talk and you get your apology (the lpc one at least, unfortunately that wasn't recorded). The CI we currently have is still 1-2 years of work behind where it needs to be, and due to corporate reorg shenagians, 2-3 years were lost (i.e. we would have prevented the last 2 years of fail, except we couldn't because for that time we had effectively zero CI).

A rational explanation. It also explains why 4.12 was a marked improvement. Thank $DIETY for that. Well, I guess I should be thanking you guys.

Sounds like you are back on the track. It would be interesting to know how an engineering firm like Intel fell off it in the first place, but I guess that explanation will have to wait until someone moves on.

Continuous-integration testing for Intel graphics

Posted Oct 14, 2017 0:45 UTC (Sat) by rahvin (guest, #16953) [Link]

He already said how, if you've never been through a major corporate reorganization thank your lucky stars. Re-org's are poison to getting anything done. They end up shifting people and management around and it's typically 2-3 years before anyone knows what's going on and effective teams get back doing work. It's good to hear that this is the reason the Intel driver went so crappy. I've got a skylake NUC that's been a nightmare and I thought I was doing something wrong even on the latest kernels, now I know it was the driver and not a misconfiguration on my part. .

Continuous-integration testing for Intel graphics

Posted Oct 13, 2017 20:12 UTC (Fri) by JFlorian (guest, #49650) [Link] (3 responses)

Wow, this *open* candor is really refreshing! I'll take this any day over a greased pig that is the standard offering from most companies.

I just wish I could buy Intel graphics chipsets on add-in cards. The integrated video easily becomes too dated while the mainboard remains otherwise sufficient.

Continuous-integration testing for Intel graphics

Posted Oct 14, 2017 1:40 UTC (Sat) by jhoblitt (subscriber, #77733) [Link]

Once upon a time, you could by discrete Intel graphics...

Continuous-integration testing for Intel graphics

Posted Oct 14, 2017 10:08 UTC (Sat) by excors (subscriber, #95769) [Link] (1 responses)

Would that be much better than simply buying a whole new processor? See e.g. https://cdn.arstechnica.net/wp-content/uploads/2015/01/Sc... ("5th Gen Intel Core Processor Die Map") - the GPU itself is about two thirds of the silicon area when paired with a dual-core CPU, plus the CPU and GPU both need the 'shared L3 cache' and memory controllers etc, so just removing the CPU cores looks like it would save very little cost. (It seems these chips essentially are graphics chips already, with a few minor peripherals like CPU cores stuck on the side.)

Then you'd need to add gigabytes of dedicated VRAM to make it work as a discrete card. And in terms of performance the current highest-end Intel GPUs would still only compete with perhaps a $70 NVIDIA card, so it doesn't seem there's much opportunity for profit there.

Continuous-integration testing for Intel graphics

Posted Oct 17, 2017 15:13 UTC (Tue) by JFlorian (guest, #49650) [Link]

Sure I could upgrade the processor but it seems that always requires a new socket type. Thanks for the picture, I hadn't realized how disproportionate the balance has become. That's almost comical.

The Nvidia cards I do buy are often in the $70 range. I don't play games so most anything is overkill.


Copyright © 2025, Eklektix, Inc.
Comments and public postings are copyrighted by their creators.
Linux is a registered trademark of Linus Torvalds