|
|
Subscribe / Log in / New account

We still need specialized hardware for this?

We still need specialized hardware for this?

Posted Aug 19, 2022 10:53 UTC (Fri) by epa (subscriber, #39769)
Parent article: The growing image-processor unpleasantness

What I find surprising is that hardware acceleration is still needed for this task and that Intel, of all companies, is admitting that rather than trying to push a solution using the general-purpose CPU. Acres of silicon are used to support vector instruction sets like AVX512. Surely image processing is just what all that extra CPU muscle was built for?


to post comments

We still need specialized hardware for this?

Posted Aug 19, 2022 11:26 UTC (Fri) by laurent.pinchart (subscriber, #71290) [Link]

Image processing for cameras typically consists of tasks that can be very efficiently performed in hardware and that are very costly for the CPU. My favourite example is white balance, which applies different gains per colour channels. With a typical Bayer sensor, that's four hardware multipliers in the ISP. Think about what would be needed in terms of CPU time (and memory bandwidth) to do the same in software, for high-resolution images at high frame rates. And this is a simple example, there are typically dozens of processing steps in the hardware pipeline.

We still need specialized hardware for this?

Posted Aug 19, 2022 14:26 UTC (Fri) by fratti (guest, #105722) [Link] (3 responses)

Maybe "needed" is the wrong word, "preferred" would be better. I'm not sure how I can word this in a way that isn't several paragraphs but in essence: Von Neumann bad, specialised hardware good because efficient. Going forward there will be a lot more specialised hardware, and that has already been the way the wind is blowing for the past decade or so. (Neural accelerators, network stack offloading to NICs, hardware video encoders and decoders, ...)

We still need specialized hardware for this?

Posted Aug 22, 2022 2:12 UTC (Mon) by dvdeug (guest, #10998) [Link] (2 responses)

This is a cycle in computing; stuff has regularly moved from external hardware to CPU as often as the other way around, like floating point math, sound cards, modem audio processing (the notorious Winmodem), etc. Many CPUs have internal GPUs; I suspect the majority of computers use integrated graphics and will continue to do so, just because you don't need a graphics card for most non-gaming computers. At one point in time, I might have needed hardware video encoders and decoders, but now my computer can handle full-screen video and audio in real-time with the CPU and integrated graphics.

It's not efficient to have a extra part to fail, require separate cooling and pass through an external bus that may be a whole light-nanosecond away. I don't personally expect a great growth in specialized hardware; if neural accelerators become something that a general purpose computer needs, it's likely to be folded into the CPU, at the very least in the case of cheap computers.

We still need specialized hardware for this?

Posted Aug 22, 2022 15:48 UTC (Mon) by immibis (subscriber, #105511) [Link] (1 responses)

Tangential: what was actually the problem with Winmodems in the end? Couldn't open source developers write DSP code?

We still need specialized hardware for this?

Posted Aug 22, 2022 16:15 UTC (Mon) by farnz (subscriber, #17727) [Link]

Not legally. Underlying the problem with WinModems was the issue that it was outright illegal in many jurisdictions to write DSP code for a part directly attached to the phone line (you'd have to use an acoustic coupler to avoid the regulations).

Even ignoring the legal stuff (which people like Richard Close and Pavel Machek did, and wrote working DSP code), you have the problem that compute resources of the era weren't great, and thus the DSP mostly ended up running in kernel space to meet the required real time guarantees. It was done, and WinModems were made to work, but they were never great - and by the time compute resource was good enough to run the entire DSP in userspace, people had moved away from dial-up.

We still need specialized hardware for this?

Posted Aug 19, 2022 16:40 UTC (Fri) by excors (subscriber, #95769) [Link]

AVX is built for people who want to put some effort into developing a reasonably fast and reasonably power-efficient implementation of their algorithm, but who don't want to spend years and many millions of dollars to build an ASIC that can do it with greater speed or efficiency.

A lot of stuff falls into that category; but many image processing algorithms have particularly high performance requirements (IPU6 claims to support 4K video at 90fps, which is a lot of pixels in a thin battery-powered device) and are mature enough that Intel can be confident they'll still be useful several years later, in which case it's worth investing in dedicated hardware.

The tradeoff keeps shifting as algorithms change and as general-purpose processors get more efficient and as specialised hardware gets more flexible, but I think the long-term trend is towards increasingly complex combinations of specialised hardware and microcontroller firmware and CPU software, which needs more sophisticated APIs to handle synchronisation between all those components.


Copyright © 2025, Eklektix, Inc.
Comments and public postings are copyrighted by their creators.
Linux is a registered trademark of Linus Torvalds