We still need specialized hardware for this?
We still need specialized hardware for this?
Posted Aug 19, 2022 10:53 UTC (Fri) by epa (subscriber, #39769)Parent article: The growing image-processor unpleasantness
Posted Aug 19, 2022 11:26 UTC (Fri)
by laurent.pinchart (subscriber, #71290)
[Link]
Posted Aug 19, 2022 14:26 UTC (Fri)
by fratti (guest, #105722)
[Link] (3 responses)
Posted Aug 22, 2022 2:12 UTC (Mon)
by dvdeug (guest, #10998)
[Link] (2 responses)
It's not efficient to have a extra part to fail, require separate cooling and pass through an external bus that may be a whole light-nanosecond away. I don't personally expect a great growth in specialized hardware; if neural accelerators become something that a general purpose computer needs, it's likely to be folded into the CPU, at the very least in the case of cheap computers.
Posted Aug 22, 2022 15:48 UTC (Mon)
by immibis (subscriber, #105511)
[Link] (1 responses)
Posted Aug 22, 2022 16:15 UTC (Mon)
by farnz (subscriber, #17727)
[Link]
Not legally. Underlying the problem with WinModems was the issue that it was outright illegal in many jurisdictions to write DSP code for a part directly attached to the phone line (you'd have to use an acoustic coupler to avoid the regulations).
Even ignoring the legal stuff (which people like Richard Close and Pavel Machek did, and wrote working DSP code), you have the problem that compute resources of the era weren't great, and thus the DSP mostly ended up running in kernel space to meet the required real time guarantees. It was done, and WinModems were made to work, but they were never great - and by the time compute resource was good enough to run the entire DSP in userspace, people had moved away from dial-up.
Posted Aug 19, 2022 16:40 UTC (Fri)
by excors (subscriber, #95769)
[Link]
A lot of stuff falls into that category; but many image processing algorithms have particularly high performance requirements (IPU6 claims to support 4K video at 90fps, which is a lot of pixels in a thin battery-powered device) and are mature enough that Intel can be confident they'll still be useful several years later, in which case it's worth investing in dedicated hardware.
The tradeoff keeps shifting as algorithms change and as general-purpose processors get more efficient and as specialised hardware gets more flexible, but I think the long-term trend is towards increasingly complex combinations of specialised hardware and microcontroller firmware and CPU software, which needs more sophisticated APIs to handle synchronisation between all those components.
We still need specialized hardware for this?
We still need specialized hardware for this?
We still need specialized hardware for this?
We still need specialized hardware for this?
We still need specialized hardware for this?
We still need specialized hardware for this?
