|
|
Subscribe / Log in / New account

The growing image-processor unpleasantness

By Jonathan Corbet
August 18, 2022
There was a time when care had to be taken when buying hardware if the goal was to run Linux on it. The situation has improved considerably in recent decades, and unsupported hardware is more the exception than the rule. That has, for many years, been especially true of Intel hardware; that company has made a point of ensuring that its offerings work with Linux. So it is a bit surprising that the IPU6 image processor shipped with Alder Lake CPUs lacks support in Linux, and is unlikely to get it anytime soon. The problem highlighted here goes beyond just Intel, though.

The IPU6, like most image processors, exists to accept a data stream from a camera sensor and turn it into a useful video stream. These processors can take on a lot of tasks, including rotation, cropping, zooming, color-space conversion, white-balance correction, noise removal, focus management, and more. They are complex devices; the kernel community has responded by creating some equally complex APIs, including Video4Linux2 (V4L2) and media controller, to allow user space to manage them. As long as a device comes with a suitable driver, the kernel can make a camera device available to user space which, with care, can work with it without needing to know the details of the hardware.

As Paul Menzel recently pointed out on the linux-kernel mailing list, there is no such driver for the IPU6, so a mainline Linux kernel cannot drive it. As a result, the kernel lacks support for MIPI cameras on some current laptops, including some versions of the Thinkpad X1 Carbon and Dell XPS 13, which are relatively popular with Linux users (cameras using other interfaces, such as USB UVC, are generally supported). To get around this problem, Dell ships a closed-source, user-space driver in the Ubuntu build it offers on the XPS 13. Lenovo, instead, is not selling the affected systems with Linux preloaded at all at this point.

Laurent Pinchart provided more details on this situation. IPU6 support on Ubuntu is based on a kernel driver that provides a V4L2 interface, but which interfaces with a proprietary user-space driver to actually get the work done. As Ubuntu's Dell page notes, this solution is not without its costs: "CPU performance is impacted by a daemon doing buffer copying for v4l2 loopback device" and the camera can only provide 720p resolution. Pinchart went on to say that the IPU6 will not be the only problematic device out there:

Given the direction the industry is taking, this situation will become increasingly common in the future. With the notable exception of Raspberry Pi who is leading the way in open-source camera support, no SoC vendor is willing today to open their imaging algorithms.

Improving this situation will require work on a couple of fronts. On the user-space side, Pinchart pointed out that the libcamera project was created for the explicit purpose of supporting complex devices like the IPU6; this project was covered here in 2019. Among other things, libcamera was designed to allow the plugging-in of proprietary image-processing code while maximizing free-software choice. It currently supports the earlier IPU3 processor either with or without the proprietary plugin, though not all functionality is available without it.

Providing a similar level of support for the IPU6 should be possible, but it will take a fair amount of work. It also is going to need a proper kernel driver for the IPU6, which could be a problem. Evidently, the complexity of this device is such that the V4L2 API cannot support it, so a new API will be required. A candidate API exists in a preliminary form; it is called CAM (or KCAM) and, according to Sergey Senozhatsky, the plan is to get that merged into the kernel, then to add an IPU6 driver that implements this API. Pinchart responded that this merging will not happen quickly because the API, as proposed, is not seen favorably in the kernel community.

The V4L2 API has required extensive development and evolution over many years to reach its current point; it will not be replaced overnight. That is doubly true if the proposed alternative has been developed in isolation and never been through any sort of public discussion, which is the case here. Senozhatsky posted a pointer to a repository containing a CAM implementation, but that code is only enlightening in the vaguest terms. There is no documentation and no drivers actually implementing this API. It is highly abstracted; as Sakari Ailus put it: "I wouldn't have guessed this is an API for cameras if I hadn't been told so".

Creating a new camera API will be, as Ailus described it, "a major and risky endeavour"; it will require a long conversation, and it's not clear that CAM has moved the clock forward by much. According to Senozhatsky, the CAM code will not be submitted upstream for "several months". He later suggested that this plan could be accelerated, but the fact remains that the community has not even begun the process of designing an API that is suitable for the next generation of image processors.

The bottom line is that purchasers of Linux systems are going to have to be more careful for some time; many otherwise nice systems will not work fully without a proprietary camera driver and, as a result, will lack support in many distributions. Making this hardware work at all will involve losing CPU performance to the current workaround. Working all of this out is likely to take years and, even then, it's not clear that there will be viable free-software solutions available. It is a significant step backward for Intel, for Linux, and for the industry as a whole.

Postscript: we were asked to add the following statement by the KCAM developers, who feel that the portrayal of the project in this article was not entirely fair:

Google has been working closely with the community members and hardware vendors since 2018. This includes organising two big workshops (in 2018 and 2020).

The development has been done in public and has been presented in public forums to welcome as many interested parties as possible.

We are looking forward to completing the development of the initial proposal and posting it on the mailing list, to get the opinion of the community. ChromeOS believes in “Upstream first” for our kernel work and will continue to work in that direction.


Index entries for this article
KernelDevice drivers/Video4Linux2


to post comments

The growing image-processor unpleasantness

Posted Aug 18, 2022 14:52 UTC (Thu) by fredex (subscriber, #11727) [Link] (4 responses)

One wonders why Intel has left Linux in the lurch, as they commonly seem to provide necessary underpinnings. I didn't see any mention here of whatever their rationale may be.

The growing image-processor unpleasantness

Posted Aug 18, 2022 15:54 UTC (Thu) by pizza (subscriber, #46) [Link] (2 responses)

Just a guess here, but they probably licensed the image processing core from someone else, and don't have the rights to release code/etc. Not unlike their Atom GPUs that were PowerVR-based.

The growing image-processor unpleasantness

Posted Aug 24, 2022 19:01 UTC (Wed) by andy_shev (subscriber, #75870) [Link]

It's used to be Silicon Hive. And actually AtomISP2 (predecessor of IPU3) gets its momentum (thanks to community and some code published for Android).

The growing image-processor unpleasantness

Posted Aug 25, 2022 9:18 UTC (Thu) by NRArnot (subscriber, #3033) [Link]

AMD need to develop or buy in one that they can release. That would provide another reason not to buy Intel.

The growing image-processor unpleasantness

Posted Aug 18, 2022 18:07 UTC (Thu) by airlied (subscriber, #9104) [Link]

I think a lot of it has been Intel have split their "Linux" groups out into the product groups in many places. This means more exposure to hardware and Windows, and more "code sharing". Less importance of upstream and open-source. They get the GPL means kernel drivers must be open, but they are often happy to just share binary Windows code for userspace and think that is acceptable, until they find out after ignoring their internal feedback that it isn't.

The growing image-processor unpleasantness

Posted Aug 18, 2022 16:07 UTC (Thu) by logang (subscriber, #127618) [Link] (9 responses)

I think Intel has not worked well with Linux in recent years despite their good reputation for it.

I have an office desktop with a UHD Graphics 630 which has always been fussy about booting with working graphics; but more recently I've been stuck on the 5.4 kernel for a long time because newer kernels simply do not work at all. I haven't had time to check in a few months, but I've spent way more time than I'd like trying to get it to work. At one point I thought a bios update was the solution but that did not help.

I have a (now 4 year old) laptop with a microphone (SST based) that has never worked despite diving through a ton of bug report noise and spending way too much time trying to configure the right driver (there's some mess of multiple sound drivers for their hardware) and right special firmware. Still never got it working and gave up.

Just this week I have a friend who bought a brand new Alder Lake laptop who is desperately trying to install the latest available kernel and firmware seeing the HDMI port does not work, nor does the laptop go to sleep properly. He reads forums with arcane instructions like installing the linux-oem-kernel package (whatever that is) but, as yet, hasn't solved the problem.

Intel may have had a few good years of great support for Linux, but IMO they no longer deserve that reputation.

The growing image-processor unpleasantness

Posted Aug 18, 2022 22:24 UTC (Thu) by marcH (subscriber, #57642) [Link] (5 responses)

> Intel may have had a few good years of great support for Linux, but IMO they no longer deserve that reputation.

Any better alternative?

The growing image-processor unpleasantness

Posted Aug 19, 2022 3:49 UTC (Fri) by xanni (subscriber, #361) [Link] (3 responses)

Personally, I switched to AMD a few years ago and haven't regretted it.

The growing image-processor unpleasantness

Posted Aug 22, 2022 13:23 UTC (Mon) by xophos (subscriber, #75267) [Link]

I can confirm that. At least on the graphics card side AMD is much less trouble than intel or nvidia.

The growing image-processor unpleasantness

Posted Aug 31, 2022 9:00 UTC (Wed) by bingbuidea (guest, #67887) [Link] (1 responses)

Could you share your model number and the ISP version on your AMD system?

The growing image-processor unpleasantness

Posted Oct 4, 2022 8:39 UTC (Tue) by Klavs (guest, #10563) [Link]

I switched to Lenovo T14s g1 - and am very happy with it. Fantastic battery time (10+ hours of working) - and sleep etc. "just works".
So yes - Intel seems to have dropped the ball, and AMD definetely picked it up.

The growing image-processor unpleasantness

Posted Aug 19, 2022 8:04 UTC (Fri) by opsec (subscriber, #119360) [Link]

As a FreeBSD user, I'm quite happy with AMD.

The growing image-processor unpleasantness

Posted Aug 19, 2022 8:11 UTC (Fri) by linusw (subscriber, #40300) [Link]

> I think Intel has not worked well with Linux in recent years despite their good reputation for it.

This might have coincided with Dirk Hohndel leaving Intel as "open source strategic technologist" in june 2016.

The growing image-processor unpleasantness

Posted Aug 26, 2022 14:24 UTC (Fri) by riteshsarraf (subscriber, #11138) [Link]

I was frustrated with the bugs in the Intel Graphics stack, seen so often. Much more graduated with reporting them to their big tracker I bit the bullet and choose AMD for my new machine. And I'm glad to mention that it was a good decision

The growing image-processor unpleasantness

Posted Aug 30, 2022 20:25 UTC (Tue) by piexil (guest, #145099) [Link]

The laptop not sleeping properly is 100% Intel (and Microsoft's) doing. Newer Intel CPUs removed S3 entirely in favor of modern standby with S0iX states but the uefi firmware falsely reports it's still supported so Linux tries to enter S3.

My GPD Pocket 3 suffers from this. The workarounds currently are to use s2idle state which drains quite a bit in standby (only a day of standby time), or hibernate.

The growing image-processor unpleasantness

Posted Aug 18, 2022 16:29 UTC (Thu) by tialaramex (subscriber, #21167) [Link] (14 responses)

So, UVC cameras aren't very expensive, and seem pretty good yet they aren't doing anything V4L2 isn't fine with. Presumably then, the IPU6 is either doing something much better (higher resolutions? Better quality images from cheaper sensors?) or is itself so much cheaper than what's inside a UVC camera that it's worth it?

If the result isn't that we get cheaper and better cameras, why do this? Intel are of course welcome to ship worse stuff for more money, but there's no reason anybody should buy it.

The growing image-processor unpleasantness

Posted Aug 18, 2022 16:43 UTC (Thu) by farnz (subscriber, #17727) [Link] (4 responses)

UVC pushes all the sensor to useful format conversion into firmware inside the camera - sensor data goes into the firmware, H.264 or similar comes out. Something like the IPU6 gets raw sensor data, allows you to apply processing steps (under software control), and then do what you like with the resulting stream of data.

There's a lot more control exposed by something like the IPU6, but it's not necessarily useful; your UVC camera has an IPU6 equivalent in it, plus hardware and firmware to encode to H.264, but you can't control the processing the camera does (at an absolute minimum, you need to demosaic the raw sensor data, but you'll also want to do things like dynamic range processing, exposure control and more).

The growing image-processor unpleasantness

Posted Aug 18, 2022 17:57 UTC (Thu) by mss (subscriber, #138799) [Link] (3 responses)

If the whole processing logic happened inside IPU itself there shouldn't be a problem - the required IPU data and code would then be equivalent to a proprietary firmware blob.
IOMMU would then take care of protecting the rest of the system from that blob's actions.

I suspect, however, that due to insufficient IPU capabilities at least part of the proprietary processing must happen on the CPU and that's where things become messy.

Anyway, I think just providing raw sensor output (much like a RAW file from a DSLR) would be a step in the right direction towards developing the required open-source processing.
After all, we already have open-source DSLR RAW file processors.

The growing image-processor unpleasantness

Posted Aug 19, 2022 3:54 UTC (Fri) by xanni (subscriber, #361) [Link] (2 responses)

Additionally, some of the truly groundbreaking new image processing work actually requires the raw image data - see this video for a cutting edge example:

Google’s New AI Learned To See In The Dark! 🤖
https://www.youtube.com/watch?v=7iy0WJwNmv4

I'd rather have a camera that provides access to the raw data so I can implement these new algorithms in open source code and get capabilities a generation ahead, instead of being stuck with whatever they implement in the firmware.

The growing image-processor unpleasantness

Posted Aug 21, 2022 3:15 UTC (Sun) by bartoc (guest, #124262) [Link] (1 responses)

This is neat, it looks like they are basically doing a long exposure image but with multiple really noisy short exposures, so they can figure out the motion between images and do stabilization. Smartphone and webcam sensors are really teensy so this kinda thing is quite useful for them.

Ditto on the depth of field simulation (although I wonder how accurate it is), it's really hard to get a lot of DoF with those tiny sensors given how the lens geometry ends up. I wonder how it compares with light field cameras, which could produce some images that would be really, really, really hard to get with a normal camera.

The growing image-processor unpleasantness

Posted Sep 13, 2022 10:18 UTC (Tue) by nye (subscriber, #51576) [Link]

That video doesn't make the method 100% clear, but it's actually very different from synthesising a long exposure image out of multiple short exposure images.

The kind of view synthesis described in this paper requires images taken from multiple angles, and uses all the combined data to generate a radiance field - basically a model of where the light is in the scene, its direction, and its value. Once you've done that, you can generate a new synthesised view from any angle, including an angle which matches one of the input images.

I'm not sure how the various NeRF methods currently compare to more traditional light field methods, but this is an area of rapid research and I would expect any answers to get stale very quickly.

The growing image-processor unpleasantness

Posted Aug 18, 2022 19:17 UTC (Thu) by excors (subscriber, #95769) [Link] (4 responses)

One significant benefit of having the image processor integrated into the SoC (instead of receiving externally-processed H.264 over USB) is that you can implement some algorithms on the CPU and/or GPU, since they have efficient access to the full-resolution uncompressed video stream. E.g. Dell says they've implemented their own Temporal Noise Reduction (to improve low-light image quality) which is presumably using that flexibility to insert their algorithm into the processing pipeline, and wouldn't be possible with an external ISP. (https://www.dell.com/en-in/blog/innovating-through-the-le...)

It looks like Intel is similarly implementing some of their ISP algorithms on the CPU (as non-open-source libraries), probably because the cost and multi-year lead time of implementing algorithms in hardware is totally impractical when it's a field of active research (as much of this stuff is), and for many algorithms the CPU is efficient enough.

The integrated ISP also saves cost by reusing hardware blocks like the video encoder (which the SoC has to have anyway, for use cases like screen recording), sharing RAM (you need a lot for 4K video, but when the camera's off you want to let other applications use it), letting you share the same ISP hardware between multiple cameras, etc. And there's the general efficiency benefit of having fewer chips.

Combined sensor+ISP packages are easier to use, but I think having raw sensors connected to the SoC's ISP is better in pretty much every other way.

The growing image-processor unpleasantness

Posted Aug 19, 2022 11:52 UTC (Fri) by laurent.pinchart (subscriber, #71290) [Link]

Combining algorithms implemented in dedicated hardware with algorithms implemented on the CPU or in other programmable processors (GPU, DSP, ...) is one of the benefits of the new camera architecture that vendors are shifting to.

It's important here to note that algorithms, especially when implemented in an ISP, are typically split in two distinct parts. The ISP performs the computations and processing that are very heavy for the CPU. This includes pixel processing (from applying simple colour gains to applying complex spatial and temporal filters), as well as statistics computation (histograms, accumulators, sharpness estimation, ...). That's the processing part of the algorithms. The ISP needs to be configured with a large number of parameters to be perform this (gains, filter kernels, look-up tables, ...). Computing those parameters from the statistics is then done in software, running on the CPU in order to provide full flexibility. This is the control part of the algorithms, and it needs to run in real time to compute ISP processing parameters that react to changing scene conditions. This is how auto-exposure, auto white balance and auto-focus are implemented, from an ISP point of view, there's only white balance, the "auto" part is handled by the CPU.

The growing image-processor unpleasantness

Posted Sep 2, 2022 7:08 UTC (Fri) by mordae (guest, #54701) [Link] (2 responses)

We are slowly entering the PCIe 5.0 era. That's 3.938 GB/s per lane. I doubt that we really need to save bandwidth at this point.

Energy savings could be realized by letting the sensor and GPU share the same buffer so that CPU don't have to touch it until GPU has done it's job. Instead of adding another semi-programmable ASIC or a single-purpose FPGA to the mix.

This only makes sense for embedded applications where the CPU is super super low-performance, GPU is non-existent and the goal is to transmit high quality video data over network. Typically surveillance cameras.

The growing image-processor unpleasantness

Posted Sep 2, 2022 7:20 UTC (Fri) by mordae (guest, #54701) [Link]

> This only makes sense for embedded applications

I forgot to add: from the technical perspective

There is obviously a competition on smartphone image quality going on.

The growing image-processor unpleasantness

Posted Sep 2, 2022 8:46 UTC (Fri) by excors (subscriber, #95769) [Link]

> Energy savings could be realized by letting the sensor and GPU share the same buffer so that CPU don't have to touch it until GPU has done it's job. Instead of adding another semi-programmable ASIC or a single-purpose FPGA to the mix.

I'm not sure if I'm correctly interpreting what you're suggesting, but: The output from the sensor will be something like 10/12-bit Bayer. I don't think it'd be particularly useful to share that directly with the GPU, because GPU memory architectures seem to be optimised for 32-bit-aligned elements with 8/16/32-bit components (since that's what graphics and most GPGPU uses) and they'd be inherently inefficient at processing the raw Bayer data. So at the very least, you need some dedicated hardware to do the demosaicing efficiently, and probably any other processing that benefits from the sensor's 10/12-bit precision (where it'd be wasteful to use the GPU's 16-bit ALU operations), before saving to RAM as 8-bit YUV.

Once you've got YUV, then it's useful to do zero-copy sharing with the GPU, and I think any sensible software architecture will already do that. (Intel's GPU has some built-in support for planar YUV420 texture-sampling to help with that.)

But a lot of the image processing will still be inefficient on a GPU. It's way too expensive to write the whole frame to RAM after every processing step (given you may have a dozen steps, and it's a 4K frame at 90fps) - you want to read a chunk into the GPU's fast local memory and do all the steps at once before writing it out, and use the GPU's parallelism to process multiple chunks concurrently. But Intel's GPU has something like 64KB of local memory (per subslice), so you're limited to chunks of maybe 128x64 pixels. Whenever you apply some processing kernel with a radius of N, the 128x64 of input becomes (128-N)x(64-N) of valid output, and if you do many processing steps then you end up with a tiny number of useful pixels for all that work. The GPU memory architecture is really bad for this. (And that's not specific to Intel's GPU, I think they're all similar.)

So you still want dedicated hardware (with associated firmware) for most of that processing, with a much more efficient local memory architecture (maybe a circular array of line buffers per processing stage), and just use the GPU and/or CPU for any extra processing that you couldn't get into the hardware because of cost or schedule.

The growing image-processor unpleasantness

Posted Aug 18, 2022 22:27 UTC (Thu) by marcH (subscriber, #57642) [Link]

> Presumably then, the IPU6 is either doing something much better (higher resolutions? Better quality images from cheaper sensors?) or is itself so much cheaper than what's inside a UVC camera that it's worth it?

How do you think smartphone cameras make PC / UVC cameras look 30 years old?

The growing image-processor unpleasantness

Posted Aug 19, 2022 4:20 UTC (Fri) by pizza (subscriber, #46) [Link]

> Presumably then, the IPU6 is either doing something much better (higher resolutions? Better quality images from cheaper sensors?)

A good example of this is Google's Pixel phones -- The Pixel 3/3a, 4/4a, 5/5a, and 6a series use the relatively ancient 12.2Mp Sony IMX363 sensor, but the quality of the resulting images improved with each model as Google threw better algorithms and more processing power at the imaging pipeline.

The growing image-processor unpleasantness

Posted Sep 2, 2022 3:35 UTC (Fri) by dxin (guest, #136611) [Link] (1 responses)

(Cheap) UVC cameras are simply too crude.

Let's see a very common UVC controller (maybe the only one "modern enough" that supports UHD and USB SuperSpeed), Cypress CX-3. It's more or less a ARM9 that handles UVC/USB protocol, a DMA engine and 512KB RAM image buffer. And no, absolutely no image processing, not even AE/AWB/AF, although the UVC controller can relay commands to the lens motor if the host want to control then lens.

Worse, noise removal and image enhancement are all left to the application, because we know V4L2 kernel driver also does none of that, which is why images from most UVC cameras looks so noisy.

Even worse, if the sensor outputs raw bayer data (which is the majority now), then the task of raw to bayer conversion is totally left to the application (because neither the UAC controller nor the kernel driver would do it) and any application that can not do that would simply ignore the camera.

For the most part of Linux history, the camera API, V4L2 is just a buffer management interface. There has never been a userspace middle layer that provides good and refined camera service to the application, nor has there been a good way of implementing it (PulseVideo?).

The camera stack of Andrroid may not be the best design in the world, but I do consider it to be on the leading edge of usefulness to the majority of application of users. The Chromium people just want to bring some of those back to Linux.

The growing image-processor unpleasantness

Posted Sep 3, 2022 18:37 UTC (Sat) by laurent.pinchart (subscriber, #71290) [Link]

Most, if not all, cameras based on a CX-3 use a "smart sensor" (a.k.a. "YUV sensor") that integrates a raw (Bayer) imaging sensor and an ISP in the same chip. Those have a tendency to target low-cost applications, and don't focus on image quality. A UVC camera could however be built around a good quality raw sensor and a powerful ISP (and possibly eve an h.264 or H.265 encoder to lower the USB bandwidth at high resolutions and frame rates). High-end webcams have such an architecture, and I wouldn't be surprised if some even ran Linux internally (I know of multiple webcam vendors who prototyped webcams based on an SoC running Linux, but I don't know if any are available on the market today).

> For the most part of Linux history, the camera API, V4L2 is just a buffer management interface. There has never been a userspace middle layer that provides good and refined camera service to the application, nor has there been a good way of implementing it (PulseVideo?).

That was right, and the libcamera project has been created to fill exactly that space.

> The camera stack of Andrroid may not be the best design in the world, but I do consider it to be on the leading edge of usefulness to the majority of application of users. The Chromium people just want to bring some of those back to Linux.

There seems to be some confusion here. Chrome OS uses the Android camera HAL3 API internally, but as far as I know, hasn't published any plan to push that API towards all Linux platforms. What they refer to as "kcam" today is a kernel API that plan to use as a V4L2 replacement, and the current design places it at an even lower level than V4L2. The gap between the kernel API and applications would then be larger than with V4L2 (and it's already pretty large there, as indicated by how Linux has no open-source IPU6 support today).

The growing image-processor unpleasantness

Posted Aug 18, 2022 16:32 UTC (Thu) by flussence (guest, #85566) [Link] (37 responses)

Ah but you see, the laws of physics around sound waves and photons changes every 5 years so this hardware upgrade treadmill is totally necessary! Can't just simply put USB UVC and PCI HDA (or god forbid, AC97) in a machine and call it good, that would impair profits somehow.

Friends don't let friends buy bad laptops; I'll be sure to speak up if I ever find a good one but I'm almost certain it won't have Intel inside.

The growing image-processor unpleasantness

Posted Aug 18, 2022 19:42 UTC (Thu) by Sesse (subscriber, #53779) [Link] (34 responses)

Yeah, and cameras totally are not getting any better! Just look at the cell phone cameras 15 years ago and today, you'll see there's _no_ difference. None at all. And any difference certainly does not have anything to do with more sophisticated processing software.

The growing image-processor unpleasantness

Posted Aug 19, 2022 4:07 UTC (Fri) by stephen.pollei (subscriber, #125364) [Link] (29 responses)

Smartphones will kill off the DSLR within three years says Sony : .... stress the role that new hardware will play in lifting phone cameras to new photographic heights. ... While DSLRs and mirrorless cameras will always have an audience among hobbyists and pros due to their handling, creative control, viewfinders and single-shot image quality, the kinds of advances outlined in Sony's presentation show that the next few years are going to be a particularly exciting time for phone cameras.
I'm not an expert, but it sounds like some are claiming that camera hardware in phones is improving.

The growing image-processor unpleasantness

Posted Aug 19, 2022 5:38 UTC (Fri) by rsidd (subscriber, #2582) [Link] (3 responses)

I believe the previous comment was sarcasm.

The growing image-processor unpleasantness

Posted Aug 19, 2022 10:53 UTC (Fri) by Wol (subscriber, #4433) [Link] (2 responses)

Certainly one of the reasons I still stick very much with my DSLR is that, when I press the trigger (for the most part) the camera takes a photo. Not 5 seconds later when the picture has moved and isn't what I was after ...

Cheers,
Wol

The growing image-processor unpleasantness

Posted Aug 19, 2022 12:05 UTC (Fri) by excors (subscriber, #95769) [Link] (1 responses)

Smartphone cameras have supported Zero Shutter Lag for ages, so they'll capture the frame exactly when (or maybe even before) you pressed the shutter button, by storing a few raw frames in a circular buffer in RAM and then (after detecting the button press) looking backwards for the best frame to send to the ISP for full processing.

(And Google uses a similar technique for HDR photos - they take several frames captured before the shutter press, plus one long-exposure frame after the shutter press, and combine them into a single photo, with one of the pre-shutter frames as the reference to realign the others in case of motion. (https://ai.googleblog.com/2021/04/hdr-with-bracketing-on-...))

The growing image-processor unpleasantness

Posted Aug 20, 2022 17:57 UTC (Sat) by Wol (subscriber, #4433) [Link]

Then why isn't that easy for the user to find? Certainly our "user experience" - I have a Motorola G8, my wife has a G10 - is that progressive software updates have made the camera experience seriously less pleasant than when the phones were new.

(and trying to fix it by changing the settings has been an exercise in futility.)

Cheers,
Wol

The growing image-processor unpleasantness

Posted Aug 19, 2022 15:16 UTC (Fri) by mss (subscriber, #138799) [Link] (15 responses)

From the linked article:
Image quality from phones will finally trump that of their single-lens reflex rivals by 2024, according to Sony.

That's interesting, considering iPhone 13 Pro has a 44 mm² sensor, while iPhone 14 is supposed to have a main sensor that's 57% larger - so about 69 mm².

That's not even remotely comparable to entry-level APS-C DSLR (332 mm²), not to mention prosumer APS-H DSLR (519 mm²) or professional full-frame ones (~850 mm²).

Image processing can do a lot of perceptual improvements but ultimately can't restore something that the sensor didn't register (or was hidden in noise), without assuming that the picture data adheres to certain model.
In this case, however, the missing or corrupted data comes from the model in use, rather than from the captured scene.

The growing image-processor unpleasantness

Posted Aug 19, 2022 21:54 UTC (Fri) by marcH (subscriber, #57642) [Link] (11 responses)

Also, DSLRs can and do implement the same algorithms.

Smartphones will not make DSLRs redundant by making better picture than DSLRs. They will make DSLRs "nicher and nicher" because no one will be able to see the difference anymore.

The growing image-processor unpleasantness

Posted Aug 20, 2022 19:26 UTC (Sat) by Sesse (subscriber, #53779) [Link] (10 responses)

DSLRs are already dying. E.g. Nikon has most likely made their last flagship DSLR (the Nikon D6). It's not like the mirror is paramount to image quality anyway; there are positively great pro mirrorless cameras out there now.

The growing image-processor unpleasantness

Posted Aug 20, 2022 22:15 UTC (Sat) by mss (subscriber, #138799) [Link] (9 responses)

It's not like the mirror is paramount to image quality anyway; there are positively great pro mirrorless cameras out there now.

You are right, my comment above should really refer to DSLRs and mirrorless cameras together since they share typical sensor sizes (much bigger than used by smartphone built-in cameras).

The growing image-processor unpleasantness

Posted Aug 22, 2022 13:58 UTC (Mon) by Wol (subscriber, #4433) [Link] (8 responses)

SLRs should have died 60+ years ago :-)

I don't know how old they are, but rangefinder cameras predate me, and they are the film equivalent of mirrorless.

And I remember, as a maybe 20-yr-old, seeing all those ads for "top end SLRs can now do 1/250 shutter speed". I had inherited my father's - SLR - that at maybe 30 years old could do 1/500. And it's contemporaneous sibling could do 1/1000!

So the technology to outperform your bog-standard SLR has been there for at least 60, 70 years. And of course, it's almost the same technology, only that bit better! Just like inferior VHS beat superior BetaMAX. (I don't know if BetaMAX could have stayed ahead by nicking the same improvements from VHS, the fact is it didn't. Likewise, I don't know if a leaf-shutter rangefinder could have stayed ahead of the SLR - my guess is it could.)

Cheers,
Wol

The growing image-processor unpleasantness

Posted Aug 22, 2022 14:02 UTC (Mon) by Wol (subscriber, #4433) [Link]

Whoops - I meant could flash-sync at 1/250. Okay, they had faster shutter speeds, but unless you used "slow burn" flash bulbs rather than an electronic flashgun, the gun and camera couldn't sync up correctly.

Cheers,
Wol

The growing image-processor unpleasantness

Posted Aug 22, 2022 16:06 UTC (Mon) by anselm (subscriber, #2796) [Link] (2 responses)

The main advantage of the focal-plane shutter as seen in SLRs is that it gives you interchangeable lenses and very fast shutter speeds, at the price of limited flash-sync exposure times. Leaf shutters, which tend to be slower and, for best results, need to be where the aperture is in a lens, are very unusual in interchangeable-lens SLRs, because being part of the – interchangeable – lens makes the individual lenses heavier and bulkier and more complicated and expensive to build (although it was tried, with limited success, in the 1960s), and you also need another mechanism to cover the film inside the camera while you're switching lenses. This is why in the film age, you tended to see leaf shutters in fixed-lens rangefinder-type cameras rather than interchangeable-lens SLRs, certainly for 35mm film.

Today's digital mirrorless ILCs invariably use focal-plane shutters rather than leaf shutters (when they have mechanical shutters at all), so the comparison is a little weak.

The growing image-processor unpleasantness

Posted Aug 23, 2022 3:46 UTC (Tue) by bartoc (guest, #124262) [Link] (1 responses)

I'm not sure any actually have mechanical shutters, they are all digital. Perhaps some of the really huge sensors do.

The growing image-processor unpleasantness

Posted Aug 23, 2022 15:16 UTC (Tue) by anselm (subscriber, #2796) [Link]

My Olympus mirrorless interchangeable-lens cameras certainly have a mechanical as well as a digital shutter. Both have their uses. AFAIK the only current mirrorless ILC that has only a digital shutter is the (very fancy) Nikon Z9.

The growing image-processor unpleasantness

Posted Aug 23, 2022 3:43 UTC (Tue) by bartoc (guest, #124262) [Link] (2 responses)

Well, rangefinders have that annoying offset, but with a mirrorless you are still looking through the lens, you're even looking through the sensor which is better (although a different exposure, the camera will simulate the exposure and aperture settings for you in the viewfinder/screen).

I think DSLRs (as opposed to 35mm SLRs, the last of which was the Nikon F6 which was discontinued in 2020) stayed around for so long because of the phase detect autofocus, go ahead and use a telephoto lens with autofocus on a mirrorless and compare to a DSLR, the DSLR is much, much faster compared to early mirrorless. So, sports photographers and nature photographers still bought them.

I think it was Sony who mostly fixed this with their hybrid AF system

The growing image-processor unpleasantness

Posted Aug 23, 2022 16:01 UTC (Tue) by anselm (subscriber, #2796) [Link] (1 responses)

go ahead and use a telephoto lens with autofocus on a mirrorless and compare to a DSLR, the DSLR is much, much faster compared to early mirrorless.

I think that has changed now that most camera makers support a type of hybrid AF. Modern mirrorless cameras are a lot faster than DSLRs (e.g., an OMDS OM-1 can capture 50 frames per second with continuous AF and no viewfinder blackout), and they also enable nifty features that would be impossible to do on a DSLR, e.g., taking pictures continuously while the shutter release is half-pressed but only storing a number of them around the instant of the actual shutter release (nature and sports photographers love that).

The growing image-processor unpleasantness

Posted Aug 23, 2022 19:30 UTC (Tue) by bartoc (guest, #124262) [Link]

yeah, like I said the hybrid AF systems have pretty much solved this.

The growing image-processor unpleasantness

Posted Aug 24, 2022 17:03 UTC (Wed) by plugwash (subscriber, #29694) [Link]

A camera with an optical viewfinder that does not use the main lens has a few issues.

1. it makes interchangeable lenses awkward as either you need to change two lenses at the same time or changes from swapping your main lens won't be reflected in the viewfinder. It would also make coupling the focusing aid to the lens focus tricky.

2. Similarly for zoom lenses, you would need two seperate zoom mechanisms one for the main lens and one for the viewfinder.

3. The two lenses are in different places, this is a problem for framing macro shots and can also lead to your pictures being ruined by an obstruction that was n6ot visible in the viewfinder.

Slrs (whether film or digital) and digital cameras with digital viewfinders avoid these issues.

The growing image-processor unpleasantness

Posted Aug 20, 2022 14:54 UTC (Sat) by patrakov (subscriber, #97174) [Link] (2 responses)

That's an interesting aspect from the perspective of using smartphone-captured photos as evidence. If they invent detail from the model, not from the original scene, how can the result be trusted?

The growing image-processor unpleasantness

Posted Aug 20, 2022 18:23 UTC (Sat) by Wol (subscriber, #4433) [Link]

The same way they trust prints from old-fashioned silver-based film.

What matters is to know the chain from shutter-press to final image. They've tampered with images from the dawn of photography.

Cheers,
Wol

The growing image-processor unpleasantness

Posted Aug 20, 2022 22:08 UTC (Sat) by mss (subscriber, #138799) [Link]

If they invent detail from the model, not from the original scene, how can the result be trusted?

It's worth noting that lossy compression formats have a similar problem
For example, the JBIG2 algorithm used in faxes, PDFs and DjVu files is prone to silent number glyph substitutions.

The growing image-processor unpleasantness

Posted Aug 23, 2022 3:36 UTC (Tue) by bartoc (guest, #124262) [Link] (8 responses)

That presentation seems a little nuts to me.

Firstly, it seems to confuse DSLRs with mirrorless cameras, DSLRs are absolutely going away, I think this was due to improvements in display resolution and (esp) brightness, as well as improvements to autofocus, esp for people who use telephoto lenses a lot. The advantages with mirrorless are huge.

Also, the bit about image sensor size doubling doesn't seem right to me, while that might happen (I think most phones are like 1/3" to 1/2" now, sony shipped a phone with a 1" sensor, but the lens couldn't cover the whole sensor, making it somewhat pointless). Bigger sensors mean a lens needs a longer focal length to give the same framing (i.e., the smaller sensors are "zoomed in" or "cropped" compared to the bigger ones with the same lens. Lenses with larger focal lengths are bigger, need more elements, and are more expensive to manufacture, to the point that a good ~35mm prime lens (I think most smartphone lenses are about equivalent to that) for a full frame camera is about as expensive as a mid-range phone. The bigger sensors are more expensive too.

We _have_ been seeing CMOS sensor size in pro cameras get bigger pretty quickly, hell you can even buy a large format digital back now. Medium format digital cameras have been getting very popular over the last few years.

In any case, even if the sensor size goes up to 1", or a bit more there would still be interest in APS-C and full frame ILC cameras, that format is much larger, and you can get physical bokeh/depth of field, which is not possible with phone cameras because the lenses end up with really short focal lengths. Cameras with full-frame or APS-C also still do much better in low light than phone cameras, which is important for both indoor photos and if you are using a telephoto lens or a teleconverter.

The growing image-processor unpleasantness

Posted Aug 23, 2022 6:45 UTC (Tue) by jem (subscriber, #24231) [Link] (1 responses)

>I think this was due to improvements in display resolution and (esp) brightness, as well as improvements to autofocus, esp for people who use telephoto lenses a lot.

One thing that was holding back mirrorless interchangeable lens cameras was that they need an electronic viewfinder, since there is no mirror. The electronic viewfinders of a not so distant past were low-quality, expensive and power-hungry. Some die-hard DLSR camera fans still hate the electronic viewfinder.

>The advantages with mirrorless are huge.

One advantage is that the flange to sensor distance can be made much shorter, since there is no mirror in the way. This relaxes some optical constraints, enabling smaller lenses.

The growing image-processor unpleasantness

Posted Aug 23, 2022 9:21 UTC (Tue) by bartoc (guest, #124262) [Link]

Hell my mirrorless camera doesn't even have a viewfinder at all! The screen is bright enough it's not really necessary, although it probably improves the color reproduction over the screen.

Both improved display tech and the better flange distance were factors too. The display tech goes for both the screen and the viewfinder (if present). Screens have gotten a _lot_ brighter over the last few years and that's important for outdoors use.

The small flange distance also lets you use almost any lens made in the last 100 years with an adapter, which is nice.

Actually, I find it a bit of a shame we'll probably loose the 1" format compact cameras. Some of them are actually quite lovely. That format is small enough that you don't need an interchangeable lens system to get a nice wide range of (equivalent) focal lengths with pretty wide apertures, and you don't have to carry around multiple lenses. Sony's is like $1100 though, which is more than the high end iphone, and the lens is 1.25 stops slower. The fact that the optical image stabilization in the iphone is useful on an f/1.8 lens does tell you something about how noisy the sensor is, though.

The growing image-processor unpleasantness

Posted Aug 23, 2022 16:19 UTC (Tue) by anselm (subscriber, #2796) [Link] (5 responses)

I think most phones are like 1/3" to 1/2" now, sony shipped a phone with a 1" sensor

It's probably just as well to mention that the “1"” in “1" sensor” has nothing to do with the size of that sensor. An “1" sensor” is 13.2 mm × 8.8 mm, which is still tiny. Generally, the problem with physically larger sensors in smartphone cameras is that the larger the sensor is, the larger the lens needs to be, and that conflicts with people's desire to have a sleek, thin phone. Various manufacturers have tried to market phones with more serious lenses, but these phones usually had a more or less prominent “bump” on the back to accommodate the optics and didn't prove very popular.

In any case, even if the sensor size goes up to 1", or a bit more there would still be interest in APS-C and full frame ILC cameras, that format is much larger, and you can get physical bokeh/depth of field, which is not possible with phone cameras because the lenses end up with really short focal lengths.

Dedicated cameras can use better lenses than phone cameras and that often makes a difference because clever software and only looking at your pictures on a smartphone screen or (comparatively) low-resolution monitor will only get you so far. Also, dedicated cameras have lots of nifty buttons and dials that are easier to handle – and in particular easier to handle quickly – than touch screen menus. They're not going away anytime soon.

The growing image-processor unpleasantness

Posted Aug 23, 2022 23:30 UTC (Tue) by bartoc (guest, #124262) [Link] (3 responses)

Apparently, the naming is derived from TV camera tubes, where a 1" long tube could capture/scan over an image area 16mm long diagonally. This is some pretty heroic marketing speak work, if I'm being honest.

The growing image-processor unpleasantness

Posted Aug 24, 2022 0:26 UTC (Wed) by anselm (subscriber, #2796) [Link] (2 responses)

The way this works is that a (round) video tube has a useable photosensitive area whose diagonal is approximately 2/3 of the diameter of the tube. This is basically 1930s technology. It turns out that if you have a 1"-diameter video tube, 2/3 of that (16.9 mm) is only a little more than the diagonal of the “1"” sensor, 15.9 mm, so hey presto, the numbers match! What's 6% among friends.

1" sensors make Micro-4/3 sensors, which are frequently – and unjustly – pooh-poohed on the Internet as being utterly unfit for any type of serious photography, look big by comparison. And of course 35mm-type “full frame” sensors are even bigger than that.

The growing image-processor unpleasantness

Posted Aug 24, 2022 8:22 UTC (Wed) by bartoc (guest, #124262) [Link]

Yeah, I really like Micro 4/3rds, Olympus' 4/3rds offerings are weirdly bulky and "classically styled" though.

The growing image-processor unpleasantness

Posted Aug 24, 2022 8:32 UTC (Wed) by bartoc (guest, #124262) [Link]

Hell, you "can" buy a 9"x11" digital sensor from Largesense, for $90,000, and it's black and white, and only 26MP. They sell a color wheel too, so you can get colors, if you want them, and your subject is good at staying still.

The growing image-processor unpleasantness

Posted Aug 23, 2022 23:50 UTC (Tue) by bartoc (guest, #124262) [Link]

I think the iphone uses are fairly good lenses. Lens construction gets easier with smaller sensors, also all three lenses on the iphone are primes. Frankly, for day-to-day photography the triplet of 13mm prime, 26mm prime, and 52mm prime are quite sufficient (although personally I would prefer 13, 30, and 52).

The growing image-processor unpleasantness

Posted Aug 19, 2022 12:40 UTC (Fri) by flussence (guest, #85566) [Link] (3 responses)

I've never in my life looked at a laptop with a tiny front-facing webcam and a sound card wired to plastic analog 3.5 jack and can-on-string speakers, and lamented that the quality would be acceptable if only they'd compensated for these $0.05 parts with more proprietary preprocessing circuitry - usually the opposite.

The growing image-processor unpleasantness

Posted Aug 19, 2022 21:59 UTC (Fri) by marcH (subscriber, #57642) [Link] (2 responses)

In most cases I can instantly tell whether someone joined a meeting from their laptop or from their smartphone. When it's ugly it's the former. You may not care but many people had their expectations raised by smartphones.

The growing image-processor unpleasantness

Posted Aug 20, 2022 18:07 UTC (Sat) by tialaramex (subscriber, #21167) [Link] (1 responses)

On the rare occasions when a colleague has to join a meeting from their phone it's certainly easy to tell, the sound quality is abysmal, the picture is usually bobbing around madly and generally the impression is complete amateur hour.

I understand from dealing with audiophiles that "better" actually just means "different and I prefer it", but of course with audiophiles all you do about it is filter the audio until it's as "different and I prefer it" as they wanted. It's no problem to smash CD audio until it's as "rich" (ie distorted) as they think it ought to be and perhaps if it's what people want we can do the same for video meetings.

The growing image-processor unpleasantness

Posted Sep 1, 2022 10:34 UTC (Thu) by davidgerard (guest, #100304) [Link]

Tripod mounts for phones are cheap and readily available. I routinely do television using my phone's front-facing camera.

The growing image-processor unpleasantness

Posted Sep 2, 2022 3:27 UTC (Fri) by andyyeh75 (guest, #160594) [Link] (1 responses)

If you consider trying Chromebook, these 2 should offer you good video call experience per this youtuber. Looking at the quality, it should not be achievable by UVC USB camera.

HP Elite Dragonfly Chromebook review: the best you can get
https://chromeunboxed.com/hp-elite-dragonfly-chromebook-r...

Acer Chromebook Spin 714 Review: a new spin on a fan favorite
https://chromeunboxed.com/acer-chromebook-spin-714-review/

"The 5MP sensor on the HP Dragonfly along with some software tuning by HP and Google has made this my absolute favorite device for video calling. The details, contrast, exposure and colors are so much better than other Chromebooks that I’ve come to the point where I don’t want to take a video call without it. With the standard privacy shutter on board as well, there’s no doubt that this is the best camera setup we’ve seen on a Chromebook, and I really hope it sets the trend moving forward."

The growing image-processor unpleasantness

Posted Sep 3, 2022 18:19 UTC (Sat) by laurent.pinchart (subscriber, #71290) [Link]

There's nothing that would *technically* stop UVC devices to increase image quality. They can use a good raw sensor, and integrate a powerful ISP. This is usually not the case though, but more due to marketing and sales reasons (including the fact that UVC chipsets for webcams usually target lower-end devices) than technical limitations of the UVC protocol.

The growing image-processor unpleasantness

Posted Aug 18, 2022 22:30 UTC (Thu) by marcH (subscriber, #57642) [Link] (4 responses)

There are Alder Lake Chromebooks. What camera software do they run?

https://chromium-review.googlesource.com/q/alder+lake (I didn't look)

The growing image-processor unpleasantness

Posted Aug 19, 2022 1:59 UTC (Fri) by timrichardson (subscriber, #72836) [Link] (3 responses)

Not all Alder Lake laptops have this camera hardware ... that is why most Lenovo Alber Lake ThinkPads have linux-friendly configuration choices. Maybe the Chromebook also avoids the more advanced camera.

The growing image-processor unpleasantness

Posted Aug 19, 2022 8:52 UTC (Fri) by marcH (subscriber, #57642) [Link] (2 responses)

True, one Alder Lake chromebook could use the Intel IPU while another sticks to the UVC.

But for sure MIPI cameras have been used on Chromebooks before, proof: https://chromium-review.googlesource.com/q/intel+ipu6

It may be possible to find what a given Chromebook uses thanks to "chrome://system" after a fresh boot (to avoid logs wrapping), "Expand All" and searching for "camera"

The growing image-processor unpleasantness

Posted Sep 2, 2022 4:48 UTC (Fri) by andyyeh75 (guest, #160594) [Link] (1 responses)

reply to marcH:
Interesting question. I just did some researches on the web. Since I know Chromebook is using the opensource project coreboot as its based bootloader, I noticed some clues that some information were left in a KConfig file in coreboot.

https://review.coreboot.org/plugins/gitiles/coreboot/+/f0...

Find keywords:
select DRIVERS_INTEL_MIPI_CAMERA
select SOC_INTEL_COMMON_BLOCK_IPU

Then we can find some devices are with MIPI camera powered by IPU; For instance; BRYA(s), REDRIX(s), KANO, VELL. However we either cannot simply tell what those devices being called on the market, launched or in development, nor which OEMs built those. I replied earlier about some review articles (https://lwn.net/Articles/906882/) that probably the HP/Acer designs are matched right away.

The growing image-processor unpleasantness

Posted Sep 2, 2022 5:31 UTC (Fri) by marcH (subscriber, #57642) [Link]

> However we either cannot simply tell what those devices being called on the market, launched or in development, nor which OEMs built those.

We can:

https://www.chromium.org/chromium-os/developer-informatio...

https://www.google.com/search?q=redrix+chromeunboxed

The growing image-processor unpleasantness

Posted Aug 18, 2022 23:12 UTC (Thu) by jafd (subscriber, #129642) [Link] (1 responses)

Intel had us almost forget completely how in the days past having 100% driver coverage of your hardware in Linux was the result of an incredible struggle, hunting for just the right parts, paying premium quite often, fighting through either some poorly written blobs that you had to hack into your kernel with a hammer, or patching some “open source” that had been thrown over the fence and promptly forgotten about.

But now we are starting to remember.

First there were fingerprint readers.

Certain sound chips.

Now there are webcams.

I dread anticipating new GPUs.

Suddenly “buy hardware two generations old” becomes a viable piece of advice again.

The growing image-processor unpleasantness

Posted Aug 19, 2022 2:51 UTC (Fri) by smurf (subscriber, #17840) [Link]

The problem is, "two generations old" no longer cuts it and needs to be replaced with "before 2019", and/or "not from Intel".

The growing image-processor unpleasantness

Posted Aug 19, 2022 1:06 UTC (Fri) by Cyberax (✭ supporter ✭, #52523) [Link]

> It is highly abstracted; as Sakari Ailus put it: ""I wouldn't have guessed this is an API for cameras if I hadn't been told so"".

I looked at the API and I don't really see anything that is too scary. It's a fairly normal API to construct a processing pipeline that connects devices together. It's also not that large, around 7k lines in total with plenty of comments.

Adding this API and writing a driver that uses it seems to be a reasonable action.

The growing image-processor unpleasantness

Posted Aug 19, 2022 1:54 UTC (Fri) by pabs (subscriber, #43278) [Link] (5 responses)

Does anyone know why the image processing code needs to remain proprietary? Is it violating patents or something?

The growing image-processor unpleasantness

Posted Aug 19, 2022 8:17 UTC (Fri) by laurent.pinchart (subscriber, #71290) [Link] (2 responses)

Patents are possibly involved, but it's mostly because SoC vendors consider the camera algorithm to be of very high added value, and a key differentiator to compete with each other. Investments in this area are usually substantial. Whether quality of the end result justifies this or not is of course sometimes debatable :-)

The growing image-processor unpleasantness

Posted Aug 19, 2022 10:34 UTC (Fri) by khim (subscriber, #9252) [Link] (1 responses)

Not just SoC vendors. If you look on quality comparisons for different phones you'll see there are substantial difference in quality even if they use the same sensor and SOC.

So yeah, it's where people actively don't want you to know what they are doing.

The growing image-processor unpleasantness

Posted Aug 19, 2022 11:24 UTC (Fri) by laurent.pinchart (subscriber, #71290) [Link]

While the ISP hardware plays a big role in image quality, the end result also heavily depends on the camera sensor, as well as the tuning process. The ISP control algorithms (the closed source part running in userspace) are also crucial, but OEMs have less control over that, except when they can provide enough monetary incentive to the SoC vendor to improve the algorithms for a specific product. All this can explain the variations you see between devices that use the same SoC.

The growing image-processor unpleasantness

Posted Aug 19, 2022 11:52 UTC (Fri) by excors (subscriber, #95769) [Link]

I think many of the image processing algorithms will be largely hardware-agnostic - if they were open source then it would be relatively easy for AMD/Apple/Samsung/etc to either port the code or reimplement the algorithms for their own SoCs, benefiting from Intel's R&D investment and reducing Intel's competitive advantage in this area, which would substantially harm Intel. (Contrast with the drivers/HALs which are more tightly coupled to the hardware (so it's probably easier to rewrite from scratch than to port another vendor's code), and are basic plumbing rather than a competitive product feature, so there's much less downside to making them open source.)

The growing image-processor unpleasantness

Posted Aug 19, 2022 20:55 UTC (Fri) by rufio (guest, #160379) [Link]

That would be my guess. They've licensed the rights from a patent holder, directly or indirectly.

The growing image-processor unpleasantness

Posted Aug 19, 2022 5:45 UTC (Fri) by wtarreau (subscriber, #51152) [Link] (2 responses)

The only solution against this craziness is not to compare components or find alternatives, but just to compare final products, if possible making video reviews. I.e. "the new lenovo/dell whatever model is unusable for video conferences, due to taking too much CPU, draining battery like crazy and making the fan make a huge noise; model XYZ on the contrary is cheaper and works much better". Machine vendors are listening to user feedback and don't like their products being shown not working, especially with laptops because users are more sensitive to annoyances than anything else with devices they bring everywhere and use all day long. And they're not going to sacrifice one percent of their userbase that will have to go to competition. Thus they will find better technical solution in next models, e.g. switch back to UVC that nobody complained about, or suggest their customers that if linux is desired, an AMD-based model is recommended instead, and intel will be left with an IP block nobody uses nor wants to pay for.

One must understand that the fault is first and foremost on those who accept to make an end-user product from unfinished components and to sell it in a state that doesn't work. Why would intel change their practice here if lenovo and dell continue to buy as many CPUs without complaining ? Any product's success always integrates a significant part of luck, and changing anything in the product or the way it's sold sometimes opens the risk of losing that luck factor. This is also why many successful products don't change by a iota for a long time (sometimes to the point of becoming ignored as outdated).

The growing image-processor unpleasantness

Posted Aug 19, 2022 15:00 UTC (Fri) by khim (subscriber, #9252) [Link]

> And they're not going to sacrifice one percent of their userbase that will have to go to competition.

Depends on much they can get from the other ninety nine percents of users.

> This is also why many successful products don't change by a iota for a long time (sometimes to the point of becoming ignored as outdated).

Don't mix perception and reality. Successful products change all the time, as long as provided drivers work, very few people will complain.

That's one reason manufacturers tend not to care that much about Linux users: not only they complain about crazy things, they tend to become upset if some component is silently replaced with other, cheaper or just more easily obtainable.

The growing image-processor unpleasantness

Posted Aug 24, 2022 19:17 UTC (Wed) by andy_shev (subscriber, #75870) [Link]

Let be honest, people who are using Android or Chromebook devices do not give a crap about upstream. The rest of "desktop Linux users" is what? 2%?

The growing image-processor unpleasantness

Posted Aug 19, 2022 8:18 UTC (Fri) by Archimedes (subscriber, #125143) [Link] (3 responses)

"Lenovo, instead, is not selling the affected systems with Linux preloaded at all at this point."

Hmm seems I am victim of a race condition then, I just bought a Thinkpad X1 Carbon Gen 10 with fedora linux installed. It already arrived and <OT> I needed to reinstall fedora, as they neither have secure boot on, nore is the disc encrypted ... </OT>
As I already turned of the camera and mics in the BIOS for me personally it is no problem, but generally I agree that this is an unpleasent step.

The growing image-processor unpleasantness

Posted Aug 19, 2022 9:52 UTC (Fri) by abo (subscriber, #77288) [Link]

If it came with Fedora preinstalled then I'm pretty sure the hardware is supported.

The growing image-processor unpleasantness

Posted Aug 28, 2022 4:51 UTC (Sun) by Conan_Kudo (subscriber, #103240) [Link] (1 responses)

Lenovo laptops with Fedora preloaded do not ship MIPI cameras. They have regular UVC cameras instead.

The growing image-processor unpleasantness

Posted Aug 31, 2022 7:42 UTC (Wed) by nilsmeyer (guest, #122604) [Link]

I recently ordered a ThinkPad P16 Gen1, now I'm wondering whether it also has one of those MIPI cameras - there was no option to get a pre-installed Linux however they were kind enough to sell it without an OS.

The growing image-processor unpleasantness

Posted Aug 19, 2022 10:42 UTC (Fri) by fratti (guest, #105722) [Link] (1 responses)

I'm not convinced a new API is needed. V4L2 is about as flexible of a beast as DRM is, in my experience. That's why there are image processors, encoders, decoders, capture devices, etc. all supported by the V4L2 umbrella. V4L2 can just throw dmabufs around from what I know so I also do not understand why Dell's driver requires a daemon doing buffer copies in userspace.

Don't define new APIs when the problem is that vendors are not writing good code.

The growing image-processor unpleasantness

Posted Aug 19, 2022 11:59 UTC (Fri) by laurent.pinchart (subscriber, #71290) [Link]

V4L2 is an old API that hasn't been designed with the features needed by recent hardware. It models devices with a level of abstraction that corresponds to a TV capture card or a webcam, essentially a black box that performs all low-level configuration of the hardware and implements imaging algorithms within the device, and exposes high-level controls. Over the years the industry has shifted towards exposing the sensor and ISP to the operating system, starting on embedded devices, phones, tablets, and as we see now, moving to laptops. V4L2 has been extended and retro-fitted with APIs such as the Media Controller and the V4L2 Userspace Subdev API, and more extensions are likely possibly, but it's becoming increasingly painful to do so within the context of V4L2. There is not clearly defined point that I'm aware of that would sign V4L2's retirement for cameras, but it certainly is looming ahead of us somewhere. That's fine though, I believe a good modern API can be designed. There are some talented developers in the community who have both an interest in this topic and the experience needed to make it happen, but we're very resource-constrained.

We still need specialized hardware for this?

Posted Aug 19, 2022 10:53 UTC (Fri) by epa (subscriber, #39769) [Link] (6 responses)

What I find surprising is that hardware acceleration is still needed for this task and that Intel, of all companies, is admitting that rather than trying to push a solution using the general-purpose CPU. Acres of silicon are used to support vector instruction sets like AVX512. Surely image processing is just what all that extra CPU muscle was built for?

We still need specialized hardware for this?

Posted Aug 19, 2022 11:26 UTC (Fri) by laurent.pinchart (subscriber, #71290) [Link]

Image processing for cameras typically consists of tasks that can be very efficiently performed in hardware and that are very costly for the CPU. My favourite example is white balance, which applies different gains per colour channels. With a typical Bayer sensor, that's four hardware multipliers in the ISP. Think about what would be needed in terms of CPU time (and memory bandwidth) to do the same in software, for high-resolution images at high frame rates. And this is a simple example, there are typically dozens of processing steps in the hardware pipeline.

We still need specialized hardware for this?

Posted Aug 19, 2022 14:26 UTC (Fri) by fratti (guest, #105722) [Link] (3 responses)

Maybe "needed" is the wrong word, "preferred" would be better. I'm not sure how I can word this in a way that isn't several paragraphs but in essence: Von Neumann bad, specialised hardware good because efficient. Going forward there will be a lot more specialised hardware, and that has already been the way the wind is blowing for the past decade or so. (Neural accelerators, network stack offloading to NICs, hardware video encoders and decoders, ...)

We still need specialized hardware for this?

Posted Aug 22, 2022 2:12 UTC (Mon) by dvdeug (guest, #10998) [Link] (2 responses)

This is a cycle in computing; stuff has regularly moved from external hardware to CPU as often as the other way around, like floating point math, sound cards, modem audio processing (the notorious Winmodem), etc. Many CPUs have internal GPUs; I suspect the majority of computers use integrated graphics and will continue to do so, just because you don't need a graphics card for most non-gaming computers. At one point in time, I might have needed hardware video encoders and decoders, but now my computer can handle full-screen video and audio in real-time with the CPU and integrated graphics.

It's not efficient to have a extra part to fail, require separate cooling and pass through an external bus that may be a whole light-nanosecond away. I don't personally expect a great growth in specialized hardware; if neural accelerators become something that a general purpose computer needs, it's likely to be folded into the CPU, at the very least in the case of cheap computers.

We still need specialized hardware for this?

Posted Aug 22, 2022 15:48 UTC (Mon) by immibis (subscriber, #105511) [Link] (1 responses)

Tangential: what was actually the problem with Winmodems in the end? Couldn't open source developers write DSP code?

We still need specialized hardware for this?

Posted Aug 22, 2022 16:15 UTC (Mon) by farnz (subscriber, #17727) [Link]

Not legally. Underlying the problem with WinModems was the issue that it was outright illegal in many jurisdictions to write DSP code for a part directly attached to the phone line (you'd have to use an acoustic coupler to avoid the regulations).

Even ignoring the legal stuff (which people like Richard Close and Pavel Machek did, and wrote working DSP code), you have the problem that compute resources of the era weren't great, and thus the DSP mostly ended up running in kernel space to meet the required real time guarantees. It was done, and WinModems were made to work, but they were never great - and by the time compute resource was good enough to run the entire DSP in userspace, people had moved away from dial-up.

We still need specialized hardware for this?

Posted Aug 19, 2022 16:40 UTC (Fri) by excors (subscriber, #95769) [Link]

AVX is built for people who want to put some effort into developing a reasonably fast and reasonably power-efficient implementation of their algorithm, but who don't want to spend years and many millions of dollars to build an ASIC that can do it with greater speed or efficiency.

A lot of stuff falls into that category; but many image processing algorithms have particularly high performance requirements (IPU6 claims to support 4K video at 90fps, which is a lot of pixels in a thin battery-powered device) and are mature enough that Intel can be confident they'll still be useful several years later, in which case it's worth investing in dedicated hardware.

The tradeoff keeps shifting as algorithms change and as general-purpose processors get more efficient and as specialised hardware gets more flexible, but I think the long-term trend is towards increasingly complex combinations of specialised hardware and microcontroller firmware and CPU software, which needs more sophisticated APIs to handle synchronisation between all those components.

The growing image-processor unpleasantness

Posted Aug 19, 2022 12:07 UTC (Fri) by stefanha (subscriber, #55072) [Link]

Two points the article does not address:

1. Why is V4L incapable of fitting this hardware and why can't the API be extended?

2. Why is the CPU doing so much work that vendors are shipping closed source drivers?

The growing image-processor unpleasantness

Posted Aug 19, 2022 12:18 UTC (Fri) by abo (subscriber, #77288) [Link] (1 responses)

https://libcamera.org/docs.html#camera-stack doesn't mention portals and PipeWire, how do those relate?

They plan an LD_PRELOAD for apps that use the old V4L2 interface. It's like the bad old days of /dev/dsp emulation. But it's for the better, we shouldn't need a kernel module to feed virtual loopback camera feeds into apps.

The growing image-processor unpleasantness

Posted Aug 19, 2022 12:37 UTC (Fri) by laurent.pinchart (subscriber, #71290) [Link]

PipeWire already has libcamera support, and we've successfully tested it with integration in Chromium's webrtc code base to placing a video call on an Intel IPU3-based device. The webrtc integration has been developed by Michael Olbrich from Pengutronix and submitted for review (see https://webrtc-review.googlesource.com/c/src/+/261620, I'm not sure it's the very latest version though, I haven't followed that development too closely myself).

The LD_PRELOAD trick is still useful in some cases (and it is implemented and has been tested with Firefox), but it is not what most systems should use. I expect desktop Linux to use libcamera through portals and PipeWire. For some more advanced use cases the GStreamer libcamerasrc element could also be useful on the desktop, but will probably see most of its uses in embedded devices. Direct usage of the libcamera API is also possible for specialized applications, and of course Android devices will go through the libcamera's Android camera HAL implementation.

libcamera suffers from the common problem in the free software world of documentation lagging behind code. We need to update the camera stack description. Apologies about that.

The growing image-processor unpleasantness

Posted Aug 19, 2022 16:51 UTC (Fri) by sergey.senozhatsky (subscriber, #91933) [Link] (2 responses)

> Pinchart responded that this merging will not happen quickly because the API, as proposed, is not seen favorably in the kernel community.

Imaginary wars.

We have not proposed any API yet and we have never posted anything upstream, so what exactly "is not seen favorably in the kernel community" will forever remain a mystery.

The growing image-processor unpleasantness

Posted Aug 19, 2022 18:13 UTC (Fri) by jbnd (guest, #160374) [Link] (1 responses)

Well I bought one of the mentioned Lenovo laptops in July preloaded with Ubuntu before all of this blew up and all I am seeing right now is Intel trying to push their proprietary trash onto everyone else.

The growing image-processor unpleasantness

Posted Aug 19, 2022 22:41 UTC (Fri) by ribalda (subscriber, #58945) [Link]

For ChromeOS The Intel IPU6 HAL is available at https://source.chromium.org/chromiumos/chromiumos/codesea... with an Apache-2 license.

The growing image-processor unpleasantness

Posted Aug 21, 2022 10:27 UTC (Sun) by sadoon (subscriber, #155365) [Link] (2 responses)

Thanks for the article, just wanted to point out a small typo:

> There was a time when care had to be taking when

taken*

The growing image-processor unpleasantness

Posted Aug 21, 2022 13:12 UTC (Sun) by jake (editor, #205) [Link]


>taken*

ouch, indeed ... thanks for the report, but please send them the lwn@lwn.net in the future ...

jake

The growing image-processor unpleasantness

Posted Aug 24, 2022 5:41 UTC (Wed) by sadoon (subscriber, #155365) [Link]

> Please do NOT post typos in the article as comments, send them to lwn@lwn.net instead.

Heh just noticed, my bad.


Copyright © 2022, Eklektix, Inc.
This article may be redistributed under the terms of the Creative Commons CC BY-SA 4.0 license
Comments and public postings are copyrighted by their creators.
Linux is a registered trademark of Linus Torvalds