The growing image-processor unpleasantness
The IPU6, like most image processors, exists to accept a data stream from a camera sensor and turn it into a useful video stream. These processors can take on a lot of tasks, including rotation, cropping, zooming, color-space conversion, white-balance correction, noise removal, focus management, and more. They are complex devices; the kernel community has responded by creating some equally complex APIs, including Video4Linux2 (V4L2) and media controller, to allow user space to manage them. As long as a device comes with a suitable driver, the kernel can make a camera device available to user space which, with care, can work with it without needing to know the details of the hardware.
As Paul Menzel recently pointed out on the linux-kernel mailing list, there is no such driver for the IPU6, so a mainline Linux kernel cannot drive it. As a result, the kernel lacks support for MIPI cameras on some current laptops, including some versions of the Thinkpad X1 Carbon and Dell XPS 13, which are relatively popular with Linux users (cameras using other interfaces, such as USB UVC, are generally supported). To get around this problem, Dell ships a closed-source, user-space driver in the Ubuntu build it offers on the XPS 13. Lenovo, instead, is not selling the affected systems with Linux preloaded at all at this point.
Laurent Pinchart provided
more details on this situation. IPU6 support on Ubuntu is based on a
kernel driver that provides a V4L2 interface, but which interfaces with a
proprietary user-space driver to actually get the work done. As Ubuntu's
Dell page notes, this solution
is not without its costs: "CPU performance is impacted by a daemon doing
buffer copying for v4l2 loopback device
" and the camera can only
provide 720p resolution. Pinchart went on to say that
the IPU6 will not be the only problematic device out there:
Given the direction the industry is taking, this situation will become increasingly common in the future. With the notable exception of Raspberry Pi who is leading the way in open-source camera support, no SoC vendor is willing today to open their imaging algorithms.
Improving this situation will require work on a couple of fronts. On the user-space side, Pinchart pointed out that the libcamera project was created for the explicit purpose of supporting complex devices like the IPU6; this project was covered here in 2019. Among other things, libcamera was designed to allow the plugging-in of proprietary image-processing code while maximizing free-software choice. It currently supports the earlier IPU3 processor either with or without the proprietary plugin, though not all functionality is available without it.
Providing a similar level of support for the IPU6 should be possible, but it will take a fair amount of work. It also is going to need a proper kernel driver for the IPU6, which could be a problem. Evidently, the complexity of this device is such that the V4L2 API cannot support it, so a new API will be required. A candidate API exists in a preliminary form; it is called CAM (or KCAM) and, according to Sergey Senozhatsky, the plan is to get that merged into the kernel, then to add an IPU6 driver that implements this API. Pinchart responded that this merging will not happen quickly because the API, as proposed, is not seen favorably in the kernel community.
The V4L2 API has required extensive development and evolution over many
years to reach its current point; it will not be replaced overnight. That
is doubly true if the proposed alternative has been developed in isolation
and never been through any sort of public discussion, which is the case
here. Senozhatsky posted a pointer
to a
repository containing a CAM implementation, but that code is only
enlightening in the
vaguest terms. There is no documentation and no drivers actually
implementing this API. It is highly abstracted; as Sakari Ailus put
it: "I wouldn't have guessed this is an API for cameras if
I hadn't been told so
".
Creating a new camera API will be, as Ailus described
it, "a major and risky endeavour
"; it
will require a long
conversation, and it's not clear that CAM has moved the clock forward by
much. According to
Senozhatsky, the CAM code will not be submitted upstream for
"several months
". He later suggested that
this plan could be accelerated, but the fact remains that the community has
not even begun the process of designing an API that is suitable for the
next generation of image processors.
The bottom line is that purchasers of Linux systems are going to have to be more careful for some time; many otherwise nice systems will not work fully without a proprietary camera driver and, as a result, will lack support in many distributions. Making this hardware work at all will involve losing CPU performance to the current workaround. Working all of this out is likely to take years and, even then, it's not clear that there will be viable free-software solutions available. It is a significant step backward for Intel, for Linux, and for the industry as a whole.
Postscript: we were asked to add the following statement by the KCAM developers, who feel that the portrayal of the project in this article was not entirely fair:
Google has been working closely with the community members and hardware vendors since 2018. This includes organising two big workshops (in 2018 and 2020).The development has been done in public and has been presented in public forums to welcome as many interested parties as possible.
We are looking forward to completing the development of the initial proposal and posting it on the mailing list, to get the opinion of the community. ChromeOS believes in “Upstream first” for our kernel work and will continue to work in that direction.
Index entries for this article | |
---|---|
Kernel | Device drivers/Video4Linux2 |
Posted Aug 18, 2022 14:52 UTC (Thu)
by fredex (subscriber, #11727)
[Link] (4 responses)
Posted Aug 18, 2022 15:54 UTC (Thu)
by pizza (subscriber, #46)
[Link] (2 responses)
Posted Aug 24, 2022 19:01 UTC (Wed)
by andy_shev (subscriber, #75870)
[Link]
Posted Aug 25, 2022 9:18 UTC (Thu)
by NRArnot (subscriber, #3033)
[Link]
Posted Aug 18, 2022 18:07 UTC (Thu)
by airlied (subscriber, #9104)
[Link]
Posted Aug 18, 2022 16:07 UTC (Thu)
by logang (subscriber, #127618)
[Link] (9 responses)
I have an office desktop with a UHD Graphics 630 which has always been fussy about booting with working graphics; but more recently I've been stuck on the 5.4 kernel for a long time because newer kernels simply do not work at all. I haven't had time to check in a few months, but I've spent way more time than I'd like trying to get it to work. At one point I thought a bios update was the solution but that did not help.
I have a (now 4 year old) laptop with a microphone (SST based) that has never worked despite diving through a ton of bug report noise and spending way too much time trying to configure the right driver (there's some mess of multiple sound drivers for their hardware) and right special firmware. Still never got it working and gave up.
Just this week I have a friend who bought a brand new Alder Lake laptop who is desperately trying to install the latest available kernel and firmware seeing the HDMI port does not work, nor does the laptop go to sleep properly. He reads forums with arcane instructions like installing the linux-oem-kernel package (whatever that is) but, as yet, hasn't solved the problem.
Intel may have had a few good years of great support for Linux, but IMO they no longer deserve that reputation.
Posted Aug 18, 2022 22:24 UTC (Thu)
by marcH (subscriber, #57642)
[Link] (5 responses)
Any better alternative?
Posted Aug 19, 2022 3:49 UTC (Fri)
by xanni (subscriber, #361)
[Link] (3 responses)
Posted Aug 22, 2022 13:23 UTC (Mon)
by xophos (subscriber, #75267)
[Link]
Posted Aug 31, 2022 9:00 UTC (Wed)
by bingbuidea (guest, #67887)
[Link] (1 responses)
Posted Oct 4, 2022 8:39 UTC (Tue)
by Klavs (guest, #10563)
[Link]
Posted Aug 19, 2022 8:04 UTC (Fri)
by opsec (subscriber, #119360)
[Link]
Posted Aug 19, 2022 8:11 UTC (Fri)
by linusw (subscriber, #40300)
[Link]
This might have coincided with Dirk Hohndel leaving Intel as "open source strategic technologist" in june 2016.
Posted Aug 26, 2022 14:24 UTC (Fri)
by riteshsarraf (subscriber, #11138)
[Link]
Posted Aug 30, 2022 20:25 UTC (Tue)
by piexil (guest, #145099)
[Link]
My GPD Pocket 3 suffers from this. The workarounds currently are to use s2idle state which drains quite a bit in standby (only a day of standby time), or hibernate.
Posted Aug 18, 2022 16:29 UTC (Thu)
by tialaramex (subscriber, #21167)
[Link] (14 responses)
If the result isn't that we get cheaper and better cameras, why do this? Intel are of course welcome to ship worse stuff for more money, but there's no reason anybody should buy it.
Posted Aug 18, 2022 16:43 UTC (Thu)
by farnz (subscriber, #17727)
[Link] (4 responses)
UVC pushes all the sensor to useful format conversion into firmware inside the camera - sensor data goes into the firmware, H.264 or similar comes out. Something like the IPU6 gets raw sensor data, allows you to apply processing steps (under software control), and then do what you like with the resulting stream of data.
There's a lot more control exposed by something like the IPU6, but it's not necessarily useful; your UVC camera has an IPU6 equivalent in it, plus hardware and firmware to encode to H.264, but you can't control the processing the camera does (at an absolute minimum, you need to demosaic the raw sensor data, but you'll also want to do things like dynamic range processing, exposure control and more).
Posted Aug 18, 2022 17:57 UTC (Thu)
by mss (subscriber, #138799)
[Link] (3 responses)
I suspect, however, that due to insufficient IPU capabilities at least part of the proprietary processing must happen on the CPU and that's where things become messy.
Anyway, I think just providing raw sensor output (much like a RAW file from a DSLR) would be a step in the right direction towards developing the required open-source processing.
Posted Aug 19, 2022 3:54 UTC (Fri)
by xanni (subscriber, #361)
[Link] (2 responses)
Google’s New AI Learned To See In The Dark! 🤖
I'd rather have a camera that provides access to the raw data so I can implement these new algorithms in open source code and get capabilities a generation ahead, instead of being stuck with whatever they implement in the firmware.
Posted Aug 21, 2022 3:15 UTC (Sun)
by bartoc (guest, #124262)
[Link] (1 responses)
Ditto on the depth of field simulation (although I wonder how accurate it is), it's really hard to get a lot of DoF with those tiny sensors given how the lens geometry ends up. I wonder how it compares with light field cameras, which could produce some images that would be really, really, really hard to get with a normal camera.
Posted Sep 13, 2022 10:18 UTC (Tue)
by nye (subscriber, #51576)
[Link]
The kind of view synthesis described in this paper requires images taken from multiple angles, and uses all the combined data to generate a radiance field - basically a model of where the light is in the scene, its direction, and its value. Once you've done that, you can generate a new synthesised view from any angle, including an angle which matches one of the input images.
I'm not sure how the various NeRF methods currently compare to more traditional light field methods, but this is an area of rapid research and I would expect any answers to get stale very quickly.
Posted Aug 18, 2022 19:17 UTC (Thu)
by excors (subscriber, #95769)
[Link] (4 responses)
It looks like Intel is similarly implementing some of their ISP algorithms on the CPU (as non-open-source libraries), probably because the cost and multi-year lead time of implementing algorithms in hardware is totally impractical when it's a field of active research (as much of this stuff is), and for many algorithms the CPU is efficient enough.
The integrated ISP also saves cost by reusing hardware blocks like the video encoder (which the SoC has to have anyway, for use cases like screen recording), sharing RAM (you need a lot for 4K video, but when the camera's off you want to let other applications use it), letting you share the same ISP hardware between multiple cameras, etc. And there's the general efficiency benefit of having fewer chips.
Combined sensor+ISP packages are easier to use, but I think having raw sensors connected to the SoC's ISP is better in pretty much every other way.
Posted Aug 19, 2022 11:52 UTC (Fri)
by laurent.pinchart (subscriber, #71290)
[Link]
It's important here to note that algorithms, especially when implemented in an ISP, are typically split in two distinct parts. The ISP performs the computations and processing that are very heavy for the CPU. This includes pixel processing (from applying simple colour gains to applying complex spatial and temporal filters), as well as statistics computation (histograms, accumulators, sharpness estimation, ...). That's the processing part of the algorithms. The ISP needs to be configured with a large number of parameters to be perform this (gains, filter kernels, look-up tables, ...). Computing those parameters from the statistics is then done in software, running on the CPU in order to provide full flexibility. This is the control part of the algorithms, and it needs to run in real time to compute ISP processing parameters that react to changing scene conditions. This is how auto-exposure, auto white balance and auto-focus are implemented, from an ISP point of view, there's only white balance, the "auto" part is handled by the CPU.
Posted Sep 2, 2022 7:08 UTC (Fri)
by mordae (guest, #54701)
[Link] (2 responses)
Energy savings could be realized by letting the sensor and GPU share the same buffer so that CPU don't have to touch it until GPU has done it's job. Instead of adding another semi-programmable ASIC or a single-purpose FPGA to the mix.
This only makes sense for embedded applications where the CPU is super super low-performance, GPU is non-existent and the goal is to transmit high quality video data over network. Typically surveillance cameras.
Posted Sep 2, 2022 7:20 UTC (Fri)
by mordae (guest, #54701)
[Link]
I forgot to add: from the technical perspective
There is obviously a competition on smartphone image quality going on.
Posted Sep 2, 2022 8:46 UTC (Fri)
by excors (subscriber, #95769)
[Link]
I'm not sure if I'm correctly interpreting what you're suggesting, but: The output from the sensor will be something like 10/12-bit Bayer. I don't think it'd be particularly useful to share that directly with the GPU, because GPU memory architectures seem to be optimised for 32-bit-aligned elements with 8/16/32-bit components (since that's what graphics and most GPGPU uses) and they'd be inherently inefficient at processing the raw Bayer data. So at the very least, you need some dedicated hardware to do the demosaicing efficiently, and probably any other processing that benefits from the sensor's 10/12-bit precision (where it'd be wasteful to use the GPU's 16-bit ALU operations), before saving to RAM as 8-bit YUV.
Once you've got YUV, then it's useful to do zero-copy sharing with the GPU, and I think any sensible software architecture will already do that. (Intel's GPU has some built-in support for planar YUV420 texture-sampling to help with that.)
But a lot of the image processing will still be inefficient on a GPU. It's way too expensive to write the whole frame to RAM after every processing step (given you may have a dozen steps, and it's a 4K frame at 90fps) - you want to read a chunk into the GPU's fast local memory and do all the steps at once before writing it out, and use the GPU's parallelism to process multiple chunks concurrently. But Intel's GPU has something like 64KB of local memory (per subslice), so you're limited to chunks of maybe 128x64 pixels. Whenever you apply some processing kernel with a radius of N, the 128x64 of input becomes (128-N)x(64-N) of valid output, and if you do many processing steps then you end up with a tiny number of useful pixels for all that work. The GPU memory architecture is really bad for this. (And that's not specific to Intel's GPU, I think they're all similar.)
So you still want dedicated hardware (with associated firmware) for most of that processing, with a much more efficient local memory architecture (maybe a circular array of line buffers per processing stage), and just use the GPU and/or CPU for any extra processing that you couldn't get into the hardware because of cost or schedule.
Posted Aug 18, 2022 22:27 UTC (Thu)
by marcH (subscriber, #57642)
[Link]
How do you think smartphone cameras make PC / UVC cameras look 30 years old?
Posted Aug 19, 2022 4:20 UTC (Fri)
by pizza (subscriber, #46)
[Link]
A good example of this is Google's Pixel phones -- The Pixel 3/3a, 4/4a, 5/5a, and 6a series use the relatively ancient 12.2Mp Sony IMX363 sensor, but the quality of the resulting images improved with each model as Google threw better algorithms and more processing power at the imaging pipeline.
Posted Sep 2, 2022 3:35 UTC (Fri)
by dxin (guest, #136611)
[Link] (1 responses)
Let's see a very common UVC controller (maybe the only one "modern enough" that supports UHD and USB SuperSpeed), Cypress CX-3. It's more or less a ARM9 that handles UVC/USB protocol, a DMA engine and 512KB RAM image buffer. And no, absolutely no image processing, not even AE/AWB/AF, although the UVC controller can relay commands to the lens motor if the host want to control then lens.
Worse, noise removal and image enhancement are all left to the application, because we know V4L2 kernel driver also does none of that, which is why images from most UVC cameras looks so noisy.
For the most part of Linux history, the camera API, V4L2 is just a buffer management interface. There has never been a userspace middle layer that provides good and refined camera service to the application, nor has there been a good way of implementing it (PulseVideo?).
The camera stack of Andrroid may not be the best design in the world, but I do consider it to be on the leading edge of usefulness to the majority of application of users. The Chromium people just want to bring some of those back to Linux.
Posted Sep 3, 2022 18:37 UTC (Sat)
by laurent.pinchart (subscriber, #71290)
[Link]
> For the most part of Linux history, the camera API, V4L2 is just a buffer management interface. There has never been a userspace middle layer that provides good and refined camera service to the application, nor has there been a good way of implementing it (PulseVideo?).
That was right, and the libcamera project has been created to fill exactly that space.
> The camera stack of Andrroid may not be the best design in the world, but I do consider it to be on the leading edge of usefulness to the majority of application of users. The Chromium people just want to bring some of those back to Linux.
There seems to be some confusion here. Chrome OS uses the Android camera HAL3 API internally, but as far as I know, hasn't published any plan to push that API towards all Linux platforms. What they refer to as "kcam" today is a kernel API that plan to use as a V4L2 replacement, and the current design places it at an even lower level than V4L2. The gap between the kernel API and applications would then be larger than with V4L2 (and it's already pretty large there, as indicated by how Linux has no open-source IPU6 support today).
Posted Aug 18, 2022 16:32 UTC (Thu)
by flussence (guest, #85566)
[Link] (37 responses)
Friends don't let friends buy bad laptops; I'll be sure to speak up if I ever find a good one but I'm almost certain it won't have Intel inside.
Posted Aug 18, 2022 19:42 UTC (Thu)
by Sesse (subscriber, #53779)
[Link] (34 responses)
Posted Aug 19, 2022 4:07 UTC (Fri)
by stephen.pollei (subscriber, #125364)
[Link] (29 responses)
Posted Aug 19, 2022 5:38 UTC (Fri)
by rsidd (subscriber, #2582)
[Link] (3 responses)
Posted Aug 19, 2022 10:53 UTC (Fri)
by Wol (subscriber, #4433)
[Link] (2 responses)
Cheers,
Posted Aug 19, 2022 12:05 UTC (Fri)
by excors (subscriber, #95769)
[Link] (1 responses)
(And Google uses a similar technique for HDR photos - they take several frames captured before the shutter press, plus one long-exposure frame after the shutter press, and combine them into a single photo, with one of the pre-shutter frames as the reference to realign the others in case of motion. (https://ai.googleblog.com/2021/04/hdr-with-bracketing-on-...))
Posted Aug 20, 2022 17:57 UTC (Sat)
by Wol (subscriber, #4433)
[Link]
(and trying to fix it by changing the settings has been an exercise in futility.)
Cheers,
Posted Aug 19, 2022 15:16 UTC (Fri)
by mss (subscriber, #138799)
[Link] (15 responses)
Posted Aug 19, 2022 21:54 UTC (Fri)
by marcH (subscriber, #57642)
[Link] (11 responses)
Smartphones will not make DSLRs redundant by making better picture than DSLRs. They will make DSLRs "nicher and nicher" because no one will be able to see the difference anymore.
Posted Aug 20, 2022 19:26 UTC (Sat)
by Sesse (subscriber, #53779)
[Link] (10 responses)
Posted Aug 20, 2022 22:15 UTC (Sat)
by mss (subscriber, #138799)
[Link] (9 responses)
Posted Aug 22, 2022 13:58 UTC (Mon)
by Wol (subscriber, #4433)
[Link] (8 responses)
I don't know how old they are, but rangefinder cameras predate me, and they are the film equivalent of mirrorless.
And I remember, as a maybe 20-yr-old, seeing all those ads for "top end SLRs can now do 1/250 shutter speed". I had inherited my father's - SLR - that at maybe 30 years old could do 1/500. And it's contemporaneous sibling could do 1/1000!
So the technology to outperform your bog-standard SLR has been there for at least 60, 70 years. And of course, it's almost the same technology, only that bit better! Just like inferior VHS beat superior BetaMAX. (I don't know if BetaMAX could have stayed ahead by nicking the same improvements from VHS, the fact is it didn't. Likewise, I don't know if a leaf-shutter rangefinder could have stayed ahead of the SLR - my guess is it could.)
Cheers,
Posted Aug 22, 2022 14:02 UTC (Mon)
by Wol (subscriber, #4433)
[Link]
Cheers,
Posted Aug 22, 2022 16:06 UTC (Mon)
by anselm (subscriber, #2796)
[Link] (2 responses)
The main advantage of the focal-plane shutter as seen in SLRs is that it gives you interchangeable lenses and very fast shutter speeds, at the price of limited flash-sync exposure times. Leaf shutters, which tend to be slower and, for best results, need to be where the aperture is in a lens, are very unusual in interchangeable-lens SLRs, because being part of the – interchangeable – lens makes the individual lenses heavier and bulkier and more complicated and expensive to build (although it was tried, with limited success, in the 1960s), and you also need another mechanism to cover the film inside the camera while you're switching lenses. This is why in the film age, you tended to see leaf shutters in fixed-lens rangefinder-type cameras rather than interchangeable-lens SLRs, certainly for 35mm film.
Today's digital mirrorless ILCs invariably use focal-plane shutters rather than leaf shutters (when they have mechanical shutters at all), so the comparison is a little weak.
Posted Aug 23, 2022 3:46 UTC (Tue)
by bartoc (guest, #124262)
[Link] (1 responses)
Posted Aug 23, 2022 15:16 UTC (Tue)
by anselm (subscriber, #2796)
[Link]
My Olympus mirrorless interchangeable-lens cameras certainly have a mechanical as well as a digital shutter. Both have their uses. AFAIK the only current mirrorless ILC that has only a digital shutter is the (very fancy) Nikon Z9.
Posted Aug 23, 2022 3:43 UTC (Tue)
by bartoc (guest, #124262)
[Link] (2 responses)
I think DSLRs (as opposed to 35mm SLRs, the last of which was the Nikon F6 which was discontinued in 2020) stayed around for so long because of the phase detect autofocus, go ahead and use a telephoto lens with autofocus on a mirrorless and compare to a DSLR, the DSLR is much, much faster compared to early mirrorless. So, sports photographers and nature photographers still bought them.
I think it was Sony who mostly fixed this with their hybrid AF system
Posted Aug 23, 2022 16:01 UTC (Tue)
by anselm (subscriber, #2796)
[Link] (1 responses)
I think that has changed now that most camera makers support a type of hybrid AF. Modern mirrorless cameras are a lot faster than DSLRs (e.g., an OMDS OM-1 can capture 50 frames per second with continuous AF and no viewfinder blackout), and they also enable nifty features that would be impossible to do on a DSLR, e.g., taking pictures continuously while the shutter release is half-pressed but only storing a number of them around the instant of the actual shutter release (nature and sports photographers love that).
Posted Aug 23, 2022 19:30 UTC (Tue)
by bartoc (guest, #124262)
[Link]
Posted Aug 24, 2022 17:03 UTC (Wed)
by plugwash (subscriber, #29694)
[Link]
1. it makes interchangeable lenses awkward as either you need to change two lenses at the same time or changes from swapping your main lens won't be reflected in the viewfinder. It would also make coupling the focusing aid to the lens focus tricky.
2. Similarly for zoom lenses, you would need two seperate zoom mechanisms one for the main lens and one for the viewfinder.
3. The two lenses are in different places, this is a problem for framing macro shots and can also lead to your pictures being ruined by an obstruction that was n6ot visible in the viewfinder.
Slrs (whether film or digital) and digital cameras with digital viewfinders avoid these issues.
Posted Aug 20, 2022 14:54 UTC (Sat)
by patrakov (subscriber, #97174)
[Link] (2 responses)
Posted Aug 20, 2022 18:23 UTC (Sat)
by Wol (subscriber, #4433)
[Link]
What matters is to know the chain from shutter-press to final image. They've tampered with images from the dawn of photography.
Cheers,
Posted Aug 20, 2022 22:08 UTC (Sat)
by mss (subscriber, #138799)
[Link]
Posted Aug 23, 2022 3:36 UTC (Tue)
by bartoc (guest, #124262)
[Link] (8 responses)
Firstly, it seems to confuse DSLRs with mirrorless cameras, DSLRs are absolutely going away, I think this was due to improvements in display resolution and (esp) brightness, as well as improvements to autofocus, esp for people who use telephoto lenses a lot. The advantages with mirrorless are huge.
Also, the bit about image sensor size doubling doesn't seem right to me, while that might happen (I think most phones are like 1/3" to 1/2" now, sony shipped a phone with a 1" sensor, but the lens couldn't cover the whole sensor, making it somewhat pointless). Bigger sensors mean a lens needs a longer focal length to give the same framing (i.e., the smaller sensors are "zoomed in" or "cropped" compared to the bigger ones with the same lens. Lenses with larger focal lengths are bigger, need more elements, and are more expensive to manufacture, to the point that a good ~35mm prime lens (I think most smartphone lenses are about equivalent to that) for a full frame camera is about as expensive as a mid-range phone. The bigger sensors are more expensive too.
We _have_ been seeing CMOS sensor size in pro cameras get bigger pretty quickly, hell you can even buy a large format digital back now. Medium format digital cameras have been getting very popular over the last few years.
In any case, even if the sensor size goes up to 1", or a bit more there would still be interest in APS-C and full frame ILC cameras, that format is much larger, and you can get physical bokeh/depth of field, which is not possible with phone cameras because the lenses end up with really short focal lengths. Cameras with full-frame or APS-C also still do much better in low light than phone cameras, which is important for both indoor photos and if you are using a telephoto lens or a teleconverter.
Posted Aug 23, 2022 6:45 UTC (Tue)
by jem (subscriber, #24231)
[Link] (1 responses)
One thing that was holding back mirrorless interchangeable lens cameras was that they need an electronic viewfinder, since there is no mirror. The electronic viewfinders of a not so distant past were low-quality, expensive and power-hungry. Some die-hard DLSR camera fans still hate the electronic viewfinder.
>The advantages with mirrorless are huge.
One advantage is that the flange to sensor distance can be made much shorter, since there is no mirror in the way. This relaxes some optical constraints, enabling smaller lenses.
Posted Aug 23, 2022 9:21 UTC (Tue)
by bartoc (guest, #124262)
[Link]
Both improved display tech and the better flange distance were factors too. The display tech goes for both the screen and the viewfinder (if present). Screens have gotten a _lot_ brighter over the last few years and that's important for outdoors use.
The small flange distance also lets you use almost any lens made in the last 100 years with an adapter, which is nice.
Actually, I find it a bit of a shame we'll probably loose the 1" format compact cameras. Some of them are actually quite lovely. That format is small enough that you don't need an interchangeable lens system to get a nice wide range of (equivalent) focal lengths with pretty wide apertures, and you don't have to carry around multiple lenses. Sony's is like $1100 though, which is more than the high end iphone, and the lens is 1.25 stops slower. The fact that the optical image stabilization in the iphone is useful on an f/1.8 lens does tell you something about how noisy the sensor is, though.
Posted Aug 23, 2022 16:19 UTC (Tue)
by anselm (subscriber, #2796)
[Link] (5 responses)
It's probably just as well to mention that the “1"” in “1" sensor” has nothing to do with the size of that sensor. An “1" sensor” is 13.2 mm × 8.8 mm, which is still tiny. Generally, the problem with physically larger sensors in smartphone cameras is that the larger the sensor is, the larger the lens needs to be, and that conflicts with people's desire to have a sleek, thin phone. Various manufacturers have tried to market phones with more serious lenses, but these phones usually had a more or less prominent “bump” on the back to accommodate the optics and didn't prove very popular.
Dedicated cameras can use better lenses than phone cameras and that often makes a difference because clever software and only looking at your pictures on a smartphone screen or (comparatively) low-resolution monitor will only get you so far. Also, dedicated cameras have lots of nifty buttons and dials that are easier to handle – and in particular easier to handle quickly – than touch screen menus. They're not going away anytime soon.
Posted Aug 23, 2022 23:30 UTC (Tue)
by bartoc (guest, #124262)
[Link] (3 responses)
Posted Aug 24, 2022 0:26 UTC (Wed)
by anselm (subscriber, #2796)
[Link] (2 responses)
The way this works is that a (round) video tube has a useable photosensitive area whose diagonal is approximately 2/3 of the diameter of the tube. This is basically 1930s technology. It turns out that if you have a 1"-diameter video tube, 2/3 of that (16.9 mm) is only a little more than the diagonal of the “1"” sensor, 15.9 mm, so hey presto, the numbers match! What's 6% among friends.
1" sensors make Micro-4/3 sensors, which are frequently – and unjustly – pooh-poohed on the Internet as being utterly unfit for any type of serious photography, look big by comparison. And of course 35mm-type “full frame” sensors are even bigger than that.
Posted Aug 24, 2022 8:22 UTC (Wed)
by bartoc (guest, #124262)
[Link]
Posted Aug 24, 2022 8:32 UTC (Wed)
by bartoc (guest, #124262)
[Link]
Posted Aug 23, 2022 23:50 UTC (Tue)
by bartoc (guest, #124262)
[Link]
Posted Aug 19, 2022 12:40 UTC (Fri)
by flussence (guest, #85566)
[Link] (3 responses)
Posted Aug 19, 2022 21:59 UTC (Fri)
by marcH (subscriber, #57642)
[Link] (2 responses)
Posted Aug 20, 2022 18:07 UTC (Sat)
by tialaramex (subscriber, #21167)
[Link] (1 responses)
I understand from dealing with audiophiles that "better" actually just means "different and I prefer it", but of course with audiophiles all you do about it is filter the audio until it's as "different and I prefer it" as they wanted. It's no problem to smash CD audio until it's as "rich" (ie distorted) as they think it ought to be and perhaps if it's what people want we can do the same for video meetings.
Posted Sep 1, 2022 10:34 UTC (Thu)
by davidgerard (guest, #100304)
[Link]
Posted Sep 2, 2022 3:27 UTC (Fri)
by andyyeh75 (guest, #160594)
[Link] (1 responses)
HP Elite Dragonfly Chromebook review: the best you can get
Acer Chromebook Spin 714 Review: a new spin on a fan favorite
"The 5MP sensor on the HP Dragonfly along with some software tuning by HP and Google has made this my absolute favorite device for video calling. The details, contrast, exposure and colors are so much better than other Chromebooks that I’ve come to the point where I don’t want to take a video call without it. With the standard privacy shutter on board as well, there’s no doubt that this is the best camera setup we’ve seen on a Chromebook, and I really hope it sets the trend moving forward."
Posted Sep 3, 2022 18:19 UTC (Sat)
by laurent.pinchart (subscriber, #71290)
[Link]
Posted Aug 18, 2022 22:30 UTC (Thu)
by marcH (subscriber, #57642)
[Link] (4 responses)
https://chromium-review.googlesource.com/q/alder+lake (I didn't look)
Posted Aug 19, 2022 1:59 UTC (Fri)
by timrichardson (subscriber, #72836)
[Link] (3 responses)
Posted Aug 19, 2022 8:52 UTC (Fri)
by marcH (subscriber, #57642)
[Link] (2 responses)
But for sure MIPI cameras have been used on Chromebooks before, proof: https://chromium-review.googlesource.com/q/intel+ipu6
It may be possible to find what a given Chromebook uses thanks to "chrome://system" after a fresh boot (to avoid logs wrapping), "Expand All" and searching for "camera"
Posted Sep 2, 2022 4:48 UTC (Fri)
by andyyeh75 (guest, #160594)
[Link] (1 responses)
https://review.coreboot.org/plugins/gitiles/coreboot/+/f0...
Find keywords:
Then we can find some devices are with MIPI camera powered by IPU; For instance; BRYA(s), REDRIX(s), KANO, VELL. However we either cannot simply tell what those devices being called on the market, launched or in development, nor which OEMs built those. I replied earlier about some review articles (https://lwn.net/Articles/906882/) that probably the HP/Acer designs are matched right away.
Posted Sep 2, 2022 5:31 UTC (Fri)
by marcH (subscriber, #57642)
[Link]
We can:
https://www.chromium.org/chromium-os/developer-informatio...
https://www.google.com/search?q=redrix+chromeunboxed
Posted Aug 18, 2022 23:12 UTC (Thu)
by jafd (subscriber, #129642)
[Link] (1 responses)
But now we are starting to remember.
First there were fingerprint readers.
Certain sound chips.
Now there are webcams.
I dread anticipating new GPUs.
Suddenly “buy hardware two generations old” becomes a viable piece of advice again.
Posted Aug 19, 2022 2:51 UTC (Fri)
by smurf (subscriber, #17840)
[Link]
Posted Aug 19, 2022 1:06 UTC (Fri)
by Cyberax (✭ supporter ✭, #52523)
[Link]
I looked at the API and I don't really see anything that is too scary. It's a fairly normal API to construct a processing pipeline that connects devices together. It's also not that large, around 7k lines in total with plenty of comments.
Adding this API and writing a driver that uses it seems to be a reasonable action.
Posted Aug 19, 2022 1:54 UTC (Fri)
by pabs (subscriber, #43278)
[Link] (5 responses)
Posted Aug 19, 2022 8:17 UTC (Fri)
by laurent.pinchart (subscriber, #71290)
[Link] (2 responses)
Posted Aug 19, 2022 10:34 UTC (Fri)
by khim (subscriber, #9252)
[Link] (1 responses)
Not just SoC vendors. If you look on quality comparisons for different phones you'll see there are substantial difference in quality even if they use the same sensor and SOC. So yeah, it's where people actively don't want you to know what they are doing.
Posted Aug 19, 2022 11:24 UTC (Fri)
by laurent.pinchart (subscriber, #71290)
[Link]
Posted Aug 19, 2022 11:52 UTC (Fri)
by excors (subscriber, #95769)
[Link]
Posted Aug 19, 2022 20:55 UTC (Fri)
by rufio (guest, #160379)
[Link]
Posted Aug 19, 2022 5:45 UTC (Fri)
by wtarreau (subscriber, #51152)
[Link] (2 responses)
One must understand that the fault is first and foremost on those who accept to make an end-user product from unfinished components and to sell it in a state that doesn't work. Why would intel change their practice here if lenovo and dell continue to buy as many CPUs without complaining ? Any product's success always integrates a significant part of luck, and changing anything in the product or the way it's sold sometimes opens the risk of losing that luck factor. This is also why many successful products don't change by a iota for a long time (sometimes to the point of becoming ignored as outdated).
Posted Aug 19, 2022 15:00 UTC (Fri)
by khim (subscriber, #9252)
[Link]
Depends on much they can get from the other ninety nine percents of users. Don't mix perception and reality. Successful products change all the time, as long as provided drivers work, very few people will complain. That's one reason manufacturers tend not to care that much about Linux users: not only they complain about crazy things, they tend to become upset if some component is silently replaced with other, cheaper or just more easily obtainable.
Posted Aug 24, 2022 19:17 UTC (Wed)
by andy_shev (subscriber, #75870)
[Link]
Posted Aug 19, 2022 8:18 UTC (Fri)
by Archimedes (subscriber, #125143)
[Link] (3 responses)
Hmm seems I am victim of a race condition then, I just bought a Thinkpad X1 Carbon Gen 10 with fedora linux installed. It already arrived and <OT> I needed to reinstall fedora, as they neither have secure boot on, nore is the disc encrypted ... </OT>
Posted Aug 19, 2022 9:52 UTC (Fri)
by abo (subscriber, #77288)
[Link]
Posted Aug 28, 2022 4:51 UTC (Sun)
by Conan_Kudo (subscriber, #103240)
[Link] (1 responses)
Posted Aug 31, 2022 7:42 UTC (Wed)
by nilsmeyer (guest, #122604)
[Link]
Posted Aug 19, 2022 10:42 UTC (Fri)
by fratti (guest, #105722)
[Link] (1 responses)
Don't define new APIs when the problem is that vendors are not writing good code.
Posted Aug 19, 2022 11:59 UTC (Fri)
by laurent.pinchart (subscriber, #71290)
[Link]
Posted Aug 19, 2022 10:53 UTC (Fri)
by epa (subscriber, #39769)
[Link] (6 responses)
Posted Aug 19, 2022 11:26 UTC (Fri)
by laurent.pinchart (subscriber, #71290)
[Link]
Posted Aug 19, 2022 14:26 UTC (Fri)
by fratti (guest, #105722)
[Link] (3 responses)
Posted Aug 22, 2022 2:12 UTC (Mon)
by dvdeug (guest, #10998)
[Link] (2 responses)
It's not efficient to have a extra part to fail, require separate cooling and pass through an external bus that may be a whole light-nanosecond away. I don't personally expect a great growth in specialized hardware; if neural accelerators become something that a general purpose computer needs, it's likely to be folded into the CPU, at the very least in the case of cheap computers.
Posted Aug 22, 2022 15:48 UTC (Mon)
by immibis (subscriber, #105511)
[Link] (1 responses)
Posted Aug 22, 2022 16:15 UTC (Mon)
by farnz (subscriber, #17727)
[Link]
Not legally. Underlying the problem with WinModems was the issue that it was outright illegal in many jurisdictions to write DSP code for a part directly attached to the phone line (you'd have to use an acoustic coupler to avoid the regulations).
Even ignoring the legal stuff (which people like Richard Close and Pavel Machek did, and wrote working DSP code), you have the problem that compute resources of the era weren't great, and thus the DSP mostly ended up running in kernel space to meet the required real time guarantees. It was done, and WinModems were made to work, but they were never great - and by the time compute resource was good enough to run the entire DSP in userspace, people had moved away from dial-up.
Posted Aug 19, 2022 16:40 UTC (Fri)
by excors (subscriber, #95769)
[Link]
A lot of stuff falls into that category; but many image processing algorithms have particularly high performance requirements (IPU6 claims to support 4K video at 90fps, which is a lot of pixels in a thin battery-powered device) and are mature enough that Intel can be confident they'll still be useful several years later, in which case it's worth investing in dedicated hardware.
The tradeoff keeps shifting as algorithms change and as general-purpose processors get more efficient and as specialised hardware gets more flexible, but I think the long-term trend is towards increasingly complex combinations of specialised hardware and microcontroller firmware and CPU software, which needs more sophisticated APIs to handle synchronisation between all those components.
Posted Aug 19, 2022 12:07 UTC (Fri)
by stefanha (subscriber, #55072)
[Link]
1. Why is V4L incapable of fitting this hardware and why can't the API be extended?
2. Why is the CPU doing so much work that vendors are shipping closed source drivers?
Posted Aug 19, 2022 12:18 UTC (Fri)
by abo (subscriber, #77288)
[Link] (1 responses)
They plan an LD_PRELOAD for apps that use the old V4L2 interface. It's like the bad old days of /dev/dsp emulation. But it's for the better, we shouldn't need a kernel module to feed virtual loopback camera feeds into apps.
Posted Aug 19, 2022 12:37 UTC (Fri)
by laurent.pinchart (subscriber, #71290)
[Link]
The LD_PRELOAD trick is still useful in some cases (and it is implemented and has been tested with Firefox), but it is not what most systems should use. I expect desktop Linux to use libcamera through portals and PipeWire. For some more advanced use cases the GStreamer libcamerasrc element could also be useful on the desktop, but will probably see most of its uses in embedded devices. Direct usage of the libcamera API is also possible for specialized applications, and of course Android devices will go through the libcamera's Android camera HAL implementation.
libcamera suffers from the common problem in the free software world of documentation lagging behind code. We need to update the camera stack description. Apologies about that.
Posted Aug 19, 2022 16:51 UTC (Fri)
by sergey.senozhatsky (subscriber, #91933)
[Link] (2 responses)
Imaginary wars.
We have not proposed any API yet and we have never posted anything upstream, so what exactly "is not seen favorably in the kernel community" will forever remain a mystery.
Posted Aug 19, 2022 18:13 UTC (Fri)
by jbnd (guest, #160374)
[Link] (1 responses)
Posted Aug 19, 2022 22:41 UTC (Fri)
by ribalda (subscriber, #58945)
[Link]
Posted Aug 21, 2022 10:27 UTC (Sun)
by sadoon (subscriber, #155365)
[Link] (2 responses)
> There was a time when care had to be taking when
taken*
Posted Aug 21, 2022 13:12 UTC (Sun)
by jake (editor, #205)
[Link]
ouch, indeed ... thanks for the report, but please send them the lwn@lwn.net in the future ...
jake
Posted Aug 24, 2022 5:41 UTC (Wed)
by sadoon (subscriber, #155365)
[Link]
Heh just noticed, my bad.
The growing image-processor unpleasantness
The growing image-processor unpleasantness
The growing image-processor unpleasantness
The growing image-processor unpleasantness
The growing image-processor unpleasantness
The growing image-processor unpleasantness
The growing image-processor unpleasantness
The growing image-processor unpleasantness
The growing image-processor unpleasantness
The growing image-processor unpleasantness
The growing image-processor unpleasantness
So yes - Intel seems to have dropped the ball, and AMD definetely picked it up.
The growing image-processor unpleasantness
The growing image-processor unpleasantness
The growing image-processor unpleasantness
The growing image-processor unpleasantness
The growing image-processor unpleasantness
The growing image-processor unpleasantness
The growing image-processor unpleasantness
IOMMU would then take care of protecting the rest of the system from that blob's actions.
After all, we already have open-source DSLR RAW file processors.
The growing image-processor unpleasantness
https://www.youtube.com/watch?v=7iy0WJwNmv4
The growing image-processor unpleasantness
The growing image-processor unpleasantness
The growing image-processor unpleasantness
The growing image-processor unpleasantness
The growing image-processor unpleasantness
The growing image-processor unpleasantness
The growing image-processor unpleasantness
The growing image-processor unpleasantness
The growing image-processor unpleasantness
The growing image-processor unpleasantness
Even worse, if the sensor outputs raw bayer data (which is the majority now), then the task of raw to bayer conversion is totally left to the application (because neither the UAC controller nor the kernel driver would do it) and any application that can not do that would simply ignore the camera.
The growing image-processor unpleasantness
The growing image-processor unpleasantness
The growing image-processor unpleasantness
Smartphones will kill off the DSLR within three years says Sony : .... stress the role that new hardware will play in lifting phone cameras to new photographic heights. ... While DSLRs and mirrorless cameras will always have an audience among hobbyists and pros due to their handling, creative control, viewfinders and single-shot image quality, the kinds of advances outlined in Sony's presentation show that the next few years are going to be a particularly exciting time for phone cameras.The growing image-processor unpleasantness
I'm not an expert, but it sounds like some are claiming that camera hardware in phones is improving.
The growing image-processor unpleasantness
The growing image-processor unpleasantness
Wol
The growing image-processor unpleasantness
The growing image-processor unpleasantness
Wol
From the linked article:The growing image-processor unpleasantness
Image quality from phones will finally trump that of their single-lens reflex rivals by 2024, according to Sony.
That's interesting, considering iPhone 13 Pro has a 44 mm² sensor, while iPhone 14 is supposed to have a main sensor that's 57% larger - so about 69 mm².
That's not even remotely comparable to entry-level APS-C DSLR (332 mm²), not to mention prosumer APS-H DSLR (519 mm²) or professional full-frame ones (~850 mm²).
Image processing can do a lot of perceptual improvements but ultimately can't restore something that the sensor didn't register (or was hidden in noise), without assuming that the picture data adheres to certain model.
In this case, however, the missing or corrupted data comes from the model in use, rather than from the captured scene.
The growing image-processor unpleasantness
The growing image-processor unpleasantness
The growing image-processor unpleasantness
It's not like the mirror is paramount to image quality anyway; there are positively great pro mirrorless cameras out there now.
You are right, my comment above should really refer to DSLRs and mirrorless cameras together since they share typical sensor sizes (much bigger than used by smartphone built-in cameras).
The growing image-processor unpleasantness
Wol
The growing image-processor unpleasantness
Wol
The growing image-processor unpleasantness
The growing image-processor unpleasantness
The growing image-processor unpleasantness
The growing image-processor unpleasantness
The growing image-processor unpleasantness
go ahead and use a telephoto lens with autofocus on a mirrorless and compare to a DSLR, the DSLR is much, much faster compared to early mirrorless.
The growing image-processor unpleasantness
The growing image-processor unpleasantness
The growing image-processor unpleasantness
The growing image-processor unpleasantness
Wol
The growing image-processor unpleasantness
If they invent detail from the model, not from the original scene, how can the result be trusted?
It's worth noting that lossy compression formats have a similar problem
For example, the JBIG2 algorithm used in faxes, PDFs and DjVu files is prone to silent number glyph substitutions.
The growing image-processor unpleasantness
The growing image-processor unpleasantness
The growing image-processor unpleasantness
The growing image-processor unpleasantness
I think most phones are like 1/3" to 1/2" now, sony shipped a phone with a 1" sensor
In any case, even if the sensor size goes up to 1", or a bit more there would still be interest in APS-C and full frame ILC cameras, that format is much larger, and you can get physical bokeh/depth of field, which is not possible with phone cameras because the lenses end up with really short focal lengths.
The growing image-processor unpleasantness
The growing image-processor unpleasantness
The growing image-processor unpleasantness
The growing image-processor unpleasantness
The growing image-processor unpleasantness
The growing image-processor unpleasantness
The growing image-processor unpleasantness
The growing image-processor unpleasantness
The growing image-processor unpleasantness
The growing image-processor unpleasantness
https://chromeunboxed.com/hp-elite-dragonfly-chromebook-r...
https://chromeunboxed.com/acer-chromebook-spin-714-review/
The growing image-processor unpleasantness
The growing image-processor unpleasantness
The growing image-processor unpleasantness
The growing image-processor unpleasantness
The growing image-processor unpleasantness
Interesting question. I just did some researches on the web. Since I know Chromebook is using the opensource project coreboot as its based bootloader, I noticed some clues that some information were left in a KConfig file in coreboot.
select DRIVERS_INTEL_MIPI_CAMERA
select SOC_INTEL_COMMON_BLOCK_IPU
The growing image-processor unpleasantness
The growing image-processor unpleasantness
The growing image-processor unpleasantness
The growing image-processor unpleasantness
The growing image-processor unpleasantness
The growing image-processor unpleasantness
The growing image-processor unpleasantness
The growing image-processor unpleasantness
The growing image-processor unpleasantness
The growing image-processor unpleasantness
The growing image-processor unpleasantness
> And they're not going to sacrifice one percent of their userbase that will have to go to competition.
The growing image-processor unpleasantness
The growing image-processor unpleasantness
The growing image-processor unpleasantness
As I already turned of the camera and mics in the BIOS for me personally it is no problem, but generally I agree that this is an unpleasent step.
The growing image-processor unpleasantness
Lenovo laptops with Fedora preloaded do not ship MIPI cameras. They have regular UVC cameras instead.
The growing image-processor unpleasantness
The growing image-processor unpleasantness
The growing image-processor unpleasantness
The growing image-processor unpleasantness
We still need specialized hardware for this?
We still need specialized hardware for this?
We still need specialized hardware for this?
We still need specialized hardware for this?
We still need specialized hardware for this?
We still need specialized hardware for this?
We still need specialized hardware for this?
The growing image-processor unpleasantness
The growing image-processor unpleasantness
The growing image-processor unpleasantness
The growing image-processor unpleasantness
The growing image-processor unpleasantness
The growing image-processor unpleasantness
The growing image-processor unpleasantness
The growing image-processor unpleasantness
>taken*
The growing image-processor unpleasantness