16 bits should be enough for anybody?
Posted May 17, 2012 19:41 UTC (Thu) by rgmoore
(✭ supporter ✭
In reply to: 16 bits should be enough for anybody?
Parent article: LGM: GIMP's new release, new über-core, and future
Note that your comment assumes that the 16-bit representation is a linear function of brightness, that is, value such as 12223 will be exactly 12223 times as bright as 1. This would not be true for your typical 8 bits per component pixel formats, although it might be true for 16 bits per component pixel format. (Is it? I don't know.)
For digital cameras that have high bit depth formats, it usually is. As you suggest, typical 8 bit formats (like JPEG) use a gamma to compress the dynamic range of the camera into the 8 bit per channel format. The higher bit depth formats from digital cameras are usually reserved for raw image formats, which means they're pretty much straight off the ADC and consequently are linear. As far as I can tell, no contemporary camera actually has 16 bits of linear dynamic range, even using a very generous definition of the noise floor. Most of them don't even bother with 16 bit ADCs because they'd just be reading extra noise; 14 bits is still considered a relatively high-end feature.
to post comments)