Not logged in
Log in now
Create an account
Subscribe to LWN
An "enum" for Python 3
An unexpected perf feature
LWN.net Weekly Edition for May 16, 2013
A look at the PyPy 2.0 release
PostgreSQL 9.3 beta: Federated databases and more
Physics versus perception...
Posted Apr 19, 2012 21:14 UTC (Thu) by Pc5Y9sbv (guest, #41328)
Because our digital color systems encode perceived color and brightness rather than spectral energy distribution, you cannot model things like selective absorption or re-emission, the basis of real-world pigments and filters. While you might think it works after thinking about the simplistic "red glass" filter, it doesn't really work for different lights and glasses we would perceive as red but which are really mixtures of different wavelengths.
Think of each pixel as a 2D curve of energy over wavelength at this point in the image plane, representing the entire cone of space behind it. Instead of a multi-channel blending function, you would have a transfer function that changes the arriving 2D curve into another 2D curve. However, some light sources and pigments often have very narrow emission lines or notch-filtering behaviors, best represented as some sort of variable set of discrete wavelength intervals, while other black-body kinds of radiator have broad spectra best represented as a smoothed curve. Any lossy compression scheme would likely destroy the physical properties you are trying to simulate, unless tailored for the specific lights/materials/effects at play along the light path.
A more likely physical simulation approach would be something like monte carlo sampled ray-tracing or photon-mapping, reevaluating the scene with a large number of discrete wavelength-energy bundles, giving you some dithered and accumulated final result in the image plane. As an added benefit, things like refraction would really work right, so you could shine a white light through a prism and get a rainbow out the other side, based on the different wavelength-specific interactions of each sample with the materials in the scene.
The conversion from spectral energy into perceived color would have to be delayed until the final accumulated spectral energy plot is available at the image plane. You could imagine either a framebuffer with a 2D curve at each pixel, or perhaps a stack of hundreds or thousands of monochromatic layers each representing one sampled wavelength. Things like absorption and re-emission at different wavelengths would add complication, since a particular "ray" could change wavelength as it traverses the scene.
Of course, given all that, we might still wonder why we cannot simulate polarizing filters, iridescence of nanostructures, or constructive/destructive interference of light... that would require an even more complex framebuffer!
Posted Apr 20, 2012 6:23 UTC (Fri) by boudewijn (subscriber, #14185)
Right now, it still compiles, but its practical utility is very limited since the color mixing tool is broken.
Posted Apr 20, 2012 11:19 UTC (Fri) by dgm (subscriber, #49227)
Posted Apr 20, 2012 12:00 UTC (Fri) by boudewijn (subscriber, #14185)
Posted Apr 20, 2012 16:05 UTC (Fri) by Pc5Y9sbv (guest, #41328)
You can experiment with per-channel alpha in GIMP today, splitting your multi-channel layer stack into monochromatic layer stacks and applying per-channel alpha masks on each layer before compositing them back into RGB.
While this RaGaBa approach can help one understand RGBA and its limits, it doesn't get you closer to real color mixing/filtering phenomena. It doesn't free you to get more correct/accurate color mixing without having to think about the limits of color spaces and color rendering; rather, it gives you another digital medium with which you have to make hard interpretive decisions that don't translate well to other media. It's a bit like giving an oil painter a different set of paints, which behave slightly differently when mixed, so he'll have to learn how to use them all over again in order to produce scenes he imagines.
Also, consider scientific imaging which often has broad spectral information, e.g. taken with a monochromatic sensor with a color filter wheel or variable wavelength light source. There are many GIMP-related image processing tasks someone might want to perform when preparing such image sets to make final print or web images, whether false-colored or true-to-life. A real spectral-energy pixel and blending system would provide much more meaningful tools for this kind of work.
Copyright © 2013, Eklektix, Inc.
Comments and public postings are copyrighted by their creators.
Linux is a registered trademark of Linus Torvalds