Hello and welcome to our community! Is this your first visit?
Accurate colour correction (for scientific images)
Someone just pointed me at this clip, but as an engineer I now want to know some details about the technique.
-and a quick search comes up with a technical paper as well as this article in Scientific American:
I'm guessing it's not going to really work on, e.g., Simon Brown's photogrammetry data as that's often too deep for enough of the colour to still exist. That and he'd have to retake all the images with a colour chart in shot...
If it’s too deep the photographer probably used lights - whether they were good ones (in terms of quality of light) is another question.
Last edited by graham_hk; 16-11-2019 at 09:05 AM.
I'm guessing that if they can model the light's absorption from the subject to the lens they could also model it from a light source on the camera to the subject. Mind you, the camera's never going to be very far from the subject if the photographer's using lights or strobes.
The colour chart isn't needed after the training period, there is a thread on Reddit somewhere with the author of the software/algorithm.
My understanding is it needs to work out the distance from the camera for each pixel and then can work out how much haze/back scatter there is and removes that, and works out how much colour will have been absorbed by the water for each pixel and adds that back in. So I guess with multiple shots with the lights for deep wrecks it might be able to do it.
To do the distance calculations it either needs more than one photo at difference distances or a stereoscopic pair of images.
The author also said it will work for video as it can get the distance information from comparing frames.
Found one of the reddit threads: