
Sea-Thru: A Method for Removing Water from Underwater Images - cgranier
https://petapixel.com/2019/11/13/this-algorithm-can-remove-the-water-from-underwater-photos-and-the-results-are-incredible/
======
TeMPOraL
"I really see this as a start of the artificial intelligence boom in marine
science" \- first words of the video in the article.

I don't understand why to even say something like that. PR value? As I
understood the paper, the presented method is very refreshingly clean of
anything resembling what we call AI today. It seems to be a combination of
good, old-school photogrammetry and image processing techniques - which is
great, because with such methods they can actually ensure the result is
physically correct.

~~~
jrpt
Aside from the paper, I think one thing they're referring to is using the
output (color corrected images) as input for AI. One thing they want to do is
count the number of fish in an image, and know which species each fish is. So
you can take pictures of coral reefs and estimate "there's 1,000 species X,
2,200 species Y". With the old images, it's too difficult to determine which
species a fish is. With the new images, it's easier. So Sea-Thru is
preprocessing that'll be useful for AI in marine science.

~~~
DoctorOetker
I dont have the impression it works on moving objects, since it needs multiple
frames from different depths, but it could be used to count static critters...
unless some kind of boom with multiple cameras at different depths is used...

also, since the technique is removing a foggy haze, it seems like this could
be used for selfdriving cars, with multiple cameras along the periphery of the
car, to clean the image for foggy conditions (fog, smoke, smog, ...)

------
Scaevolus
Neat! This uses Structure from Motion to compute depth to each pixel and
correct backscatter for the specific distance light is traveling from the
subject. Typical flat color correction algorithms can help reduce the blue
tint typical of underwater photography, but it's only physically correct for a
narrow distance band.

~~~
perf1
Modern Image programs have a Dehaze algorithm that already can calculate a
depth map using the haze. Wonder if this is better.

~~~
dorkwood
Which programs? I had to generate some depth maps recently and couldn’t find a
good solution.

~~~
bufferoverflow
That's not what the comment above said. They said some imaging programs create
depth maps, and process images based on that. It doesn't mean it creates a
depth map for the user, it can simply be an internal part of the whole
process.

Photoshop RAW pluging does dehazing. I don't know how it works.

------
kawsper
A couple of days before I saw the Scientific American article someone dropped
this project in our local development group:
[https://github.com/nikolajbech/underwater-image-color-
correc...](https://github.com/nikolajbech/underwater-image-color-correction)
the repository contains some test images, and they also provide a website to
test it out:
[https://colorcorrection.firebaseapp.com/](https://colorcorrection.firebaseapp.com/)
that states: "This web-application features a beta version of the Paralenz
color correction algorithm, which features an open-source license".

The two projects seems unrelated at first glance, but the timing is
interesting.

~~~
SimpleMinds
I've skimmed the description of the repo and it does look unrelated. The paper
uses depth information while the repo is focused on color balance. The paper
mentions this is inaccurate way of fixing the colors as the color information
changes with distance too.

------
jagged-chisel
I'd love a tool to reverse this process on normal photos so I can put random
family events under water without drowning the family

~~~
soylentcola
You can do it in Photoshop with various color tweaks, blurs, and textures. It
helps a lot to "stage" the original shots for background removal (or to look
like they could already be in an underwater location).

~~~
munificent
_> It helps a lot to "stage" the original shots_

In particular, a fog machine will help emulate the haze and backscatter
produced by water and the particles floating in it.

------
knolax
[https://www.scientificamerican.com/article/sea-thru-
brings-c...](https://www.scientificamerican.com/article/sea-thru-brings-
clarity-to-underwater-photos1/)

Original article linked in current article, has more pictures/content.

~~~
Markoff
the first picture in this article is extremely blurry compared to original

------
weinzierl
While, as explained in the article, the method was developed to help
scientists and not primarily to improve aesthetic qualities, I find most of
the images very pleasing.

The one thing that is disturbing though is images with visible _" horizon"_
(for lack of a better word). I find the images that look like they were taken
on land but don't have a sky where one would expect it somewhat uncanny.

------
neovive
These are truly amazing results. Very excited to see some wonderful underwater
photography in the coming years. Also, the name "Sea-Thru" is cleverly
perfect!

------
dylan604
I would be interested in an experiment of exporting video to a frame sequence,
and then letting the software process each frame. Would it make the same
decisions on each frame so that when played back as video there is no visible
changes between frames? While the results are impressive for a single still,
using this to see movement that occurs underwater would be amazing with the
improved color.

~~~
djsumdog
and you could also use it to just accuracy (or at least consistency) of the
algorithm. Sounds like a good follow up paper.

~~~
dylan604
In the early days of VR, I did the equivalent of this test. The software to
stitch stills was much more robust than the the nascent versions of software
handling video. As an experiment, I ran the video through the video software,
and then did the export to image sequence and had the stills software apply
the same settings to the image sequence. When the image sequence was converted
back to video, you could see how the stills software made different decisions
for each frame resulting in a very psychedelic video. With more tweaking the
trippy effect was reduced but not eliminated, and the video software was
updated to become much more robust (and impressive).

As a primarily video guy, I always laugh to myself (sometimes not to myself)
at the amount of effort photo editors spend on a single image. I remind them
that the video world has to do that same level of work, except x24 per second
multiplied by number of seconds. Photoshop is cool, but Nuke is mindblowing

~~~
666lumberjack
>I remind them that the video world has to do that same level of work, except
x24 per second multiplied by number of seconds.

While this is true, video frames only have to stand up to scrutiny for ~40ms.
You can take a lot of clever shortcuts when an onlooker only has a fraction of
a second to spot the rough edges!

~~~
dylan604
The one crux to my argument. Thanks for knocking me back down from my high
horse

------
pronoiac
I kinda want to see this applied to the diver video. It could surface
artifacts, or look surreal.

------
aitchnyu
I was initially disappointed by the shadows/refraction patterns of the surface
waves are present in the images. Can we compensate for that?

~~~
bschwindHN
Fun fact, those patterns are called caustics:

[https://en.wikipedia.org/wiki/Caustic_(optics)](https://en.wikipedia.org/wiki/Caustic_\(optics\))

------
TrueDuality
This seems like it's just color fixing using stereographic distance modeling
and a known reference palette. I'm not sure what is new or novel about this?
Maybe it's just new to oceanographic photography?

~~~
Pfhreak
Inventions often seem obvious or trivial in hindsight. Are you aware of this
technique being used anywhere else? Because this seems pretty novel to me.

~~~
TrueDuality
Yeah both are common in astrophotography. Color correction using a reference
palette is also pretty common when doing any kind of scientific imaging,
though in lab setting the distance is usually well established and not
relevant.

~~~
Semaphor
TFA:

> Once trained, the color chart is no longer necessary. As Akkaynak helpfully
> explained on Reddit, “All you need is multiple images of the scene under
> natural light, no color chart necessary.”

On Reddit:

> Just a clarification: the method does NOT require the use of a color chart.
> That part was not clear in the video. All you need is multiple images of the
> scene under natural light, no color chart necessary.

~~~
jsjw7sbw
What annoyed me with that answer is they didn't explain why they needed the
color chart at all. I would assume it's for training some model, which would
lead to this method not working without the chart out of the box for example
in muddy waters.

~~~
juliendorra
The chart is needed to validate that the algorithm works: you need to have
known colors in the image, you need a reference. (The chart is the ground
truth)

Once the chart is back to its exact color, the image can be considered
corrected (at least for this distance, illumination…).

If the algorithm brings the chart back to its true colors at several distances
and in various conditions, then it can be applied confidently on images
without a chart.

------
Markoff
is it physically accurate image though? I think those untouched photos are
accurate same as seen by naked eye

sure it's more aesthetically pleasing, but it's already distorting reality

personally I think photos should be most accurate representation of what
healthy naked human eye see, no beatification, no bigger contrast or
oversaturating despite making photos more appealing. if you twist reality
where it will stop, where is the border of what is too much?

~~~
leni536
> personally I think photos should be most accurate representation of what
> healthy naked human eye see

Under what lighting?

~~~
Markoff
no artificial lightning which would not be there just for photo

------
johnpowell
I'm not really excited about technology anymore. But I found this absolutely
amazing. The people that made the jankey whale pop out of the floor and raised
billions have nothing on this. I'm not joking about how cool I think this is.

------
jansan
Does something like this exist for over-water photography? I remember once
taking a picture of beautiful scenery in the Japanese mountains, and the
result was a very disappointing, almost completely grayed out image.

~~~
grenoire
Shoot in RAW and adjust white balance to your taste, maybe tweak the colour
space too. Professional editing software such as Lightroom in combination with
high-resolution and fidelity RAWs will allow you to correct even the _worst_
pictures, as long as it's not pure white or black.

------
ape4
The name is pretty good.

------
fareesh
How does one verify if the output is correct?

~~~
gus_massa
She used a color chart for calibration. I think it should work, but I'm not an
expert in image processing.

An alternative can be to make some similar looking objects that are like fake
coral blocks and use a boat to sumerge them in the sea and compare the
corrected photos with the photos taken in air before. (The wet surfaces have a
different look, and it would be important to use a diffuse light instead of
direct sunlight, and perhaps other technical problems for the comparison.)

------
joshdance
The branding of this is on point. Well done.

------
laurent123456
Whether the modified photos look better or not is mostly a matter of taste I
guess. Personally I prefer the non-modified photos for the underwater feeling
they give. It's a cool technique though, and probably useful for scientific
purposes.

~~~
skocznymroczny
I'd be kind of curious about the reverse filter. I wonder how some of my
photos would look like if I "underwaterified" them.

------
padwan
r_wateralpha 0.1

