
Algorithm Removes Water from Underwater Images - fortran77
https://www.scientificamerican.com/video/seeing-through-the-sea/
======
ivanech
actual project site with more detailed description and before+after images:
[https://www.deryaakkaynak.com/sea-thru](https://www.deryaakkaynak.com/sea-
thru)

paper [pdf]:
[http://openaccess.thecvf.com/content_CVPR_2019/papers/Akkayn...](http://openaccess.thecvf.com/content_CVPR_2019/papers/Akkaynak_Sea-
Thru_A_Method_for_Removing_Water_From_Underwater_Images_CVPR_2019_paper.pdf)

~~~
andromeduck
So it seems like it's just MVS > depth map + recovery of haze characteristics
by sampling a color reference at a bunch of distances. That seems blindingly
obvious unless I missed something.

~~~
ancarda
This is the most Hacker News comment I've seen in a really long time

Good job, dude

~~~
flir
It's so HN it's almost /.?

------
cetra3
> Sea-thru is protected by a registered patent owned by Carmel Ltd the
> Economic Co. of the Haifa University and its subsidiary SeaErra

Looks like they are going to monetise this technology at some point given the
disclaimer at the bottom of the page. This is not wrong. But it feels like a
PR exercise dressed up as something academic which is a little creepy.

~~~
smnrchrds
It is not unusual in academia to patent all innovations with potential
commercial applications. At least in Canada, universities typically have
innovation centres whose main job is encouraging and helping professors,
graduate students, and other researchers patent their innovations and
commercialize them (e.g. by licensing the patent). It is not sinister, it is
normal procedure in academia.

~~~
loeg
And the funding for the research that allowed for this invention was paid for
by? The inventor? Or the government?

~~~
pansa
[deleted]

~~~
mattkrause
Really?

Some professors have “hard money” jobs where the university covers most/all of
their salary; startup packages that are meant to help you get a grant are
pretty common, as are fellowships or TAships for students.

However, I don’t think most universities cover much of the actual research
expenses.

As for the patent, most places offer a split with the inventor, and may not
patent everything; they have a right of first refusal though.

------
chairmanwow1
I hate to say it, but that was actually a really poorly edited and produced
video. It spent way too long on b-roll and did a really poor job framing the
problem.

I would have strongly preferred static images in the article and an interview
video buried below.

~~~
ec109685
The video let them show a 30 second ad which monetizes much better than static
ads.

I agree they should have gotten to the punch line and show results rather than
the doctor swimming.

------
Redoubts
[https://news.ycombinator.com/item?id=21542184](https://news.ycombinator.com/item?id=21542184)

------
a_t48
Some static before/after pictures would _really_ help this article. I get that
it's intended to be consumed as a video, but comeon.

~~~
erikig
For anyone that’s interested - skip to the last 30s of the video for the best
before/after examples.

------
wereHamster
Not a single image in the article. An article about images. What a shame.

------
wallflower
Scuba divers already use post-processing in Lightroom or apps like Dive+
([http://dive.plus/](http://dive.plus/)). It will be interesting to see if
this becomes popular in that community. The results are pretty good already
with Dive+.

~~~
peteretep
Looks a lot stronger than the Dive+ images I've seen (and created)

------
blt
This would be a great application for deep learning. Use the authors' method
to generate a lot of uncorrected-corrected pairs. Or, use a graphics engine to
render realistic underwater scenes with and without water color. Then use a
convolutional neural network to learn to mimic the transformation. Then any
photographer can apply the learned filter without a color card or depth
information.

Edit: already been done:
[https://arxiv.org/abs/1702.07392](https://arxiv.org/abs/1702.07392)

~~~
aspaceman
You should take a look at the paper itself which is linked above. It does not
require a color card in all cases, and the only information required is the
depth, which can be obtained from a series of photos instead if required.

------
iicc
What you want is a few lasers of known wavelengths (RGB?) pointing at known
angles such that, from the camera's perspective, they appear as lines (ie they
aren't perpendicular to the plane of the photo sensor).

A calibration image(s) can be made before each shot. Possibly the resulting
image correction can be integrated in to the camera too.

The laser wavelengths are a substitute for the color chart. The laser angle
means you get a reading at each distance (ie in the image, each point on the
laser "line", corresponds to a distance.)

~~~
ynniv
Her algorithm doesn't require calibration. The color chart in many of the
photos is to demonstrate the effect, or was just habit when she took them.

~~~
londons_explore
True, but the method would totally benefit from calibration, because not all
water is made the same - some has different densities of particulates in.

~~~
nicwest
Also I imagine that depth would be a significant factor

------
ismepornnahi
How is this different from the colour chart idea? If we know how some actual
KNOWN RGB pixels look in a particular setup, we can apply the same filter
across the image. Right?

~~~
Scaevolus
The amount of color shift depends on how much water is between the object and
the camera, so you need to have a depth map to recover the true colors. You
can see how it compares to naive color transforms in the paper.

~~~
Udik
> so you need to have a depth map to recover the true colors

Isn't the blue shift of a known color already a measure of the amount of water
between the object and the camera- and therefore its distance? Knowing the
true color of a fish, a seaweed or the sand isn't already enough to infer
distances and color-correct?

~~~
Scaevolus
Yes, but how do you know the true color of a fish, seaweed, or sand? That's
the unknown part!

~~~
Udik
> how do you know the true color of a fish, seaweed, or sand

I am pretty sure that in most cases those colours are well known.

------
Rainymood
What if I now get a bunch of images before & after and train a neural network
on this to "learn" the mapping. Who would the neural network belong to?

------
lilyball
This looks really cool. She described how she takes photos that include her
color chart, and I'm wondering if that's actually necessary to calibrate the
process, or if that was just done for the purposes of developing it.

~~~
zackangelo
The researcher that authored the paper answered a few questions on Reddit last
year [0]. She explained that the color chart isn’t necessary for every photo.

[0]
[https://www.reddit.com/r/videos/comments/dvts2j/this_researc...](https://www.reddit.com/r/videos/comments/dvts2j/this_researcher_created_an_algorithm_that_removes)

------
stevenjohns
Coincidentally: the researcher's name, Derya, translates to "sea" or "ocean"
in her native Turkish. I wonder if that's something that might have inspired
her as part of this.

~~~
teddyh
[https://en.wikipedia.org/wiki/Nominative_determinism](https://en.wikipedia.org/wiki/Nominative_determinism)

------
nighthawk648
I read this and wonder if the technique can be applied to space. Too bad we
can’t take photos of a closing distance of something 50 million light years
away.

I wonder if the algorithm would become better if not only did author did the
swim closer but also took pictures of different path distances and angles
simultaneously and map images together. Maybe it will reveal some of the
editing work on hazy objects took a little more liberty and will produce more
accurate images.

Great read!

------
vijay_nair
Perhaps a real-time version of this can be embedded into the diving
goggles(AR) for murk-free dives.

~~~
dmix
Currently it’s only photoS and requires a carefully placed colour chart to
sync up the colours.

I don’t think that last bit is automatable with just glasses.

~~~
bradyd
The project website shows a video example and says that a color chart is not
required.

[https://www.deryaakkaynak.com/sea-thru](https://www.deryaakkaynak.com/sea-
thru)

------
luxuryballs
I don’t get how it’s not a photoshop though, it’s just a really specific
photoshop. See the stuff how it would be on land... but it wouldn’t be like
that on land at all. This is no more real or fake than any other filter
applied to pixels.

~~~
devit
It would be like that without water, that's the whole point of the algorithm:
based on physics, it computes the (best known approximation to) way the scene
would look with no water being present.

That is, it's supposed to make the photo be like the one you would take if you
were to lift a part of the seafloor onto a boat and photograph it.

~~~
pixel_fcker
Not really: it doesn’t account for specular reflections. It’s more like if you
took the objects out of the water and then used a polarising filter to remove
the highlights.

------
tabtab
NASA and JPL do similar things with Mars surface images to bring out details
and color differences. The orange dusty sky normally washes everything in an
orange tint and softens shading.

------
DocG
Article that is not really an article but video about images. Why? :(

------
miguelrochefort
Is anyone aware of something similar that works when the picture is taken
outside the water? Something that could remove reflection and glare.

------
lqet
> It does not use neural networks and was not trained on any dataset.

------
untitled_
Thats pretty cool. We could do with a GoPro version of this lol.

------
hinkley
I was watching a blooper real for an older movie the other day and realized
that a very similar color rectangle appeared in some of the shots.

Made me wonder how unique this technique is or if they’re just using bog
standard movie editing tricks.

~~~
_ph_
Using color-correction charts is a very standard photographic technology for
correcting colors in any situation, where there is challenging lighting. So
just using one for your underwater photography would give you vastly improved
colors in your picture. As I understand their work, it uses a correction model
which is distance dependant, so corrects the color shifting with distance. In
air, changing the distance doesn't change the color appeareance, but under
water, the absoption and scattering effects vary with distance.

~~~
ivanhoe
> In air, changing the distance doesn't change the color appeareance

Actually it does, it works similar to the water, just the effect is weaker in
the air. That's why sky and distant mountains look blue, the atmosphere
scatters blue light.

~~~
_ph_
Right, with lots of air, the same happens :).

------
thrownaway954
Not trying to be a jerk... but at 2:43 in the video the show video of fish
swimming in the reef and those fish are beautiful, their colors aren't
affected by the problem she is describing... so my question is, was that
footage of the fish run through this algorithm? And if not... then what is the
actual point of this ground breaking technology when it can already be done
without it?

