
3D Lightning (2013) - fanf2
http://calculatedimages.blogspot.com/2013/05/3d-lightning.html
======
tomn
This is really cool.

When I first saw this, I re-did the 3D reconstruction as a bundle-adjustment
problem, which resulted in this model:

[http://misc.tomn.co.uk/lightning/out.gif](http://misc.tomn.co.uk/lightning/out.gif)

To do this, I found point correspondences between the two images, and set up a
model of the two cameras looking at these points. The camera parameters and
point positions were then optimised to minimise the reprojection error (the
distance between the real positions of the points in the images and the 3D
position projected through the camera models).

This was implemented using ceres, which is worth a look if you're interested
in this kind of thing.

It took quite a bit of fiddling to make this work, mostly estimating the
camera parameters manually by hunting around google maps for landmarks.
There's not very much data to work with, so the optimisation tends to get lost
quite easily without doing this.

The results are kind of similar, but it's impossible to say which is closer to
the real thing. The reprojection error ended up pretty small (mostly a few
pixels), but there were some errors that I was unable to reconcile --- likely
due to the difficulty of manually estimating point correspondences for a 3D
line in space, possible errors in my constraints, lens distortion etc.

~~~
detaro
This ceres, or something else? [http://ceres-solver.org/](http://ceres-
solver.org/)

~~~
mierle
I'm one of the founders of Ceres Solver; let me know if you need any help.

------
dang
Discussed in 2014:
[https://news.ycombinator.com/item?id=7702805](https://news.ycombinator.com/item?id=7702805)

~~~
naibafo
Also top comment in a thread from yesterday:
[https://news.ycombinator.com/item?id=16542395](https://news.ycombinator.com/item?id=16542395)

------
iambateman
Question...I was always told that lightning seeks the highest point on the
ground. If that’s the case, this path doesn’t seem to be very efficient in
getting there.

How does lightning actually choose it’s path?

Enlighten me. Sorry. Couldn’t resist.

~~~
z2
The explanation I've seen is that multiple leads actually do a random walk
search algorithm of sorts until one connects.
[https://earthscience.stackexchange.com/questions/580/why-
doe...](https://earthscience.stackexchange.com/questions/580/why-does-
lightning-strike-from-the-ground-up)

------
mirimir
> This means the pair of images are roughly a stereo pair, but with a vertical
> shift instead of a horizontal. This is just like the pair of images you
> would see with your eyes if you had two eyes positioned vertically instead
> of horizontally on your head.

OK, so why can't you just rotate both images 90 deg, and view them as a stereo
pair?

~~~
elil17
Like with stereoscopic glasses? People seeing the stereoscopic effect relies
on the displacement between the photos being equal to the distance between
human eyes.

~~~
mikeash
It works with other distances, the perception of depth just scales
accordingly. If you view a pair of images taken with a very wide distance, the
depth will look small, and the scene will look like a model.

~~~
romwell
This is called "hyper stereo", and is often used when the subject is very
large[1].

To take a 3D photo, say, of a mountain, one needs to space the shots several
meters apart if one wants to get a noticeable stereo effect.

[1]
[https://en.wikipedia.org/wiki/Stereo_photography_techniques#...](https://en.wikipedia.org/wiki/Stereo_photography_techniques#Longer_base_line_for_distant_objects_%E2%80%93_%22Hyper_Stereo%22)

------
thexa4
He did another one a while later:
[http://calculatedimages.blogspot.nl/2014/07/3d-lightning-2.h...](http://calculatedimages.blogspot.nl/2014/07/3d-lightning-2.html)

------
keyle
Images seem half working? I can't actually see the end result.

