
Neural networks meet space - qubitcoder
http://www.symmetrymagazine.org/article/neural-networks-meet-space
======
hacker_9
_' “The neural networks we tested—three publicly available neural nets and one
that we developed ourselves—were able to determine the properties of each
lens, including how its mass was distributed and how much it magnified the
image of the background galaxy,” says the study’s lead author Yashar Hezaveh,
a NASA Hubble postdoctoral fellow at KIPAC.

This goes far beyond recent applications of neural networks in astrophysics,
which were limited to solving classification problems, such as determining
whether an image shows a gravitational lens or not.'_

Pretty fascinating stuff. Once you think about it, applying NNs to space makes
a lot of sense. There is a ton of data to sift through and find patterns in.
Amazing to think neural nets could crunch through this data in seconds, and
point out areas of interest immediately. I wonder if NNs have been used in the
search for exo planets yet.

~~~
tachyonbeam
> I wonder if NNs have been used in the search for exo planets yet.

Might be kind of overkill. The patterns being looked at for exoplanets are
periodical dimming of stars AFAIK. I don't think you necessarily need a neural
network to sift through that.

~~~
yh_82367
yes and no. Most exoplanet detections so far have been indirect: people just
search for the dimming from the eclipse or similar other methods. So you're
right about that. But recently we're starting to actually directly take
sensitive enough images to be able to see planets (see
[http://planetimager.org](http://planetimager.org)). They aren't using NNs
yet. But there have been discussions of using NNs for various aspects of those
searches.

~~~
sanxiyn
Detecting dimming is actually surprisingly difficult and currently partly done
by humans using Mechanical Turk-like platform.
[https://www.planethunters.org/](https://www.planethunters.org/)

------
m3kw9
Wouldn’t it take too long to train with such amount of data? I’m thinking the
NN has to be constantly be in training because you never want to miss some
data sets that contain special categories and not to miss them when in the
inference stage

~~~
yh_82367
hi, Yashar here (one of the authors). These NNs were really fast to train. A
day or so. We trained them on half a million simulated images. Then they're
good to go for the analysis of any new data. We don't need to keep on training
them as we get more data.

~~~
jawbone3
How long did it take you to get them to train in a day?

~~~
yh_82367
surprisingly not that long. We started this in Feb without any expectation
that it would work at all, and after 2-3 weeks of playing with things they
were working great.

~~~
cropsieboss
I love how machine learning went to statistical learning and then back to
machine learning. Playing is exactly the word to describe the discovery
process.

------
hprotagonist
dimensionality reduction is very powerful when you need to traverse a large
configuration space and look for things that seem salient.

------
nicklaf
Naive question: is the exponential increase in performance talked about in
this article unique to neural nets? Or are there other techniques for writing
classifiers that yield the same performance increase, given advances in
hardware?

~~~
mholt
Neural networks are function approximators. So if you 1) know an algorithm
that is really computationally complex but not highly random and 2) have a lot
of inputs and outputs of that algorithm, you can usually train a neural
network to approximate a closed-form formula of the algorithm. It boils down
to a bunch of matrix-multiplies and some standard non-linear functions in
between.

~~~
posterboy
is that anything close to like polynomial fitting? What with PTIME and NP-
Completion?

~~~
mholt
Kind of - but instead of computational complexity in the "NP" sense, you have
lots of data. It's often so hard to get good training data that the cost of
just waiting out a big algorithm to finish can be cheaper. So you have to
weigh that.

~~~
posterboy
well sure, you said as much before. But I was also thinking, what if P~=NP, in
the sense that any function can be approximated by a polynomial of
sufficiently high degree.

------
pfd1986
Nice. Friends with the authors here, I'll try to bring them to answer
questions.

~~~
yh_82367
Bob?

------
thearn4
If I understand it correctly, it sounds like they used a NN to fit a surrogate
model to the kind of analytical physics-based pipeline that they had been
using before?

