

Neural networks and a dive into Julia - glamp
http://blog.yhathq.com/posts/julia-neural-networks.html

======
dnautics
I am using Julia to casually participate in a bioinformatics contest (I have
no bioinformatics background but I am pretty darned good at coding and
biochemistry). The technique i'm using involves blasting a subset of the
genome with neural nets, and I'll be damned if Julia isn't fast. I haven't
benchmarked it against python, but basically the program is running a hybrid
swarm optimization/gradient descent technique to find 20:5:1 neural nets; it
is able to find test and optimize a swarm of 100 neural nets over 50
iterations in about 2 seconds on my rather slow laptop (2.3 GHz Core i3) A
friend of mine is doing the same contest (using python) and his eyes popped
out when I told him how efficient Julia was.

~~~
jamesjporter
This sounds really cool; I'd love to see a blog post about it :)

~~~
dnautics
I would love to but i got a very late start on this and I'm coming up against
a time crunch... I was without my laptop while traveling because I left it in
the security line... also for HIPAA reasons I have to be very careful about
what gets included in the git push, etc. etc. For various reasons it's a PITA
so I might only do a blog post about it if it's successful, and after the
contest is over.

------
bch
I'm pretty excited about Julia and integrating it into existing
infrastructure[0]. The build process though -- even the "release" seems to
insist on running git to pulldown code/data from the network -- if Julia
devs/release-managers are reading -- what are odds of getting standalone
distributions ?

[0]
[https://news.ycombinator.com/item?id=7173137](https://news.ycombinator.com/item?id=7173137)

~~~
one-more-minute
We're actually working on static compilation right now, so that you'll be able
to compile Julia code to a small binary and distribute it without the runtime.
It should be completed over the summer.

~~~
bch
Unless I misread, I think you misinterpretted my question: what I'm wanting to
do is compile a libjulia.so and link against it, and submit code to it and
collect the results.

Grabbing either the git master tip, or even .tgz "distribution" and trying to
compile that results in git being fired up and pulling in _something_ new. Can
that "something" be included in a distribution so I can effectively just:

    
    
      ./configure; make; make install
    

without git, and without hauling down more code/data over the network ?

~~~
ihnorton
There are two different questions here. See my post above for the "link
against libjulia" part (possible, not turnkey yet, probably still some warts
to work out).

For compilation without internet access: yes, that is possible. There is a
special "make source-dist" target which will grab everything you need (see
[https://github.com/JuliaLang/julia/blob/master/DISTRIBUTING....](https://github.com/JuliaLang/julia/blob/master/DISTRIBUTING.md)).

We are not distributing this bundle ourselves right now, but it's worth
considering for the next release version.

~~~
bch
make source-dist sounds promising. I know we've been cross-posting threads,
but I'll just thank you here. :)

------
michaelochurch
Interesting. OP (if you're around): I noticed in the confusion matrix that
everything was classified to the middle classes (5, 6, 7). That makes sense
because the 3s, 4s, and 8s are rare and "true 8s" are still most likely to
have a high probability on the 7 class, because there are far more 7s in the
data. Did you analyze approximate correctness for the probabilities, or
consider sampling from the computed probabilities rather than classifying to
the highest one, to see where that led?

~~~
ericchiang
Hi there.

That ml problem is more for example than for rigor. In fact that particular
problem would probably be better suited for other algorithms (eg, random
forest).

My background's in biomedical imaging, so I'm quite fond of problems with
skewed class distributions. Though I didn't have time to explore this
particular one further.

The code's all openly available if you want to give it a go though :)

