
Show HN: Juggernaut – Experimental Neural Network in Rust - afshinmeh
http://juggernaut.rs/
======
amelius
I'm a bit confused. How useful is this if:

\- Rust cannot compile to the GPU

\- Neural network programs are usually not large and therefore do not need the
type safety that Rust offers

\- All cool neural network research is done on Keras/Tensorflow, so developing
on that platform gives access to new algorithms automatically

\- Scripting in Python is virtually at least as fast as anything else because
you can use Tensorflow which uses the GPU

~~~
hobofan
> All cool neural network research is done on Keras/Tensorflow, so developing
> on that platform gives access to new algorithms automatically

I disagree with that assessment. The implementations that come with research
are rarely of a quality that they can just be picked up and used. Yes, the
community is bigger with Python frameworks, which gives you quicker access to
the new stuff, but the effort is about the same as writing a new
implementation in another language.

~~~
danieldk
_which gives you quicker access to the new stuff, but the effort is about the
same as writing a new implementation in another language_

Not really. You can just serialize a Tensorflow graph, freeze the variables as
constants, and then load and run the graph using Tensorflow's small and
convenient C API. There are bindings to the C API for multiple languages (e.g.
Rust and Go).

This is how I use neural networks in my Rust (and formerly Go) programs: I
just build and train the graph in Python and then use it in Rust.

Newer versions of Tensorflow also have the XLA compiler, which compiles a
graph to executable code that you can link directly into an application. I
haven't tried this approach yet, since the C API serves me well, but it looks
promising.

------
yorwba
I'm confused by several of the API choices in the example. Why is the training
set part of the network? I would have expected it to be a parameter to the
train() function. Same for the activation function, shouldn't this be a
property of the layer rather than fixed for the network as a whole?

I get that this is just in the early stages and more for learning than
anything else, but it doesn't seem very well thought-out IMO.

~~~
afshinmeh
Thanks for your suggestions, yorwba. I do see what you mean about the API and
I will work on this.

For now, if you want to train a model, you need to pass data to the NN struct
not train method but as you mentioned, probably it is better to pass it to the
train method.

Sorry for the confusion and thanks for your comment!

------
blahman2
Keep going! I really like being able to follow projects that start small, as
opposed to 'here is my 10000 line toy project.

That being said, you will probably get some flak, probably because of the
insane amounts of rust evangelism people on hn have had to deal with

------
bmh100
What is your goal in writing this project?

------
shadowmint
There's nothing to see here.

Trivial NN implementations are a dime a dozen, and this one is no different.
It's just a partial work in progress; it's not a 'Show HN'; it's just a few
hundred lines of toy code.

...and that's the same feedback it got on /r/rust last week.

I don't see why it's turned up here now.

(Just as a baseline, at this point, if you can't use your NN implementation to
_at least_ do a basic classifier on MNIST, its probably not worth showing
people)

~~~
overcast
Basically because it's ticked off all the buzzword bingo that works on HN.
Unfortunately that's how it goes to front page.

~~~
afshinmeh
No not really no! I'm trying to gather some information and feedbacks about my
project. I don't care (at this stage of the project) to be on the first page
of HN or not, as long as I have some good feedbacks to improve it. Thanks
anyway!

