
Neuroph: a lightweight Java neural network framework - chaostheory
http://neuroph.sourceforge.net/
======
viergroupie
Oh God, the horror. Almost every time neural network library I've seen manages
to obscure a very simple idea in piles of useless objects.

"Neural networks" are just the composition of several nonlinear regressions.
There's nothing particularly "neural" about them.

Here's a typical 3-layer network:

f(x,Wh,Wo) = tanh(Wo * tanh(Wh * X))

Wh, and Wo are the hidden and output weight matrices respectively. Fix some
loss function (ie, L(x,Wh,Wo,y) = || f(x,Wh,Wo) - y||^2), get the gradient of
this function, and a take step down the gradient. There's your learning rule.

Now, I understand the desire for flexibility/modularity, but (1) what's the
sense of trying to house supervised and unsupervised methods in the same
hierarchy? and (2) what could possibly justify Connection and Weight objects?

~~~
vicaya
Well, this thing supports GUI/visualization as well, so many of these objects
make sense.

------
Leon
This would be really great to drop into groovy to do quick ANN's to quickly
make a prototype of some software.

