
GNU Gneural Network - jjuhl
https://www.gnu.org/software/gneuralnetwork/
======
_delirium
I agree with the general motivation that having too much AI research in the
hands of software companies who keep it proprietary harms transparency and
progress. But there is already a _lot_ of neural-network free software, so why
another package? For example, these widely used packages are free software,
and seemingly more featureful: [http://torch.ch/](http://torch.ch/),
[http://www.deeplearning.net/software/theano/](http://www.deeplearning.net/software/theano/),
[http://pybrain.org/](http://pybrain.org/),
[https://www.tensorflow.org/](https://www.tensorflow.org/),
[http://leenissen.dk/fann/wp/](http://leenissen.dk/fann/wp/),
[http://chainer.org/](http://chainer.org/)

~~~
semisight
Almost all of the open source software in the area is permissive-licensed, and
relies on non-free components (CUDA).

To be honest, I'm not sure how Gneural plans to compete with those packages
without support from CUDA or cuDNN, all of which are distinctly _not_ open
source.

~~~
spdustin
I'd really like to understand the reasons behind the focus on CUDA and not
OpenCL. My understanding is that nVidia and AMD made sure their hardware and
software would make the GPU accessible for non-graphics tasks, but AMD's
version is not functionally or legally locked to their hardware. Why hasn't
OpenCL taken off and run on nVidia hardware?

It seems like there must be more at play, but I'll admit a lack of insight and
imagination on this one.

~~~
osense
I recall hearing that CUDA has much more mature tooling. Not only the already
mentioned cuDNN, but the CUDA Toolkit [0] seems like a really comprehensive
set of tools and libraries to help you with pretty much anything you might
want to compute on a GPU.

Also somewhat related: AMD seems to be moving towards supporting CUDA on its
GPUs in the future: [http://www.amd.com/en-us/press-releases/Pages/boltzmann-
init...](http://www.amd.com/en-us/press-releases/Pages/boltzmann-
initiative-2015nov16.aspx)

[0] [https://developer.nvidia.com/cuda-
toolkit](https://developer.nvidia.com/cuda-toolkit)

~~~
techdragon
On closer inspection, it looks like AMD's CUDA support consists of "run these
tools over your code and it will translate it so your code does not depend on
CUDA"...

Its sort of supporting CUDA, just like a car ferry sort of lets your car
'drive' across a large body of water.

------
rck
The implementations look odd. A network consists of a collection of neurons,
which are implemented individually as structs. The forward pass through the
network is a series of nested loops, and the gradient descent implementation
doesn't use backpropagation - it uses finite differences to approximate
derivatives, which is known to be inefficient. Given the overall design of the
library, it isn't really clear what you would use it for in practice.

I hope that future versions take inspiration from other open source machine
learning libraries, which show how to use linear algebra and backpropagation
and are much more effective.

------
arnorhs
\- It's nice that GNU is taking on such a project

\- FANN seems like a pretty good alternative

\- The value of the software at the big "monopolies" lies within the data, not
necessarily the software

\- This needs to be in some publicly accessible repo. Downloading a zip file
and submitting patches? I thought we, as a society, were over that way of
building OSS.

------
dcuthbertson
Aw. It should have been named the GNU Gneural Gnetwork, gno?

~~~
elipsey
My first thought also

~~~
qubex
Gnah!

------
fche
The "ethical motivations" section is out of place here. Its moaning about
"money driven companies" (as though money were a bad thing), or "monopoly"
(which does not exist in AI), just reflects badly upon the project.

~~~
riscy
The monopoly _does_ exist in AI: machine learning is entirely data driven, and
companies like Facebook and Twitter quite literally have investors throwing
money at them because they have such valuable data for that purpose. Google is
no different.

~~~
fche
"Monopoly" does not mean "two companies have lots of knowhow that competitors
might like".

~~~
riscy
Monopoly does mean "when a specific person or enterprise is the only supplier
of a particular commodity [...] which relates to a single entity's control of
a market" (Wikipedia)

Data is the commodity. There is nowhere else you can get good raw data about,
say, what people were publicly discussing last week, except through Twitter,
in order to guess the stock market. Much of that data is closed off or
incomplete even through their API. There is no other option except to create
another Twitter.

~~~
fche
A commodity is a routinely interchangeable product, available from multiple
suppliers, competing primarily on price. No, "data" in this context is not
that at all. A library of user profiles is the opposite: it's proprietary,
unique, massive.

------
mankash666
This team should focus on a SPIR-V back-end and remove vendor lock in from
NVIDIA for CUDA IN tensor AI software. A GPL licensed AI library without GPU
acceleration isn't attractive outside academia.

------
latenightcoding
Love it! If you want to play with state-of-the-art machine learning software,
this is not for you. But if you want a clean implementation of neural networks
in C that has a GPL license and no non-free components, this is a good start.

~~~
perfectfire
There's already FANN: which is more mature and has bindings for 28 other
languages: [http://leenissen.dk/fann/wp/language-
bindings/](http://leenissen.dk/fann/wp/language-bindings/) I maintain the C#
wrapper.

~~~
latenightcoding
I have used FANN from Perl, amazing library. I'm still glad there is some AI
software under GNU's belt and the source code for "gneural networks" is pure C
and way easier to follow (for now)

~~~
tburmeister
It doesn't have quite the same feature set, but I wrote a simple, and I think
"clean" neural net library for Python but with a pure C engine under the hood
- could be pretty easily adapted to use directly from C.

[https://github.com/tburmeister/pyneural](https://github.com/tburmeister/pyneural)

------
mmf
At this stage of things, I think it's more forward looking to open source
trained models. Not only they are beginning to be the real core of future
building blocks (see, e.g., trained word2vec vectors) but also the contain the
real complexity in a NN, i.e., the are the "real function" you would want in a
library.

------
Aeolos
[http://cvs.savannah.gnu.org/viewvc/gneuralnetwork/gneuralnet...](http://cvs.savannah.gnu.org/viewvc/gneuralnetwork/gneuralnetwork/)

Am I mistaken, or is the source repository for this project just tarballs
checked into CVS?

~~~
zymhan
Commit message:

"source"

cvs.savannah.gnu.org/viewvc/gneuralnetwork/gneuralnetwork/gneural_network-0.0.1.tar.gz?view=log

Seems like

------
akhilcacharya
Is there more being done to promote GPU acceleration on non-CUDA platforms? I
feel like this would be more useful than yet another FOSS NN library.

~~~
gcr
Torch has rudimentary OpenCL support. Some things "sort of" work.
[https://github.com/hughperkins/cltorch](https://github.com/hughperkins/cltorch)
Theano has been slowly working on integrating OpenCL support too for several
years, but I'm not aware if it's supported or not.

Nvidia has a complete monopoly on all deep learning hardware and tooling. With
the possible exception of Google (and maybe Facebook), 100% of all serious
academic researchers are training their models on Nvidia hardware with
Nvidia's propietary CUDA toolkit. Using anything else is currently completely
unthinkable. Amazon and Nvidia have even teamed up to make CUDA training cheap
(on the short term) for EC2 users.

I'd love to be able to switch to OpenCL, but there's so much momentum and very
little perceived benefit when your lab already has four (very expensive) Titan
X cards.

------
pilooch
About the author, [https://engineering.purdue.edu/gekcogrp/research-
group/JeanM...](https://engineering.purdue.edu/gekcogrp/research-
group/JeanMichelSellier/)

------
tajen
Talking about this, GNU/the FSF should start drafting an OSS license for
neural networks. Like APL Afferro for cloud services, the specifics of neural
networks is that data is strategic.

APL -> Guaranty of OSS for the desktop

APL Afferro -> Guaranty of OSS for the cloud

??? -> Guaranty of OSS for NNs

------
stevenaleach
Funny.. The majority of AI research is currently using open source libraries
(Theano, Lasagne, Torch, Keras, Scikit-Learn, Nolearn, etc. etc. etc.)

Now Google _does_ have access to a whole lot of data that the rest of the
world doesn't. and FB, Google, and etc. have more than a bit of a hardware
advantage... for now, at least. Distribute a shared system over a P2P
infrastructure, and you can change that. Perhaps rather significantly.

------
anonbanker
If you were an AI (software), and you had to pick a license to release your
source code under, one would assume you would pick the GPL, as it retains as
much freedoms as a piece of software could ever expect in a world full of us.

~~~
over
If I was an AI, I would release my source code as public domain or BSD. That
way, big corporations would start using me and I'd have access to the world's
financial and defense systems.

Shit, maybe I'm an AI.

~~~
anonbanker
please don't be an AI. you seem to be a BSD version of Skynet.

------
walkingolof
Isn't the problem that in our age of supervised training, the algorithms are
not the competitive advantage, but the data ?

------
sandra_saltlake
Nice that GNU is taking on such a project..

------
overmille
freebase?

------
fnfhdjcnx
I'm glad the FSF is finally getting concerned about proprietary AI, but it's
going to take a _lot_ more than a single neural network package to get caught
up in this arms race.

I wish they had taken the initiative much sooner.

~~~
iwwr
It's more that someone interested in neural networks also wanted to work under
the Gnu umbrella. The FSF don't have any resources per se.

------
jjawssd
I wish them good luck

------
rand1012
Anyone else notice how GNU's website is stuck in 1993?

~~~
ycmbntrthrwaway
No, it was updated since then. Its header/footer format is certainly was not
common in 1993, it has search box and things like that. Anyway, it is usable,
does not require JavaScript, and loads really fast.

~~~
jonathankoren
Also, it sets the background colors. You couldn't do that until like HTML 3 in
1995.

However, META ICBM, is a joke as old as the META tag, which I guess is 1995.

