
Tiny-dnn – A C++11 implementation of deep learning - chang2301
https://github.com/tiny-dnn/tiny-dnn
======
krona
If you're interested in C++ Frameworks that take a similar approach (e.g. a
comprehensive C++ API, statically defined networks etc.) and you aren't afraid
of meta-programming, then I'd also take a look at the dlib implementation:
[http://blog.dlib.net/2016/06/a-clean-c11-deep-learning-
api.h...](http://blog.dlib.net/2016/06/a-clean-c11-deep-learning-api.html)

~~~
hellofunk
I was shocked to read in that article that MS Visual Studio still does not
support the full C++11 spec, which is now over 5 years old. What? I don't use
MS products at all, but this really surprises me. Why are they so far behind?

~~~
fetbaffe
What would that be? Visual Studio 2015 seems like fully covered except C99
preprocessor, _if_ this table fully covers all C++11 features.

[https://msdn.microsoft.com/en-
us/library/hh567368.aspx](https://msdn.microsoft.com/en-
us/library/hh567368.aspx)

Why they have been behind? Wild guess, Microsoft thought that C#/.NET would be
the future for everything and scaled back on C++, but with Windows 10 they
realized that the world still consist much of C++, so now it is a equal member
again together with .NET and HTML5/JavaScript, i.e Universal Windows Platform.

~~~
pjmlp
> Wild guess, Microsoft thought that C#/.NET would be the future for
> everything and scaled back on C++, but with Windows 10 they realized that
> the world still consist much of C++, so now it is a equal member again
> together with .NET and HTML5/JavaScript, i.e Universal Windows Platform.

Actually it is a bit more complex than that.

.NET used to belong to DevTools and C++ to WindowsDev.

WindowsDev lost the political wars to DevTools when MS went full .NET, but
given the technical and political battles that caused Longhorn failure,
WindowsDev gained ground.

Hence why starting with Vista the "Going Native" motto was born, and COM
gained ground as the way to get more OS APIs to user space.

Alongside this, "Going Native" wave, Singularity and Midori meant bringing
those efforts back to .NET thus creating .NET Native.

UWP original design, WinRT, can actually be traced back to .NET before the CLR
was created. When they were designing COM 2.0 Runtime, as the future way to
use VB, C++ on Windows.

Also before you rejoice too much about C++'s role, check how many C++ related
talks are there at Build, Ignite and Connect().

C++ is seen as the official systems programming language for kernel
programming, drivers, games and GPGPU. For everything else, the documentation
or conference sessions tend to focus on .NET languages.

~~~
trentnelson
> C++ is seen as the official systems programming language for kernel
> programming, drivers

Kernel and driver development is still very much C.

~~~
Sean1708
I thought Windows did away with C completely and only supports C++ now?

~~~
pjmlp
The official statement from Herb's blog was into that direction.

[https://herbsutter.com/2012/05/03/reader-qa-what-about-vc-
an...](https://herbsutter.com/2012/05/03/reader-qa-what-about-vc-and-c99/)

The modern C support has only been done to the extent required by ANSI C++,
and by integrating clang frontend with VC++ backend (C2).

Microsoft is doing with C2 something similar to LLVM for all their languages
(.NET Native, VC++, clang frontend).

------
fest
When I looked for a small ANN library with little external dependencies and
which could be statically linked I settled on FANN[0].

Worked reasonably, solved my problem as well as I hoped it would. It is rather
limited in features though- no training on GPU, single-threaded by design,
etc.

tiny-dnn appears to have a lot more choices regarding network architecture,
parallelization options. Would definitely have tried tiny-dnn first if I had
known about it.

[0]: [http://leenissen.dk/fann/wp/](http://leenissen.dk/fann/wp/)

~~~
gcp
Fann doesn't do DCNN, arguably the most important right now.

~~~
hellofunk
> arguably the most important right now.

But for very specific uses, right? DCNN excel at image and video, but are DCNN
also common for other tasks?

~~~
mark_l_watson
Surprisingly (to me at least) convolutional neural networks (CNN) are also
useful for natural language processing. (I am using CNN for NLP at work.)

~~~
dr_zoidberg
CNNs in general excel at infering/decoding "context" from the vectors, so it
makes sense to use them for language. But yes, there are cases where they
don't make sense/just add complexity.

------
inglor
"98.8% accuracy on MNIST in 13 minutes training" \- that's really slow isn't
it? I can probably write vanilla JS or Python code that does that in 10 lines
or so and would finish in around 5 seconds.

Why is this taking so long?

~~~
lexy0202
> I can probably write vanilla JS or Python code that does that in 10 lines or
> so and would finish in around 5 seconds.

Really? There are certainly frameworks that you can use to achieve this
performance in that amount of code, but I would be really interested to see
that in vanilla JS/Python.

~~~
inglor
Hey, sure - here's 9 lines of code that implement a perceptron. I get ~95% of
precision after a second with 400 samples.

```js

    
    
         const dotSign = (v1, v2) => v1.reduce((prev, x, i) => prev + x * v2[i], 1) > 0 ? 1 : -1
    
        module.exports = (data, weights = Array(data[0].content.length).fill(0)) => {
          for(const {label, content} of data) {
            const delta = (label - dotSign(content, weights)) / 2;
            weights = weights.map((x, i) => x + delta * content[i]);
          }
          return { perceive: vector => dotSign(vector, weights), weights };
        }
    

```

I can upload an electron app that does this with mnist if interested.

~~~
anentropic
how long does does it take to get from 95% -> 98.8% though?

~~~
lunula
This. You can also get near 98% accuracy with vw and one quadratic interaction
over the pixel space. It takes seconds. But that last .8%, forget about ever
getting there with infinite training time.

------
FraKtus
Sorry if the question is not relevant but is that difficult to use that
project to implement something like DeepDream
([https://github.com/google/deepdream](https://github.com/google/deepdream)) ?

~~~
chaosmail
This should be possible and fairly easy to do, I guess - it's just forward and
backward pass with your favorite model through the net and collect the loss
for all octaves. You will have to port a few image utility functions (like
roll, zoom, etc.) if not available. I ported the basic Deep Dream example to
JavaScript [1] and it was not that difficult.

When looking through the code I saw that BP for LRN is not implemented yet, so
you cannot pick a model using LRN (or you have to implement it).

[1]
[https://github.com/chaosmail/caffejs/blob/master/docs/assets...](https://github.com/chaosmail/caffejs/blob/master/docs/assets/scripts/deepdream_worker.js)

~~~
FraKtus
Thank you!

------
EvgeniyZh
Contributor here. Welcome to ask question. We are working on GPU support now.

------
mark_l_watson
This looks interesting. I cloned the repository and am checking it out. The
examples all use supplied with the repo binary image training data. It would
be good to document data preparation better (sorry in advance if I just missed
seeing that).

20+ years ago, I made my living on C++, but dropped it. Looking at the example
programs using C++11 features, I am motivated to get up to speed on C++11. I
put writing a NLP example for tiny-dnn and sending a pull request on my todo
list.

------
tinco
Can anyone recommend a book, long article or tutorial on deep learning? I'm
not in the field but interested in what particular advances have been made
that deep learning is suddenly so much more attractive than regular neural
networks etc were back in the 00's.

~~~
chaosmail
Check out this blog post about deep learning for computer vision applications
[1]; it describes very well the historical development and the current
advances. In short, not too much changed from LeCuns ConvNets in the 80s.
Disclaimer, I wrote this article.

[1] [http://chaosmail.github.io/deeplearning/2016/10/22/intro-
to-...](http://chaosmail.github.io/deeplearning/2016/10/22/intro-to-deep-
learning-for-computer-vision/)

------
philliphaydon
When I saw "tiny-dnn" I though. Huh? Tiny Dot Net Nuke?? No such thing exists.
DNN is too much bloat!

~~~
philliphaydon
Wow HN got no sense of humour today.

~~~
hellofunk
HN has lots of humour and an appreciation for clever wit and sarcasm. I'm
afraid that your effort, while noble, failed to reach this bar.

