Hacker News new | past | comments | ask | show | jobs | submit login

How does Keras compare to Lasagne [0], which is also Python/Theano based, and which was used with some impressive results [1]?

  [0] https://github.com/benanne/Lasagne
  [1] http://benanne.github.io/2015/03/17/plankton.html

One of the authors of Lasagne here! Lasagne is being built by a team of deep learning and music information retrieval researchers. Keras seems to share a lot of design goals with our project, but there are also some significant differences.

We both want to build something that's minimalistic, with a simple API, and that allows for fast prototyping of new models. Keras seems to be built 'on top of' Theano in the sense that it hides all the Theano code behind an API (which looks almost exactly like the Torch7 API).

Lasagne is built to work 'with' Theano instead. It does not try to hide the symbolic computation graph, because we believe that is where Theano's power comes from. The library provides a bunch of primitives (such as Layer classes) that make building and training neural networks a lot easier. We are also specifically aiming at extensibility: the code is readable and it's really easy to implement your own Layer classes.

Another difference seems to be the way we interpret the concept of a 'layer': a Layer in Lasagne adheres as closely as possible to its definition in literature. Keras (and Torch7) treat each 'operation' as a separate stage instead, so a typical fully connected layer has to be constucted as a cascade of a dot product and an elementwise nonlinearity.

Layers are also first-class citizens in Lasagne, and a model is usually referred to simply by its output layer or layers. There is no separate "Model" class because we want to keep the interface as small as possible and so far we've done fine without it. In Keras (and Torch7) the layers cannot function by themselves and need to be added to a model instance first.

For now, all Lasagne really does in the end is make it easier to construct Theano expressions - we don't have any tools yet for iterating through datasets for example, but we do have plans in this direction. We plan to rely heavily on Python generators for this. The scikit-learn like "model.fit(X, y)" paradigm, which Keras also seems to use, only really works for small datasets which fit in memory. For larger datasets, we believe generators are the way to go. Incidentally, Nolearn ( https://github.com/dnouri/nolearn ) provides a wrapper for Lasagne models with a scikit-learn like interface. We may also add this to the main library at some point.

Lasagne is not released yet - the interface is not 100% stable yet, and documentation and tests are a work in progress (although both are progressing nicely). But a lot of people have started using it already, we've built up a nice userbase and a lot of people have started contributing code as well! We're currently aiming to put out the first release by the end of April.

A non-exhaustive list of our design goals for the library is in the README on our GitHub page: https://github.com/benanne/Lasagne

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact