
DiffEqFlux.jl – A Julia Library for Neural Differential Equations - ChrisRackauckas
https://julialang.org/blog/2019/01/fluxdiffeq
======
oxinabox
I think it is a truely beutiful thing how the Flux and DiffEq are totally
seperate, and it takes ~100lines to make them work together. Now suddenly you
don't just have 2 or 3 ODE solvers you have dozens.

------
xiaodai
I wonder if this type of collaboration is easy to achieve in other languages?
Of course not many have an awesome ODE ecosystem like Julia's. But it feels
like an unique strengths of Julia where it's easy to integrate heavy duty
libraries and have them work together!

It reminds me of this post [http://www.stochasticlifestyle.com/why-numba-and-
cython-are-...](http://www.stochasticlifestyle.com/why-numba-and-cython-are-
not-substitutes-for-julia/)

I think Numba and the Cythan ecosystem will struggle to achieve this with this
level of (relative) ease, although I see the original implementation was done
in PyTorch. The ODE and ML experts probably made it look easy!

Now I wonder about the relative performance of DiffEqFlux vs the original
implementation

~~~
marmaduke
the numerical performance ecosystem in Python is historically more siloed than
in Julia, due to reliance on C-like language for performance. Relative
latecomers like Numba and Autograd(compare to NumPy) didn't get enough
traction for it all to cohere nicely.

(I don't use Julia, but have done several evaluations for our hpc work)

------
rhymer
Fascinating! Such a great demonstration of Julia's strength. I wish more
academic papers are written in this fashion or include a tutorial-
like/reproducible post like this.

~~~
xiaodai
This is so true! I was struggling with a paper yesterday and I was hoping it
gave more examples. The paper felt terse but i think this could be a space
constraint issue with physical journals. In the internet age, papers should be
longer form with more examples and clear explanations. Heck, they shouldn't
even be called papers!

~~~
tnecniv
Even when physical copies aren't being manufactured, length constraints would
still exist. Different publication venues serve different purposes and these
purposes are best served by articles of a certain length.

The purpose of a short Nature article is very different than a journal with
100+ page submissions.

~~~
xiaodai
Nature is printed. Exactly my point.

------
mark_l_watson
Really nice work. I just started using Julia and the Flux machine learning
library a few months ago. Julia is great for numeric processing - I would say
much better than Python except for the deep learning libraries like TensorFlow
I rely on for work. That said, for my personal projects I have been using Flux
and Julia and am happy with the ease of setup and the dev environment.

------
sivakon
Great. It’s already here. I remember Chris had to convince some commenter on
HN about why would someone want to use ODEs and Neural nets together. NeurIPS
best paper goes to Neural ODEs. Great paper, long live Julia.

------
npr11
This looks awesome! Thanks for making a working library and a great
introduction :)

------
pcerdam
I started messing around with Julia again recently after reading about Flux.

So far, it has been amazing, and this article further proves to me that Julia
is an amazing platform for ML, among other things.

------
eigenspace
Very exciting stuff! Keep up the good work!

------
4thaccount
Nice!

