I would say if you are completely unfamiliar with machine learning then this article accomplishes the job of informing you that we've proposed something exciting and a bit different, and gives some flavour for what kind of problems we're interested in (like continuous time medical data).
However, this article assumes essentially zero technical background. So if you're comfortable with concepts either from machine learning (at the level of vaguely knowing some details about a neural network) or mathematics (taken some calculus or know about differential equations) then you will probably find this article lacking in details. Again, this is just that you're not in the target audience.
If you don't know where you fall on that spectrum of prerequisite background I highly recommend taking a look at some of David's responses in the thread I linked where he gives a pretty good quick primer with a bit more technical detail.
> If your brain hurts (trust me, mine does too), ...
This brings to mind something Ben Goldacre said (sorry can't quote) about the average science journalist being less qualified than the average science article reader.
Let's say I understand DNNs and ODEs well enough. Could you explain like I'm five how ODEnet differs from the more traditional architectures, and what uses it might have?
Have you taken a look at the other HN thread I linked to above?
In which David answers a similar question here: https://news.ycombinator.com/item?id=18677947
If you have any follow up questions there I will try to get to them!
We are also attempting to developing an algorithm which learns without doing a backpropagation more details are http://alpes.ai
And there was mention on HN about a month ago of a PyTorch implementation for ODEs (the name for this new NN design) here: https://github.com/rtqichen/torchdiffeq/blob/master/README.m...