I'm one of the authors. There's more discussion on this other recent HN thread https://news.ycombinator.com/item?id=18676986 where David Duvenaud (Senior Author) answers some questions.
As with all science communication, it is highly specific to the audience.
I would say if you are completely unfamiliar with machine learning then this article accomplishes the job of informing you that we've proposed something exciting and a bit different, and gives some flavour for what kind of problems we're interested in (like continuous time medical data).
However, this article assumes essentially zero technical background. So if you're comfortable with concepts either from machine learning (at the level of vaguely knowing some details about a neural network) or mathematics (taken some calculus or know about differential equations) then you will probably find this article lacking in details. Again, this is just that you're not in the target audience.
If you don't know where you fall on that spectrum of prerequisite background I highly recommend taking a look at some of David's responses in the thread I linked where he gives a pretty good quick primer with a bit more technical detail.
Presumably the target audience then is people who got there by accident.
> If your brain hurts (trust me, mine does too), ...
This brings to mind something Ben Goldacre said (sorry can't quote) about the average science journalist being less qualified than the average science article reader.
Hi Jesse, amazing work! Even reading the article I'm still stymied.
Let's say I understand DNNs and ODEs well enough. Could you explain like I'm five how ODEnet differs from the more traditional architectures, and what uses it might have?
It does, but that's just a coincidence. That's a visualization of the latent space for the toy time-series problem in our paper that has to learn to model spirals of different sizes and orientations.