Hacker News new | past | comments | ask | show | jobs | submit login
A new neural network design could overcome challenges in AI (technologyreview.com)
143 points by nmstoker 3 months ago | hide | past | web | favorite | 11 comments



I'm one of the authors. There's more discussion on this other recent HN thread https://news.ycombinator.com/item?id=18676986 where David Duvenaud (Senior Author) answers some questions.


Does this article do a good job of representing your work?


As with all science communication, it is highly specific to the audience.

I would say if you are completely unfamiliar with machine learning then this article accomplishes the job of informing you that we've proposed something exciting and a bit different, and gives some flavour for what kind of problems we're interested in (like continuous time medical data).

However, this article assumes essentially zero technical background. So if you're comfortable with concepts either from machine learning (at the level of vaguely knowing some details about a neural network) or mathematics (taken some calculus or know about differential equations) then you will probably find this article lacking in details. Again, this is just that you're not in the target audience.

If you don't know where you fall on that spectrum of prerequisite background I highly recommend taking a look at some of David's responses in the thread I linked where he gives a pretty good quick primer with a bit more technical detail.


Presumably the target audience then is people who got there by accident.

> If your brain hurts (trust me, mine does too), ...

This brings to mind something Ben Goldacre said (sorry can't quote) about the average science journalist being less qualified than the average science article reader.


Hi Jesse, amazing work! Even reading the article I'm still stymied.

Let's say I understand DNNs and ODEs well enough. Could you explain like I'm five how ODEnet differs from the more traditional architectures, and what uses it might have?


Thanks!

Have you taken a look at the other HN thread I linked to above? In which David answers a similar question here: https://news.ycombinator.com/item?id=18677947

If you have any follow up questions there I will try to get to them!


Its great to see some new way of looking at things in AI.

We are also attempting to developing an algorithm which learns without doing a backpropagation more details are http://alpes.ai


Looking forward to see your results on Imagenet!


The paper the article is based on is here: https://arxiv.org/abs/1806.07366

And there was mention on HN about a month ago of a PyTorch implementation for ODEs (the name for this new NN design) here: https://github.com/rtqichen/torchdiffeq/blob/master/README.m...


Is it me, or the picture opening the article looks like a Lorenz attractor (http://mathworld.wolfram.com/LorenzAttractor.html)... Which is a set of 3 differential equations.


It does, but that's just a coincidence. That's a visualization of the latent space for the toy time-series problem in our paper that has to learn to model spirals of different sizes and orientations.




Applications are open for YC Summer 2019

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: