Hacker News new | past | comments | ask | show | jobs | submit login
Yagrad – 100 SLOC autograd engine with complex numbers and fixed DAG (github.com/noway)
33 points by noway421 10 months ago | hide | past | favorite | 4 comments



Elegant. I want to review this more. Could __slots__ work here? I always compulsively try that to save memory. Keep it up.


Great idea!

I'm testing it on a 3-layer perceptron, so memory is less of an issue, but __slots__ seems to speed up the training time by 5%! Pushed the implementation to a branch: https://github.com/noway/yagrad/blob/slots/train.py

Unfortunately it extends the line count past 100 lines, so I'll keep it separate from `main`.

I have my email address on my website (which is in my bio) - don't hesitate to reach out. Cheers!


What are some common examples of complex numbers in these sorts of applications?


Here complex numbers are used for an eloquent gradient calculation - you can express all sorts of operations through just the 3 functions: `exp`, `log` and `add` defined over complex plane. Simplifies the code!

The added benefit is that all the variables become complex. As long as your loss is real-valued you should be able to backprop through your net and update the parameters.

PyTorch docs mention that complex variables may be used "in audio and other fields": https://pytorch.org/docs/stable/notes/autograd.html#how-is-w...




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: