Hacker News new | past | comments | ask | show | jobs | submit | ahdsr's comments login

NotepadNext


Will try, thanks!


> Some fractals -- for instance those associated with the Mandelbrot and quadratic Julia sets -- are computed by iterating a function, and identifying the boundary between hyperparameters for which the resulting series diverges or remains bounded. Neural network training similarly involves iterating an update function (e.g. repeated steps of gradient descent), can result in convergent or divergent behavior, and can be extremely sensitive to small changes in hyperparameters. Motivated by these similarities, we experimentally examine the boundary between neural network hyperparameters that lead to stable and divergent training. We find that this boundary is fractal over more than ten decades of scale in all tested configurations.

Reading this gave me goosebumps


ZSA uses this for their keyboards. The moonlander I'm typing this on uses a headphone jack for all of the above [1]

[1] https://www.zsa.io/moonlander/


this is pretty cool


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: