Hacker News new | past | comments | ask | show | jobs | submit login
Convolutional Neural Networks in APL (in 10 lines) [pdf] (acm.org)
3 points by eggy on Dec 12, 2020 | hide | past | favorite | 2 comments



A lot more. Maybe they could be made similarly concise with more libraries, but the fact is that APL/J have the array as their native unit, and all of the functions in the example are from its base or core. NumPy and Pandas are the progeny of APL/J. GPUs became the "vector/matrix processing" machines that APL and J were looking for decades ago for the perfect match. For me, I study APL/J because they are like math. You learn the symbols, and now you can abstract your thoughts into a program the same way you can convey it in mathematical symbols - succinctly. The speeds are not performant, but allow for quickly learning or experiementing before optimizing. Interpreted APL code can run as fast as C, but in this paper they stay away from using functions that are not vanilla APL. I have been reading books on neural networks since the late 80s, but I have never fully owned or grasped some concepts until I see them succinctly and not buried in pages of a book or hundreds/thousands of lines of code. To me it is yoga for the mind!


How many lines of code would the equivalent NumPy, Matlab, R, Julia, or Fortran 2018 be?




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: