Hacker News new | past | comments | ask | show | jobs | submit login
A History of APL in Fifty Functions (2016) (jsoftware.com)
109 points by lelf on Nov 21, 2019 | hide | past | favorite | 23 comments



I wish the "Iverson bracket" (number 0 in this list) had caught on in a bigger way. It can really simplify writing conditional expressions, as opposed to e.g. a case expression.


It's also an exceptionally useful technique in mathematics if you've got a few nested sums with dependend indices. By writing the condition in an Iverson bracket it becomes trivial to change the order of summation or change one of the variables.


I don't know if Knuth popularized it, but it's somewhat common in combinatorics. Similar idea to using indicator functions when doing change of variables in calculus.

https://arxiv.org/abs/math/9205211


You can do that easily in numpy. ((x>0)-(x<0))*1 will give zeros and ones.


I think the most interesting bit of trivia here is the discussion of Bertelsen's Number: 50847478, which is famous as being a published but incorrect count of the primes less than 1e9. 50847534 is the correct answer, but the incorrect number was derived in the 19th century and has persisted in textbooks through the 20th.

In some ways, with computers, this is pretty trivial; what's impressive is that this was computed long before computers were available, and although incorrect, was remarkably close. Computers have spoiled us.


Cool, but I was disappointed to find that it assumes you already know enough APL to understand the extensional definitions or analogies it gives. I was hoping it would be more of a “bootstrapping APL from a simple, explicitly-defined-in-the-text set of operators” kind of thing.


APL is not so much a model of computation as a new notation for semantics which already exist. The real semantic explanation of APL is set theory and traditional mathematics.

Probably the best deep ground-level explanation of APL is Iverson’s paper “Notation as a tool of thought.” [0]

The bootstrapping explanation you describe sounds a lot like what Paul Graham did in “On Lisp” [1] and in a much more complex fashion, Queinnec in “Lisp in Small Pieces” [2], both highly recommended.

[0] https://www.jsoftware.com/papers/tot.htm

[1] http://www.paulgraham.com/onlisptext.html

[2] https://www.amazon.com/dp/0521545668


It takes a fair amount of work on top of "set theory and traditional mathematics" if you want to actually state a semantics for APL.


If anyone is willing to contribute to making APL more popular - feel free to send a pull request to Learn X in Y minutes site[1].

[1] https://github.com/adambard/learnxinyminutes-docs/issues/358...


I learned APL in the mid 70's in college but never got to use it for much (I do remember writing a table tennis game which was pretty bizarre). But I always remember how amazing it was to learn the "combos" that did so much work in so few symbols. It was like learning to master a video game like Street Fighter. But APL was always a mostly write-only language, unless you used it every day, your code quickly became WTF-is-that.


APL really is a wonderful core notation, even though the full language is rather crufty. I've long been saddened that its promises of parallelism never seemed to work out.


Modern APL incarnations (kdb, dyalog apl) are quite damn parallel.


I've never used K, but isn't Dyalog mostly parallel through isolates? Last I checked, basic array operations were not automatically executed in parallel. Did this change?


Dyalog launches multiple threads for certain operations if the arrays are large enough to justify the overhead.


That seems like the right approach for CPU-targeted code. Has an APL descendant ever been created to target a GPGPU compute-kernel, or even to compile to an FPGA netlist description of a DSP?


Aaron Hsu’s co-dfns is a compiler of a subset of APL, written in APL, which compiles to GPU code.

Do a HN Algolia search for his username “arcfide” to find a lot of discussion, and there’s a couple of YouTube video talks, one of him talking through the codebase on a livestream, another a recording of a conference style tall introducing it to people.

It needs Dyalog APL, but that’s now easily available for non commercial use.



kdb+/q supports massive parallelism.


Interesting. Hope it can be written for APL novice.


Disappointing they didn't webfont the thing. ⍢


Did anyone else also think about dry-aged steaks when reading this title? If you're in LA, you'll know what I mean :D


Haha so apparently you can get downvoted for humor in HN #whysoserious


Because HN is not Reddit. The value of HN comes from good and insightful comments. Humor is noise and noise drowns the signal. EDIT: Humor is a problem per se if it accompanies useful comments.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: