
A History of APL in Fifty Functions (2016) - lelf
https://www.jsoftware.com/papers/50/
======
Y_Y
I wish the "Iverson bracket" (number 0 in this list) had caught on in a bigger
way. It can really simplify writing conditional expressions, as opposed to
e.g. a case expression.

~~~
contravariant
It's also an exceptionally useful technique in mathematics if you've got a few
nested sums with dependend indices. By writing the condition in an Iverson
bracket it becomes trivial to change the order of summation or change one of
the variables.

~~~
dubya
I don't know if Knuth popularized it, but it's somewhat common in
combinatorics. Similar idea to using indicator functions when doing change of
variables in calculus.

[https://arxiv.org/abs/math/9205211](https://arxiv.org/abs/math/9205211)

------
andrewla
I think the most interesting bit of trivia here is the discussion of
Bertelsen's Number: 50847478, which is famous as being a published but
incorrect count of the primes less than 1e9. 50847534 is the correct answer,
but the incorrect number was derived in the 19th century and has persisted in
textbooks through the 20th.

In some ways, with computers, this is pretty trivial; what's impressive is
that this was computed long before computers were available, and although
incorrect, was remarkably close. Computers have spoiled us.

------
derefr
Cool, but I was disappointed to find that it assumes you already know enough
APL to understand the extensional definitions or analogies it gives. I was
hoping it would be more of a “bootstrapping APL from a simple, explicitly-
defined-in-the-text set of operators” kind of thing.

~~~
tumba
APL is not so much a model of computation as a new notation for semantics
which already exist. The real semantic explanation of APL is set theory and
traditional mathematics.

Probably the best deep ground-level explanation of APL is Iverson’s paper
“Notation as a tool of thought.” [0]

The bootstrapping explanation you describe sounds a lot like what Paul Graham
did in “On Lisp” [1] and in a much more complex fashion, Queinnec in “Lisp in
Small Pieces” [2], both highly recommended.

[0]
[https://www.jsoftware.com/papers/tot.htm](https://www.jsoftware.com/papers/tot.htm)

[1]
[http://www.paulgraham.com/onlisptext.html](http://www.paulgraham.com/onlisptext.html)

[2]
[https://www.amazon.com/dp/0521545668](https://www.amazon.com/dp/0521545668)

~~~
kd0amg
It takes a fair amount of work on top of "set theory and traditional
mathematics" if you want to actually state a semantics for APL.

------
xvilka
If anyone is willing to contribute to making APL more popular - feel free to
send a pull request to Learn X in Y minutes site[1].

[1] [https://github.com/adambard/learnxinyminutes-
docs/issues/358...](https://github.com/adambard/learnxinyminutes-
docs/issues/3580)

------
coldcode
I learned APL in the mid 70's in college but never got to use it for much (I
do remember writing a table tennis game which was pretty bizarre). But I
always remember how amazing it was to learn the "combos" that did so much work
in so few symbols. It was like learning to master a video game like Street
Fighter. But APL was always a mostly write-only language, unless you used it
every day, your code quickly became WTF-is-that.

------
Athas
APL really is a wonderful core notation, even though the full language is
rather crufty. I've long been saddened that its promises of parallelism never
seemed to work out.

~~~
lelf
Modern APL incarnations (kdb, dyalog apl) are quite damn parallel.

~~~
Athas
I've never used K, but isn't Dyalog mostly parallel through isolates? Last I
checked, basic array operations were not automatically executed in parallel.
Did this change?

~~~
jharsman
Dyalog launches multiple threads for certain operations if the arrays are
large enough to justify the overhead.

~~~
derefr
That seems like the right approach for CPU-targeted code. Has an APL
descendant ever been created to target a GPGPU compute-kernel, or even to
compile to an FPGA netlist description of a DSP?

~~~
jodrellblank
Aaron Hsu’s co-dfns is a compiler of a subset of APL, written in APL, which
compiles to GPU code.

Do a HN Algolia search for his username “arcfide” to find a lot of discussion,
and there’s a couple of YouTube video talks, one of him talking through the
codebase on a livestream, another a recording of a conference style tall
introducing it to people.

It needs Dyalog APL, but that’s now easily available for non commercial use.

------
ngcc_hk
Interesting. Hope it can be written for APL novice.

------
contingencies
Disappointing they didn't webfont the thing. ⍢

------
aloukissas
Did anyone else also think about dry-aged steaks when reading this title? If
you're in LA, you'll know what I mean :D

~~~
aloukissas
Haha so apparently you can get downvoted for humor in HN #whysoserious

~~~
FullyFunctional
Because HN is not Reddit. The value of HN comes from good and insightful
comments. Humor is noise and noise drowns the signal. EDIT: Humor is a problem
per se if it accompanies useful comments.

