
The Problem With APLs - mr_golyadkin
https://www.hillelwayne.com/post/the-problem-with-apls/
======
aaaaaaaaaab
In my experience the APL family of languages are extremely hostile to the kind
of exploratory coding that people would do in e.g. Python, JS or even C.

You do APL like you do mathematics: sit down with pen & paper and think. And
yes, many times it’s hit-n-miss, and someone walking by will tell you how it’s
trivial with technique X you were unaware of. But again, this is how people do
maths too: Maxwell’s original set of equations for electromagnetism were 20
differential equations, and it was later that Heaviside came by and rewrote it
in its concise vector form we all know. (And those four equations can still be
simplified to a single equation via geometric algebra!
[https://en.m.wikipedia.org/wiki/Mathematical_descriptions_of...](https://en.m.wikipedia.org/wiki/Mathematical_descriptions_of_the_electromagnetic_field)
)

~~~
olzd
I disagree: it's not that hard to gradually get to a solution via the REPL.

~~~
ken
I've played around with APL a bit, and I agree that it's not that hard to
gradually get to a solution in the repl, but...

The problem I have is that when I gradually get to a solution in
Lisp/Ruby/Python, I end up with something which is also close to lots of other
problems I want to solve (or that my manager or users ask for). In solving the
problem, I've naturally built up tools for that entire problem space.

In APL, I end up building a solution which is not at all close to other
problems I might want to solve. I'll take an identity, flip this, zip it with
that other array, reduce it with this function, and then poof, the final step
slides all the puzzle pieces together and it's perfect (and only 17
characters!), but only for that problem exactly as stated. When I get a
request for something slightly different, I almost have to start from scratch.

I assume this is what the "beautiful diamond" / "ball of mud" dichotomy
referred to.

~~~
sheepmullet
> When I get a request for something slightly different, I almost have to
> start from scratch.

Do you start over from scratch in terms of your mental model or just in terms
of the actual code?

I’ve found using APL helps me to generalize a problem but the underlying code
will often look very different.

I have found I waste a lot of time in other languages trying to re-use
components that aren’t necessarily a good fit just because I have already
written them.

~~~
girvo
Your last point is something I fight my hardest to avoid. It’s hard, though,
but leaning on small generic functional primitives and building solutions
through those has helped a lot (see: Ramda, fp-ts, and others)

------
untangle
I'm most familiar with Dyalog APL so my comments are based on that. I believe
that Dyalog has about 100 primitives. (But APL's relatively challenging
semantics are somewhat offset but truly simple syntax.)

As in most language learning, one starts with a small number (20?) of words
and expands from there. APL also requires the neophyte to develop a mind
mapping from abstract symbol to function. This may double initial learning
time but I don't think that it's much worse than that. Mnemonics exist for
many of the symbols.

But more fundamental is learning to map the problem space into an array
formulation. This, not symbol application, is where a white board may come in
handy.

Dyalog provides a free web-based tutorial that is quite comprehensive. [1] I
daresay that spending a couple of hours with this tool will give you a great
taste for APL. Spend a couple of days and you'll be quite proficient in both
array-think and APL semantics.

Other useful tools include the wiki [2], a cloud server [3], and a neat
examples wiki [4]. BTW, the "library of useful components" that the OP desires
exists both as a compilation of useful idioms (code fragments) [5].

[1] [https://tutorial.dyalog.com/](https://tutorial.dyalog.com/) [2]
[https://aplwiki.com/](https://aplwiki.com/) [3]
[http://www.aplcloud.com/](http://www.aplcloud.com/) [4]
[http://www.jsoftware.com/papers/50/](http://www.jsoftware.com/papers/50/) [5]
[https://aplwiki.com/FinnAplIdiomLibrary](https://aplwiki.com/FinnAplIdiomLibrary)

~~~
fusiongyro
What I really want (and I will annoy you by talking about J instead of Dyalog)
is resources for intermediate users, because there is a significant amount of
literature for beginners and experts but I agree with Hillel that the road
between the two levels seems to be unmitigated struggle. The Finn idiom
library may be useful for APL users but even looking at it, I don't see what
the purpose of it is, unless perhaps you're just supposed to read the
expressions and learn novelties about how to use the built-ins.

The approach I've taken, which I think is common and slow and easy to "fall
off the bandwagon", is to write my solutions to my problems on my blog or
email the J mailing list and see if the experts can improve on my code. They
always can, if they care enough to show me, and they usually do. But as I
mentioned in my other comment, I think one of the wonders of APL/J/etc. is
that it seems to share even less with conventional programming languages than
it seems to at first because the way APL programmers decompose problems is
radically different. I often find myself trolling through the dictionary
looking for things that I wouldn't need if I didn't break the program down
into as small pieces. Choosing the right verb is partly tactical, but the
whole strategy for attacking problems is so unconventional you are often
looking from the wrong vantage point.

I don't know if there is a royal road to APL that addresses this, but I share
Hillel's desire for one. I suspect there isn't, and the only way to really get
better is: read lots of code, write lots of code, get feedback on your code,
repeat.

------
pavlov
_The biggest problem with APLs, in my opinion, is discoverability: it’s hard
to know what you’re supposed to be writing._

Honestly this is a problem with programming in general. A lot of modern
frameworks actively work to maintain the problem by innovating with layers
that break even the modest discoverability achievements that have been made,
such as IDE autocomplete and "jump to declaration".

This problem is endemic and not due to any particular language or company. To
pick a few examples, Redux, Rails and Angular each offer their own flavors of
this issue. In the name of making things visually simpler or conceptually more
structured, they make it much harder to understand who's calling your code and
what calls are available to you — the very basics of programming
discoverability.

------
fusiongyro
I definitely agree with Hillel's prescription here. In a complex topic, you
need different kinds of literature. For beginners, there is _J for C
Programmers_ , and for experts there is _NuVoc_ and the official dictionary.
But for intermediate users who grasp the basic concepts, J and other APLs are
harder than other languages because there seems to be a whole strategic aspect
separate from the tactical aspect. It seems as though trying to figure out how
to do something in APL based on experience with any non-APL programming
language gives you a huge X-Y problem, where even you cannot even trust how
you would decompose the problem. Relearning that after a dozen or so more
conventional languages seems to be a significant cognitive barrier.

This confrontation with an alien culture of computing seems to make us recoil
in horror for the most part. But having been over that barrier with SQL and
Prolog, I find it plausible that the APLs really do represent a more elegant
approach to computation that just requires a much larger up-front investment.

~~~
avmich
We have different perspectives here. I agree that _J for C Programmers_ is
good for novices, but I wouldn't say NuVoc is for experts. Next, J allows
solving problem with different algorithms, and the choice you make among
primitives reflects more what you want than what you can do - so the problem
author is discussing seems rather odd, you can go many different ways, what do
you mean "which way I'm supposed to go"? Yes, decomposing problem is different
with APLs. No - believe it or not, after three days of meditation I've got
enough understanding of J that makes the road from absent of major roadblocks.
That wasn't and still isn't the case with supposedly easy Python - years of
conventional languages like C, C++, Java etc. still allow me to struggle with
Python - perhaps because I didn't have those days of meditations with it?

~~~
fusiongyro
Python does encourage the perspective that there is a single right way to do
things. This isn't necessarily true at the high-level, but at the low level of
a handful of lines, it often is true.

Like Hillel, I have often thought that my solution to a problem in J was good,
only to see much shorter solutions from prominent J programmers. What makes it
shorter is usually that there are direct but less obvious ways of doing things
using the primitives. I have the experience of having a first look at things
like grade-up and remembering it as a way of sorting, but when you see the
various ways that grade-up is used in practice, there are a lot of uses for it
that do not strictly have to do with sorting. This is in contrast to Python
where lists can be sorted and that's all sorting does for you. I frequently
forgot about the match builtin and would do convoluted things like equals with
rank (0 1) and insert-equals or sum. There's more than one way to do things,
but often beginner code is obviously worse.

------
pjc50
One of the ideas that really informs my thinking about comparative programming
languages is "semantic compression":
[https://caseymuratori.com/blog_0015](https://caseymuratori.com/blog_0015)

A programmer has an idea of what they want the program to do, expressed in
human language and concepts. Programming involves refining this idea in
sufficient detail that it can be expressed as a program, replacing the
imprecise human thought with precise symbols that will be mechanically
interpreted. A programming language offers a set of primitive symbols from
which larger concepts can be built.

One reason people avoid programming in assembler much is that the primitives
are "too small": they don't relate well to the concepts the programmer is
thinking in and you need a lot of them. I wonder if the problem with APL-style
languages is that, unless you're used to operating in a mathematical domain,
the primitives are "too compressed": a single symbol standing for a large
complex concept.

~~~
cousin_it
For me, the nicest way to program is to use imperative code that executes from
top to bottom and deals with elements one by one. With other approaches, I
don't feel as comfortable growing the program incrementally, because they
require large changes to the program when the problem statement changes
slightly. James Hague said it well in "Puzzle Languages":
[https://prog21.dadgum.com/38.html](https://prog21.dadgum.com/38.html)

~~~
avmich
A relevant quote from _J for C Programmers_ :

Writing code in a scalar language makes you rather like a general who gives
orders to his troops by visiting each one and whispering in his ear. That
touch-of-Harry kind of generalling can achieve victorious results, and it has
the advantage that the orders can be tailored to the man, but its
disadvantages are significant: the general spends much mental energy in
formulating individual orders and much breath in communicating them
individually; more significant, his limited attention is drawn toward
individuals and away from the army as a whole. Even the great Rommel was
overtaxed at times.

The J programmer is, in contrast, a general who stands before his army and
snaps out orders to the army as a whole. Every man receives the same order,
but the order itself contains enough detail for the individual men to act
appropriately. Such a general can command a corps as easily as a platoon, and
always has the 'big picture' in mind.

------
tluyben2
Like others, I think this is a problem with programming in general. Reading a
lot of code, you will see patterns where people are using the things they
know. I have seen in Lisp, APL (particularly k), Prolog and Forth (but also
Haskell) that people spend far more time thinking and reworking possible
solutions than you see in Python, C# and C++. With k especially, there is this
drive to work to Arthur his dream of not having to scroll. I never got the
write only aspect either; when you get used to the symbols and tenseness, it
is really not that hard. I find it really a pleasure to not have to click
through 100s of files. Especially as I get older and my memory gets worse,
APL/k/j are really working; you do really not have to remember things as they
are there, in front of you all the time while you work on the next part.

------
RodgerTheGreat
I think that the number of primitive functions in a language like APL should
be compared to the number of functions in, say, Python's standard library,
which is _several_ orders of magnitude larger. In APL, the language is the
library. In more mainstream languages there's a sharper delineation. Longer
and (sometimes) more suggestive names seem to make the task of learning a
large standard library feel less onerous, but the experience of gradually
acquiring more familiarity with the building blocks which are readily at hand
is ultimately the same.

------
aethertron
Seems that proficiency in J needs more memorisation than proficiency in
Python. Some people get on better with this memorisation demand. Some prefer
more stuff explicitly written out in their programs, which means bigger
programs, so you'll need to scroll up and down the file to see everything.
Arthur Whitney, creator of the K APL, reportedly hates scrolling [0].
Different people have different preferences, because their brains work
differently. There's nothing wrong, no 'problem', with this situation.

[0]:
[http://archive.vector.org.uk/art10501320](http://archive.vector.org.uk/art10501320)

------
patrickg_zill
The Rosetta code entry for comparison :
[https://rosettacode.org/wiki/Averages/Mode#J](https://rosettacode.org/wiki/Averages/Mode#J)

------
IshKebab
It would be a lot easier to find the function you want if it was represented
by a descriptive name rather than an arbitrary symbol. I mean, I'm not saying
you have to write `3 plus 4` but maybe there's a middle ground? I mean... the
middle ground that most normal programming languages are in.

~~~
snaky
There were many 'normal programming languages' of APL times. But not so many
of them are still around - Lisp with 'no syntax! so many brackets!' and APL
with 'arbitrary symbols', Prolog, Smalltalk - what else? All of them are
special, and maybe that's not a coincidence.

~~~
IshKebab
Fortran predated APL by 5 years and that had actual function names (even if
they were limited to 5 characters to fit on punch cards). I'm not sure what
your point is.

~~~
avmich
Read _Notation as a Tool of Thought_ for reasons why APL looks the way it
looks.

It's a pretty common idea "let's rename all those ugly APL primitives with
descriptive names". And it's easy to do. Why do you think the idea doesn't
stick in APL, even though it works in say Matlab? Maybe it is because for APL
programmers \\. is perceived on the same level as + and you don't ask to
rename + to 'plus'.

------
kazinator
Finding mode in TXR Lisp:

    
    
      This is the TXR Lisp interactive listener of TXR 197.
      Quit with :quit or Ctrl-D on empty line. Ctrl-X ? for cheatsheet.
      1> [group-by identity '(1 1 1 1 2 2 3 3 3 3 3 3 4)]
      #H(() (1 (1 1 1 1)) (2 (2 2)) (3 (3 3 3 3 3 3)) (4 (4)))
      2> [find-max *1]
      (4 4)
      3> [find-max *1 : len]
      (3 3 3 3 3 3 3)
      4> (car *3)
      3
    

All together:

    
    
      5> (car [find-max [group-by identity '(1 1 1 1 2 2 3 3 3 3 3 3 4)] : len])
      3
    

Build an anon function out of these steps using opip macro:

    
    
      6> (opip (group-by identity) (find-max @1 : len) car)
      #<intrinsic fun: 0 param + variadic>
    

Invoke it on the sequence:

    
    
      7> [*6 '(1 1 1 1 2 2 3 3 3 3 3 3 3 4)]
      3

------
de_Selby
It seems like the main issue is that the author doesn't know the language yet.

It would probably take a beginner python programmer 40+ minutes to write the
mode function too.

J might have a lot of primitives, but for an example that might be more
familiar, so does assembly and people got along just fine there. There is an
investment in getting the hang of the primitives but there aren't a huge
amount of them, and once you do a problem like this becomes much easier.

The other thing that's really nice is that you start to see how to solve it in
different ways using completely different sets of primitives.

~~~
IshKebab
No that isn't the problem. There are essentially 200 _unnamed_ functions that
you have to learn. The problem isn't that he hasn't learned them, it's that
there are 200 of them that you have to learn!

Normal languages only have about 20 unnamed functions (i.e. operators) and
they're mostly things we've already learnt at school.

~~~
avmich
> There are essentially 200 unnamed functions that you have to learn.

Is % an unnamed function? Or {. ? Then you have lots of unnamed things in,
say, Java, which are "public", "static", "char" etc...

You don't need to learn them all before you can start doing something useful.
_J for C PRogrammers_ provides a good course gently introducing simple ones
first.

------
userbinator
_I’m sure a J expert can look at this and say “no you’re supposed to use the
Foobar primitive which makes it trivial”, but that’s my whole point: to find
the right primitive I have to review 200 of them._

There's a good analogy here with human languages --- APL is like Chinese,
whereas Python is closer to English.

~~~
yorwba
That analogy breaks down as soon as you know Chinese. Python's syntax is
intended to look as if it could be English (from ... import ... as ...), but
you could build a Chinese Python based on the same principle of choosing a few
phrases and turning them into templates for abstract syntax. It would look
very unlike APL. (Convince yourself:
[http://www.chinesepython.org/](http://www.chinesepython.org/))

The whole point of APL is that it doesn't try to appear like an existing human
language, so the concepts relevant for programming can be expressed with more
concision.

~~~
avmich
If you read the history how Iverson notation became APL programming language,
you might agree that APL is more natural as it grows from mathematical
notation which accumulated over centuries. Actually, APL was initially used to
teach basics of programming, it was considered to be easier for many,
especially non-programmers. Funny that now Python takes a lot of that role.

------
calcifer
Yet another article about a three letter acronym without mentioning what it
stands for. It's infuriating, even Wikipedia [1] doesn't seem to know what it
is.

[1] [https://en.wikipedia.org/wiki/APL](https://en.wikipedia.org/wiki/APL)

~~~
schindlabua
[https://en.wikipedia.org/wiki/APL_(programming_language)](https://en.wikipedia.org/wiki/APL_\(programming_language\))
There you go!

~~~
calcifer
The article talks about APLs, plural. If APL is _a_ programming language, how
can it be plural?

~~~
detaro
Multiple implementations, derived languages with similar concepts

~~~
schindlabua
Exactly. You more commonly hear this when talking about Lisp. Racket is a Lisp
and so is Scheme or Common Lisp, and what we mean by that is dialects. In the
article he mentions J, K and Dyalog as APLs.

