
APL and J (2015) - kick
https://crypto.stanford.edu/~blynn/c/apl.html
======
xelxebar
Always excited to see APL/J content on the top page.

Happily, this coincides with a brand new release of the J interpreter, namely
J901. It's so new that the homepage[0] still claims 901 is in beta. The
install page[1], however, does have up to date information.

If you would like to taste a bit of the difference before deciding to jump in,
Aaron Hsu has a really interesting talk[2] that compares APL and "other"
languages via the lens of Human-Computer Interfaces.

I am just beginning my journey into J, but so far it has been extremely
rewarding. Head on over to #jsoftware on Freenode. The channel is super small,
but has some really friendly and helpful people on it!

[0]:[https://jsoftware.com/indexno.html](https://jsoftware.com/indexno.html)
[1]:[https://code.jsoftware.com/wiki/System/Installation](https://code.jsoftware.com/wiki/System/Installation)
[2]:[https://www.youtube.com/watch?v=v7Mt0GYHU9A](https://www.youtube.com/watch?v=v7Mt0GYHU9A)

~~~
gmfawcett
I'm a J newbie, but I've been using J901 to do the Advent of Code challenge
this year. I'm surprised at how pleasant it's been to work with. Some of the
AoC solutions have been so straightforward that I wonder if the AoC author
used an array language to design the puzzles in the first place. :)

It's not all roses! The error messages can be quite inscrutable sometimes, and
I can't seem to get the J901 debugging tools working (except for old-fashioned
print-debugging). But overall J is a charming (if quirky) language, and well
suited for a bit of low-risk exploratory programming.

~~~
pmoriarty
I remember a year or two ago I was struggling with an AoC problem, and the
best I could come up with in Python was a one or two hundred line program.
Then I saw an answer for it in about 20 characters of J. That was impressive,
though I do still wonder how readable the language is in practical every day
programming, as opposed to toy problems in a programming contest.

~~~
gmfawcett
True. I'm not sure how I'd feel about maintaining a 2,000+ line J codebase. It
definitely feels like a "programming-in-the-small" kind of language.

It is impressive how much logic you can fit into so few characters. Even other
terse languages (e.g. Haskell) seem wordy in comparison. Of course this is
especially true when the problem has a fairly natural array representation,
and that might not be true for many real-world problems.

I haven't tried it, but I feel that a literate-programming style might work
well for J. The code itself is very dense, so I think it might work well to
have a long-ish, plain-text explanatory document with bits of J interspersed
throughout it.

~~~
chongli
Perhaps the approach to maintaining a large codebase of terse code (such as J)
is to keep the files very small and narrow in focus. This could end up very
similar to a Unix userland code tree.

Of course, lots of documentation could also help. Maybe a style similar to
R/Python notebooks would work.

~~~
gmfawcett
That's an interesting point. I wonder if the small, focused approach is (or
was) common in professional APL/J/K development.

It turns out that there is a Jupyter notebook implementation for J [1]. I
might play with that! I suspect that you're right, notebooks would be a good
fit for terse languages.

[1]
[https://code.jsoftware.com/wiki/Guides/Jupyter](https://code.jsoftware.com/wiki/Guides/Jupyter)

------
pavlov
_" J might overly encourage programmers to use arrays when another data
structure would be a better fit."_

I feel this is actually a distinct advantage of the APL/J model: it eliminates
boring data structure bikeshedding.

If I were Supreme Dictator of Tech, I'd mandate APL as the language for all
whiteboard interviews. Everyone would have learn it, but at least it would be
far more interesting to study than the "Leetcode / Cracking the Code
Interview" type material that's currently being prescribed.

~~~
dalke
I deal with sparse graph data structures a lot, as molecular graphs, with a
maximum degree of about 5. While graphs can be represented as an adjacency
matrix, many of the algorithms become suboptimal in that representation, eg,
finding the list of 1-away neighbors becomes an O(n) operation rather than
O(1), in the number of nodes.

Now, I'm _not_ saying that these graphs or algorithms can't be represented in
APL. But, as an example, consider the problem of how to find strongly
connected components in a graph, which is something that is not an
unreasonable interview topic for an intermediate developer in my field.

An implementation for Dyalog APL is at
[https://dfns.dyalog.com/c_scc.htm](https://dfns.dyalog.com/c_scc.htm) with
commentary at
[https://dfns.dyalog.com/n_scc.htm](https://dfns.dyalog.com/n_scc.htm) . It
requires about 30 lines of code, starting:

    
    
      scc←{⎕ML←1                                  ⍝ Strongly connected components.
                                                  ⍝ (Tarjan)
        loop←{                                  ⍝ for each vertex in graph G
            vert←{⍺ conn⍣(0=X ⍺⊃⍵)⊢⍵}           ⍝ connection of unvisited vertex ⍺
            ⊃vert/(⌽⍳⍴G),⊂⍵                     ⍝   for each vertex in G
        }                                       ⍝ :: T ← ∇ T
    
        conn←{v←⍺                               ⍝ connection of vertex v
            T0←v trace ⍵                        ⍝ optional tracing
            T1←x1 v push v Lx v Xx T0           ⍝ successor state for x S L and X
            T2←↑{w←⍺                            ⍝ edge v → w
                min_L←{(⍺ w⊃⍵)⌊@(⊂L v)⊢⍵}       ⍝ L[v] ⌊← ⍺[w]
                0=X w⊃⍵:L min_L w conn ⍵        ⍝ w not connected: depth-first trav
                X min_L⍣(w∊S⊃⍵)⊢⍵               ⍝ low-link if w on stack
            }/(⌽v⊃G),⊂T1                        ⍝ for each edge from vertex v
            root←(L v⊃T2)=X v⊃T2                ⍝ is a root vertex?
            v comp⍣root⊢T2                      ⍝ new component if root
        }                                       ⍝ :: T ← v ∇ T
         ...
    

The pseuocode on the Tarjan's strongly connected components algorithm
Wikipedia page (not counting "end if", "end for", etc. lines) is shorter.

One Python implementation is at
[https://codereview.stackexchange.com/questions/46832/strongl...](https://codereview.stackexchange.com/questions/46832/strongly-
connected-components-algorithm) where you can see it's slightly shorter than
the APL version and - I argue - easier to understand.

Now, the APL commentary page points out:

> Nick Nikolov provides this alternative one-liner, which uses the transitive
> closure of the adjacency matrix (see →Graphs←).
    
    
        scc←{(∪⍳⊢)↓∧∘⍉⍨∨.∧⍨⍣≡i∘.∊⍵,¨i←⍳≢⍵}
        ⍝    ·     ·   ·     ·   ·  i←⍳≢⍵   vertex indices
        ⍝    ·     ·   ·     ·   ⍵,¨i   ·   consider each vertex a SCC by itself
        ⍝    ·     ·   ·     i∘.∊   ·   ·   neighbour lists to adjacency matrix
        ⍝    ·     ·   ∨.∧⍨⍣≡   ·   ·   ·   transitive closure: g[x;y] ←→ path x → y
        ⍝    ·     ∧∘⍉⍨ ·   ·   ·   ·   ·   ... and from y to x
        ⍝    (∪⍳⊢)↓ ·   ·   ·   ·   ·   ·   renumbering of component numbers
    

> The version is very good for small graphs but its space and time
> requirements grow rapidly as the size increases.

Which, I believe (not knowing more than a trivial amount of APL) is a
consequence of using an array data structure rather than something which is a
better fit.

~~~
robomartin
One of the problems I was constantly trying to avoid when using APL
(professionally, for real-world applications, not academics) was the explosion
of arrays.

You have to understand what the interpreter is doing. Just because you can
write a cool one-liner that gets you an answer with one turn of the crank it
doesn't mean this is the best solution.

When I started using APL I was using seriously resource-constrained machines
of the time (think ~1 MB of RAM rather than 64GB). This makes you far more
aware of what's going on behind the curtains.

In many ways this is one of my pet peeves with today's programmers, I think I
can say the majority have not been exposed to what might actually be happening
at the processor/memory level. This leads to such things as OOP being their
default level of abstraction and the explosion of classes and layers. Very
soon a simple addition of two numbers takes a thousand clock cycles rather
than one.

Anyhow, love APL, I really do, but I have no clue why it shows up so much on
HN. These days I would not use it (or much less J) if my life depended on it.
Learn it, yes. Definitely. Write some non-trivial stuff with it. Seriously
consider using it in a business? No way.

I'd say the same thing about Forth, BTW. Love both languages. No longer good
choices for anything other than learning about different ideas.

~~~
dalke
Taking the tangent further, you write "one of my pet peeves with today's
programmers".

As someone who started programming in the 1980s, I've observed both sides of
the issue. When I started, the earlier generation complained that 'today's
programmers' didn't know anything about the hardware, like being able to re-
write the computer to add new hardware capabilities, or to debug the machine
by attaching probes to the bus. And they were right ... and mostly irrelevant,
as the hardware complexity and reliability increased. Eg, it's much harder to
hand solder surface mount than discrete components.

On the other hand, I've had to re-learn programming for modern hardware. I
remember when I implemented a string upper-case function like:

    
    
        while (*s) {
            *s = toupper(*s);
            s++;
        }
    

only to find that

    
    
        while (*s) {
            if (islower(*s)) {
                *s = toupper(*s);
            }
            s++;
        }
    

was faster for my use case. Someone had to explain to me the difference
between read and write performance - on the machines I learned on, they were
both one cycle.

Similarly, I've had to learn (poorly, in an ad hoc fashion) about instruction
pipelining and prefetching.

I've therefore concluded that talking about problems with "today's
programmers" is more like voicing the age-old complain about "kids these days"
then expressing something more fundamental.

Regarding APL, I like the comment at
[https://prog21.dadgum.com/122.html](https://prog21.dadgum.com/122.html) \- "I
encourage learning J, if only to make every other language seem easy."

Regarding Forth, see [http://yosefk.com/blog/my-history-with-forth-stack-
machines....](http://yosefk.com/blog/my-history-with-forth-stack-
machines.html) and commentary at
[https://news.ycombinator.com/item?id=1680149](https://news.ycombinator.com/item?id=1680149)
.

~~~
robomartin
The exception I'll take with regards to my "today's programmers" comment is
that, again, in my opinion, things are not getting better.

The simplest example of this I have is iOS. I wrote an app years ago that
required me to implement a genetic solver. Simple enough. Well, not so.
Objective-C is so thick and heavy that this thing was a complete dog. This was
back in the iPhone 3 days. I needed this solver to produce results in real
time, defined as "as quickly as a user could touch a button". This thing was
at least an order of magnitude slower than what the app required.

So, I re-coded the thing in plain procedural C++, not OO. Clean, simple fast
code. With that change the code ran faster than real time, to the point that I
could then add features.

Anyone trained in languages like Objective-C, Python, etc. sees the world
through a single OO lens and "programs" by threading together libraries and
really chunky slow code. They lack the benefit of understanding the same or
more can be achieved by leaving that baggage behind. This is how we end-up
with machines operating at GHz that actually slow down despite having massive
amounts of memory and resources.

I am not anti-OO, but OO seems to be an inextricable part of bloated code
these days.

~~~
dalke
While I don't believe it's getting worse.

Objective-C dates from the 1980s, and I'm surprised you needed anything from
C++ which wasn't available in the C that Objective C supports.

Python _doesn 't_ encourage a single OO lens. I've been working with Python
since the 1990s, and don't use OO that much. When I teach Python to
computational chemists, I deliberately don't touch on "class" or other OO
aspects because it doesn't seem that useful for what most people need to do.

I double-checked using the text of "Automate the Boring Stuff" at
[https://automatetheboringstuff.com/](https://automatetheboringstuff.com/) ,
which is also for beginners. I found no description of making classes. So I do
not accept the idea that Python programmers "[see] the world through a single
OO lens".

Certainly there are people with bad habits. I wrote about one of these in a
scientific methods paper published just a couple of weeks ago, at
[https://jcheminf.biomedcentral.com/articles/10.1186/s13321-0...](https://jcheminf.biomedcentral.com/articles/10.1186/s13321-019-0398-8)
:

> Many search implementations interpret Eq. 1 literally, and represent
> fingerprints using a set data type and compute the Tanimoto using set
> operations. This approach often uses a large number of temporary set
> instances. By comparison, an implementation which represents a fingerprint
> as a byte string or sequence of machine words uses less memory, has less
> memory management overhead, and can implement Eq. 2 with a handful of fast
> bit and arithmetic operations.

But it's not specific to this generation, because I read essentially the same
complaints - scratching off "OO" \- back around 1991 or so (either in CACM or
Dr. Dobbs').

Looking now, I found [https://www.drdobbs.com/are-the-emperors-new-clothes-
object-...](https://www.drdobbs.com/are-the-emperors-new-clothes-object-
orie/184408251) complaining about the slowness of OO software back in 1989.

And "threading together libraries" is, you certainly know, the goal of
software re-use. Consider Jon Bentley's classic 1986 paper where Donald Knuth
and Doug McIlroy write short program, at
[https://dl.acm.org/citation.cfm?id=315654](https://dl.acm.org/citation.cfm?id=315654)
.

> Read a file of text, determine the n most frequently used words, and print
> out a sorted list of those words along with their frequencies.

As McIlroy comments, "A wise engineering solution would produce—or better,
exploit—reusable parts. ... The simple pipeline given above will suffice to
get answers right now, not next week or next month. It could well be enough to
finish the job. But even for a production project, say for the Library of
Congress, it would make a handsome down payment, useful for testing the value
of the answers and for smoking out follow-on questions."

------
JaumeGreen
I'd love to learn APL, all the symbols give it some mystic aura that make you
feel as if you were reading a magic scroll, rather than mundane source code.

OTOH J is great because it's easier to dive into, no strange keyboards needed.
Also it might be one of the best language to use in a phone. With their own
keyboard and a nice app, it's the first time I found that while working on
advent of code problems during commute the interface was not the problem.

But that language has a big big problem, it is not search friendly. Not on
internet, not even on a page with solutions on different languages.

~~~
dancek
I'd love to have a language that combines the notation of APL with the
ergonomics of J (ie. hooks and forks enabling functional-ish programming, a de
facto open source implementation and a working Android "IDE").

~~~
shrubble
There's the fully open source GNU APL which compiles on Arm Linux in addition
to Linux, OpenBSD, MacOS.

I don't know what is needed to get from Arm Linux to Android.

~~~
mlochbaum
dzaima/APL ([https://github.com/dzaima/apl](https://github.com/dzaima/apl))
has an Android version. I would recommend that over GNU APL which makes a
number of questionable language choices.

Many modern APLs—Dyalog APL, NARS2000, ngn/apl, and dzaima/APL—include support
for forks. They're usually called function trains in APL.

~~~
shrubble
Seeing as I am not an APL guru, I would appreciate your comments on the
language choices that you don't agree with.

~~~
mlochbaum
The one that should stand out to programmers in other languages is using
dynamic scope instead of lexical scope for its anonymous functions, even
though they were based on Dyalog's dfns which are lexically scoped. There are
other more APL-centric issues such as even more questionable use of square
brackets than the typical APL.

I work for Dyalog which is a "competitor" (in practice, the APL market share
is small enough that any APL doing well helps us all). We have a lot of
respect for other APL implementations and frequently use their choices as
reference points for our own decisions. GNU is the only exception.

