Hacker News new | past | comments | ask | show | jobs | submit login
APL in its modern state (2020) (sacrideo.us)
95 points by akbarnama 41 days ago | hide | past | favorite | 90 comments



Conor Hoekstra's code_report[0] YouTube channel is a great way to learn about APL. The "1 problem N languages" series of videos are especially good.

[0] https://www.youtube.com/c/codereport/featured


Gilad Bracha is working on Shaperank. An APL inspired language for reactively calculating with multi dimensional arrays. Ie vectors and matrices. https://twitter.com/Gilad_Bracha/status/1450149734325256193


Obviously this is relevant to machine learning and data analytics. On github: https://github.com/f5devcentral/shapeRank


I like the confluence of ML and array languages happening right now. I know Pandas and NumPy are their offspring, but I like the array-oriented PLs. An ELM (Extreme Learning Machine) in J [1], CNNs in APL [2], and APL-to-GPU toolchains like Futhark/apltail [3,4]. I'll have to look at shapeRank.

[1] https://github.com/peportier/jelm

[2] https://dl.acm.org/doi/10.1145/3315454.3329960

[3] https://futhark-lang.org/blog/2016-06-20-futhark-as-an-apl-c...

[4] https://github.com/melsman/apltail/


Interesting how reactive graph are spreading everywhere :)


Can you explain/expand on this?


Was it ever alive, outside some tiny academic niche? You could hardly even use it without getting a custom keyboard for it.


I was a co-op student at IBM Canada in 1986-87. I got sorted (think sorting hat - hiring was generic) into the APL cohort. We had a fun course at the beginning and then I was shipped off into ... get this ... a marketing group.

So here I sit in front of a then very fancy 3279 terminal writing my little APL programs to do what they wanted - mostly extract/reformat/summarize data from one sort of file format to another. Was this a particularly good fit for APL? Not really, but if all you have is a hammer...

Anwyay one day I poked around in other APL code that was on the system and visible to me, and ran one program and wham... the screen on the 3279, which to that point I'd only known as a nice colour text terminal - exploded into fancy graphics. Mind blown. How did they do that? I never found out, because the program that did that was, as you say, "write only" - it looked like screenfuls of random noise.

But yes, APL was used in the mainstream, at least in the IBM internal world, as late as the mid 1980s. Maybe the use was artificial - you know, train new cohorts of co-op students in it and make them use it for real work - as a sort of dogfooding, don't know.


> You could hardly even use it without getting a custom keyboard for it.

Not true. I can still touch-type APL, decades after my last use of the language and I don't have an APL keyboard.

Those complaining about typing being difficult or the funny symbols are not thinking this through.

What's the difference between typing English (or another language) on a computer keyboard without looking at it and typing APL.

None. Exactly zero.

If you don't learn how to type without looking at the keyboard, typing any spoken language is a slow grind.

How long does it take to learn to touch-type, say, English? Not too long. It takes effort and dedication that is well within the skills and mental capacity of 99.999% of the people who use computers. APL is no different.

Of course, you can't go from touch-typing English to touch-typing Greek or Japanese after a few sessions with a card in front of you. It takes work. You have to learn it. And then it's easy.

When I got started, in the early 80's, we would put stickers on the keys, buy a set of replacement keycaps or a ready-made APL keyboard. After not too long the keyboard no longer mattered. I, and everyone else I knew who actually used APL for more than a curious exploration, could touch-type APL on a normal keyboard without any custom labels or keycaps at all. As I said above, I can still do it, decades later.


If you don't touch-type, typing English is terrible; you might reach 24 wpm. The words just have so many letters! By contrast, ιρρω, ("generate the indices for the dimensions of the right-hand argument") is four keystrokes on three keys, like BASIC on a Sinclair ZX-81. That seems like it would make APL a lot more usable for a non-touch-typist than English. With the stickers, anyway.

(I do keep forgetting which keys I have ψ and ξ on in Greek: the illogical C and J. Maybe I should make stickers.)

For a period of time, custom font ROMs or typeballs were harder to improvise than custom stickers; although the PLATO IV and V terminals had both softfonts and overstrike, very few were ever made, and both features were entirely missing from more common hardware, like the IBM 2741, Diablo daisy-wheel printers, Epson MX-80, ADM-3A, VT-100, VT-220, Heath H-19, IBM MDA, Hercules, and even the CGA. In 01986, in the Microsoft shill magazine PC Magazine, Charles Petzold touted this killer new feature of the US$524 EGA: "Even with 64K [video RAM], the normal font is replaceable by software." (March 25, 01986, p. 115.) The TI-99/4A and Nintendo Famicom did of course already use programmable "character generators," but I don't think anyone ever offered APL for them. Eventually, of course, we all moved to framebuffer displays backed by cheap semiconductor RAM, so custom fonts were no longer a premium feature.

I think the "font problem" really was a significant issue for APL adoption during the period 01968–01988, and as it turned out, that was a crucial formative period for computers; that was when we got, among other things, the PDP-8 (and thus process control computers), Unix, TCP/IP, C, the Macintosh, the IBM PC, MS-DOS, spreadsheets, object-oriented programming, semiconductor RAM, computer animation, digital music synthesizers, TeX, Intel and its 8008/8080/8086/80386 line, the primacy of ASCII, the 68000, ARM, RISC in general, and the modern IDE. Of these, only MS-DOS and the 68000 have really fallen out of favor.

There are a lot of path-dependent things in computing that we can attribute to standards established during that time. If IBM hadn't had their head so far up their ass, or if VisiCalc hadn't been written until 01986, things might have turned out very differently for APL.


> custom font ROMs or typeballs

Yup. Used those. I had a PC rigged with a toggle switch and a custom little wire-wrapped board to be able to switch character ROMs. I even wrote printing utilities to be able to pause and swap out the IBM printer's type ball when printing documents that required a mixture of APL code segments and regular text. I wrote a custom hybrid text editor in APL for precisely this purpose, to be able to do application notes that included both character sets.

> I think the "font problem" really was a significant issue for APL adoption during the period 01968–01988

Yes, agreed. The bit of APL history casual observers miss is that Iverson's motivation for transliterating APL symbols into J was exactly this problem. He tried to solve a business/financial/adoption problem. As a result, he bifurcated and confused the APL world. He was wrong to make this decision. And, what ended-up happening was that both languages became oddities rather than what APL could have become with the capabilities of next generation hardware.

That and the cost of licenses. As a student I got free licenses but it was hard to justify what some of these packages cost. STSC's, I think, was the lowest cost most popular version used by most of the university types I used to engage with. IBM's version had penetration into business because of their position with mainframes. Once again, splitting the ecosystem did not do anyone any favors. J is an abomination (it defecates all over Iverson's own seminal paper on the power and value of notation as a tool for thought).

APL had many issues that truly needed resolving. Simple things like the object oriented programming and heterogenous data types would have been very interesting to explore. The other thing may have been providing official means for avoiding O(n^2) issues where just a few innocent looking operators would result in converting vectors intro matrices or >2 dimensional arrays and then having to do all that processing when a simpler procedural option that does not expand to consume all available memory would have been great. In some ways this is the world of Python and numpy today. You can work at various levels of abstraction and be reasonably aware and in control of resources and computational complexity.


Indeed, when APL was first available, you couldn't use it without a custom machine: an IBM system/360 mainframe. The keyboard was the least of your worries. And APL was distributed for free! Why charge for software when it's written in 360 assembly?

APL was near-mainstream in the 1970s thanks to time-sharing[0] (the article mentions this starting with "APL thrived in the 70's and 80's"). In fact, time-shared APL was significantly easier to use than most languages because it was used from an interactive session rather than punch cards. With results typed onto paper tape! It was used for a lot of practical tasks like administration where Excel would now be the most common tool. One book on APL had sales in the hundreds of thousands, and conferences[1] had attendance of over a thousand for a few years, despite competing with additional vendor-specific conferences.

[0] https://aplwiki.com/wiki/Time-sharing

[1] https://aplwiki.com/wiki/APL_conference


Ironically with the rise of touchscreens, it's easier than ever to have an APL keyboard.


Finally a great use for the MacBook touchbar!


I don't know which would be more painful, trying to program on a touchscreen, or trying to program in APL.


Actually, Dzaima's APL app for Android has a very intuitive keyboard, coupled with APL's terseness, allows me to do pretty intricate stuff on my phone that I can pick up later at my desktop if it gets bigger. A lot of the resistance to the array languages is one of familiarity and aversion to anything new or different. Just try it. Math scared you as a kid too most likely, and even though you may not be a master mathematician in your adult life, I am sure you cleared some seemingly obtuse hurdles in your youth.


Honestly I think you'd have to be pretty proficient at APL for the touchscreen typing to become a problem. When I dabble in APL, I use the mouse to click on the buttons in the Dyalog language bar[0]. It doesn't matter -- it takes me much longer to think of what to type than it does to pick with the mouse.

[0] http://help.dyalog.com/14.0/Content/UserGuide/Images/Languag...


Typing speed is not the limiting factor when programming in APL. The limiting factor is the speed of your thoughts.


That and finding the symbol on the the keyboard if it lacks the APL labels. I learned it with a small reference card I placed on top of a standard IBM PC keyboard.


Why not both?


I mean, given the number of new implementations of APL that are being actively worked on, there is clearly some interest.

The most prominent such dialect is BQN, which I definitely recommend anyone interested in APL-like languages to take a look at. https://mlochbaum.github.io/BQN/

APLwiki has a list of other languages: https://www.aplwiki.com/wiki/Language_overview#Dialects

Disclaimer, I'm the developer of one of these dialects so of course I'm biased.


In the 01950s through the 01970s you almost couldn't use a computer at all without getting a custom keyboard for it, so that wasn't an extra obstacle for APL. (You couldn't connect your Smith-Corona or Selectric to a computer, and until 01963, teletypes spoke 5-level "Baudot" code.)

One of Stallman's first jobs was writing a text editor in APL in the 01960s.


Why the 0xxxx dates?


he is from the future


The fintech company SimCorp was built on APL and their backend still has tons of APL.


Anyone using APL or J for work on HN? I've used it as a hobby and think it's really cool, but haven't ever used it professionally.

Edit: this has been posted on HN before.


I haven't written in APL professionally (other than fixing up existing code) but I have been responsible for an APL system and for its replacement.

I'm sure that writing a greenfield APL program is a lot of fun. Initially.

Or if you're just writing vignettes to do some temporary data wrangling then it's fine.

As a language for writing applications that do proper work and need maintaining? Wouldn't be in my top 50 choices of language. And I don't think I've worked with 50 languages yet.


Does trying to build a Youtube channel around J count as professional?

I've been livestreaming a presentation tool / time-travelling REPL for J for a few months now:

https://github.com/tangentstorm/j-prez

It hasn't paid anything directly yet, but my videos probably helped me land my current job.

At work, I use another APL-inspired language called K.


For finance or something else?


Yes, on 3 May 2020:

https://news.ycombinator.com/item?id=23055793

"Is APL Dead? Not anymore"


Believe it or not, the APL world's changed a lot since then. I still consider APL dead, but there are strong signs of rebirth.

APL dialects April[0] and KAP[1] are improving rapidly, and my own offshoot BQN[2] has gone from prototype to a full language. All of these are open source and made to work with the Unix ecosystem. While the K language isn't as close to APL, ngn/k[3] is following a similar trajectory.

This year we created a Discord/Matrix forum[4] (bridged together) for array languages, which has hundreds of members and a few conversations per day at the slowest. The Array Cast was featured prominently here when the first episode aired[5] and is also worthy of note: they say they now have thousands of subscribers.

[0] https://github.com/phantomics/april

[1] https://github.com/lokedhs/array

[2] https://mlochbaum.github.io/BQN/

[3] https://codeberg.org/ngn/k

[4] https://mlochbaum.github.io/BQN/community/forums.html

[5] https://news.ycombinator.com/item?id=27209093


Thanks for all you do! I can't wait until one of these languages reaches a 1.0 milestone.


search for aaron hsu, he made plenty of talks about APL dialects

if you like terse generic code you'll be fed for a while (too much even, Aaron's two page parser/compiler was somehow too cryptic even for my tastes :)


Ohh yeah. I've enjoyed talking with him on HN. Probably on a previous account as I never enter my email and get locked out all the time :)

That parser was mind bending.


He also wrote TFA.



q and k are still actively used in many parts of finance.


Indeed, and if you have a community of people using them, you can be very productive.

But those languages are astonishingly intimidating for new programmers -- reading from right to left, with implicit variables, different usage of brackets, types that are hard to discern, overloading of every possible bit of punctuation, etc makes it more like translating Latin than writing code.

So while a line such as : .[`:/data/raw; (); :; 1001 1002 1003]

is very succinct, the skill and concentration necessary to write that line is not something that lends itself to widespread adoption.


I've been learning APL for a bit, and have tried q too, and I disagree with the idea you need more skill and concentration to write either. Personally, for problems that suit the domain, I've found it much easier than equivalents in other languages like numpy and pandas. Q-sql in particular seems extremely convenient to have built into a language, and probably requires less concentration to write than an equivalent using manual iteration/filtering.


Yes, sorry for not mentioning that. I'm aware of kdb+ in finance (never used it personally, but wish I could).


I use J's Jd columnar database to munge data for academic research. Much, much easier than SQL. The models, I do in R because packages.


I do. But I don't really write or maintain large systems, only various tools and interfaces.


Nothing dies.

Like, if it has actual uses and implementations on modern machines and isn't abandonware, someone is going to be using it somewhere.

But I would say it was niche.


And it's not hard to play with gnu apl + emacs apl-mode.


Thanks for the recommendation!


Few languages ever die completely. Even Jovial is still used in some places (I learned it in 1982). I have not seen any mention of PL/1 in decades, however. I learned APL in 1979 and found it an amazing language at the time, though I never used it again. Every other language I learned in my life is still in use in some form.


Ha! You inspired me to look up FOCUS, a language I once wasted a summer on. I guess it is still limping along out there, if somewhat unrecognizable.


Or perhaps somewhat un-FOCUS-ed.


I will note that the article doesn't contain the word "readability". APL is the classic example of a "write-only" language.


"Language X is a write-only language" seems to be code for "it doesn't look like C."

I've also heard Forth and Lisp described this way. And yet I find both readable since I have experience using them. I wonder if APL is similar: It's only unreadable to people who don't know the language. Well of course it is.


APL is different from Forth and Lisp, in that the 'write-only' reputation is supported by it's programmers.

Take the famous 'game of life' APL example:

``` life ← {⊃1 ⍵ ∨.∧ 3 4 = +/ +⌿ ¯1 0 1 ∘.⊖ ¯1 0 1 ⌽¨ ⊂⍵} ```

It's quite logical, when you walk through it.[0] But it's harder to read 'back to front', which is what code readability is about.

[0] https://www.youtube.com/watch?v=a9xAKttWgP4


I would also recommend Conor Hoekstra's version of that video for a bit more explanation:

https://www.youtube.com/watch?v=pMslgySQ8nc&t=45s&ab_channel...

And he even shows how you can animate it in the editor window.


I can say confidently I couldn’t figure out what a line of APL code did 2 hours after writing it myself. I had to write comments at almost 5:1 ratio.

In other languages it’s easier to communicate “why” you are doing something, while in APL it’s the “how” that’s easiest.


C isn't very readable either which is why commenting is encouraged. But the Obfuscated C Contest is a competition whereas APL is literally the punchline for illegibility.


As I asked in another thread about APL readability few weeks ago[1], would you be surprised to learn this is valid Dyalog APL?

    result←findMax data

    max←0
    :For i :In data
        :If i>max
            max←i
        :EndIf
    :EndFor

    result←max
then

          findMax 5 1 2 3 5 6 3 1
 
    6
Writing it more neatly as findMax←⌈/ isn't mandatory anymore than `reduce(max, numbers)` is mandatory in Python.

[1] https://news.ycombinator.com/item?id=28092097


The terseness of “vintage” APL is what makes it hard to read and reason about. This example highlights how a little less terse code can be much more readable.

But I remember what confused me the most was trying not to use loops to sum arrays and using vector ops like +/A


I'm not surprised at all. Good/bad, un/readable code can be produced in any language. Ruby fans crowed about its beauty then created incomprehensible DSLs of awkward Yoda code. But norms and values do differ between language communities, and APL is not exactly noted for a strict emphasis on readability.


As one who took APL as an "introduction to programming" course in college: APL practically encouraged unreadable implementations. Students would literally and frequently challenge others with "what does this program do?" with the intent of eliciting "I have no idea, it's unreadable."

Forth and Lisp were odd, like a native English speaker learning Russian or Korean. APL is like writing a novel directly into encrypted form.


I think readability has as much to do with the language culture and the writer's taste as language features. I've definitely read Lisp code which is approaching unreadable due to excessive use of macros, for instance.


It really isn’t. Thousands of APL programmers read it perfectly well. I can’t read Japanese, but that doesn’t make it lacking in “readability”.


You could argue just that. Japanese is surely more difficult to learn to read with its mix of Japanese and Chinese symbols, then say Russian or Greek.


Languages have varying costs of entry and mastery. The relative cost depends on what languages one already knows. But some language features are rarer and more difficult (declensions, conjugations, symbols, tones, etc). The size of the active vocabulary matters a great deal. But the availability of the language matters perhaps most of all.


It's a language using a graphical alphabet of built-ins. When you know them, you know them. When you understand how their position on the line creates an algorithm, you can see the algorithm. And then it's read-write.


Do you have any examples?


There was a good Co-Recursive episode I just listened to last week that was partially about APL. https://corecursive.com/065-competitive-coding-with-conor-ho...


Context: Professional APL user for ten years back in the '80's

I firmly believe languages like APL, Forth and LISP should be taught in a single quarter course on programming languages. The perspective you get is invaluable. These ideas help you think about computational problem solving in a different way.

That said, attempting to use any of these languages today for anything other than a fun hobby would be a mistake. While APL isn't dead --paid and free distributions are still actively maintained--, it is, in my opinion, deader than a doornail when it comes to the reality of using it for any real work, particularly at scale. In this sense it isn't any different than Forth, LISP, COBOL and FORTRAN. Can you imagine Facebook announcing a move to FORTRAN. Neither can I.

I often find the comments on HN about APL is terribly misinformed. Things like "read only language", "need a custom keyboard", "need a custom machine", etc. are, from the perspective of someone who actually knows APL, just silly. People truly should stop for a second and think about whether their opinions about anything are based on enough data to actually support even having an opinion. Simple parallel example:

Dabbling in music does not make you a musician. Declaring that you need a custom machine to type musical notation and that this notation is impossible to read would sound terribly ignorant to someone who devoted sufficient time to actually learning and internalizing this.

I can, still, to this day, decades later, touch-type APL. Do you look at your keyboard when you type anything in your spoken language/s? No? Same with APL. The learning curve isn't any worse than learning to type on an ASCII keyboard. Do you have to look at the keyboard when you type any of the shifted symbols on the top row? No? Well, imagine that's APL. Different symbols. No problem.

Yes. APL is dead as a sensible informed choice for non-trivial projects.

No. APL is not dead as it pertains to learning some amazing things about what computing could --and arguably, should-- look like at some undefined point in the future.

I have always thought that the future of AI will require us to be able to communicate with the computer in code in a form far closer to the symbolic APL approach rather than typing words in English. I can't put my finger on what form that would take. Iverson's "Tool for Though" paper goes over the reasons that support this idea. I just can't offer anything other than to say I believe this to be true based on ten years using an amazing symbolic language for real work at scale. One of my APL applications was part of the human genome decoding project. It helped analyze, classify and visualize sequencing data.


> The perspective you get is invaluable. These ideas help you think about computational problem solving in a different way.

This, a million times. It’ll open young minds like nothing else.


> In this sense it isn't any different than Forth, LISP, COBOL and FORTRAN.

Given that Fortran is used all the time for numerical computations, especially in science, and this very site runs on LISP, I'm not sure you're making the point you think you are making.


> I'm not sure you're making the point you think you are making

Not counting "it would be nice if you had exposure to these languages."

A job, where you are required to primarily develop software using these languages. That's the criteria.

-----------------------------------------------------

Monster.com

"apl software engineer" results: 0

"fortran software engineer" results: 0

"lisp software engineer" results: 0

-----------------------------------------------------

Linkedin:

"apl developer" results: 1

"fortran developer" results: 0

"lisp developer" results: 0

"python developer" ~50,000 results (no, did not bother to filter through the list to get down to actual python jobs vs. mentions)

"c++ developer" ~100K (same comment)

-----------------------------------------------------

Oh, no, I am making precisely the right point. HN running on LISP is a rounding error. FORTRAN for scientific computing is also a rounding error. These things are dead.


You're not going to find scientfic positions advertised on LinkedIn, and I don't know what other blind spots that selection makes. I would concede if you compared StackOverflow questions. That would also reflect the definition of "alive" from the article better ("actual usage").


Oh, please!

https://stackoverflow.com/questions/tagged/apl 266 questions

https://stackoverflow.com/questions/tagged/fortran 11,855 questions

https://stackoverflow.com/questions/tagged/forth 262 questions

https://stackoverflow.com/questions/tagged/python 1,817,529 questions

https://stackoverflow.com/questions/tagged/c++ 741,496 questions

LISP: 6,615

javascript: 2,285,941

java: 1,805,413

c#: 1,503,129

php: 1,417,981

etc.

If "underwater_basket_weaving" was a tag it would likely get more questions than APL, Forth and LISP combined.

Let it go...


I feel like we're quibbling about semantics.

Is APL an interesting language that most people would benefit from picking up and building something with? Sure.

Is there a small and passionate community around it? Absolutely?

Is it still possible to get up and running with APL in 2021? Yes.

Given the variety of choices in the developer ecosystem is APL the best choice for the types of problems the vast majority of developers are solving today? No.

And I think maybe most interestingly this conversation (to me) highlights how important ecosystems, frameworks and communities are to modern development over the pure language semantic benefits.


Given the variety of choices in the developer ecosystem is APL the best choice for ANY types of problems one can realistically have to solve today?

I.e. is it still the best tool for the job it was designed for but on modern hardware/OSes? Provided you don't mind the special charset and low availability of APL programmers to maintain your code.


Already when it appeared, APL was an incomplete programming language.

It had much better facilities than any other language for working with arrays of numbers but other tasks were awkward, e.g. handling strings, input/output, program control structures, partitioning a large program in separate modules and so on.

So going back to use the original APL is not a solution.

On the other hand, having to work with arrays of numbers in any language that does not include APL-like expressions is tedious. Having to write explicit loops when better solutions existed more than half a century ago seems extremely stupid.

With Unicode, providing the APL operators in a programming language is trivial.

There is however one APL feature that prevents the simple extension of any current programming language to just allow you to write APL expressions without changing the language otherwise.

APL had a different rule for the precedence of operators than most popular programming languages, all operators have equal precedence and the right hand operand of an operator is everything that is to its right. So the operators are evaluated from the right to the left, unless there are parentheses to change the order.

This rule was a very important innovation of APL. While it may seem weird for those who do not have experience with it, it is actually much more convenient than the usual multi-level precedence rules.

Just adding APL operators to a language without also using the APL rule of evaluation order would loose a good part of the APL advantage and simplicity.


(Meta comment)

Am I the only one who finds this "[question]? [answer]." comment format hard to read/digest?

Would it be less awkward to simply write a statement as a sentence? Yes. :-)


Article describes interest in APL, and constructing tools for it, but does not note any modern practical/commercial use of APL. As such, seems little more than academic interest akin to Latin or Esperanto, where even Klingon gets more actual use (https://smile.amazon.com/s?k=klingon+shakespeare).


APL is alive and well and widely used. It’s just evolved into more verbose forms known as NumPy, R, and other Iverson Ghosts [0].

Turns out people love array programming but hate terse syntax.

[0] https://dev.to/bakerjd99/numpy-another-iverson-ghost-9mc


"Turns out people love array programming but hate terse syntax."

People hate Java and COBOL and XML and PowerShell and ActionScript and SQL for their verbose syntax. People adore `x ? y : z` while complaining about unreadable terse syntax that with symbols that don't say what it does in English. Why do people put up with data[4] to get an item out of an array with a special single-purpose symbol heavy syntax instead of index(data, 4)? How come the symbols are never the problem when people are familiar with them?

Can it just as easily be explained by "people like what they're used to, people hate change"?


> People adore `x ? y : z`

That’s not APL. It’s all ASCII.

One of the issues I experienced in college was being unable to verbally communicate APL code to colleagues. We called the comment symbol “finger”.


The ASCII standard RFC depicts ":" and "?" in the "symbol" column of this table: https://datatracker.ietf.org/doc/html/rfc20#section-4.2

If you don't know the name of ":" then you can't verbally communicate it to a colleague, and saying "it's ASCII" doesn't mean you magically know its name. If you know that ⍝ is called "lamp" then you can verbally communicate it.

"It's bad because I don't already know it" feels like a weak kind of criticism.


So, instead of reading a keyword, you had to memorise the names of a couple dozen symbols and their meanings. Not impossible, but not great.


I don't want to repeat myself here so I'll link a previous comment https://news.ycombinator.com/item?id=28094051

The tl;dr is that while you have to learn a few more symbols, the benefit of that is they're so composable you don't have to learn anywhere near as many keywords, because they can be defined in just a few characters.


You still need to learn their names and meanings instead of learning names/keywords and their meanings, minus symbols.

I'm repeating myself in hopes it becomes obvious that learning the symbols is not needed in other languages because the keywords are already the names we would need to learn anyway and, usually, also make it easy to derive their meaning.

Think of how you read code in an unknown language - you look for patterns you usually see and use the names of keywords and variables to understand both what's being done and why. With "vintage" APL, the symbols are opaque and you are left looking for patterns you probably won't be able to identify without first understanding what the symbols mean, because both syntax and alphabet are unknown.


The core point of the parent comment was that the symbols become more composable, so you end up with vastly less of them to learn than the equivalent set you'd have in a traditional language with shorthand spam from things like "reduce(+,x)" (or worse if you need a lambda around plus) being way too unwieldy and hard to read.

APL has like 80 builtin glyphs, compared to hundreds or thousands of functions in stdlibs of traditional languages, and when you learn those 80, you can read & write all of APL (and the terseness means you can consume information much faster, and knowing all of the language vastly simplifies the writing process too - you might not need to learn all of a traditional language to read it, but writing one well still needs a very big coverage of knowledge, or constant docs lookup).

Sure, for someone who knows C-like syntax, Java is infinitely more readable than APL, but if you bother to learn APL, it's pretty simple.


Terse is generally better than verbose but it has 2 problems:

1. Learning curve - most of the people immediately dismiss what they can't intuitively understand straight away, even if it requires just some minutes to grok and a cheatsheet during first days of usage. People need to have at least very basic clue immediately to get interested and feel motivated to invest further attention.

2. Terse syntax/vocabulary encourages packing overly complex logic into one-liners which become brainfuck to read and reason about even for yourself shortly after. I believe it's not hard to develop automatic decoders for such expressions which would split them into multiple lines introducing intermediate variables, structure them with indentation and/or highlight corresponding elements but people probably prefer to just read the code immediately rather than to use advanced tooling even to read it.

I indeed adore `x ? y : z` and similar things and use these heavily but always split the expressions in separate indented lines every time I nest them.

Some times I also use ReSharper to convert some verbose C# code I wrote into much shorter (Linq or something) but in not-so-rare cases I then fail to understand the result and have to remember/comment what does it do. I would often revert to verbose unless I'm sure this part of the code is not going to need to be debugged ever after.


People like typing code on their keyboard, so it's mainstream hardware that doesn't care about APL.


There's a balance here between terse and inscrutable.

The line is fine.


I think people actually prefer terse syntax, but only for things that are already familiar to them. Most languages use "+" instead of "plus", because the audience knows it already.

Some languages (I'm thinking of Rust's Result/try/? syntax) have gradually evolved some parts of themselves from verbose to terse as people became more familiar with the concepts, so I would not be surprised if Python/NumPy follows a similar path.

Python only added a symbol for matrix multiplication ("@") in version 3.5, probably because many of its _current_ users are already familiar with the concept.

When Python was first introduced, it seemed to be more of a Perl or Bash replacement, so dedicated syntax for matrix multiplication would have been weird.


One mans “terse” is another mans “cryptic”.

Languages like R are both easier to for the average programmer to read and (much) easier to type.


Anyway, the iota operator isn't wholly dead. It gets name-checked in libraries for other languages, including recently C++. Which isn't dead.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: