Hacker News new | past | comments | ask | show | jobs | submit login
APL – A Glimpse of Heaven (2006) (vector.org.uk)
196 points by lelf 18 days ago | hide | past | web | favorite | 93 comments



An anecdote about my first and only brush with an APL programmer:

Towards the end of the nineties I was a young computer science student standing on an almost deserted suburban above-ground underground station waiting for the next train into the centre of London.

A very old but brisk and snappy gentleman in traditional gentleman's clothes came onto the platform, and for some reason we nodded genially as you do when you pass strangers on country lanes; in London you ignore absolutely everyone, so it was strange we did this.

And there we stood, waiting for the train. He offered me a peppermint from a small metal tin. Of course my parents taught me to never take sweets from strangers, so I instinctively declined even though I'd now left home and thought of my self as a grown-up.

Anyway, we got talking. At first it was 'what are you studying?' kind of polite interest. He was intrigued that I was a computer science student.

I've gotta underline again how unusual this was, for anyone in London to ever talk to another stranger, but this really happened. I don't think I ever acknowledged there was anyone else in London, ever. It wasn't even normal to know the names of your neighbours.

And so we sat beside each other on an underground train and he told me about his early years as a programmer. Did I know APL, he asked? I had never heard of it! His veiled references suggested he worked with but not for Cheltenham.

And he delighted in telling me programmer war stories from when he did commercial stuff like how a vendor had encrypted their APL programs and he had reverse engineered it and waved the decrypted source-code in front of the firm's rep and complained about shoddy code and bugs! At the time I was tackling Simon Singh's Code Book contest so that kind of story caught my imagination and I begun to wish I'd have that kind of chance in my programming future.

Now, in his old age, he went from his home in the north of England down to London once a week for a charity meeting he chaired. That's the kind of thing respectable old gentleman do to give their life meaning, I guess.

Anyway, afterwards I searched the web for APL (this was before google) and was, of course, gobsmacked.


For non-Brits, "Cheltenham" = GCHQ


For non-Brits, GCHQ = the British intelligence service, roughly equivalent to the US NSA. (This non-Brit had to Google.)


My hobby :

Showing people in the bay area this photograph and asking them if they've seen the new Apple HQ

https://en.wikipedia.org/wiki/Government_Communications_Head...


a.k.a "The Doughnut" because of the shape of the building: [0]

// For non-Brits, 'doughnut' = 'donut'

[0] https://en.wikipedia.org/wiki/The_Doughnut


Ah sorry, I figured that after all the Snowden leaks the intelligence service acronyms were widely known on HN. GHCQ were particularly prominent in many of the leaks


Speaking of gobsmacked, check out this use of APL notation to formally describe the IBM System/360. https://web.archive.org/web/20060813132807/https://www.resea...


If you're interested in APL to describe hardware, perhaps you're interested in a language based on APL to actually build hardware.

Many decades ago there was a hardware description language based on APL.

https://en.wikipedia.org/wiki/AHPL

I studied it as part of a university course circa 1975. Wiley published a typo-infested book about it.

Using Google verbatim for "a hardware programming language" provides a plethora of hits.

But then VHDL and Verilog happened and the rest, as they say, is history.


I, too, studied this in the late 70s. Interesting stuff.

Thanks for linking this. Looks very cool.


Your mint-taking joke made me laugh weirdly, so much I tripped my whole cup of tea on my lap. I need gravityproof tea cups now.


There are interesting histories, like this one, that are apparently very little known in today's UK computer industry: https://books.google.com/books?id=Dhk9wHXfQMkC


A few years ago, discovering J and playing a bit with it, I went through phases : "this looks square and clean, a nice exercise in language design" -> "those guys take it way too seriously" -> "people seem to be using it for serious stuff, they are simply insane".

Coming back to it every now and then for kicks slowly showed me a few strengths. First, with the short verb names and clear semantics, J is the only thing I've found that brings, for me, the comfort of a specialised handheld calculator to the PC. That realization, very independently from its array capabilities, is what got me really going with it, on a gentle learning curve. And you get it on your smartphone too ; I eventually sold my good old HP48 friend.

From there I started using it for small things. When I was building a lift for my brother's Warré beehives, J let me scribble a few beam-resistance equations in it and using its solver and plotter to easily get good results. So I started using it for things that were too big for spreadsheets, but that I was too lazy to deal with in a "proper" language.

That needs a change of mind regarding how you write and maintain programs. When I want J to do something in a program, I fiddle with it until I get the "sentence" right, and that's it. That sentence will never be modified again. If I need it to work differenty, I'll rewrite that sentence from scratch. Then find a good name for your sentence so that you can combine it with others, and suddently you discover that you can write J like a nice DSL that can very satisfyingly sound like plain spoken language.

I wouldn't stretch it though. I'm not passionate enough about it to invest what's needed for mastery. 80 lines of thin J is my largest. Beyond that, switch to something that helps with maintainability, but beware of the sudden and amazing code-size inflation !


> When I want J to do something in a program, I fiddle with it until I get the "sentence" right, and that's it. That sentence will never be modified again. If I need it to work differenty, I'll rewrite that sentence from scratch. Then find a good name for your sentence so that you can combine it with others, and suddently you discover that you can write J like a nice DSL that can very satisfyingly sound like plain spoken language.

I had exactly the same experience with Forth. It is very cool when you define a word as an almost English sentence and all the data flow effortlessly through the stack but I have to admit it can take a lot of time to get it right. I got to the conclusion that Forth is a contemplative language where you spend a lot more time thinking than typing.


I once got talking with an anesthesiologist who was taking care of a family member. Upon hearing I was a developer, he told me he was an APL programmer who spent most of his computer time developing Bible software. He regaled me with a few APL war stories, about how he could get so much more done with so much less code...

I was intrigued, and still am. But one thing I didn't catch -- how do you write all those non-ASCII symbols using a standard OS and text editor? Or does it require special tools?


Edit: btw, https://tryapl.org «`» is prefix key. `i10 ⟶ ⍳10

> how do you write all those non-ASCII symbols using a standard OS and text editor?

Modifier+KEY or PrefixKey KEY

All symbols are in Unicode. No special tools required.

        ]KEYB
  US Keyboard Layout:

  ╔════╦════╦════╦════╦════╦════╦════╦════╦════╦════╦════╦════╦════╦═════════╗
  ║ ~  ║ !⌶ ║ @⍫ ║ #⍒ ║ $⍋ ║ %⌽ ║ ^⍉ ║ &⊖ ║ *⍟ ║ (⍱ ║ )⍲ ║ _! ║ +⌹ ║         ║
  ║ `◊ ║ 1¨ ║ 2¯ ║ 3< ║ 4≤ ║ 5= ║ 6≥ ║ 7> ║ 8≠ ║ 9∨ ║ 0∧ ║ -× ║ =÷ ║ BACKSP  ║
  ╠════╩══╦═╩══╦═╩══╦═╩══╦═╩══╦═╩══╦═╩══╦═╩══╦═╩══╦═╩══╦═╩══╦═╩══╦═╩══╦══════╣
  ║       ║ Q  ║ W⍹ ║ E⋸ ║ R  ║ T⍨ ║ Y¥ ║ U  ║ I⍸ ║ O⍥ ║ P⍣ ║ {⍞ ║ }⍬ ║  |⊣  ║
  ║  TAB  ║ q? ║ w⍵ ║ e∈ ║ r⍴ ║ t∼ ║ y↑ ║ u↓ ║ i⍳ ║ o○ ║ p⋆ ║ [← ║ ]→ ║  \⊢  ║
  ╠═══════╩═╦══╩═╦══╩═╦══╩═╦══╩═╦══╩═╦══╩═╦══╩═╦══╩═╦══╩═╦══╩═╦══╩═╦══╩══════╣
  ║ (CAPS   ║ A⍶ ║ S  ║ D  ║ F  ║ G  ║ H  ║ J⍤ ║ K  ║ L⌷ ║ :≡ ║ "≢ ║         ║
  ║  LOCK)  ║ a⍺ ║ s⌈ ║ d⌊ ║ f_ ║ g∇ ║ h∆ ║ j∘ ║ k' ║ l⎕ ║ ;⍎ ║ '⍕ ║ RETURN  ║
  ╠═════════╩═══╦╩═══╦╩═══╦╩═══╦╩═══╦╩═══╦╩═══╦╩═══╦╩═══╦╩═══╦╩═══╦╩═════════╣
  ║             ║ Z  ║ Xχ ║ C¢ ║ V  ║ B£ ║ N  ║ M  ║ <⍪ ║ >⍙ ║ ?⍠ ║          ║
  ║  SHIFT      ║ z⊂ ║ x⊃ ║ c∩ ║ v∪ ║ b⊥ ║ n⊤ ║ m| ║ ,⍝ ║ .⍀ ║ /⌿ ║  SHIFT   ║
  ╚═════════════╩════╩════╩════╩════╩════╩════╩════╩════╩════╩════╩══════════╝


Unfortunately, HN's markdown (or maybe my Firefox's rendering of it) is messing up your beautiful ASCII graphic. I copy/pasted it in my editor and it looks fine. Thank you for taking the trouble to post this.


It works just fine for me with Firefox. You may want to check the monospaced font in preferences (General tab -> Language and Appearance). HN just uses the default monospaced font, and swapping through my fonts I found a decent number of fonts that did not render it correctly.

You'll need to click OK to dismiss the settings popup to apply the change, but the page doesn't have to be reloaded.

You may also want to un-check the "Allow pages to choose their own fonts, instead of your selections above" box, my experience is that just using a fixed sans + serif font (GNU Freefont for me, though Apple's San Francisco is also good for sans) makes many websites look better, makes very few websites look worse, and makes everything look consistent.


I'm reminded that it's possible to be quietly weird and brilliant, in the world that moves on in ignorance of the latest JavaScript shit, kubernetes shit, database shit, ... Thanks for sharing!


Most of the vendors (Ex: Dyalog) sell keyboards with the markings on them. It isn't necessary though as you'll actually pick them up decently fast. There is so much less typing that you can also just pick them out of the IDE's toolbar kind of like you do on the tryapl.org website.


You use a font that has those symbols and then Alt key combinations to make the characters.


Haha this brings back memories. In college days, this was our way to hide files. We'd name the files with these symbols, and so you had to know that key combination to open the files. Note that this was pre internet era and DOS prompt, no windows so you couldn't just click on the file.


Wow... How about on a Mac? Option and Shift-option keystrokes?


Yes. I have the free but non open source APLX running on my iMac and have the image of the keyboard map they supply loaded in Preview.

Many of them are mnemonic, such as iota being (key)-I for example.


Free because the APLX vendor eventually decided to focus on their legacy software business and get out of selling an APL interpreter. I think they essentially needed a rewrite or just a lot of work on the product and couldn't justify it. Dyalog agreed to host the old and unsupported APLX interpreter on their site if anyone wants it.


I'm wondering what about APL made it a good choice for "Bible software", or in fact what such software even does, other than searching?


He said he did a lot of data extraction, processing, and exporting. Sort of an ETL process that pulls stuff together from different sources and exports to different formats (I think).

For each verse in the Bible there's a ton of different types of data that can be pulled together and provided in a UI - stuff like translator notes, Strong numbers and lexical parsing of source words (i.e. part of speech, tense, voice, person, declension, mood), lexicon for each word, cross-references, ... and that's just at the translation level. Bible software also usually provides multiple translations, pre-built search indexes, commentaries, study notes, theological texts, and collected sermons and books by historical luminaries.


Wouldn't Bible software be mostly concerned about strings and/or UI? Putting scripture into arrays and matrices sounds a bit... TempleOS-ish.


APL (or at least J) actually works quite well for string processing.


What's Bible software?


Software to assist people with Bible study. It does things like search, cross referencing, showing different translations in parallel, annotations, study notes, suggesting passages with similar themes etc.


I'd like software like that for reading classic literature. It often has many translations, study notes, commentaries, are available in multiple languages, etc. etc.

If it were easy to do, you could use it to learn new languages, by reading in multiple languages simultaniously.


APL and Prolog/Datalog are local optimums in programming languages. When you enter their domains, you are limping in every other language. It's almost better not to know about them because you can avoid constant "there is a better way" in the back of your head.

I think we should just bite the bullet and make them into standard domain specific languages available in every language and environment.


What domains are they used in? I've learned a little bit of J but got put off because most people involved with J seem more interested in using J for the sake of using it rather than actually doing anything substantive with it. My impression is that APL is similar (at least currently) but I'd love to be wrong.


I got into J in the '00s, it was interesting. The community was making substantive stuff, but they were making it for themselves more than for anyone else. I noticed a lot of teachers in the mailing lists. They were making tools that helped them track grades and attendance. They were generating quizes and worksheets from question banks that they produced over the years. They were creating puzzles for their students. All of this was substantive, but it was focused on their ability.

J (and APL) really seemed to be an extension of what computing could've been. As a computer scientist and programmer, any language and OS work for me. I want something, I make it. I can write an app in C or C++ or Java or Lisp or Erlang or... because I have the background for all of that. For most other users, they have to find a program that accomplishes their need, or maybe they do it in Excel or Google Sheets. But in the past we had an idea of interactive languages that were the main interface for the user. The user could then produce the things they needed to meet their needs. They didn't need me, the CS guy, they could produce at least a functional prototype that met the need for the day. Then over the years they'd tweak it, add more, grow it into something comprehensive, or it reached a natural limit (due to complexity or meeting requirements).

J and APL are from that branch of computing. We've hidden it in our OSes (to varying degrees), encouraging people to buy programs that accomplish X rather than providing them an environment to produce their own programs to do X. The APL family of languages makes it front-and-center about the user accomplishing their needed computations and activities.


I think that we actually already have something much like the alternate world you describe - linux distros.

The posix shell language is amazing at what it does. There is no language in existence in which it is as easy to fork and exec two programs with different collections of arguments and connect their standard streams together. In fact, it's so easy to do those things that these languages have been universally adopted, bare and raw, as both the first and last user interface that experts reach for when operating a computer. I log in to a posix shell, I use it to find files and launch programs and deploy software and talk to people, and I depend on it when civilization has fallen and I need to re-establish operation using only what I can scavenge in the howling emptiness of my bootloader's recovery console.

At the same time, these shell languages are in many ways objectively terrible. They're practically made of subtle unexpected behaviors, their provisions for modularity are terrifying, and they're so deeply baked into so many things that they're past ossified and well on their way to fossilization. Google's style guide for bash [1] draws the line at 100 lines of code - if you have that much logic you need to switch to a better-suited language now.

But there are situations where people couldn't follow that rule. As pointed out by the Oil Shell project [2], most linux distros are almost entirely sh codebases. And I think that these give us a good view into the kind of alternate world you're looking at. They grew that way because they are the result of hundreds or thousands of people over the course of decades using the language as their main interface, building functional prototypes interactively and then tweaking and packaging and adding functionality until they became comprehensive or capped out. And then people adopted those tools and they became part of the operating system. I guarantee you that /etc/rc can trace its origins back to some angry sysadmin deciding that this whole "booting the operating system" thing was too much work and needed to be automated.

Personally, I think that there's a reason we went the direction we did. In some ways, languages like APL and sh are neat. I can, in fact, use bash as a core interface for sysadmin tasks. But at the same time, I've seen some pretty horrifying code in my time as a programmer, and I find shell scripts in /etc/ that near the top of the list. These hyper-expressive interactive languages are fantastic for their purposes at the cost of their performance in other domains. It might take me an hour to get twenty lines of bash working safely and reliably, and every moment of it I'm thinking to myself, "there's a better way to do this, I should just use $other_lang".

1: https://google.github.io/styleguide/shell.xml?showone=When_t... 2: http://www.oilshell.org/blog/2018/01/15.html


APL is for numerical programming with vectors, matrices and so on.

Prolog or Datalog is best when you deal with logic, complex rules, deduction, configurations, graph manipulation, or search. For example, Microsoft used to have embedded Prolog that chose the right combination of libraries and drivers when you installed their OS.


Yep, Prolog is pretty great for that kind of thing. I'm working on a project right now where I'm using Prolog (via tau-prolog) in the browser to determine permissible input options based on the user's existing selections. I'm also using it (via https://github.com/mndrix/golog) to apply those restrictions on the server, and to calculate the pricing for a configuration. In most cases I'm generating Prolog code from a collection of tick boxes, but for complex campaigns I can dig in and write a specific implementation of the required rules. Very flexible.


When I was in engineering school an APL programmer friend came over to where I was working and looked at my MATLAB homework. She eventually stated that MATLAB was a very bad APL. She then proceeded to solve my homework in a single line of MATLAB.


MATLAB doesn't deserve the moniker of a "bad APL". It's a bad FORTRAN.


In old computer magazines from the 80's and before, it was a trope to have an article about some technique or algorithm, and then show it implemented in one line of APL.


> She then proceeded to solve my homework in a single line of MATLAB

To be fair, though, was that single line as long as a normal screen of code? I have not decided whether it's a good thing or bad thing, but apl/j/k programmers have some very interesting ideas about how much is acceptable to put on a single line;)


For anyone interested in apl, take a look at k which is essentially apl restricted to the ascii character set. There's also q (aka kdb) which adds some additional database-like functionality (among other things) however it's proprietary and "free".


There’s also J, also created by Kenneth Iverson.

http://www.jsoftware.com/


Quote about motivations behind J, found in Kenneth E. Iverson's Wikipedia page:

"""

When I retired from paid employment, I turned my attention back to this matter [the use of APL for teaching] and soon concluded that the essential tool required was a dialect of APL that:

• Is available as "shareware", and is inexpensive enough to be acquired by students as well as by schools

• Can be printed on standard printers

• Runs on a wide variety of computers

• Provides the simplicity and the generality of the latest thinking in APL

The result has been J, first reported in [the APL 90 Conference Proceedings]

"""

This is how you'd compute a moving average (taken from its Wikipedia page [0]):

    avg=: +/ % #
    avg 1 2 3 4
  2.5
    v=: ?. 20 $100     NB. a random vector
    v
  46 55 79 52 54 39 60 57 60 94 46 78 13 18 51 92 78 60 90 62
    avg v
  59.2
    4 avg\ v            NB. moving average on periods of size 4
  58 60 56 51.25 52.5 54 67.75 64.25 69.5 57.75 38.75 40 43.5 59.75 70.25 80 72.5
Pretty cool.

[0] https://en.wikipedia.org/wiki/J_(programming_language)


If you want to try kdb, there is are two free noncommercial versions:

- 64 bit/16 core, sends "I'm alive" packets back to Kdb: https://ondemand.kx.com/

- older 32-bit, standalone, very small executable: https://kx.com/download/

Wkimpedia page has history and links to various docs: https://en.m.wikipedia.org/wiki/K_(programming_language)


Last time I searched for Kdb I found that it's highly used in finance, is that still the case?


Speaking as a kdb programmer in finance (although for an unusual use case), most big (and some small) banks have an Enterprise license and have one or more teams that maintain a market data store and/or an order reporting/reconciliation/PNL stack.

There are a decent number of hedge funds that use it for more or less the same purpose but work with other data sets in addition to market data

For storing and exposing market data, you need very little additional code. Maintenance is a different story.


What's the unusual use case?


Doing analytics on streaming packet capture data. Probably not as unusual as the red bull f1 racing team using it for wind tunnel analytics.


It really depends on what you're emphasizing in the question.

Are a sizeable number of Kdb's clients in the finance industry? Yes, it seems they are

Is Kdb common or widely used in finance? No


It has a niche (time series analysis of equity data) where its performance is valued. Kdb developers are rare and expensive so it gets used for specific tasks rather than as a general purpose db.


Not just equities but other asset classes too. I've seen it used for persisting Eurex order book state for futures contracts. London contract rates for KDB devs seem to be 1000GBP/day.


In my experience it is rarely used on the buyside but often uses on the sell-side: most investment banks have atleast a couple people that are using it.


Bitmex, the cryptocurrency derivatives exchange's trading engine is written in kdb+/q.


https://www.dyalog.com/uploads/documents/MasteringDyalogAPL....

I knew the text felt familiar, the same author wrote the above linked book. The intro covers the same material. If you're interested in APL (NB: Not a professional, just liked playing with it), that book was really useful to me last year.


“This example shows clearly that there are ways of reasoning other than those which have dominated information processing for 40 years but they are, alas, still extremely rare. This difference and originality, introduced by APL, are major features. They typify the open and welcoming intellectual spirit of the people who practise it.” — FWIW, this example (+/ Salaries × Rates[Categories]) can be rewritten nearly word for word in R (Sum(Salaries*Rates[Categories])), so at least some of the ideas still have some currency.


> (Sum(Salaries*Rates[Categories]))

This actually looks like Excel / 1-2-3, which is an array-based programming language that people look down on because it enables non-experts to build programs and therefore the programs are bad.


No I do not look down on Excel because it enables non-programmers to program. I look down on it, because Excel programs become an unmaintainable mess. They are hard to reason about, because they do not seperate code and representation. You can never have a nice overview of your Excel programs.


The example you gave was overly simplistic for a comparison. It's like doing one test for a benchmark comparison. Go look at Aaron Hsu's Co-Dfns parallel APL compiler and his discussions on HN and YouTube. I don't think you could easily do an R translation that isn't 10x longer.


It was the example given by the original author. I should have probably fitted it in the quote.


Gotcha. I apologise btw if I sounded like a jerk (hard to tell tone on a forum). I was just trying to relay what little I've determined by playing with these languages.


I would say that pandas and R can pretty much express everything that APL/J/Q can. They are a lot slower though (than Q atleast, I haven’t used J or APL)


Basic operations, yes. Composabilitu, not quite.

I don’t remember enough APL, so my examples are K:

Maximum subarray sum: |/(0|+)\

Flatten: ,//

Rank: <<

In these examples, every single character is its own operation, and they compose to give another useful operation.

Any Turing complete language can express what any other Turing complete language can (by definition). And pandas/r do provide APL’s vector and matrix operations. But not the composability, which IMHO is just as important to what this family brings to the table.


Well every pandas operation returns a DataFrame so that’s pretty composable.

No doubt APL-likes are much nicer than pandas but in terms of operations, I belive they support the same operations.


, is a function, not dissimilar to pd.concat and indeed, x,y is very similar to pd.concat([x,y])

/ is an operator. It's similar to functools.reduce.

,/ is a function, not dissimilar to pd.concat, ,/x is pd.concat(x) which is a little weird. It's tricky, but not impossible to implement functools.reduce(lambda x,y: pd.concat([y,x]), x) but I might've gotten the order of the arguments wrong.

,// is also a function. I think you can do pd.concat(x).values.flatten(), but this obviously doesn't compose. I don't see how you can implement it with another functools.reduce but maybe functools.reduce operates on dataframes after all, so maybe it's just functools.reduce(lambda x,y: pd.concat([y,x.values.flatten()]), functools.reduce(lambda x,y: pd.concat([y,x]), x)) but I have no idea: pd.concat(x).values.flatten() is shorter so I can't imagine anyone would prefer trying to make this monstrosity work.

Being "much nicer" is everything. If the syntax, and the rules for the syntactic elements means nothing, and you are fine with merely the ability to write the transformations, then you haven't thought about this clearly: VB6 is the same as Pandas since it too supports the same operations (or they can be implemented, as they can be in any turing-complete, and many non-turing-complete languages), but I find this line of thinking quite insulting: Pandas is much better for data programming than VB6, and whether you know it (or like it) or not, APL and their ilk are much better than Pandas for data programming.


The second / is converge, rather than reduce - I don’t think python (or pandas) has it in a common library - and even if it did, usage would be ultra clunky. Your description is excellent, just wanted to make it more precise.


I think you more eloquently stated what I tried to do below.


This is like saying a Turing complete language can simulate everything another Turing complete language can. A decent sized idiomatic APL program will be very different from the same Python program only using Numpy, even if the outputs are the same.


Do you mean pandas? numpy isn’t very similar to APL like functionality at all. On the otherhand, pandas is very similar though much much less elegant.


Yes and No. Pandas does the data frame stuff and various calcs, but you need Numpy to do things like invert a matrix unless Pandas supports that as well. APL combines a lot of that in a single coherant package.


Their another downside, I guess, is that they are wildly inelegant. R is inelegant in its core, despite all the nice functionality, and Pandas, IMO, presents a rather ugly addition to the regular Python syntax despite all its nice functionality. So people still yearn for things pure and beautiful.


Pandas looks like R from around 2010.

Now, granted, I can see how R is perceived as inelegant, but I strongly believe that this is driven by unfamiliarity (for many people).


I'm never quite sure what to make of APL. It feels fantastically advanced for its time. Hardware keyboards? Performance?


When I was a graduate dev I worked for a company whose primary product was a APL interpreter. So it was an interpreted language, but it may well be possible that it can be compiled or JIT these days. The interpreter was written in 68K assembler and was pretty fast I suppose, but then I'm not aware any benchmarks to compare it with, since APL is not like anything else (except, as someone pointed out, maybe MatLab).


It seems to me APL and family are mostly useful for numerical work on data that is possibly queried (efficiently at that) from a database. A very naïve way would be to say it's a glorified Excel. Is there something I'm missing ? Are there other domains one could use it for ?


Although the "glorified Excel" statement makes me cringe, I would agree that APL's strong point is numerical work on data. I wouldn't use APL to write, for example, an operating system.

When Ken Iverson wrote APL in the sixties, numerical work was one of the big uses of computers (after the typical finance applications such as billing, inventory, accounts payable/receivable, general ledger), and APL filled that niche quite well. When I was working at IBM in the nineties investment banks were still using APL for some of their modeling.

APL also took a different evolutionary path than many of today's languages, operator precedence for example. APL's rule was simple: evaluate left-to-right. I didn't need to remember that exponentiation has a higher precedence than multiplication which in turn has a higher precedence than addition (those were the easy ones to remember; the hard ones were bit operations, logical operations — some languages had both high-precedence logical operations ("&&" and "||") as well as low-precedence ("and" and "or"), which further muddied the waters). I found Reverse Polish Notation (RPN) the most elegant when determining operator precedence, with APL's a close second.

Another difference was the way that the code was maintained. For example, my father, who was also an APL developer, often said, "APL is write-only code". Rarely would one modify someone else's code; instead, you'd re-write the function entirely. Thus the most important aspect of writing APL code was documenting what the function took as input and what it spat out.

But I'm glossing over one of the most important aspects of APL: it made you think about the problem space differently: instead of nested for-loops, you'd think of vectors and matrices. The manner in which you thought about the problem was incredibly succinct, similar to APL's vocabulary.


There was an extremely impressive video a year or two back of someone who had built a compiler in APL. While I don't code in it, it changed my view of APL and its power.

HN was fairly caught up with it for a month or so, and a flood of APL posts.

Someone may have a link to it.


Just look for Aaron Hsu Co-Dfns on GitHub, talks on YouTube, and on HN...I think his username here is arcfide. Look for his submissions. He's probably 100x better at coding than me and never leaves notepad.


Historically it was spreadsheets that killed (financier) interest in APL (which was going gangbusters at the time). They were much easier for this crowd. APL does actually work for non numerical stuff. There is the famous example of OS360 being written in a couple pages of APL code. The model was actually used at IBM and at the time was heavily used there. Iverson went on to shepherd J, a much sophisticated language. K, another much modern offshoot, in the mean time has taken over the financial (trader) world.


Those who have experience with array languages, is it fair to say that something like numpy captures most of the benefits of APL/K?

If Python allowed a reduce operator “/“, numpy will be able to get even closer to APL, correct?


In terms of semantics, sure. However a real part of the charm of APL (and J, and K) are in the syntax, and how notating your program causes you to think about it differently. Something like or/and outer product in the game-of-life oneliner is very straightforward APL, but is much clunkier to write in numpy.


I just found that there's a Julia package for at least some APL syntax, perhaps an easy way to play. [3] is a fork updated for v1.0:

[1] https://www.youtube.com/watch?v=XVv1GipR5yU [2] https://nbviewer.jupyter.org/gist/shashi/9ad9de91d1aa12f006c...

[3] https://github.com/MasonProtter/APL.jl


I still see two things missing from currently available APL dialects: being able to compile programs, and being able to create a control loop (which is required for continuous access to data input).

I wonder how much effort would be required to use, say, GNU APL for compiling the language?


As someone who actually used APL prifessionally for a decade I am always surprised by how often the language surfaces on HN. I used it in a wide range of domains, from business and engineering applications to DNA sequencing.

The main comment I want to make is that anything that abandons notation is an abomination. The power of APL comes from the use if notation as a tool for thought and the expression if ideas. Notation is crucially important in this regard. Language transliterations like J were huge mistakes in APL’s historical timeline.



In college I learned APL. Most of my colleagues complained that we were learning it instead of something with more practical use or market value.

APL taught me it was possible (and remarkably easy) to write "write-only" code and how delicate legibility can be. Most of my APL code had 4 lines of comments for each line of code.


When I learned APL (which I never used) that when there was a bug, it was faster to rewrite the whole line using a different way than to debug the line.

A truly 'write only' language..

It shows in the article: the 'min' operation works both with an array and a scalar. Nice when you mean it, not so nice when you don't.


Every time APL comes up, I — as a non-APL programmer — am compelled to post this fun video about implementing Conway's Game of Life in APL: https://www.youtube.com/watch?v=a9xAKttWgP4


Quoting from the article:

>> To get a similar result by means of a traditional computer language requires many instructions, which hides the object of the calculation behind arcane programming. Here, for example, is what one would write in PASCAL

It seems that the article emphasises on how APL can get a lot done with little code. I know of a programming language that can help you do that: Rust.

Rust has two powerful features:

1- Traits: You basically declare a "struct" type that has your data. Implement the Mul/Add Traits to it; and then "boom" you can multiply these seemingly complicated objects with *.

2- Macros: You can abstract repetitive code with macros. For example your code displays a table on the terminal. You have to do that repetitively. You can either create a function to display your data or have a macro do it.


Is there a good open-source APL-like language somewhere?


GNU APL is a perfectly reasonable APL implementation.

There are open source implementations of some K versions - Kona, which is K3, and oK, which is ~K5.

J is available under GPL3.


So, is there a Machine Learning Docker with APL pre-installed?




Applications are open for YC Summer 2019

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: