Hacker News new | past | comments | ask | show | jobs | submit login
APL deserves its renaissance too (wordsandbuttons.online)
182 points by okaleniuk on May 28, 2018 | hide | past | favorite | 118 comments



Neat little tribute article.

APL will not be reborn. If that was going to happen, this century's embrace of analytics and linear-algebra-rich machine learning would have propelled the upswell. It didn't and it won't.

Why? It's not the symbols. Not the keyboard. Nor the learning curve. Nor the lack of standardization, libraries, or GPU support. These are collateral damage, not primary drivers.

APL will not flourish because it is effectively a DSL with archaic I/O capability. It is a nearly-pure functional programming language. It's primary domain is mathematics. Native GUIs, networking, the web -- all uncompetitive add-ons.

In this sense, APL and Haskell share some DNA.

For dynamic systems modeling for my thesis, I reached for APL and Matlab. I still code Dyalog APL for recreation. Love it. But I wouldn't build a web app with it and a wouldn't run my company on it.


I like your take, and Haskell is an apt comparison in this sense.

Another sense in which i think APL is... un-competitive is that its strengths (terse, mathematical, matrix-centered) are the easy parts of programming, i.e. the parts that are the most objectively quantifiable and verifiable (you just use math).

Accounting for numbers is trivial, accounting for human factors is a proverb.


I agree that Haskell is a good comparison. I think matrices and math are too closely focused on in array languages (and dismissals of array languages), though.

For fun I'm writing a "spreadsheet programming language" that borrows a little from array languages, and I think there's useful stuff to take away from them.

They make you focus on the data, and the transformations you want to effect on the data.

They make you think, "where there's one, there's probably many."

The implicit mapping over arrays for most functions and operators is a massive ergonomic win over "a formula for each row" in the spreadsheet model. (Excel "array formulas" but not shit.)

Some of the simple little abstractions are beautiful. Seriously, as someone who didn't know anything about array languages three weeks ago, I say go spend a day reading the first two parts of the J book (http://www.jsoftware.com/help/learning/contents.htm) up to "Rank". It's this humble little operator that makes the whole language sing and fit together masterfully, and I'm heartbroken my language's data model (and programming style) isn't a good fit for it.

Really, though, for me being able to have my (still hypothetical) non-programmer users type

  organisations[users.orgId]
instead of

  users.map(({ orgId }) => organisations[orgId])
is a no-brainer. Ditto just about any place I might want to use an anonymous function in JS, really:

  sort(people, by: people.age)
vs

  sort(people, ({ age }) => age)
So simple. Pass an array of things to use as the sort key, not "a function to call on each item-to-sort".

Forget the cryptic syntax, "operator ambivalence" and math/matrix-focus, there's a lot of useful stuff in there that should be better known in language circles.

As for the grandparent's complaint of being "nearly-pure functional", that's the zeitgeist now -- functional transformation of data, with big ugly explicit data-replacement as the mutation story. That's not an impediment to use, it's an encouragement to use responsibly while not being too much of an encumbrance if you want to walk off the beaten track.


I'm not massively keen on the former (IMO having to write .map etc helps you get a better "feel" for what parts will have the biggest impact on performance), but the latter looks an awful lot like lenses, which are pretty great.

Of course, a lens is really just a pair of functions in disguise, so it wouldn't prevent you from sorting my more complex functions if needed.


You can do that last one pretty easily in JS with a helper function.

    function by (key) {
      return function (obj) {
        return obj[key]
      }
    }

    // sort(people, by('age'))


Not a bad solution. Having a top-level `by` function isn't a bad tradeoff if it's easily understood. For me there are two problems, though. Most importantly:

1. It doesn't generalise past lookups. Say I want to sort a list of numbers by their absolute value. In my spreadsheet language:

  sort(numbers, by: abs(numbers))
or filtering by some threshold:

  filter(numbers, by: numbers > threshold)
Both are "repetitive" -- we might prefer something with first-class functions to get closer to a point-free style, but my users won't care. And the repetition isn't actually useless, there are times you might really want that flexibility:

  filter(users.id, by: users.age > 21)
2. My language isn't big on first-class functions. One day it might get support for them for some use-cases where you just need that flexibility[1], but in the meantime they don't provide a lot of value, and they're not a good fit for my hypothetical future users.

If I told my hypothetical users that `by` returns a function, their response would probably be, "What is a function?" The unit of logic abstraction in my spreadsheet language is... slightly weird from a traditional programming perspective.

Thanks again for commenting constructively, though -- seeing a new ergonomic way to approach this sort of problem (or even just "tricks that could become idioms") is really useful.

[1]: If you needed to provide a 2-arg comparator to `sort` then a lambda would be much better -- my scheme would require you to evaluate the cross-product of comparisons, so sorting would have to take time O(n^2), which is no good. Thankfully single-item key functions seem sufficient for most purposes.


If you think about it, a decent stack is composed of languages. That's currently already often true, e.g. the use of IDL languages like protobuf/flatbuffers. Some languages offer integrated idl like Kotlin (data classes). Another example is html, where the UI is described using a dsl to specify elements and CSS for layout & appearance.

This fact caused me to think: should stack also not be made of other languages? Is there a place for a completely pure functional language? I mean, Haskell is nice, but get a bit awkward with I/O. Same for APL. C# has nicely integrated data querying, but from a distance is actually at least somewhat awkward. C# seems to be optimized for mutable domain objects; everything else can be nicely done but falls somewhat outside it's "identity".

I would love to be able to express functions, functors and math using a terse math-y language. Whether that be APL or some sort of blend of Haskell and APL (Haspell? spelling pun intended), I don't care, but it would be great if we can have nice integrations of these languages in a full stack. Sort-of like how TypeScript and JavaScript have dialects to enable React syntax to express HTML within it.

The same thought experiment can be applied to SQL. Can we have a data-querying language integrated right into, say, C# or Java?


The language Racket (https://racket-lang.org) sort of approaches this. It allows you to quickly switch which compiler a given source file is actually sent into, the goal being that you solve each problem with sub-languages particularly suited to the task at hand. For instance, the webserver language has HTML build in as a primitive.


C# kinda does have a data querying language built right in with LINQ query expressions, although they seem to have rather fallen out of favor.


Indeed. But you can still write functional style data querying languages using LINQ extension methods. IMO, LINQ based DB querying is revolutionary in terms of the flexibility and capabilities it provides. For complex Relational DB querying it's pretty great.

I tend to write raw SQL using Dapper these days, because hey, scalability and such, but for business apps and non-cloud apps, EF and other LINQ-based DB tools are great.


Re: The same thought experiment can be applied to SQL

I am a fan of "Table Oriented Programming" and believe OOP forced an unfortunate shift from data-oriented programming to code-oriented programming. I hope the pendulum swings back. APL is intended for mostly scientific applications, but it has some nice lessons for other domains, and RDBMS may be a door into common data-oriented programming.

I also believe that "Dynamic Relational" is needed to improve data-oriented mock-ups, experiments, and small-scale projects. The existing RDBMS are too "stiff" for some needs. We have dynamic programming languages, so why not for databases? Yes, there are some dynamic query languages, but they require throwing out SQL and starting from scratch, which cranks up the re-learning curve. Dynamic Relational only requires minor changes to SQL. (SQL ain't perfect, but so far a decent replacement has yet to gain traction. I'm personally a SMEQL fan.)


The ML family (SML, Ocaml, F#) is also pretty good with the mixture of functional and imperative constructs.

Having a part of the program known to be pure at compile-time, but another part effectful is something you see in the language F* (http://fstar-lang.org) - here you get monads, dependent types, a proof system, and the ML module system neatly packed into a general-purpose programming language.

Not as widely popular as O'Caml, F# or Haskell, but both as practical as Ocaml and as researchy as Haskell.

Also there's a free tutorial/book on the website.


The heart of my argument is mostly that integration in main languages should be pursued. F# can be mixed on assembly (project) level, but not on source level (like css in html or html in js/ts).


Is there a reason one wouldn’t just build the core engine of SIMD math software in APL, much like people build the core engines of medical-system software in Prolog?


Is Prolog really used for medical-system software? It seems to me that it turned out to be the wrong path taken by Japan's Fifth Generation Project.

In the early 80's I experimented with Prolog. Simple ideas were simply coded into programs; it was really very interesting. However, the need to understand the compiler's inner workings to get programs that ran efficiently by manually inserting "cuts" to limit backtracking ruins Prolog's claim to being simple to translate requirements into programs.

An even more serious problem with Prolog is the simplicity of it's model for logic. It is very easy to end up with requirements in the real world that don't easily fit the model. See for example the SHOOT problem in [1], where the concluding sentence is:

> As such, the claim implicit in the development of nonmonotonic logics--that a simple extension to classical logic would result in the power to express an important class of human nondeductive reasoning--is certainly called into question by our result.

Don't get me wrong, I was the Chief Scientist at a company with one Prolog based product. I still thinks its an interesting language, but I predict that it will remain eclipsed by much more general purpose programming languages.

[1] https://www.aaai.org/Papers/AAAI/1986/AAAI86-054.pdf


The popular narratives that 5th generation computing, expert systems, etc failed to deliver because it was the 'wrong path' to take is bs, basically the hardware designers weren't used to non-turing architectures, it was hard to find (and hire) programmers that fully understood both non-imperative paradigms and hardware integration, and government funded science (outside of black budget defense and intelligence projects) education and research started taking a nosedive during the 80s.


It is! You don't use it for diagnosis; you use it to formalize treatment guidelines (which are effectively 'expert systems' even on paper.) It runs alongside more ML-based models, glued together under business-logic written in something like Java.


You have got to give more details about this :)


>> In the early 80's I experimented with Prolog. Simple ideas were simply coded into programs; it was really very interesting. However, the need to understand the compiler's inner workings to get programs that ran efficiently by manually inserting "cuts" to limit backtracking ruins Prolog's claim to being simple to translate requirements into programs.

Where does this claim come from? Prolog is an automated theorem prover for first-order logic theories. There have certainly been various arguments about the pros and cons of it made by many different people (not all of whom were close to the development of the language), but the fact remains that its main characteristic is the automation of a proof process.

You can certainly find many flaws in Prolog if you look hard enough and choose your requirements carefully (for example- it has no functions! Why has it no functions? It should have functions! Therefore, it sucks).

I think a lot of the bad rep Prolog's got over the years is the result of promises by various sources that had nothing to do with its actual goal, which was to create a language with the syntax and semantics of FOL- which it achieves not perfectly, but way, way better than anything else out there.

>> I still thinks its an interesting language, but I predict that it will remain eclipsed by much more general purpose programming languages.

That's probably true, though :)


Yes, thanks for the correction. You are right, it is possible to learn to Prolog without understanding the inner workings of the compiler or even the Warren Abstract Machine, but that is how I understood why I had to put up with the cut operator in my programs.

I was originally sold on the idea of Prolog because of the many beautiful declarative examples found in introductory tutorials. The sad truth is that Prolog isn't magic and just like Functional Programming, Object-Oriented Programming, or even Structured Programming, there are wrinkles that one encounters that require a less than pure language semantics. In Prolog's case the semantics must be understood procedurally by understanding the order of goal searching and backtracking.

From Clocksin and Mellish [1]:

> So the moral is: If you introduce cuts to obtain correct behaviour when the goals are of one form, there is no guarantee that anything sensible will happen if goals of another form start appearing. It follows that it is only possible to use the cut reliably if you have a clear policy about how your rules are going to be used. If you change this policy, all uses of the cut must be reviewed.

The cut operator in Prolog is used to make programs more efficient or to make the language more expressive[2]. So there is little alternative but to accept it's use along with it's confusing and error prone semantics due to it having a meaning that is only in terms of the procedural semantics of Prolog[3]. From Clause and Effect, section 4.3 titled "Taming Cut"[4]:

> It is not easy to understand the full implications of using cut.

Prolog looks like it allows programming by specifying what is wanted rather than by specifying how to get to what is wanted. This appearance might even be partly justified, and it is certainly touted by Prolog's promoters. From the preface to Prolog for Programmers[5]:

> ... Prolog can be classified as a descriptive programming language, as opposed to prescriptive (or imperative) languages such as Pascal, C and Ada. In principle, the programmer is only supposed to specify what is to be done by his or her program, without bothering with how this should be achieved. ... In practice, however, Prolog can be treated as a procedural language.

I think this is misleading. In my mind it should end with the sentence: In practice, Prolog must be treated as a procedural language in addition to a descriptive language.

In their chapter devoted to cuts and negation, Sterling and Shapiro[6] say:

> With ingenuity, and a good understanding of unification and the execution mechanism of Prolog, interesting definitions can be found for many meta-logical predicates. A sense of the necessary contortions can be found in the program for same_var(X,Y), ...

This expects programmers to have a non-trivial understanding of Prolog's implementation. I was wrong to say that an understanding of the compiler was needed, but programmers do have to understand unification and the order of evaluation used by the language.

Finally, I believe that the cut operator also makes parallel processing implementations of Prolog very difficult. In this post Moore's law era this will cause problems for Prolog's performance without making more of the implementation leak though to the language semantics.

Logic and constraint programming are interesting techniques, and it's difficult to say where they will lead. Prolog is an important language in the history of programming languages. Perhaps Logic Programming will make a comeback aided by the new advances in automated theorem proving.

[1] Clocksin, W. F. and Mellish, C. S. Programming in Prolog. Springer-Verlag, 1981. p. 78.

[2] Bratko, Ivan. Prolog Programming for Artificial Intelligence. Addison-Wesley, 1986. p. 136.

[3] Roger, Jean B. A Prolog Primer. Addison-Wesley, 1986. p. 115.

[4] Clocksin, William F., Clause and Effect, Springer, 1997. p. 50.

[5] Kluźniak, Feliks and Szpakowicz, Stanisław, Prolog for Programmers, Academic Press, 1985. p xi.

[6] Sterling, Leon and Shapiro Ehud, The Art of Prolog, 2nd Ed. The MIT Press, 1994. p. 201.


>> This expects programmers to have a non-trivial understanding of Prolog's implementation. I was wrong to say that an understanding of the compiler was needed, but programmers do have to understand unification and the order of evaluation used by the language.

Actually, I'd go one step further and say that in order to make full use of Prolog you need to understand SLD resolution and definite clauses, otherwise you'll never really get what the hell is going on in there, or why the interpreter works the way it works. To program in Prolog you do need a non-trivial understanding not only of its implementation but also of the substantial theoretical work that led to its implementation as the prototypical logic programming language (e.g. see J. W. Lloyd, Foundations of Logic Programming).

The thing is, there is a lot more going on with Prolog than "here is a language that can be used to control a digital computer". Logic programming is the continuation of the work of the logicians of the early 19th century, the evolution if you like of mathematical logic. It has to be understood in that context- or not at all.

You make the point that there is an operational, or imperative, reading of Prolog code, that must be understood alongside the declarative one. That is correct, in my opinion, but I don't see why it's a problem. For one thing, programmers taught on imperative languages (i.e. the majority) should not really have a problem with the operational semantics of Prolog (e.g. what you see when you trace Prolog code in the four-port tracer, the order in which goals are called, succeed, fail and are retried etc). Understanding the semantics of the cut is maybe not trivial, but again that is a problem only if someone tried to sell you Prolog as an "easy" or "friendly" language; which it isn't.

More importantly, Prolog has its declarative reading also, which is something almost unique to it. So maybe Prolog is not a perfect, pure implementation of the declarative paradigm- so what? Should we throw the baby out with the bathwater and say, hey, Prolog is only 99% declarative, so I'll just stick with Java, which is 0% declarative? That doesn't make much sense! At least not if you somehow want to use declarative programming- if not, then Prolog is not an option anyway.

>> Finally, I believe that the cut operator also makes parallel processing implementations of Prolog very difficult.

You mean parallel execution of Prolog code? Not parallel implementations of the language? I don't see that the cut is a problem there, but then again I don't have any experience trying to parallelise Prolog code. There is again a promise that Prolog's execution scheme is uniquely disposed to parallel execution. Maybe it is, maybe it isn't. But that is not the most important characteristic of Prolog.

The most important characteristic, like my previous comment says, is that Prolog is an automated theorem prover for first-order logic theories. Anything else that it may do is by the by- and, conversely, any compromise it has to do on the way there, provided we're talking about pragmatic compromises necessary to get the language to work on limited resources (i.e. like the cut), is worth it.

Of the sources you quote I'm familiar with Clocksin & Mellish, Bratko and Sterling & Shapiro. Those are good books, although Bratko probably tries a bit too hard to make Prolog sound a bit like Pascal. I would also recommend Richard O' Keefe's The Craft of Prolog and George Lugger's "AI Algorithms, Data Structures, and Idioms in Prolog, Lisp, and Java", online here:

http://wps.aw.com/wps/media/objects/5771/5909832/PDF/Luger_0...

>> Perhaps Logic Programming will make a comeback aided by the new advances in automated theorem proving.

Nah, I don't think so. I think it will remain a niche thing, of interest primarily to academics and programmers who like obscure languages. But that's OK. Not every language needs to be Java or Python.


APL has flourished, as a DSL embedded in a language with excellent I/O. It's just got weird syntax and is called Numpy. Or Tensorflow. Or PyTorch...


Morgan Stanley used APL successfully in their bond trading business for many years... its by no means an insurmountable hurdle. Last I heard they were using Java tho’, lol. From one end of the verbosity spectrum to the other!


MS used APlus (http://www.aplusdev.org/index.html) a derivative of APL. I was there during the transition and a common attitude in the Fixed Income group was "You'll pry APlus out of my cold dead keyboard".


It will not become popular for one simple reason: you have to be smart to understand it. Want to be popular? Write a python^Wlanguage for people who shouldn't even be programming in the first place.


I've felt that high-DPI touchscreens are the opportunity for an APL renaissance, if there's ever going to be one.

Once hardware keyboards standardized around ASCII in the '70s, it cramped the promise of APL's custom notation. On touchscreen it's trivial to render a software keyboard that contains the entire symbol set (perhaps even adapted to the user's experience level to make it less scary initially?). Increased resolution could be used to make the programs easier to parse visually by e.g. scaling symbols based on their context and nesting level, just like we do in math equations.

I'd like to work on an open source APL variant that runs on iPad and/or touchscreen Windows. It should be aimed at entry-level programmers because they don't have the imperative mindset strongly ingrained yet. It would be interesting to see if "touch APL as first language" could be a viable approach to teaching programming. Just need to find the time somehow... :/


What's the intersecting region in the Venn diagram between "people who will program in APL" and "people who will program on a touchscreen" though? They're small circles to begin with— most programmers seem to have an allergic reaction to coding on tablets and APL is kind of a hard sell even notwithstanding the input issue— and so I can't imagine the overlap is many people at all.


Like I said, it would be an open source project aimed at new learners who haven't yet built up those preconceptions. It's ok if it doesn't take off since it's not a business.

(If there's one thing I've learned in my career, it's that you can't change developers' minds about anything fundamental. We are extremely stuck in our ways, myself included, and all too ready to pretend that tiny incremental changes in syntax or tooling are revolutions.)


A language like APL would make it much more attractive to program on tablets or even phones, though.


* e.g. scaling symbols based on their context and nesting level, just like we do in math equations*

That's a great idea. I can see it working in other languages as well, e.g., what Guy Steele and others were trying to do in Fortress.


This is such a transparently good idea that I don't know why someone hasn't already done it.

APL programming on a touch screen sounds like it would be better than regular programing on a touch screen because the terseness in that context is much more appealing -- it's a lot nicer to enter a few symbols from a software keyboard than a few words on a touch screen, which isn't at all the case for programming on a regular keyboard.

(searching for 'apl iphone' is going to suck, though)


If you want APL to be reborn make a version that supports GPU acceleration. That just might give it momentum enough to be used, a language that runs with the native speed of the GPU and that is easy to compose without having to fiddle with the details could be a winner.

There is this:

https://github.com/Co-dfns/Co-dfns

But it's not quite it.

As for the word 'deserve': computer languages don't really deserve anything, they get adopted, or not. APL is a beautiful language and it gives me immense pleasure when I 'get' a program but the barrier between 'the problem' and 'the solution' hinges on libraries, eco systems and communities. I suspect that APL's window of opportunity has passed and even if Clojure became 'a modern lisp' and Elixir became 'a modern Erlang' I don't really see a big driver behind 'A modern APL'. What would it do that can't be done in other ways?


How is Co-dfns not what you want? It is by far the best attempt I know of for bringing the full power of APL to accelerators like GPUs[0].

...however, it is not easy. Full APL has a lot of features (like nested arrays) that are incredibly difficult to map to GPU execution. And in general, APL is not easier to map to GPUs than your bog standard Numpy-style matrix library (in fact it's harder, since Numpy only has regular arrays and simple operations).

[0]: I was involved in a project that targeted only a small well-behaved subset of APL, which was hard enough: https://github.com/henrikurms/tail2futhark


It would be a great dsl for tensor comprehensions. Also it's a better notation for much (most?) of the math of data science. I wish papers used it. Then the math would be unambiguous and testable and maybe even correct.


No experience, and not APL; but [0] pops up on my radar now and then.

[0] https://futhark-lang.org/


I have used and loved apl\11 and its successors for many years.

It could not be compiled on 64-bit machines, because of some ancient programming practices that were used throughout the code.

I took some time to update apl\11 so that it compiles and runs on modern machines.

Did a few other things as well, such as replacing linear algorithms in the symbol table code with O(logN) algorithms.

I would refer to this project as "heirloom software". It is touchingly old-fashioned in some ways, a bit dusty, and still fully serviceable.

It is a bit like owning and flying an Ercoupe. A bit quirky, an ongoing effort of love, but a joy to fly.

Edit your apl with vim, run apl files like scripts from the shell, pipe inputs and outputs, copy an APL binary or build it anywhere with a C compiler, run it immediately and conveniently.

My take on the 'funny characters problem' was to go with touch typing, and stick with ASCII. If you can write APL with your eyes closed, you can use APL-touchtype. "rho" is "R", "iota" is "I", etc. Trig is "O". Log is "O@*". (APL-touchtype uses "@" instead of backspace, and with that trivial substitution, all of the APL overstrikes become APL-touchtype ascii.)

It is ABSOLUTELY a work-in-progress, but anyone who wants to is welcome to give it a try:

    https://github.com/gregfjohnson/aplette


I keep hearing APL is great, but like everyone else I find the cryptic symbols very off-putting. I haven’t seen a really compelling explanation for why it’s great.

It looks like a bunch of matrix and vector functions with very terse names. Is it significantly different from just taking GLSL, say, and #define-ing one-character names for all the built-in functions? What is APL’s secret sauce?

This article suggests that a lot of the fun is just the cryptic symbols themselves. That’s absolutely fine, but it doesn’t seem like something that “deserves” a renaissance, compared to, say, Lisp, which really does have a uniquely productive idea at its core.


If you don't like the symbols then there are J and K/Q which are ASCII based but retain many of APLs other characteristics.

https://en.wikipedia.org/wiki/J_(programming_language)

https://en.wikipedia.org/wiki/K_(programming_language)


I just don't think the symbols are all that important. I'm curious about the key ideas behind the language, but I haven't seen a really good explanation of them.

Is it just a set of well-chosen matrix operators? I would have thought that could be done in a library, rather than a language.


There are two key ideas: one is notation as a tool of thought, and thus the symbols are incredibly important (or, if not using symbols, to use similarly lightweight notation as in j/k).

In a semantic sense, the key idea is that you compose operations on array indices. A good example of this is k's where operator, &, which converts an array of booleans into an array of indices where it is true. It's very common to take & of some predicate on your inputs, manipulate that for a while, and then index into something else at the end.

There are also ideas of rank/shape ambivalence, which naturally encompasses broadcasting (so scalar + vector broadcasts the scalar, vector + matrix broadcasts the vector row-wise, etc).

You certainly can implement something with APL's semantics as a library - Numpy does this. The magic of using APL comes from having all these features at once.


Great, thank you! Food for thought and some concrete stuff to research further.


If you lose the symbols you lose the terse nature of the language, which as far as I can tell seems to be one of the most alluring parts of the language (the author of the blog post shows game of life as essentially a one-liner).


Did you read any J programs? It is just as terse as APL. (Probably K is pretty terse too, but I won't comment on it as I only know J.)


In J, the parens, brackets and braces aren't balanced because those 6 symbols, i.e. `(`, `)`, `[`, `]`, `{` and `}`, are used as standalone symbols for other syntactic stuff, making J code very hard to read.


No, parens are parens, used for grouping. The rest is true, though: brackets `[`, `]` are essentially two flavors of identity function (dyadic) and `{`, `}` are indexing operators (not sure I remember right, but I think `(<1;2) { A` would be `A[1][2]` in more common languages).


Parenthesis are paired and have their usual function of groupoing. You are correct about brackets and braces though. Personally I do think J is often hard to read, but not because of the unbalanced brackets and braces.


J and K are extremely terse, though. They just chose other ASCII symbols that exist on everyone's keyboard.


J seems to have its fans ( http://www.jsoftware.com/ and https://en.wikipedia.org/wiki/J_%28programming_language%29 ) and uses ascii instead of all the different symbols.

Some of the code CAN be quite terse (and cryptic) like

quicksort=: (($:@(<#[), (=#[), $:@(>#[)) ({~ ?@#)) ^: (1<#)

I spent a small amount of time learning the basics a while ago, but never really used it for anything. But it was interesting and I'd have to spend a lot more time with it to get into the mindset. Popular with some Code Golfers as well.


> quicksort=: (($:@(<#[), (=#[), $:@(>#[)) ({~ ?@#)) ^: (1<#)

Please, no. Glyphs / characters are not a scarce or constrained resource in software development. The major resource constraint is developer time spent writing, reading and debugging code. APL (and its descendants) are specifically designed to make these challenges far more difficult than they need to be.


I think it's worth suspending judgement and opinions about what you think a programming language should be and evaluate it on its own terms. The preceding is a terse form of the code, you can write it more verbosely. But just like regex, terse forms can be very useful once you understand them.


I could also see either the existing chain of ascii glyphs becoming more legible with some wisely-chosen syntax highlighting, or an editing environment where the ascii glyphs are used to produce something distinctive, and that might change how developers relate to the symbols.


Funny that you should say that. APL was designed to be easy to read, like how mathematicians can look at an equation and immediately get a feel for it.

"Notation as a Tool of Thought": http://www.jsoftware.com/papers/tot.htm


Also, another argument for this notation, is that you condense patterns to be very close to one another. Notice in Java and C# when you start using language features that support something like 'fluent' notation where you just append operators and the data 'flows' through. Eventually you'll start seeing obvious improvements, for instance a 'map' followed by a 'concat' might be better off as a 'collect'. APL and J focus on the idea of making very high level things accessible with symbols, and since the underlying structure is always an array, it's easier to judge what things are doing and when the operations can combine. What's more, once you combine them, you give clues to the compiler about your intent, and often further optimizations can be done.

It reminds me a bit of SQL Databases, in that the more indexes and constraints you add, the more the query execution can draw from to change the route a calc takes in order to try and do it more efficiently.


The MCM/70 https://en.wikipedia.org/wiki/MCM/70 , another early microcomputer which failed to thrive, was also an APL machine. In fact, the computer pictured at the bottom of the article is an MCM/70. Interesting to consider what maybe could have been.

(One reason the MCM/70 wouldn't have had BASIC is that its development predates BASIC Computer Games https://en.wikipedia.org/wiki/BASIC_Computer_Games , apparently the original, 1973 publication of BASIC listings. Those were all games written on and for the Dartmouth BASIC system or other DEC microcomputers. I think (I am not an expert) it was probably this corpus, and its publication in a readily-available book, that originally drew microcomputer systems like the ALTAIR towards supporting BASIC.)


I will support this, but only if IBM brings back the beam-spring keyboard.


The beam-spring keyboards are still being made - someone bought the patents, tooling, and supplies from IBM and their main supplier, 22 years ago and has been making them ever since.

http://www.pckeyboard.com/page/category/UKBD


Nope. These are the buckling spring ones. I have one (http://www.pckeyboard.com/page/category/PC122) and it's awesome.

But the beam spring was awesomer ;-)


They even sell an APL keycap set: http://www.pckeyboard.com/page/product/USAPLSET


These guys are pretty amazing. I can't recommend their keyboards enough.


Yeah, he's speaking of the Model F variant of keyboards from IBM rather than the Model M (the better known buckling spring variant). There are a ton of Model F's online for sale because they're not natively compatible with regular computer and thus need an adapter (I believe), and they're just not as widely known.

Also the Model F is a monstrous beast of a keyboard, that would make even a regular model M seem compact.

I just want an industrial (gray) Model M Space Saver. I'd get one but I still need to think of a good argument to use with my wife over a $600 keyboard...


Beam-spring > buckling-spring Model F > buckling-spring Model M.

The guys at Unicomp were musing around the Space Saver layout. Not sure if they have the machinery, or the funds, to do the tooling to build it.


APL looks like the best programming language I've ever seen. I wish it could be extended with SQL and web-service mappings to input-output data conveniently and to use functions written in other languages (like C and Python) for acceleration and rapid logic prototyping.


Well, there's K, along with its flagship database KDB which lets you write SQL-like queries and, from what I've heard, integrates very nicely with the language. idk how good their FFI story is. Unfortunately, you need to pay for commercial use.


You can extend kdb with c functions, but they need to take/return their internal K structures, so you'll need to wrap functions. `2:` is the thing to look at.

Otherwise there's ipc clients, and a very nice native python integration now.


Not exactly what you want, but you may want to check out F# type providers..


Dyalog has a web server and a fully documented SQL library.


I remember being floored by APL few months ago when I learned about co-dnfs and watched the stream by Aaron Hsu [1] with quotes like "This code is perfectly readable, it just isn't in English".

Thing is, when I was thinking about trying it, I never figured out a good toy project. Like, when I wanted to try anything else, most of the time I would know I would be able to cobble together a web-service. Haskell, Erlang, even Prolog.

I understand co-dnfs to be self-hosting, so maybe a toy programming language?

[1] https://www.youtube.com/watch?v=gcUWTa16Jc0


Do some deep learning. It's very straightfoward, and you only need some trivial IO.

Writing a programming language in APL is not for the faint of heart - co-dfns is, alongside it's performance goals, to show that compilers aren't necessarily a bad fit for APL.


Incidentally, I wouldn't be shocked if you reached out to Aaron Hsu and he actually helped you. I sent him a related email and found him to be extremely cordial and informative.


Could Numpy be considered the APL renaissance? Not in syntax, but conceptually?


No, not for terseness. Numpy is nice because it extends Python in a reasonably good and efficient way but standalone it doesn't do much of anything and Python is so much more verbose than APL. That's the beauty of APL in a way: there really is nothing left to remove.


It can be both! https://github.com/baruchel/apl

What I'm craving is APL for TensorFlow...


If we're going that way, I'd say Julia is closer.


In fact, it can get extremely close:

https://www.youtube.com/watch?v=XVv1GipR5yU

And since you can define your own operators, you could just go all of the way.


You can't define operator precedence though!! (That's a good thing). There was once a Julia talk about a really old programming language which disallowed any operator groupings without parens except for the basic seven or eight.

https://gist.github.com/ityonemo/6512f3b4fd02b9a9cc292cf0f88...

also string macros are cheating!


I'd love for it to go more mainstream with a high performance open source implementation. Dyalog is pretty good, and I like J, but feel the missing symbol leaves something out. K/Q with kdb+ is too expensive, and the other open source array language implementations (Qnial, Klong, Kona, GNUAPL)are just not there.


How is GNU APL not there? It is standards compliant, 99.9% pure APL2 (per ISO standards), and is actively maintained (bugs and issues are usually turned around in a day or two). It performs quite well, and can be easily extended. I have even added memmap vectors to it to achieve even more performance (and to begin GPGPU integration).

GNU APL also supports a library version (libapl.so) which allows APL2 to be incorporated into other applications via C ffi. This is my primary use case -- interactive development followed by delivery via libapl.so.


I think it requires a more expert user. I couldn't get the manual install process to work on Ubuntu and I believe it requires MinGW or Cygwin to work on Windows (not for a beginner either). Don't get me wrong, I'm really really glad it is there.


Kx could be much bigger if they open sourced the q interpreter, tapped into a larger developer pool who could write libraries and add ons. There's more money in selling consulting services than licenses...


As much as I would love that, I do not think getting bigger is an issue for Kx. It seems to me they are doing quite well with their current model.

I have read before (sorry, I do not have the link) that Arthur Whitney does not really believe in open source. That is unfortunate (and a mistake, if you ask me). Even if he did not release a complete product, I would like to study his code. Reading and trying to understand kparc.com/b/ is a very enlightening experience. I wish k.c was there too.


From what I understand, they've made millions off of the big banking clients and that is probably quite straightforward as opposed to making money off open source.


Here's a partial implementation of APL running on Common Lisp: https://github.com/phantomics/april


No, it does not. I used APL professioally for about ten years back in the 80's. I love the language. It is incredibly powerful. Once you internalize it's like playing the piano, you don't think about the mechanics you play music.

However, the language did not stand the test of time for far more important issues than the inconvenience of the character set and the keyboard.

And, no, J is not a successor to APL, even though Iverson created it. J is an abomination. He made a mistake. He thought that abandoning notation --which is incredibly powerful-- would solve the APL popularity problem. What he ended-up creating was a royal mess of the first degree. It's garbage.

APL could be very useful today but someone with the time and context needs to organize an effort to evolve it into a modern language that retains the power of what got branded as a "tool for thought" while adding layers of functionality that are sorely missing. I wish I had the time to embark on this journey. I would love to do something like that, but I can't.

Again, the character set and keyboard are not the problem. I used to touch type APL. Didn't take that long to get there. People learn to drive vi/vim. It's a matter of having to have a reason to make the effort.

And the ecosystem. That's another huge issue.

This has two aspects:

Finding qualified programmers and having access to libraries so you don't reinvent the wheel.

Back in the day I used to do a lot of work with Forth as well. Great language for the right applications, but finding qualified Forth programmers was difficult when the language was popular and it became nearly impossible with the passage of time.

APL suffers from the same problem, a seriously limited talent pool.

I probably don't need to explain the value and power of having libraries to support a wide range of applications. Python is a good example of this today. You can find a library to do just about anything you might care to approach with Python, from desktop through embedded and web. In many ways the breath and depth of available libraries an be far more important than language capabilities and sophistication. After all, if you had to write OpenCV from scratch there's no amount of APL magic that is going to make you more efficient and effective than a 15 year old kid with Python and OpenCV.

I see APL mentioned on HN with some frequency. I feel that some here are in love with the idea of APL rather than understanding the reality of APL. Again, I love the language, but there's a reason I stopped using it about 25 years ago.

What's interesting is that C, which I started using way before APL, is still around and very solid language (with lots of libraries) for the right applications.


> Finding qualified programmers and having access to libraries so you don't reinvent the wheel.

Lots of niche languages have the same problem, notably lisp, but it doesn't do to say they aren't popular for those reasons. It's circular reasoning. Languages get those things by being popular. They get popular by having those things.

Every current "popular" language with good libraries and a large userbase started with no popularity, no libraries, and no users. They built these things over time.

The problem is these languages can't create a robust community. They are powerful, so people don't need large teams to do what they want. They are different, so it is a bigger investment to understand them. The combination means they attract the kind of elitists who are not willing to help newcomers or write basic libraries, the kind of people who are perfectly capable of reinventing every wheel and doing it better than last time.

No one teaches these languages. How popular could they get if companies and universities spent millions of hours collectively drilling even the most marginal programmer on how to use them like they do for C++ and Java?

They would never do it though. Large companies don't want more powerful languages. They will take the productivity loss for fungible employees. It's part ego. Middle managers look much more important if they have 20 programmers write 1,000,000 lines of code over 5 years than two programmers write 10,000 over six months even if functionality is equivalent. It's part bargaining and risk. If you only have a few programmers, the individual programmer is worth a lot more. It is also riskier to employ one because she could leave or get hit by a bus at any time.


> It's circular reasoning. Languages get those things by being popular.

Maybe it is but it's reality. Also, there's the other kind of reality: Languages don't matter. Solving problems is what matters.

I've programmed in everything from Machine Language (note I did not say "Assembler") to APL, passing through languages like Forth, C, C++, FORTRAN, Objective-C, Lisp, PHP, JS, Python, etc. At the end of the day the ONLY thing that matters --if it isn't a hobby-- is solving problems computationally. I have no cult adherence to any language whatsoever. They are tools, that's all.

My best example of this was making tons of money solving a problem using Visual Basic for Applications, which allowed me to use Excel to automate a time consuming task in a CAD program. It just so happened that this CAD program could be automated using VB. Put the two together and several months of work and we had a tool worth quite a bit of money.

APL still has lots of value...in the right circles. I believe it still sees professional usage in the finance industry.


> No one teaches these languages. How popular could they get if companies and universities spent millions of hours collectively drilling even the most marginal programmer on how to use them like they do for C++ and Java?

Schools and universities have been teaching Pascal/Lisp/Caml/Scheme for decades, yet (almost) nobody used those languages to produce actual software, neither as a job nor for free software side projects; and a majority of jobs implied the use of a member of the large C family (or VB at the time).


Weren't some of APL-related languages, to keep it on topic, like A+ and K created at or on contract with a large company?


I think most of the users here realize the reality (lack of users and libraries), but also recognize the power you described. I'd love for a simple open-source & multi-platform interpreter with a built in keyboard and package manager. Simplicity is key here. J basically has this, but I agree the syntax is difficult for me.



I find both APL and Forth fascinating. I would not try to promote them as a replacement for newer more approachable languages, but I think that learning them gives you different points of view and are worth learning for every programmer (same with FP, for example).


I still think Forth beats C as a language for microcontrollers programming. I didn't try it myself, yet, but I think a built-in REPL, nearly 1-1 correspondence with ASM and programmable compilation (among other things) can sum up to a really nice programming environment.


Yes, APL and Forth are excellent in this regard. They give you a very different set of mental tools with which to solve problems computationally.


Could you go into more detail about your opinion of J? I have not seen any negative opinion about it before and I am very curious as to what issues you believe it to suffer from.


One of the most powerful aspects of APL is its notation. Ken Iverson himself wrote a paper titled "Notation as a Tool For Thought". Here it is:

http://www.eecg.toronto.edu/~jzhu/csc326/readings/iverson.pd...

I remember watching Iverson deliver a presentation in person about this very topic.

Anyone familiar with fields such as mathematics or music understands the power of notation. The integral symbol conveys information and allows you to think about the problem rather than the mechanics.

APL in early times suffered from a unique problem: You had to physically modify your computer and printer to be able to do APL. You had to remove and replace the character generator ROM from you graphics card (who remembers those cards?). You had to get a new keyboard or put stickers all over your standard keyboard. And you had to change the print wheel or print ball (IBM printers) to be able to see, type and print APL characters.

It was a pain in the ass. Only the most interested cult members endured that level of pain for an extended period of time.

Years later Iverson decided to transliterate APL symbols into combinations of standard ASCII characters. This was a knee-jerk reaction to the above stated problem. What he did not have was the vision to recognize that technology would take care of this on its own. Not long after the introduction of J everyone could display and print graphics of any kind. The APL character set, the symbols, ceased to be a problem in that regard.

Iverson took the wrong road with J out of --conjecture on my part-- commercial interest rather than language interest. He violated something he personally talked about: The value of notation as a tool for thought.

J doesn't need to exist. If we are to evolve APL and move into a world where symbolic programming is a reality (something I think would be very powerful) we need to move away from typing ASCII characters into a keyboard and move into a paradigm where advanced software engineering has it's own notation that can be used to describe problems and create solutions with the kind of expressive power we have not seen in mainstream computing in years.


Learned APL in a college linear algebra class long ago and loved it (still have the books). Best story: circuit synthesis class assignment to perform matrix exponentiation -- two lines of APL while the rest of the class was struggling with FORTRAN and the like.


Thought this very thing a few minutes ago, only to sit down and see this.


Really accessible tutorial format here. More of this could get more people into APL.

(Although truthfully I don't think that char set will ever fly in a broad commercial sense)


Oh god no. This would kill programming productivity. Why people like these complicated programming languages? Seriously...


Your comment reminded me very much of a paragraph in "Beating the Averages"[1]:

"As long as our hypothetical Blub programmer is looking down the power continuum, he knows he's looking down. Languages less powerful than Blub are obviously less powerful, because they're missing some feature he's used to. But when our hypothetical Blub programmer looks in the other direction, up the power continuum, he doesn't realize he's looking up. What he sees are merely weird languages. He probably considers them about equivalent in power to Blub, but with all this other hairy stuff thrown in as well. Blub is good enough for him, because he thinks in Blub."

[1] http://www.paulgraham.com/avg.html


I would love apl with types.


Have you looked at Perl 6? It's optionally typed, and can be nearly as terse as apl with some of it's higher order operators.


You'd have to essentially embed a slow unoptimized APL in it which you can do. Add that to the fact that P6 is already slow at the moment would be an issue. Also, unlike APL, your P6 APL operators would probably be only known to you.

It would definitely work though.


Unicode has APL characters, so create the operators using them.

If there is a conflict with an existing feature you could always create a Slang. A Slang is a module that changes the parser. (It could be argued that with this feature all programming languages are a subset of Perl 6)


I've heard that quote, but realistically, that would be a huge project to write an efficient APL in P6 and as I already said, the performance wouldn't make you happy.


Take a look at this paper that explores the idea:

https://www.cs.ox.ac.uk/people/jeremy.gibbons/publications/a...



noob question: how do I paste those symbols in say Notepad++?


Just copy and paste. They're normal Unicode characters.

If you mean "type" rather than "paste", well that's more complicated.


I'm more worried about how to type them with a standard QWERTY keyboard.


Well I can't say for Windows but in Linux APL is one of the keyboard layouts standard in basically any install. In KDE you can just go in and enable it, in gnome/xfce/cinnamon you have to tell the system to actually show you all the layouts

  gsettings set org.gnome.libgnomekbd.desktop load-extra-items true
in a terminal will make it show up, then you just set it to toggle with some button.

https://www.dyalog.com/apl-font-keyboard.htm


When I used the Dyalog APL free trial on Windows it installed a new keyboard mapping as part of the process. I think the keymapping is also on their site. It uninstalled as well I think when I was finished and uninstalled.


I want to draw APL. Like with a stylus.

That's pretty much how I'd want to use it, and the only way I'd bother. It's cool tech but it's general purpose appeal is nonexistent and there is already Julia and Matlab in technical computing space.

If I could draw APL symbols and have some common libraries for JSON and Http and stuff like that, I could see a renaissance having enough steam to do something.

Otherwise, it's only the top gun high level nerds that already use it anyways, that would want to work in non-ascii. Modern developers seem to have an ever decreasing attention span these days.

How many developers under 25 use vim? It's always less than it was last year.

:(


Remember the Palm Pilot? Drawing the ASCII-ish symbols in the little box? While I did memorize the strokes, I never found the process efficient. I don't think that drawing APL runes would be any better -- certainly not better than the key-combos of Dyalog's bespoke APL editor (free for personal use, BTW).


Wow, actually I do remember that, and it sucked!

So, I've never written apl, and I'm glad I read this comment because that's an excellent point you raise.

To expand a bit, I was thinking about being able to notate computation graphically. That's really what I want. Visual programming has existed and has sucked forever, but those thoughts don't go away that there might be a way to do it, if I just had the missing piece.

I need to sit down with dyalog and find out what I'm missing, it sounds like.

Thanks!


To be fair, ownning an iPad Pro, I can say first hand that the handwriting recognition possible with the stylus is actually quite remarkable and a great improvement from the old palm-pilot days.


With that new interface, could you see yourself jotting down a "sentence" of APL that maybe you think of in a meeting or something?

I mainly am interested in apl as a notation, there's a certain lispy zen to a language that unifies notation and code into the same thing.

Maybe there are better handwriting notations to explore for what I'm thinking.


https://www.sacrideo.us/paper-is-dead-long-live-paper-progra...

I think one thing to consider would be actually using paper. The terseness means you don't really pay a 'retyping' penalty.


> With that new interface, could you see yourself jotting down a "sentence" of APL that maybe you think of in a meeting or something?

If you've got an Android device you could give MyScript Calculator a go, if writing APL was as easy as writing equations on that app then it would be a lovely experience.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: