Hacker News new | past | comments | ask | show | jobs | submit login

"Fast" is not relevant - the runtime would be just as fast if the code was written in plain words. Code size is also irrelevant unless you're called Elon Musk. The only claim left is that it's a productivity boost, but I just can't see how a language that needs a special keyboard to be written can be written more productively than actual words, especially with modern IDEs.



Speaking for myself, I find that whenever I think "I just can't see how..." it makes me want to figure out what it is that others can see that I can't. This is the reason I learnt APL, actually, after seeing half a page of k code and feeling annoyed that I couldn't read it. k subsequently led me to APL.

You can try it for yourself -- NumPy is a fair "Iverson ghost" -- APL without the symbols: it has a similar enough array model, and most of the APL primitives as functions or methods. APL lets you express linear algebra in a very natural way. Doing it in NumPy is much more convoluted.

Or try Rob Pike's (of Go fame) "ivy" -- his attempt at an APL-with-words (https://github.com/robpike/ivy).


Sorry, I'm not embarking on a quest to learn APL because somehow the idea can't be explained.


The claim isn't that writing is faster, the claim is that there's less pointless boilerplate to write, read, or think about. You didn't write your comment as:

"quote fast unquote is not relevant hyphen the runtime would be just as fast if the code was written in plain words period code size is also irrelevant unless you apostrophe are called initial capital elon musk period the only claim left is that it apostrophe s a productivity boost comma but i just can apostrophe t see how a language that needs a special keyboard to be written can be written more productively than actual words comma especially with modern initialism ide s period"

because symbols are useful. And people don't have a problem with peppering code with chorded symbols like () ++ {} : <> ? without demanding that they be turned into words because words are easier. In fact people struggle when making new languages (Rust, C# new versions) to find more symbol combinations, reaching for things like Python's triple quote strings """ """ and :: and ""u8 and so on.

There's nothing inherently better about shift-2 shift-2 shift-2 shift-2 shift-2 shift-2 than altgr+i

Why aren't the common operations of transforming collections short, sharp, and getting out of your way? Why are they ceremonious "map a lambda which takes a parameter with a name and ..." when you don't need that.


I type each APL symbol with a single chord, e.g. ∊ is AltGr+e and ⍷ is AltGr+Shift+E. Each keystroke is basically instant.


You don't worry about repetitive strain injury from all that chording? I try to avoid that sort of thing as much as possible; even though I use emacs, I do it with evil bindings. Capslock rebound as control is useful, but the bottom row modifiers seem problematic. I can't easily reach those modifier keys while also keeping my fingers on the home row; I have to either contort my thumb or pinky in a bad way, or move my entire hand (which is usually how I use those keys.) Either way, keeping hands straight and my fingers positioned on the home row seems much safer and comfortable. I just can't imagine using a language where I have to altgr for every character I type.


Hasn't really been an issue. Occasionally, I map CapsLock to AltGr so I have a left-side APL shifting key too — it doesn't see much use, though. Also remember that it isn't "every character I type". Actual APL primitives only comprise a relatively small fraction of the total code. I just did a rough computation on four things I've been working on recently and they had 4%, 5%, 6%, and 7% non-ASCII APL glyphs, respectively.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: