Hacker News new | past | comments | ask | show | jobs | submit login

I agree #5 may not apply to some people. If a small community of people is able to use APL to do things that can't be done by other means, that would be an effective refutation of this argument, or at least an important caveat to the scope of its claims. Maybe APL is truly an effective tool of thought for at least this small group of people.

I would be interested in your response to this post [1], which is on that point.

I noted the Ackermann function and inverted index results in Hui's response to Dijkstra. I can't understand the APL so can't be sure how novel or significant these results are, but I would predict that such properties of the Ackermann function and inverted tables/indices would have been known long before these derivations. These may just be examples of pre-existing knowledge being encoded into APL logography.

[1] https://news.ycombinator.com/item?id=28967730




I don't claim that APL is necessarily useful to everyone. I think the fraction of people who could use array-based notation effectively is not small, but there's no hard evidence for this.

APL has not to my knowledge been used to prove novel mathematical results. Has any programming language? It's not very interesting to note that a language that's unknown to working mathematicians doesn't find much use in cutting-edge mathematics. However, APL is closer to the notation that is used for these discoveries than nearly any other programming language (Mathematica is the exception that comes to mind).

There are some historical APL successes to point to. It was used to design IBM's highly-regarded system/360[0]; as the first APL implementation ran on the 360 this was of course done entirely on paper. I believe some early time-sharing systems were written in APL as well, though I don't have a source. STSC's Mailbox[1] was one of the earliest email systems[2]. More recently Aaron Hsu has used APL to create what seems to be the first data-parallel compiler[3]. While data-parallel is rigorously defined (polylogarithmic time given infinite processors), the vagueness of "compiler" means this isn't a result of interest to mathematicians.

I myself have used APL to make what I consider to be a significant discovery about sequences of natural numbers[4]. It unifies methods that I've come up with ad-hoc in the past, and I've now used this framework dozens of times to more quickly solve a variety of problems. I don't expect you to understand the APL-heavy description, but isn't that what you asked for? I and many others think in APL. It just seems natural to us. Here's a quote along those lines[5]:

> And I looked at the code that I’d written, and it’s about 150 lines or so, and it was complicated, complicated stuff. I went to sleep, and I had a dream, and in the dream, it told me how to approach this in a whole different way that I had done before. So I woke up, I sat down at the computer at six o’clock in the morning, and by noon I’d rewritten the whole thing from scratch, pretty much. I kept 7 lines of code that were tangential to that, and it was all correct. It passed the QAs like that. So sure, you can write for loops in your sleep, but I can write entire correct programs in my sleep!

[0] https://dl.acm.org/doi/10.1147/sj.32.0198

[1] https://forums.dyalog.com/viewtopic.php?f=30&t=1629&p=6415

[2] https://en.wikipedia.org/wiki/History_of_email

[3] https://scholarworks.iu.edu/dspace/bitstream/handle/2022/247...

[4] https://aplwiki.com/wiki/Partition_representations#Unificati...

[5] https://www.arraycast.com/episode-0-transcript




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: