
Handwriting Programs in J - qznc
https://www.hillelwayne.com/post/handwriting-j/
======
svat
The article went in a different direction from what I was expecting, but
leaving aside J, the original motivation — the topic of writing programs by
hand (I mean with pen/pencil on paper) is interesting to me.

The only person I know to regularly write even large programs by hand is
Donald Knuth. This includes the early heroics of his college days
([1][2][3][4]). It includes TeX and METAFONT which he wrote in the 1980s over
several months, before typing them in. (Corroborated right here on HN [5] by
David Fuchs [6] who worked very closely with Knuth on TeX.) I'm pretty sure
from looking at some his recent programs [7] that he still writes by hand.

Of course as a programmer Knuth is _sui generis_ and has developed his own
idiosyncratic style of programming over the decades, different from the rest
of the world (consider his idea of Literate Programming). He thinks
differently, tends towards monolithic rather than modular, and is always in
book-author mode even when writing programs. He has some abilities and
disciplines that more than make up for what some others would consider
defects. What works for him may not work for others. Still, it would be
interesting to hear of more people's experiences writing by hand.

A couple of months ago I wrote a small program in an assembly-like language
for fun, and I felt that writing out the first couple of drafts by hand was
quite helpful, and prevented me from jumping to an implementation too quickly.

[1]: [http://ed-thelen.org/comp-hist/B5000-AlgolRWaychoff.html#7](http://ed-
thelen.org/comp-hist/B5000-AlgolRWaychoff.html#7)

[2]:
[http://archive.computerhistory.org/resources/text/Knuth_Don_...](http://archive.computerhistory.org/resources/text/Knuth_Don_X4100/PDF_index/k-8-pdf/k-8-u2779-B5000-People.pdf#page=8)
(Note "half-true" on the cover page is penciled by Knuth!)

[3]: [http://www.metafilter.com/122701/STORIES-ABOUT-
THE-B5000-AND...](http://www.metafilter.com/122701/STORIES-ABOUT-
THE-B5000-AND-PEOPLE-WHO-WERE-THERE-By-Richard-Waychoff)

[4]:
[http://archive.computerhistory.org/resources/text/Knuth_Don_...](http://archive.computerhistory.org/resources/text/Knuth_Don_X4100/PDF_index/k-2-pdf/k-2-c1039-ALGOL-B205.pdf)

[5]:
[https://news.ycombinator.com/item?id=10172924](https://news.ycombinator.com/item?id=10172924)

[6]:
[https://www.tug.org/interviews/fuchs.html](https://www.tug.org/interviews/fuchs.html)

[7]:
[http://www.cs.stanford.edu/~knuth/programs.html](http://www.cs.stanford.edu/~knuth/programs.html)
(I've generated PDF versions: [https://github.com/shreevatsa/knuth-literate-
programs/tree/1...](https://github.com/shreevatsa/knuth-literate-
programs/tree/1b86fdc/programs))

~~~
coliveira
Knuth is from a time when professors had secretaries who would type their
handwritten manuscripts (maybe he still has one). That's why he thinks it is
OK to handwrite all his papers and even code. Normal people (us) may even
think that handwriting is easier than typing, but very few of us are willing
to spend the time it takes to both handwrite and later type the same document.
So, in my opinion, the reason for the choice is pragmatic, given the way we
need to prepare software.

~~~
melling
Ideally, you should be able to scan in what you wrote using your phone, then
have it OCR’ed. With the recent advances in deep neural nets, this should be
close to possible.

~~~
coliveira
Maybe this will be easier in the future, but nowadays there is still the issue
of scanning, running OCR, and fixing the many errors that the OCR process
generates.

------
Jtsummers
I spent some time about 10 years back learning J, it was (next to Forth and
kin) a great way to learn about tacit programming. However, I found the
programs being discussed in the J discussion lists completely impenetrable. I
liked it for what I could express, but I hated it because I couldn't
understand what others had done (making sharing work very challenging).
However, it is, due to its brevity, rather useful for hand notated code.

I actually thought this was going in a different direction:

[http://www.jsoftware.com/papers/tot.htm](http://www.jsoftware.com/papers/tot.htm)

This is Iverson discussing APL (predecessor language of J) and its
utility/advantages as a notational tool. In particular, something that I find
interesting to see here since I just reread (most of) it, is a mention of ad
hoc notations introduced as needed by the user. What today we would call
domain specific languages.

~~~
kd0amg
_However, I found the programs being discussed in the J discussion lists
completely impenetrable._

I would argue that (a big) part of the issue is that proponents often
overstate the readability of such code. My usual experience from the J mailing
list is that people "read" code by looking at the prose description of what it
does, puzzling through how they would do that themselves, and then looking for
common structure between the original code and their own reimplementation.
It's a good learning experience as far as solving that particular problem is
concerned (as there's often more than one way to do it), though it definitely
runs counter to the meme that comments and meaningful names are superfluous.
The other usual excuse that "the alphabet just isn't the one you're familiar
with" isn't really true of mailing list regulars.

~~~
hwayne
That's one of the major reasons I think handwriting J is more interesting and
understandable than typing it. You can more clearly mark which pieces are
conjunctions and what they operate on, you can present the explicit expansion
tree as a reference, etc.

------
quickben
Comments don't look well planned. First they double down on brevity, then the
go with 'NB.' for a comment heading instead of a quote or something.

~~~
throwaway7645
It makes perfect sense. A symbol is way too valuable.

------
rprospero
I've been a bit surprised that array based languages like APL and J haven't
taken off with GPU programming. It feels like it would be a natural fit for
the domain, but I'm probably missing something.

~~~
Buttons840
The Futhark language is the closest thing to "APL for the GPU" I'm aware of.

[https://futhark-lang.org/](https://futhark-lang.org/)

------
throwaway7645
I posted this on here yesterday...wonder y it didn't go through. Neat article
and beautiful language. I just wish it was easier to train my brain to use it
and easier to get data in.

------
postfacto
“Everyone knows that debugging is twice as hard as writing a program in the
first place. So if you're as clever as you can be when you write it, how will
you ever debug it?”—Brian Kernighan.

~~~
Avshalom
The APL lineage is generally extremely clever in the micro but completely
normal levels of clever in the macro.

and at the individual line level it's as easy as any dynamic language to work
your way through a head-scratcher by testing progressively larger
chunks/stepping through with a debugger until you find where it starts
breaking from your expectaions.

------
NHQ
Next, train a neural network to read your handwriting and spit out the code.

~~~
dang
Please don't post drive-by dismissals to HN. We're hoping for better than that
here.

Unorthodox creative approaches are particularly vulnerable to this kind of
thing and deserve much better.

------
feelin_googley
There is a third freely available ASCII character APL in addition to K and J.

I have not seen it mentioned on HN before.

Have I missed the discussion?

It is called ELI.

I find it very sad that bloggers and commenters who are presumably employed as
software engineers are continuing to discount the value of these languages
simply because someone finds them incomprehensible. Failure of someone to
comprehend is not the fault of the language. Persons having no prior knowledge
of programming do not make complaints about APL versus more verbose languages.
It is all the same to them, all incomprehensible in the beginning. To learn,
work is required. And generally, more volume means more to learn. More work.

~~~
hwayne
> The proper way to critique these languages IMO is to do benchmarks. For
> example, the author of the blog post could state specifcially the problem he
> is trying to solve and then perform the solution in two languages or more
> languages, including an APL, including both the code size and the timings.

Here's the hobby thing I'm doing with this. I have an initial investment in
Vanguard and contribute X additional per year. I'm at 90/10 stocks/bonds ratio
and plan to, between now and retirement, gradually shift to a more
conservative ratio. I have a rough distribution of RoR for both stocks and
bonds. With all of these constraints in mind, I want run a few thousand
simulations of expected savings at retirement. I want to easily compare these
simulations to more pessimistic RoRs.

(No real reason, just seems fun.)

I'm pretty far along with the J. I've also written a bit of this of python as
comparison, and from what I can tell so far it's an order-of-magnitude
difference. I don't know numpy, though, which probably makes a big difference.

In the end, though, the benchmarks don't matter as much to me as the sheer
delight in snapping my brain in two.

~~~
noobermin
I don't program in J, but I breathe numpy. I can tell you that numpy
broadcasted operations are probably an order of magnitude faster than
_explicit for loops in python_...simply because numpy is of course written in
C.

------
vforgione
> 6 - 1 = 5 > 6

Stopped right after the author explained that line. What benefits does this
offer over LtR programming languages? I would consider debugging this as a
form of punishment.

~~~
hwayne
Most languages aren't LtR, either! For example, what's this in python?

    
    
        2 + 2 * 4 / 1 + 1
    

In J, that's very obviously 6: as it's 2 + (2 * (4 / (1 + 1))). In Python,
it's 11, as there's order of operations to think about: 2 + (2 * 4 / 1) + 1.
Strict LtR is not always preferable, but when you're pipelining a lot of
tricky array operations it makes it a lot easier to understand the code.

~~~
vforgione
For simple arithmetic I find that extremely confusing. I also would have
guessed that the equals was an assertion for the addition.

