
Masterminds of Programming: Chuck Moore (2009) - chrispsn
https://www.oreilly.com/library/view/masterminds-of-programming/9780596801670/ch04.html
======
quantified
I would like to find the motivated time to analyze how Forth and lambda-
calculus functional languages compare/connect. Each Forth word seems to be
effectively a lambda that must be connected to a correctly-typed stack, and
words can themselves manipulate the stack as its own structure.

~~~
kragen
Manfred van Thun’s Joy, Christopher Diggins’s Cat, Henry Baker’s Linear Lisp.

~~~
quantified
Thanks for the pointers!

------
kjs3
I keep reading papers on stack-based vs register-based architectures, and it's
clear there's no "conclusive" right-answer. But I think it's pretty telling
that very little looks like a B5000, and an awful lot looks like a
PDP-11/S360. Maybe the Forth faithful are right and the rest of us are too
dumb to 'get' stack-based, but it's pretty clear what the market decided.

~~~
RodgerTheGreat
The B5000 does not closely resemble a Forth machine. It is stack-oriented, but
does not contain separate parameter and return stacks which may grow and
shrink independently. You could perhaps argue that the JVM and its peers are
the modern-day successors to the B5000.

~~~
kjs3
That wasn't the point. The point is that most of the field abandoned stack
hardware for register hardware, upon which all sorts of software paradigms are
implemented.

~~~
kragen
Sounds like somebody hasn't read Koopman yet.

~~~
kjs3
I've read Koopman; fascinating stuff. Really solidified my understanding of
the use of multiple stacks in Forth-like environments.

But I think it's instructive that none of the commercial architectures he
describes (from 1989) have a contemporary descendant (although I think you can
still get the RTX 2000 as a space-rated special order at an astronomical unit
price). I think that tends to support my premise.

~~~
kragen
That's reasonable. I thought you were talking about things like the B5000 and
the 8087.

~~~
kjs3
Well...I can.

The 8087 was basically obsoleted by register-based SSE2 and SSE3, and as I
understand on x86_64 the vestigial 8087 op codes are decoded microinstructions
executed by the SSEx unit. And the descendant of the B5000 (now called Unisys
ClearPath) is a VM running on Xeons; they stopped making stack-based MCP
processors in 2015 or so.

So I think the original point still stands.

------
kick
These types of books have always been a bit frustrating to me. _Coders at
Work_ is another one.

Moore is a genius, and his inclusion is absolutely for the better.

However, if it came just 5 years earlier, it would have been able to have _so_
much more value. Falkoff's inclusion within was great, Falkoff contributed
greatly to the APL ecosystem, but interviews with Iverson are scarce and hard
to come across, and he had an amazing view on the big picture for these
things. His books are probably the most valuable I have on my shelves.

Also, it leaned far too hard on ALGOL's many derivatives. Roger Hui and Arthur
Whitney would have been more valuable than most of the people included. Even
Forth got three people interviewed, which is admittedly pretty cool!
(PostScript is Forth.)

~~~
yesenadam
>PostScript is Forth

Last year I got into Forth, then PostScript, which seems a kind of dumbed-
down, simplified Forth. On PostScript's wikipedia page, Forth is not mentioned
as an influence. So I went to change that, and _in the page html_ there's a
comment saying if you came to add Forth, see talk page. I looked into it a
bit. (One of?) The guy who wrote PostScript's previous couple of languages
were based on Forth (wikipedia admits), but then PostScript was not so much as
influenced by it?! That actually made me very angry! The PostScript reference
manual reads as if written by lawyers, which maybe shed some light on it. They
don't admit it's influenced by Forth, but say of course it has influences, and
the next sentence is about Forth. As if they couldn't not mention Forth, but
legally didn't want to spell anything out. Pretty disgusting treatment of
Chuck Moore, seems to me. HN, help me right the wrong!

~~~
kragen
PostScript is really a lot more like Lisp or Smalltalk than Forth. It has GC,
blocks for control flow, local variable binding, symbols, mandatory bounds-
checking, type-safety, and exception handling. Forth conscientiously abjures
such things, and uses compile-time metaprogramming instead in some cases,
which is possible in PostScript (and used routinely for raster image data) but
less flexible.

However, PostScript not only passes arguments on the stack, it exposes things
like the compiler machinery in ways that are more typical of Forth than of the
high-level languages it draws so much from. So I feel like it has some Forth
inspiration, in the same way Python is influenced by SNOBOL or BASIC.

Nowadays I've mostly given up PostScript in favor of SVG and Reportlab, sad to
say.

~~~
DonHopkins
>Nowadays I've mostly given up PostScript in favor of SVG and Reportlab, sad
to say.

JavaScript and the canvas 2d api are the moral equivalent of PostScript, these
days.

~~~
kragen
Yeah, but the canvas API sucks bad enough (mostly due to the constraints of
JS) that PostScript is an appealing alternative. Super verbose and can't even
produce a PDF. If you can print it it's all pixelated.

------
markus_zhang
Can anyone elaborate on the bottom up method with a small but concrete
example? Say I want to parse a special format of strings, read part of each
line and dump into say. CSV, how would a Forth programmer approach the
problem?

~~~
mikekchar
It's been ages since I did any Forth programming, but at the start of my
career I spent about 2 years writing Forth code. I have always found that
bottom up programming is very similar in nature to test first TDD. You start
implementing one specific part, then you implement another specific part. You
check to see if there is a better representation for it and refactor if you
can. Rinse and repeat.

This is _very_ different than outside-in GOOSE style development, though. You
start at the lowest level of abstraction and work your way up. It requires you
to do some analysis of the problem up front so that you can see that low level
of abstraction.

WRT your example, I can't really work through it without a real problem,
because the devil is in the details. But generally, if I have to parse some
strings, I'll start with parsing an example of a string in the simplest way I
can. Then I'll expand that code to include another string, refactoring the
code as I go. Once I have some idea what the parsing interface is going to
look like, I'll write some code to read a line from a file. Then I'll work my
way up to call the code that reads the line and sends it to the parser. Then
I'll take a look at the representation for the parsed string and implement a
single CSV output. Then back up to hook that up. Finally, I'll move back down
the abstraction layers and work on more strings. You want to move from that
low level, gradually up refactoring as you go, and then dive back down
repeatedly. Or at least that's how I do it. I don't pretend to be as skilled
at it as Chuck Moore who is famous for discovering amazing abstractions.

I should note that not all problems fit this style of development. Sometimes
you have to work from top down. Normally you can discover this fairly quickly
if you start chasing your tail. You'll implement something, work your way up,
then move down only to discover that you have to change something fundamental.
You do it, moving back up, only to discover that the first bit doesn't work
any more. However, once you discover the top level abstraction you need, you
can go back to a bottom up technique because the lower level abstractions will
be more obvious.

~~~
markus_zhang
Thanks this is really interesting. I'm definitely not experienced enough to
classify problems into top down or bottom up though guess that takes years of
experience.

------
dang
The book discussed in general at the time:
[https://news.ycombinator.com/item?id=583164](https://news.ycombinator.com/item?id=583164)

------
shrubble
In reading this interview I couldn't help but think that what he was saying
about his parallel computing chips, ended up applying to GPU's.

~~~
agumonkey
I wonder if gpu core arrays can share ~high level code like GA144. Passing
bits of forth thunks onto neighbors felt immensely powerful (and as much more
rope to tie yourself into knots).

------
dewster
If you actually look at the way Forth works you'll see that every stack
manipulation wastes code space and real-time. Since there is only one data
stack there are a lot of stack manipulations going on. Forth programmers are
aware of this and do their best to minimize them, which tends to make their
incredibly cryptic code even more cryptic.

If the definition of a low level language is one that bedevils the programmer
with minutiae, the Forth is the lowest of the low. I don't understand the
fascination others have for it, and don't understand how anyone can like it
after actually programming with it. It's horrible.

~~~
rabidrat
It teaches you to set things up so that the code doesn't _have_ to do stack
manipulations and other minutiae in the typical case. Most other languages
seem to encourage modules with general APIs and hard boundaries so that the
caller has to unpack/repack/rearrange the data as it enters and leaves. Forth
very deliberately encourages developing a holistic system, and it discourages
wholesale code reuse from other projects and systems, which gives you the
power and flexibility to refactor relentlessly, until only the essence of the
computational solution remains.

Forth is definitely a difficult language to work with, particularly in a
professional environment where managing turnover is massively important. When
I dive in to some Forth code that I've written, to make even the smallest of
changes, my brain has to be fully engaged, and that's a non-starter in most
environments (Chuck probably thinks this is a good thing; why are we making
changes to code we don't understand?). But I am still an avid proponent of
learning and applying the principles of Forth, because of the results that it
makes possible. It is quite eye-opening to see directly how a system can
become 10x as powerful, with 1/10th of the code, if you are willing to do the
work and embrace the "minimalist" (I would call it "essentialist") mindset.

