
Levels of code in Forth programming (2002) - pointfree
http://www.ultratechnology.com/levels.htm
======
elcritch
Intriguing article. Just updated some Forth code for interfacing with an ADC
on a sensor I’m building. Dealing with SPI/I2C and sensors/adc’s in Forth
really is fantastic. The results in much more succinct hardware code IMHO than
C or even higher level languages. Really Chuck Moore seems spot on when
dealing with specific hardware.

One example is a simple Forth word (macro) to convert 3 bytes into one 32 bit
number integrates well into code for dealing with an adc chip.

However I wouldn’t want to write while applications in Forth as dealing with
stack swaps becomes annoying. Still writing your own Forth is pretty fun too.
I did mine by basing C compiler XMacros which made porting to an Itsy M4
trivial (about 3-4 of work) [1].

Still there’s a few Forth’s for Arduino’s, Itsy’s, ESP’s [2] which are really
fun as they enable repl style interactive programming with hardware while
still being blindingly fast!

1:
[https://github.com/elcritch/forthwith](https://github.com/elcritch/forthwith)
2:
[https://github.com/zeroflag/punyforth](https://github.com/zeroflag/punyforth)

------
haolez
Forth is pretty amazing. It really delivers on that old promise of “a language
that will make you a better programmer in other languages”.

However, from my experience, since there is basically no syntax, all Forth
programs tend to be a DSL for the problem at hand. It’s almost like having to
learn a new language ok each new project. It’s like a complete opposite of
what makes Go great.

~~~
adestefan
That is exactly what Chuck Moore wanted when inventing Forth.

~~~
haolez
I know. It’s a feature, but it makes it harder to share code (and Chuck Moore
believes that sharing code is rarely worth it). Maybe he is right. His
productivity is unbelievable :)

~~~
mr_crankypants
The more time I spend dealing with blowback from excess complexity being
imported in the form of 3rd-party libraries that offer complicated solutions
to simple problems, the more I think that Chuck Moore was very, _very_ right
on that point.

~~~
rauhl
I agree with you, but I wonder what his answer to stuff like GUIs would be.
There’s a tremendous amount of complexity and domain knowledge in stuff like
drawing fonts, and in cryptography, and so forth — and very very few of us
have the time to become competent in even one of those, let alone all of them.
Then consider the amount of work necessary to have a modern browser: text
parsing of not one but three languages, language interpretation, more
graphics, more cryptography.

It would be _awesome_ to get back to first principles, but modern systems try
to do so much that I wonder how practical it would be to reinvent them — and I
would how practical it is to say, ‘well, don’t do that then.’

~~~
mr_crankypants
I don't know what Moore would say. Personally, I've retreated to the back end
- used to be full stack, but I'm just sick to death of how overcomplicated
front-end work has become.

I'm inclined to say that, e.g., the modern browser is a cautionary tale that
complements the Chuck Moore approach to things: By forever piling thing on top
of thing in an evolutionary way, you end up with a system that ultimately
feels more and more cobbled together, and less and less like it ever had any
sort of an intelligent designer. Perhaps the lesson is that it can be
worthwhile to occasionally stop, take a real look at what things you really do
need, aggressively discard the ones you don't, and properly re-engineer and
re-build the system.

Obviously there are issues of interoperating with the rest of the world to
consider there, and Moore has made a career of scrupulously avoiding such
encumbrances. But a nerd can dream.

~~~
abstract7
Also consider that all we know is a world that has become more global, open,
and relatively peaceful post 1970s. If collaboration were to slow or decline,
open-source would be harmed. And/or if Google and Facebook lose its dynamism
from politics, regulation, and maturity, corporate sponsored open-source could
be shaken. Google could become like AT&T and Facebook like Ericsson or
something in some way.

Once unstoppable sectors, like aerospace (to mix comparisons) began to reverse
and decline in the early 70s. No one really saw it coming. I can't think if
one publicly known or credible person called it in 1969 shortly after the moon
landing, at least on record. Oversupply of engineers in the US and the West
became a thing. And engineering still suffers here because of aerospace's
decline. Forth began to lose steam around then, right? Forth, hardware and
Cold War (barriers) politics are inextricably linked, perhaps. And then
GNU/Linux and BSD saw its high-collaboration paradigm birthed around that
time. Nixon/Kissenger talks with closed China began around then too, and now
relations are breaking down with a more open China today.

Look how Lua scripting came about not terribly so long ago. Some parallels.
Brazilian trade barriers. Now half believe Huawei is evil. Cross-hardware
story may be cracking. Many believe Google is evil. Open software may be
cracking. And there are rifts between US, EU, and China on how to regulate the
internet. A new Cold War may be brewing. It's a nerds nightmare.

If anyone can tie in distributed ledger and specialized AI coder productivity
tools, or something to counter this argument or round it out, that would be
awesome.

EDIT: I was mistaken. Forth caught on with personal computer hobbyists in the
1980s, per Wikipedia. However, as a career or industry,slow downs with NASA
and Cold War spending seemed to take some wind out of Forth's sails. I've
noted that lot of that type of work was what paid people to write Forth. And
the open-source paradigm with C/C++ and GNU Linux was even more limiting, I
believe.

------
codr7
Before I learned Forth, I was very comfortable in Common Lisp. Now I miss the
convenience when writing Forth and raw simplicity when writing Lisp.

I realize the problem isn't Forth, some people (such as the writer for
example) are capable of pulling amazing feats for such a primitive tool.

But as a result, the programming languages [0] I've designed since have all
been part Forth and part Common Lisp.

[0] [https://github.com/codr7/cidk](https://github.com/codr7/cidk)

------
theamk
> Portability is not possible. Real applications are closely coupled to
> hardware. Change the platform and all the code changes

Some lessons simply did not age well.... Even in world of today's
microcontrollers, which often have kilobytes of RAM and very different CPU
styles, people still write mostly hardware abstracted code.

~~~
astrobe_
It's the usually game of MPUs becoming cheaper or more stuffed for the same
price because the demand allows to manufacture bigger batches, and the demand
is to allow more portability by the means of more stuffed MPUs that allow more
abstractions...

But it's a miniature of the same tragedy as Node.js applications that consume
ten times the resources needed just because it allows people to do more with
more... In a logarithmic way.

------
bcherny
This is a really interesting read, and as someone who’s almost exclusively
programmed in high level languages, this approach seems alien to me.

A couple of questions:

1\. Is it possible to write complex, modern applications (things like
browsers, photo editors, etc. — things that would take millions of lines of
Java or JS) using this style of programming?

2\. What is “sourceless programming”? Where is a good place to learn more
about it?

~~~
madhadron
1\. "This style" is kind of hard to pin down. If you mean Chuck Moore's
dramatic minimalism, then, yes, but it won't resemble what most people in
computing expect from a browser or photo editor. If you mean expressing the
abstractions you want directly in the primitives you have without bothering
about layers or even accepting the idea of higher vs lower levels, then, yes,
certainly. It requires a lot of unlearning, though.

2\. Sourceless programming was something Chuck Moore tried for a while where
he designed a machine that was the virtual machine he wanted to program,
implemented it in hardware, and then edited byte code directly for it. Later
he stepped back and went to colorForth, which has the source/binary separation
we are all accustomed to.

~~~
bcherny
Thanks for the reply!

> but it won't resemble what most people in computing expect from a browser or
> photo editor.

In the sense that an end user would interact with some low level API
primitives, rather than a full GUI? I’d love more examples or metaphors, or
maybe a link where I could learn more.

~~~
snazz
Chuck Moore’s software is somewhat famous for reducing the complexity of both
the software itself as well as the requirements. In developing a web browser,
he would probably eliminate all of JavaScript, the user interface, and most of
CSS, and leave it to run only on a chip he designed for the purpose. His
ideology is super cool, but isn’t what is expected of software like this.

For instance, OKAD (chip design and simulation CAD package) is 500 lines of
colorForth. Although it includes all sorts of fancy tools, he also applied his
ruthless minimalism to the requirements.

~~~
brokenkebab
I would welcome all of these except maybe a separate chip. E.g. having 3
separate languages built in a browser does look like a overcomplication.

------
jwilliams
Eons ago as an embedded programmer I came to respect Forth. I encountered
numerous situations where using Forth led to a much smaller footprint (size in
particular) -- Why? For exactly the reasons that Chuck Moore espouses here;
you are writing a purpose-built VM from the hardware up.

Even then I don't agree that portability/abstraction isn't important - it's
got the potential to be an extremely reductionist position. Instead I'd argue
it's incredibly expensive and should be treated as such.

~~~
nickpsecurity
"Instead I'd argue it's incredibly expensive and should be treated as such."

Using a cross-platform framework isn't incredibly expensive in time or
performance cost. It's done by one-person projects and large businesses alike.
There's issues but that's way overstating it.

Then, there were 4GL's like Lansa and Windev that made it easier to do than
creating non-portable, native applications. Those weren't used for
performance-sensitive code, though. Mostly business apps.

~~~
jwilliams
I wasn't being specific enough - I meant expensive in the most general sense.
e.g. An abstraction is a cost not only in terms of (potential) performance,
but developer headspace, etc, etc.

The right number of abstractions is very powerful. Too many any you'll sink
under the weight of them.

------
stallmanite
This is really captivating. I think I’m inspired to finally dust off my of TI-
Forth for the 99/4a

------
wwweston
>I was also researching AI in Forth, implementing ideas from LISP examples and
doing expert systems and neural nets and mixing them and building robots. In
the software I added a layer for an inference engine for English language
descriptions of rule sets and a layer for the rules. I wrote a learning email
report and conversation engine AI program and had it running for a few months.
My boss could not distinguish it from me. That was my idea of AI, smart enough
to do my job for me and get paid at my salary while I took a vacation.

Is the author exaggerating here, or did they actually succeed at writing
something that could pass whatever Turing test level his boss could offer?

If it's the latter, what then-current knowledge would they likely have
sourced?

~~~
brokenkebab
I can't know what the author means, but the thing which is known as chatbots
now is very old tech, in fact:

[https://en.m.wikipedia.org/wiki/ELIZA](https://en.m.wikipedia.org/wiki/ELIZA)
(1966)

------
MrEldritch
Chuck Moore has always struck me as some kind of alien, not unlike the way
stories about Von Neumann do - that this is a person who is, in his
specialized field, capable of thinking in ways that I just can't, and
achieving things that seem practically magical with it.

~~~
gaze
You might be right... but I have the impression that if you talked to him,
he'd tell you that you'd get similar milage by following his philosophy about
program design. It's just that most people just can't stomach it.

------
pilmihilmipilmi
Does anybody know what Chuck Moore does right now? I was following a while his
posts on patent lawsuite but then he went quite.

------
nine_k
To sum up Chuck Moore's quotations: you write code that takes all of the
machine, and you got to write all of the code. If you have this, you can
squash out all abstraction and build the ideal solution directly.

This may hold true for small-scale hardware like controllers. They have a
well-defined set of tasks, small enough to fit in your head.

This means that you have a certain trouble sharing the code with your
colleagues, making the bus factor of your project closer to 1, and lowering
the usefulness of code reviews.

This means that you have trouble sharing code with yourself in your next
project.

You become tightly coupled do the machine. This is, on one hand, liberating,
you can do anything easily. But this is also limiting, because you spend your
mental resources on optimizing for this particular machine.

I personally think that deep optimization is something that the machine should
do, they are better than humans at this most of the time. And humans should do
want machines currently can't.

~~~
astrobe_
> This may hold true for small-scale hardware like controllers. They have a
> well-defined set of tasks, small enough to fit in your head.

People usually insist that software should be modular, so that you don't have
to have millions of lines of code in your head when making a local change.
That's what drove the procedural evolution and later the OOP evolution.

So if you're a good boy/girl/etc. you write your million-LOCs PC application
as modules that are manageable for (ideally) a single person. Then you need an
extra programmer to glue the modules together.

> This means that you have a certain trouble sharing the code with your
> colleagues, making the bus factor of your project closer to 1, and lowering
> the usefulness of code reviews.

Where does Forth prohibit peer reviews and pair programming? If you have a bus
factor of 1, it is because you don't want to pay the price of increasing it.
It has nothing to do with Forth; plenty of projects in super-high hyper-
readable projects have a bus factor of 1.

> This means that you have trouble sharing code with yourself in your next
> project.

Not really. It's easier to copy/paste/hack Forth code. The code is more
compact for various reasons: point-free style makes it less verbose, you tend
to factor more intensively and you code exactly what you need.

> You become tightly coupled do the machine. This is, on one hand, liberating,
> you can do anything easily. But this is also limiting, because you spend
> your mental resources on optimizing for this particular machine.

No, that's the other way. When you code e.g. for a little-endian, two's
complement CPU then you don't have to worry about big-endian and sign
magnitude. You are actually _also_ optimizing programmer's cycles too.

Being tightly coupled to the machine is what embedded programming really is
about. Embedded programming is often about writing esoteric values at occult
addresses in order to bang out bits on an SPI bus. Running a Python program on
Debian on a rPI is not really embedded programming.

~~~
rcombine
It kinda sounds to me like Forth wants to be used in the context of embedded
programming, then.

~~~
astrobe_
Not only. Checkout Forth Inc. last year's projects [1]. In one of them, Forth
is used everywhere from micro-controllers to the monitoring PCs.

[1] [https://wiki.forth-ev.de/doku.php/events:ef2018:forth-in-
tha...](https://wiki.forth-ev.de/doku.php/events:ef2018:forth-in-that)

