
A Forth Story... - pmarin
http://groups.google.com/group/comp.lang.forth/browse_thread/thread/4e15a2197c0aaafe/95c3f82d1c681296?#95c3f82d1c681296
======
jacquesm
Quite a story, and it seems it does not have a happy ending, but for all the
wrong reasons.

About 2/3rds in (past the 'read more') the Novix rates a mention. I think that
that is possibly the most underrated CPU design that ever saw the light of
day, at the time it was so out of the ordinary that only very few people knew
what to do with it.

A high level language (forth is by most definitions a high level language)
directly executed by the CPU.

At the time I worked for a company called dadadata in the Netherlands, they
were in the process of developing a bunch of real time software to process
video images for classification and recognition (1988 or thereabouts). The
Novix chip arrived and would have definitely blown the socks of anything that
I could have done with a regular PC at the time, if not for one small problem:
someone dropped a dime into the power supply and the whole development kit was
toast. A couple of all nighters later we had a working reproduction of the
code in 'C' and that was used for the customer demonstrations. Still, the
power of that little chip is something I'll never forget and it is a pity that
the Novix and its successor which iirc was called shboom never made real
headway. The forth code for the novix was probably only about 10% or so in
size of the equivalent C code. The head forth honcho there called C sneeringly
a 'great' language.

There is still hope, even today that one day that power will finally be
available in some package that sees wide distribution, the heritage of Novix
lives on in the greenarrays line (which is now shipping dev boards and chips).

<http://www.greenarraychips.com/>

To kill forth and the concepts behind it permanently will take a lot of garlic
and stakes, I think it will always have its champions.

~~~
msutherl
I took a close look at Green Array recently and found a few blogs from
engineerings who were beginning to play with them. At first they seemed too
good to be true, but it appears that the architecture is only a good match for
tasks that can easily be parallelized by hand. Computer vision would be one of
those tasks, but then is rewriting CV code from scratch on a new architecture
something you can viably do anymore?

Anyway, I hope these find a niche so that this kind of architecture might one
day make a comeback. The advantages are staggering.

~~~
gruseom
_the architecture is only a good match for tasks that can easily be
parallelized by hand_

Could you expand on that?

~~~
tern
It forces you to structure your application as tiny communicating processes
which you route by hand. You're actually coding quite close to the hardware.
When you want a process to communicate with another – correct me if I'm wrong
somebody – you can only communicate processes that are physically adjacent on
the chip and you actually write something like "the one to the left/right".
Each process is very small and one of the blogs I looked at was very much
about optimizing code-size to fit in such a small place. This results in a
situation where if your problem does not easily map to this kind of
architecture, then you have no other option.

But the same thing can be said of programming at a low level with a Von
Neumann architecture. Such architectures are an exceptionally bad match for
what the Green Array (dataflow) architecture is good for, namely highly
concurrent applications involving processing flows of data such as sensor
networks, computer vision, and all manner of routers.

The way that Von Neumann systems handle this problem is by having high-level
programming languages (like C), which gives you some level of abstraction from
the hardware so that you can model your program however you like and force it
to map to the architecture, even if the problem intrinsically doesn't map
well. This is why parallel programming is a hard problem. It's not actually a
hard problem (well, it could be intrinsically harder), it's just a hard
problem for architectures that are not designed for it. On a Green Arrays
chip, parallelism is the most natural thing in the world.

Currently Green Arrays has no equivalent of C, and as I understand, this is
unlikely to change as the philosophy of Forth is to stay close to the metal
and make that an enjoyable experience. That's probably a good thing actually.
it's more efficient to design problems to fit the hardware than to use one
architecture for absolutely everything as we've been doing for the past 40
years or whatever (with notable exceptions like the Transputer, the Connection
Machine, Xmos XCore chips and FPGA's to a certain extent).

~~~
RodgerTheGreat
Your description of inter-core communication in the GA144 is on the money. In
addition to the challenge of mapping your problem to the grid, you have the
interesting problem of _initializing_ all those cores with their programs,
since it's only possible to do serial communication with a few cores on the
edges of the chip. In one of his recent "fireside chats", Chuck discussed how
he first fills the chip with a zigzag pattern of of programs that act as a
pipeline and then pumps the desired binary along, replacing the scaffolding
backwards toward the serial port he's using. It reminds me a bit of the design
of biological systems- not only do you have to write a program, you have to
make it self-assemble.

If anybody's interested in the GA144, GreenArrays recently put up a series of
free self-paced courses about the architecture:
<http://school.arrayforth.com/>

~~~
msutherl
That sounds awesome – reminds me of cellular automata – but also, um, I just
can't see Joe Engineer figuring that out. My first thought when I started
researching Green Arrays was that this was an amazing idea that I would love
to see take off, but their UX and marketing will really get in the way. Not
like that have much competition though – the entire microcontroller world is
hopelessly obscure with the exception of Arduino and all of the projects that
it inspired.

~~~
RodgerTheGreat
Fortunately Chuck's already done the nasty parts and you can (presumably)
leverage his upload code if you use GreenArrays' tools, but there's always the
possibility that your code will require some very specific sequence of node
initialization and you'll have to wade in and do it manually. I agree that
this tech will probably never be mainstream. If I had an unlimited supply of
free time I would be very tempted to develop some third-party devtools for GA
chips- I enjoy programming in Forth and I could get used to the F18
instruction set, but using a custom keyboard layout and ditching my text
editors and source control systems in favor of a block editor is pretty hard
to swallow.

------
mark_l_watson
I got one of the first Apple II computers (serial number 71) and after I wrote
the silly little Chess program that Apple provided on a demo cassette tape, I
was considering getting FIG Forth running on the Apple II.

One Saturday morning I was in the local Computer Land store in San Diego and
was talking about my just beginning side project of porting Forth. One of the
new sales guys got really pissed off and started ranting about why would I
waste my time on that, lots of people were doing it, etc. The owners of the
store Dan and Dave were friends and they glared at their sales guy, and the
conversation turned to more pleasant topics.

I decided that afternoon that the obnoxious guy was perhaps right and started
instead spending time with Bill Budge's cool 3D library for the Apple II.

A few months later someone came out with an inexpensive product that was Forth
ported to the Apple II and I always wondered if the sales guy who got so mad
that Saturday had any involvement with that product.

~~~
rbanffy
> A few months later

Unlikely. Porting Forth is not a few months long project ;-) Most of Forth is
usually written in Forth anyway - all you have to do is to port some
primitives and you're set.

In any case, I did a lot of interesting stuff with Paul Lutus' GraFORTH and
TransFORTH. Much more mileage than I ever got out of Aztec C.

BTW, I probably shouldn't say that, but that chess program was most likely the
one that came on the boot disk of my first Apple II clone (I live in Brazil
and it was impossible - as in "a fellony" - to get a computer from abroad).
Thanks for the many hours of challenging chess. I never had the patience to
play against humans, however. Meat is too slow. ;-)

~~~
jacquesm
He didn't say it took that person a few months to do a port, he said that
person launched their product a few months later.

~~~
mikeash
I think he's saying it was unlikely that the sales guy was involved, given the
time interval.

------
codgercoder
One of the problems with our field is the fixations that programmers, and
organizations, develop with particular solutions. They form a really bad
alliance with language/platform specialization that leads to a reinforcing
spiral which shuts experienced, adaptable programmers out of the market. If
Asia were not available to pick up the slack, the problem might be more
noticeable.

~~~
marshray
In some alternate universe, the Forth/Forth++ programmers are sneering at the
silly die-hard C programmers. :-)

Back in the time that most of the story takes place, there was not the glut of
well-developed, thoroughly documented, cross-platorm tools with helpful user
communities that we have today. Every machine had its own set of 2 or 3 first-
class development development packages (i.e., pascal, cobol, basic, fortran:
choose any two).

So when a small, elegant cross-platform development solution like Forth or C
comes along, folks would get really excited about it. If you found something
you could be productive with, and could take it from job to job, it was hard
not to. It just didn't make sense to use the best tool for that particular
machine when you knew there was a high probability that you might never use
that vendor's hardware for any other project.

------
kolev
Forth is the as simple as beautiful as Lisp and Smalltalk are. It's, in fact,
vastly used... in the form of PostScript, which based on Forth. Back in the
Apple ][ days, Forth became popular as GraFORTH. As a langue, today it has its
limitations, but it's not impossible that Forth will have the Clojure that
Lisp got and start its second life.

~~~
RodgerTheGreat
You might argue that Factor is "the new Forth".

~~~
kolev
You're right!

------
grout
Forth's necessity of carrying the stack in short-term memory while coding and
debugging has seemed to me its deepest flaw as a tool for mortals.

~~~
RodgerTheGreat
Honestly, maintaining the stack in your head is a skill that can be learned
with practice. In my own experience, it was very hard at the beginning but
over the course of a few weeks it became second nature. You get used to
idiomatic ways of doing things- arranging expressions so that they don't
become deep, factoring words apart as a means of reducing the number of
elements you care about in a given context, when to use a variable or two to
untangle a complex expression.

From the debugging side, well-factored code consists primarily of short pure
functions with a very small number of code paths. This couldn't be a more
perfect scenario for TDD. In my own projects I use an extremely simple unit
testing system called the Test Anything Protocol (TAP). Here's an example of a
fixture:

[https://github.com/JohnEarnest/Mako/blob/master/lib/Test/tes...](https://github.com/JohnEarnest/Mako/blob/master/lib/Test/testMath.fs)

------
ianterrell
What happened to the first three?

