

Just give me a simple CPU and some I/O ports - jgrahamc
http://www.jgc.org/blog/2009/08/just-give-me-simple-cpu-and-few-io.html

======
abalashov
I've been having that exact thought for quite some time now.

It seems that the art of programming - or at least, the practice of it,
equivalent to "proficiency" or "competency" - has shifted from fundamentals of
the machine and the language's intrinsic semantics, which allowed you to
actually create something anew, to spending 90-95% of the time trying to
figure out someone else's API. In other words, the real skill now is reading
reference material, and trying to figure out how to fashion a bunch of opaque,
prebuilt Lego blocks together.

It's just no fun that way, and it relies on very different mental faculties
than the ones present in fundamentals-based programming. Writing low-level
backend processes in C may have taken a lot more work for boilerplate and
nontrivial data primitive/data structure support, but it was real creation -
using the actual power of the language to get something done. Now it's all
about figuring out how the SynchronousIOGoatSoapBubbleVectorManagerFactory
interacts with the
SynchronousIOGoatSoapBubbleVectorReflectorParserTransformer.

~~~
mechanical_fish
I know where you're coming from, but I wonder if this observation is just a
classic language rant waiting to be born. In other words: I wonder if the
problem is _not_ that all higher-level APIs necessarily suck, but that the
ones you _have_ suck, and your language platform is too impoverished to
profitably work around that.

API design is _hard_. It's like composing poetry, or trying to design the game
of chess. ("I know, the queen should be able to jump like a knight if she's on
a white square!") It is bound to take thousands of little trials and errors in
order to get things right. And my suspicion is that certain languages and
platforms make it especially difficult to tinker with existing APIs, or wrap
them while minimizing the leaks, or replace one part of them and leave the
others alone, or build them in a way that makes it easy to compose complex
higher-order objects and behaviors out of little pieces. When working with
those systems, the most efficient way to solve any given problem is to just
slog through, trying to make lemonade out of slightly-spoiled lemons by hand-
sorting and hand-squeezing each lemon. And that can be effective -- at least
it's more efficient than trying to use genetic engineering to construct a
lemon out of raw amino acids -- but it's not always fun.

There are high-level APIs that are a joy to use. I love jQuery, for example,
which is a layer of Javascript atop more Javascript atop an absolutely _scary_
C program atop Unix or Windows and yet somehow manages to feel lightweight and
elegant and composeable. But for every elegant API there are dozens of clunky
ones.

The lovely thing about low-level C or assembly is that the building blocks
tend to be simple, understandable, and composeable -- and, if they aren't,
it's not so much work to rewrite them. (Though, as jgc points out, in modern
times even assembly is hardly immune to complexity. The idiosyncrasies of
modern processor architectures are so baroque that it takes years for
optimizing compilers to be refined to take advantage of them.)

~~~
jacquesm
jquery is a good example to use, I've only been into it for a couple of days
now (after asking on HN what would be a good js library), and after a few days
I already like what it does:

It hides the mess!

Most software is so absolutely messy. Just the other day there was an argument
that it is perfectly ok to see the user interface that results from
interpreting a spec'd document as an approximation. Computers are supposed to
be deterministic, you're supposed to get _exactly_ the results you want and
not something 'good enough for government work'.

Elegance breeds excellence, bloat breeds badness. Cellphones that crash (who
would have ever accepted that), computers that need to be periodically
reinstalled and APIs that have manuals 10 times the size of the operating
system components they interface to.

Complexity is a given, so, then we should strive to make the complex simple,
instead of more complex.

------
pmorici
My opinion may be in the minority here but I think the crux of this article is
misguided. It's like a farmer saying, "I long for the days before the
combustion engine because I love planting 3 acres by hand".

I also think the two basic premises of the article are just plain wrong. There
are plenty of examples where people have written their own operating system
from scratch for the x86, there have been articles posted on HN describing as
much. So to say that todays processors are "to complex to understand" is just
wrong. On the assertion that programming has devolved into "learning another
man's APIs" that's just a fact of engineering I suspect the Z80 had a thick
manual describing all of it's interfaces and inner workings. The AVR micro
controllers I've worked with, the same ones on the Ardunio the author says is
still "fun", have a manual that is over 320 pages long describing in effect
the processor's API.

~~~
Goladus
It's quite probable that x86 processors are more complex than they need to
be-- at least, the instruction set is more complicated than is necessary. CISC
architectures are meant to make life easier for assembly language programmers,
letting you do things like load, multiply and store in a single command. In
practice that hasn't been all that valuable, because programmers just use
higher level languages like C when they need that much expressive power. From
a compiler perspective, it's easier (or at worst about the same) to generate
code for RISC than CISC, and RISC has much more flexibility with regards to
automatic optimization, pipelining, etc.

~~~
anamax
> RISC has much more flexibility with regards to automatic optimization,
> pipelining, etc.

Folks designing processors haven't believed that since before the Pentium was
introduced. (I went to dinner with some of the MIPS principals right after the
first Pentium tech talk. Their conclusion was that everyone would finally
figure out that the RISC/CISC wars were over and they'd lost.)

There are lots of things that go into designing a high performance processor.
Instruction decoding and its consequences have little effect/cost compared to
everything else.

~~~
Locke1689
RISC won. Intel continues to ship with CISC style artifacts, but that is only
because _Intel has almost never removed a feature from its microprocessors._
However, the actual processor implementation is designed with CPU microcode,
which is a RISC architecture. Even when you think you are doing something
"CISCy" in an Intel proc, it's translated to RISC behind the scene.

~~~
anamax
> RISC won.

Yup, MIPS and SUN are a thriving companies and Intel shut down.

> However, the actual processor implementation is designed with CPU microcode,
> which is a RISC architecture. Even when you think you are doing something
> "CISCy" in an Intel proc, it's translated to RISC behind the scene.

Do you really think that RISC machines don't have microcode? (They also have
multi-cycle instructions and the like.)

The claim was that RISC ISAs had inherent advantages that would cause CISC
ISAs to be uncompetitive. That claiim was wrong.

------
jacquesm
Thank you for that, that echoed my own thoughts better than I could have ever
put them into words.

My frustration with the amazing amounts of bloat that you have to deal with in
order to do the simple stuff knows no boundary.

A PIC chip has more power than the machines that made Apollo 11 possible, I
wonder if with todays technology we'd be tempted to go for some 'high tech'
solution and mess it up because of that.

In the mid 80's I worked for a dutch artist on a project called 'SonoMatrix',
a room full of speakers with a bunch of computer controlled tape recorders,
amplifiers and channel switches attached to it.

The whole thing ran of a beeb, the user & printer port. We designed the
hardware, wrote the software (both the controller in 6502 assembler and the
user interface) and built the whole thing.

If I had to do that today I wouldn't even know where to begin...

~~~
HeyLaughingBoy
You'd get an old PC from your closet, install Linux (text only) and gcc and
get the whole thing working in less time and for less cost, probably.

...at least that's what I'd do :-)

------
mhb
CPU built from 74 series TTL chips running web server:
<http://www.homebrewcpu.com/>

~~~
jacquesm
you ought to post that invidually, that's a really neat hack.

~~~
mhb
Didn't get any love: <http://news.ycombinator.com/item?id=741639>

~~~
jacquesm
I think a big factor in that is when something gets posted and whether or not
it gets traction immediately. If that doesn't happen it will be gone before
someone notices.

I completely missed it yesterday. The 'new' page gets filled up so fast
sometimes it's not even funny. And that's when ignoring the spam. One thing
that would help here is a minimum delay before the same user can post another
link.

Your comment here got more 'points' than the original posting.

~~~
jgrahamc
I have reposted it here: <http://news.ycombinator.com/item?id=743583>

------
scott_s
The fact that a single person can understand a smaller percentage of the whole
of a computer system indicates _progress_.

There was a time when any given physicist probably had a good grasp on most of
their field. I doubt that's still the case. I do sympathize with the romantic
notion of knowing it all, but let's not confuse this notion with a call to
action.

------
wwalker3
I think the completely comprehensible system has always been an illusion.
There's always some cutoff level below which people don't understand (or don't
care to understand) the system they're working in.

The old Apple 2 hackers had a great understanding of assembly, but that's
because it was the top level of the system to them, the stuff they worked with
daily. I doubt most of them understood the PLA that decoded the instructions,
or the behavior of the dynamic NMOS logic inside the 6502.

The 6502 instruction set was essentially the API of the processor, and it
wasn't above reproach any more than current software APIs are. Many people
wished for different addressing modes or additional instructions to simplify
common coding tasks.

~~~
limmeau
However, the relation between how much is there to know (increasing) and how
much fits in a head (constant) is growing, which is sad, and whether it has
ever been 1 seems rather unimportant to me.

~~~
neilc
The fact that the ratio continues to grow seems like a fundamental property of
technology -- personally I don't find it sad. In exchange for more powerful
tools, you inevitably need to accept more levels of abstraction and more
underlying complexity.

------
irrelative
_I can no longer understand the computer I am forced to spend my days in the
lonely struggle against an implacable and yet deterministic foe: another man's
APIs._

Having worked as an electrical engineer creating hardware, this seems strange.
Obviously the computer designer had to create an API of some sort, even if
they did it in transistors. Computers aren't given to us from above -- there
are people creating them too.

~~~
asciilifeform
There is a fundamental difference between a hardware "API" and a software one.
See <http://www.loper-os.org/?p=37>

~~~
arakyd
It's not a fundamental difference, it's a consequence of the fact that
hardware people are at the bottom of a very big stack and have a massive
financial incentive to be as solid and predictable as possible. Higher up the
stack everyone prefers to use relatively cheap programmers and build stuff
quickly.

The problem is not having to deal with software APIs, the problem is the sheer
size of the stack and the sheer number of accumulated assumptions that are
built into it. Moving more pieces into hardware might improve the stack's
overall integrity and reduce bugs, but it won't do much to reduce the size.

The real issue, IMHO, is that no one wants to admit that the general reuse
problem is hideously, horrifyingly difficult. The biggest problems it causes
are diffuse and long term, and in the short term everyone can do things faster
by hacking together their old code with someone else's 95% solution library,
so that's what everyone does. Putting enough thought into each new application
to really do it right tends to be hard to justify on a business level, and
most programmers have neither the inclination nor the skill to do it anyway.
It's so ingrained that even people who are frustrated with the way things are
think that a different operating system or language would solve the problem.
It wouldn't - it would only start the process again, with at best a percentage
reduction in stack size and corresponding percentage increase in time to
frustration. I think it boils down to the fact that code reuse is basically a
2^n problem, and the bigger and more opaque the stack gets the harder it is to
cheat that 2^n.

The only potential solution I've seen is what Chuck Moore is doing with Forth
chips. He's now at the point where he can design and manufacture chips that
are cheap and simple in design but are very good at running Forth very
quickly. Of course the tradeoffs are (perhaps necessarily) as horrifying as
the reuse problem in that it demands a lot more from programmers in general,
and in particular requires them to learn a radically different way of doing
things than they are used to while at the same time strongly discouraging
reuse at the binary and source levels. In other words, he's spent decades
designing a small core language and hardware to run it, and that's really all
you should be reusing (along with data transfer protocols). Needless to say,
no desktop or web or server programmer (or said programmer's boss or company)
is ever going to go for this unless problems with reuse become far worse than
they are now. (Even then the supply of cheap programmers might grow fast
enough to keep masking the problem for a long time.) Most programmers are not
very good, managers like it that way, and most of the smarter programmers are
nibbling around the edges or looking for a silver bullet.

In short, there are no easy solutions. If you don't like the direction
software is going, think about becoming an embedded systems programmer.

------
cesare
I've felt exactly the same way in the last few years.

A complete understanding of the whole system in every detail is not so
necessary for me. I just want to focus on the things I'm trying to accomplish
without the need to constantly lookup API documentations and writing glue
code.

That's why (for my own projects) I always end up making my own tools and
coding almost all the stuff I need by myself.

But our culture is going in the opposite direction
([http://www.wisdomandwonder.com/link/2110/why-mit-switched-
fr...](http://www.wisdomandwonder.com/link/2110/why-mit-switched-from-scheme-
to-python)).

------
Mintz
_"Don was responsible for the LM P60's (Lunar Descent), while I was
responsible for the LM P40's (which were) all other LM powered flight". Two
men were able to write all that code and understand its operation._

Is this probably why we don't use more advanced technology for space flight
today; it's too complex? I've always wondered just how much more we could
accomplish if we used modern computing technology in space shuttles, but if
safety is of the utmost important, maybe the complexity is a bad thing?

~~~
rbanffy
I like to trace a distinction between advanced technology and complicated
technology. A 45nm AGC would be more advanced. An AGC running WindowsCE would
just be more complicated.

------
brianobush
Love the article and like many others here enjoy the simple. I still have the
pleasure of coding in C at work and my fun work is currently writting an app
for the Nintendo DS. Truly simple system.

While back I was interested in designing/constructing simple 4-bit cpu, here
is one that was actually completed: <http://www.vttoth.com/vicproc.htm>

------
billswift
Off-topic, but someone mentioned Forth chips in a comment here, and I have
read about Lisp machines, and C is often referred to as high-level assembly
language - has anyone tried making a computer that actually runs C or a strong
subset of it as its actual assembler?

------
giardini
There's always the language Forth.

~~~
FraaJad
That was my first thought too when I read the title. Chuck Moore's site is a
good place to start - <http://colorforth.org/>

------
yan
I can definitely relate. I'm building a monome(.org) clone with an Arduino and
having a blast.

------
DanielBMarkham
I've felt this way for several years -- we've went from engineering to
something more like witch doctor.

The problem is, of course, that all abstractions are leaky, and we've piled
abstraction after abstraction on top of the hardware. So it's not unusual to
have four or five levels of stuff between you and the machine. Adding to that
is the problem of multi-cores: it's not just one machine operating anymore.
Each piece is deterministic, sure, but as all the pieces interoperate in real-
time that determinism can be so hidden as to be effectively non-existent.

Languages are going to be able to help to some degree, but at the end of the
day machines are just going to keep getting more and more complex and our
understanding of them shallower and shallower.

It'll be interesting to see if there is a major hardware refactoring that
takes place anytime soon. I imagine for AI to work we're going to need it.

------
joshu
beagleboard?

------
torpor
One thing that I have found that helps with exactly this situation, and I have
suffered it as well over the years, is to move your focus from one end of the
spectrum to the other. That is, if you have to spend your daylight hours
grokking another mans API, then spend a few hours in the evening, or during
off-time, hacking on your own embedded project with the Arduino. I've got tons
of projects around, all of them slowly making progress admittedly, with the
purpose of getting me out of my funk .. I don't actually code my hobby
projects for any other reason than to make my professional work (embedded
systems for safety critical applications) a little more enjoyable.

It sure is fun to dive into Android, for example, and get some context, and
then another week spend some time with the AVR compiler.. then the
Beagleboard, etc.

