
How I coded in 1985 - jgrahamc
http://blog.jgc.org/2013/04/how-i-coded-in-1985.html
======
edw519
That headline got me to thinking...

    
    
      Date:     Monday, April 29, 1985
      Age:      29
      Location: Santa Ana, California.
      Company:  electronics manufacturer
      setup:    dumb black & green 24x80 14" CRT terminal
      Hardware: Honeywell mini
      OS:       Honeywell proprietary
      DBMS:     Pick (Ultimate Flavor)
      Language: BASIC
      App:      Work Order Processing (I wrote from scratch.)
    
      Date:     Monday, April 29, 2013
      Age:      57
      Location: Miami, Florida
      Company:  aerospace manufacturer
      setup:    Windows PC, 19" & 32" flat screens monitors
      Hardware: who knows
      OS:       Unix
      DBMS:     Pick (Unidata flavor)
      Language: BASIC
      App:      Work Order Processing (I'm writing from scratch.)
    

That's right, everything I wrote in 1985 would compile and run perfectly in
the environment I'm connected to in my other session right now. And everything
I'm writing today would have also worked on that mini-computer back then. I
wonder how many other programmers could make the same claim.

I have learned and continue to use many other client/server and web-based
technologies since then but somehow, I always come back to "old faithful".

~~~
pinaceae
but isn't it questionable that you're (seemingly) writing the same stuff, from
scratch, for 28 years now? shouldn't there be a work order processing standard
solution for aerospace by now?

like an engineer re-inventing the process to build bricks anytime a new
building is requested.

industry observation, not personal.

~~~
edw519
I'm not writing the same stuff. I've written over a thousand different custom
applications _using the same technology_. That was my point. This environment
is so tried and true and so effective, it still kicks ass after 28 years.
Pretty amazing when you think about how much has changed.

 _shouldn't there be a work order processing standard solution for aerospace
by now?_

I'm sure there is. Just like there are blog, website, & CMS standard
solutions. And plenty of people using them. I never see them because those
people would never call me. Like many other programmers here, I imagine, we
only see the customers who need something custom, almost always for good
reason.

~~~
seanwoods
+1 for Pick. I've met several people who were/are Pick devotees. Also heard
stories about how Dick Pick was kind of crazy.

~~~
joe_bleau
My mentor knew Dick, and I also heard stories!

~~~
ianmcgowan
A shameless plug for people wondering "what is this PICK thing?" I host
<http://www.pickwiki.com>, in an effort to capture some of those fascinating
(to me) stories. The guy that created it really was called Dick Pick.

I wouldn't choose PICK for a brand new project today, but as Ed says, there's
life in the old dog yet...

------
robomartin
This is neat stuff, but it also shows how far behind some schools can be in
technology. Yes, I did the hex-keypad-assembler-in-your-head thing, but, by
1985 there were far better options. One word:

    
    
      Forth
    

Funny enough, I built an eleven processor parallel-ish computer in 1985 to
control a large (three feet high) walking robot. That's in the days when you
couldn't buy kits for this stuff. You had to design all of the mechanical
bits, machine the parts, design the electronics, build the boards and then
program it yourself.

The machine had eleven 6502's plus an additional one acting as the supervisor
and comms link to a PC. The entire system was programmed in Forth, which I
mostly rolled out from scratch. This included writing my own screen-based text
editor. The PC ran APL.

The project was a bit more elaborate than that. As a very --very-- green
college student I ended up presenting a paper on this project at an
international ACM conference.

Once you bootstrap an embedded system with Forth the world changes. In '85
there were few reasons to slog it out using hand-coded assembler to automate a
labeling machine. My guess is that he and/or his school did not have any
exposure to the mounting Forth embedded community and so they ended-up doing
it the hard way. That's still good. You learn a lot either way.

~~~
kragen
My experience so far is that my Forth is shorter, but my assembly has fewer
bugs. Maybe this means that Forth wins for larger projects, because it's
harder to read but easier to write?

~~~
robomartin
Perhaps your last few words reveal the reason for the bugs? Forth --or any
other language-- is hard to read only when someone doesn't know it or does not
use it frequently enough. For example, I programmed in Lisp nearly every day
for a couple of years. I could read Lisp without even thinking about it. That
was fifteen years ago. Today I would struggle to make sense out of a non-
trivial Lisp program for a couple of weeks.

If you use Forth as intended I can't see how you would introduce more bugs
when compared to anything. Few languages are inherently buggy. Bugs can nearly
always be traced to the programmer, not the language.

~~~
kragen
All bugs can be traced to the programmer, of course. I find Forth error-prone
because it's untyped (not even dynamically typed) and you have parameter-
passing errors, where you pass a parameter or get a return value from the
wrong place. Traditionally, too, you don't have local variables, but I think
that's not actually nearly as big a problem as it sounds.

That said, my last few words were exactly backward from what I meant to say.
Harder to write than assembly, because you have to take more care to avoid
stack-effect bugs and you don't have local variables; but easier to read,
because your Forth isn't full of arbitrary noise about which register you're
using for what.

~~~
robomartin
You are not looking at Forth correctly. It isn't C, C++ or Java. It's a "raw"
language that you should use to write a DSL. It's just above assembler yet
many orders of magnitude faster to develop with.

Even with your comment backwards it makes little sense. I have developed in
nearly every language between typing raw machine code with a hex keypad and
Objective-C (not to moly that O-C is at the opposite end of the scale). I did
years of Forth. Not one thing you are saying rings true. If you know a
language you speak it. It's as simple as that.

~~~
kragen
Oh, I know that about the EDSL nature of Forth, and it's entirely possible
that if I really knew Forth, my experience would be different.

On the other hand, if you were right that Forth is many orders of magnitude
faster to develop with than assembler, we'd see people writing HTML5-compliant
web browsers in Forth in a few hours. (I'm figuring: 100 person-years for e.g.
Chromium, multiplied by ten for assembly, divided by six orders of magnitude
(because five isn't "many") gives you about 8 hours.) So I suspect you're
living in some kind of a fantasy world.

~~~
robomartin
Nope. Wrong again. Writing a browser isn't the right fit for Forth. Could you?
Sure. Should you? Nope.

Forth is great to develop with but comes with liabilities if used on the wrong
project. I remember a product a friend of mine did using Forth. High end
embedded system with custom hardware all selling for tens of thousands of
dollars per unit. When he started to look into selling the business Forth
became a liability. Nothing wrong with it. Te code worked well and was mostly
bug-free. The company bidding to acquire his business did not do Forth and saw
it as an impediment. They, like a million others, did C. Ultimately he sold
the company for $16 million with a requirement to translate the code base to C
under contract.

Today I use C for any time I need to do embedded work. Having a foundation in
a diverse range of languages means that you tend to use a language like C as a
toolset to create an efficient development environment through domain specific
libraries, data structures and data representation.

I rarely find the need to get down to Forth these days. It made a lot of sense
with 8 bit resource-restricted microprocessors. Today you can have a 16 or 32
bit embedded processor for a few bucks and have it run at 25, 50 or 100MHz.
Different game.

~~~
kragen
I'm still skeptical about this "many orders of magnitude" nonsense. Even on an
8-bit resource-restricted microprocessor, what could you write with Forth in a
day that would take half a century with a team of 60 people in assembly? Or
what could you write with Forth in ten minutes that would take you 60 years in
assembly? I think you're talking nonsense and being condescending to try to
cover it up.

And if almost all projects are "the wrong project" for Forth (which seems to
be what you're saying, although I don't agree with it) then does it make sense
to compare it to a general-purpose language like assembly? You make it sound
like a DSL like SQL, not like a framework for all DSLs.

~~~
robomartin
You know. You are right. I don't know what I am talking about. Thanks for
helping me sort that out. Good day.

~~~
kragen
I'm not accusing you of ignorance; I'm accusing you of talking nonsense, which
is to say, writing a bunch of bullshit without regard to whether it's true or
false. I'm sure you know perfectly well that you're talking nonsense, as does
anybody else who's read this thread, and that's why you're responding with
this kind of diversionary aggressive bluster.

But I find it disappointing, and I wish you'd engage the conversation at a
rational level instead of bullshitting and sarcasm. It's a missed opportunity
to share your knowledge — not with me, but with whoever else reads the thread
in the future.

~~~
robomartin
"I'm sure you know perfecly well that you are talking nonsense, as does
anybody else who's reading this thread"

How do you expect me to answer something like that?

You don't know the language and are intent on having an argument with me about
this?

When Forth is used correctly productivity gains become exponential (the limit
being application dependent). So, yes, when used correctly you can go 10, 100
and even 1,000 times faster than assembler.

A good comparison would be to look at the productivity gains had when writing
raw Python vs using a library such as SciPy. The productivity gains are orders
of magnitude greater than the raw language. And, even raw python is orders of
magnitude faster than writing assembler.

What I said was that [Forth] "It's just above assembler yet many orders of
magnitude faster to develop with." and you seem to think this is nonsense.
Well, you are wrong. What else do you want me to say. Go learn the language
and we can have a conversation. Using it casually doesn't produce the same
level of understanding you get from dedicated non-trivial usage. This is true
of any language or technology.

I am not insulting or diminishing you. Perhaps you are taking it that way.
Relax. It's OK to not know everything, I certainly don't and I've been
designing electronics and writing software for a long time.

Here's a place to start:

<http://www.ultratechnology.com/forth.htm>

Here's a good Q&A:

<http://www.inventio.co.uk/forthvsc.htm>

If I have the time later tonight I might post a follow-up with an example of
what development flow might be for, say, developing code for a multi-axis CNC
machine.

~~~
robomartin
Te key argument behind this thread centered around my assertion that Forth can
be orders of magnitude faster than assembler. That means 10, 100, 1000 --or
more-- times faster to code a solution in Forth than in assembler.

I haven't done a any serious work in Forth in about ten years, so coming up
with an example for this thread would have consumed time I simply don't have
right now.

The reality is that ANY language is orders of magnitude faster than assembler.
The idea that this assertion is being challenged at all is, well, surprising.

I happen to be working on a project that, among other things, makes extensive
use of nested state machines. The main state machine has 72 states and some of
the children FSM's have up to a dozen states. Writing this in C it takes mere
minutes to lay down a bug free structure for the execution of the entire FSM
set. It should go without saying that doing the same in assembler would take
far longer and result in a mess of code that would be difficult to maintain.

Assembler has its place. I have written device drivers, disk controllers,
motor controllers, fast FIR filters, pulse/frequency/phase measurement and a
myriad of other routines across a number of processors all in assembler. These
days I'd venture to say the vast majority of embedded systems are done in C.
Coding is faster, far more maintainable and embedded optimizing compilers do
an excellent job of producing good, tight and fast machine code. As much as I
love and enjoy Forth this is one of the reasons I rarely use it these days.

I still think it is important to learn about TIL's as it adds a layer of
thinking outside the box one would not otherwise have.

~~~
kragen
I agree that learning about threaded interpretive languages is important,
despite the fact you allude to that Forth compilers (at least those I'm
familiar with) produce deeply suboptimal code.

I agree that writing in a high-level language is faster — quite aside from the
availability of library functionality like hash tables and heaps and PNG
decoders and whatnot, you have less code to write and to read, and less fiddly
decisions about register assignment and memory allocation to make, possibly
get wrong, and have to debug.

But that difference is a constant factor — it's not going to be even 1000, and
it's never going to approach "many orders of magnitude", which was the
nonsense claim I took most issue with. (Two or three isn't "many" in my
vocabulary.) Typically I think it's about 10 to 30, which is nothing to sneeze
at in the real world, but which would be insignificant compared to your absurd
earlier claims.

You _can_ get a "many orders of magnitude" boost — rarely — from libraries or
static checking. Neither of these is Forth's strong suit.

Nested state machines are actually an interesting case — often by far the most
convenient way to write them is with one thread per state machine. C by itself
doesn't give you the tools to do that predictably, because you can't tell how
much stack space you're going to need — it depends on your compiler and
compilation options; assembly and Forth do. So it's one of the rare things
that might actually be easier to do in assembly than in C. (But if you have
some memory to spare, maybe setcontext() is plenty good enough.)

------
danso
It's kind of useful to write out code by hand, or at least pseudo-code...I'm
not a neuroscientist but I wouldn't be surprised if it turns on a different
part of the brain. Last week I was sitting in the park with nothing to do and
I reflexively pulled out my iPod to play Sudoku on it. Since it was one of our
first nice days I felt bad about that so I just wrote out the Sudoku grid on a
pad of paper and solved it from there. It was surprisingly difficult, even
though it was an easy puzzle (the iOS version highlights which minigrids on
the main grid that a given number appears in for starters).

Anyway, anytime I think about what a pain in the ass writing, or even just
typing out code can be, I refer to the chapter in iWoz in which, lacking
development software at the time, he "wrote out the software [for Little Brick
Out] with pen and paper and then hand assembled the instructions into 4096
bytes of raw machine code for the 6502"

<http://en.wikipedia.org/wiki/Integer_BASIC>

~~~
SoftwareMaven
I agree with you. I think the physical motion of writing is involved in the
process, activating neurons that just flicking fingers doesn't.

Would be fascinating to see it in an fMRI.

~~~
marshray
There is a psychiatric condition known as 'graphomania' in which one has an
obsessive compulsion to write, the content being secondary in importantce. So
it's conceivable that such a structure in the brain exists and can become
overstimulated.

<https://en.wikipedia.org/wiki/Graphomania>

------
buro9
Brain debugger.

I honed mine early enough that the most I need from a program is a few Sprintf
statements just to give the brain debugger some context when the application
is dealing with external data or data transformation.

I personally find it hard to turn off the brain debugger and rely on software.
Originally this was because you'd very much be used to the debugger being
wrong. As in, "Ah, but in this case the debugger thinks X but Y is the case",
and later just because the brain debugger had become so effective that not
using it seemed like a lot of hard work for little extra gain.

How other programmers visualise or handle code in their brain, and how they
put together new functionality in their head, is deeply fascinating. Even if
you have self-awareness enough to grasp the basics of how you do it, you can
be reasonably sure that each of us does it slightly differently.

~~~
glhaynes
I can often do that on code I've written or am intimately familiar with, but
tons of the code I work on is code that someone else wrote and that I rarely
interact with. As a result, I _love_ my debugger.

------
lucb1e
Let's put the code to text and test if it runs?

I've made a start, but it takes a little longer than I'd thought (handwriting
is often hard to read). Let's do some crowdsourcing?

[https://docs.google.com/document/d/192TW6ghnmMFnVaNvIe98aqTG...](https://docs.google.com/document/d/192TW6ghnmMFnVaNvIe98aqTGkY6gkZF4CbzMaQPxOlo/edit?usp=sharing)

Edit (+12mins): I see lots of people viewing, but not a single edit over the
past 10 minutes.

Edit (+60mins): One other anonymous user helped at last and _the document is
complete_ , but didn't leave his username in the contributors list yet.
Anyway, thanks!

Now the big question is how to run this. _If you have any idea how to run it,
let us know!_

~~~
elwin
There are manuals and an emulator here:
<http://www.floodgap.com/retrobits/kim-1/>

------
bri3d
The mental debugger is definitely one of the tools I'm glad I developed early.

For me, it was by reverse engineering. I started cracking software and
eventually moved on to white-hat black box security auditing, and that quickly
taught me how to evaluate execution flow mentally.

I find that even though I'm writing high-level Ruby web apps now, my ability
to rapidly follow code around in my head lets me debug more quickly and
effectively than many of my co-workers.

I firmly recommend trying reverse-engineering for anyone who hasn't - it will
forcibly provide a lot of the same metal execution mapping abilities while
feeling more relevant than writing machine code or assembler out on a piece of
paper. And once you learn the basics, everything transfers back up to higher-
level languages pretty well (with the exception of mental hex arithmetic,
which will still come in handy as soon as you segfault your high-level
language's runtime). Plus, when reverse-engineering, you can't fall back or
get frustrated and use a compiler - unless you have an IDA + HexRays license
for some reason, you're stuck figuring things out yourself.

As a side-note, it's always fun watching my dad work - he's an oldschool
mainframe guy and he'll sometimes solve common setup issues using JCL or REXX
stuff with a modified date sometime in the late 1980s.

------
smoyer
This brings back a lot of memories ... Hand-compiling code for a COSMAC ELF
(<http://www.cosmacelf.com/docs.htm>) and punching it into memory on a hex
keyboard (the original had a switch for each bit and 4 push-buttons for load,
run, etc).

By the time I was coding for 6502 and 8088 processors (still in assembly
language - I was after all an embedded engineer), I had assemblers and an
80-column by 43-line text editor.

Aren't we spoiled today? I wouldn't want to go back, but I've also found that
the low-level experience with machine code is something many "newer engineers"
are missing ... it's an appreciation of the hardware that you can't get any
other way.

~~~
zerohp
Computer Engineering still offers this experience. It's the main reason I
picked this major when I returned to school after many years as a programmer.
Most of the graduates from here go on to software development. The low level
experience makes them well suited to work on embedded systems, device drivers,
and operating systems.

------
lake99
I did my bachelors (in India) in the mid-90s, and we all coded this way in our
8085 labs. Many of us even did final-year projects on similar-looking 8085
kits.

Given that such kits are still sold [1] in India, I guess quite a few
engineering students still learn to code like that.

My bosses, though, had all worked on punched cards.

[1]: [http://www.dynalogindia.com/products/education-
solutions/808...](http://www.dynalogindia.com/products/education-
solutions/8085-microprocessor-262.html)

~~~
kamaal
>>Given that such kits are still sold [1] in India, I guess quite a few
engineering students still learn to code like that.

Did my Bachelors(in India) in 2010's, and yes even we coded this way in our
8085 labs!!! Nothing has changed. Talked to my cousin who is now in his 3rd
semester, They to start out coding the same way. Its 2013 and NOTHING has
changed.

------
kps
Not to go all Yorkshire, but I used to _dream_ of having a KIM-1. My first
computer — the best thing I could afford — was a Quest Super Elf [
<http://www.oldcomputermuseum.com/super_elf.html> ] with 256 bytes of RAM, and
a processor on which a recursive subroutine call took 16 instructions.

On the other hand, by 1985 I was doing QA for a 64-bit Unix environment.

~~~
jekub
You're joking, this was a wonderful machine. Especially the CDP1802 processor
with its ability to use any register as a PC. I've very nice memory with this
chip.

~~~
kps
In some ways it was nice. For those unfamiliar:

The 1802 had sixteen general purpose 16-bit registers. Any one of these could
be selected as the program counter. In a tiny embedded system (which is what
the part was meant for), you might choose to designate one as the ‘normal’ PC
and reserve a few others for important subroutines, which could then be
invoked with a one-byte ‘SEP _n_ ’ instruction. Similarly you could implement
coroutines or simple task switching by switching the selected PC between a
pair of registers.

On the other hand, there was no conventional call instruction. The SCRT
(“Standard Call and Return Technique”) for recursive or reentrant subroutines
essentially involved defining (non-reentrant) coroutines to perform the
‘recursive call’ and ‘recursive return’ operations.

------
DanielBMarkham
I remember playing around with machine instructions on the 6502. I wouldn't
call it hand assembly; it was more like copying and groping around. I probably
wrote less than 200 lines in my life.

But what a neat feeling! There's something about putting in these numerical
codes and seeing a graphical result that I haven't experienced since then.
Yes, higher-level languages rock, but dang, this is coding _next to the
metal_.

After learning a bit about logic gates and playing around with the theory,
there was an incredible feeling of discovery. Somehow I had deciphered the
code of computers, and was finally speaking their native language. This
brought back great memories. Thank you.

~~~
pm90
I know that feeling. My undergrad CS course had a rigorous electronics
component where we coded in ASM for the 8086. One of my fondest moments of
that time was when I wrote a program that would split the terminal screen into
two halves of different color, and display what you typed on one side and the
inverse case on the other.

This could probably be done with less than 50 lines in python, but doing it in
assembly, calling all those interrupt routines, making sure your registers
contained all the right values, man, when it finally worked, it was the
greatest feeling in the world!

~~~
DanielBMarkham
Wasn't it? And for that moment, you could look out and imagine "Damn, I could
code the whole freaking thing like this!"

Then you realize it would take around 30 years, but still -- it was doable.
You finally figured out how it all worked. _You could make the computer do
anything it was capable of doing_. There was no more mystery there.

Fun stuff.

~~~
kragen
Do you really think it would take 30 years? The NAND-To-Tetris kids do most of
it in a semester, and they're starting at the level of individual gates.

I mean, Chuck Moore has been doing it for about 30 years, but he had a working
top-to-bottom system after less than five years.

Here's what I think it would look like:

Week 0: build a working interactive Forth system that can compile itself, in
machine code. This is assuming you already have hardware and a way to boot it
into some code you wrote. If not, that could be a matter of minutes or of
months.

Weeks 1-3: build an interactive high-level language interpreter on top of it.
Say, JS.

Week 4: enough of TCP/IP to do HTTP GET.

Week 5: write a bitmap font editor and a minimal filesystem.

Weeks 6-8: parse HTML (more or less), lay it out with a simplified box model,
and draw it to pixels on a canvas, using a pixel font.

Week 9: cookies, HTTP POST, XMLHttpRequest.

Week 10: a more elaborate filesystem, maybe based on Git.

Weeks 11-12: some more HTML5 features, maybe including WebGL.

Weeks 13-14: TLS.

This is sort of cheating because you're taking advantage of our previous 70
years of experience to figure out what's worth building, and you're likely to
end up spending a fair bit of time going back and retrofitting stuff onto
previous iterations — JIT compilation, say, and multitasking — but it still
seems like rebuilding the personal computing environment ought to be more like
a one-semester project than a 30-year project.

You could argue that you get a huge speedup by not writing stuff in assembly,
and I agree, but I think that's only a constant factor. You could easily write
assembly code that maps directly onto JS semantics:

    
    
            ;;; invoke getContext("2d") on the canvas object
            .data
        sgetContext: .asciz "getContext"
        s2d: .asciz "2d"
            .text
            mov $sgetContext, %eax
            call GetProp
            mov $s2d, %edx
            call InvokeMethod
    

That's more code to write and read, but only by a linear factor. So it might
take you two semesters, or four, but not 60.

------
sramsay
Dude wrote _in pen_. We are not worthy, brothers and sisters. We are not
worthy.

~~~
gmrple
As did most of my EE brethren in school in 2009. Trust me, writing code in pen
does not a master programmer necessarily make.

~~~
edran
Jokes aside, writing down something like OP's software in pen from scratch
would require some degree of proficiency and confidence. I can still see some
pencil around the paper, in any case :) (Not to diminish the achievement - As
a young programmer, I'm always impressed when I see these things and I'd love
to see more and more of them!)

------
peteri
That's late for hand coding assembler (and very late for a KIM 1). Last time I
did any hand assembly was 81/82 for a ZX81. Graduated to an Apple //e where I
wrote my own assembler using the mini-assembler in the integer basic ROM and
Apple Writer I as a text editor. By 1984 I had gotten my hands on the Apple
ProDOS assembler so my bodged mini-assembler got put out to pasture.

Mind you I've just got an Atari 65XE from ebay today which I will now have to
explain to my wife (I always thought the Atari graphics was neat and wanted to
play with the display list stuff but never did so at the time)

------
ErikRogneby
The comments made me go look for Compute! 168 issues available on archive.org:
<http://archive.org/details/compute-magazine>

------
tterrace
That's really fantastic that he was able to save his work from 28 years ago. I
really wish I had the foresight to do the same from only five years ago,
though I expect the result would be less like unearthing buried treasure and
more like finding what stinks in the fridge.

~~~
jgrahamc
I saved a lot of stuff: [http://blog.jgc.org/2009/08/in-which-i-switch-
on-30-year-old...](http://blog.jgc.org/2009/08/in-which-i-switch-on-30-year-
old.html)

~~~
lifeisstillgood
I am just beginning to worry about this sort of thing (#) but what age were
you in 1979/82 with your sharp computer, and I have assumed I will be trying
to guide my son / daughter onto emulators (nand2tetris etc) rather than real
physical machines - or are there real physical machines which are simple
enough and have root on to be early hacking machines?

(#)<https://github.com/lifeisstillgood/importantexperiments4kids>

~~~
salgernon
I gave my daughters TRS 80 model 100 computers for their tenth birthdays. It
has a 8085 processor and really a quite powerful basic (the last software bill
gates worked on, and its not bad) with a bit mapped display. They run for
weeks on 4 Aa batteries and have a better keyboard than most modern laptops.
here is an emulator available, but since they go for about $30 on eBay, I like
having the real thing.

~~~
tluyben2
I bought a box of these recently and they are really great; I used to have TRS
80 III and 4 so when I saw I box (don't hit me) of model 100s for 10 euros I
bought it. They're a very nice addition to my museum and batteries last
forever. The box had 1 Sharp PC-1211 too. Talking about battery life :)

What always amuses me is that my computers from around 2000 are not working
anymore (heck, most laptops I have from the past 10 years are not even booting
anymore) while computers from my parents basement which are around 30 years
old just work like they just came from the shop. Even the Philips computers
from that time who had known capacitor issues in the power circuitry work like
time didn't happen.

------
italophil
I used an East German version of the KIM (LC80) for lab experiments. In 1999!
Tying in code was slow and debugging even slower, but the hardware I/O was
easy, no fuzzing around with RS232. Ahhh, good times.

------
brudgers
I spent several weeks during the Summer of '82 living in dorms and taking
college courses at UCF. My suitemate, John, was writing Petman in machine
code, on and for his PET 16. An enduring memory for me and one which reminds
me that there are programmers who are 100 more productive than others - I was
at the stage of typing in code from magazines.

------
jgrahamc
Wow. I should really dig out more stuff from my box of old programs. This one
blog post is at almost 100,000 page views.

------
SoftwareMaven
And this was just a small utility. Then think about things like the Apollo
program being coded the same way (or more obliquely on punch cards).

There is a finite and limited amount of complexity this allows for. We've been
given quite the catch-22: the ability to build increasingly complex system
while needing to maintain those increasingly complex systems.

On whatever project you are currently working on, how much less code do you
think you'd be writing if you had to hand write it? Where would the project be
better for it and where would it be worse?

------
tinco
This weekend for the ludum dare someone made an implementation of Pong on his
C64, in a weird turn of events he had to resort to printing out screenshots of
memory dumps, run those through an OCR. But the OCR was buggy and he had to
check every byte by hand.

[http://www.ludumdare.com/compo/2013/04/29/ponkmortem/#more-2...](http://www.ludumdare.com/compo/2013/04/29/ponkmortem/#more-250547)

Reminded me a bit of this story..

------
verandaguy
As a relatively new (5 or so years experience) programmer, 1985 sounds like
hell.

~~~
rayiner
Software development basically peaked in the mid 1980's.

Macintosh Common Lisp circa 1987:
[http://basalgangster.macgui.com/RetroMacComputing/The_Long_V...](http://basalgangster.macgui.com/RetroMacComputing/The_Long_View/Entries/2013/2/17_Macintosh_Common_Lisp.html),
specifically:
[http://basalgangster.macgui.com/RetroMacComputing/The_Long_V...](http://basalgangster.macgui.com/RetroMacComputing/The_Long_View/Entries/2013/2/17_Macintosh_Common_Lisp_files/inspect%20and%20describe.jpg)

Firebug circa 2013: <http://getfirebug.com/logging>, specifically
<http://getfirebug.com/img/logging/consoleDir.png>

~~~
svachalek
In the 70s and 80s programmers were working on fixing the problems that
programmers faced in the 70s and 80s: memory management, GOTO spaghetti,
efficient assembly etc. In response we got heaps, garbage collectors,
functional languages, virtual machines, compilers, debuggers, IDEs.

Now most programmers spend most of their time dealing with problems of
persistence, networking, parallelism, security, and massive codebases. In
response we are creating scripting languages with slightly different syntax,
adding closures to Java, and arguing about vim vs emacs. We're so obsessed
with solving the problems of the 80s, my way, that we've just gotten trapped
there.

------
zafka
I have fond memories of programming my "Arnesh" 68 K board around 1995. We
used them in our microprocessor course, and since I had so little time in the
lab, I bought my own board. After the class ended I would lend it to friends
until it came back fried. One of these days I need to dig it out and try to
repair it, now that I feel a little more secure with a soldering iron.

------
alan_cx
Finally, some programming this old fart recognises. Why did it all get so
complicated?

Yeah, I know. Rose tinted and all that.

~~~
jacquesm
> Why did it all get so complicated?

Two words: 'the web'.

Slightly longer: we're re-inventing the wheel through a very roundabout path,
I think right now we've gone back in time to roughly the mainframe area, it
won't be long before someone will invent the mini computer all over again only
this time it will be a local cluster in a box. Next after that the PC, a
'personal cluster' with a few 100 nodes the size of the desktop machine you
had last year.

~~~
greenyoda
I don't think it was the web. The pre-web applications with GUI clients
running on PCs and servers on mainframes were already pretty complicated
(remember Microsoft's ever-changing object APIs, or CORBA?). What made things
increasingly complicated was the ever-growing power of computer and
communication hardware, which made it possible to create larger and more
complex systems. If you're limited by having to run your software on a machine
with a total of 256K of memory (like the first machine I used Unix on in
1980), you just can't build something as complex as what you can build today
on a cheap laptop.

~~~
jacquesm
We were building clusters of PCs in the 1980's using arcnet. Those were pretty
large and complex systems, they used a very elegant message passing protocol.
And that definitely wasn't the only game in town for large and complex systems
(but it was one of the most productive ways of constructing those systems with
a small team).

------
cimnine
nice 'syntax highlighting' ;)

~~~
simmons
I was thinking the same thing. I, too, wrote many sheets of hand-coded 6502
assembly back then, but I didn't think to use a multi-colored pen.

------
segmondy
I wrote code like this in 1997. First year in college, micro processing
course. So I wrote a simulator for the 6800 chip, and an assembler. So I
didn't have to wait every Saturday at 8:30am to run my code. The professor
wasn't impressed at all when I showed him my program.

------
quattrofan
This made a big wave of nostalgia wash over me, I learnt to write 6510
assembler on my C64.

------
bwang8
Articles like these make me realized I am completely spoiled. I am in no
position to complain about debugging when modern technology afforded me a much
easier time than 30-40 years ago.

------
ericssmith
I had a KIM-1 in the late seventies and for several years I would run into
other people who used one, but 1985 seems really late for programming on one
of these. For me, the KIM-1 was my first experience with programming and a
computer of any sort. It influenced my taste for low-level concepts, not only
through x86 assembly programming for graphics in the 90s but much later in the
lambda calculus and combinatory logic. I'm tickled that the KIM-1 is seeing
such a resurgence of interest.

~~~
jgrahamc
It was late to be programming one but it was (a) what was available where I
was and (b) it had good I/O for the interfacing.

~~~
jacquesm
But that FSK demodulator was incredibly finicky. More than one tape I recall I
had to try to read a lot of times while tweaking the playback speed on the
recorder!

------
pjmlp
Oh, the pleasure of writing hex dumps into monitors....

------
treerex
I was in high school in 1985 and our computer club built a Heathkit HERO-1,
which had a 6808 processor and also had to be programmed through the hex
keypad. After a few months of hand writing machine code and typing the opcodes
into machine, I wrote a simple assembler on the Apple //e that would then
download the assembled programs to the robot through the RS-232 interface. So
much fun.

~~~
seanmcdirmid
My dad had one of those. So cool, I enjoyed keying in "programs" on the head
keypad to make it move and speak (I was around 9 or 10); I would have loved
your tool (if I could get it for the Osbourne we had then...).

------
andrewcooke
heh. i also programmed a stepper motor in ~1985 (after building the driver
board). but it was z80-based (and it never ran smoothly - when i left i think
they were going to try adding some noise to the timing, but that didn't sound
likely to me - more likely i had a crossed wire in the hardware...). anyway, i
am pretty sure i had an assembler (and this was working for a one-man company
in the backwoods of n yorkshire). why were you hand assembling?

------
bgruber
one of the good things about a formal CS education (well, mine anyhow) is that
they force you to do things like write in assembly. obviously there's far less
immediate practical application these days, but the exercise is definitely
valuable, and probably not something I would have done voluntarily. Actually,
it feels like something I would have thought sounded cool, tried, and then
given up on when it got annoying.

~~~
tomjen3
We had to make adders, flip-flops, etc (on paper, not physically) and those
were used to build an ALU and (I think) a memory-switch bank.

It was very cool to know that when the world ends I can build a new computer,
from scratch.

~~~
kragen
What are you going to make the logic devices from? I've speculated about this
extensively in [http://lists.canonical.org/pipermail/kragen-
tol/2010-June/00...](http://lists.canonical.org/pipermail/kragen-
tol/2010-June/000919.html) but I still don't have a working scratch-built
computer.

~~~
reeses
You can make transistors with some...hazardous components, but you may want to
start with relays or tubes and resistors (which could, theoretically, just be
varying lengths/guages of wire) and build NAND gates. It would be big, hot,
noisy, slow, and a huge waste of your time, but there were computers before
semiconductors.

~~~
Someone
For the curious: here's how you make your own tubes from glass, metal wire and
metal sheets: <http://m.youtube.com/watch?v=EzyXMEpq4qw>.

------
smrtinsert
Best coding I do is with a pen and paper.

------
VinzO
That makes me wonder what we will think 30 years from now, when we look back
at how we develop software today.

------
eksith
Articles like this inspire awe and shame in me at the same time. Is there an
emotion called Aweshame?

------
Tomis02
"...typically I reach for the brain debugger before gdb".

Gdb, wow. So he'll still coding like in 1985.

