
Ask HN: What was programming like before internet? - shivekkhurana
I recently talked to someone who helped a company with their main-frame architecture in the 80s.<p>As a 90s kid, I grew up with internet, github and stackoverflow so it can be argued that I got everything easily.<p>But I&#x27;m eager to know what was it like before internet or personal computers were a thing.<p>Who hired you? What did you work on? How did you learn ? How did you fix issues? How did you find talent? How did news spread? Do you have any war-time stories ?<p>Thanks
======
runjake
\- Books. Usually just "book" given a particular language. There wasn't a
breadth of material available, so you had to rely on your critical thinking
and reasoning skills much more.

\- Paper programming magazines that came out monthly. These normally had
articles that centered around a program you would type in manually from the
pages of the magazine.

\- In-person user group meetups. These were pretty entertaining and full of
brilliant basement dwellers with entertaining and non-normal personalities.

\- BBSes (social networking)

~~~
jerrytsai
Ah, the 1970s and 1980s. Imagine a world with no cell phones or Internet. How
are you going to get your debugging help now? StackOverflow doesn't exist.

To add to this list:

\- Manuals. Reading carefully through the documentation was necessary for
mastery. Both hardware and software. These were often closely associated-- you
couldn't always just concentrate on one and abstract away the other.

~~~
DrScump

      How are you going to get your debugging help now? StackOverflow doesn't exist.
    

Real Men and Real Women read core dumps. And they actually communicated with
each other.

Plus, turnover was less, so you had institutional knowledge over long periods
maintained by peers and clients alike.

In the mainframe environment in which I worked, we had a lot of company-
specific and proprietary systems and tools that outsiders couldn't have helped
with, anyway. And they beat the crap out of IBM's packages.

And no live Internet connection doesn't preclude dial-up.

And IBM tended to document the hell out of their stuff.

------
AnimalMuppet
We got hired through want-ads in the local papers, or through headhunters
(once we became known). We read paper documentation. That was slower.

I learned C by reading K&R. I learned Pascal by reading an equivalent book,
though I can't remember who wrote it. I learned Basic by reading the TRS-80
Basic manual. I learned 68000 assembler by reading Motorola's 68000 book. And
so on.

I remember the day the hard drive on the development system died. How did we
find a replacement? Yellow pages to find companies that were plausible, then
calling them to find out if they had what we needed.

Everything moved slower. But the people you competed against were also moving
slower...

~~~
jonjacky
_I learned Pascal by reading an equivalent book ..._

Jensen and Wirth, _Pascal User Manual and Report_. Springer. Silver cover.
Typeset in typewriter font. Syntax presented both in BNF and graphic 'railroad
track' diagrams. Very short, even though it was two books in one: the tutorial
User Manual and the reference Report. Very very clear and well-written.

The ideal in those days was _one book_ that was complete and authoritative and
also instructive - if you knew that book, you knew the language. K&R (1978) is
the famous example but Jensen and Wirth preceded it (1974).

~~~
AnimalMuppet
Sounds likely. I don't remember the cover, but I remember the "railroad track"
diagrams.

------
acrophiliac
You didn't ask about this, but a salient point to me: when programs were
created on punched cards in the early 70's, and you had to wait 15 minutes to
an hour for a single compile, you spent a lot of time proofreading your source
code before submitting it. You designed and coded the entire program in
advance, none of this programming by accretion that is so common now. And we
became quite skilled at finding our own errors before submitting code to the
compiler. The current approach of letting the compiler lead you by the nose
from one error to the next seems horribly inefficient to me, and results in a
higher defect rate in the finished product.

~~~
afarrell
It seems like that to me as well, but what is an engineer to do if he feels
confused about what he's building on top of and he gets told that he's just
feeling impostor syndrome or needs to stop "trying to understand the universe"
or that his boss trusts that he _does_ actually understand things?

It has been true my whole career and I can't wait for the moment when I can
retire and then have the time to learn enough to feel like I know what I'm
doing.

~~~
Delk
The "universe" is also, in some sense, much more complex than a couple of
decades ago.

In most cases we don't need to deal with as much low-level complexity or with
cumbersome tooling, but "understanding the universe" would mean understanding
pretty much all the levels of abstraction we have (and we have a _lot_ ), a
diverse stack of technologies with rather different paradigms, a security
landscape that has become much more complex with all the code we're running in
a browser, and so on.

I don't know what one is to do, but I somehow feel that "understanding the
universe" is much more difficult nowadays than it perhaps was a few decades
ago. There were probably fewer levels of abstraction, less diversity of
technology, UIs might have been programmed in the same language as the rest of
the program so there might not have been a need to juggle in a few languages
at the same time, etc.

Not to mention that since it's nowadays possible to program reasonably without
having all that lower-level understanding, unlike perhaps a few decades ago,
lots of people do that and (immediate) productivity expectations naturally
grow to match.

I feel the pain, though.

~~~
alisonatwork
I agree that there were less levels of abstraction, but I think the domain in
which each individual developer operates isn't that much different.

In my first "serious" programming job, I was hired to be a C programmer. But
just to get C building, you also need to understand Makefiles, and to
understand those you need to understand shell script. We printed stuff to
console in color, so we needed to understand ANSI escape sequences. We also
printed to printers, so we needed to understand HP PCL. Some of our systems
needed to talk to dedicated hardware attached on the serial port, so we had to
learn proprietary protocols. Newer devices were attached via TCP, so we had to
learn sockets and those protocols too. We had our own home-built database, but
sometimes we needed SQL. We used Informix, later Postgres. Each had a
different stored procedure language and C bindings. We deployed on UNIX, so we
needed to understand files and pipes and processes and semaphores. We used a
modem to dial our our customer sites, so needed to know about a tool called
Kermit. There were many different flavors of UNIX back then, and they all had
slightly different behaviors. We had to learn all the quirks that applied to
the APIs we were using.

Later on we started to build front ends in Java AWT and Python GTK, and that
brought in a slew of new tech. Much of this tech was still pre-2000.

In my current job, I am mostly working in Java and Python, but now they are
backend languages. The services are built using Gradle, which needs knowledge
of Groovy, or setuptools which is just more Python. The packages are bundled
in containers and pushed to the cloud. There is a lot of engineering that
enables this stack, but the only part we need to know about is the format of a
Dockerfile (so, basic shell scripting) and various YAMLs for CloudFormation
and the CI/CD system. We talk to lots of other services, but we don't need to
know anything about TCP or sockets, we just need to use a library to manage
the HTTP connections and another library to convert the wire format to
objects. We don't care about presentation logic or reporting because someone
elsewhere in the stack is going to do that. We define our contracts in
Protobuf, JSON or XML and move on. Some services need to keep state, and for
almost all use cases the file system is sufficient - in AWS that means S3.
Sometimes we send or receive messages - in AWS that means SNS/SQS. We also
push metrics and logging and distributed trace context, but that's just more
HTTP abstracted away into a library. We build dashboards in Grafana using
OpenTSDB query language. We check our logs in Kibana using Lucene query
language.

There is a lot to learn. I could spend even more time learning about
Kubernetes or Kafka or Go or Redis or a hundred other components required by
software outside my current domain, but none of that is necessary in my
current role. Back in the day I could have spent time learning about OS/390 or
Syslog or Perl or Berkeley DB too.

So, although I agree that "the universe" in general is much more complex, I
haven't found that to be true about the part individual developers are
expected to understand.

~~~
Delk
> So, although I agree that "the universe" in general is much more complex, I
> haven't found that to be true about the part individual developers are
> expected to understand.

That's probably true. I think the person I was responding to felt like they
would _want_ to understand more than they're necessarily expected by others
to, though. I kind of share that sentiment, and they also mentioned impostor
syndrome, so I wanted to express my feeling that if you want to understand
"the universe" (and not just what's necessary in order to fulfill the what's
expected of you), that's probably more complex nowadays than it maybe was back
when the field was even younger.

You have a longer perspective on that development than I do, though (I started
in the early 2000's or so), so thanks for sharing.

------
roelschroeven
Both software and hardware tended to come with much better and more complete
documentation than today (things didn't change as fast as they do now, so
printed documentation stayed relevant much longer than it would do now). As a
teenager I learned a lot from the pretty detailed books on MS-DOS and GW-BASIC
that come with our PC XT clone, for example.

Then there were 3rd party books, sometimes very good ones. And magazines, some
of which were very good too.

------
staycoolboy
Books, magazines (BYTE, Nibble[0], Dave Ciarcia's Circuit Cellar), computer
camps, BBS. But because there were so few resources, they tended to be FAR
higher quality than googling or Stack Overflow. Plus things weren't as complex
back then: you could memorize the 6502 instruction map in an afternoon.

The only way I really learned was by saving my money (returning soda cans and
mowing lawns) and buying MULTIPLE books on the same subject because the
examples or explanations differed enough to triangulate WTF was going on.

Nibble magazine had source code in it every month for the Apple that explained
how stuff worked. I once typed in 14 pages of assembly to write a space-
invader's game. Most of it was bitmaps.

When I was 13 I won a scholarship to a computer camp and spent a month with
experts who answered every question I had and taught me how to debug. It
expanded my knowledge 1e6%.

I'm a late-70's kid. I grew up near Yale University I bought programming books
from the Yale Co-op book store with allowance money.

I started with BASIC on a relative's C64, then got into 6502 assembly[1] on
the school Apple computers[2], then bought a book on 286 protected mode when I
got to highschool. Then I worked for a machinist who taught me C in the
mid-80's.

It was really hard without a network of experts to reach out to.

I still a have the latter two books, below, with my big, goofy-kid handwriting
in them:

[0] Nibble mag ...
[https://www.nibblemagazine.com/](https://www.nibblemagazine.com/)

[1] 6502 Software Gourmet Guide & Cookbook, Robert Findly ...
[https://www.amazon.com/6502-software-gourmet-guide-
cookbook/...](https://www.amazon.com/6502-software-gourmet-guide-
cookbook/dp/B0006EAKSM)

[2] Beneath Apple DOS, by Don Worth and Pieter Lechner ...
[https://www.amazon.com/Beneath-Apple-DOS-Don-
Worth/dp/091298...](https://www.amazon.com/Beneath-Apple-DOS-Don-
Worth/dp/0912985003/ref=sr_1_1?dchild=1&keywords=Beneath+Apple+DOS%2C+by+Don+Worth+and+Pieter+Lechner&qid=1590876003&s=books&sr=1-1)

~~~
shivekkhurana
14 pages of assembly!

Do you still work in the tech domain? Do you have any guidance, or words of
wisdom, for people with less experience?

~~~
jlokier
14 pages is pretty small for an assembly program.

After all, to do anything useful it's often 10 times longer than doing the
same thing in C, which is itself longer than doing the same thing in, say,
Python.

~~~
jlokier
To the sibling [dead] comment:

> You are probably thinking of assembly code generated by a compiler, not by a
> human

An understandable response, but no as it happens I'm not.

At various times I've been an expert-level assembly language programmer and
have programmed at least 20 architectures. I forget, some are quite obscure,
especially non-descript Qualcomm DSPs :-) At one time I won an informal
competition to make the smallest ELF executable, and I used to write games
engines, back when every cycle counted, and artfully hand-tuned, clever
texture mapping techniques, tuned for each generation of processor, alongside
higher-level mathematical algorithms to get the best out of the low-level
loops were the sort of thing needed. So I'm familiar with Michael Abrash, and
"human-optimised" dense assembly.

I've written 100% assembly programs on the order of 10,000s of instructions
long, which corresponds to 100s of "pages".

I wouldn't do that these days without a good reason, in fact I'm into very
high level architecture and optimisation methods, and designing languages
around problems, but I do know my way around an assembler when needed.

I've also reverse engineered a number of things from machine code back to C,
sometimes to fix bugs in the machine code and create binary patches, sometimes
to port it, and sometimes to reverse engineer how to operate a device. So I've
read and analysed large amounts of assembler too, but to be fair that code
does tend to be mostly compiler generated.

------
gregjor
I had a dumb terminal with a dial-up modem in 1976. I learned from books,
classes at colleges, mentors. Employers recruited through newspapers and
recruiters, I got jobs through word of mouth and asking everyone I knew. Nerds
would meet up to exchange information. We had magazines like Computerworld.

------
Rochus
I worked on VAX in the eighties and took the knowledge from hundreds of the
famous VAX manuals. They must have cleared whole forests for these manuals.
Tons of paper that filled all the walls and rooms. And of course a lot of
paper books.

And the keyboards were big and loud. And there were only a few lines on the
terminal screen.

And when you printed out a listing, you had to walk a few hundred meters to
the data center (called "Rechenzentrum") to pick up the stack of paper at the
counter.

Even today, in the age of the WWW, I still have thousands of books as PDF with
a local search engine. There is still plenty of (important) information that
you can't just access over the Internet.

------
znpy
This isn't about "pre-internet" but one of the things I miss about developing
Java SE / J2ME code is that as long as you had a local copy of the Java API
javadoc you could work offline.

I really miss that.

------
chrismcb
When you say internet, I'm going to say you really meant the world wide web?
Before college of a books, magazines, and BBSes. I almost got an internship
job at compuserve. But it was across town, and I couldnt afford the gas. In
college it was the usenet. One of my first jobs, I called the help lin, for
the mini we had, more than I would have liked. I asked stupid questions that I
would find on stack overflow today.

------
jl2718
> Who hired you? Pre-internet, nobody. Professional computer nerds did
> hardware. I was a screwup kid only interested in bad ideas from other
> screwup kids with modems.

> What did you work on? The internet. We had dial-up BBSs and some pay
> services, mostly to distribute text files. I spent most of my time writing
> peer-to-peer networking, distributed file storage, and p2p content
> discovery. > How did you Learn?

The book that came with the compiler and/or IDE was usually excellent and all
you needed. I tried reading Dr Dobbs, but that was usually closer to hardware,
and way over my head. There were all these books full of prose that I didn’t
care about. I just wanted code to get things done.

> How did you fix issues? Change. Compile. Error message. Repeat.

> How did you find talent? Kids that swapped floppies at school. Started with
> txtz, then pr0n, then warez, then c0dez.

> War stories? Lots of screwup kid stuff learned from txtz like making bombs
> and drugs. Password stealing worms. Eventually pioneered the click fraud
> worm at the beginning of the internet and that was the end of it for me.

------
LocalMan
Writing code in a commercial IT shop in the late seventies, the manager though
he was progressive since he guaranteed that we could get as many as one
compilation per day. We'd submit punched cards and get back the card deck and
a compile listing when it was done.

Hardware was less reliable, occasionally coming up with the wrong results. One
time the result was wrong because the printer had printed the wrong digit! I'd
spent hours desk checking the code only to find the code was right. So I ran
it again.

It was much more normal for managers to really not know what a computer really
is. This included programming and IT managers. One very accomplished boss
asked me, in a meeting, how long it took to write a program. What kind of
program? I asked. Oh, a general program. That put me in a no-win situation. If
I said a year, he'd pop a cork and get all red-faced. If I said two weeks,
he'd hold me to it no matter what the program was supposed to do.

------
joe202
> I was hired as a physicist with requirement for some programming. Almost
> immediately I was a programmer with a bit of physics.

> I first worked on surface acoustic waves for signal processing - most
> computational intensive part of the business. Other teams were laying out
> chips with Rubylith.

> Learnt by doing, occasional mentoring from more experienced team member,
> reading the manual.

> A lot of issue fixing was done by poring over printouts. A lot of trial and
> error.

> I didn't do talent hunting in those days.

> News was in journal articles, 'trade' magazines, user groups (proceedings
> distributed on magnetic tapes).

We had terminals connected to our main computer, a connection to a remote
university computer (slow) and networks to other sites within the business.
New software would be delivered on magnetic tape, also used for backup. Some
measurement data was transferred via paper tape.

Data was mostly measured, processed and printed out. Archive versions of code
were stored on printout.

------
foolmeonce
I'd just like to mention that the time between the internet and the web was a
few decades for some people and was quite different than "the Internet" today.

You had a Terminal, then maybe a workstation on your desk and lastly a PC and
modem at home.. News groups and irc, ftp and muds over telnet, then eventually
gopher.. If you had no academic affiliations you might use independent BBSes
or communicate over an alliance of them that exchanged mail.

Many things could be downloaded eventually, but large things like compilers
and OSes and their updates might come on tape..

Even once the web was running, it was not evenly distributed even in
corporations and we would fax faq contents to common OS problems to workers
who had large disconnected networks. These still exist for security reasons
today, but those workers at least have web access when they leave their sites.

------
ipnon
The first computer program was transmitted through the postal network.

[https://www.bl.uk/collection-items/letter-from-ada-
lovelace-...](https://www.bl.uk/collection-items/letter-from-ada-lovelace-to-
charles-babbage)

------
jll29
Programming before the internet required tenacy/grit. You had not Stack
Exchange sites and no Google to find solutions to known problems, you had to
solve everything yourself. You could visit a friend and show your code,
usually as print-out or loading it from a floppy disk. You'd learn programming
by reading a book, by subscribing magazones and by hanging out with people
that can programm better than you.

Software would be distributed in print magazines for you to key in, either in
a language like BASIC or Turbo Pascal, or in many pages of hexadecimal numbers
that represent the binary executible (sometimes with a check sum, sometimes
without).

------
quaffapint
I used to bring the C64 Programmer's Reference Guide to our community pool to
read in between the pool, games of Zaxxon, and eating Combos.

------
jimmyvalmer
No stackoverflow meant deciphering manpages or asking the guy sitting next to
you. You can imagine how ugly things got.

~~~
mangamadaiyan
Well, to be fair, you also had W. Richard Stevens if the guy sitting next to
you behaved like a certain bodily orifice, or if the man pages were either too
dense or too sparse. The key was to be able to read source and build a mental
model of the system you were working on. That usually took a few months to a
year depending on the system.

------
chrisbennet
I met my wife in college where we were to only ones that taught ourselves 'C'
from K&R. (Mid 80’s)

------
chrisbennet
You couldn't Google the answer so you needed (and were allowed to) try stuff.
I think this exploration was an invaluable part of my experience. I learned so
much from what _didn 't_ work. We don't have that luxury anymore - you just
Google the best answer.

~~~
markus_zhang
But you didn't have (to use) huge libraries and just a few days to complete a
requirement either...

Nowadays I think a system programmer fits more into the role of prehistory
programmer. You need a manual, a couple cups of coffee and that's it.
StackOverflow probably doesn't have a lot of answers for system programming I
assume. Plus a system programmer usually has more skills out of box.

------
plerpin
I taught myself programming. * Lots of grit. You needed it, to stay in the
game when your shit didn't work. * Type-in programs from books, magazines
(3-2-1 Contact). * Later on, buying big reference books. * Trial and error,
lots of it.

------
karmakaze
I discovered computers when starting high school. We had computers that used
floppy diskettes 5 1/4" TRS-80 or 8" Wang (yes like in the Simpsons).

We also had a computer not in our building where we would code programs in
pencil onto cards (like multiple-choice answers) that would be taken away and
run overnight. You received your syntax errors the next day so there was
incentive to double-check.

That pretty much sucked so I got an Atari 8-bit computer with the Basic and
Asteroids cartridges. The basic book came with simple instructions. Enough
that I could write my own lunar surface generator (2D using graphic
characters) and lander game.

Over the years learned from magazines like Creative Computing, Compute!,
Antic, Byte, Dr Dobbs.

My big break was getting a floppy drive to save programs (before that I had to
turn off the computer losing my work to play Asteroids which invariably
happened regardless of how many days I spent writing the current program). I
also got Galactic Chase and Macro Assembler (with Medit a full screen editor).
Before that I was making strings of extended characters in Basic that
contained manually translated machine instructions to execute. You allocate
the string that ends in a "RTS" return from subroutine, get its address and
jump-subroutine to it.

After getting bits and pieces of hardware graphics info from magazines, I got
my hands on the legendary "de re Atari" which pretty much explained
everything, or at least started to so you could follow up. After that my
friends and I made an endless number of games. Even then I enjoyed making the
game building tools (character set editors, sprite editors, level editors)
more fun than playing the games.

Then I got paid to write a stock-keeping program for a stereo/jewellery store
in the mall with CompuPlace where I hung out. That was CP/M running dBase II.
Wrote an inventory program for CompuPlace with machine-code compressed in-
memory sort with memory bank switching (so like a merge sort). Automated some
record keeping for an Italian immigration assistance shop run by a priest,
discovering espresso. Then I started university co-op. Wrote some mini,
mainframe, Vax assembler routines, COBOL, microcode macros for terminals.

My real education didn't happen until I worked as a co-op for a
communications/graphics company called Eicon Technology in Montreal. Read the
K&R C book, wrote programs using MS C Compiler v3 and v4 (woohoo CodeView). I
wrote an interpreter for the HP PCL language used by LaserJets to be firmware
for Eicon's laser engine products.

I have a few good stories. I was working late at the office one night and the
office got a call. We had phones at our desks with all the line buttons so I
answered it. One of our developers in the Vancouver office doing a high
profile project with IBM had just somehow lost the hard drive partition with
all their latest work. There were lots of undelete tools at the time, but this
was an IBM project--they were using OS/2 with HPFS formatted drives. I figured
that there were identically configured hardware so I instructed the guy, over
the phone, to use the copy of the DOS Norton Utility that I sent him, again
over phone modem, to copy some sectors from any random computer with the same
OS and hard disk size. While working late at the office, I would often read
the books on our shelves, one of them happened to detail the internals of HPFS
and I'd recently read it. HPFS has multiple boot sectors at the beginning of
the disk, then central directory in the middle of the disk (to save on seek
time), then data bands with local allocation info. I just wanted to copy the
initial boot and central directory from one machine to the other. Of course
all the filenames and allocation would be garbage, but it would at least know
it had a filesystem and the size of it. There was some "systeminternals" tools
that could be downloaded from various BBSes and one of them happened to be an
HPFS directory rebuilder. I think we had to boot OS/2 off of several floppies
to get to a command prompt to run that. After maybe an hour of running, the
data was back. Got the news, was surprised to hear it worked as well as it
did. They owed me beer and Mongolian grill when I came to visit.

------
jlokier
A small number of books, lots of magazines, and simply spending time with
machines just trying things and figuring them out.

As a teenager, I had about 2-3 people to talk to about computers, and no-one
who could program like I could.

At 10 I got to play with a BBC micro, then at 11, I was lucky to get one of my
own. Around the same time, I discovered the BBC Micro Advanced Users Manual, a
book which was banned in the school computer room for unclear reasons.

It was chock full of commands for the ROM, things like how to draw,
programming the sound controller, how to work with the video chip, the A/D
converters, etc. Looking back that was a very good ROM and a very good book
for someone like me learning.

In the back was a two page table listing the 6502 mnemonics and opcodes. But
it didn't say what any of the instructions did!

It took me a while to guess what things like ADC, LDX, TXA etc. meant and it
was really exciting to have the ideas fall into place, as if by magic as the
guesses worked out.

The BBC was unusual in having a built-in assembler, embedded into BBC BASIC,
so it was easy to try instructions.

This was awesome for learning assembly as a child, and it meant I ended up
learning assembly less than a year after touching a computer for the first
time.

There were absolutely no books from the local library that helped with
computers. And no bookshops. It was that manual (as well as the non-advanced
manual), a couple of Ladybird books about the basics of how computers work and
logic gates, and... lots and lots of magazines. Computer magazines were almost
everything, I had boxes of them by the end. I'm sure I learned a lot from
them, especially C++, although I didn't have anything powerful enough to
compile C, let alone C++. The low-level technical stuff wasn't really covered;
I had to figure that out myself. But there was a lot of game listings you
could type in and learn from.

(There was one book from the library about PAL colour TV that I kept taking
out though, because PAL encoding is so fascinating and subtle, and it
described it both mathematically and in great detail the analogue circuits
that implemented it all.)

I wrote a lot of games, clever demos, and several little OSes as a teenager
(for different machines). Since there was no internet (that I knew about) and
I had no modem, and couldn't have afforded the phone calls if I did anyway,
for the most part I'm the only person who used my work at the time, but some
of the games, demos and cracks were used by others.

Much of my time was spent cracking games, and some of them had fancy copy
protection schemes and strange tape modulations (we loaded things from
cassette tape), so there were a lot of puzzles to solve.

I ported a few games from machine to machine, just for interest. But a couple
in particular I put a lot of effort into. Starquake was a fun Spectrum game
that I reimplemented on the BBC, but ultimately didn't share with anyone.
Impossaball nearly earned me some cash, but it didn't work out as the company
decided they were leaving the BBC market, shortly after they'd agreed to buy
it from me :/

In those days, chips in computers did just what they needed to when programmed
just right, and did all sorts of strange things when programmed differently.
So another bit of fun, that demoscene folks will recognise, is that if you
poke video chips in various "unspecified" ways or at critical times to exact
clock cycle, they malfunction in interesting ways.

On the BBC that got me a high-res full-colour display which was "impossible"
(therefore good for the Starquake port), smooth scrolling which was also
"impossible", and mode 7 multi-colour text without gaps between letters -
that's enough to do syntax highlighting which I never saw anyone do on it :-)

One of the most memorable things for me from that time, was printing out the
full disassembly of Elite (Disk version) on dot matrix, and poring over it to
completely understand how the game worked. I'm not sure how many pages that
was, but the stack of paper was about an inch thick and filled my bedroom when
unfolded. I learned a lot from that; it was a rich game, full of tricks.

It turns out I couldn't actually play the game properly until I modified it
anyway, as I had a "Torch" BBC Micro, which had a different keyboard mapping
than the standard BBC, so I had to modify the game before I could play it as
it was meant to be played.

My "not quite BBC" machine had a 6502, Z80 and 68000 CPU attached, so in a
peculiar way I got to write for all three CPUs and do some things that were,
in retrospect, very unusual for their time. At various times I was running ZX
Spectrum (Z80) and Atari/Amiga (68000) game code on my BBC (6502), in a way
that is akin to emulation, and remarkably for some things it sort of worked. I
also built a crude floppy disk controller in discrete logic (74LS devices) on
a breadboard so I could read Amiga floppies, because ordinary floppy
controllers couldn't read its unusual format.

Then I went to university at 18, and within a month discovered the internet
using the Sun machines there.

I say within a month, because it was a _secret_ internet: Undergraduates were
not supposed to have access, and we were not even told the internet existed.

I found the internet _entirely by accident due to running an Emacs command_...

And when administation found out, they took it away.... but felt pity and let
me have access again as long as I didn't tell other people. It was read-only,
I wasn't allowed to post anything.

That's when I started to learn Lisp, C and Unix on Sun workstations...

The main source of knowledge was Emacs' own built-in documentation, Sun man
pages, GNU info, and other people's source code (usually GNU) because I could
download packages over FTP, even if I couldn't talk to anyone writing it.

Eventually we were all allowed to use the internet, so Usenet was all the
rage, people joined MUDs, and a couple of years later this thing called NCSA
Mosaic came out.

About then, some guy called Linus posted on comp.os.minix that he was writing
a toy kernel...

------
SigmundA
MSDN Library on CD anyone?

~~~
kjaftaedi
Yes, but this was after the internet.

