
The IBM PC, Part 1 (2012) - Stratoscope
http://www.filfre.net/2012/05/the-ibm-pc-part-1/
======
kpgraham
My company sent me to Boca Raton in 1983 to learn how to use the PC. It was a
week long class with hands on experience configuring PCs and BAT file
programming using "EDLIN".

I had PC serial #7 in the class and serial #1 was at the front of the
classroom.

The instructors were very excited that they could use BAT files and were
bragging that the whole class roster was entered and managed on the PC using
only the tools that came with PC-DOS.

The course set me on the path of systems and programming and kept me fat and
happy for the next 35 years.

~~~
mschaef
> the whole class roster was entered and managed on the PC using only the
> tools that came with PC-DOS.

Having once written a depth-first directory tree traversal using edlin and
batch files, I can only imagine what this must have entailed.

------
forinti
First time I used an IBM PC I thought it was awful. I was used to 8 bit micros
that had: colour, OS in ROM, BASIC in ROM, and sound.

You couldn't do anything with it before you used at least two disks: one for
DOS and the other with an application.

I thought business people liked them because they wanted boring machines on
purpose.

And then came the 16 bits machines: Amiga, ST, etc. And then the Acorn
Archimedes. And IBM PCs were still awful. They started to make sense to me
when 386s came about.

~~~
bluedino
The 386's weren't even that great for games/graphics. Most games were still
286-compatible, and if you had a 386SX you might as well have had a 286. I
remember being blown away by a friends Tandy because of the graphics and sound
card it had, even though it was basically an IBM PC underneath.

The CGA graphics options on the IBM really didn't do them any favors. EGA
wasn't much better but at least you could do games like Commander Keen. I
remember playing games like Prince of Persia, BattleChess, and thinking
"Finally. My friends with Atari and Amiga (and even Apple) computers have had
these games for years."

It really took a 486 and a VGA card to let the PC start to shine. And then
once the 3D games started coming out (Doom, Descent), you didn't even need a
console anymore.

~~~
Zardoz84
Correction , a 386 with SVGA card I played Doom, Descent, Wolfstein 3d, Dune,
Dune 2, Indiana Jones and the Fate of Atlantis, Warcraft II, Sim City 2000 and
Hexen on a simple AMD 386@40 with 4 MiB of RAM and OAK SVGA 1MiB that my
father build self.

~~~
mschaef
The 386/40's were definitely fast enough to be pushing into the 486's
performance envelope (at least for integer).

------
walterbell
_> Rather than talking of an open architecture, we might do better to talk of
a modular architecture. The IBM would be a sort of computer erector set, a set
of interchangeable components that the purchaser could snap together in
whatever combination suited her needs and her pocketbook. _

Has this been achieved with software components in any niche? Other than early
IETF protocols, software mutation tends to challenge interoperability.
Containers are making progress in this direction.

On the hardware front,
[http://www.opencompute.org](http://www.opencompute.org) has increased choices
for those who can buy servers in large quantity. But small business has less
choice these days. Even Dell, pioneer of build-to-order PCs, now favors pre-
configured PCs.

On a positive note, it is now possible to buy some PCs without a Windows tax.
Dell offers Linux and HP offers Linux/FreeDOS on select models. Is there room
for PC OEM/ODMs to innovate in (free, non-bloatware) software components to
increase the value of their mix-and-match hardware components?

~~~
badsectoracula
> Has this been achieved with software components in any niche?

Isn't this what "programs" and "files" do? There used to be a time when
computers were really one (or few) trick machines but since the time when
anyone could install any program on their computer and have programs work with
any file (regardless of which program originally made it), software on the
whole is as modular (if not more modular) as hardware.

(which is also why i dislike the trend that iOS created with software being
isolated both in terms of execution and file access, except a few preapproved
holes here and there)

~~~
jacquesm
> There used to be a time when computers were really one (or few) trick
> machines

You mean the time when programming was done with plug boards?

Computers have been more or less universal since the time RAM was used to
execute the programs, the fact that files are a convenient way to organize
data does noes not detract from the fact that the internal structure of a file
needs to be known if you want another program to make sense out of it.

In that sense the text file is the most enabling element here, and binary
files with unknown structure the least.

The ability to quickly load a program quickly (rather than to load it in
through elaborate means such as plugboards, switches, punched cards, paper
tape or magnetic reel-to-reel tape) definitely has had a lot of effect on the
universal application of computers but that doesn't mean they weren't
universal machines.

And even in the time of plug boards there were tricks that would keep multiple
programs active by means of switching out an entire bank or segment of
configured boards for another.

Software is as modular as the hardware it runs on permits, but it can also
very effectively restrict or expand that modularity, a nice example of this is
virtual machines which present a machine entirely different than the one that
the original hardware embodies.

~~~
laumars
> _Computers have been more or less universal since the time RAM was used to
> execute the programs, the fact that files are a convenient way to organize
> data does noes not detract from the fact that the internal structure of a
> file needs to be known if you want another program to make sense out of it._

> _In that sense the text file is the most enabling element here, and binary
> files with unknown structure the least._

text files haven't always been universal. There used to be different text
encodings (ASCII hasn't always been universal) and even different sizes of
byte (eg some systems would have 6 or 7 bits to a byte). And even if two
machines you're wanting to share data between were both an 8bit ASCII system;
there's no guarantee that they would share the same dialect of BASIC, LISP,
Pascal or C.

~~~
jacquesm
ASCII isn't universal even today (the page you are reading is UNICODE), but
all text was much more readable and in general easier to process with ad-hoc
filtering or other manipulation than binary formats.

This is what underpins most of the power of UNIX, but at the same time is
something of a weak point: good and error free text processing is actually
quite hard.

~~~
laumars
> _ASCII isn 't universal even today (the page you are reading is UNICODE)_

That's not really a fair point because Unicode and nearly all of the other
extended character sets (if not all of them) still follow ASCII for the lower
ranges. This also includes Windows Code Pages, ISO-8859 (which itself contains
more than a dozen different character sets) and all of the Unicode character
sets too.

> _but all text was much more readable and in general easier to process with
> ad-hoc filtering or other manipulation than binary formats._

Text is still a binary format. If you have a different byte size or
significantly different enough base character set then you will still end up
with gibberish. This is something we've come to take for granted in the
"modern" era of ASCII but back in the day "copying text files" between
incompatible systems would produce more garbage than just a few badly rendered
characters or the carriage return issues you get when switching between UNIX
and Windows.

~~~
jacquesm
So, essentially you are trying to make the point that even ASCII has its
problems and that all data has to be encoded somehow before processing can be
done on it. The latter seems to be self-evident and UNICODE is a response to
the former.

~~~
laumars
That's not what I'm saying at all. ASCII is just a standard for how text is
encoded into binary. However it hasn't been around forever and before it there
were lots of different -incompatible- standards. It was so bad that some
computers would have their own proprietary character set. Some computers also
didn't even have 8 bits to a byte and since a byte was the unit for each
character (albeit ASCII is technically 7bit but lets not get into that here),
it meant systems with 6 or 7 bits to a byte would be massively incompatible as
you're off by one or two bits on each character, which multiplies up with each
subsequent character. This meant that text files were often fundamentally
incompatible across different systems (I'm not talking like weird character
rendering; I'm talking about the file looking like random binary noise).

ASCII changed a lot of that and did so with what was, in my opinion at least,
a beautiful piece of design.

I did a quick bit of googling around to find a supporting article about the
history of ASCII: [http://ascii-world.wikidot.com/history](http://ascii-
world.wikidot.com/history)

There was also some good discussions on HN a little while ago about the design
of ASCII, I think this was one of the referenced articles:
[https://garbagecollected.org/2017/01/31/four-column-
ascii/](https://garbagecollected.org/2017/01/31/four-column-ascii/)

The history of ASCII is really quite interesting if you're into old hardware.

~~~
jacquesm
ASCII wasn't nearly as universal as you think it was.

And a byte never meant 8 bits, that's an octet.

ASCII definitely was - and is - a very useful standard, but it does not have
the place in history that you assign to it.

In the world of micro-computing it was generally the standard (UNIX, the PC
and the various minicomputers also really helped). But its limitations were
apparent very soon after its introduction and almost every manufacturer had
their own uses for higher order and some control characters.

Systems with 6 or 7 bits to a byte would not be 'massively incompatible' they
functioned quite well _with their own software and data encodings_. That those
were non-standard didn't matter much until you tried to import data from
another computer or export data to another computer made by a different
manufacturer.

Initially, manufacturers would use this as a kind of lock-in mechanism, but
eventually they realized standardization was useful.

Even today such lock-in is still very much present in the word of text
processing, in spite of all the attempts at getting the characters to be
portable across programs running on the same system and between various
systems formatting and special characters are easily lost in translation if
you're not extra careful.

Ironically, the only thing you can - even today - rely on is ASCII7.

Finally we've reached the point where we can drop ASCII with all its warts and
move to UNICODE, as much as ASCII was a 'beautiful piece of design' it was
also very much English centric to the exclusion of much of the rest of the
world (a neat reflection of both the balance of power and the location of the
vast majority of computing infrastructure for a long time). If you lived in a
non-English speaking country ASCII was something you had to work with, but
probably not something that you thought of as elegant or beautiful.

~~~
laumars
With the greatest of respect, you don't seem to be taking much attention to
the points I'm trying to raise. I don't know if that is down to a language
barrier, myself explaining things poorly, or if you're just out to argue for
the hell of it. But I'll bite...

> _ASCII wasn 't nearly as universal as you think it was._

I didn't say it was universal. It is _now_ , obviously, but I was talking
about __before__ it was even commonplace.

> _And a byte never meant 8 bits, that 's an octet._

I know. I was the one who raised the point about the differing sizes of byte.
;)

> _ASCII definitely was - and is - a very useful standard, but it does not
> have the place in history that you assign to it._

On that we'll have to agree to disagree. But from what I do remember of early
computing systems, it was a bitch working with systems that weren't ASCII
compatible. So I'm immensely grateful regardless of it's place in history.
However your experience might differ.

> _In the world of micro-computing it was generally the standard (UNIX, the PC
> and the various minicomputers also really helped). But its limitations were
> apparent very soon after its introduction and almost every manufacturer had
> their own uses for higher order and some control characters._

Indeed but most of them were still ASCII compatible. Without ASCII there
wouldn't have even been a compatible way to share text.

> _Systems with 6 or 7 bits to a byte would not be 'massively incompatible'
> they functioned quite well with their own software and data encodings. That
> those were non-standard didn't matter much until you tried to import data
> from another computer or export data to another computer made by a different
> manufacturer._

That's oxymoronic. You literally just argued that differing bits wouldn't make
systems incompatible with each other because they work fine on their own
systems, they just wouldn't be compatible with other systems. The latter is
literally the definition of "incompatible".

> _Initially, manufacturers would use this as a kind of lock-in mechanism, but
> eventually they realized standardization was useful._

It wasn't really much to do with lock-in mechanisms - or at least not on the
systems I used. It was just that the whole industry was pretty young so there
was a lot of experimentation going on and different engineers with differing
ideas about how to build hardware / write software. Plus the internet didn't
even exist back then - not even ARPNET. So sharing data wasn't something that
needed to happen commonly. From what I recall the biggest issues with
character encodings back then were hardware related (eg teletypes) but the
longevity of some of those computers is what lead to my exposure with them.

> _Finally we 've reached the point where we can drop ASCII with all its warts
> and move to UNICODE, as much as ASCII was a 'beautiful piece of design' it
> was also very much English centric to the exclusion of much of the rest of
> the world (a neat reflection of both the balance of power and the location
> of the vast majority of computing infrastructure for a long time). If you
> lived in a non-English speaking country ASCII was something you had to work
> with, but probably not something that you thought of as elegant or
> beautiful._

I use Unicode exclusively these days. With Unicode you have the best of both
worlds - ASCII support for interacting with any legacy systems (ASCII
character codes are still used heavily on Linux by the way - since the
terminal is just a pseudo-teletype) while having the extended characters for
international support. Though I don't agree with all of the characters that
have been added to Unicode, I do agree with your point that ASCII wasn't
nearly enough to meet the needs for non-English users. Given the era though,
it was still an impressive and much needed standard.

A side question: is there a reason you capitalise Unicode?

------
bogomipz
Interesting I had no idea about this passage:

>"In September of 1975 the company introduced the IBM 5100, their first
“portable” computer. (“Portable” meant that it weighed just 55 pounds and you
could buy a special travel case to lug it around in.)"

6 years later the "luggable" from Compaq became a real thorn in IBMs side. If
you enjoy this type of computer history I highly recommend watching "Silicon
Cowboys" a documentary about the rise of Compaq but also a bit about the
history of the IBM PC and "clones". It's pretty fascinating and it's available
on Netflix as well as free on Youtube:

[https://www.youtube.com/watch?v=oylmFdZswCM](https://www.youtube.com/watch?v=oylmFdZswCM)

------
amelius
Computing was more fun when hardware and software were separated.
Unfortunately the trend is now going in the opposite direction.

~~~
mschaef
How far back do you want to go with that? I remember assembling computers out
of video boards, motherboards, etc.... but there are some people that remember
soldering chips together... or transistors... or tubes... or relays. The trend
towards higher levels of integration in computing extends back to the 50's (or
maybe 40's).

> Computing was more fun

Fun is where you choose to find it. Modern machines are so good that they can
be fun too.

------
walshemj
Interesting the 8 inch floppys look similar to the ones used in the IBM
displaywriter hard sectored and made a distinctive "graunching" sound when the
drives where accessed.

