
Six works of Computer Science-Fiction - smcgivern
http://blog.fogus.me/2015/04/27/six-works-of-computer-science-fiction/
======
kens
The architecture manual for the Intel iAPX 432 processor reads like alternate-
world science fiction. [1] This processor came out in 1981 and was supposed to
be the revolutionary new thing in computers. It failed and was mostly
forgotten, but the world would be very different if it had replaced x86.

The 432 had incredible hype: "The vacuum tube, the transistor, the
microprocessor - at least once in a generation an electronic device arises to
shock and strain designers' understanding. The latest such device is the iAPX
432 micromainframe processor, a processor as different from the current crop
of microprocessors (and indeed, mainframes) as those devices are from the
early electromechanical analog computers of the 1940's." [2]

This 32-bit machine had some very unusual features. It implemented support for
objects at the hardware level, with access protection for on a per-object
basis. Even the kernel doesn't have access to everything. The world would be
much more secure with no more buffer-overflow exploits.

This chip was started before the 8086 and included a virtual address space of
2^48 bytes. It was designed to be programmed entirely in high-level
languages.The processor also included garbage collection in hardware. It also
supported floating point and multi-processor operation, well before x86 did.
Part of the operating system was built into the chip; the policies were
defined in software, but the implementation was on the chip.

It's interesting to think what computers and programming would be like if the
Intel 432 had succeeded instead of x86. We'd probably have super-secure
computers and be programming in Ada.

[1]
[http://bitsavers.org/pdf/intel/iAPX_432/171821-001_Introduct...](http://bitsavers.org/pdf/intel/iAPX_432/171821-001_Introduction_to_the_iAPX_432_Architecture_Aug81.pdf)

[2] [https://archive.org/stream/Intel-
AR-166UnderstandTheNewestPr...](https://archive.org/stream/Intel-
AR-166UnderstandTheNewestProcessorToAvoidFutureShockOCR#page/n3/mode/2up)

~~~
bcantrill
The iAPX 432 failed for a very good reason: its performance was infamously
abysmal. The performance of the iAPX 432 is actually the subject of one of my
favorite systems paper of all time: Robert Colwell's "Performance Effects of
Architectural Complexity in the Intel 432"[1] -- a paper that I love so much
that I wrote a reasonably detailed review of it decades after it was
published.[2]

[1] [http://us-
east.manta.joyent.com/bcantrill/public/colwell-432...](http://us-
east.manta.joyent.com/bcantrill/public/colwell-432.pdf)

[2] [http://dtrace.org/blogs/bmc/2008/07/18/revisiting-the-
intel-...](http://dtrace.org/blogs/bmc/2008/07/18/revisiting-the-intel-432/)

~~~
rodgerd
> The iAPX 432 failed for a very good reason

From my point of view I guess it actually seems like a terrible reason _in the
long run_ , given that even something as widely derided as the x86
architecture has, over time, been made to perform.

~~~
sitkack
Ideas fail and succeed for reasons that are totally unrelated to their
"goodness". X86, JavaScript, trains, the bicycle.

~~~
efnx
Please expand on the bicycle.

~~~
sitkack
The bicycle is an excellent technology that has largely failed in western
world, but not because it is a bad idea. The car economy has more money and
more political power, allowing it to displace the bicycle.

~~~
jacquesm
The bicycle is doing fine in the western world, it's the transportation of
choice for just about every kid and plenty of adults too.

~~~
sitkack
I wasn't making a value judgement on bicycles. Look at the growth curve and
adoption rate of bicycles in the western world. The bicycle _paved_ the way
for the car. Tubed tires, chains, sealed roads. All for the bicycle. Cities
could have been denser, cleaner, quieter and vastly safer with bicycles.
Places like Amsterdam weren't always cycling paradise, canals were getting
paved for to make streets for cars. In China which has had great bicycle
adoption is seeing a state sponsored push to switch over to a car based
consumption economy.

Only recently has the bicycle seen a resurgence in the west. Because there is
a small uptick doesn't mean bicycles as a technology have succeeded the level
they should have compared to the alternatives.

[http://planyourcity.net/2013/03/15/amsterdam-the-
bicycling-c...](http://planyourcity.net/2013/03/15/amsterdam-the-bicycling-
capital-of-europe/)

------
BoppreH
I hoped for actual fiction, like "The Laundry Archives" (where computation can
summon demons), "The Lifecycle of Software Objects" (about raising children
AI), "The Last Question" (short story from Asimov) or "The Nine Billion Names
of God" (about monks buying a computer for religious purposes).

All of the above are highly recommended, by the way.

~~~
Freaky
Greg Egan's "Disapora"[1] (where most of us literally live in computers) and
"Permutation City"[2] (living in a cellular automata).

Similar themes in Roger William's "The Metamorphosis of Prime Intellect"[3]
(an AI accidentally takes over the universe).

1:
[http://gregegan.customer.netspace.net.au/DIASPORA/DIASPORA.h...](http://gregegan.customer.netspace.net.au/DIASPORA/DIASPORA.html)

2:
[http://gregegan.customer.netspace.net.au/PERMUTATION/Permuta...](http://gregegan.customer.netspace.net.au/PERMUTATION/Permutation.html)

3: [http://localroger.com/prime-intellect/](http://localroger.com/prime-
intellect/)

~~~
frikk
6 tabs later and a $4 copy of Permuation City on its way to me from Amazon, I
had to go back and find what started me on that rabbit hole. Thanks for the
recommendation, that sounds fascinating.

~~~
ngoldbaum
If you don't mind highly technical hard SF, Schild's Ladder is also _very_
good. It's not explicitly spelled out in the text, but it serves as a good
pseudo-sequel to Diaspora.

Egan's publisher recently ran off a new printing of many books in his back
catalog that were hard to find in the US.

[http://www.amazon.com/Schilds-Ladder-Novel-Greg-
Egan/dp/1597...](http://www.amazon.com/Schilds-Ladder-Novel-Greg-
Egan/dp/1597805440)

~~~
frikk
Awesome, thanks for sharing. I'm a big hard scifi fan, although I tend to
stick to the middle half of the last century (there's just so much good
stuff!)

------
AKrumbach
"these are books of computer science and/or programming that when you read
them you can’t quite believe that what they claim is reality."

I feel that part of why these books seem so alien is that most people are
taught programming as if it were two different disciplines: "low level"
algorithms, with fixed data types, and big-O complexity theory; and "high
level" systems design, with type abstraction and object patterns. While a
truly skilled programmer must understand both worlds, this sort of model has
them separated in the same manner physicists seem to separate general
relativity and quantum chromodynamics (or oil and water).

Books on Smalltalk and Forth, like those listed in the article, frequently
reveal a mode of programming which is neither purely "high" or "low" level.
Yet despite their non-conformity, neither language is haphazard or capricious
in design. Instead, the both seem to embody the unofficial motto of the US
Army Engineers: "The difficult we do immediately. The impossible takes a
little longer."

------
radiowave
I was thinking about this just recently, but instead of likening these kind of
things to sci-fi, I was thinking more along the lines of non-Euclidian
geometry, i.e. what if we take something that is _considered_ to be axiomatic,
and change it - a whole different world emerges.

For example, throw out the notion that memory is volatile - or slightly more
practically, what if the price we pay for automatic memory management in our
programming languages also bought us abstraction over the volatility of
memory? How different would our systems look? For one thing, switching things
off and back on again wouldn't be the "cure-all" that it mostly is today.

The fact that we _can_ build systems like Smalltalk tells us that much of our
current notions of computing are merely convention, not axiomatic at all.

Smalltalk and Forth are definitely "different convention" things, while SICP
and CTM are more like detailed examinations of things that might _really_ be
axiomatic, giving us the means of combination, and hopefully the means to
imagine building things beyond what our mindset of present conventions would
allow.

~~~
groby_b
Quite a bit of memory _used_ to be non-volatile.[1] These days, there's
FeRAM[2], but it's not widely used. Power-off not working wasn't really an
issue - you'd just manually key in a bootloader of ~30 machine words that you
knew by heart :)

[1] [http://en.wikipedia.org/wiki/Magnetic-
core_memory](http://en.wikipedia.org/wiki/Magnetic-core_memory) [2]
[http://en.wikipedia.org/wiki/Ferroelectric_RAM](http://en.wikipedia.org/wiki/Ferroelectric_RAM)

------
spdegabrielle
'The New Turing Omnibus', along with 'the magic machine' and 'the armchair
universe' by AK DEWDNEY really appealed to the young (Science fiction Reading)
me.

The current (older) me just had their mind blown by 'Self'\- but there is no
book.

------
abecedarius
The Connection Machine. Nanosystems. The Humane Interface. Cellular Automata
Machines.

------
carlosgg
The author of the 3rd book in the list, Peter Van Roy, teaches a course on
edX. :) The course is archived but I think people can access all the videos
and notes.

[http://www.edx.org/course/paradigms-computer-programming-
lou...](http://www.edx.org/course/paradigms-computer-programming-louvainx-
louv1-1x)

[http://www.edx.org/course/paradigms-computer-programming-
lou...](http://www.edx.org/course/paradigms-computer-programming-louvainx-
louv1-2x)

------
waterlesscloud
Thanks for this list. I'm always interested in the might-have-been worlds of
computer science. The Smalltalk and the Oberon books sound particularly
interesting.

------
jonathanhefner
At least two of these are available for free download (as in "not pirated"):

[http://thinking-forth.sourceforge.net/](http://thinking-
forth.sourceforge.net/)

[http://www.projectoberon.com/](http://www.projectoberon.com/)

------
brudgers
I've been watching the _SICP_ lectures again and rereading the book in light
of what Ableson and Sussman emphasize. Their restaurant runs specials but they
ain't free. It's only because the wizards are so junior that they don't
immediately see Hogwart's darkside.

~~~
abecedarius
Can you say how the reading changes in light of the videos? I skipped the vids
because I'm hard of hearing, but _maybe_ it'd be worth the trouble to look
again with auto-transcription.

~~~
brudgers
_SICP_ became less a book about programming techniques and I see the central
theme as sound engineering design in the vein of McConnell's _Code Complete_.

------
bane
Does the movie "Sneakers" count?

------
zem
i've made two attempts to work through ctm, and fell by the wayside each time.
should pick it up again and just read it through without trying to do the
exercises.

------
DannoHung
I like the idea of matching reference-y books on computer subjects with
speculative fiction works.

Anyone have an idea about what would pair well with Neil Gaiman?

------
wiggumz
Is computer science really a science? Is there any part of the computer
science community (academia or otherwise) that is using the scientific method
and experimentation?

~~~
Apocryphon
Computer science was a fraud. It always had been. It was the only branch of
science ever named after a gadget. He and his colleagues were basically no
better than gizmo freaks. Now physics, that was true science. Nobody ever
called physics “lever science” or “billiard ball science.”

The fatal error in computer science was that it modeled complex systems
without truly understanding them. Computers simulated complexity. You might
know more or less what was likely to happen. But the causes remained unclear.
\- Bruce Sterling, The Zenith Angle

~~~
kleer001
> named after a gadget

I disagree.

Firstly "computer" used to mean the people that did computations, so, that
untethers computation from the material doing those computations. And
secondly, if you buy into the philosophy of it, computation is all around us,
binds us in a way like the force in Star Wars.

~~~
JadeNB
> Firstly "computer" used to mean the people that did computations, so, that
> untethers computation from the material doing those computations.

But it _is_ called _computer_ science, who / whatever is doing the
computation, not _computation_ science, so I'm not sure that (either part of)
your response applies.

Actually, I disagree with the grandparent (hence, I suppose, with Sterling)
differently: I think that it's rather common to name sciences after gadgets,
depending on how flexible you are about what is called 'science'. The first
example that came to mind, just because I have a colleague who works on it, is
cryo-electon microscopy ([https://en.wikipedia.org/wiki/Cryo-
electron_microscopy](https://en.wikipedia.org/wiki/Cryo-electron_microscopy)).
It's fair to argue whether that's really a 'science' as opposed to just a
'technique', but I'm confident that there are other examples that are more
clearly 'sciences'.

------
eli_gottlieb
If you think SICP is about as scifi-flavored as actual scifi, you have
seriously narrow horizons.

