Hacker News new | past | comments | ask | show | jobs | submit login
Six works of Computer Science-Fiction (fogus.me)
164 points by smcgivern on Apr 27, 2015 | hide | past | web | favorite | 84 comments



The architecture manual for the Intel iAPX 432 processor reads like alternate-world science fiction. [1] This processor came out in 1981 and was supposed to be the revolutionary new thing in computers. It failed and was mostly forgotten, but the world would be very different if it had replaced x86.

The 432 had incredible hype: "The vacuum tube, the transistor, the microprocessor - at least once in a generation an electronic device arises to shock and strain designers' understanding. The latest such device is the iAPX 432 micromainframe processor, a processor as different from the current crop of microprocessors (and indeed, mainframes) as those devices are from the early electromechanical analog computers of the 1940's." [2]

This 32-bit machine had some very unusual features. It implemented support for objects at the hardware level, with access protection for on a per-object basis. Even the kernel doesn't have access to everything. The world would be much more secure with no more buffer-overflow exploits.

This chip was started before the 8086 and included a virtual address space of 2^48 bytes. It was designed to be programmed entirely in high-level languages.The processor also included garbage collection in hardware. It also supported floating point and multi-processor operation, well before x86 did. Part of the operating system was built into the chip; the policies were defined in software, but the implementation was on the chip.

It's interesting to think what computers and programming would be like if the Intel 432 had succeeded instead of x86. We'd probably have super-secure computers and be programming in Ada.

[1] http://bitsavers.org/pdf/intel/iAPX_432/171821-001_Introduct...

[2] https://archive.org/stream/Intel-AR-166UnderstandTheNewestPr...


The iAPX 432 failed for a very good reason: its performance was infamously abysmal. The performance of the iAPX 432 is actually the subject of one of my favorite systems paper of all time: Robert Colwell's "Performance Effects of Architectural Complexity in the Intel 432"[1] -- a paper that I love so much that I wrote a reasonably detailed review of it decades after it was published.[2]

[1] http://us-east.manta.joyent.com/bcantrill/public/colwell-432...

[2] http://dtrace.org/blogs/bmc/2008/07/18/revisiting-the-intel-...


I love studying failure, or at least when systems break. Please do a follow up that addresses Itanium and the P4.

The failures you describe in the post were technical manifestations of an operational failure in bringing an engineering project to completion. The scope was too large, with too many unknowns. Had they scheduled their project relative to Moore's Law, they could have kept the team smaller until they had the transistor budget to ship the chip they designed. You should read about how the Aztek was made [1]

[1] http://www.roadandtrack.com/car-culture/a6357/bob-lutz-tells...


> The iAPX 432 failed for a very good reason

From my point of view I guess it actually seems like a terrible reason in the long run, given that even something as widely derided as the x86 architecture has, over time, been made to perform.


Ideas fail and succeed for reasons that are totally unrelated to their "goodness". X86, JavaScript, trains, the bicycle.


Please expand on the bicycle.


The bicycle is an excellent technology that has largely failed in western world, but not because it is a bad idea. The car economy has more money and more political power, allowing it to displace the bicycle.


The bicycle is doing fine in the western world, it's the transportation of choice for just about every kid and plenty of adults too.


I wasn't making a value judgement on bicycles. Look at the growth curve and adoption rate of bicycles in the western world. The bicycle paved the way for the car. Tubed tires, chains, sealed roads. All for the bicycle. Cities could have been denser, cleaner, quieter and vastly safer with bicycles. Places like Amsterdam weren't always cycling paradise, canals were getting paved for to make streets for cars. In China which has had great bicycle adoption is seeing a state sponsored push to switch over to a car based consumption economy.

Only recently has the bicycle seen a resurgence in the west. Because there is a small uptick doesn't mean bicycles as a technology have succeeded the level they should have compared to the alternatives.

http://planyourcity.net/2013/03/15/amsterdam-the-bicycling-c...


Interesting, looking forward to reading them.

But certainly processors can do more than they do. For example, doesn't x86 offer a memcpy or memmove instruction and just decode it as the right "fast" way, versus making people update their stdlib with complicated code?

I don't really know what I'm talking about, but it seems I often run across long discussions where people are trying all sorts of instruction sequences and they're model-specific and it just seems like there should be a few more higher-level instructions exposed. (Well perhaps that's the point of all the SSE and AVX instructions.)


Talking of alternative-world: I think anything by Chuck Moore falls in its own world (http://www.colorforth.com).

He even wrote an (unpublished) book about programming with as third basic principle "Do It Yourself!" (Aka "Invent The Wheel!"; http://www.colorforth.com/POL.htm). That alone moves it into its own universe. If you don't think so, read on to "Make variables as GLOBAL as possible. Why not? You can save some space and clarify your requirements"


But programs in the Chuck Moore universe have a total size measured in hundreds of words, globals are like function level scope. To go the other direction is code in a purely https://wiki.haskell.org/Pointfree style.


  > It implemented support for objects at the hardware level,
  > with access protection for on a per-object basis.
Periodic reminder that Henry Levy's Capability-Based Computer Systems is available in PDF form for free¹.

The 432 was slow, but the i960² was a clean RISC that could have had a future.

¹ http://homes.cs.washington.edu/~levy/capabook/

² https://en.wikipedia.org/wiki/Intel_i960


Allow me to add the http://en.wikipedia.org/wiki/Rekursiv

Another weird attempt of the early 80s. Similar fate. Very surprising company though.


The 8086 processor was multi-processor capable!


I hoped for actual fiction, like "The Laundry Archives" (where computation can summon demons), "The Lifecycle of Software Objects" (about raising children AI), "The Last Question" (short story from Asimov) or "The Nine Billion Names of God" (about monks buying a computer for religious purposes).

All of the above are highly recommended, by the way.


Greg Egan's "Disapora"[1] (where most of us literally live in computers) and "Permutation City"[2] (living in a cellular automata).

Similar themes in Roger William's "The Metamorphosis of Prime Intellect"[3] (an AI accidentally takes over the universe).

1: http://gregegan.customer.netspace.net.au/DIASPORA/DIASPORA.h...

2: http://gregegan.customer.netspace.net.au/PERMUTATION/Permuta...

3: http://localroger.com/prime-intellect/


Greg Egan is fantastic. Too many authors just magic stuff into existence all over the place and it leads to large, annoying inconsistencies. Egan's stuff mainly seems to follow the rule of only using magic once, and having the rest of the storyline come from that and make sense/be consistent.

I wish more stories would start off with one magic point, cast one universe-altering spell, suspend my disbelief once, and then just deal with the consequences.

Yeah I hear people call this "hard sci-fi", but that's not really fitting. It can apply to any fiction. There's fantasy like Harry Potter where the world is just unbelievably inconsistent (as HPMOR loved to point out). Compared to, say, Mistborn (I don't read a lot of fantasy), which introduces its restricted magic system and more-or-less deals with it from there.

And the one big change can be huge, unrealistic, too! Like the Culture books - posit that we've got hyperintelligent friendly AI that can warp many dimensions at will - the rest fits in more-or-less from there; but no one would call Culture hard sci-fi.


+1 for Diaspora, if you have any interest in trans-humanism this book has one of the most plausible/believable post-singularity worlds I have encountered.


6 tabs later and a $4 copy of Permuation City on its way to me from Amazon, I had to go back and find what started me on that rabbit hole. Thanks for the recommendation, that sounds fascinating.


If you don't mind highly technical hard SF, Schild's Ladder is also very good. It's not explicitly spelled out in the text, but it serves as a good pseudo-sequel to Diaspora.

Egan's publisher recently ran off a new printing of many books in his back catalog that were hard to find in the US.

http://www.amazon.com/Schilds-Ladder-Novel-Greg-Egan/dp/1597...


Awesome, thanks for sharing. I'm a big hard scifi fan, although I tend to stick to the middle half of the last century (there's just so much good stuff!)


Or Wizard's Bane by Rick Cook, where a programmer gets summoned to a fantasy world and writes a magic compiler in forth.


Here is one of my favourites: http://www.nature.com/nature/journal/v402/n6761/full/402465a...

(Despite the domain name, it is a science fiction piece. Nature publishes short science fiction under their 'Futures' column.)


And how could I forget: David Langford, "BLIT"[1], and "Different Kinds of Darkness"[2].

> BLIT (which stands for Berryman Logical Image Technique) is a short science-fiction story written by author David Langford. It features a setting where highly dangerous types of images called "basilisks" have been discovered; these images contain patterns within them that exploit flaws in the structure of the human mind to produce a lethal reaction, effectively "crashing" the mind the way a computer program crashes when given data that it fails to process.

1: http://www.infinityplus.co.uk/stories/blit.htm

2: http://www.lightspeedmagazine.com/fiction/different-kinds-of...


Like in "Snow Crash".


Yes. Except Blit was published in 1988 and Snowcrash in 1992.


Or any of Stanislaw Lem's books of "fables for robots" like "The Cyberiad".


>I hoped for actual fiction, like "The Laundry Archives" (where computation can summon demons),

Does Stross actually manage to use real theoretical CS in that series?


Words are used in a manner that indicates he clearly knows what they are (which is not surprising given his background), but, generally, I'd say no. There's never been a point in the series where I feel like I super-extra understand something because I have a computer science background. But it's also at least plausible enough that I don't have to turn that part of my brain off.


Another of Stross's short stories, Antibodies [0], examines the consequences if P = NP, a key problem in theoretical computer science.

[0] http://www.antipope.org/charlie/blog-static/fiction/toast/to...


Reminds me of Traveling Salesman (2012) [1], also set in P = NP and discussing the ethics of selling the algorithm to the government. (Literally discussing, it's a low-budget 4-men-in-a-room movie).

I can't say the whole film is worth the time, but I really loved how in the first minutes it establishes it's alternative history by a single sentence: introducing a scientists who "in 2008 was awarded ... the fields medal for his proof of the nonexistence of one-way functions"

[1] http://www.travellingsalesmanmovie.com/


Whoa, that was a good read! Thanks, going to buy some books by Stross :-)

[when I said "reminds me of Traveling Salesman (2012)" I just meant examining the consequences if P=NP; that was before I followed your link and was reminded how good sci-fi _should_ be — it absolutely pales in comparison to Antibodies.]


He implies that such things exist. I think he cites something Turing was supposed to have written (in-universe, I mean) that crossed over between CS and demonology. It doesn't go beyond plausible-sounding titles and breezy one-sentence synopses though.

You can read a bit early on in the first book of the series where he talks a bit about this stuff, on Google books, page 17: https://books.google.ca/books?id=GfSGzhDcU2UC&lpg=PP1&dq=atr...


Stross mentions a suppressed volume of Knuth, can't remember in which book


I'd forgotten about that. Stross's book shows its age here, it's volume 4. The bit is on page 134: https://books.google.ca/books?id=GfSGzhDcU2UC&lpg=PP1&dq=atr...


> He implies that such things exist.

This looks pretty damning: http://en.wikipedia.org/wiki/Petersen_graph

(Don't email it to cstross, I already did ;-)


There are actual fiction books in the small italics at the bottoms of the blurbs.


Just finished The Atrocity Archives recently. Very entertaining.



> I hoped for actual fiction, like "The Laundry Archives" ...

I think this subthread fills that desire nicely. :)


You are like an urban gardener ...


Ken Macleod's The Restoration Game has some fun CS plot twists, especially at the end of the novel...


Yes, it feels much more like Computer-Science Fiction.


I've been looking for good hacker-fi!! Danke!


"these are books of computer science and/or programming that when you read them you can’t quite believe that what they claim is reality."

I feel that part of why these books seem so alien is that most people are taught programming as if it were two different disciplines: "low level" algorithms, with fixed data types, and big-O complexity theory; and "high level" systems design, with type abstraction and object patterns. While a truly skilled programmer must understand both worlds, this sort of model has them separated in the same manner physicists seem to separate general relativity and quantum chromodynamics (or oil and water).

Books on Smalltalk and Forth, like those listed in the article, frequently reveal a mode of programming which is neither purely "high" or "low" level. Yet despite their non-conformity, neither language is haphazard or capricious in design. Instead, the both seem to embody the unofficial motto of the US Army Engineers: "The difficult we do immediately. The impossible takes a little longer."


I was thinking about this just recently, but instead of likening these kind of things to sci-fi, I was thinking more along the lines of non-Euclidian geometry, i.e. what if we take something that is considered to be axiomatic, and change it - a whole different world emerges.

For example, throw out the notion that memory is volatile - or slightly more practically, what if the price we pay for automatic memory management in our programming languages also bought us abstraction over the volatility of memory? How different would our systems look? For one thing, switching things off and back on again wouldn't be the "cure-all" that it mostly is today.

The fact that we can build systems like Smalltalk tells us that much of our current notions of computing are merely convention, not axiomatic at all.

Smalltalk and Forth are definitely "different convention" things, while SICP and CTM are more like detailed examinations of things that might really be axiomatic, giving us the means of combination, and hopefully the means to imagine building things beyond what our mindset of present conventions would allow.


Quite a bit of memory used to be non-volatile.[1] These days, there's FeRAM[2], but it's not widely used. Power-off not working wasn't really an issue - you'd just manually key in a bootloader of ~30 machine words that you knew by heart :)

[1] http://en.wikipedia.org/wiki/Magnetic-core_memory [2] http://en.wikipedia.org/wiki/Ferroelectric_RAM


http://en.wikipedia.org/wiki/KeyKOS was an OS along those lines (but on conventional hardware, treating RAM as cache). It was very different also in being built around capabilities -- a related decision.


'The New Turing Omnibus', along with 'the magic machine' and 'the armchair universe' by AK DEWDNEY really appealed to the young (Science fiction Reading) me.

The current (older) me just had their mind blown by 'Self'- but there is no book.


The Connection Machine. Nanosystems. The Humane Interface. Cellular Automata Machines.


The author of the 3rd book in the list, Peter Van Roy, teaches a course on edX. :) The course is archived but I think people can access all the videos and notes.

http://www.edx.org/course/paradigms-computer-programming-lou...

http://www.edx.org/course/paradigms-computer-programming-lou...


Thanks for this list. I'm always interested in the might-have-been worlds of computer science. The Smalltalk and the Oberon books sound particularly interesting.


I like the idea of matching reference-y books on computer subjects with speculative fiction works.

Anyone have an idea about what would pair well with Neil Gaiman?


At least two of these are available for free download (as in "not pirated"):

http://thinking-forth.sourceforge.net/

http://www.projectoberon.com/


I've been watching the SICP lectures again and rereading the book in light of what Ableson and Sussman emphasize. Their restaurant runs specials but they ain't free. It's only because the wizards are so junior that they don't immediately see Hogwart's darkside.


Can you say how the reading changes in light of the videos? I skipped the vids because I'm hard of hearing, but maybe it'd be worth the trouble to look again with auto-transcription.


SICP became less a book about programming techniques and I see the central theme as sound engineering design in the vein of McConnell's Code Complete.


Does the movie "Sneakers" count?


i've made two attempts to work through ctm, and fell by the wayside each time. should pick it up again and just read it through without trying to do the exercises.


Is computer science really a science? Is there any part of the computer science community (academia or otherwise) that is using the scientific method and experimentation?


This tension derives from the fact that Computer Science is an umbrella term for a set of topics including mathematics, computer systems engineering, and generally agreed to be boring, but absolutely necessary, information technology clerical studies (which has some overlap with the similarly awkwardly named Library Science).

I think we'd all welcome a new, more accurate term, but no one has really come up with a good one. Maybe Applied Computational Philosophy?


Computing Studies


Computer science was a fraud. It always had been. It was the only branch of science ever named after a gadget. He and his colleagues were basically no better than gizmo freaks. Now physics, that was true science. Nobody ever called physics “lever science” or “billiard ball science.”

The fatal error in computer science was that it modeled complex systems without truly understanding them. Computers simulated complexity. You might know more or less what was likely to happen. But the causes remained unclear. - Bruce Sterling, The Zenith Angle


> named after a gadget

I disagree.

Firstly "computer" used to mean the people that did computations, so, that untethers computation from the material doing those computations. And secondly, if you buy into the philosophy of it, computation is all around us, binds us in a way like the force in Star Wars.


> Firstly "computer" used to mean the people that did computations, so, that untethers computation from the material doing those computations.

But it is called computer science, who / whatever is doing the computation, not computation science, so I'm not sure that (either part of) your response applies.

Actually, I disagree with the grandparent (hence, I suppose, with Sterling) differently: I think that it's rather common to name sciences after gadgets, depending on how flexible you are about what is called 'science'. The first example that came to mind, just because I have a colleague who works on it, is cryo-electon microscopy (https://en.wikipedia.org/wiki/Cryo-electron_microscopy). It's fair to argue whether that's really a 'science' as opposed to just a 'technique', but I'm confident that there are other examples that are more clearly 'sciences'.


The fatal error in computer science was that it modeled complex systems without truly understanding them."

And yet, mechanical, civil, chemical, and electrical engineers have made great strides without such understanding, in fields where it is truly impossible to fully understand or model the forces at work, the materials in play, the structures in shape, or the physics in motion.

They manage to get by on approximations, so that's not it.


Nobody ever called physics “lever science”

Mechanics?


Political Science and Social Science seem to be more spurious than Computer Science. At least areas of computer science deal with physics and / or mathematics (Electron tunneling, dielectric effect etc.)

I agree that the term is used too broadly. I don't regard myself as a "Computer Scientist". I'd argue that "Software Engineer" is another term that is too loosely applied, although it does have a definition.


"Computer Science is no more about computers than astronomy is about telescopes." -- Edsger W. Dijkstra (origin disputed)


Yeah, well, astronomers don't call it telescope science.


> The fatal error in computer science was that it modeled complex systems without truly understanding them.

Its a nice quote, but science is all about modeling complex systems as a way of understanding them...arguably, human "understanding" of systems is just modeling of them.


In other words, the science part is bogus. It should be called Computing Studies. That way they could keep the acronym CS which everyone seems to like.

Computer science is about as useful a term as polymorphism (borrowed from biology). It serves to put lipstick on a pig.


To counter obvious retorts:

What if the science of physics started with the Grand Unified Theory, knowing what the fundamental properties & rules of the universe were? would it still be science as we then worked from both ends of knowledge (basics up and complexity down)? how then would it be different from "computer science", where we start both with the basics (0, 1, NOR) and from the complexity of the behavior we see & desire, working both toward the middle for it all to come together and function?


https://www.cs.mtu.edu/~john/jenning.pdf

I'd say yes, and this article is representative of my thinking on the subject (easier than trying to write an essay in a HN comment).


Computer science is a science in the same sense that political science is.


Politics is not defined by logic. Computers are (although "computer scientists" may not be).

You can argue that they're both not sciences, but not in the same sense.


Computer Science borrows methodologies from a variety of disciplines. It certainly uses empirical methods as well as proof due to its roots in mathematics & logic. I don't think you can get much more 'sciency' than that. Sub-disciplines like HCI use methodologies validated in psychology and social sciences.

If you have any doubts please browse a few back issues of 'Communications of the ACM'.


The only disciplines that are supposed to be called sciences are those that use the scientific method.


Is math a science?


No


So, if it's used in physics, do you define it as a tool for applying the scientific method?

If so, do those methodologies in physics also stop being "science" and start being a tool as well?

If you use a Bunsen burner in a method according to the scientific method, does it count as "science"? Or is it just setting fire to things?

Not sure where you're drawing the boundaries. Science is done with tools and methods, and mathematics is one of the most important tools of science, used to define some of the most important methodologies.


[deleted]


The article isn't actually about science fiction.


If you think SICP is about as scifi-flavored as actual scifi, you have seriously narrow horizons.




Registration is open for Startup School 2019. Classes start July 22nd.

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: