
Systems Past: The software innovations we actually use - vilhelm_s
http://davidad.github.io/blog/2014/03/12/the-operating-system-is-out-of-date/
======
tikhonj
> _FORTRAN’s conflation of functions (an algebraic concept) and subroutines (a
> programming construct) persists to this day in nearly every piece of
> software, and causes no end of problems._

This is exactly what Haskell solves. Not by _eliminating_ subroutines, but by
separating the two concepts out again. In particular, functions _are_
functions in the algebraic sense, and subroutines just become values in the IO
type.

> _Tracing compilers scratch the surface of reversing this mistake, but so far
> I know of no programming languages that are specifically designed around
> such a mechanism._

I'm not sure why he thinks tracing compilers help rectify the issue. Perhaps
they can claw back some performance gains from knowing a function is pure, but
static compilers can do that too.

And I think it's fair to say that Haskell was basically designed around "such
a mechanism"; it's a shame the author doesn't know about it.

> _ISWIM (which some programming language histories identify as the “root” of
> the ML family) is based on ALGOL 60 (source), which of course is based on
> FORTRAN_

While ISWIM was _influenced_ by ALGOL, it was _based_ on the λ-calculus. And
the λ-calculus, of course, precedes Fortran—and, in fact, computers in
general—by quite a margin! It was originally developed in the 1930s.

Many modern functional languages are just thin layers over a typed λ-calculus:
Haskell's Core intermediate representation is actually _exactly_ that extended
with some low-level primitives. This means that they are far closer to formal
logic than they are to Fortran!

~~~
cscheid
> functions are functions in the algebraic sense

This is a minor nit, but there are effects in pure Haskell functions, namely
partiality and non-termination. (In other words, the sense in which "functions
are functions" is actually a deep question)

There's plenty of academic discussions on how to solve this problem. See stuff
like this: [http://lambda-the-ultimate.org/node/2003](http://lambda-the-
ultimate.org/node/2003)

~~~
efnx
I always thought that algebraic functions are not guaranteed to be defined
given certain parameters. It's just that perfectly algebraic functions don't
throw errors, they silently return +-infinity. Like the asymptotes in `tan x`.

~~~
anon4
To be pedantic, tan doesn't have a value at tau/4 (or pi/2, if you swing that
way). Also, algebraic functions don't return, they are - cos(0) is 1, it
doesn't return 1, it doesn't compute 1, it is not a kind of computer, nor a
kind of program, nor any kind of thing that consumes resources and time and
returns a value; it really literally is 1.

Algebraic functions are just syntactic notation. You can sit down and convert
from one notation to another, like how cos(5) is a number quite close to
0.28366218546322625, and deriving one representation from the other does take
resources and time, because it's a physical process performed by a person or
computer.

But sin, cos, tan, cotan, log and all their friends by themselves don't
compute, they are just a different kind of notation for numbers.

Which is why I find the desire to make functions in programming like algebraic
functions silly - by definition they are two completely different things. One
is a specification for a process that produces binary-encoded numbers, the
other is a syntactic notation for real numbers.

~~~
theoh
In a purely mathematical sense, if tan(pi/2) doesn't denote a number, what is
its type as a value?

~~~
lutusp
tan(pi/2) is undefined, like 1/0.

[http://math.stackexchange.com/questions/189621/is-tan-
pi-2-u...](http://math.stackexchange.com/questions/189621/is-tan-
pi-2-undefined-or-infinity)

------
StandardFuture
Most of the comments in this discussion are missing the point of this article
in almost its entirety. I am seeing everything from "Haskell is the most mathy
of the languages and resolves an approach to algebraic concepts!" to "This
article is bullshit!" ... Sad, really.

An easy way to summarize what this article is trying to convey can be derived
from the title of the article: Systems Past. The next chapter would simply be:
Systems Future. And this is what the author is trying to get across.

There is nothing wrong with languages or OSes. What's wrong is a seemingly
pervasive attitude throughout the hacker community to never want to improve on
foundational concepts. This is usually argued as: 'if it ain't broke don't fix
it'.

One critique of this article I will give is: these software innovations are
dependent on the hardware architectures used. And we have been using the same
basic computer architecture for decades. So maybe it is not fair to assume
revolutionary systems innovations should happen before we have revolutionary
hardware systems to program?

~~~
davidad_
I am very happy that you get the point! :-)

To address your criticism: Internetworking fundamentally required new
hardware, Interactivity and Hypermedia depended on advances in display
technology, and Virtualization and Transactions benefit substantially from
hardware acceleration. However, the OS, the PL, and the GC were all
independent of any new developments and hardware. Our display technologies are
already way ahead of the computational features they should be able to
support. Same goes for telecommunications. And the hardware acceleration that
powers virtual memory and memory locking is versatile enough to be applied to
more advanced abstractions as well (although the advanced abstractions might
later benefit from more advanced acceleration).

I spent a few years at MIT trying to design revolutionary hardware systems and
left with a deep respect for Intel. Much as I'd like to have a PC based on the
Lisp Machine or the Connection Machine, I've come to believe it's we software
folks who really aren't keeping up, rather than any kind of stagnation in the
hardware world.

In fact, Intel comes out with a whole pile of new machine instructions every
other year, and they probably never get invoked once on most PCs: most
binaries are effectively compiled for AMD Opteron (the first x86_64 processor,
released in 2003) so that they'll run seamlessly on anything since then.

------
gumby
This is an outstanding essay -- more fields need this kind of thinking. Plus
it's doubly ironic having it in computation, a field that has always seemed
determined to ignore history, and reinvent it.

A nano nit: Yes, GC came from Lisp, but its first mention was in AI Memo #1
(the first MIT AI Lab working paper) by Minsky. I have a copy someplace -- it
was only a few pages long.

~~~
grinich
I would love to read that if you could post it.

~~~
DougMerritt
I found a PDF (page images):

[http://www.softwarepreservation.org/projects/LISP/MIT/AIM-00...](http://www.softwarepreservation.org/projects/LISP/MIT/AIM-001.pdf)

------
netdog
See David Wheeler's page, The Most Important Software Innovations
<[http://www.dwheeler.com/innovation/innovation.html>](http://www.dwheeler.com/innovation/innovation.html>).

The page has been online and refined for 12 years. It lists things such as the
Stack, Packet-Switching Networks, Spelling Checker, Relational Model and
Algebra (SQL), and quite a few other useful and important software
innovations.

~~~
MaysonL
[http://www.dwheeler.com/innovation/innovation.html](http://www.dwheeler.com/innovation/innovation.html)

------
mneary
Considering the idea of a programming language a fundamental innovation
derived from FORTRAN disregards earlier concepts, like Gödel numbering from
the year 1931, which exhibit language interpretation. I guess what I'm trying
to say is that every good idea is closely related to countless others; pigeon-
holing them into the Only 8 categories and naming a "first" doesn't do justice
to all of the interesting ideas of computation.

------
zanny
> Virtual memory should have been extended to network resources, but this has
> not really happened.

I get that we are already operating in heterogeneous virtual memory worlds,
but network transactions are _so_ slow. I can't see it being useful to have
them as virtual addresses if random reads and writes to network space take
literal seconds of round trip. That is so much worse than even disk, there is
a reason networking is at most virtual filesystem bound and at least just its
own thing above that via URIs.

It really pokes holes in Von Neumann computer models around memory when the
memory has heavily disparate access times. You can have networked devices via
drivers (like printers) that do have those huge round trip times, or you can
have cache hits that give you single digit cycle retrieval. It is NUMA before
you even get to the hardware version.

> Reject the notion that one program talking to another should have to invoke
> some “input/output” API. You’re the human, and you own this machine. You get
> to say who talks to what when, why, and how if you please. All this software
> stuff we’re expected to deal with — files, sockets, function calls — was
> just invented by other mortal people, like you and I, without using any
> tools we don’t have the equivalent of fifty thousand of. Let’s do some old-
> school hacking on our new-school hardware — like the original TX-0 hackers,
> in assembly, from the ground up — and work towards a harmonious world where
> there is something new in software systems for the first time since 1969.

So if you had a machine executing code - without an operating system - it
would need to pull in functions from disk or something whenever they get
invoked by another program? And have some means to deduce which function that
is, via some mechanism to scan the filesystem to find it. Because you need to
discretize out "programs" because each one is inherently insular in its world
view. So you just execute dynamic code that invokes other dynamic code.

That sounds like a real big performance hit, though, to have an indirection on
every function call to see if its property resident in memory or just fake -
at least a conditional every time saying "is this function at 0x00? then it
needs to be looked up!".

~~~
mcguire
Virtualization and network resources have a long and glorious research
history, although most, if not all, of the approaches are not currently in
fashion. (And maybe there's a good reason for that?[1])

"Distributed memory" is one example. Once upon a time it was a big deal. I
suppose that things like iSCSI could be regarded as an application, but that
is the only use I know about that's at all recent.

"Remote procedure calls", "distributed objects" and stuff like that could be
seen as "virtualization", if you like. I suspect all of the major advocates of
these have either recanted or died off---at least I hope so. When I'm wearing
my network-protocol-guy hat, I hate these things with the fiery passion of a
thousands suns.

[1] See
[http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.41.7...](http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.41.7628&rank=1),
"A Note on Distributed Computing" by Jim Waldo, Geoff Wyant, Ann Wollrath, and
Sam Kendall, from Sun ca. 1994, for some early, clear, and correct reasons why
making remote things look like local things doesn't work as well as you might
think.

------
ariwilson
Haven't people been communicating information to other people as text for
thousands of years? I'm not so convinced that there's a universally better
mechanism for communicating with machines.

~~~
Millennium
Text isn't as important as language itself, but there's the rub: no one has
yet devised a better way to program computers than language. Even the so-
called "visual environments" we see today ultimately boil down to recognizable
linguistic concepts: they do it by arranging shapes in space rather than
writing text, but you can pick out the same nouns, verbs, and other parts of
speech. And that's the problem. Visual environments are trying to represent
language, and they're just not as good at it as text, which is the closest
thing to a native format yet devised for the stuff. Eventually the
inefficiencies start to get annoying, people go for the greater efficiency of
text, and the visual environment languishes.

The bottom line is that visual environments will never be as good as text for
representing programming languages. Rather than trying to do language better
than text, they need to find a paradigm that's better than language, and
implement that. I don't have any clue what something like that would even look
like; to be honest, I'm not sure it can even be done. But the greatest
victories against the most hopeless-seeming odds are won by changing the whole
nature of the fight, and that's what visual programming is going to have to
do.

~~~
andrewflnr
What, precisely, do you mean by language? Machine language doesn't exactly
have grammar the same way that human language does, and I can't interpret a
concatenative language that way either (which is much higher level). We're not
going to get away from the idea of a linear sequence of bits/symbols, but
beyond that almost anything goes, and indeed probably has gone.

------
owenversteeg
> Every programming language used today is descended from FORTRAN

As a matter of fact, everything didn't descend from FORTRAN. COBOL was heavily
influenced by FLOW-MATIC which descended from the A-0 System. All three of
these were created mostly due to the work of Grace Hopper, commonly called
"the mother of COBOL."

~~~
beagle3
APL did not descend from Fortran either - it wasn't even designed as a
programming language - rather as a standard notation for algorithms

------
sanxiyn
For some truly innovative operating system and programming language, I
recommend everybody to go learn about Urbit.

[http://www.urbit.org/](http://www.urbit.org/)

~~~
kabdib
Urbit made my head hurt. Interesting, but in an "I don't want to go there and
do real work with it at all _ever_ " way.

~~~
urbit
Edith Piaf said it best:
[https://www.youtube.com/watch?v=fFtGfyruroU](https://www.youtube.com/watch?v=fFtGfyruroU)

------
ATLobotomy
Reminds me of the "Future of Programming" presentation given by Bret Victor
[0]. Bret's talk is much more focused on the concepts that were created during
the early period of CS, but abandoned(more or less) over the years rather than
the major concepts that have persisted.

[0]: [http://vimeo.com/71278954](http://vimeo.com/71278954)

------
TrainedMonkey
"ARPAnet is the quintessential computer network. It was originally called “the
Intergalactic Computer Network” and ultimately became known as simply “the
Internet”."

Awesome, I did not know that.

~~~
fjarlq
It was something J.C.R. Licklider envisioned in the early 1960s when he was
the first director of the Information Processing Techniques Office at ARPA.

[https://en.wikipedia.org/wiki/Intergalactic_Computer_Network](https://en.wikipedia.org/wiki/Intergalactic_Computer_Network)

M. Mitchell Waldrop delves into the history in his wonderful book about
Licklider titled "The Dream Machine". Here's chapter one:

[http://www.nytimes.com/2001/10/07/books/chapters/07-1st-
wald...](http://www.nytimes.com/2001/10/07/books/chapters/07-1st-waldr.html)

Here's an example of a memo written by Licklider discussing the Intergalactic
Computer Network:

[http://worrydream.com/refs/Licklider-
IntergalacticNetwork.pd...](http://worrydream.com/refs/Licklider-
IntergalacticNetwork.pdf)

------
HaloZero
Shared Memory seems dangerous because you assume that both actors are behaving
correctly. The reason why programs are isolated I always thought wasn't
because programmers were lazy but to ensure that a malicious or a badly
written program can be contained.

~~~
marcosdumay
Well, if that's the goal, it failed completely. If you run a malicious program
once, it can spread to all your data.

Isolation was a protection about badly written programs, but it was mainly
about simplifying things and increasing the (virtual) memory available.

------
teddyh
Is it just me, or is #2 (Operating System – running separate programs
concurrently independent of one another) effectively the same as #6
(Virtualization)? It is the same idea – the programmer can pretend that the
program has a machine all to itself.

~~~
thepicard
A modern OS is definitely a virtual machine, where each process perceives that
it is running on a single CPU with its own single contiguous bank of memory.
Threads are a bit of a leaky abstraction but whatever.

What is interesting is that the operating system virtualizes a machine that
doesn't actually exist: fake "hardware" that can execute syscalls like
read/write/exit. A VM in the contemporary sense has the exact same
functionality, with a different interface. Rather than read/write as syscalls,
you have to send SATA commands to disk, or commands to a network card, or
whatever. Instead of an exit system call as an interface you work with a
hardware interface that powers down the physical machine.

Containerization is actually a logical next step from this. Why virtualize a
REAL hardware interface only to virtualize a fake one on top of it? The only
reason to do that is if you want multiple fake interfaces, eg Linux and
Windows. When virtualizing a bunch of Linux machines, mostly you really just
want isolation of your processes. Virtualizing real hardware is a hack because
Linux was not capable of isolating processes on its own, so you had to run
multiple copies of Linux! Now with cgroups and other resource namespacing in
the kernel, it can isolate resources by itself.

~~~
teddyh
The fact that an OS supplies system calls is mostly irrelevant – it is a
separate concept (not listed in the original article) which we usually call
“Software Libraries”. But innovation #2 did not list the standard libraries as
a point of an Operating System – the process isolation is the point. Libraries
had been in use long before.

I definitely agree that hardware virtualization is going the long way around,
and that more refined process isolation is the way to go. The Operating System
was made for this, and it should continue to do this; there is no
architectural need for an additional level of isolation.

------
leggo2m
Didn't know what to expect going into this with a title like that, but was
pleasantly surprised. A legitimate list.

------
pinealservo
I wholeheartedly agree with the sentiment expressed by the introduction to
this article. We really do seem to have got stuck in a deep rut, where we can
make progress laterally but can't seem to come up with anything truly novel to
move the state of the art dramatically forward.

I have some issues with the style of the rest of the article, though. It
consists of a lot of very interesting and thesis-supporting facts, but they
are couched in a lot of arbitrary statements ("only 8 software
innovations...") of dubious facts that don't seem very well supported on their
own.

I mean, yes, you say there are eight and then list eight, but I am not left
convinced that those are the ONLY eight. You say that all languages (aside
from a bit of backpedaling in the footnotes) are descended from FORTRAN, which
is a pretty bold claim to make, but the justification you provide seems to
reduce "descended from" to a mostly meaningless "probably borrowed some ideas
from" that is hard to base any value judgement on. Surely not all ideas in
FORTRAN were misguided!

The whole rest of the article continues in this pattern, distracting from
basically good points with brash and sometimes bizarre (from my perspective,
at least) statements that seem to belie a wide but very spotty understanding
of computing history. Granted, it's been chaotic and not terribly well-
documented, but that ought to give one second thoughts about writing with such
a definitive and dismissive tone.

I want to repeat that I agree with the general premise, and I think that it's
unfortunate that I came away from the article feeling like I _disagreed_ with
it due to the problems noted above. I had to re-read the intro to remember the
intent. Hopefully this criticism is accepted in the constructive sense in
which I offer it, as I think that there's some great insight there that could
be more effectively conveyed.

------
mwcampbell
On the one hand, the article challenges us to question established ways of
doing things. On the other, the first footnote correctly points out some
projects that were economic failures because they were technology for its own
sake rather than providing something of value to people.

Some of us may have the freedom and the desire to hack on things that are
destined to be economic failures. But for the rest of us, I think it's more
important to err on the side of technologically conservative but economically
successful projects. So, most of us, myself included, will continue to work
within the context of established programming languages, operating systems,
and other groundwork that has already been laid for us.

------
cliveowen
A very well-written article that promotes thinking, or better yet, re-
thinking. The main point to take away is this: don't take for granted current,
commonly-used constructs and architectures, they're the result of decades of
tradeoffs designed to tip-toe technology's constraints. Today most of those
constraints are long gone and the assumptions don't hold anymore, if we could
just forget the bad parts instead of accepting them as gospel and use these
five decades of experience to build something new and better, maybe we could
finally stop our current methodologies from curbing our progress.

Kudos to the author.

------
nzp
Interesting article, but I think that the basic premise, that it's "bizarre"
that we're still using concepts developed 50+ years ago, is a bit naive. To
give an analogy from a different engineering field, the basics of rocketry and
space travel have been conceptually almost completely developed almost 100
years ago. Multistage rockets, orbital stations, etc. Should it be considered
bizarre that we still use those same concepts and mechanisms to fly into
space? I don't think so. People had figured out an optimal (sometimes the best
or only possible) method to do something and we're using it. Sometimes a
better idea isn't possible because a better method can't exist. Some ideas are
simply timeless.

One of those ideas, I believe, is expression of programs as text. It wasn't
even a distinct idea, it's just the most efficient, natural way to express
algorithms. You can't get around the fundamental mathematical fact that you
need a formal symbolic system to express algorithms, i.e. you need a language.
Until we gain the ability to directly interface our brains with computers
we'll need to express language in written symbols, and even then I doubt we
could get away without text for cognitively expensive activities such as
programming (because of limitations of our working memory, etc.). "Lines of
text" are anything but limiting.

What the article says are some drawbacks of operating systems I don't think
are drawbacks at all. Having the OS lying to programs so that they don't have
to know irrelevant details of the machine is a really _good_ thing.

> But when it comes to what the machine is actually doing, why not just run
> one ordinary program and teach it new functions over time?

What?! You mean like one monolithic piece of code doing everything ranging
from memory management to email and multimedia? I must be missing something,
am I stupid and just don't understand the proposal?

> Why persist for 50 years the fiction that every distinct function performed
> by a computer executes independently in its own little barren environment?

Because it's a good idea, it reduces complexity for the function (program) in
question.

> A righteous operating system should be a programming language.

Like we had with some early PCs where you essentially had a BASIC interpreter
for an OS? That concept got replaced because it was a horrible way for humans
to do actual work instead of dicking around all day with toy programs.

> Let’s do some old-school hacking on our new-school hardware — like the
> original TX-0 hackers, in assembly, from the ground up — and work towards a
> harmonious world where there is something new in software systems for the
> first time since 1969.

While I have nothing against assembly (to quote Michael Abrash: "I happen to
like heroic coding."), first, I find the idea of regressing to old methods of
producing programs to yield new ways of computing a little strange, and
second, there's a good reason assembly isn't used unless necessary--it's a
horribly unproductive way to solve problems. Unless the assembly in question
is Lisp. ;) Or Haskell. So if we're dreaming, let's dream all the way--we need
pure functional computing machines, not just "better mouse traps".

~~~
marcosdumay
>> A righteous operating system should be a programming language.

>Like we had with some early PCs where you essentially had a BASIC interpreter
for an OS?

I think that line has some relation to the graphical programming language at
the beginning. If so, no, it's nothing like old BASIC interpreters, and more
like making GUIs out of interoperating modules.

------
merak136
Am I the only one impressed with how long some of these technologies have been
around? I know most of the common ones such as the internet and FORTRAN but I
did not realize how long markup languages have existed. Also at the rate that
technology improved in such a short amount of time. That must have been an
exciting time to work in the field.

------
sixdimensional
I find that in the pursuit of the latest and greatest, a pattern we so often
see in society, it is easy to forget to let our historical experiences of the
past inform our current and future thinking. There is a place for both - pure
forward thinking creation as well as innovation inspired by the past.

------
akkartik
Only? No mention of viewport clipping? Compression? Encryption? Bittorrent?

~~~
TheZenPsycho
would the way we use computers be completely unrecognisable without those
things?

~~~
jagger27
What would the internet look like without encryption?

~~~
TheZenPsycho
so wait. are you saying encryption… _encryption_ was invented after 1970?

~~~
Pitarou
_Asymmetric_ encryption (public-key cryptography) was developed in the 1970s.
The internet would be very different without it.

------
agumonkey
This reminds me of golang vs brand-x
[http://cowlark.com/2009-11-15-go/](http://cowlark.com/2009-11-15-go/)

------
pdonis
If this writer is so convinced that we need a new way of interacting with
computers, why isn't he building it, instead of just writing about it?

~~~
TheZenPsycho
Because software actually is kind of hard, and to get anywhere you need to
convince a somewhat larger group than one to all work towards the same goal.

And how do you know he is not?

~~~
davidad_
Yes. I am working on it, but it is "kind of hard" to do by oneself. Also,
writing about it is a good step, regardless of who might be convinced or not,
simply because it forces me to get my ideas more straightened out.

~~~
andrewflnr
I'm quite curious where you intend to go with the "mesh" project on Github.
"An operating system with the heart of a database" sounds like some of my
ideas, as does your doc/index.md, but it seems to stop there.

------
javajosh
You want to throw out the OS and programming languages? You think that text is
a poor interface for specifying machine behavior? Then show me. Show me
something real. Perhaps not text (since text is a dirty word, right?) but
something I can install on my machine.

If you're going to pull an Emperor's New Clothes, then it's not enough to
loudly (and snarkily) proclaim that the emperor has no clothes on - you need
to produce a naked emperor.

Heck, it should be easy, right? If you think that the conventions of the last
50 years are all shit, then how hard could it be to come up with some new
ones? If you want to shake the pillars of computer programming, you need to be
able to do more than say "everything is crap," and if you can't, then you
sound like nothing more than surly, precocious, ignorant teenager.

~~~
secstate
I'm sad that you got downvoted, though I have a feeling its simply your choice
of language, as your points are spot on. This is a wonderfully curated list of
where computers have come from. But to go further and claim that we're all
somehow fools for not exploring new ideas without offering any overview of
what a new idea might look like is disingenuous.

Plenty of artists can dream up interfaces like in Minority Report, Tron, or
Hackers, but to actually build something that achieves a human goal is much,
much more difficult.

For what it's worth, I think the Self object environment, Squeak smalltalk and
Flow programming are beginning to approach the paradigm shift the author hopes
for. But in the meantime, I'm communicating with all of you in words, so I
might as well communicate with my computer in them as well.

~~~
gfodor
Honestly I feel like this type of viewpoint is one that almost every
programmer gets to after about 10 years doing real work. The difference is
tact. Some hack out some experiments in new programming paradigms and some
post detailed ideas or concepts of how things can be improved, and others just
try to gain street cred by just saying everything is obviously shit and we are
all fools for not "fixing" things.

~~~
secstate
Well said, sir. Worth noting that this overview of where we've come from and
how not far we've come was posted by a fellow is both clearly a genius, and
who has his most popular github project written in x64 assembly. So I'd argue
that even he sees the merit in the past when attempting to blaze a trail into
the future.

For one thing, as long as our computers are binary, they are going to require
instructions in a very specific way and anything we put over the top of them
will expose that architecture to some degree or another.

~~~
davidad_
My entire article is about things that happened between 1955 and 1969. The
part where I state my thesis literally concludes "With a solid historical
perspective we can dare to do better." I'm not sure how much more obvious I
could make it that I "see the merit in the past when attempting to blaze a
trail into the future".

It's the _present_ whose merit I find lacking.

------
wes-exp
The author forgot node.js! It has... modules!

------
greatsuccess
Quite a load of shit. Those innovations are now taken for granted and the
"this was then", "this is now" comments it where you see the leakage in the
arguments.

Im not going to quote anything from this to dismiss it. Its simply wrong, a
bit extreme, and out of context. You would have to be an idiot to buy the
entire premise.

~~~
TheZenPsycho
This comment is a load of shit, and so since you're happy to dismiss an entire
essay without any back up, evidence or argument, I can just as easily dismiss
your comment. Hitchen's razor.

------
mildtrepidation
_[Transactions] enabled the development of systems called databases, which can
reliably maintain the state of complex data structures across incessant read
and write operations as well as some level of hardware failures._

This is interesting. It suggests to me that either the author is using a
nonstandard or outdated definition of "database" or that a denotation-
fundamental tenet of database implementation, transactions, has been left for
dead by the side of the road by many modern "database" projects.

Regardless, this is a great and informative piece.

~~~
ArkyBeagle
How is transactional integrity an outdated concept? And you are actually
telling me that there are "databases" that _don 't_ have transactional
integrity?

~~~
eropple
MySQL with MyISAM tables.

~~~
ArkyBeagle
Ah. Thanks.

