

Talk with Rich Hickey: Time is the New Memory - alrex021
http://www.artima.com/articles/hickey_on_time.html

======
vdm
I think the Date class is an unfortunate example to use, its too prone to
being confused with the idea of a timeline of mutable state. Maybe a better
example to use would be the concept of a document, and that parts of it don't
change while you're reading it. On the other hand, his audience will always be
familiar with the Date class...

I've been following Rich's work for a while and I felt myself almost slip into
confusion.

~~~
thunk
I think he chose it specifically because of the pleasing self-similarity --
it's sort of a joke. After all, what kind of problem is a mutable date? A time
problem, of course.

------
gruseom
What does this add to the discourse about mutable state? Is there any genuine
insight here, other "hey it's kind of like about time"? If so, it ought to
give rise to some new construct or technique that helps solve the problem, or
even just a simplification of some well-known example.

I guess he's trying to popularize the first principle of functional
programming, but he's doing it in a confusing way that will inevitably breed
further confusion. Look at the absolute silliness of the title for evidence of
that.

~~~
fauigerzigerk
What's your issue with the title? We did manual memory management in the past
and we do manual time management now.

I agree with him completely that managing the timing of parallel processes is
the most pressing complexity issue in computing today.

~~~
gruseom
Absolutely no one uses the phrase "time management" in this way. "I found a
bug in my time management." <\- Huh?

That mutable state has something to do with time is a tautology. Nothing in
the recent discussion shows it to add anything significant to what we do or
think about the problem. The interview title takes this doodling and amplifies
it into some sort of grand theory. I'd be embarrassed.

I'm not criticizing Clojure at all, I'm criticizing this way of drawing
attention to it (if that's what it is).

~~~
richhickey
Yes, there is a tautology, and I'm glad you understand it, but such
understanding is not widespread I assure you.

You are right, no one says "time management". I don't think they said "memory
management" as often either, until there was _automatic_ memory management
(GC). But one thing is certain, most OO languages don't yet offer automatic
_____ management, whatever it is to be called.

People do say they have a "race condition" or a "deadlock", or simply "data
corruption" often without connecting it at all to the unmanaged mutation of
their objects. And before there was GC there was just "heap corruption",
"double deletes" and "memory leaks". Management implies a manager, and there
isn't one yet, so why would anyone say it? What I am advocating is that
languages provide automatic management of "time". It's not a novel idea, yet
it is still missing, and I think it is missing because there is no place to
put it in OO strategies which conflate identity with state, with weak or
missing notions of object values.

But the phrase "mutable state" is broken, in a way that keeps people from
understanding the problem and especially the potential solution. As soon as
you consider state the value of something at a point in time, then mutation
isn't the correct term, succession is, and the connection to time becomes
clearer.

FYI, I neither wrote nor uttered the title. The interview was a recording of a
casual, no prep conversation following my keynote. It was not meant to be
profound, nor an advertisement.

~~~
gruseom
I agree with much of this, but "time management" still seems like a half-baked
metaphor for FP immutability and not a clarifying model, at least not yet.
It's easy to explain the X to which "memory management" refers: allocating and
freeing memory. Surely that was easy to understand even before the advent of
GC. It's also easy to explain what memory is and point to where and how it is
manipulated in code. I don't see how time is a programming construct in the
way that memory or state is.

~~~
cschep
Isn't time a product of memory and state?

------
reginaldo
Funny that Rich used the number 42 as a recurring example in the interview.
After all, 42 is "the answer to life, the universe and everything" (really,
just search on google), which basically is what he is trying to find.

------
10ren
I wonder if there is an entirely different way to reason about computation,
other than mutable state, which coincides with our intuitions based on
everyday experience of the physical world; and immutable state, which
coincides with the powerful and abstract approach of our mathematics.

Alas, mutable and immutable would appear to cover all the cases.

~~~
lucifer
The issue isn't computation, it is modeling. The next order of evolution for
computation is quantum computation. What R Hickey is addressing is modeling.

I think the modeling problem could be alternatively addressed by unlearning
the notion (conceit?) that every program is a representation/simulation (which
is where (domain) state creeps in) and instead embrace a
descriptive/measurement approach. Or more precisely, to accept that a program
is always orthogonal to the domain model and give up chasing after the holy
grail/folly of seeking a paradigm to express "reality" in computer
instructions.

One could argue that this is what FP is all about, but suggestion here is that
the issue is fundamentally the conceptual view of the programmer and it can
apply equally to OO or any other paradigm of programming.

For example, the "Date" case he outlines conceptually works if one is to
suspend disbelief and accept the notion that a "Date" type object is really a
point in time. "How can September 22, 2009 change?" But at some level this is
just semantics. Would calling the "Date" object "DateDatum" cause a similar
conceptual problem?

DateDatum lastUpdate = now (); //... doSomething(); lastUpdate = now();

~~~
richhickey
> DateDatum lastUpdate = now (); //... doSomething(); lastUpdate = now();

Yes, this highlights the crux of the argument:

MutableDate lastUpdate = now(); doSomething(); lastUpdate.changeInPlace();

versus:

ImmutableDate lastUpdate = now(); doSomething(); lastUpdate =
changedVersionOf(lastUpdate);

In the first case, when someone reads lastUpdate they get a chunk of memory
that could change right out from under them, with no clear recipe as to how to
obtain a consistent value. In the second case, they get a consistent value
they can use without worrying or synchronization. The first conflates identity
and state, the second separates them.

Now, all is not right with the second case, as there still aren't coordination
semantics for putting new values in lastUpdate:

TransactionalRef<ImmutableDate> lastUpdate = now();

or

AtomicRef<ImmutableDate> lastUpdate = now();

or something similar would fix that. It's not a matter of removing mutable
memory, but one of using it in a way with clear coordination semantics, other
than "good luck with that".

------
radu_floricica
I just started reading Norvig's PAIP, and one of the things that surprised me
was seeing lisp code written in a non-functional manner. Of course the code in
PAIP evolves and the early iterations are "bad" on purpose, but it's amazing
how obviously wrong some early patterns seemed just because I was used to
Clojure's more functional style.

~~~
lispm
no, they are not 'bad on purpose'. Peter Norvig shows how to get from quick
bottom up prototypes to optimized code. The purpose is to quickly and cleanly
as possible get a first version going and then improve that.

Common Lisp programs make heavy use of side effects from imperative code, over
destructive operations to the CLOS object system.

~~~
radu_floricica
I haven't programmed at all in lisp and less then a year in Clojure so my
opinion doesn't carry much weight, but I did notice that a) imperative code is
harder to read and b) later versions look more functional. So for what's worth
I'm happy Clojure considers functional style and immutable data as defaults.

------
tybris
I'm sure he's a really smart guy, but he sounds like a Computer Science
freshman who is just beginning to grasp the problems of concurrency and
consistency (could be his way of explaining). We already know these things are
hard. No reason to give it a new, vague name like "Time management".

~~~
gizmo
A computer science freshman? I'd rather argue the opposite -- that he's one of
the few guys in Computer Science who comes up with something original AND
builds a language that supports his ideas to see if they work in practice.

Even if you don't think Hickley is on the right track, how can you not like
his combination of hands on implementation and search for theoretical
soundness?

~~~
jrockway
_that he's one of the few guys in Computer Science who comes up with something
original AND builds a language that supports his ideas to see if they work in
practice._

Not to discount Rich's work, but all of the ideas in Clojure have been well-
known forever, and Haskell and ML implemented these ideas many, many years
ago. Clojure lets you use these ideas with some Java libraries; that's its
innovation.

Also, I don't really get the whole time analogy, but I can't help but think he
wants Comonads, which Clojure does not have. (And due to its type system, will
never be able to support well.)

~~~
richhickey
Yes, Clojure is full of old ideas. I've never claimed otherwise.
Unfortunately, they have yet to gain significant traction as currently
embodied. Clojure is just an attempt to make them palatable and practical (and
fun). I would never argue against someone who preferred Haskell.

As far as comonads, no - monads and comonads are essentially threaded through
a single computation. The reference types I'm talking about are like, and for
similar situations as, Haskell's MVars and STM TVars, which wouldn't exist if,
e.g., state monads were sufficient.

~~~
jrockway
OK, now that you mention MVar/TVar, I think I understand I understand your
argument better. I think MVars and TVars are mostly a performance hack (like
most other uses of state in general and ST or IO in Haskell specifically).
Theoretically, many programs written in terms of MVars could be written in
terms of spawning a new thread for each read/write pair. This would be easier
to reason about, but probably slower.

A common use of MVars:

    
    
       f x = pure computation
       g h m = forever $ hGetChar h >>= putMvar m
    
       do mvar <- newEmptyMVar
          forkIO $ liftM f (takeMVar m)
          forkIO $ g a mvar
          forkIO $ g b mvar
          <wait for exit condition>
    

All we do is read from two handles, concurrently, and eval f for every byte on
either handle. The MVar is a natural way of expressing this, but it does lead
to potentially undesirable side effects.

Writing it in a functional way would probably be better:

    
    
      f x = pure computation
      g h f = forever $ liftM f (hGetChar h)
      
      do forkIO $ g a f
         forkIO $ g b f
         ...
    

There is now no shared state, since the problem doesn't actually require any.
(We could use continuation passing style if we needed to keep state.)

Anyway, I am still not sure how Clojure encourages the reference-less style.

