Hacker News new | past | comments | ask | show | jobs | submit login
Simplicity Matters – Rich Hickey (2012) [video] (youtube.com)
126 points by hharnisch on Aug 28, 2015 | hide | past | favorite | 69 comments



I was at this talk and I disagree with his fundamental statement that simple + simple = simple. I program in Ruby one of the biggest problems beginners make is not creating complex data structures where they are needed. Instead they pass around hashes of hashes of hashes. Why? Hashes are simple, they're easy to understand and work with. Unfortunately this initial simplicity introduces unexpected complexity, things like deep dup and deep merge are now needed. Every part of your system must now know the hash structure you're using and often reproduce logic in many places like dealing with optional keys that may be missing. By not isolating the complexity to one part of the app it must now be shared by the entire app. simple + simple !(always)= simple.

If I had a time machine and could make one change in an open source project it would be to go back and remove the design choice of putting using hashes as internal config objects in Rails. It makes understanding all the possible edge cases almost impossible unless you're really familiar with the project.

The second is the claim of speed that this "simplicity" buys you. I agree that functional programming is extremely fast when it's parallelized and it's extremely easy to make a functional program parallelizable. When we're dealing with sequential tasks, mutating data is much faster than allocating new data structures. I think clojure helps with the speed of allocating new immutable structures by using copy on write strategies behind the scenes.

I think Rich is an extremely smart and very accomplished programmer. I think functional programming is really good at some things, I don't feel like many people talk about the things it's not good at. To me we if we're not embracing and explore all a new concept/language/paradigm strengths and weaknesses, we're not growing by being exposed to that thing.


> I think Rich is an extremely smart and very accomplished programmer. I think functional programming is really good at some things, I don't feel like many people talk about the things it's not good at.

As someone who used to swear by Ruby but now prefers FP langs, part of learning FP is as much about learning what FP is good at as it is about learning what FP isn't good at. Given FP's proclamation for declarative style and immutability, clearly FP isn't good at domains where imperative styles are important or really, really, really[0] CPU intensive work where mutability is needed.

Although, thanks to compiler research (which is mostly done by FP language users, ahem) languages like OCaml are bridging this gap when you take a look at projects like MirageOS. You get to write your code in a declarative, immutable style that then gets compiled to very fast native code. It's having your cake and eating it, too.

As far as your original complaint of everything in Ruby either being a hash of hashes or a class that just wraps a hash of hashes, I completely agree. That is what drove me away from the language. I'd rather just blow that hash + class relationship up completely. Not to mention, in ML-dialect languages you can use features like Enum/Sum/Product types to define the configuration/syntax of your programs which is a lot better than reading arbitrary keys in a hash, as you have stated.

0 - No, I mean really, really, really, really. "I think I need C to do this fast enough for my use case" is the "I need Cassandra for my 2GB database of 'Big Data'" of the FP world.


>I think clojure helps with the speed of allocating new immutable structures by using copy on write strategies behind the scenes.

No, it's not copy on write in the sense that COW means in other languages. Data in clojure is persistent, so usually even an altered piece of data is not actually copied, only the tiny bit that changed is added, if necessary. This is very different than, for example, Swift that implements copy on write, but the whole data structure is copied even if the change is minor.

Additionally, you say:

>When we're dealing with sequential tasks, mutating data is much faster than allocating new data structures

That would be true in cases where the entire data piece is new; but so much of what you do in any language involves interative changes over existing data, and again because of clojure's persistence, new allocations are often not happening at all in many cases.

...However, I do agree with most of your other points :)


I was introduced to http://hypirion.com/musings/understanding-persistent-vector-... not too long ago on HN. I don't use clojure so I was paraphrasing my somewhat limited understanding. Thanks for the clarification.

That part about speed was more about my general frustration with the "immutable is fast" and "mutable is slow" meme that I hear too frequently. While that can be the case, it isn't always 100% true. Your language and how you use it can play a huge part.

I'm hoping to learn more about clojure data structures in the coming months, so much good stuff.


My guess would be that immutable clojure is still faster than say mutable ruby though.

I don't think anybody is arguing that trie lookup is faster than array lookup. But when it comes to passing data between parts of the system, zero cost copies are much faster than copying buffers or objects.


> Unfortunately this initial simplicity introduces unexpected complexity, things like deep dup and deep merge are now needed.

Not if your data structures are immutable, which is exactly what he is advocating.


Excellent point, I honestly hadn't considered that. I've spent more time dealing with functional programming than in actual languages with immutable structures. Thanks for pointing that out.

I think my other point still stands, things like optional keys, or missing or required keys still must be shared throughout the app. Everything that touches that hash needs to know its structure.


Having to know the structure of a hash/map/dict is strictly superior to having to know the methods and fields of an object. For any given task the two are equivalent, except that one is easily modified/reused/passed to other callers without creating an extremely brittle class hierarchy you're only going to have to rearchitect when your requirements change. As an IBM researcher looking at Watson waaaay before Watson actually existed: "OOP requires you to treat all extrinsic information as if it were intrinsic to the object itself." This is tantamount to saying you must have a complete and consistent programme design before you write a single line. Which is crazy (and probably the reason UML exists.) But don't take my word for it, think it through, it's the only conclusion you can reach, I think.


Yeah, but that structure is also easy to explore and understand.

And in case you want some security you can use libraries like prismatic/schema, that allow you to declaratively describe your expected data-structures akin to a type system.

The web is build on json and not corba because plain maps and vectors are far easier to work with than domain specific objects.


There is a discussion of deep vs shallow hashes. Take Datomic or Datascript for instance. Very shallow datastructures, with a defined schema. Indexes to make this fast.

There’s tradeoffs. Shallow datastructures are under-used.

But OOP objects share fundamental flaws with deep hashes. (And add some more.)


Exactly; the whole idea of "deep dup" doesn't even exist in Clojure. It's a non-issue. Some languages go so far as to encourage actually serializing and then reading back in a data structure as a form of doing a deep duplication! (I've done this in Objective-C, based on advice in popular books.) This is an extreme example of the pitfalls of mutable data in many circumstances.


There is actually a lot of complexity to hashes in Ruby.

The two major sources of complexity are the fact they are mutable and the fact that keys are mutable but hashes are computed when the object is first added.

So if you construct a valid hash, then call three functions passing them the hash, you have no idea if the hash is still valid for the second and third functions without reading the code for all the preceding functions.

What this tends to mean in practice is you are always having to check all over the place that your hash is valid.

They might look the same as clojure hashes but in practice they are used very differently.


> There is actually a lot of complexity to hashes in Ruby.

I agree here

> So if you construct a valid hash, then call three functions passing them the hash, you have no idea if the hash is still valid for the second and third functions without reading the code for all the preceding functions.

I don't find that to be true, even in the Rails codebase. You pass a hash to a function (method) and expect that it won't be mutated. If you're writing a method and you mutate a hash, you're expected to dup the argument so it won't be mutated. This is the convention. There are times when hashes are mutated, but generally that's reserved for methods who's purpose are to mutate its arguments.

Where Rails gets into trouble in its functional passing of hashes isn't in the mutability of the hashes, it's in the composability of the functions. I've never written about this problem before, so i'm not sure I have a great example but it's definitely a problem.

One example in rails is `url_for` it takes a hash argument. The problem is that there's multiple `url_for` methods that all do slightly different things. Some are needed for generating links in email, while some work in your views, and others are designed for you to use programmatically outside of a view context. One of the hardest things about this method is that one `url_for` can call another `url_for`. Since we can never be guaranteed the order of the calls it is really it makes things like having default values, or optional keys. You have to replicate logic in different functions since some may never be called in the order you might expect. This significantly impacts our ability to refactor which in turn impacts our ability to make performance improvements.

I recently did a bunch of perf work in https://engineering.heroku.com/blogs/2015-08-06-patching-rai... and some of my biggest perf improvements were getting rid of duping and merging of hashes. If Ruby had an immutable and performance efficient hash then it would have helped a bunch, however I don't think it would make the general awfulness that is an entirely hash based API to a very complex action (such as url_for) that significantly better to work with.


Take a look at ring for clojure, which is a web server interface very similar to ruby rack.

Because it passes every request as a map you can hook functions (middleware) in between that change the behaviour of the request handling, for example they could add a field for passed params or fields for user authentication.

This is possible because maps are easily extendable and functions down the line don't need to know about additional keys. You can't do that with OO properly.


You are sort of arguing a strawman. First you say that hashes are simple, but then go on to argue how they are not. Maybe they were not simple in the first place? Simplicity does not mean easy. It was "easy" to use a hash. Simplicity can actually be incredibly hard to design properly. I think this is what Rich alludes to in the video.

Edit: It's one of the first thing he says in the video actually - "It's about the interleaving, not the cardinality."


Actually, parent gets it and is neither arguing a strawman nor mistakenly labelling hashes as simple rather than easy.

Rich absolutely argues that basic data structures like hashes are simple -- something that is generally agreed upon. They are simpler than 'objects' because you can use basic comparison operators on them, and you can operate on them with higher-order functions etc.

The question which parent is digging at is whether a larger, complex app, that leans heavily on hashes actually results in an app that is on the whole simpler, i.e. does 'a simple thing plus a simple thing equal a simple thing'. This is something I've heard discussed well on the Ruby Rogues podcast -- I think that either David Brady or Josh Susser may have a good blog post on the subject of 'simple + simple != simple' but I'm struggling to track it down.

Whilst I love Rich Hickey's talks I do find myself coming round to the same conclusion as parent -- if you forgive my possibly incorrect interpretation of their argument -- that the idea that simple data structures are simpler is fairly useless if programmers use that fact naively.

PS. Everyone should watch the linked talk, it's brilliant, and one of my favourite programming talks ever. I recommend it to every programmer I meet.


I think you nailed my original intent quite well, thanks for the post.


Why ignore the message of simplicity in abstractions and design only to nitpick on hashes? I am not a fan of using hashes either. I also disagree with Rich's stance on static typing. But still think the video has a great message that is being lost here.


> I think functional programming is really good at some things, I don't feel like many people talk about the things it's not good at.

This is a wonderful tenet in general: let's think about what X is bad at, rather than what it's good for.

While I'm not much of a Rubyist (I've hacked a little in my time, but I fall on the Python side of the divide), working with trees (hashes of hashes are a kind of tree) has never been easier than in Clojure in my experience. I suspect this is true of any lisp.


Could you expand on how Clojure makes it easier to deal with the frustrations your parent comment mentions (deep merges, shared reliance on hash structure, reproduced logic, and handling of optional values)? I've only used Clojure a bit, but I'm not sure how it solves any of these problems (which I also find frustrating in Ruby).


Persistent data structures are part of it, for sure. But on a higher level. On a lower level, if you look at how and assoc-in works in Clojure, the scales will fall from your eyes (they certainly fell from mine). Consider:

(def g {:x "a" :y {:k 71}})

user=> (assoc-in g [:y :k] 72)

{:y {:k 72}, :x "a"}

With this you can walk and mod deeply nested datastructures with ease (and others can reason about it well).


I may be misunderstanding your point, but that's just a well-done function which could be written in virtually any language, the advantage isn't inherent to Clojure.


The difference is that in Clojure, `assoc-in` is effectively a deep copy as well since the data structures are immutable. Most of the complexity involved with nested maps is about the effects of modifying them. In practice, this doesn't exist in Clojure.


> which could be written in virtually any language

What natrius says is completely right on this point (which explains why those functions don't exist commonly outside of lisp) but it should by now make you wonder: if that's true, why haven't they?


It's not true. ImmutableJS is becoming very common, especially for React users: https://github.com/facebook/immutable-js/


People like to talk about immutable.js. It's not common.


8k stars and counting. We used it at the last place I worked (a YC startup) and several of my friends report the same.


OH, I see your point. Thanks for the enlightenment.


Hickey has stated elsewhere that optional values don't really make sense in a dynamically-typed language.


Immutability is a big part of it, as is the wide array of operators for maps and map-like data structures.


> Every part of your system must now know the hash structure you're using and often reproduce logic in many places like dealing with optional keys that may be missing.

Why not just write accessors and mutators for your hashes? Also, Python hash a function dict.get(key, default). Does Ruby not have this?


Your comment seems more about Clojure's type system and fundamental data structures than about functional programming in general. e.g. Haskell solves some/most of the problems you mention, while introducing different ones and presenting other trade-offs to consider.


Using data structures without a schema is user error, and isn't the fault of any particular language.

The rest of your points have some merit. :)


The simplicity culture is what drove me to Python, and the Clojure community's simplicity culture is on a whole other level. It's not just the code and syntax that need to be simple. The abstractions in the program need to be simple as well, and Clojure gives you the tools to make simple abstractions easy to build and understand.

Go learn Clojure.


> Clojure gives you the tools to make simple abstractions easy to build and understand.

I understand if you don't want to take the time, but I would love it if you could you give an example of an abstraction that you can build in Closure that you would consider "easy to understand", yet which couldn't be built just with, say, anonymous functions, structs, arrays, and simple loops?


The simple syntax makes things like Hiccup and macros more convenient. https://github.com/weavejester/hiccup

Macros allow things like core.async (like Go's channels), which is just a normal library; you didn't have to upgrade your Clojure version or anything.

Immutable datastructures make your life simpler because you're not worried about values mutating suddenly. Keeps you from cloning or locking an object. And undo is simpler: you don't destroy old state by mutating it, so you can just hold onto old versions.


I think it comes down to expressing your intent in terms of that intent rather than confining it to imperative constructs. Pretty much any work on collections of items is generally simpler and more in line with the semantics of your intentions when working with Clojure or other functional languages or structures, especially when parallel code is involved.

I'd say a specific example would be pmap. It's very difficult to parallelize code as simply as pmap does without functional programming constructs.


Hm. I'm looking at the implementation[1] of pmap.... I don't fully understand the implementation (can anyone point me to something that explains what rets is? Googling "clojure rets" doesn't return anything useful) but it appears to just be looping through the list and parcelling work out to multiple processors. That seems like something simple enough to do in any language with threads and function references, no?

I'll dig in some more though. Thanks for the reference.

[1] https://github.com/clojure/clojure/blob/master/src/clj/cloju...


To me it comes down to what my intent with the code is.

I could write a few simple routines in Go that certain get the job done without much code, but when I read the code I have to perform more mental translation from how the code is written to what my intent was.

In contrast, pmap or other constructs such as PLINQ get the same work done with less code that expresses my intent more clearly.


in this code, "rets" is a local variable used in that function, which is why it wouldn't show up anywhere in google. it's just a variable name.


Zippers.


I just started BraveClojure today. I've said "next week" enough times. http://www.braveclojure.com/


Or learn another Lisp. I recommend Guile Scheme and Racket.


You may get just as much pedagogical advantage from learning one of those, but Clojure will be much more practical with its JVM base.

Besides, I know lispers hate the idea, and it may not even be necessary, but uniting behind a single good-enough lisp like Clojure will reap more rewards then advocating for multiple (while still good) Scheme implementations.


I spoke with a "grizzled old Lisp hacker" at my Clojure meetup here in Minneapolis. He let it drop that "If I didn't know the problem in advance, I would choose Racket over any other language. I love Clojure, though. I just don't get as much opportunity to use it." There's room for everyone here.


Here we go again with the "any Lisp that isn't Clojure isn't practical" argument. I disagree. Guile and Racket are plenty practical, and both have dynamic FFIs to any C library you want so I couldn't care less about using Java and its terrible VM boot time and build systems. I'm not interested in "good enough" when I've already been using something better for years.


I'm a Clojure developer and I agree that people shouldn't use the argument that other Lisps aren't practical (I'm just happy people are using lisp based languages). The point we should make is that for any shop that's using the JVM or Javascript already it's easy to introduce Clojure/ClojureScript into that environment without a ton of risk.


Curious if you could tell us more about what you've worked on with Guile/Racket in a production environment?


I'm not the persuasive person that convince the bosses to use Scheme, so I don't have a triumphant success story, but some minor victories. I'm one of the core developers of GNU Guix (a package manager, distro, and associated set of tools) that recently was deployed successfully in a large production HPC cluster in Germany, which I think is a pretty nice achievement. At work, I use Guix and an init system called DMD (also written in Guile) a lot for development, and I'm trying to slowly work them into production via Docker.

It's an uphill battle to advocate for something that isn't the status quo, so I'm proud of the little victories.


If you build systems that are part of an existing ecosystem, specifically an ecosystem built on Java technologies, then Guile or Racket would be terrible choices.


Sure, that's fair, and for that Clojure is a fine choice.


Many choices are 'sacrifices' that bring so much simplicity. Syntax, persistent data structures, ... I hope I can jump back to clojure soon.


Yeah, Python : 1999 :: Clojure : 2015

edit: state of adoption and perceived coolness, not birth year. Coincidentally, Python 1.0 released in 1994 and Clojure 1.0 was released in 2009, and both had a couple years of unstable releases predating 1.0


Clojure is actually from 2006 after two years of planning and development.


Yeah, but Python is not from 1999, so they were referencing something else: either personal adoption or language take-off.


While simplicity may matter, I believe Clojure to be a poor example of it -- it's a lot of functions all thrown together in basically one namespace, with poor error handling, and a tendancy to throw a 50 line traceback with a lot of random symbols in it.

Clojure macros are the antithesis of simple, and the need to indent a scope for every new variable actually fights against TDD in my experience.

I recently wrote a good chunk of a RESTful SQL-backed application in Python in two days that took a team of 3 people in Clojure over 2 months to just get to the basic level of library support one would expect from things like sqlalchemy and flask.

Clojure isn't simple -- it's basically a step up from assembler in how little it provides.

Simplicity is having all the power tools and being able to put them together and be instantly productive, and to support programming in multiple paradigms.

While it's not the norm, I sometimes feel many FP purists spend so much time debating purity and giving basic concepts complex names - when they could be using something else and getting much more done.

Side effects aren't the devil and are sometimes neccessary to get real work done. Bad code can be written in anything, and it just takes experience.

I'd much rather see a language focus on readability, maintaince, and rapid prototyping than side effects.

Functional programming concepts have benefits - I love list comprehensions and functools.partial in python is pretty neat, but when you can also have a decent object system, and embrace imperative when steps are truly imperative, you can get a whole lot more done.


Rich uses a very precise and archaic definition of simplicity, which he describes in his "Simple Made Easy" talk. In a nutshell, when Rich talks about making something simple, he's talking about limiting the number of things that can affect it.

For example, under this definition, the simplest thing possible is immutable data, since nothing can affect it. The next simplest things are pure functions, because they're affected only by their arguments. Clojure is a language built around Rich's idea of simplicity, so Clojure prefers data over pure functions, and pure functions over side-effectful functions.

What you're describing is what Rich would likely term "easy". Something is easy if you can do it with little effort. Something is simple if few things affect it.


It's unfortunate that you got downvoted, because while people may not agree with this there is definitely something to this.

My approaches to Clojure have been seriously hampered by the fact that some of the abstractions above those that are "simple" are remarkably complex, and that the tools that surround the ecosystem are still pretty frail.

Macro bugs are certainly something that have scared me away for a while.


Yeah, exactly.

My experiences were around looking for a quality ORM, job scheduler, and web framework - things like korma and ring exist - but they lack a large amount of features compared to equivalents found in /most other/ languages.

I came to the conclusion that Clojure is an acceptable way to call Java SDKs if you want a bit higher Java velocity AND Lisp fits your brain already, but I'd rather pick up Scala or Groovy instead for that purpose.

Libs in pure clojure, for which I tried dozens, were usually incomplete and error-prone even if they were community favorites, which I attribute in part to the fact that it's a small circle of developers using it, and the language is still newish.


Macros are for extending the language, I have never in 2 years of professional clojure dev, used a single macro in problem domain code. I've written a few libraries that are supposed to extend the language, but I've only used them rarely there too, maybe under a dozen times.


Sure, but when there's bugs in the caller of macros I have to figure out what exactly is going on, and that's where the rubber meets the road.


> I recently wrote a good chunk of a RESTful SQL-backed application in Python in two days that took a team of 3 people in Clojure over 2 months to just get to the basic level of library support one would expect from things like sqlalchemy and flask.

Nonsense. Java library interop is great. If you could write it in 2 days in Python then it wouldn't take more than a week in clojure. In the very worst case scenario you could simply write "Java in Clojure" and directly use Java libraries.

> Clojure isn't simple -- it's basically a step up from assembler in how little it provides.

It gives you full access to the JVM and its tens of thousands of man years worth of high quality libraries.

As for the language itself it has macros, first class functions, built in vectors and maps, structural editing, fantastic REPL support, and a top rate concurrency story to name a few features.


you might want to look at clojure again. that comment reads like babble


"talk-transcripts : Rich Hickey "

- Inside Transducers (11/2014)

- Transducers (09/2014)

- Implementation details of core.async Channels (06/2014)

- Design, Composition and Performance (11/2013)

- Clojure core.async Channels (09/2013)

- The Language of the System (11/2012)

- The Value of Values (07/2012)

- Reducers (06/2012)

- Simple Made Easy (9/2011)

- Hammock Driven Development (10/2010)

- Are we there yet? (09/2009)

https://github.com/matthiasn/talk-transcripts/tree/master/Hi...


I love Rich Hickey and I will watch any talk he ever gives. I believe I have seen them all at current.

Here's what drives me nuts about this one, though - it gets passed around a lot where I work, and people say how strongly they agree with him. There are real, concrete things he claims are not simple here! Things like for loops. And these people I'm talking about, they say they love this talk and then they say they love for loops. Really! I don't get it.


It's perfectly reasonable to agree wholeheartedly with his core idea (easy vs simple) but disagree with his particular pronouncements on what is and isn't simple. Which could involve for loops.

Now, personally, I don't have much to say about for loops one way or the other. But I do disagree with him on types, which can be far less complex than he intimates. System F, say, which largely covers F#, OCaml and Haskell, can be completely defined as a handful of self-evident rules. A single page for both checking and inference, if in somewhat dense notation. That's not complex at all, especially since it follows fairly naturally from the way the lambda calculus works even without types.

To me, that seems like a perfectly consistent view. Nothing forces you to take everything he says or leave it: you can find it accurate piecemeal. Take the mental framework but apply it with your own knowledge and experience and you could very well come up with your own conclusions.

Seems like the perfect way to use ideas like this.


It's one thing to recognize simplicity versus ease of use, it's a wholly different thing to apply it. Ease of use is what is most trivially observable by an end user. Actually understanding the nature of the problem domain and properly evaluating architectures beyond making sweeping inferences from what the interface looks like, that is significantly more difficult. Despite all the lip service that gets payed towards simplicity, most people do not want it and it will be met by derision and scorn if the simple solution imposes a higher learning curve or does not make policy and integration decisions for the user. Especially considering the culture of coding bootcamps, DevOps, "get shit done" and "move fast and break things" by and large promotes an anti-intellectualism that is in stark opposition to deeply evaluating problem domains so you can come up with simple solutions.


Right, and so what you get is people saying something is "simple" when what they really mean is that they like it. Which is of course not Hickey's fault, it just irks me for some reason.


Carl Sassenrath is another programming language designer who as long been a proponent for simplicity.

Some links:

* Definition of Simple (2011) - http://www.rebol.com/article/0509.html

* Fight Software Complexity Pollution (2010) - http://www.rebol.com/article/0497.html

* Contemplating Simplicity (2005) - http://www.rebol.com/article/0127.html




Applications are open for YC Winter 2022

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: