
Extremist Programming - achille
http://blog.ezyang.com/2012/11/extremist-programming?
======
tikhonj
I agree with the blog's premise: "extremist" languages are great for language
and for research. So this whole rant is not directly related to the post's
central thesis. Instead, it's about the assumptions most people have whenever
this topic comes up.

What I'm a little annoyed with can be summed up with a single banal and
overused phrase: "the right tool for the right job".

For one, this phrase really doesn't say all that much--it's basically a
tautology. Yes, using the right tool would be good, but with programming
languages, it's rarely obvious what the right tool _is_! It's just a
specialized version of the advice to "make the right choices", which is not
much advice at all.

Another problem is that people inevitably ignore how much programming
languages overlap. Virtually any languages worth comparing are going to be
_general-purpose_ languages. Choosing between a functional language and an OO
language is not like choosing between a hammer and a screwdriver to pound in a
nail, it's more like choosing between different types of hammer. In a world
where hammers can do anything. (I don't know enough about carpentry to extend
the analogy properly.) There are very few applications where one language
clearly fits and another is clearly unsuited--and if you're in a vertical like
that, the question just won't come up in the first place!

Another thing that comes up is people assuming that a multi-paradigm language
has the benefits of _all_ the paradigms it supports. I've found this is never
the case. Even very multi-paradigm languages tend to favor one paradigm or the
other more. And even if they didn't, there are benefits to being consistent.
You can do much more by being functional _everywhere_ than you can by merely
supporting functional programming in some places. Any mix of paradigms is
necessarily going to be a compromise, and the advantages of prioritizing one
main paradigm can outweigh the flexibility of supporting more than one to any
large extent. Doing one thing, and doing it well, is a powerful idea that
doesn't stop applying in designing programming languages.

Now, I'm not leaning one way or the other here in any comparison of languages
(I'm sure my biases are pretty evident and show through, they're just not
germane to this comment); I just think that summarily dismissing a language
for being too focused or too "extremist" or not multi-paradigm is rather
short-sighted. Also, often, unless you've tried doing something in a language
yourself, don't assume it's more difficult than what you already know. There
is much "common wisdom" about (like "functional programming is bad for GUIs")
which is often more "common" than "wisdom".

~~~
papsosouid
>Choosing between a functional language and an OO language is not like
choosing between a hammer and a screwdriver to pound in a nail, it's more like
choosing between different types of hammer

Take it one step further and ditch the terrible "tool" analogy altogether. It
isn't like choosing between two hammers. Nothing we do is like driving a nail.
We can't mix and match the way a carpenter can, taking hammer A for pounding
these big nails, hammer B for the small ones, 4 different saws, a lathe,
planer, etc. You can't use java's classes and perl's regexes and haskell's
laziness. You have to pick the whole toolbox together, and take it as it is.

~~~
bosie
wouldn't using a popular VM (e.g. JVM) give you quite a few of those choices
though?

~~~
jeremyjh
It helps with things like library, tool or framework selection because you end
up evaluating the language itself more so than the ecosystem. I don't think
many people are actually writing their applications in more than two different
general purpose languages at the same time and place. Maybe you inherit some
crusty code you just wrap up and re-use but you probably aren't doing new
development in that old code-base at the same time.

~~~
gerts
In the Unix environment, you may often write Java, Python, and Shell, all in a
day's work on one project.

~~~
jeremyjh
And that would be two general purpose programming languages :)

~~~
sparkie
Shell is general purpose (perhaps even more so than Java or Python), when you
look at it from the perspective that programs are simply functions you call
and get a result from, or which perform additional computation.

~~~
jeremyjh
Sure.

------
jonsen
Another extreme direction to try out is the direction toward the machine. The
value of trying out assembler programming may not have similarly direct
benefits. But I personally find it a great general advantage to have detailed
knowledge of under which practical conditions your program must run. To know
that whatever fancy high level constructs you are making use of, you are
always building a giant state machine where space is traded for time.

~~~
PieSquared
To expand on your suggestion a little, I would suggest going even further -
learn how to design your own processors and figure out how the low-level
details _really_ work. With hardware description languages such as Verilog,
programmers can apply a lot of their knowledge to hardware engineering. I've
found that a lot of things carry over from conventional programming languages
to HDLs, and that it's incredibly easy to get started. The computer
engineering mindset is pretty similar to low-level programming, and really
helps you understand how your code runs, even more so than assembly.

~~~
jules
One surprise that you find out when you do this is that languages like C don't
efficiently map to hardware at all. Hardware is inherently massively parallel,
whereas C is completely serial. What modern hardware is doing to be fast is
trying to recover as much fine grain parallelism from a sequential C program
as possible using pipelines and out of order execution. We are now at a point
where that has been mostly milked out, so explicit parallelism is necessary to
gain performance, like SIMD and multiple threads.

It's interesting to consider how you can exploit parallelism more easily, for
example from going from a sequential instruction set and language to an
inherently parallel instruction set and language. Nobody has found the
ultimate answer to that yet. GPUs execute thousands of sequential threads in
parallel, and while that works for problems with massive and regular
parallelism, it does not work for irregular parallelism or parallelism that
requires fine grain communication or short lived parallelism or not-so-massive
parallelism. FPGAs do work well for those types of parallelism, but they have
other problems for general purpose computing. With hardware trends, it's
inevitable that we'll see more and more parallelism and eventually a paradigm
shift to inherently parallel architectures. Interesting times ahead.

~~~
PieSquared
The point you make is really valuable. A few days ago, I found myself
explaining to someone why custom hardware and GPUs could so easily outperform
processors, and I realized that most programmers have no concept of how much
overhead the general nature of a processor entails. (Although, I don't think
most programmers really _need_ to know this.)

For instance, let's take the problem of multiplying ten numbers. In a normal
processor, you have a loop of instructions, each instruction has to go through
a "fetch" state (to load it from memory), a "decode" stage, to figure out what
the instruction is, an "issue" stage, to figure out which processor pipeline
can best execute this instruction, an "execute" stage, to finally execute the
instruction, and maybe a "commit" stage to write the outputs back to memory.
(The exact number of stages and amount of parallelism depends on the
microarchitecture and pipeline depth, of course). What if we wanted to just
build a chip that did this? We could put ten multipliers on the chip, and then
do the exact same operations in just a few clock cycles, since we would have
no instruction fetch or decode, no commit, no loops, and so on. This is a
contrived example, but my point is that general-purpose processors are
incredibly slow compared to dedicated hardware, precisely because the extra
transistors necessary to make processors general purpose also take a large
portion of the computing time.

I find the idea of FPGAs reconfigured per-application to be really
interesting. Celoxica (<http://www.celoxica.com>) seems to do some sort of
FPGA-based software acceleration for trading software, for instance. I wonder
if it's possible to do something like this for a more general market...

------
jacques_chester
Incidentally, this is a formula for coming up with PhD projects: take a common
comp sci primitive and then remove it or shift it to someplace else in the
life cycle or stack.

<http://chester.id.au/2009/10/21/upsetting-the-natural-order/>

------
timbaldridge
This is why I use Clojure. I can do Functional, OOP, logic, or any of the
dozens of other programming styles in one language. And since it's a lisp I
don't have to worry about having to need extra syntax from the language
writers to get what I want.

Pragmatic languages FTW!

~~~
spicyj
I interpreted the OP suggesting that you not use a single multiparadigm
language because then you _won't_ be forced to follow the new principle
everywhere. Of course, it's possible to do (for example) OOP in a lisp-like
language, but you'll really be forced to work with it in a language like Java,
which may give you a deeper understanding.

~~~
jfb
Well, avoid Java, which is just C++--, and think Smalltalk instead.

~~~
klez
Therefore Java == C?

~~~
evincarofautumn
I don’t think programming languages have inverses, so in general _λ_ \+ 1 − 1
≠ _λ_.

------
drbawb
>what if we made an OS where everything was a file?

Shameless Plan 9 plug.

[http://en.wikipedia.org/wiki/Plan_9_from_Bell_Labs#Design_co...](http://en.wikipedia.org/wiki/Plan_9_from_Bell_Labs#Design_concepts)

~~~
cms07
Or, you know, Unix.

Edit: Which came from Multics, I know.

~~~
jacques_chester
Plan 9 is the spiritual successor of Unix, because in Unix: Everything Is A
File (except for the many, many things which are not files after all).

------
nickbarone
So, could we start a listing of things learned through the application of
extremist programming?

Or better yet, a listing that shows where a given principle hasn't been
extremified, so we can go try it out and see what happens?

------
rizzom5000
Sure, you could try to treat everything like an object and then find out if an
integer was an object (the hard way) or you could just RTFM (the easy way).

Don't get me wrong, experimentation is a great for learning about limitations
and capabilities; but I personally wouldn't use it as my primary means for
learning about the design of something (unless it was very poorly documented,
in which case I would try to avoid using it at all).

------
liquidise
Awesome article. I would argue this goes for practices as well. Automated
Testing vs TDD and the like.

------
w0utert
Nice article and interesting viewpoint, taking principles to the extreme for
learning purposes seems very useful.

The thing that impressed me most isn't the article though, but the amazingly
beautiful clean look of that blog. Really a pleasure to look at and read on
the iPad :-)

------
wissler
He underrates the power of principle.

"Mass is awesome. What if every object in the Universe had mass?"

"Liberty is awesome. What if there should be no such thing as slavery and
every human being should be free?"

If you pick the wrong principle and take it to an extreme, then yes, it'll
lead to undesirable results, but that means you should throw bad principles
out, not all principles.

~~~
jerf
I have no idea what you are trying to say. You take a point that is explicitly
written about programming languages, then I can make out that you think you're
saying something by applying it to... physics? and then political philosophy?
What? This is not a sensible line of thought on any level I can find, not
metaphorically or literally. The post is about trying out programming
languages, not slavery.

~~~
nickbarone
The post may be about programming over politics, but the post is essentially
recommended exploring the consequences of extremes, and that operation is
valuable in (heh - taking it to the extreme!) all places. For example: 1984 -
extremes of medication, Player Piano - extremes of automation.

You can also notice this technique in a debate, someone takes the emotional
stakes to an extreme - while it's usually used as an attempt to convince you,
it's actually a useful practice because it can magnify otherwise overlooked
elements of the positions, opinions, and principles.

~~~
derleth
> 1984 - extremes of medication

The only medication I remember from _Nineteen Eighty-Four_ was Liberty Gin. I
think you're thinking of _Brave New World_ with soma.

~~~
nickbarone
Ah! I am, sorry, and thanks.

------
10098
That's the right idea for a hobby or research project, but please don't do
this in production code. Think about people who have to maintain it after you.
I've seen my colleagues wade through a swamp of completely unnecessary C++
metaprogramming madness left by someone who apparently learned about templates
yesterday, and it wasn't very nice.

~~~
shusso
I have to disagree with your first sentence, but the rest I do agree. I think
maintenance of the SW should be categorized under "the right tool for the
right job". If your company has dozens of skilled "extrimists" then why not
use it in production. On the other hand if not..

