

Why functional programming matters (aka MapReduce for humans) - paulmillr
https://gist.github.com/2011876

======
SeoxyS
I hardly see this as an example of what's great about map/reduce or functional
programming… It doesn't illustrate either the core concept of map/reduce, or
that of functional programming.

I think this is a better basic example (in clojure) of what makes functional
programming and map/reduce powerful (skipping Hadoop, while we're at it):

    
    
        (defn wc-map [str]
          (let [words (split str #"\s+")]
            (map
              (fn [p] [(first p) (map (fn [_] 1) (second p))]) 
              (group-by (fn [x] x) words))))
        
        (defn wc-reduce [h]
          (map (fn [p] [(first p) (reduce + (second p))]) h))
    

Simple word count function, completely functional, and using the map/reduce
concept.

This function allows you to infinitely parallelize word counts: you could
simply run `wc-map` on a file and `wc-reduce`, maybe even on multiple servers,
merge (merge-with concat) the results, and run it through `wc-reduce` again,
ad infinitum.

~~~
adeelk
You can use `identity` instead of (fn [x] x).

------
alenahemkova
Google also does this I think. I remember reading a presentation somewhere
about FlumeJava but I couldn't find it anymore. See pg 20 of
[http://www.slideshare.net/greenwop/expressiveness-
simplicity...](http://www.slideshare.net/greenwop/expressiveness-simplicity-
and-users) or
[http://web.eecs.utk.edu/~dongarra/ccgsc2010/slides/talk28-ko...](http://web.eecs.utk.edu/~dongarra/ccgsc2010/slides/talk28-konerding.pdf)
for an example.

------
georgieporgie
What am I looking at?

~~~
metra
This is Hadoop MapReduce code written using the Crunch library, which is
written on top of the Hadoop API with an intent to simplify it. On top of
that, the gist author wrote this bit of code in Scala (thus, Scrunch), trying
to emphasize how much easier it is to write functional code in Scala as
opposed to Java.

~~~
georgieporgie
Thanks for the explanation.

