
Functional Programming with Python - llambda
http://kachayev.github.io/talks/uapycon2012/index.html#/
======
dev360
So I felt really torn with this. I applaud the author for introducing Python
devs to FP; I love functional programming. Python was one of the things that
opened my eyes to FP with its cute little m/r syntax and for that I'm forever
grateful.

Over the years, I have found that python code that people write to solve
problems will fall in roughly four categories: a) completely ignorant and
inefficient code; b) concise, idiomatic python code with a focus on
readability; c) fancy functional code where the author is showing off their FP
savvy (peppered with recursion and m/r); and d) raw, boring C-influenced code
that looks primitive and boring to read, but is incredibly efficient and
algorithmically sound. As much as I used to love going for b), I then prided
myself for writing c). But last few years I have realized how under performing
c) is for most serious work, so I tend to avoid it and try to strike a balance
between d) and b), but preferably the latter. To write code in c) and tell
yourself, well the data size will never be so great that the stack will blow,
or memory will suffer, is in many ways just like avoiding null checks all
together and saying to yourself your data will always have perfect
integrity... e.g. any sane person will tell you that you cannot rely on this
mindset in the long run.

So, for me at least it comes down to this: to try to pretend that you can
write FP code effectively in Python is an exercise in futility. FP is more
than writing a map and reduce. If thats your basis of reasoning, yeah I agree
take that and run with it - I find m/r to be a neat syntax to use here and
there in your program but its really challenging to write a whole program in
python that is composed of several chained map / reduce functions to a large
extent because things are not oriented around generators by default.. I'm not
saying its impossible, just that it goes against the grain. I also argue that
recursion and guard expressions (and lazy evaluation) are at the heart of the
FP paradigm and if you want to do it right, then its just retarded to sit and
do this in Python when there is no tail recursion.. how are you going to
appreciate this concept when its so limiting?? If you want to muse yourself
and marvel at pretty code, write it like this and have fun with it, but please
pick another language if you want to actually ship production code.

And ... so sorry to piss on the parade.

~~~
sixbrx
Yeah, i gave up on doing fp (or any mathematics) in python when I got
surprised by this:

>>> monomials = [lambda x: x(star)(star)i for i in [1,2,3]] #(not sure how to
escape stars for exponentiation properly, whatever)

Those all end up being the _same_ monomial because the integer i is captured
_by address_ and not given a new binding for every iteration. Very surprising,
and an accident waiting to happen for anyone used to dealing with integers by
value (hard to say that with a straight face).

The hackish workaround is "lambda x,i=i: x(star)(star)i" but I'd forget it all
the time.

~~~
zanny
This is just like how python default values are global variables that if you
modify over time (ie, def foo(x = []): x.append(0), the second iteration x
would be [0,0], because it is only initialized once.

~~~
njharman
Not global. Scoped with the "def".

~~~
tome
I think zanny meant "static" rather than "global".

------
d0m
Look, I love functional programming. I use it all the time and I procrastinate
in the shower about the best to add currying to all my favorite languages.
That being said, even after all those years, I still find this:

    
    
        expr, res = "28+32+++32++39", 0
        for t in expr.split("+"):
            if t != "":
                res += int(t)
        
        print res 
    

SO MUCH more readable than this:

    
    
        from operator import add
        expr = "28+32+++32++39"
        print reduce(add, map(int, filter(bool, expr.split("+"))))
    

A couple years ago I created a library called Moka in python to make
functional and nesting easier. That would be written this way:

    
    
        (Moka(expr)
           .do(string.split, '+')
           .keep(lambda x: x != "")
           .map(int)
           .reduce(moka.add))
    

Still, the imperative version, althought it has state and feels hacky for a
functional hacker, is still pretty damn clear and easy to read.

~~~
ak217

        expr = '28+32+++32++39'
        print sum(int(i) for i in expr.split('+') if i is not '')

~~~
llimllib
It's really weird to me that the author prefers map(add, ...) to sum(generator
expression).

~~~
rbanffy
Because with map you can use any function and sum maps __add__. It also
illustrates passing functions as arguments.

More subtly, a generator (or list comprehension) implies the order in which
the results are generated and an implicit but observable changing state. If
I'm trying to write (or, at least, express, since we are talking about Python)
parallelism, map would be a better choice.

~~~
Locke1689
_More subtly, a generator (or list comprehension) implies the order in which
the results are generated and an implicit but observable changing state. If I
'm trying to write (or, at least, express, since we are talking about Python)
parallelism, map would be a better choice._

This is definitely not true -- generator expressions can be expressed using
lazy list semantics. By your logic every Haskell list comprehension would be
stateful.

I would encourage you to write the denotational or operational semantics of
your expressions in these matters rather than simply going by intuition.

~~~
rbanffy
> generator expressions can be expressed using lazy list semantics

That does not change the fact the results are given in a certain order. Lazy
list semantics means the data will be calculated as it's needed, therefore,
the calculation cannot be parallelized.

~~~
Locke1689
Not true.

See
[http://hackage.haskell.org/packages/archive/parallel/3.1.0.1...](http://hackage.haskell.org/packages/archive/parallel/3.1.0.1/doc/html/Control-
Parallel-Strategies.html#v:parList)

~~~
rbanffy
I was talking about Python. A list comprehension is defined as having the
behavior of a loop.

~~~
Locke1689
Once again, that doesn't semantically make a difference, unless the loop is
side-effecting. However, even without side effects, Python cannot necessarily
make any kind of guarantee about such computations since arrays are mutable
and can be modified out from under the map.

This is why parmap equivalents take iterables in Python, not arrays. It
doesn't matter if you have lazy or eager semantics, only if your semantics are
side-effecting.

------
Locke1689
The missing slide: profiling.

I love functional programming. Nonetheless, I strongly believe in using the
natural idioms of your language: they are more widely used, more widely
optimized, and more often thought about.

I think you would find that the functional style is _not_ idiomatic in Python
most of the time. This can hurt readability, performance, and
interoperability.

Overall, I would encourage people to do what is _natural_ \-- it is often also
what's correct. For example, Python generators expressions are lazy sequence
semantic constructions and are both functional in semantics and _better_ than
the equivalent map/reduce semantics.

~~~
b0rsuk
Good points. Python lists are surprisingly slightly faster than tuples (which
are immutable). That's because lists are extremely common, and more effort
went into optimizing them.

------
GeneralMayhem
So, basically, with lots of work, you can make something that's technically
functional but with none of the beauty of a language that's meant for it.

I understand the appeal of functional programming, really I do, but why can't
we accept that sometimes just because you can doesn't mean you should?

------
goatslacker
I used to dream about FP and Python; then I read about Guido's plans for
removing `lambda` and `reduce`
([http://www.artima.com/weblogs/viewpost.jsp?thread=98196](http://www.artima.com/weblogs/viewpost.jsp?thread=98196))
and that is when I stopped writing Python.

~~~
hat_matrix
That disappointed me also, but the FP voice from the community is strong
enough. Reduce lives on in functools and the functional package also adds
additional functions, like compose. It's one of the packages I install first
in a new Python installation. But true, Python will never grow into Haskell.

~~~
hmsimha
Also, lambda doesn't appear to be going anywhere. It's one of my favorite
features.

~~~
hat_matrix
Yes, not going away but not growing beyond the crippled, single-statement
version it is now...

------
dickler
There's little reason to use python instead of haskell these days.

\- compiled code, fast

\- cleaner, nicer, conciser syntax

\- type safety built in

\- pattern matching

\- numpy equivalents are now available

\- purity (see John Carmack talking about how it helps when codebases get
large)

To switch from python to haskell easily within minutes, realize that a python
function is simply do notation in haskell. So replace assignment (=) with
arrow (<-) and write with "func v1 v2 = do".

Then pure functions are those without do and no assignment written lisp/scheme
style. "$" means you can get rid of the parentheses.

After that, declaring types is equivalent to expressing how syntax tree can be
written. You can use these to plan out how you attack your problem and then
let compiler help you when writing out your functions. You could even just
write out types and outsource the function writing to someone else (similar to
your oo architects can parcel out code writing).

So, all python, golang, and ruby programmers - you really should check out
haskell and realize it's superiority. They've taken your niche.

(If anyone wants to pay me to code in haskell, send me an email at
haskellpostgresprogrammer@gmail.com, looking for telecommute work)

~~~
pekk
It's a completely different language with different goals. It's like you said
"there's little reason to eat eggs instead of drinking wine these days."
Python was expressly NOT intended to be a compiled language with heavy
emphasis on type safety.

------
tome
If you're forced to write Python and you like FP I think it makes sense to try
to twist the Python you write towards FP as much as possible. The advice "Stop
writing classes" is particularly helpful in my experience.

However the "What did we miss?" slide tells it like it is. FP in Python is
never going to be like FP in, e.g., Haskell. Moreover, the libraries you use
are going to have objects and mutable state everywhere which ends up being
very frustrating.

If you're not forced not to use FP, I recommend just going with Haskell (or
OCaml or Scala or whatever according to your taste).

~~~
ams6110
_I recommend just going with..._

If you're writing some kind of server, I'd nominate Erlang for serious
consideration. The OTP framework is fantastic and rock solid.

------
westurner
Great deck about functional programming in Python. Also:

* `operator.attrgetter`, `getattr()`, `setattr()`, `object.__getattribute__` [1][2]

* `operator.itemgetter`, `object.__getitem__` [3][4]

* `collections.abc` [5][6]

[1]
[http://docs.python.org/2/library/operator.html#operator.attr...](http://docs.python.org/2/library/operator.html#operator.attrgetter)

[2]
[http://docs.python.org/2/reference/datamodel.html#object.__g...](http://docs.python.org/2/reference/datamodel.html#object.__getattribute__)

[3]
[http://docs.python.org/2/library/operator.html#operator.item...](http://docs.python.org/2/library/operator.html#operator.itemgetter)

[4]
[http://docs.python.org/2/reference/datamodel.html?#object.__...](http://docs.python.org/2/reference/datamodel.html?#object.__getitem__)

[5]
[http://docs.python.org/2/library/collections.html#collection...](http://docs.python.org/2/library/collections.html#collections-
abstract-base-classes)

[6]
[http://docs.python.org/3/library/collections.abc.html](http://docs.python.org/3/library/collections.abc.html)

------
_ZeD_
everytime I see

    
    
        reduce(operator.add, ...)
    

and variations like that I think GvR was right to relegate reduce to the
functools module... it's just that

    
    
        sum(...)
    

covers the most common (90%? only?) usage...

~~~
ak217
Not just that. More generally, comprehensions cover the most common use cases
and are tremendously more readable.

~~~
tome
Comprehesions are more readable in the sense that they read more like
"English" or "natural language". Unfortunately they're inherently non-
composable and make refactoring harder.

d0m's Moka example
([https://news.ycombinator.com/item?id=6150944](https://news.ycombinator.com/item?id=6150944))
is closer to composable. I would suggest the following modification. Instead
of chaining methods, have a function "chain" which composes functions. We
could then write

    
    
        chain(do(string.split, '+'),
              keep(lambda x: x != ""),
              map(int),
              reduce(moka.add))
    

If I decided that removing blanks and converting to int was a concept that was
worth capturing in its own function, all that I need to do is splice out part
of the chain as is.

    
    
        non_blanks_to_int = chain(keep(lambda x: x != ""),
                                  map(int))
        
        chain(do(string.split, '+'),
              non_blanks_to_int,
              reduce(moka.add))
    

No modifications were required. It's more complicated with comprehensions. Of
course it's still _possible_ , but it's one more unneccessary grain of sand in
the mechanism. Doing the same with the imperative version would be nigh-on
impossible.

Come to my FP days talk in Cambridge, UK, Thursday 24th October, 2013 if you
want to hear more about this!

[http://lanyrd.com/2013/fpdays/sckzkz/](http://lanyrd.com/2013/fpdays/sckzkz/)

~~~
maxerickson
If you take the awful step of naming things, comprehensions are easy enough to
chain. That is (in untested python...)

    
    
        parts=s.split('+')
        keep=(s for s in parts if s)
        ints=(int(s) for s in keep)
        sum(ints)
    

Hiding the middle steps there in a function is easy enough.

That split returns a list, but the other steps are lazy.

~~~
tome
But naming your intermediate variables means it's not possible to splice parts
out.

~~~
maxerickson
You did call it a grain of sand, but I suppose I'm responding to where you
call it complicated. Moving a couple of the names into a separate function
means passing in one of them and editing the last line (at least in python
with explicit returns). That meets the meaning of more complicated, but (in my
opinion) in a sort of hair splitting way.

~~~
tome
Horses for courses then I guess. If the "easier to refactor" argument doesn't
convince you maybe the "easier to read" argument will. Personally I strongly
prefer zero-length variable names, but everyone has their own taste.

------
zipfle
I'm a python-native programmer who just recently picked up SICP. I've been
trying to use some more functional tools; I told a friend at work one day that
I was going to try to do everything with just map and filter.

But I also just had an experience I haven't had much since I was extremely
new: the experience of going back to something I wrote last week and not being
able to decipher it. I'm going to keep it up, partly because I like it and
also because I have some hope (unjustified?) that map will give me better
performance than looping. I wish there was a way to enable tail recursion
though--seriously, I promise I'll ask for clear stack traces when I need them.

~~~
teddyh
@dorolow, your comment is dead because it used the auto-banned word
"mast?rb?tion" (uncensored). FYI.

~~~
lelf
What? We've got auto-banned words?

~~~
teddyh
Try it and see.

------
hcarvalhoalves
Love the video mentioned in the slides:

[https://www.youtube.com/watch?v=o9pEzgHorH0](https://www.youtube.com/watch?v=o9pEzgHorH0)

------
gpsarakis
Great presentation! Just a question/remark: Partial function application seems
handy but using functools.partial to create a new function seems obsolete
since you can use non-keyworded variable length argument lists with *args in
your function definitions?

------
bcl
The official Python docs also have a section on FP.

[http://docs.python.org/dev/howto/functional.html](http://docs.python.org/dev/howto/functional.html)

------
ivanbrussik
wow, i love your presentation design, type and function :)

~~~
westurner
is this a reveal.js rendering of an IPython notebook ?

    
    
        ipython nbconvert --to slides <notebook.ipynb>
    

[http://ipython.org/ipython-
doc/dev/interactive/nbconvert.htm...](http://ipython.org/ipython-
doc/dev/interactive/nbconvert.html#converting-notebooks-to-other-formats)

------
ghostdiver
this code is so smart, maybe because it is Python after all

    
    
        from operator import add
        expr = "28+32+++32++39"
        print reduce(add, map(int, filter(bool, expr.split("+"))))
    

but in JavaScript it looks much more readable:

    
    
      "28+32+++32++39".split(/([\+]+)/g).reduce((r,v) => r+v, 0);

