

Functional Python Made Easy - Suor
http://hackflow.com/blog/2013/10/13/functional-python-made-easy/

======
neur0mancer
What Python really needs to be more useful as a "functional language" is a
consistent way to define algebraic data types and to perform pattern matching.

This is the closest thing i found for that:
[https://github.com/lihaoyi/macropy](https://github.com/lihaoyi/macropy)

~~~
profquail
Forgive me if this is a silly question -- I'm an F# developer by trade, and I
only know a bit of Python -- but why would you rather add a bunch of features
common to strongly+statically-typed functional programming languages to Python
instead of simply writing your code in a language which already has support
for such features?

For example, you could write code you want to use GADTs and pattern-matching
for in Haskell, then create a Python wrapper for scripting purposes. I
understand that having your solution written in two languages generally
requires more effort to keep things working, and it's more difficult to find
qualified developers to hire. To me, it just seems like it would make more
sense to use each language's strengths where you need them instead of trying
to graft all the features you could ever want into a single language (which,
IMO, is basically what C++ does, to it's detriment).

~~~
nnq
> F# developer by trade

I thought F# was quite a niche language and there weren't any "F# developers",
just .NET/C# that also know some F#. How did you end up on this career path?

About grafting features onto Python: most Python guys come from the Linux land
and have quite a problem with using a .NET/Mono language, and Haskell is/seems
too intimidating and complex. Most also seem to dislike/fear the JVM. So
there's only OCaml left and this is even more of a "niche language" than F#.
Only viable option is extending Python, unfortunately Python is such a pain in
the ass to properly extend so people just make libraries to "ease the pain".

~~~
profquail
That was true some years back before F# was released as a "first-class"
language (F# 2.0 in VS2010), and the F# community is still (and probably
always will be) dwarfed by the sheer number of C# programmers in the world. On
the upside, the F# community has grown quite a lot in the last year; there's
even an official F# Foundation now: [http://fsharp.org](http://fsharp.org)

I ended up working with F# through my previous job; I founded a startup whose
product as a .NET -> GPU JIT compiler. The idea was similar to what Xamarin is
doing now with their C# (and now F#!) tools for building iOS and Android apps,
in that we aimed to make GPU programming accessible to the everyday
programmer.

I wrote the first version of our product in C#; it was around 30KLoC, still
missing quite a lot of functionality, and buggier than I was really happy
with. I'd heard that F# was based on ML, and that ML was designed for work
like building compilers and theorem provers, so I decided to take the plunge
and learn it. It was a little difficult to wrap my head around at first, but I
stuck to it, and I ported+simplified our original C# codebase to F# within a
couple of months, ending up with ~5KLoC and a new version of our product which
was much faster, had fewer bugs, and overall easier to maintain than before.
It sounds a little cheesy, but making the jump to F# and putting in the effort
to learn it _really_ well was one of the best career decisions I've ever made,
and I haven't regretted it for a second. It's made me a much stronger
developer overall.

I think you're spot on about extending Python with libraries to "ease the
pain". As I've said before, Python is quite a good, useful language; however,
it seems like a good chunk of the core "Python" libraries are actually C
libraries designed for use with a Python wrapper.

And sure, Haskell can be a bit obtuse at first, but it's really not too bad if
you actually take the time to learn it. I think the real problem is that
"senior" developers get comfortable in whatever language they use day-to-day
for a long time, then they try Haskell -- which is probably much different
than other languages they've used before -- and since they don't immediately
get it, they assume it's the language's fault and give up.

------
iamartnez
Guido brought up Python functional programming during the 2012 PyCon keynote:
[http://www.youtube.com/watch?v=EBRMq2Ioxsc#t=44m25s](http://www.youtube.com/watch?v=EBRMq2Ioxsc#t=44m25s)

------
kitanata
I gave a talk at PyOhio this year that covers this topic. There are some
errors in the presentation -it was the first time I gave it- but if you are
looking for some more information on it check it out.

[http://www.spkrbar.com/talk/11](http://www.spkrbar.com/talk/11)

------
bcl
This is also helpful -
[http://docs.python.org/2.7/howto/functional.html](http://docs.python.org/2.7/howto/functional.html)

------
jevinskie
I really like how re_find gives you the actual regex match right off the bat,
instead of having to go through match_result.groups()!

~~~
falcolas
Using groups lets you pull out parts of a match more easily. Perhaps it's a
bit of a corner case, but for parsing logs, I find it invaluable.

~~~
Suor
re_find returns tuple of all captures if there are more than one. See parse
ini example

~~~
falcolas
That's definitely a bonus. But in that case, what about the "entire line"
match, that you would get from match.groups(0)?

And if the captures are named? Is there a performance penalty for always using
search instead of match? Can I pass re.* flags to alter the behavior of the
engine? can I pre-compile regexes that I use frequently?

Please don't get me wrong; I appreciate the effort that went into this, but
there appears to be a lot of flexibility (and performance) lost in the re_find
function.

This is cool,

    
    
        dict(imap(re_finder('(\w+)=(\w+)'), ini.splitlines()))
    

But this would beat the pants off it in performance (and is, to my eyes, more
readable/equally functonal).

    
    
        dict((line.split('=') for line in ini if '=' in line))

~~~
Suor
You can pass flags and compiled regular expression. Pattern with named
captures causes re_find to return dict.

You won't get entire match if you have several captures, but you can add a
pair of parentheses around your regexp to circumvent this limitation.

As you can see, plenty of flexibility here. Also, you can fallback to naked re
once or twice a year, not that much trouble.

------
houshuang
Very neat, and great examples. I do a lot of data manipulation myself, and
could see this being very helpful/useful.

------
jnazario
neat, i do a lot of FP idioms in my code.

another option i frequently borrow from is google's "goopy": [http://goog-
goopy.sourceforge.net/goopy.functional.html](http://goog-
goopy.sourceforge.net/goopy.functional.html)

------
joshz
btw, Python 3.3 introduced ChainMap for "merging" dicts.

[http://docs.python.org/dev/library/collections#collections.C...](http://docs.python.org/dev/library/collections#collections.ChainMap)

