
What's Functional Programming All About? - punnerud
http://www.lihaoyi.com/post/WhatsFunctionalProgrammingAllAbout.html
======
_ph_
Functional programming means, that your program is composed out of functions.
A function only depends on its input parameters and the effect of applying a
function is the returned function result. By this definition, functions don't
have side effects.

Why do functions and functional programming matter?

\- Functions are a very nice abstraction. Looking at the input and output
values gives you the complete information about the data flow. This makes
functions very easy to reason about.

\- With the absence of side effects, you also have no unintended side effects,
which plague all code which modifies state.

\- Functions are easily testable, they do not depend on an environment but
just the function parameters.

\- For the same input a function necessarily always returns the same result,
this also makes reasoning about a programs behavior easier.

\- As a consequence that they only depend on their inputs, functions usually
are very composeable.

While for certain tasks, modifying a global state can be the most efficient
way of performing a computation, functional programming does not have to be
slow, and in many cases the resulting program might even be faster. There are
several reasons for this:

\- With the lack of side-effects, functional programs can be relatively easy
parallelized.

\- With only local state involved, compilers can optimize the code inside
functions more aggressively.

There are pure functional languages, but most modern languages allow you to
write your program in a functional style. So it is possible to mix functional
with object oriented programming.

~~~
wiz21c
>>> functions usually are very composeable

This plus the fact that the "function" is also the most basic way to describe
operation on data means that you can build any kind of operations, data
structure, program construct as you please like lego.

This is at the same time, imho, the weakness of the paradigm when used in
industry : it gives so much freedom that two programmers will solve the same
problem in different ways, making it harder to share code. Not everyone has
the desire/patience to understand the many flavors of solutions proposed by
the colleagues.

~~~
_mhr_
What do you think the solution is? Not being snarky here; I'm genuinely
interested in the problem you've described, because this unbounded freedom
bothers me too.

~~~
harrisi
My immediate thought is specifying the requirements for the input and output
desired. This is something I thought a small bit about when learning Haskell.
It's easy enough to search on Hoogle[0] by type signature to at least narrow
it down, but then you have to look through all sorts of functions to see if
the semantics are correct. Then, one may think, perhaps having search
functionality to pass a collection of inputs and outputs and run any relevant
functions that have a valid type for the inputs and outputs. However, that
could take longer to compute than just writing the function in the first
place. It also gets complicated because perhaps what you need is a composition
of functions (map reduce, filter map, etc.), so now how do you know when to
stop composing functions to get the desired output? I'm not smart enough to
know if there's a general solution to this (other than simpler cases of
cycles).

I would also really like more insights into this. I'm working on a toy project
dealing with bytecode and being able to interpret the semantics of a series of
instructions reliably would be very handy. I believe if the above has any
practicality then it would help my problem.

[0]: [https://www.haskell.org/hoogle/](https://www.haskell.org/hoogle/)

------
dahart
> The core of Functional Programming is thinking about data-flow rather than
> control-flow

I've spent a small amount of time trying to convince people that FP is a
byproduct of immutability and not much else. But I like this description
better than that. I like that it captures the idea of dependencies. Data flow,
and FP, are about explicitly passing all the required dependencies into any
step in the process, and getting back all the results - as opposed to having
any implicit inputs or output. Making all dependencies explicit will force you
into writing immutable code, and likewise writing exclusively with immutable
data and functions that dont have side-effects will automatically make you
spit out code with explicit dependencies as a byproduct. These are two sides
of the data-flow coin.

I also like that the focus on data-flow makes it clear you can do FP in any
language. Some languages definitely help you, but you absolutely can practice
data-flow over control-flow in C and assembly almost as easily as python. It's
just tempting not to when you don't have to, being more strict about FP takes
more self control.

~~~
stcredzero
_Data flow, and FP, are about explicitly passing all the required dependencies
into any step in the process, and getting back all the results_

I always knew there was some relationship between FP and dataflow. There is
something profound here. However, I'm not so sure that current FP languages
are capturing this something in the best possible way. To me this:

[http://www.lihaoyi.com/post/BasicFunctionalProgramming/CodeG...](http://www.lihaoyi.com/post/BasicFunctionalProgramming/CodeGraph.png)

Somehow seems far inferior to this:

[http://www.lihaoyi.com/post/BasicFunctionalProgramming/Diagr...](http://www.lihaoyi.com/post/BasicFunctionalProgramming/DiagramGraph.png)

~~~
lilactown
Some syntactic sugar can help with this a bit; here's the same code rewritten
using the pipe operator in Elixir:

    
    
        def make_tiramisu(eggs, sugar1, wine, cheese, cream, fingers, espresso, sugar2, cocoa)
          beat(eggs)
          |> beat(sugar1, wine)
          |> whisk
          |> beat(beat(cheese))
          |> fold(whip(cream))
          |> assemble(
            soak2seconds(fingers, disolve(sugar2, espresso)))
          |> sift(cocoa)
        end
    

It reads a lot closer to the recipe, though the nested multiple operations are
still a bit hard to read. Separating them into their own named functions would
probably be better.

~~~
eyelidlessness
In Clojure:

    
    
        (defn make-tiramisu [eggs sugar1 wine cheese cream fingers espresso sugar2 cocoa]
          (-> eggs
              (beat)
              (beat sugar1 wine)
              (whisk)
              (beat (beat cheese))
              (fold (whip cream))
              (assemble (soak2seconds fingers (dissolve sugar2 espresso)))
              (sift cocoa)
              (refrigerate)))

------
chriswarbo
A very nice overview; whilst the author is right about having to 'linearise'
the block diagram into lines of text, it might be worth mentioning that the
diagram is a perfectly valid way of drawing a tree, and that tree is (as far
as I can tell) _exactly_ the syntax tree of the 'big expression' (the version
which doesn't put intermediate values into variables)!

Another nice point about the 'big expression' version: even hardcore
imperative programmers tend to avoid relying on "horizontal" order of
execution, e.g. in `f(x, y)` it's frowned upon to rely on either `x` or `y`
being evaluated first; the order is even undefined in some languages (e.g. C).
Hence the dependencies are clear from the nesting, whilst adjacent expressions
can be assumed to be independent.

FYI I wrote up my own advice at
[http://chriswarbo.net/blog/2015-01-18-learning_functional_pr...](http://chriswarbo.net/blog/2015-01-18-learning_functional_programming.html)
which echoes some of the points in this article (e.g. FP is a style applicable
to many languages; there's a spectrum from "less functional" to "more
functional", and it's fine to pick and choose for your situation; etc.)

------
blackrock
I love programming in a functional-style in Python.

I create one-way functions. Every input will always give the same output.

Then I build out the test harness for that function, to unit test the
function, and pathway test it. So I make sure that I have automated tests to
capture every single extreme that can go into the function.

I do this for every single function created. Then I string together one
function into another, and I put that into its own function. And I unit test
that larger function. Then sometimes, I even put those 2nd level functions,
into more functions. And I unit test that as well. In the end, I just have a
function that calls a function, that calls a function, and so forth. The
rabbit hole just goes deeper and deeper.

If I ever need to enhance a function, then I make my changes, add more unit
tests, and re-execute it. The impact is always local, and I have full
confidence in my unit tests and pathway tests.

I'm always incredibly amazed when I execute the program, and it works
flawlessly, every time. I keep iterating through the development cycle this
way. And every version checked in, is fully tested, very accurate, with
detailed test logs captured, and most importantly, it does not crash. It's
beautiful.

With Object-oriented programming, I have to maintain an external state. This
is like modifying a global variable within your object. Your methods have to
load and manipulate an external variable. And then another method will modify
it later.

However, I do like the organization and hiding ability of OOP. So in the end,
I just make all my functions as a functional-style, one-way function. And my
methods are just wrappers around it.

My main complaint about Python, is that it's just slow. I wish it had a
compile to binary feature built in. Although, I get around that with my heavy
unit tests, but some coding mistakes can be easily caught during compile time.

~~~
FridgeSeal
Check out Nim, it's got python style syntax, but it's statically typed and
compiles down into performant C and then into a binary.

I've been converting a lambda function used at work from Python into Nim[0]:
I've had very few issues so far and the performance gains are fantastic
(Admittedly my lambda function is pretty straightforward, but still). It's
also got support for functional programming styles, with a function
composition operator/first class functions etc[1] and a pragma to tell the
compiler that a function has no side effects[2].

[0] [https://nim-lang.org/](https://nim-lang.org/) [1] [https://nim-by-
example.github.io/procvars/](https://nim-by-example.github.io/procvars/) [2]
[https://nim-by-example.github.io/procs/](https://nim-by-
example.github.io/procs/)

------
bimbossible
I've been paying attention to functional for the last 6 or so months and
coming from a C background, its emphasis on immutability, data-oriented,
readability, and safeness is really attractive if you're not worried about
performance (which is allowed to take a back seat in most apps these days). I
can't wait for software to be improved harboring the benefits of functional
programming.

~~~
IMTDb
Why should FP be slower ?

The articles emphasises that one of the main arguments in favour of FP is that
it's easy to see what part of the recipe can be done in parallel. Why can't
the runtime automatically leverage multiple cores and speed up the process ?

~~~
_greim_
> Why can't the runtime automatically leverage multiple cores and speed up the
> process?

Do any runtimes actually do this? (Honest question)

~~~
jackhack
Erlang/elixir's runtime, OTP (open telecom platform) will do this. It can
scale to multiple cores, or across multiple machines in a network, almost
effortlessly.

[https://en.wikipedia.org/wiki/Erlang_(programming_language)](https://en.wikipedia.org/wiki/Erlang_\(programming_language\))

~~~
klibertp
It doesn't happen automatically on BEAM - you still need to explicitly spawn
your processes.

------
msla
Functional programming makes compilers nicer to work with, because you can
tell them explicitly when locality can be assumed.

All compilers which do any kind of optimization have to reason about the code
as if it were functional. Otherwise, they couldn't rearrange mathematical
expressions and do things like common subexpression elimination or constant
folding or even simple algebraic rearrangements we expect to happen because
we, as humans, know that mathematical operators are are functions and
therefore side-effect-free and therefore can be switched around arbitrarily as
long as the end result is correct, because the end result is all that matters.

But a compiler which could only optimize arithmetical and algebraic code would
be worthless. How about memory accesses? How about function calls in tight
loops? And that's where you run into problems in a language like C, where
strong assumptions about who gets to look at what generally don't hold.
Optimizing memory accesses is hard because you don't know who's looking at the
memory in a global sense. Spooky action-at-a-distance crops up, the kinds of
things humans don't cope with well because the effect could be many pages of
code removed from the cause. C has the volatile keyword for some instances of
this, which basically kills optimizations for certain variables, but the
general problem stands: C is unphysical. Locality cannot be assumed. Humans
don't deal with that very well.

In a language like C, therefore, there's a tension between generating
optimized code and generating predictable code, and that tension comes from
the fact the compiler doesn't know which code could have side-effects and,
therefore, invoke spooky action-at-a-distance if it's optimized too
aggressively.

In Haskell, or any other language where side-effects must be explicitly
marked, that problem goes away. Most code is purely local in its effects, and
only special code, written inside a special monad, can invoke nasal demons if
the optimizer gets too frisky. The compiler can trace its data-flow graphs and
do all kinds of tricks secure in the knowledge that nobody outside can peek in
to see how the sausage is being made. Only in special, marked locations do the
values get exposed to the outside world, which must be done in an orderly
fashion.

So all code is potentially functional. Functional languages just make it
possible for the developer and the compiler to agree about precisely _which_
code that is.

~~~
benlorenzetti
> ...Optimizing memory accesses is hard because you don't know who's looking
> at the memory in a global sense...In a language like C, therefore, there's a
> tension between generating optimized code and generating predictable code,
> and that tension comes from the fact the compiler doesn't know which code
> could have side-effects and, therefore, invoke spooky action-at-a-distance
> if it's optimized too aggressively.

I get this argument for functional languages and Haoyi's description for
functional style, by writing down the expression tree not just one possible
serialization the expression tree, was excellent); both the freedom to access
memory and the lack of richer information about possible execution orders
makes it more difficult for automatic compiler optimizations. I have
definitely bought into writing C in a functional way when using local
variables.

However, the lack of some compiler optimizations due to memory aliasing
guarantees is not always a disadvantage. Like bootstrapping a compiler, taking
advantage of new processor features, doing an inherently sequential task like
state machine parsing, or even if you just enjoy thinking about data and
algorithms at a low level.

What imperative languages need to remain competitive is richer type systems
regarding objects with pointers into the heap. And this is no secret;
languages have been experimenting with this forever. C++'s attempt with object
oriented programming was a good stab in the dark but clearly has limitations,
with the explosion of type information regarding things like exceptions and
wart covers like move semantics.

What I would like to see is a language with weak typing and free pointers like
C, but with a more capable, generic type system with concepts, like that
described by Alexander Stepanov in Elements of Programming. Also object
orientation as a language feature should be replaced by [some syntactic sugar
+ generic functions + function overloading + linearish? types] to make OO
style easier than it is in C. Things like memory aliasing optimizations would
be managed by generic functions and type concepts provided by STL like
libraries.

Perhaps it makes me a slower, less productive programmer but I like imperative
programming and access to memory. That is the part I enjoy.

~~~
msla
I fully agree that there's a time and place for explicitly sequential code. I
just think that it's safer for everyone if it's explicitly _marked_ as being
sequential, so compilers know that most optimizations are now off-limits.

~~~
benlorenzetti
This seems like a reasonable compromise if people are interested in a language
that can do both. Using C's syntax, a language could modify the semantics of
terminators , ; : to indicate strict/total ordering of statements.

------
carapace
"Can Programming Be Liberated From the von Neumann Style?"

John Backus's Turing Award Lecture [http://www.cs.tufts.edu/~nr/backus-
lecture.html](http://www.cs.tufts.edu/~nr/backus-lecture.html)

(This is the origin of FP.)

edit: One of the things that is central to FP is the idea of being able to do
_algebra_ to your programs.

~~~
kristianp
Not really. McCarthy's lisp paper was 1959, 18 years before Backus' lecture.
Also ML appeared in '73\.
[https://en.wikipedia.org/wiki/ML_(programming_language)](https://en.wikipedia.org/wiki/ML_\(programming_language\))

~~~
carapace
Eh, er, em, I kinda want to concede the point, but no.

I'm trying to draw attention to the _algebra of programs_. The subtitle of
Backus' paper is "A Functional Style and its Algebra of Programs". I just
skimmed the wikipedia article on Functional programming and it's not really
mentioned at all that I saw.

Everyone seems to leave it out, but it's arguably the most important part of
FP.

You can do wonderful things in Joy (Manfred von Thun's language) which is very
close to Backus' FP system. This algebraic stuff is possible in most languages
that are considered Functional, but it's much easier in a language designed
for it.

~~~
kristianp
I was just pointing out that it wasn't "the origin of FP" as you stated.

~~~
carapace
My whole point is that defining FP as "programming with functions" or
"programming with immutable data" or whatever is incomplete without the idea
of being able to do algebra to derive programs.

Now, you can kinda do algebra with LISP and ML but they do not lend themselves
to the algebraic approach with quite the ease of (Backus') FP or (Von Thun's)
Joy. Backus introduced the terminology and the "higher-order" functions in
that paper. I don't think people were talking about "functional programming"
prior to that. To me, with my admittedly limited understanding and knowledge,
it feels a little "revisionist" to go back and retro-actively anoint LISP as
the origin of FP.

(LISP and ML have their own greatness, they don't need to steal John Backus'
glory.)

If FP style becomes popular but everyone omits the algebra then it's not going
to fulfill its potential.

~~~
kazinator
A higher order function is a function that takes one or more functions as
arguments, or returns functions, or does both.

The 1960 Lisp manual already describes a LAMBDA operator and functions that
take functions, such as APPLY.

This papering about FP by Backus seems to have occurred in the mid to late
1970's.

"Higher order function" is a name given to something that some programmers had
already been taking for granted.

(And I don't think Backus was trying to take credit for inventing anything. Of
course he knew all that.)

~~~
carapace
Read the paper.

The _algebra_ of _programs_ is the thing that Backus' FP emphasizes that's new
from the context of LISP, et. al..

Let me try another tack: Why do you suppose Backus _didn 't_ write the paper
in terms of LISP or ML? He knew all that, as you say, and this was his _Turing
Award_ lecture. Are we all really going to try to maintain that _John Backus_
didn't have something new and interesting to talk about on this occasion?

------
dizzystar
I'm always surprised at the continued confusion of what FP is. I think that a
lot of it is defensiveness about not wanting to lose whatever someones's
favorite OO feature is.

In FP, nearly everything you like about OO exists except for one critical
thing: global mutation. Even James Gosling, the creator of Java, encourages
programmers to use mutation as a last option.

In other words, if you are programming well, you are nearly doing FP anyways.

~~~
nlawalker
>> In FP, nearly everything you like about OO exists

Is there an FP pattern that's analogous to an immutable object? The reason I
like objects has nothing to do with mutability or state, it's the convenience
of organizing a bunch of related functionality into a named class. When I
instantiate it, I effectively get a named value through which I can access a
bunch of functions that have all been closed over the same immutable values.

------
telesphore
I dunno, I agree that data flow is important and maybe the most important part
of functional programming. But the ability to sling program fragments (partial
application, pointless functions, etc.) around as if they were data seems
pretty important to me. I think of it as Lego-blocks for programs. If the
program parts snap together real easy and the data moves seamlessly (or
seemlessly-ish) from end-to-end then you're on the right track.

But I suppose there are many paths to functional programming... I have only
followed one so far.

------
flavio81
Best article on FP i've ever read, this deserves a bookmark.

Although to me it's not only about FP but also about data-centric development.

------
zeratax
Therw actually are people that do program in such a 2D diagram type, it's
called LabVIEW.

I recently had to work with it and while it was a pain to drag around these
blocks instead of just using my keyboard, it really did give you a better idea
of the data flow and made it incredibly easy to execute things in parallel.
Just place two blocks on top of each other.

------
sagitariusrex
Also a great read: "Why Functional Programming Matters"
[https://www.cs.kent.ac.uk/people/staff/dat/miranda/whyfp90.p...](https://www.cs.kent.ac.uk/people/staff/dat/miranda/whyfp90.pdf)

John Hughes is kind of a functional programming god.

------
quadcore
The core of Functional Programming is thinking about a succession of
expressions instead of a succession of instructions.

------
didibus
Its really just a practical application of the lambda calculus. Making it a
different model of computation proven to be equivalent in power to the turing
model.

Its requirements are to have functions of input variables to output variables,
where variables can be functions themselves. And the ability to apply
functions to values.

It contrast the turing model, which says instead that you have infinite
variables that can be uniquely accessed. And a set of instructions which can
read the values of variables at a given location, transform it, and write it
back at a given location.

------
cdevs
Functional programming made me start using static classes for manipulation and
instances for basic array containers and cleaners. Besides that my job has
memory heavy processing we try to keep low so to this date I can't see how
immutable data wouldn't be more of a burden with lingering objects that are no
longer used or need to be milled out. I did like how this article showed how
returning a new variable represented the variables manipulation through name
change.

~~~
tyurok
I can speak for Erlang when coming to immutable data structures and the memory
issue. Since erlang's processes are basicaly a recursive function, once the
process dies or it calls itself (with tail-call optimization), unused
structures dies out quickly. Also changing an immutable data structure does
not make a whole copy but rather a "diff".

And since processes are completely isolated, garbage collection is easy (and
doesn't hang the whole runtime), so the memory argument has been solved for
some time, not by the language, but by the runtime.

------
b4ux1t3
I liked this article, but I feel like it's just a bit too. . .matter of fact.
"FP _will_ make your linter do your work for you", things like that.

("Even if you haven't already-made a mistake, and are just thinking of making
a change to a codebase, the # FP version of the code is a lot easier to think
about than the # Imperative version" was particularly bad about this. I have
never had a problem paralellizing a procedural program that would have been
solved using FP)

It is _much_ better at being inclusive and not preachy than other articles of
similar scope. But I would prefer if posts like this tried to explain
functional programming without feeling the need to put down other paradigms.
And, to be clear, I'm a big fan of FP, and have been using many of its tenants
for a long time now.

Functional programming is just a different way to structure a code base. Some
people are better able to grasp how data flows through a maze of functions.
Some people are better able to grasp how an object interacts with other
objects.

In the end, it's all just a sequence of instructions executed, in order, by a
machine. The true correct way of writing code is: However you can get that
machine to do what you want it to do, while still being able to make changes
to its behavior.

------
hashmal
The recipe example is expressed even better using concatenative programming.

~~~
pmarreck
You might be right... Write a blog post about it!

------
ianai
It helps ensure software doesn't degrade to a chaotic state.

------
cwyers
The Python example of make_tiramisu I think really illustrates why I wish
Python had a pipe-forward operator, like |> in F#/Erlang or %>% in R.

~~~
brightsize
If a compile-to-python functional language might work for you, there's
[http://coconut.readthedocs.io/en/master/DOCS.html#pipeline](http://coconut.readthedocs.io/en/master/DOCS.html#pipeline)

------
EmTekker
Interesting perspective. After reading this article, if you want to use FP in
your python projects, definitely check out the toolz library. It's got great
API and really good documentation.

------
crimsonalucard
How come immutability is never mentioned? The only technical requirement for
your program to be functional is for every variable to be immutable. That's
really it. Every other functional concept is derived from this one rule.

------
Eire_Banshee
Its about feeling smarter than the guy next to you.

~~~
nv-vn
If the whole point were to seem smarter than others, what would be the point
of writing articles inviting people to learn it?

~~~
Eire_Banshee
internet street cred

~~~
eropple
So let's think about the two possibilities.

1) People feel, through experience, that writing code that minimizes mutable
state and side effects leads to building systems that are more easily reasoned
about, composed, and debugged, which is the job of the people thinking about
FP.

2) Those people just want to look smart.

Geez--I know which of these makes _a heck of a lot_ more sense.

------
rsrsrs86
Silly article, functional programming is about purity. All else is a
consequence.

------
snambi
Went through this post. Oh god... what kind of nonsense.... Here is the
definition for FP: making simple things complicated.

~~~
mbroncano
Oddly enough, I got the opposite impression: FP is about making complicated
things simple.

There's some sort of epiphany that naturally occurs after fiddling with FP for
a certain amount of time. After it winds down, your perception of how to solve
problems changes drastically.

If only because of the increased awareness is worth the trouble to play with
it a bit. I sincerely encourage you to do so.

~~~
b4ux1t3
Frankly, it's just a different way to think about structuring a code base.

Some people think better functionally, some people think better procedurally,
but, in the end, it's all just a list of instructions, executed in order by a
machine.

I'll never understand the need to poopoo all over someone else's way of
thinking (not you, the guy you're replying to).

~~~
kmicklas
That's like saying monarchy and democracy are just different ways of
structuring a society. Some people think better monarchically, some people
think better democratically. Why poopoo all over someone else's way of
thinking?

~~~
0xCMP
Isn't that what respecting a country's sovereignty is? Are societies not
organized differently?

~~~
b4ux1t3
Honestly, I wouldn't respond to anyone who compares political systems with
software development paradigms. Realistically, it's just really hard to
compare the two directly, and anyone who says otherwise is trying to sell you
something.

