
Functor-Oriented Programming - szatkus
http://r6.ca/blog/20171010T001746Z.html
======
harveywi
Seems to be similar to the direction taken by John De Goes (using algebras to
describe your problem domain, building programs from those algebras via free
monads/applicatives/etc., interpreting those programs using natural
transformations). I think there is definitely something to this kind of
thinking.

Both authors argue that modern programming languages do not have the best
ergonomics for this style of programming, albeit it is at least possible to do
most of it in languages such as Haskell and Scala.

1\. Data Structures Are Antithetical to Functional Programming
[[http://degoes.net/articles/kill-data](http://degoes.net/articles/kill-data)]

2\. A Modern Architecture for FP [[http://degoes.net/articles/modern-
fp](http://degoes.net/articles/modern-fp)]

3\. Modern Functional Programming: Part 2 [[http://degoes.net/articles/modern-
fp-part-2](http://degoes.net/articles/modern-fp-part-2)]

~~~
shalabhc
> Data Structures Are Antithetical to Functional Programming

Interestingly I've heard a similar pov from the 'pure OO' (Smalltalk style)
world as well. Specifically avoiding 'data structures' and embracing late
binding.

~~~
jcelerier
And both styles lead to equally unreadable code. If you can't take an average
new graduate, put it in front of your code and have him be productive in your
codebase in a few days it's no good.

~~~
dmit
An average new graduate wouldn't be productive in any codebase in a few days.
And even if they were, that's a terribly low bar to set. Software development
education sucks and most people have to learn on the job. So why not teach
them ways to write more reliable and maintainable code.

~~~
Tarean
That is probably good advice but this style of coding (which comes down to
lawless type classes without inherent utility) does make it much more
difficult to read unknown code.

Usually documentation isn't fleshed out in internal modules so I end up
flipping between various instances and call sides just to figure out what the
type class is supposed to do. This doesn't mean that there aren't valid use
cases for this but please don't use it as default.

------
fmap
There is an easy design choice - which Haskell didn't take - which would allow
us to have transparent Identity functors, associative Compose and many others:
Add a conversion rule to your language and don't eagerly expand definitions at
the type level.

For example, in Coq you would write

    
    
      Definition Identity a := a.
      Definition Compose F G a := F (G a).
    
      Instance: Functor Identity. Proof[...]
    

and so on. This works, since explicit conversion during type checking allows
you to associate type classes to otherwise transparent definitions.

Of course, this is no silver bullet and complicates other things. Everything
works out beautifully though, if you combine conversion with bidirectional
type checking, at the expense of less powerful type inference.

~~~
hyperpape
Interestingly, the author did a dissertation using Coq, so I wonder if he
thinks that it's not really suitable.

~~~
fmap
I don't know, but Russel O'Connor got his PhD in 2008, the same year that type
classes were introduced in Coq.

There has been a lot of development on the Coq proof assistant since then. I
don't imagine that working with Coq in 2008 was particularly pleasant. :)

~~~
auggierose
I don't think working with Coq has ever or will ever be particularly pleasant
:-)

O'Connor gave a nice talk at TPHol's which I attended:
[https://arxiv.org/abs/cs/0505034](https://arxiv.org/abs/cs/0505034)

Keeping up with Coq shouldn't have been too difficult for him.

------
dbranes
This is great. It's essentially asking you to work in a 2-category[1], which,
once you get used to it, is an amazingly intuitive way to make sense of
constructions in ordinary 1-categories.

[1][https://ncatlab.org/nlab/show/2-category](https://ncatlab.org/nlab/show/2-category)

------
platz
Gabriel Gonzalez: The functor design pattern

[http://www.haskellforall.com/2012/09/the-functor-design-
patt...](http://www.haskellforall.com/2012/09/the-functor-design-pattern.html)

is also a well-known post.

------
vog
Pardon my ignorance, but can't this be more easily translated into terms of
(modern) object oriented programming, where you avoid mutating state?

Here, the "natural transformation" is a function that takes an input object
which adheres to a specific interface (i.e. provides a set of specific
methods), operates only on that input interface, and returns another object
that conforms to the expected output interface.

Avoiding mutating state means that either those objects have no attributes, or
that the state can only be set on construction. (In FP speak, this corresponds
currying of common parameters. Or to an internal abstract type, if the
constructor is private and you can only create it via a specific set of static
methods).

Is this modern style of OOP equivalent to "Functor-Oriented Programming", or
am I missing something?

~~~
lmm
> Avoiding mutating state means that either those objects have no attributes,
> or that the state can only be set on construction. (In FP speak, this
> corresponds currying of common parameters. Or to an internal abstract type,
> if the constructor is private and you can only create it via a specific set
> of static methods).

I don't think the style you're talking about is something that most people
would call "object oriented programming" ("modern" or otherwise). Indeed the
usual name for the programming style in which you avoid mutating state is
"functional programming".

~~~
vog
Maybe, but the terms "class", "method" and "interface" are more commonly
understood than "functor" or "natural transformation".

So for better understanding by the general public it may make sense to offer
an explaination in terms of OOP, even if what is described is more FP style
than OO style.

~~~
lmm
Functors are functors, natural transformations are natural transformations;
neither has much in common with classes, methods or interfaces. Functors are
being used to achieve composition here, but the use of functors is an
essential part of the pattern; talking about "composition-oriented
programming" would be vague to the point of meaninglessness (every programmer
believes they're being compositional), and there simply isn't an OO equivalent
of open recursive types as far as I know.

------
vittore
I really want to see examples of this.

~~~
danidiaz
I've seen mentioned the "streaming" package as an example of this:
[http://hackage.haskell.org/package/streaming](http://hackage.haskell.org/package/streaming)

"streaming" provides a Stream type which is parameterized by a Functor. The
main module in the package provides very general functions which work for any
functor and often involve natural transformations and functor compositions and
sums.
[http://hackage.haskell.org/package/streaming-0.1.4.5/docs/St...](http://hackage.haskell.org/package/streaming-0.1.4.5/docs/Streaming.html)

The "Streaming.Prelude" instantiates the functor with a pair type and provides
the more concrete API that one would expect from a streaming library.
[http://hackage.haskell.org/package/streaming-0.1.4.5/docs/St...](http://hackage.haskell.org/package/streaming-0.1.4.5/docs/Streaming-
Prelude.html)

------
AlexCoventry
Anyone got some good examples of this?

> With functor oriented programming, polymorphism is not necessarily about
> using functions polymorphically. Instead, polymorphism provides correctness
> guarantees by ensuring that a particular function can only touch the
> specific layers of functors in its type signature and is independent of the
> rest of the data structure. One benefits from polymorphism even when a
> function is only ever invoked at a single type.

~~~
lmm
Imagine you have two functions that you only call with lists of integers:

    
    
        def summarize1[A](list: List[A]) = ...
        def summarize2(list: List[Int]) = ...
    

Even though you don't call summarize1 polymorphically, the signature tells you
that it's only looking at the structure of the list rather than what's inside
it - a reader can immediately see that it's going to do something like return
the length of the list. Whereas summarize2 could do other things, e.g. sum the
elements of the list, add and subtract alternating elements...

A list is a pretty simple structure, these kinds of guarantees become more
useful when the structure itself is complicated. E.g. in the project I'm
currently working on I have a tree-like datatype that only ever contains
Strings, but I've written it generically so I can be confident that my
structural functions only work on the structure, and I have a clear separation
between functions that operate on the structure of the tree and functions that
operate on the content of the tree.

------
runeks
It would be nice to know what kind of code OP writes. Ie. are we talking
application code or library code, and if it's libraries are we talking
networking libraries, graphics libraries or math libraries?

Usually I find that Haskellers write very "low-level" code, ie. implement math
abstractions or the like, while I prefer to write applications (in Haskell),
where this abstraction technique may not be suitable.

~~~
kmicklas
This style is plenty useful in "higher level" stuff. Consider Reflex:
[http://docs.reflex-frp.org/en/latest/](http://docs.reflex-frp.org/en/latest/)

~~~
runeks
I don’t think I understand the reference. Reflex is a library, used to create
applications.

The actual application code doesn’t seem to adhere to any type of functor-
oriented programming, as far as I can see: [http://docs.reflex-
frp.org/en/latest/architecture.html](http://docs.reflex-
frp.org/en/latest/architecture.html)

~~~
kmicklas
> I don’t think I understand the reference. Reflex is a library, used to
> create applications.

Yes, but typical uses would be mobile apps and web apps, pretty "high level"
stuff.

> The actual application code doesn’t seem to adhere to any type of functor-
> oriented programming, as far as I can see: [http://docs.reflex-
> frp.org/en/latest/architecture.html](http://docs.reflex-
> frp.org/en/latest/architecture.html)

[https://hackage.haskell.org/package/reflex-0.4.0/docs/Reflex...](https://hackage.haskell.org/package/reflex-0.4.0/docs/Reflex-
Class.html)

The entire library abstracts over its implementation via the "Reflex" class.
This is not _exactly_ the style advocated in the original post, but it's
basically the same spirit.

------
highmastdon
It's not the same thing, but it triggered something. A programming language
related is Koka (see: An overview of Koka: [https://koka-
lang.github.io/koka/doc/kokaspec.html#sec-an-ov...](https://koka-
lang.github.io/koka/doc/kokaspec.html#sec-an-overview-of-koka)).

It's basically a functional programming language where functions can be
applied to a type. The type is then passed as the first parameter. So instead
of `afunction(param1, param2)` you would do `param1.afunction(param2)`.

Example: ([https://koka-lang.github.io/koka/doc/kokaspec.html#sec-
struc...](https://koka-lang.github.io/koka/doc/kokaspec.html#sec-structs))

struct person(age: int, name: string)

fun birthday(p: Person) { return p(age = p.age+1) }

val me = Person(27, "Albert") // { age: 27, name: "Albert" }

val meWithBirthday = me.birthday() // { age: 28, name: "Albert" }

~~~
gnaritas
That's just OO without encapsulation.

~~~
wtetzner
No, it's just some sugar over plain old functions.

~~~
gnaritas
Sure, and functions are just sugar over goto's.

~~~
wtetzner
Are you suggesting that the difference between a.foo(b) and foo(a, b) is
analogous to the difference between goto and functions? Functions give
structure and restrictions to goto. Method syntax is just a different way to
write the exact same thing.

~~~
gnaritas
I'm suggesting all of programming is syntactic sugar. Calling something
syntactic sugar is not a valid critique.

~~~
wtetzner
> I'm suggesting all of programming is syntactic sugar. I wasn't critiquing
> anything, just suggesting that moving the first argument of a function call
> to come before a . isn't enough to consider something OO.

------
z3t4
A strict type system makes sure your program is statically correct, and
catches many bugs caused by the programmer. The real world though is dynamic
and I prefer to check all input data at runtime, eg. each function check if
what's passed into it is correct. It not only catches a lot of bugs, those
caused by the programmer _and_ the ones caused by interacting with the outside
word, it also makes the code easier to test. This probably sounds like I'm
swearing in church to a Haskel programmer though.

~~~
thinkpad20
I'm not sure what you mean, or perhaps you have an incorrect understanding of
how Haskell works. Of course data is still checked at runtime; for example a
parser will check that a string is formatted in valid way, or a lookup in a
hash table will check that a key is valid. This is done at runtime as it would
be in virtually any language.

However there's no way in Haskell (short of low-level type system tricks) to
check that a function with an argument of type `Int` actually receives an
`Int` when it is called. There's no need to, because it is guaranteed by the
type system. The strength of these guarantees, coupled with the ability to
define rich, expressive types, are what makes Haskell programs very robust.
For example you can represent all possible results of a valid JSON parse:

    
    
        data Value
          = Object (HashMap Text Value)	 
          | Array (Vector Value)	 
          | String Text
          | Number Float	 
          | Bool Bool	 
          | Null
    

Which variant of this type is actually encountered will not be determined
until runtime, so the program is required to respond to the different cases;
however a function which takes type `Value` will never need to worry that it
doesn't actually receive one of the above variants at runtime.

------
Fire-Dragon-DoL
Any chance someone can provide code examples, maybe in a bunch of different
languages (elm, haskell and even js functional)? Would love to understand this
more

------
dmitriid
> Instead of writing transformations between data structures, one writes
> natural transformations between functors, where a natural transformation
> between functors F and G is a polymorphic function of type forall a. F a ->
> G a.

There's nothing _natural_ about this.

;)

