
Mastering Time-to-Market with Haskell - Tehnix
https://www.fpcomplete.com/blog/2016/11/mastering-time-to-market-haskell
======
dorfsmay
I think this thread which I started this last weekend about Haskell
productivity vs Rust might be relevant here:

[https://m.reddit.com/r/rust/comments/5dtfp2/haskell_more_pro...](https://m.reddit.com/r/rust/comments/5dtfp2/haskell_more_productive_than_rust/)

(Tried to ask on HN first, which once again proves the value of subreddits, as
small focused communities:
[https://news.ycombinator.com/item?id=12989041](https://news.ycombinator.com/item?id=12989041))

------
douche
Every couple of months, I think to myself that I ought to buckle down and make
an effort at really learning Haskell. And I go through some books and some toy
projects, and then I hit the wall. The problem is, I can't actually use
Haskell in anything production, because nobody else that I work with is going
to be able to figure out how it works, and if I move on to some other job, my
employer is never going to find somebody with Haskell experience in this area
- at least for what they'd be willing to pay.

F# looks like a more pragmatic choice, and I've been looking hard for a good
place to bite off to use it going forward. But again, it'll be bad for the bus
factor.

So the best option I've found is to just start using more functional concepts
and patterns in Java/C#. C# in particular seems to be leaning in this
direction with the features added in the latest language versions.

------
kreetx
The take-away for Haskell tends to be if there is some kind of a commitment at
some "core" level to it (i.e the the long-term members of the tech team, CTO,
etc.). If there is none and people move in-and-out of the company, then it's
more practical to use a more mainstream language, which is easier and less
far-out to learn.

------
mirekrusin
Why comparisons to C#/Java are mentioned so many times but not a single
mention of F#?

Maybe with F# author would see decrease of development time compared to
Haskell?

~~~
vog
Or OCaml, which is kind-of the original language F# was copied from.

------
kriro
The article assumes an existing team which is a bit problematic when talking
about time to market. If you start the analysis earlier (two people discussing
some ideas in a coffee shop) then I'd argue that a language like Haskell can
be problematic if your metric is time to market. You might very well make that
up later by having a more robust code base or reaping any of the other
asserted benefits but the existing gallery of premade and tested building
blocks in other languages seems to be richer. It's probably also going to be
harder to add people to your team (on average).

I would have liked to see a comparison to other functional languages (say
Elixir or OCaml) and not just Java and C#. I'd also argue that picking Java
instead of a more agile environment (there are some cool lightweight Java
frameworks but most people will associate it with the rather heavy enterprise
stack) when comparing regarding time to market is a bit odd. Granted I'm
mostly thinking about webapps (but the article mentions Yesode).

Still a nice article (since my post sounds overly negative upon rereading).

~~~
lmm
> The article assumes an existing team which is a bit problematic when talking
> about time to market. If you start the analysis earlier (two people
> discussing some ideas in a coffee shop) then I'd argue that a language like
> Haskell can be problematic if your metric is time to market.

Well, the best language for an early-stage business is the one the founder
knows, but that's always going to be true. There's no reason that language
can't be Haskell.

------
greydius
> Haskell developers are self-selecting

This is not going to change.

I love Haskell, and I appreciate the effort many are making to evangelize the
language, but I am experienced (ie. cynical) enough to believe that it's never
going to become truly mainstream.

~~~
kreetx
That's my experience as well. As a Haskell user I once felt that I had a
secret, that it's so much better than the rest. Now, years later, it seems
many pretty much can't tell a difference between a good and a bad language. Or
rather, that many pick a language by the language's spread rather than its
properties.

------
kinkdr
Although very fresh into the Haskell world myself, I tend to agree with the
author that, when I know what I am doing, my Haskell code is usually written
in less time and has less bugs.

Having said that, Time-to-Market is only partially influenced by my-code, the
biggest part is the code that I don't have to write, i.e. third-party
libraries.

In my Haskell adventures I am having trouble finding third-party libraries for
even the most popular things, e.g. Cassandra. As far as I can tell there are
two libraries, the `cassandra-cql` and the `cql-io`, the first hasn't been
updated for a year now, and the second has only 3 stars, which makes me
uneasy.

So, although I can see where the author is coming from, I don't think you can
beat Java, Ruby, JS or Python in that sense. Unless of course your
code/project doesn't have a lot of dependencies.

~~~
eklavya
Mine has a single star but you may find it useful.
[https://github.com/eklavya/hascas](https://github.com/eklavya/hascas)

~~~
kinkdr
Thanks. I love that you have a fair-sized concrete example for usage. I am
giving it a try now.

------
stanislavb
And I believe that based on the enlisted properties/features, Elixir may
qualify pretty well, too. What is more, Elixir may be a better choice in
regards of time-to-market + developer happiness.

~~~
hellofunk
Yes, but that's a dynamically typed language and everything changes when
comparing it. There are so many tradeoffs.

Clojure is quite quick to write and time to market can be very fast, as with
most lisps, but you pay the price at debug time. We recently shipped a
production app that had a small typo that the compiler would have easily
caught, but instead it crashed the site when this one particular task was run.

~~~
kazinator
Your experience with Clojure might not in fact generalize to "most lisps",
which have very good compilers that catch all kinds of errors. Certainly the
"low hanging fruit" ones like references to unbound variables or functions and
such.

What was the typo?

> _it crashed the site when this one particular task was run._

Might that also be because that was the _first_ time the task was run at all,
since the code was written or altered?

Code that compiles (such that we are confident it is probably free of silly
typos, other than those somewhat rare typos which play out in some way that
still compiles), might not be correct; it still requires testing.

~~~
hellofunk
The typo had to do with accidentally naming a variable in a let binding to the
same name as a built-in function. It happily compiled but when that code
actually ran in one particular use-case, it crashed due to the way the binding
and the scoping worked in that case.

~~~
kazinator
> _accidentally naming a variable in a let binding to the same name as a
> built-in function._

Not even a problem in a Lisp-2 like ANSI CL!

    
    
      (let ((list '(1 2)))
        (list list list)) -> ((1 2) (1 2))
    

Facepalm. We could argue you were burned by the stupid Lisp-1 namespacing.
Under the separate function and variable namespace of a Lisp-2, you would have
to bind a local _function_ in order to shadow a global one.

Even without a compiler, we can implement a warning for this. The code walker
which expands macros is aware of lexical environments and can issue
diagnostics when suspicious-looking shadowing is going on, or unbound
variables are referenced and such.

Not only is this not the fault of the language being dynamic, but the problem
could exist in a static language, like, oh, C:

    
    
       #define DECLARE_MY_PRINTF int (*printf)(const char *, ...) = my_printf;
    
       {
         DECLARE_MY_PRINTF:
         printf("hello, %s\n", "world");  // goes to my_printf via shadowing local var
       }
    

ISO C does not require a diagnostic for this. Gcc has -Wshadow. ISTR -Wshadow
is _not_ turned on by -Wall or -Wextra; you have to use it explicitly.

Of course, the above depends on the local printf pointer actually having the
right type so that the call is well-formed. If we just have "int printf" or
whatever, the type system will catch it.

------
pron
> In summary we've seen that: Haskell decreases development time...

Have we actually _seen_ that or have you just _asserted_ that? Is this really
true, and if it is, by how much? Haskell has been around for a couple of
decades now, and has had least two hype cycles (I remember that when I was in
university in the late '90s, Haskell was the next big thing). It does not seem
to expand significantly even within organizations that have tried it (and
that's a very negative signal), with at least one notable case where the
language has been abandoned by a company that was among the flagship adopters.

In general, we know that often linguistic abstractions that seem like a good
idea in theory -- or even seem to work nicely in small programs -- don't end
up having a significant effect on the bottom line when larger software is
concerned. People say that scientific evidence of actual contribution is hard
to collect, but we don't even have well-researched anecdotes. Not only do we
not have strong evidence in favor of this hypothesis, but there aren't even
promising hints. All we do have is people who really like Haskell based on its
aesthetics and really _wish_ that the the nice theoretical arguments
translated to significant bottom-line gains.

This blog post by Dan Ghica, a PL researcher, really addresses this point:
there is nothing to suggest that aesthetically nice theory translates to
actual software development gains, and wishful thinking (or personal affinity)
simply cannot replace gathering of data:
[http://danghica.blogspot.com/2016/09/what-else-are-we-
gettin...](http://danghica.blogspot.com/2016/09/what-else-are-we-getting-
wrong.html)

~~~
psibi
It's hard to measure these types of things. But if you are interested, there
is a related paper regarding that: [http://haskell.cs.yale.edu/wp-
content/uploads/2011/03/Haskel...](http://haskell.cs.yale.edu/wp-
content/uploads/2011/03/HaskellVsAda-NSWC.pdf)

Although the study in the paper isn't very practical, it's still an
interesting experiment.

~~~
pron
> It's hard to measure these types of things.

I'll settle for well-researched case studies.

> there is a related paper regarding that

That's a step in the right direction, but the paper doesn't discuss software
development, but prototyping. We know that "theoretically aesthetic" languages
do well in specification and prototyping.

~~~
Tehnix
>I'll settle for well-researched case studies

Don't know if it helps, but they have some case studies here
[https://www.fpcomplete.com/case-studies](https://www.fpcomplete.com/case-
studies).

~~~
pron
That would have helped a lot if those really were case studies. Unfortunately,
they're just marketing material for FP Complete (with statements like "In
Haskell, Acme Inc. found the perfect solution!"). There's nothing wrong with
marketing material, but that's not what I meant by case studies (I meant
actual technical reports).

------
tmptmp
Warning: be warned before you commit to Haskell. Not all is rosy about
Haskell. You may find yourself in quagmire if you don't know for sure what you
are going to get from Haskell, especially from the libraries. Although this is
true for other languages also, the library support for Haskell is yet _far
from satisfactory_ as compared to the library support found for Python/Java.
The Haskell community seems to have been divided over it.

Not so ago there was some discussion about "batteries" included with Haskell.
[1] It compared situation of Haskell with that of Python/Java etc, worth
reading if you are about to go the Haskell route.

It seems, the priorities (academic, commercial, library support and so on) of
the members of Haskell community are at crossroads and they cannot seem to
resolve those very good, IMHO.

My take: Haskell is good for learning some really deep concepts, but may be
not so good when it comes to commercial projects, unless you are a Haskell
veteran and also have an army of Haskell veterans with you.

[1] [http://osdir.com/ml/haskell-
cafe@haskell.org/2016-10/msg0001...](http://osdir.com/ml/haskell-
cafe@haskell.org/2016-10/msg00014.html)

~~~
cageface
You would think that a really rigorous language like Haskell would foster a
better ecosystem of libraries than ad hoc languages like Python or JS but that
has not been the case at all so far. I'm not sure exactly why but it's one of
the main reasons I'm taking a wait and see attitude to Haskell.

~~~
oblio
> a really rigorous language

Is it rigorous in theory or in practice?

An academic veteran might want to better express his ideas in code while a
professional veteran might want more readable code with better test coverage.
And that's just 1 axis where the two viewpoints might diverge or even come in
conflict.

~~~
paulajohnson
I'm trying to imagine what "rigorous in practice but not theory" might look
like. It seems to me that rigor derives from theory and can then be applied in
practice. Haskell is rigorous in both.

~~~
oblio
I'm not talking about Haskell "the language", but about Haskell "the
ecosystem".

Industrial rigor regarding code means:

* consistence in using a coding style

* having adequately named modules, functions and variables

* having adequate comments

* having an adequate level of code coverage through automated tests

* having performance and regression tests

* having good release notes

* etc.

Many of those things, required for high quality libraries, are often skipped
for the academic projects Haskell is known for.

So that's why Python or even PHP or Javascript, as less "rigorous" theoretical
languages, can have more "rigorous" libraries in practice.

~~~
paulajohnson
So, like anything, the available libraries vary, but the important ones score
pretty highly. Haddock documentation (like Doxygen) is considered a basic part
of the job. "cabal test" will run embedded test harnesses. Stackage contains a
set of libraries considered to be stable and of sufficient quality that most
projects don't need to have qualms about using them. See for instance
[https://www.stackage.org/haddock/lts-7.10/http-
client-0.4.31...](https://www.stackage.org/haddock/lts-7.10/http-
client-0.4.31.1/Network-HTTP-Client.html)

------
almata
If you were a developer with 10 yoe looking for something new to get into,
what would you choose at this moment and thinking about the near future:
Haskell, Scala or F#?

~~~
LeonidasXIV
It depends what you are looking for. If you are looking for work then Scala
beats the other two hands down. Then comes F# and Haskell somewhere in the
end.

If you're looking for enlightenment, pick up Haskell, maybe OCaml and skip F#.

~~~
pjmlp
Well, F# beats both OCaml and Haskell in industrial support and tooling,
specially if we take .NET libraries into account.

------
millstone
IME Haskell development has a sort of bell-curve to it. Initially, you're
spending a lot of time prototyping, fumbling around trying to find the right
abstractions. Here Haskell mostly gets in the way: you have to declare up-
front which functions do I/O, etc.

But once the core abstractions are settled, you start to reap its power. The
type system catches tons of potential errors. Combinators allow for enormous
expressiveness. Here you're rolling: Haskell is in its zone!

But then you hit a wall. Laziness makes for brutal debugging. Singly linked
lists actually suck. Performance optimization is a black art. You find
yourself longing for a language with simple semantics and mechanical sympathy.
Now Haskell is bumping up against the real world.

Haskell has its sweet spot somewhere between "bang this out by 5pm" and "ship
this to a million users". (No surprise it's popular in academia.)

~~~
vog
_> But then you hit a wall. Laziness makes for brutal debugging. Singly linked
lists actually suck. Performance optimization is a black art_

To be fair, these issue are more-or-less solved by newer functional languages,
such as OCaml or Rust.

~~~
amelius
You mean non-purely functional languages.

~~~
vog
I assume you mean strict (the opposite of lazy) instead of non-pure (which
means imperative).

Well, even then, Rust does not only give you control over execution order, but
also tight control over memory management, while keeping it safe.

Haskell allows for good reasoning about correctness. In OCaml, you can
additionally reason about execution order, but memory management and garbage
collector are still somewhat "unpredictable" (i.e. hard to reason about). On
top of that, in Rust, you can reason clearly about memory usage and aliasing.

These new, modern languages follow a clear path about keeping quality and
correctness (hence safety) while having more and more control over
performance, without going back to low-level where you would loose clarity
about correctness and safety.

(Note, however, that the distinction between correctness and performance is
over-simplified here, because performance can also be part of correctness,
e.g. for real-time systems, or in systems where scalability is an important
requirement.)

~~~
pron
> pure (the opposite of imperative).

Pure is not quite the opposite of imperative; declarative is the opposite of
imperative, although in practice it is true that pure functional languages
tend to be more declarative that imperative functional ones.

You can have pure (AKA referentially transparent) imperative languages if they
are synchronous[1]. The synchronous style is especially well suited to
reactive/interactive applications, and quite popular in hard realtime
applications.

Also, we don't really have exact definitions to any of these terms
(imperative, pure, functional, declarative). Here is my attempt for
approximate, problematic definitions:

* Functional -- a language that models most/all computations as (possibly partial) mathematical functions, and constructs a program by assembling those functions.

* Imperative -- a language that models computation as state transitions, specifying what state should the computation have at each step.

* Declarative -- a language that describes what result should the program give (as a function, a relation or a behavior) rather than at each step.

* Pure -- a language where the semantic _value_ of the composition of syntactic terms can be determined by the semantic value of each of the component terms and no others.

[1]:
[https://en.wikipedia.org/wiki/Synchronous_programming_langua...](https://en.wikipedia.org/wiki/Synchronous_programming_language)

~~~
catnaroek
You've just defined “pure” as “can be given a denotational semantics”. I have
news for you: even ALGOL can be given a denotational semantics.

\---

@pron

Nowadays I prefer the terms “effect-free” and “effectful”, rather than “pure”
and “impure”, since I don't want to suggest that effects are somehow “wrong”
or “evil”. By definition, values are effect-free. An effect is anything that
invalidates some equational law that holds for values.

For example, integer equality is decidable, and `x == x` evaluates to `true`
when `x` is any integer value. However, if we substitute `x` with the
expression `foo()`, where:

    
    
        int counter = 0; // or any other initial value
        int foo() { return counter++; }
    

Then `foo() == foo()` doesn't evaluate to `true` anymore. Hence `foo` is an
effectful procedure.

Of course, nontermination is an effect too. If we had defined `foo` as:

    
    
        int foo() { while(true); return 0; }
    

Then `foo() == foo()` would similarly fail to evaluate to `true`.

~~~
pron
Yeah, I know my definitions are problematic. Remember that the term
"referential transparency" was introduced to computer science by Strachey in
his _Fundamental Concepts in Programming Languages_ as a property of
_procedural, imperative_ languages. How would you define pure?

\---

> An effect is anything that invalidates some equational law that holds for
> values.

Ah, I can certainly accept that as a definition, except that I'm not sure how
different it is from mine, because the equational law you refer to applies to
syntactic terms (`foo` in your example) and the language's equality operator
(`==` in your example). It's easy to come with an equivalence relation that
would hold for `foo`, but not at the syntax level (e.g., equality that takes
`foo` to mean `foo`'s definition and the content of the heap, and the value of
`foo` would not be its return value but its behavior -- i.e., the behavior of
foo for equal heap contents is always the same).

~~~
catnaroek
> except that I'm not sure how different it is from mine, because the
> equational law you refer to applies to syntactic terms

An effectful language can be given a denotational semantics. At least from
what I've seen, when most people say “pure”, they mean “effect-free”.

> It's easy to come with an equivalence relation that would hold for `foo`,
> _but not at the syntax level_ [emphasis mine]

Yep, indeed, that's the whole point.

~~~
pron
> when most people say “pure”, they mean “effect-free”

But saying "effect free" doesn't mean much unless you define what an effect
is, and what constitutes an effect depends on the language. I meant to define
the same thing without referring to another vague definition.

~~~
catnaroek
I already defined above what an effect is. My definicion is technically
precise and language-independent.

~~~
pron
OK, so you define effect-freedom using the language's equality operator. I
agree it's a better definition than mine, but I think assembly language and
BASIC would still qualify as effect-free (provided there's no concurrency)
even though some people would not consider them pure.

~~~
catnaroek
> OK, so you define effect-freedom using the language's equality operator.

Not the runtime equality testing operator. Rather, the language's static
notion of contextual equivalence.

It just happens to be the case that most high-level languages provide a built-
in equality testing operator that works on primitive types, returning `true`
iff its operands are contextually equivalent values.

> I think assembly language and BASIC would still qualify as effect-free

I don't even know what a good notion of contextual equivalence for any
assembly language would be.

As for BASIC, I haven't used the old ones, so I can't comment on them. I've
used Visual Basic, which most certainly has effectful procedures.

~~~
pron
> Not the runtime equality testing operator. Rather, the language's static
> notion of contextual equivalence.

How is that different from denotational semantics?

> The ones that can be effectful or effect-free are specific computations, not
> whole languages.

Well, we could define an effect-free language as one where all programs are
effect-free. But anyway, what is the "static notion of contextual equivalence"
in assembly (or BASIC), and how can you write an effectful program in such a
language?

