
Confessions of a Used Programming Language Salesman (2007) - alokrai
https://dl.acm.org/doi/pdf/10.1145/1297027.1297078
======
clarry
Speaking of Haskell and laziness.. I found myself wondering last night, what
was the issue with laziness again? I think it was that it's making it
difficult to parallelize things. Why would that be? Is it just an
implementation artefact, or are the rules of lazy evaluation actually
disallowing the compiler/runtime from batching things appropriately for
parallel computation?

EDIT: Guess I was thinking of this:
[http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.36.3...](http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.36.3806)

> we have shown that lazy languages, even when implemented using a framework
> well-suited for parallelism, can generate only small amount of parallelism.
> [..] this result, when combined with the result concerning the low
> parallelism for really lazy programs, can also be interpreted as saying that
> there is not much hope for lazy languages as far as parallelism is
> concerned: in effect, they are saying that we can hope to obtain parallelism
> from programs written in a lazy language only when the programs _do not
> need_ to be written in a lazy language!

That's a really old paper though (1995), if there are more recent relevant
studies or results, I'd be interested..

~~~
lmm
The big problem with laziness is that it becomes impossible to reason about
performance, because performance becomes noncompositional. If y = g(x) then we
expect the computation time of f(g(x)) to be the computation time of g(x) plus
the computation time of f(y), but in the presence of laziness that's no longer
true. Any sufficiently large codebase hits a performance problem (usually a
trivial one that ought to be easily resolved) eventually, and at that point
you lose the ability to refactor fearlessly that's the great advantage of
Haskell-like languages.

~~~
tome
Hmm, I'm not sure that's quite right. The time taken by f(g(x)) under lazy
evaluation is known to be no more than the time taken under strict evaluation.
It's the _space_ usage that can be hard to predict.

~~~
lmm
> The time taken by f(g(x)) under lazy evaluation is known to be no more than
> the time taken under strict evaluation.

True but irrelevant (particularly given that idiomatic Haskell involves a lot
of constructions that would take infinite time under strict evaluation). If x
< a, y < b, z < c and a + b = c, that doesn't give you any useful relationship
between x, y and z.

~~~
tome
A fair point. An example would be something like

    
    
        take 10 (map (+1) [1..])

------
tannhaeuser
> _To appreciate the difficulties of XML, let’s make a short excursion to the
> world of XML schema [...], which must be one of the most complex artifacts
> invented by mankind. The complexity of XSD is especially baffling since the
> existing solution, DTDs, is already a perfectly fine solution._

Ah, the period where it was attempted to integrate XML as first-class
type/literal into programming languages. I guess the conclusion can only be
that what SGML/XML is designed to represent (namely, semistructured text
documents with regular content models) was fundamentally misunderstood by
folks wanting to represent (relatively benign) business records-like
structures for service-oriented apps. The complexity results for statically
type-checking programming languages having XML as literals were disappointing,
and the whole thing got out of fashion. Then, shortly after Mozilla dropped
E4X (XML literals in JavaScript), JSX and React happened. I honestly don't
know what conclusion should be drawn here :)

~~~
lmm
My conclusions would be: usability in the small matters. Overengineering is
the natural fate of standards and must be fought at every turn. Insist
ruthlessly on conciseness at the cost of virtually anything else (but not
consistency; remember Perl).

~~~
tannhaeuser
Yes, and it's all the more ironic that XML started out as a simplified subset
of SGML - only to turn into a monstrosity shortly after with W3C's XML Schema
and WS-* "death star" series of standards.

------
okareaman
Orthogonal to the topic but Erik Meijer is hilarious

------
Rochus
The author compares himself with the prophet and apparently believes that the
acceptance of functional programming languages requires intensive missionary
work - just like with an ideology or religion. However, the history of science
shows that the right knowledge always prevails in the end. If it is so
difficult to get functional programming languages to market, either the time
is not yet ripe, or there are better solutions.

~~~
goto11
Programming languages are seldom chosen because of the merit of their core
semantics. Successful programming languages are coupled to some platform or
ecosystem and it is the platform that is chosen. For example JavaScript could
have been functional or imperative or whatever - it wouldn't have mattered for
it's success. Objective C was obscure until the success of the iOS platform,
C# was pushed by MS as the way to write Windows apps, VB was the way to script
office applications. None of these languages became successful due to their
core semantics.

Haskell will always remain niche since it is not coupled to a successful
platform. But compared to the thousands of other "second tier" languages it is
doing pretty well.

~~~
Icathian
I believe that you're right about why languages aren't chosen, but miss the
alternative explanation. Languages are chosen for speed of deployment and ease
of use, almost nothing else. From the dawn of time, when C and Unix beat out
Lisp, all the way to modern JavaScript, the tool that solves a given set of
problem quickest and easiest wins regardless of 'merit'. Worse is better, all
the way down.

~~~
AnimalMuppet
I agree, but there's one more twist: Problems are different. Languages are
chosen for ease of writing _specific programs_ , not for "general
programming".

I think, then, that FP is great for certain kinds of problems, and not so good
for others. If you've got an FP problem, reach for an FP language. If you
don't, don't.

What's an FP problem? Someone here (I wish I remembered who, I'd give credit
where due) said "if you can think of your program as a pipe", then FP is
appropriate. If the nature of your problem has a lot of state, FP probably
isn't the answer. (Note well: _problems themselves_ can have a lot of state.
It isn't always just the implementation - the problem itself may demand it.)

------
Bellamy
That is a one misleading title.

