
The perfect programming language - furtively
https://cygni.se/the-perfect-programming-language/
======
nickserv
Interesting perspectives on a variety of languages, at first.

Then when he says that XSLT is the best language I was thinking that we are
very differently minded. So I suppose it makes sense that the Tailspin
language is completely incomprehensible to me.

~~~
gmueckl
For me it was the blatant rejection of C++ without any further discussion. C++
has a lot of baggage from C, but it also sports powerful metaprogramming, even
though that is flawed (the necessity of SFINAE speaks volumes...). There are
newer languages that improve on that a lot. The author doesn't even care to
mention them by name.

To me, it is pretty clear that the author has a very specific world view,
probably defined by the set of problems he's solving. I can't quite tell what
that perspective is. But this makes his claims a lot less universal than he
admits.

~~~
kick
You just acknowledged that C++ was not the perfect programming language in
your (rather biased for it) comment.

~~~
gmueckl
That is not what I wrote. Dismissing C++ summarily as this article does skips
over a few points that are worth talking about. Generics and templates are not
discussed. Memory management approaches of different languages are not
discussed. The world of programming languages is huge. The author seems to
ignore most of it, yet claims to search for a "perfect language" while
clinging to the nonsensical illusion that a single languagecan be perfect
forall use cases.

------
adgasf
Sad to see no mention of the impure ML languages (OCaml, F#, PureScript, etc).
For me these strike a great balance between powerful abstractions and the
ability to throw things together.

~~~
hhas01
The article’s title says “ _perfect_ programming language”.

“Impure” means the language design is by definition compromised. The moment
you allow imperative behaviors to leak into a declarative system it loses its
ability to reason _reliably_ about operations over time; and you’re back to
flying seat-of-your-pants a-la C &co. Haskell at least has the good grace to
firewall any imperative crap so that the remainder of the program can still be
reasoned about.

That doesn’t mean OCaml and F# can’t still be useful, in the same way that C
is still useful, but they cannot be more than the sum of their own
limitations; they are already evolutionary dead-ends. Thus article is correct
not to expend any space considering them.

~~~
abathologist
> The moment you allow imperative behaviors to leak into a declarative system
> it loses its ability to reason reliably about operations over time; and
> you’re back to flying seat-of-your-pants a-la C &co

By your (imo, extremist) line of reasoning, wouldn't the existence of
`unsafePerformIO` mean Haskell itself "looses its ability to reason reliably
about operations over time"? I think a more reasonable position allows for
gradations of reliability, and appreciates the benefits of encouraging pure
computations while allowing room for judicious use of benevolent side effects.

> they are already evolutionary dead-ends

That is patently false, imo. OCaml has a number of interesting features which
most statically typed FP languages are yet to explore (e.g., polymorphic
variants, (typed) named and optional arguments, first-class modules, binding
operators). More to the point, ongoing development on a novel approach to
multi-core, native support for algebraic effects, and modular implicits,
testify to it being a live evolutionary branch. OCaml aside, innovative
languages and approaches like like F _, 1Ml, Frank, BER MetaOcaml, and Rust
seem to give clear evidence that we still have much to learn from_ impure*
dialects of ML.

~~~
tome
> By your (imo, extremist) line of reasoning, wouldn't the existence of
> `unsafePerformIO` mean Haskell itself "looses its ability to reason reliably
> about operations over time"? By your (imo, extremist) line of reasoning,
> wouldn't the existence of `unsafePerformIO` mean Haskell itself "looses its
> ability to reason reliably about operations over time"?

Well no, because for some reason, be it social or technical, people simply
don't use unsafePerformIO in way that is actually unsafe.

~~~
abathologist
Yeah, that's my point. Contrary to the claim in the post I'm replying to, it
seems clear that having the ability to do unsafe impure things in a language
doesn't mean users of that language "loose all ability to reason reliably". To
the extent that the libraries and the ecosystem doen't make inappropriate use
of unsafe behavior, you can have your impurity and reliable referential
transparency too.

~~~
Quekid5
I definitely agree that "lose all ability[...]" is probably a _tad_ too strong
:).

> To the extent that the libraries and the ecosystem doen't make inappropriate
> use of unsafe behavior, you can have your impurity and reliable referential
> transparency too.

The key qualitative difference between Haskell and e.g. O'Caml/Scala/whatever
here is that it's quite trivial to verify that your program doesn't use side
effects inappropriately (aka. is not referentially transparent): Just make
sure it contains no unsafeFoo calls. There's no such mechanism in
O'Caml/Scala, etc. (I'm not sure, but I imagine it might even be undecidable
unless you restrict to a teeny tiny subset of the language, e.g. just the pure
arithmetic expression language.)

The qualitative difference in how _confident_ you can be that moving a bit of
code around won't change the semantics of your program is profound (or:
bonkers) -- and before I'd experienced it myself I could have (rightly!) been
extremely skeptical of such claims, yet here I am making that claim.

Yes, unless you really want to fully vet _all_ of your transitive dependencies
you will have quite a large TCB, but I find that it's actually really rare for
libraries to use unsafeFoo willy-nilly. Some of the low-level stuff like Text,
Vector, etc. do, but AFAICT it's mostly for performance reasons... and DGMW
quite serious bugs _have_ been found in these. However, because these are
isolated data structures these bugs get fixed once and they are immediately
fixed for all users.

Another wart wrt. a large TCB in Haskell is IO. Unfortunately, it has a very
large surface area, but there's no really much that can be done about that at
this point. Even still, as a programmer you can choose to use e.g.
polysemy/fused-effects/etc. to define the semantics of effectful code in a
much more fine-grained way such that the only point of contact between your
code and IO is in a tiny "interpretation of semantics" layer.

------
atilaneves
> but as soon as you enter a physics-institution anywhere in the world,
> FORTRAN likely reigns supreme

Not at CERN, but it used to be the case. It's C++ now.

> It turns out that FORTRAN is a very good match for the way physicists think
> about their work

I don't even know what this means and I have a PhD in Physics.

> You simply specify what fields you have and what pattern they are written in
> and the computer takes care of all the reading and writing for you

This is possible in any language with decent reflection.

> and layers upon layers of unnecessary complexity

None of the complexity in C++ is unnecessary. There were, as there are always,
trade-offs.

> Go was created explicitly to replace C++, it turned out that the programmers
> who love C++

IMHO, this is regurgitated nonsense from Rob Pike. What I got from what he
wrote on the subject is that he doesn't grok C++ at all. If he did, he
wouldn't be surprised that C++ developers didn't flock to Go.

> efficient development of efficient programs

I'm not aware of Go programs that are particularly efficient. I'd also argue
against "efficient development".

> there is an elegant practical solution in Go

Elegance is in the eye of the beholder. I don't think pretty much anything in
Go is elegant nor possible to be because of the lack of generics. It forces
people to write for loops, which is just... mind boggling to me. That alone
makes me wonder what kind of Python devs are taking it up.

~~~
lllr_finger
> efficient development

I think the rationale for that is twofold: that as a static typed language
that prefers stack allocation, you can get some very good performance without
putting much effort into it, and that it's usually clear what the idiomatic
way to accomplish something is. On the latter point, I appreciate golang's
simplicity when I need to review other people's code or dig into third party
libraries.

For the record, I agree with most of what you're saying. I also don't find it
elegant, and most developers I know are eager to work in a different language
after spending any amount of time in golang. Its benefits come at the expense
of expressiveness, and I never thought I'd be doing things like code gen to
work around a lack of generics in 2019.

~~~
atilaneves
> you can get some very good performance

Compared to... ?

> I appreciate golang's simplicity when I need to review other people's code
> or dig into third party libraries.

See, I don't. Every for loop has to be inspected to see if it's one of the
99.9% cases where it's a map, reduce, or filter. For loops can't be chained.
Error handling takes up over half of a function. An algorithm library can't be
written because of the lack of generics.

Switching to Python, I'd much rather read:

    
    
        [x for x in xs if x % 2 == 0]
    

Than:

    
    
        res = []
        for x in xs:
            if x %2 == 0:
                res.append(x)
    

3 lines of boilerplate to do 1 line of work. Pass.

Then there's the fact that everything is mutable in Go, so I have to track
every and all variables in case they change.

All in all, for me, reviewing code written in it is harder, not easier.

~~~
RodgerTheGreat
In K, I'd much rather read

    
    
        (~2!)#x
    

(Not (~) modulo 2 (2!) filter (#) of x.)

Or perhaps more idiomatically,

    
    
        x@&~2!x
    

(x indexed (@) where (&) not (~) modulo 2 (2!) of x.)

Reading a python comprehension requires awkward skipping around, and composing
them together is a mess. APL-style uniform precedence is something I wish more
languages imitated.

Still, anything's better than an endless sea of for loops.

~~~
yellowapple
I'm disappointed that these are not valid Perl programs (the first is a syntax
error, and the second complains about an array where an operator is expected).

------
classified
That Tailspin language looks even worse than Perl. After the author didn't
understand his own Perl code after a while, this seems like a lesson not
learned. Creating a good language really is hard, let alone a "perfect" one.

------
moksly
When a news paper headline asks a question that seemingly can be answered with
a yes or no, the answer is always “no”.

When a programming blog has a headline which teases knowledge of the perfect
technology, the answer is always “I don’t know”.

Both baits work on me, apparently, but I sure wish they didn’t.

------
hhas01
The article’s title is itself nonsense: the `imperative <> declarative`
disjoint alone means it’s logically impossible for a single language ever to
satisfy all programming needs; and that’s even before you start decomposing
them into specific problem spaces (kernel vs system vs application
programming; functional vs logic vs pipeline programming; single-threaded vs
massively distributed; rapid development vs pedal-to-the-metal execution;
etc).

(Article’s author probably realizes this and merely chose that title to
increase clickbateyness; but given the continued creation and persistence of
kitchen-sink mountains like C++ and Apple Swift it’s clear that plenty still
don’t.)

------
bwanab
I worked at one of those other company's that designed Ada specs that didn't
get selected, but we did do the first official ALS (Ada Language System).

"first Ada was so complex that even the compilers had performance problems, so
there was a bit of a delay getting it out"

This was largely true. The complexity of compilation really bit. But,

", and then, second, C and Unix happened"

This is silly. Unix and C had been around for years before Ada's design was
accepted. I'd been working on C in a Unix environment for five years when I
started working on Ada. In fact that's why I was hired since one of our first
target ALS implementations was for Unix which included several C system level
drivers.

~~~
pjmlp
Maybe he means, "happened across the industry".

Until the mid-90's Portuguese computing world, UNIX was only seen at some
universities and eventually government level businesses.

Everyone else was using some kind of mainframe, or the typical 8 and 16 bit
micros, mostly networking over Novel NetWare.

I only bothered to learn C when we had introduction to OSes, where our teacher
would carry a single 486 PC from an university lab into the class room,
running Xenix, to be shared across the whole set of students.

Each preparing their samples in Atari, Amiga and MS-DOS systems, followed by
copying the stuff into a floppy and being allowed a short timeslot to try out
our stuff.

So maybe that was where he was coming from, or then again, maybe not.

------
capableweb
Disappointed to see that Scheme is the only lisp mentioned and the author
spent too little time to be able to see past the parenthesis and realize the
model lisps represents. Interesting to see the different combinations he
thought about but really thought lisps would be mentioned more.

------
eapriv
Their discussion of Haskell starts with the statement that it is "very
abstract" and continues with "Do you really need six different ways to
implement factorial?". Well, you have six (in fact, many more that six) ways
to implement factorial in any (reasonably expressive) programming language.
That's like looking at language that has both if/then/else statements and
switch statements, and asking "do we really need both?". The first three ways
to implement factorial demonstrated in the article amount to minor differences
similar to those between "if/then/else" and "switch", and the last three rely
on various standard library functions, which technically doesn't say much
about the language itself.

~~~
cousin_it
Here's another list of factorial implementations in Haskell, which I think
does say something about the language and its mindset. Be sure to scroll to
the end:
[https://www.willamette.edu/~fruehr/haskell/evolution.html](https://www.willamette.edu/~fruehr/haskell/evolution.html)

Here's another illustration. Since values in Haskell are immutable,
implementing something like foo.bar.baz+=1 becomes a bit clunky, so they wrote
a library to make it easier. Check it out, especially the diagram that says
"simplified":
[http://hackage.haskell.org/package/lens](http://hackage.haskell.org/package/lens)

So I agree with the article that Haskell can be dauntingly abstract. (Though
you can write code that's perfectly straightforward if you want.)

~~~
imglorp
I would like to see a concerted effort from the Haskell community to set aside
the self-gratifying maps and factorials, and show us real work.

The OP article, too, focusing on reducing errors, went into the factorial
weeds here. How about presenting the air traffic control example in Haskell to
contrast with the Ada? Real concurrency, real I/O, and real error cases.

~~~
cousin_it
I think real I/O and error handling will be about the same in any language.
People fall in love with Haskell for other reasons:

It's very nice for some algorithms. For example, see Appendix A of this paper:
[https://pdfs.semanticscholar.org/b47b/41a9b2dcf6b3ffe3ce7a04...](https://pdfs.semanticscholar.org/b47b/41a9b2dcf6b3ffe3ce7a04c49958122b2060.pdf)
An implementation of red-black trees in only 60 lines, including deletion,
which alone can take hundreds of lines in a curly brace language.

It's also very nice for some APIs. For example, software transactional memory
fits the Haskell type system very well, because it can ensure that data can't
be modified outside a transaction and can be rolled back when needed. A team
at Microsoft, including some well-known folks, tried to port it from Haskell
to C# and failed due to the type system:
[https://www.infoq.com/news/2010/05/STM-
Dropped/](https://www.infoq.com/news/2010/05/STM-Dropped/)

Haskell has a bunch of non-mainstream features that work together unusually
well. It enables programmers to think a certain kind of thoughts that wouldn't
naturally arise in many other languages. It's like painting with oil when the
rest of the world uses watercolor: in theory they can draw the same things,
but in practice they'll guide your hand toward different things, so having
both makes the world richer.

------
daotoad
All that and his code examples are bizarre trash.

Despite the way he tries to crap on Perl (5) by referring to a Perl 6 (now
Raku) example, I find a Raku FizzBuzz far easier to read than his mess:

    
    
        multi sub fb( Int $n where * %% none(3,5) ) { return $n }
        multi sub fb( Int $n where * %% all(3,5) )  { return 'fizzbuzz' }
        multi sub fb( Int $n where * %% 3 )         { return 'fizz' }
        multi sub fb( Int $n where * %% 5 )         { return 'buzz' }
    
        .say for (1..15).map({ fb($_) } );
    

Or if you want to name your specialized types:

    
    
        subset Fizzy of Int where * %% 3;
        subset Buzzy of Int where * %% 5;
        subset FizzBuzzy of Int where * %% (3&5);
        subset Otherwyzy of Int where not * %% (3|5);
    
        multi sub sfb( Otherwyzy $n ) { return $n }
        multi sub sfb( FizzBuzzy $n ) { return 'fizzbuzz' }
        multi sub sfb( Fizzy $n ) { return 'fizz' }
        multi sub sfb( Buzzy $n ) { return 'buzz' }
    
        .say for (1..15).map({ sfb($_) } );

------
thekingofh
There seems to be a happy point in languages where it's easy enough to get up
and running, but complex enough to come back through and tighten things up.
Like how Python allows a secondary file for parameter types in the function
signatures. Also how tools like Flow for javascript has the bolt on type
checking. To me, the perfect language seems to be one that does the annoying
things like that for you and lets the programmer focus on the actual logic.

Languages like Rust seem to be a reaction against the problems caused by
dynamic typing, but I'm not sure that forcing all that burden onto the
programmer is the right answer yet. The languages that ease newcomers into the
language tend to do well. Rust is like a brick to the face.

Does that mean Rust won't take over the world? Who knows. I do know that what
they're doing with it does seem to be the future, but the language is
difficult, even for experienced programmers to learn.

~~~
zozbot234
Rust eases newcomers into the language a _lot_ more than C++ does. As there
are plenty of novices learning C++ as their first(!) programming language, I'm
not sure why newcomers would be expected to have more of an issue with Rust

~~~
thekingofh
I'd consider C++ as the worst of the worst when it comes to ease of entry.

Compared to, say Javascript, Ruby, or Go, Rust has a secondary set of rules
that programmers have to learn before they can even do the most simple things.
As an experienced programmer, I look at most Rust code with bewilderment. This
is after a week of studying the language.

I can look at Python and instantly understand what's happening, without any
knowledge of the language. I can look at Go's standard library code and
instantly know what it's doing, without any previous knowledge of the
language. That seems to me to be the goal of a language. I don't know much
about those languages and if they scale or whatever, but they look like they'd
be productive.

I look at most C++ and it's on the edge of being incomprehensible if anyone
does anything remotely abstract. And that's with 10+ years of experience in
the language.

Rust, to me seems like it chooses to prioritize memory correctness and speed
over everything else, including productivity. Maybe I'll hack away at learning
it for another few weeks and something will click, but for now it's an uphill
battle. That to me is a barrier to entry that will not help it gain traction.

~~~
zozbot234
> Rust has a secondary set of rules that programmers have to learn

The rules are not that hard _if_ you're somewhat familiar with the constraints
of functional programming. Rust requires you not to rely on shared, mutable
state, unless specifically enabled - thus, by default, you're essentially
writing FP code using a 'procedural' paradigm and syntax.

It _is_ also possible to do "quick, exploratory" programming in Rust; it just
requires somewhat liberal use of language facilities such as .clone() and
RefCell<>, to specify things that would mostly be implicit in other memory-
safe languages. (And because these somewhat problematic features are so
clearly marked in the resulting code, it becomes fairly easy to directly
refactor it into better-performing, idiomatic Rust.)

~~~
thekingofh
> thus, by default, you're essentially writing FP code using a 'procedural'
> paradigm and syntax

I've not heard it described this way. It's an idea I can get behind. A lot of
the native code I write tends to try and be as functional as possible, or at
least avoiding relying on shared mutable state.

And don't get me wrong. I think that Rust is a significant upgrade to the
status quo. Perhaps it's just time to keep hacking away at it until it clicks.

------
dustingetz
clojure is very close \- declarative data programming \- embraces immutability
\- hosted (jvm, js) \- host ecosystem compatibility \- principled culture

~~~
fnordsensei
Possibly with regards to simplicity as well, depending if the author means
"not (objectively) complex" or "not (subjectively) hard."

The "ability to know everything about the language" seems to (possibly) point
to simplicity as "not complex", but the discussion around "what you already
know" seems to point to the interpretation of "not hard."

I'm not sure I understood the paragraph as it was intended.

------
derekp7
One language that seldom gets mentioned is Postscript. People tend to think of
it as just a page descriptor language, but really that is just the built-in
graphic functions.

So we have a post fix (RPN, similar to HP calculators) expression style, which
is really a mirror image of Lisp. You have genuine first class functions,
introspection, and something I don't see that often is first class
environments. That is you can, within any part of the code, activate an
environment (they call it a dictionary, which is very similar to a C
structure). And they stack, so that something not in the active environment
can be automatically looked up in a parent environment.

~~~
dannas
I remember being surprised when I read an interview with Robert Sedgewick
where he listed the PostScript Language Reference Manual on his list of top
ten programming books. I've never read it though.

~~~
derekp7
They are also available free online -- "PostScript Language Tutorial &
Cookbook", also known as the PostScript Blue Book, is at [https://www-
cdf.fnal.gov/offline/PostScript/BLUEBOOK.PDF](https://www-
cdf.fnal.gov/offline/PostScript/BLUEBOOK.PDF). There is also the Red Book
(useful to fully understand the language, to the point of writing a PostScript
interpreter), and Green Book (more of a reference book than a cookbook).

------
michelpp
> I used to have a copy of "SQL for Dummies" at hand whenever I, with dread in
> my heart, needed to write some SQL, especially if there was any funky stuff
> like joins and such going on.

SQL isn't so much a programming language as a query language. It never claims
to be an efficient way to write a factorial function.

And why do people freak out when they see a JOIN? It's one of the simplest
concepts in databases: intersecting two sets. Everything in column A that
matches everything in column B. You don't even have to concern yourself with
the most optimal way to do it, just say it!

~~~
irishsultan
> And why do people freak out when they see a JOIN? It's one of the simplest
> concepts in databases: intersecting two sets. Everything in column A that
> matches everything in column B.

Actually no, that's not what a join is. A join is a cartesian product (not an
intersection), followed by a WHERE clause (which doesn't have to be an exact
match of a value between columns). This is further complicated by outer joins
where empty rows are added to one or both of the sets involved in the
cartesian product.

~~~
michelpp
Only a CROSS join is a Cartesian product which is rarely used.

------
skocznymroczny
For me Dart comes very close to the perfect set of features and the overall
feel of the language. I just wish it was more low-level, with pointers and
native code rather than VM.

~~~
pjmlp
Dart 2 compiles to native code.

------
otabdeveloper4
> The foremost is simplicity, that a language should be simple enough that the
> programmer should be able to know everything about it.

Well, no. Translated - "a language should serve the lowest possible
denominator".

Apart from the obvious fact that it is impossible, it's also definitely not
the way to go if you want quality software.

~~~
G4BB3R
There is no purpose of having a so complicated language - like c++ or java -
that the majority of the programmers don't even know 100% of the language. I
think Elm (and somehow Go) fits this category, everyone can learn in less than
a week every aspect of it. Another benefit of having a very simple language is
the compile time, c++ has a awful compile time, 100kloc can take half an hour,
in Elm it takes less than 5 seconds (incremental compile) vs 30 seconds (full
recompile).

~~~
Gibbon1
> c++ has a awful compile time

I have a thoery that C++ compiler writers obsession with pointless micro
optimizations is because C++'s compilation model scales horribly. Thus they
desperately want their compiler to be faster. But they are fucked because the
more optimizations just slows down the compile times even more.

~~~
MaxBarraclough
You can disable optimisation if you want faster builds.

The existence of optional optimisations isn't a problem.

~~~
Gibbon1
Of course not the problem is C++ compilation and link model is fundamentally
broken from a performance stand point.

~~~
pjmlp
Hence C++20 modules.

------
zyxzevn
For fun: The purrfect programming language is

C@ - [http://www.reddit.com/r/C_AT](http://www.reddit.com/r/C_AT)

~~~
pjmlp
As by my sibling comment, the language was called C+@, formely Calico.

[https://www.drdobbs.com/cpp/the-c-programming-
language/18440...](https://www.drdobbs.com/cpp/the-c-programming-
language/184409085)

------
mosselman
Wait a minute, Ruby is not on here...

~~~
thekingofh
Ruby hits that sweet spot for me. Have you seen Crystal? Ruby, but native.
Amazing stuff.

~~~
mosselman
Yes, Crystal is pretty cool. Though I have found that it is hard to compile
for different platforms. If you need speed and concurrency, giving jruby a
shot is worth a try. It speeds up your code through both being fast and having
'real' concurrency.

~~~
thekingofh
What do you think of the reliance on the JVM? I've tended to shy away from
using it.

~~~
capableweb
JVM can be kinda neat if you're doing server deployments and might use
different languages like Kotlin, Java, Clojure and jRuby, as the deployments
are mostly the same and same inspection tools can be used for the runtime.

However, doing CLIs or binaries for whatever reason, then it wouldn't be so
neat. Ruby wouldn't either, but GraalVM might be able to provide something
good for it in the future as well.

~~~
pjmlp
You could always compile Java to straight native code since the early 2000's,
just not for free.

------
Patient0
He praises XSLT and says that all data should be stored as XML!? This article
is surely a troll...

