
General purpose programming languages' speed of light - ltratt
http://tratt.net/laurie/tech_articles/articles/programming_languages_and_the_speed_of_light
======
StefanKarpinski
This "Magellanic view" of programming language exploration doesn't seem quite
right. Programming language design is mostly not about finding virgin
territory by inventing brand new ideas (which was admittedly easier in the era
when high-level languages were brand new – i.e. in the 1950s). Rather, it's
mostly about finding unexplored folds hidden nearby in the vast, combinatorial
manifold of ways to combine existing ideas in a single, coherent language. In
my experience, people who have spent a lot of time designing languages are the
_most_ sympathetic to people trying new permutations – precisely because they
are so painfully aware of all the awful compromises they were forced to make
in their own designs and their understanding that much better ways of
combining those features might be so tantalizingly close.

Scala is a great example: the innovation of the language is not so much in new
language features, but rather in its ingenious combination of so many powerful
features into a single, coherent system. Of course, some may argue that Scala
has too many features (I'm a bit terrified of it), but it's indisputable that
putting all those pieces together in a way that works is tour de force of
language design.

------
guard-of-terra
Am I the only one who sees no clothes on this article?

I mean, it seems that the person who wrote it spends a lot of time thinking
about programming and much less time programming; that's where he ends up with
meta-ideas that are interesting but also mostly wrong.

Languages are substantially different. You don't (and can't) understand every
concept under the hood to drive the thing. You can't add and substract
features, they're interdependent.

~~~
kd0amg
The comment about the last 20 years also makes me twitch a bit. That sort of
thing always makes me wonder if it's supposed to be about the state of actual
usable programming languages or of recent PL research. Either way, the first
two things I think of when asked about recent advances are substructural types
(maybe stretching the 20-year limit) and higher-order contracts, and these are
both available in things intended as general-purpose languages (Rust and
Racket).

~~~
jes5199
Yeah, it's almost like he's making a definition that's impossible to satisfy:
the changes in mainstream languages (since 1993!) don't count because the
features already existed in academic languages, but the new features in
academic languages don't count because they aren't mainstream.

------
rockmeamedee
A bit off topic, but I've been learning about languages like ML and Scala, and
what does everybody have against static typing? I feel like if we used type
systems better, we'd have a lot fewer problems. You can prove that your
programs have no bugs! That's much stronger than unit testing.

When thinking 'static types', does everybody just think C/C++/Java? Is it
upfront costs? My first ML program, I took half an hour to write a function
that output all the words in a trie. It gets easier, and more interesting
afterward, but I might have given up had it not been for a class. Are static
types too rigid for prototyping? Scala, Haskell and OCaml have REPLs.

Some of the "experimental features of static type systems" like dependent
types are really powerful; you can make some strong proofs about the logic of
your program. I'd be willing to give up (or at least try going without) duck
typing for that.

~~~
jes5199
Yeah, a lot of people's only experience with static types is C or Java. (A
programming-language-theory PhD student friend-of-mine was first exposed to C
while writing her dissertation. "This isn't a type system at all!" was the
diagnosis)

Also, sometimes in Haskell it's so hard to write a function that mostly-works
(because they type system sees your mistake but you don't) that you can't get
enough of your half-baked idea down so that you can see it clearly enough to
figure out how it's supposed to really work correctly

Also, I've seen plenty of programs that provably have no bugs, but still grow
to exhaust all system memory or occasionally take several hours to return from
a function call for no apparent reason. And those are a major pain to
diagnose.

~~~
papsosouid
>Also, I've seen plenty of programs that provably have no bugs, but still grow
to exhaust all system memory or occasionally take several hours to return from
a function call for no apparent reason

I think you mean programs that type checked, not programs that provably had no
bugs.

~~~
jes5199
What if those programs were guaranteed to do the right thing, if only that had
enough time or space to finish? Maybe they're still "correct".

Or, to put it another way: our existing type systems don't make code "provably
correct", and when people say that it does (like the commenter above), they're
trying to talk you into something.

~~~
papsosouid
>our existing type systems don't make code "provably correct"

That is what I am telling you.

>and when people say that it does (like the commenter above)

They said no such thing.

~~~
jes5199
Yes, you're telling me something that I was already trying to demonstrate with
my example.

~~~
papsosouid
I can't speak to your motive, all I can do is read what you posted. What you
posted was wrong, and I explained that.

~~~
jes5199
What I posted was obviously self-contradictory, which is not the same as
wrong.

~~~
papsosouid
What you posted was wrong, like I said. There are programs that have proofs
demonstrating they are free of bugs. You have not seen them do either of the
things you claim to have seen. You were stating something false.

------
NyxWulf
Interesting concept, but the flaw at the heart is the presupposition that lack
of progress right now means we are at the ultimate limit of what can be
accomplished. Imagine if cavemen learning to paint on walls said, well we
haven't improved in a few millennia, so this is probably the most complex
thing that can be represented by drawings. Or what about math stopping with
Euclid? It was thousands of years later that progress happened.

Technology comes in fits and starts. A lot of new things happened in the 50s
and 60s, we are still trying to figure out ways to use and apply them. Just
because someone thought about and prototyped something then doesn't mean it's
not new when that feature goes mainstream (e.g. garbage collection in Java,
channels for concurrency in Go, etc).

For a long time we couldn't break the sound barrier, which is a limit, but not
the speed of light limit. Because progress is stalled now doesn't mean there
will never be progress in the future.

~~~
groovy2shoes
The author addresses this in the post. First by comparing major technological
advances to earthquakes: "they occur at unpredictable intervals, with little
prior warning before their emergence." At the end of the post he says that he
doesn't think we've reached an ultimate limit and that the language design
space needs to be explored more fully.

~~~
Dylan16807
Which would mean even he doesn't agree with his ridiculous main point. But I
think you're being too generous.

~~~
groovy2shoes
I think it's more of a thought experiment; just an interesting idea to
entertain even if we haven't reached the limit. The more immediately
applicable point is that of a cognitive limit moreso than the suggestion of a
technological limit.

------
Zak
_But when our hypothetical Blub programmer looks in the other direction, up
the power continuum, he doesn't realize he's looking up. What he sees are
merely weird languages. He probably considers them about equivalent in power
to Blub, but with all this other hairy stuff thrown in as well. Blub is good
enough for him, because he thinks in Blub._

<http://www.paulgraham.com/avg.html>

------
altcognito
We went backwards when we went to the web in terms of development tools. It
was necessary but RAD (Delphi, Visual Basic, yes that VB) were far ahead of
where we are now today in the ability to put basic pieces together to make an
application. I've not used visual studio in a long time, but I would hope they
retained their core philosophy of reuse.

~~~
jes5199
A symptom of this: we have Object Oriented languages that can't produce UI
elements that are objects. And that's because we can't figure out how to let
objects live from one web request to another? And _that_ 's because despite
all the new database paradigms, nobody has made a widget-object model that can
be saved to the database and reloaded transparently, and _that's_ because our
models of serializability of objects are brittle and unsafe (see the recent
YAML bugs), and _that_ 's because we have no good systems for seamlessly
migrating data structures from one version to another

~~~
Zak
Objects do too much to live in a heterogeneous world. We'd do better
serializing maps (JSON is popular, of course, but EDN is better) and operating
on them using generic functions rather than trying to bind functionality to
data. It's easy enough to serialize the properties of a UI widget when the
functionality is data-driven and lives elsewhere.

~~~
jes5199
I think you're probably right? I mean, with the tools available in 2013, I
certainly agree that the Clojure way is the least crazy. But I miss some OOP
stuff when I'm doing that - like, I kinda like the organizational qualities of
OOP; lots of the time I wish I had ruby-style polymorphism (on immutable
structures?) instead of having to put all the logic in one function. Maybe
that means I should just learn how to use defprotocol. But I guess that still
doesn't give you an event model - what's the EDN equivalent of onClick ?

~~~
Zak
You should learn how to use records, protocols and hierarchies if you want to
organize things that way in Clojure. You'll have to be a bit more ad-hoc about
the same kinds of things when you're sending them to Javascript or storing
them in a database.

------
melloclello
I've always had a kind of feeling that there is an upper bound on the rate at
which a human can articulate an (original) idea/program/function in explicit
enough terms for a computer to then run it.

Even to do this in the first place will always require a bare minimum
understanding of the language of logic; basic control flow statements,
variables and so on.

This is why I feel things like Bret Victor's idea of the 'MathKiller' (which
as I understand the idea, is his general term for a hypothetical universally-
intuitive computing environment which can model _anything_ ) are goals which
we can only ever approach asymptotically - there will always be some uncharted
waters where the only option available to those who want to explore further is
simply to straight up write some code.

I guess the point I'm trying to make is that I feel improving the
sophistication of the programming languages we use or changing the core
paradigms upon which they are based will not help the situation; improving the
sophistication of the tools we use to write them with will.

~~~
VLM
The author's argument is a classic literature criticism / philosophy argument
and I'm kicking myself trying to remember which specific person/theory from
philosophy / lit crit he is more or less paraphrasing.

Note that I'm not by any means accusing the original author of plagiarism or
of trying to pull off a stunt. Convergent evolution makes perfect sense.
Although comp sci languages are a bit more rigid than other human languages,
trying to express creative / complicated stuff in a human created language is
an old, heavily discussed problem, even if the original author doesn't know
it. Does this go all the way back to Plato? I can't remember its just too
early in the morning.

The bright side is "we" as a species have been churning out new lit more or
less continuously for a couple millenia, so even if lit production dies off at
some point in the future (and I don't think it will) that means that in
computer languages we "only" have a couple more millenia of productive
programming left.

------
kyllo
I agree in the sense that I don't think there's going to be a killer new
programming language or language paradigm that's going to overturn the
existing languages within the problem domains they're suited for.

Since the current high-level languages are so extensible, the new paradigm
seems to be moving beyond programming languages to a higher level of
abstraction based on frameworks and DSLs. We're not just "writing code," we're
always writing code that writes code (that writes code, and so on, of course,
until it's a stream of 0's and 1's.). We do this because working at a higher
level of abstraction is usually more productive. The recent crop of
programming frameworks are just another layer on top of this. Next I guess we
will have some sort of meta-frameworks on top of those. So the specific
language choice will only matter insofar as it's a part of the framework
stack.

------
binarymax
I like to envision something like 'augmented coding', where the concepts are
driven by the programmer, and some friendly bot fills in the nitty gritty. I'm
not talking about something like a GUI driven language, more like autocomplete
- just that the autocomplete is doing way more than calling up a list of
method names.

~~~
seanmcdirmid
Sounds like AI and maybe machine learning. Once we get to that point though,
why do we need a person telling the bog what to do?

~~~
VLM
If you don't intentionally give it a god, repeated experimental evidence is
it'll invent (at least) one for itself, so you'd best keep in the loop, unless
you really trust evolution or don't really care what happens.

------
mej10
A post on the future of programming languages without mentioning Agda, Coq, or
OMeta?

I think we are just now getting beyond the "low-hanging fruit" era of
programming languages.

------
tel
Haskell's type system can be unwieldy (and often on the edge of
experimentation, it is) but that's because it's not so different from a
programming language itself. That said, both the type system and the language,
Haskell, itself are based on incredibly simple pieces.

------
narag
How would scissors invention fit in that reasoning?

I vote for the tooling option or, to be more precise, for a different
combination of tooling and features.

------
papsosouid
I hope the author spends some time learning about programming languages and
programming language research. The core point is simply false, and the
evidence to support it ranges from flimsy to nonsense.

"The plain truth about programming languages is that while there have been
many small gains in the last 20 years, there have been no major advances"

There have been many major advances. The fact that they were not included in
java does not mean they do not exist.

>There have been no new paradigms

Arguably there have only ever been two paradigms: imperative and functional
(object oriented and procedural simply being minor variations of the
imperative paradigm). It is not reasonable to expect entirely new paradigms to
be discovered on anything other than an incredibly rare basis.

>I'm not even aware of major new language features

The first thing this should do is trigger your "I should research new language
features" instinct, not your "I should assume there are none" instinct.

>beyond some aspects of static type systems

So, he does know some, but chooses to ignore them because why exactly?

>The core of virtually every extant programming language is largely similar.

No. And it is entirely possible for new things to replace old things,
programming languages are not required to take the C++ approach of
accumulating every possible feature that has ever existed.

>Some of the things Haskell and Scala's type systems can express are
astonishing; but I have also seen each type system baffle world-renowned
experts

That is an awfully bold statement to just pull out of nowhere with nothing to
back it up. Who are these experts, and what baffled them exactly? Haskell has
been a hotbed of programming language research in the last decade, with a
large number of advances being made and being put into actual use, then more
advancements being built on top of those. Dismissing the entire concept of
type systems based on an unnamed "expert" who was somehow "baffled" by some
unmentioned aspect of the language is insane.

>Verification techniques and tools have made major advances

Yeah, like those crazy type system things you just dismissed as being
unconvincing, oversold, and baffling to experts. Go learn agda and then tell
me nothing new has happened in 20 years.

~~~
munificent
> I hope the author spends some time learning about programming languages and
> programming language research.

The author is a programming language academic. See:
<http://tratt.net/laurie/research/pubs/>

~~~
papsosouid
I am aware. And as I said, I hope he spends some time learning about
programming languages and programming language research. Not just a tiny
subset of virtually identical languages that are not used for research. Look
at what he has published. His work has been entirely in the world of unityped
languages, and his characterization of type system research is that "some
stuff has happened but I don't know what it is and its types so who cares".
Making claims about programming languages as a whole requires a far broader
knowledge base than he appears to have acquired. Very deep knowledge in a very
small space is certainly very useful, but not for making general statements
about broad topics.

