
What to know before debating type systems (2010) - davedx
http://blog.steveklabnik.com/posts/2010-07-17-what-to-know-before-debating-type-systems
======
jeffbush
"For most computing environments, performance is the problem of two decades
ago."

I've been hearing that performance isn't a problem for the last 20 years, yet
it seems like I still spend a lot of time waiting for computers and devices to
respond. I've worked in mobile and embedded (arguably not a small segment) for
a while, and every major product I've released has had a last minute
optimization scramble. The growth of these spaces (and now wearable devices)
is only making this worse.

~~~
kaeluka
When people tell me they "don't worry about performance", I ask them how long
their smartphone battery lasts.

------
Aqueous
Informative article but I take issue with the following:

'It doesn't make sense to do the same kinds of exhaustive unit testing in
Haskell as you'd do in Ruby or Smalltalk. It's a waste of time. It's
interesting to note that the whole TDD movement comes from people who are
working in dynamically typed languages.'

Unit testing in dynamically typed languages serves two purposes: 1) to prove
code correctness, and 2) to prove code soundness/completeness, to show that
the result of a function is correct for most or all input parameters. In
statically typed languages 1) is already taken care of by the type system. If
your code compiles then there are no glaring type errors in it and your code
is correct. However, you still have #2 to worry about.

I work in Scala a lot where there is a static type system. I still have
extensive unit tests, and I can't tell you how many times it's saved me when
I've changed code in one area of the codebase, thought it was sound and
complete, and then immediately become informed it's caused tests to fail, even
though it compiled correctly. For instance I'll change code in a model and
think that the controller will still behave the same way - and unit testing
demonstrates immediately that it doesn't. This is especially true when I
didn't write the controller that depends on the model. Testing also helps you
rigorously test corner cases - when does this code throw an exception, when
does it return a negative number, etc.? It really helps you get a good sense
of when a piece of code is _done,_ and move onto the next code.

Unit testing is indispensible both in static and dynamically typed languages,
and I don't think lots of coverage is a waste of time in static languages.

~~~
stephencanon
Unit tests almost never prove correctness or completeness. For a non-trivial
program, the best you can hope to demonstrate is correctness for the arguments
tested in the state determined by the test environment. If you want to prove
correctness, you need proofs.

By and large, unit tests exist to quickly find bugs.

~~~
mrbrowning
As I like to put it: unit tests don't affirm the correctness of code with
respect to a specification, they merely affirm that certain implications of
the specification hold with the code under test. In that sense, the general
tendency to assume correctness in the face of passing test suite is basically
an instance affirming the consequent on a huge scale.

------
eatonphil
"One reason is that the average skill level of programmers who know Ruby is
higher than those who know Java, for example."

Had to double check the year on this one. Things must certainly have
changed... For instance, Java is a staple of most CS curriculum. If not Java
then often Python. /Never/ have I heard of Ruby being the primary language of
choice by a CS department. (Nothing against Ruby, just an observation.)

On the other hand, what is the most popular language/framework for a beginning
programmer to learn today? Ruby/Rails or Node? Whereas Java is the
quintessential Corporate language.

I'm obviously making some generalizations here. But things certainly seemed to
have changed a lot!

~~~
eru
I'm not sure if the author is just quoting the `reason' here, or actually
asserting it.

~~~
eatonphil
Certainly! I noticed is assertion and was confused - until I checked the year.
Then I listed some of the points relating to my confusion. What is apparent is
that I am just missing some historical context. At some point, as the author
asserts, Ruby appears to have been a more academic/corporate(?) language than
Java. I did not know that.

------
platz
Guy Steele: "A dynamic language is one that defers as many decisions as
possible until runtime.." [https://youtu.be/agw-
wlHGi0E?t=24m11s](https://youtu.be/agw-wlHGi0E?t=24m11s)

------
trejitus
Thanks for posting this, I was thinking about this topic recently while at
Angular U Conference and seeing how TypeScript is becoming the de facto
language in that community for Angular 2 development.

------
capicue
_From a theoretical perspective, preventing infinite loops is in a very deep
sense the most basic possible thing you can do with static types! The simply-
typed lambda calculus, on which all other type systems are based, proves that
programs terminate in a finite amount of time._

Is this not claiming that static typing (and/or lambda calculus) solves the
halting problem?

~~~
tikhonj
No. Rather, some typed variants of the lambda calculus _are not Turing-
complete_. They prevent infinite loops without running into the halting
problem by being strictly less powerful than normal programming langauges.

It's worth noting that this is not an either/or problem. Some modern
dependently typed languages take a more nuanced approach: they have a less-
powerful core with enforced termination¹ and a way to write potentially non-
terminating code that's isolated from everything else by the type system. It's
the same approach that Haskell uses successfully for IO—it allows IO, but only
in a controlled and explicitly delimited part of the program.

In a very real sense, we can treat potential non-termination as yet another
effect and manage it accordingly.

¹ Actually, they're even more nuanced than this: programs either have to
provably terminate or be provably "productive", which means they produce some
new output from new input in finite time. This allows languages to support
beneficial infinite loops like the event loop in an operating system while
preventing purely harmful busy loops that never get anywhere.

------
kazinator
Claim: _Dynamic and static type systems are two completely different things,
whose goals happen to partially overlap._

Immediately contradicted: _A static type system is a mechanism by which a
compiler examines source code and assigns labels (called "types") to pieces of
the syntax, and then uses them to infer something about the program's
behavior._

This is possible to do with a program that is understood as being dynamically
typed.

Moreover, whenever a type _can_ be assigned to a piece of syntax, it will
agree with the dynamic type.

I.e. not "completely different things".

~~~
AnimalMuppet
I think you're ignoring the rest of the article, which says that static and
dynamic types can do some overlapping things. So:

> This is possible to do with a program that is understood as being
> dynamically typed.

Well and good, _but it usually isn 't_. So the article's point is valid.

------
pron
A few comments:

1\. This debate is not (or no longer) binary. There is a whole spectrum of
static typing richness, from dynamic typing to well beyond Idris. The debate,
then, is not whether we should be at any of the extreme, but where along the
spectrum (including, maybe, the extremes). The interesting question to pose to
proponents of very rich type systems, then, is "should we always use the
richest type system possible?", and if not, why not? The whole debate --
which, again, is not binary -- hides in two of the questions the author lists
in the end: "How easy can it be made to program in a language... ?" Nobody
argues that the richer the type system the more properties can be proven
(except for one major caveat I mention in item 3). The debate isn't about what
character strings containing programs can be rejected or accepted. Programming
language design is 5% theory and 100% psychology. What string the compiler
accepts is a question for PL theorists. What language is "better" is a
question that is 100% psychological, and has absolutely nothing to do with one
theory or another. It's a question PL researchers are simply not qualified to
answer (Also, even before psychology, it may well be that moving from one
point along the spectrum towards richer typing has a cost -- simple, economic
cost in development time or in performance -- that is higher than the benefit
of extra correctness. Most programs need to be correct only up to a point. Why
should they pay for a type system that will make them more correct than they
need to be?)

2\. The question, "How close can we bring those test suites to the
unattainable ideal of never accepting a broken program?" directed at dynamic
typing can also be directed at static typing. True, a statically typed
language may reject _all_ programs, but practical ones will accept broken ones
_as well_ as reject correct ones, simply because there are properties the type
system won't verify.

3\. Testing isn't the only way to verify dynamically typed programs -- or
_any_ program along the typing-spectrum -- for properties that aren't enforced
by its type system. There are quite good formal verification methods that
don't rely on types. True, the author uses a confusing, self-referential
(undecidable? :)) definition for types: "a type is a label used by a type
system to prove some property of the program's behavior" that is hard to argue
with ("a type is something the type system uses"), but there are "property
annotations" in the form of pre- and post-conditions used by verifiers, that
aren't normally regarded as types (they don't form a particularly nice
algebra), and may be used at any point along the spectrum to prove properties
beyond the reach of the particular type system. The implication that types are
the _only_ way to prove correctness is wrong.

