
Covariance and Contravariance: a fresh look at an old issue [pdf] - bassislife
https://www.irif.univ-paris-diderot.fr/~gc/papers/covcon-again.pdf
======
hardmath123
The way I explain variance is with a "bakers and chefs" example. Chefs turn
ingredients into food and bakers turn dough into bread. Since dough is an
ingredient and bread is a food, you may think all bakers are chefs. But this
isn't true, because a chef should be able to handle _any_ ingredient, and a
baker may not necessarily know what to do with broccoli.

------
kbenson
Prior submission a year ago, not much by the way of comments at the time:
[https://news.ycombinator.com/item?id=9771718](https://news.ycombinator.com/item?id=9771718)

------
venning
A shorter, possibly easier to understand writeup from six months ago:
[https://www.mikeash.com/pyblog/friday-
qa-2015-11-20-covarian...](https://www.mikeash.com/pyblog/friday-
qa-2015-11-20-covariance-and-contravariance.html)

I believe it was written by an HN commenter, and accumulated 67 comments:
[https://news.ycombinator.com/item?id=10606355](https://news.ycombinator.com/item?id=10606355)

------
bassislife
A very nice explanation about what covariance and contravariance mean,
especially in the context of subtyping (with potential parametric
polymorphism).

And implementation details are provided. An undergrad should be able to
understand this.

------
btilly
Covariance and contravariance are wonderful ways to notice key OO design
problems. Unfortunately it is very hard to learn to think that way.

Consider an OO language with classes and inheritance. Each class is a type.
(There may be types that aren't classes, for example Java an interface also
represents a type. The equivalent in a dynamic language with duck typing is
"all objects that satisfy this contract".) An object belongs to the type of
its class, and every class you inherit from. (For instance an integer is an
integer, a number, and an object.)

So far, so good.

Covariance occurs when we can use objects of any subtype freely. For instance
we can insert integers into a list of numbers.

Contravariance occurs when we can allow an object to be of some supertype
freely. For instance it is safe to assume that numbers from our list of
numbers are objects.

The problem is that we can almost never do both. For example in Java you can
put integers into a list of numbers, but you can't read numbers out of a list
of numbers and assume that they will be integers! (Not even if you only put
integers in - the type system won't let you do it.)

So, which do we want? Well, sometimes one, and sometimes the other. For
example the Liskov substitution principle says that an object of a subtype
should be usable anywhere we can use an object of the original type. Which
means that if we override a method in a subclass, we are OK if we change the
method's signature to accept a supertype, but are breaking the rule if we
change it to require a subtype of the original.

Unfortunately it sometimes makes sense to have a subtype override a method and
require a subtype be passed in. The paper offers a graphics example involving
colors.

When we have this, we have 3 options. 1) Disallow it because the type system
can't easily guarantee that things won't break. (Static languages like Java
mostly do this.) 2) Assume that the programmer isn't an idiot then throw run-
time errors if the programmer was. (Most dynamic languages do that.) 3) Build
a sophisticated type system that can figure things out and reason out problem
cases in a clever way. (This is what the author would like language designers
to do.)

Unfortunately for the author, there is a chicken and egg problem here. Few
programmers understand the sophisticated type system required for the
reasoning solution, or can understand the weird errors that the type system
can give you to say why it won't let you do something stupid. So developers
shy away from languages that provide such types. Therefore there is little
demand for languages that provide it.

As a result language designers have little reason to do anything other than
either simple type systems with easy to understand checks, or dynamic dispatch
with run-time errors. Which is a frustration to people who have put effort
into how to have clever type systems that provide both programming flexibility
and automatically catch common classes of errors.

~~~
continuational
Is there really enough value in subtyping to keep it around?

Sure, lots of things in the real world are naively "is-a" relationships, eg.

    
    
      interface Fruit { 
        boolean isSoft();
      }
    
      class Apple implements Fruit {
        boolean isSoft() { ... }
      }
    
      class Banana implements Fruit {
        boolean isSoft() { ... }
      }
    

But this is both less explicit and less flexible than modelling it as a
"has-a" relationship, eg.

    
    
      interface Fruit { 
        boolean isSoft();
      }
    
      class Apple {
        Fruit asFruit() { ... }
      }
    
      class Banana {
        Fruit asFruit() { ... }
      }
    

The latter example needs no subtyping, and thus no covariance and no
contravariance. All for the price of an explicit .asFruit() here and there
instead of an implicit upcast.

~~~
catnaroek
Inheritance isn't the same thing as subtyping:

(0) Inheritance is a (rather undisciplined) form of code reuse - it's
literally automation for copying and pasting part of an existing definition
into the body of another. It doesn't presuppose a notion of type.

(1) Subtyping is a semantic relationship between two types: all terms of a
subtype also inhabit its supertype(s).

There's nothing _too_ wrong with inheritance as long as you're aware that it
doesn't always lead to the creation of subtypes. This is, for example, the
case in OCaml.

Sadly, Java, C# and C++ confuse matters by conflating classes with types
(which is tolerable) and subclasses with subtypes (which is a logical
absurdity and leads to painful workarounds, I mean, design patterns, as we all
have learnt the hard way).

~~~
continuational
The Java style (Nominal) subtyping is what most people are familiar with, and
the most common reason why people think subtyping is necessary, so let's not
stray into other kinds of subtyping until we can agree on this kind.

~~~
catnaroek
I never said subtyping isn't necessary, and if you read my reply to btilly,
you'd see that I actually suggested otherwise: subtyping is basic, natural and
necessary, so languages should do it right.

Also, as I again previously said, nominal typing and even nominal subtyping
are fine (well, I said “tolerable”, since they have downsides for modularity,
but that's a topic for another day), but _conflating inheritance with
subtyping_ is a problem. To put it in Java terms, a subclass should only be
considered a subtype if:

(0) The subclass doesn't override any methods that aren't abstract in the
superclass. A subclass can do whatever its implementor wishes, but a _subtype_
can't behave differently from a supertype.

(1) The subclass doesn't directly mutate any inherited fields from the
superclass - this destroys inherited invariants. OTOH, reading inherited
fields is just fine in a subtype.

In other words, a subclass is a subtype if and only if the type-checker has
enough information to tell that the Liskov substitution principle actually
holds.

