Hacker News new | past | comments | ask | show | jobs | submit login
Covariance and Contravariance: a fresh look at an old issue [pdf] (univ-paris-diderot.fr)
62 points by bassislife on May 17, 2016 | hide | past | web | favorite | 26 comments



The way I explain variance is with a "bakers and chefs" example. Chefs turn ingredients into food and bakers turn dough into bread. Since dough is an ingredient and bread is a food, you may think all bakers are chefs. But this isn't true, because a chef should be able to handle any ingredient, and a baker may not necessarily know what to do with broccoli.


Prior submission a year ago, not much by the way of comments at the time: https://news.ycombinator.com/item?id=9771718


A shorter, possibly easier to understand writeup from six months ago: https://www.mikeash.com/pyblog/friday-qa-2015-11-20-covarian...

I believe it was written by an HN commenter, and accumulated 67 comments: https://news.ycombinator.com/item?id=10606355


A very nice explanation about what covariance and contravariance mean, especially in the context of subtyping (with potential parametric polymorphism).

And implementation details are provided. An undergrad should be able to understand this.


Covariance and contravariance are wonderful ways to notice key OO design problems. Unfortunately it is very hard to learn to think that way.

Consider an OO language with classes and inheritance. Each class is a type. (There may be types that aren't classes, for example Java an interface also represents a type. The equivalent in a dynamic language with duck typing is "all objects that satisfy this contract".) An object belongs to the type of its class, and every class you inherit from. (For instance an integer is an integer, a number, and an object.)

So far, so good.

Covariance occurs when we can use objects of any subtype freely. For instance we can insert integers into a list of numbers.

Contravariance occurs when we can allow an object to be of some supertype freely. For instance it is safe to assume that numbers from our list of numbers are objects.

The problem is that we can almost never do both. For example in Java you can put integers into a list of numbers, but you can't read numbers out of a list of numbers and assume that they will be integers! (Not even if you only put integers in - the type system won't let you do it.)

So, which do we want? Well, sometimes one, and sometimes the other. For example the Liskov substitution principle says that an object of a subtype should be usable anywhere we can use an object of the original type. Which means that if we override a method in a subclass, we are OK if we change the method's signature to accept a supertype, but are breaking the rule if we change it to require a subtype of the original.

Unfortunately it sometimes makes sense to have a subtype override a method and require a subtype be passed in. The paper offers a graphics example involving colors.

When we have this, we have 3 options. 1) Disallow it because the type system can't easily guarantee that things won't break. (Static languages like Java mostly do this.) 2) Assume that the programmer isn't an idiot then throw run-time errors if the programmer was. (Most dynamic languages do that.) 3) Build a sophisticated type system that can figure things out and reason out problem cases in a clever way. (This is what the author would like language designers to do.)

Unfortunately for the author, there is a chicken and egg problem here. Few programmers understand the sophisticated type system required for the reasoning solution, or can understand the weird errors that the type system can give you to say why it won't let you do something stupid. So developers shy away from languages that provide such types. Therefore there is little demand for languages that provide it.

As a result language designers have little reason to do anything other than either simple type systems with easy to understand checks, or dynamic dispatch with run-time errors. Which is a frustration to people who have put effort into how to have clever type systems that provide both programming flexibility and automatically catch common classes of errors.


> Unfortunately for the author, there is a chicken and egg problem here. Few programmers understand the sophisticated type system required for the reasoning solution, or can understand the weird errors that the type system can give you to say why it won't let you do something stupid. So developers shy away from languages that provide such types. Therefore there is little demand for languages that provide it.

Somehow Haskell and OCaml programmers manage to get by! OCaml has proper variance management built into the core language. Similarly, GHC Haskell with Rank2Types (or anything subsuming it) enabled, this is what lets you say things like “every Lens is a Traversal”: Lens (resp. Traversal) has a Functor (resp. Applicative) constraint in contravariant position in what's otherwise the same type, and Applicative is a subclass of Functor, so Lens is a subtype of Traversal.

The notions of covariance and contravariance are too natural and useful to get rid of them. If your type system doesn't have them, people will work around it to express as much variance as they needed. Except the workarounds will be clumsy, ad-hoc and most likely incorrect.


Most programmers shy away from Haskell and OCaml. :-P

Seriously, the average programmer trying to learn Haskell starts with wanting to print "Hello, world", eventually winds up at a tutorial about monads, then retires with their head spinning. Haskell remains on the, "I should learn that some day" bucket list and remains unlearned.

This is not to say that you don't have plenty who don't learn them. But now we have another problem. One of the biggest reasons to use a language is available libraries. Because of the initial barriers to entry for these more sophisticated languages, there is a smaller pool of people writing useful libraries. Which means in the real world that when you want to get something done, you'll be more likely to find what you need pre-written if you use a more mainstream language.

Just to get a sense, in the (admittedly highly flawed) TIOBE index, the top language with a strong inference system is Scala, and the next is F#, then Haskell, and nothing else is in the top 50. The sum of popularities for these three would tie with Groovy at #18.

I have never written anything more than a toy program in any of these languages. I doubt I ever will.


> The sum of popularities for these three would tie with Groovy at #18

You said "highly flawed TIOBE index". Look at http://www.tiobe.com/tiobe_index?page=Groovy and notice the sudden jump from 0.33% in Nov 2015 to 1.2% a month later. I would say grossly unbelievable, even massively fabricated. I suspect Groovy's backers pulled off this trick by having "Gradle DSL" added to their Groovy keyword list, but with the announcement that Gradle are adding Kotlin to their supported DSL languages, Groovy's top 20 listing won't last.


Unfortunately, I think this speaks more towards the failing humans have towards assessing risk in complex systems. The cheaper to learn (considering time and effort required) languages that provide less safeguards routinely get far more new users over time. These languages can be useful in their ease of use (hey, I'm a Perl programmer, so I can't knock them entirely), but they have a far wider share of the market than I think is warranted.


ahem

PHP


Indeed.[1]

There's definitely a trade-off between on-boarding new programmers and teaching good software engineering practices. I think that programming language mind-share is so overvalued at this point that to some degree we are shooting ourselves in the foot. There's some point at which the number of new users you can get based on how easy your language is to learn and the average quality of the software produced in that language are at an equilibrium[2], and I think we've seen a few languages which have blown straight past that equilibrium.

1: As I suspect most Perl programmers feel when they've had to spend time writing PHP, the event can be infuriating. Here's a language that purported to take a lot of ideas from Perl, and all you can conclude is they took all the wrong things.

2: Obviously affected by other language traits as well. I think Python keeps a higher average of quality than you would think likely though strict dogmatism, so that works well for them, even if it doesn't attract me.


Is there really enough value in subtyping to keep it around?

Sure, lots of things in the real world are naively "is-a" relationships, eg.

  interface Fruit { 
    boolean isSoft();
  }

  class Apple implements Fruit {
    boolean isSoft() { ... }
  }

  class Banana implements Fruit {
    boolean isSoft() { ... }
  }
But this is both less explicit and less flexible than modelling it as a "has-a" relationship, eg.

  interface Fruit { 
    boolean isSoft();
  }

  class Apple {
    Fruit asFruit() { ... }
  }

  class Banana {
    Fruit asFruit() { ... }
  }
The latter example needs no subtyping, and thus no covariance and no contravariance. All for the price of an explicit .asFruit() here and there instead of an implicit upcast.


Inheritance isn't the same thing as subtyping:

(0) Inheritance is a (rather undisciplined) form of code reuse - it's literally automation for copying and pasting part of an existing definition into the body of another. It doesn't presuppose a notion of type.

(1) Subtyping is a semantic relationship between two types: all terms of a subtype also inhabit its supertype(s).

There's nothing too wrong with inheritance as long as you're aware that it doesn't always lead to the creation of subtypes. This is, for example, the case in OCaml.

Sadly, Java, C# and C++ confuse matters by conflating classes with types (which is tolerable) and subclasses with subtypes (which is a logical absurdity and leads to painful workarounds, I mean, design patterns, as we all have learnt the hard way).


The Java style (Nominal) subtyping is what most people are familiar with, and the most common reason why people think subtyping is necessary, so let's not stray into other kinds of subtyping until we can agree on this kind.


I never said subtyping isn't necessary, and if you read my reply to btilly, you'd see that I actually suggested otherwise: subtyping is basic, natural and necessary, so languages should do it right.

Also, as I again previously said, nominal typing and even nominal subtyping are fine (well, I said “tolerable”, since they have downsides for modularity, but that's a topic for another day), but conflating inheritance with subtyping is a problem. To put it in Java terms, a subclass should only be considered a subtype if:

(0) The subclass doesn't override any methods that aren't abstract in the superclass. A subclass can do whatever its implementor wishes, but a subtype can't behave differently from a supertype.

(1) The subclass doesn't directly mutate any inherited fields from the superclass - this destroys inherited invariants. OTOH, reading inherited fields is just fine in a subtype.

In other words, a subclass is a subtype if and only if the type-checker has enough information to tell that the Liskov substitution principle actually holds.


This can be summed up with, Favor composition over inheritance. :-)

Indeed it is a good idea to use composition whenever feasible. But your problems aren't over. Suppose you write a method that can accept anything that implements the Fruit interface. You've got covariance again. Suppose you have a dictionary whose values are of type Apple. You can pass those values into that method. That's contravariance again. And so it goes.

It doesn't matter whether you're defining types by classes, or what interfaces you implement. You will have types of some sort, and as soon as you do, you have covariance and contravariance as concepts again.


You can't have co/contravariance without subtyping. Remove the extends and the implements keyword from Java, and you're rid of it. You can still create instances of Fruit using annonymous classes.


Simple things like adding a list of Apples to a list of Fruits become non-trivial.


Non-trivial? In Scala it would be:

    fruits ++= apples.map(_.asFruit)


That part is fine. The difficulty arises when you want to take out an apple from a list of fruits.


Why should you be able to do that?


Right, if you take type-safety first, you shouldn't be able to. But a list of abstract fruits has very limited usage without the ability to accessing concrete fruit instance. The adoption of downcasting in some OO languages came out from such needs, given that they lacked generic and/or algebraic types. And with that regard, I thought your solution didn't address the original covariance/contravariance problem (that is, want to have heterogeneous list of fruits and allowing to access concrete types of individual elements).

There are type-safe ways, like making a fruit a sum type of apple and banana, or using traits or type classes, etc. But is-a/has-a discussion seems a bit off from that.


> I thought your solution didn't address

FWIW, I didn't propose the original Scala snippet, so it isn't really “my” solution.

> the original covariance/contravariance problem

The problem is precisely preventing what you're trying to do, because it's unsafe.


Ah, ok, I mixed up you with continuational.

For this context, what I'm saying is that if you don't need to extract an apple from a list of fruits, then you can do it safely with inheritance, so whether using inheritance or delegation is irrelevant. Isn't it?


If you want to extract an apple from a list of fruits, you need runtime type information (so that you can test whether a particular fruit object happens to be an apple), which has nothing to do with whether your language has inheritance or not. Using runtime type information is a symptom of bad design, though, since it lessens the extent to which you can reason about programs (as pieces of text) by just looking at their types.


I agree with you all. All I say is that suggesting delegation in place of inheritance to address contravariance issue seems off the mark.




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: