Right, that's what type inference is all about. Impressive they can take it that far without having any 'input' from the programmer, that's something that should make its way into other languages.
Hindley–Milner type inference is a mathematically-sound type inference system. Given certain not-entirely-unreasonable constraints, it is always correct. Given a program written entirely in those constraints, HM will be fully correct, no matter how large you make it.
Unfortunately, the constraints are indeed a little limiting. Haskell offers many advanced features that technically break HM type inference. Thus, in Haskell it is considered good practice to always label your functions with their type signatures. (Human-made ones are often better; the compiler may have been more generic than you actually want (though still correct), or you may be able to better "spell" the type signature using your type synonyms that mean something to humans.) Then, even if there is a function that manages to confuse the type inferencer, the damage is contained.
It's fairly easy to do the type annotation, too; nobody has to know that what you did was leave it off, load it into the compiler, and ask the interpreter what the inferred type was. :) I'm sure Haskell experts don't ever have to do that, but I'm still learning and there's definitely been some times where I could write a function that worked as I expected but I couldn't quite follow the type. So far I'm not confusing the type inferencer very often; mostly it just manifests as types I'd "spell" differently but are equivalent.
Thus, in Haskell it is considered good practice to always label your functions with their type signatures.
Haskell is 20 years old, and this has been made 'good practice' only very recently -- I'd date it to dons's revival blitz a few years ago. A professor of mine who'd been using Haskell from the beginning was annoyed by their use.
that's something that should make its way into other languages.
How do you think javac knows when you get the types wrong? It's inferring types internally, it just makes you repeat yourself out of fealty to Bondage and Discipline.
"that's something that should make its way into other languages"
It's starting to, to some extent. C#, for example, has had inferred types available since 3.0, though it still requires using the "var" keyword when declaring variables.
Thanks for the detailed explanation.