It's hard to tell with just the slides (and no talk), but it seems like this isn't presenting the complete/correct picture. Pip can use distribute or setuptools - it's not exactly an alternative as slide 31 makes it seem from just the text.
On a higher level, while it's interesting to know the evolution that things have taken, it's really not that complicated anymore - I haven't run into issues with Python packaging with ages; since pip is essentially fully featured (even including uninstallation), and pip is capable of working from source, I don't run into any issues when I use virtualenv (as everyone should be doing - and as is integrated into CPython since 3.3 anyway).
Python packaging has experimented with several different approaches, but it seems to have found the magical combination - in fact, I often miss the virtualenv + pip approach when working with other packaging systems in other languages (such as Haskell's cabal).
Depends on your use-case. I wasn't at the talk (and haven't watched the video, yet), but apparently there was a bit of a 'heated' debate during the question portion of the talk. Apparently some people from the scientific Python community have given up waiting for their issues to be addressed and are working on their own packaging solution to meet their needs.
If most of your modules are pure-Python or maybe a little bit of C, then you might not be running into the warts that the current packaging systems have.
Also, running 'pip install package' hides what kind of lengths that developers/package maintainers have to go through to get things working correctly. That's also part of the discussion when talking about a decent packaging system.
> I often miss the virtualenv + pip approach
Perl's "perlbrew + cpanm" approach is more fully featured. I miss having a 'requirements.txt' type capability, but perlbrew downloads/compiles Perl, so each environment can have a different version of Perl, which is a little more complex with virtualenv.