I believe the use cases are orthogonal at best. If you want to distill it, just as packages are great for dependency management and application install (which, if you read many Dockerfiles, you'll see the common approach is to have the package manager do most of that work,) Docker is great at combining the technologies and providing 2 types of experiences.
1) Developer intent. It is up to the developer to specify that the application receives traffic on specific ports. That it is going to store persistent data in a specific location. That is should run as a specific user.
2) Fulfillment (sysops). This is a prod environment? Let's put that storage on a NBD instead of local storage. Need static port allocation? Map it at run time. Host based routing? Run time.
I've found that the duality of the roles here can be quite powerful. And I believe it can only get better.
You could build something as a developer and install it on a machine, and it could run multiple versions of the same application at the same time, including with different ABIs. There were build servers and all the build scripts were automated and vcs-managed. You could package config changes or applications. You could go back and rebuild old crap nobody had looked at in 3 years, and have it actually work. Ops and devs could both use it independently, with ops having the ability to overwrite dev changes. It was slightly clunky, but the functionality was beautiful.
Decentralized, distributed, automated, auditable, and able to support maintenance of pretty much any kind without interrupting existing services. It was fucking sweet, and i've never seen another tool that could match it.
It all depends on the use case. Having everything is pretty much the same as having nothing.
Ahh yes, grasshopper. But neither situation is the pure folly of being attached to the idea of such possession!