When you have no third-party dependencies, then adding the first one requires picking amongst trade offs and lots of work. A subset of choices include using virtualenv, using pip, using higher layer tools, copying the code to the project, using a Python distribution that includes them, writing code to avoid needing the first dependency ...
* You have to document to humans and to the computer which of the approaches is being used
* Compiled extensions are a pain
* You have to consider multiple platforms and operating systems
* You have to consider Python version compatibility (eg third party could support fewer Python versions than the current code base)
* And the version compatibility of the tools used to reference the dependency
* And a way of checking license compatibility
* The dependency may use different test, doc, type checking etc tools so they may have to be added to the project workflow too
* Its makes it harder for collaborators since there is more complexity than "install Python and you are done"
I stand by my claim that the first paragraph (adding another dependency) is way less work, than the rest which is adding the very first one.
 https://github.com/jazzband/pip-tools  https://docs.pipenv.org/en/latest/  https://tox.readthedocs.io/en/latest/
I do think there's a lot of low-hanging fruit where Python could bake something in to auto-setup a virtualenv for a script entrypoint & have the developer just list the top-level dependencies & have the frozen dependency list also version controlled (+ if the virtualenv & frozen version-controlled dependency list disgaree rebuild virtualenv).
I used it to distribute dependencies to Yarn workers for PySpark applications and it worked flawlessly, even with crazy dependencies like tensorflow. I'm a really big fan of the project, it's well done.