
Hypermodern Python - BerislavLopac
https://cjolowicz.github.io/posts/hypermodern-python-01-setup/
======
jalons
This helps highlights the main issue I have with python today, and that's
running python apps.

Having to ship a compiler to a host|container to build python with pyenv,
needing all kinds of development headers like libffi for poetry. The hoopla
around poetry init (or pipenv's equivalent) to get a deterministic result of
the environment and all packages. Or you use requirements files, and don't get
deterministic results.

Or you use effectively an entire operating system on top of your OS to use a
conda derivative.

And we still haven't executed a lick of python.

Then there's the rigmarole around getting one of these environments to play
nice with cron, having to manipulate your PATH so you can manipulate your PATH
further to make the call.

It's really gotten me started questioning assumptions on what language "quick
wins" should be written in.

~~~
doorstar
I decided to drag myself kicking-and-screaming to the 21st century and start
writing my handy-dandy utility scripts in python instead of bash. All was well
and good until I made them available to the rest of my team, and suddenly I'm
in python dependency hell. I search the internet and there are a lot of
different solutions but all have their problems and there's no standard
answer.

I decided "to heck with it" and went back to bash. There's no built-in JSON
parser but I can use 'grep' and 'cut' as well as anyone so the end result is
the same. I push it to our repo, I tell coworkers to run it, and I wash my
hands of the thing.

~~~
zxexz
jq has been a lifesaver for me parsing json in bash. Of course, it's an
external utility not present by default in most systems.

Another thing to consider is more of a middle-ground approach. Most systems do
have a python interpreter, so you can use a lot of base python without
worrying about dependency hell. I use inline python in bash all the time, e.g.

    
    
      ls | python -c 'import sys,json;lines=sys.stdin.read();print(json.dumps(list(filter(bool,lines.split("\n"))),sort_keys=True,indent=2))'
    

You can even use variable substitution, if you surround the python code in
double quotes. Even mix f-strings and bash substitution

    
    
      python -c "print(f'Congrats, ${USER}, you are visitor number ${RANDOM}. This is {__name__}, running in $(pwd)')"

~~~
moreaccountspls
Great trick with using the python standard lib! Thanks for posting that.

edit: You probably already know this, but for anyone reading along, piping
`ls` is unsafe if you plan to use the paths for anything except for printing
them out. A path on linux can contain any byte except for NULL, so when `ls`
prints them out, you can get broken behavior if you try to break on newlines.

------
cs702
It's evident from reading the OP and previous similar posts on HN that many
developers find it difficult to specify and replicate deterministic Python
environments for their applications. Personally, I have found it best to use
(a) a virtualenv or conda environment, with (b) a requirements file that
specifies fixed version numbers for packages (e.g., `pandas==1.0.3`). Only
very rarely have I run into issues doing this; it works quite well for me.

\--

That said, from a security standpoint, I'm not sure it's a good idea to run a
script downloaded from the web, without verification, on your _local_ command
line:

    
    
      curl https://pyenv.run | bash
    

If that URL ever gets hijacked, you would be running malicious code. At a
minimum, you may want to take a look at the script before running it, or
otherwise verify that you're downloading what you actually want to run.

~~~
tyrion
I completely agree. I have no idea why many people seem to be fan of this
`curl x | bash` approach.

For pyenv all is needed is to clone their git repository and add a couple of
lines to your .bashrc.

~~~
dahfizz
I think `curl | bash` is treated unfairly. Whether you `git clone` or `curl` a
script, you are fundamentally doing the same thing: downloading and executing
code from the internet. `git clone` just feels safer because it is hiding that
fact under layers of abstraction.

If I want to run pip, I need to trust PYPA. It's their code I want to run, and
I need to download it one way or another. If I don't trust them to keep their
domain secure, I don't see why I would trust them to keep their github repo
secure.

And the whole point of pip is to download code from PYPI and run it. pip, git,
curl|bash, all do the same exact thing in this case. curl|bash just smells
funny because it makes it more plainly obvious what is going on.

~~~
dtech
git clone will not execute arbitrary code from the internet without
inspection.

~~~
hagy
I assume the cloned git repo includes arbitrary code that will be executed
with the same security profile as a downloaded shell script.

------
totalperspectiv
I think that at this point all the ceremony around setting up and deploying a
python project outweighs it's 'easy to read and use' aspects. Unless there is
a library you can't live without or rewrite, it seems like a language with
better tooling and real benefits of a type system is a better choice.

If you're mainstream: Go or Java. If you're edgy: Nim, Scala, or Crystal

All of those have much more sane type, build, and packaging systems.

@perl-people, was this a solved problem when Perl was big? Or is python
walking the same roads?

~~~
tinco
It most definitely was not a solved problem when Perl was big, as far as I
know Ruby is the language that finally solved it with bundler, which was
released after Rails was, so that's like 2007 or something.

The recipe for the 'solved' deployment:

    
    
      - a compiler that ships with a build tool (so building is homogenous in the community)
      - a centralized or at least uniformly accessible package repository (so dependency acquisition is homogenous in the community)
      - a common file format for describing dependencies (so dependency resolution is homogenous in the community)
      - a common file format for locking dependency versions (so deployment can be done reliably without vendoring dependencies)
      - optional but very nice: a tool for managing compiler versions so it's easy to switch/upgrade projects
    
    

Any programming language that has all of these boxes ticked is a modern
programming language in my book. As far as I know Ruby is the first that
ticked all of them, but other ones I've used that have this: Node.js, Go,
Rust, Haskell, Python (though it's a bit messy). I'm pretty sure C# checks
them as well nowadays, but I haven't used it in over 5 years so I'm not sure.
Same for Java.

~~~
sansnomme
Somebody needs to tell the Racket people about this. Dependency management in
Racket is still C-style no-version-pinning-anything-goes. The third party
dependency managers (see below) are all primitive and do not support multiple
versions of the same libraries which is a standard feature in
Go/Node.js/Rust/Java class loaders

[https://github.com/Bogdanp/racksnaps](https://github.com/Bogdanp/racksnaps)

------
tgb
I have to say, at first blush I would not choose click over argparse. I do
have to look up the docs of argparse every time I use it, but I like that it's
just gives me the args and lets me structure the program flow how I want to,
which I think is more natural.

And then I can do things like import a big package (pandas) only after parsing
the args, which is highly convenient to users that want to check the argument
options without a five second lag.

~~~
jobeirne
I recently wrote this
([https://github.com/jamesob/clii](https://github.com/jamesob/clii)) because I
can't stand click and got sick of having to check the argparse docs every time
I wanted to write a CLI. I guarantee you'll spend a tenth of the time trying
to figure out how to use this thing, it has no dependencies, and is
implemented in a single vendor-friendly file.

~~~
fastball
There is also Typer[1]

[1] [https://github.com/tiangolo/typer](https://github.com/tiangolo/typer)

~~~
jobeirne
Similar interface, different design goals. This lib has 6x the code,
dependencies, and isn't as easy to vendor/audit.

~~~
fastball
Yep, good work! Though I do think the name "clii" could generate some
confusion. Maybe not though.

~~~
jobeirne
Thanks!

------
nemetroid
I have a really high usefulness threshold for adding external dependencies to
Python projects. If you can get away with never getting into the virtual
environment mess, distribution/installation/development becomes _so_ much
simpler.

Of course, sometimes you can't avoid external dependencies (personally, this
often involves pandas). But the standard library gets you really far. And even
though urllib.request is clunky, I will only use Requests if something else
already is forcing me to add external dependencies.

~~~
dahfizz
I definitely agree for production-grade applications. Most of the time I'm
using python, though, its to script a task I need to do or throw together a
small app on a raspberry pi at home. In those cases, I have my own "standard
library" of packages I always have installed system-wide, like requests.

Unless I am making a "real" application, I do my best to avoid virtualenvs
altogether.

~~~
BiteCode_dev
You don't have to. Use zipapps, this allows you to create a project with as
many deps as you want, but use it as if it was a single python module.

It's an official feature
([https://docs.python.org/fr/3/library/zipapp.html](https://docs.python.org/fr/3/library/zipapp.html))
that lets Python execute an entire project bundled into a zip file.

If you don't want to create it by hand, use shiv:
[https://pypi.org/project/shiv/](https://pypi.org/project/shiv/)

It's the best of both worlds.

~~~
dahfizz
That's really cool! I can't believe I've never come across this, thanks for
sharing!

~~~
BiteCode_dev
You are not alone. __main__.py in a zip has been supported litterally since
Python 2.6 but very few people knew about it. Eventually in 3.5, the zipapp
module has been released to make the feature more discoverable, but again it
has been mostly ignored.

I think the reason is that something like shiv was missing: it streamlines the
process, shows a finish product instead of just telling people "all the thing
they can do", and the automatic unzipping solved plenty of problems that you
had with alternative like Pex, espacially on windows or with static resources.

So now you have it. Spread the word.

------
russellallen
Hypermodern Python seems to have a lot more steps to it than Ye Olde Python,
where you put a hello.py file somewhere and did 'python hello.py'.

Yes, yes, I know. But...

~~~
luch
I genuinely thought that was a satire post about over-engineering Python
projects

~~~
ra
I think it's a very enthusiastic post. Anything can be over engineered - and
generally young enthusiastic programmers are keen to learn about the options.

------
Kovah
I am not into Python, but the article lost me before it even started: you are
required to install a bunch of compiler tools on your device to be able to
proceed. Excuse me? Is this really what hypermodern Python looks like?

~~~
tyrion
Those are needed to compile different versions of Python with pyenv.

Usually distributions bundle one or two specific versions of Python. Pyenv
makes it super easy to install and use all the versions of Python that you
want.

Even though for most people it might be enough to just use whatever version of
Python comes installed with your system, for a team it might be important that
everyone has the exact same version.

Moreover, pyenv-virtualenv makes it painless to use virtualenvs and so I
recommend you give pyenv a try even if you do not need additional Python
versions.

~~~
simiones
Again, do these tools really require a C compiler toolchain? If so, they are
completely out of the question - they add much more complexity then they could
possibly fix.

Why not just install the required version of Python, maybe from a 3rd party
repo if not available in the main repos? Why would I ever want to get a
compiler toolchain to get my Python interpreter?

~~~
tyrion
Could you explain why it is such a deal-breaker in your opinion?

It is a single `apt-get install` (which probably downloads less MBs than our
typical `npm/yarn` install).

Consider that if pyenv came pre-packaged for ubuntu/debian, those packages
would be runtime dependencies and then you would just need to `apt-get install
pyenv`.

~~~
simiones
What if I already have a different C toolchain installed? Would I now need
"cenv" to keep the pyenv C toolchain and my other C toolchain separate?

------
languagehacker
Great read! I see a few approaches I'd be interested in adopting. Nox sounds
particularly useful for tooling consistency across development environments.

It's worth noting that this a rather opinionated toolset, and that we
shouldn't mistake opinionated for hypermodern. You could replace poetry for
pipenv (particularly now that it's being maintained again...), and I tend to
prefer unittest to pytest. The further into it you get, the more opinionated
it is -- for instance, code coverage support and CI/CD is not one-size-fits-
all, but the choices of CodeCov and GitHub Actions are nicely illustrative.

~~~
fastball
Please no pipenv. A horribly managed project with worse design choices than
Poetry. And a confusing name to boot (it makes it sound like a clean bridge
between pip and pyenv, but it's definitely not). On top of everything you can
throw in Ken Reitz's ego. Literally the only reason pipenv gained traction is
because there are still an unnerving number of people in the Python community
that think Ken Reitz shits gold, when he is clearly more of a one-hit-wonder
(love requests btw).

Even the testimonials are self-important.

> Pipenv is finally an abstraction meant to engage the mind instead of merely
> the filesystem.

Like, really? I want to interpret that as a joke, but I don't think it is.

Poetry should be the future of Python dependency management. I'd like to
forget pipenv ever happened. Unfortunately since PyPA took pipenv under its
wing, that might not happen.

------
darkerside
Wouldn't hyper modern Python rely on containerization for isolation and
portability? Poetry and pyenv seem like incremental improvements rather than a
qualitative leap forward.

~~~
jacobush
Heh, but that's one modern thing I don't like. Someone complained about click
adding to startup time, running a new container is certainly "modern" but
tiring.

~~~
darkerside
It's certainly overkill for a one man side project. But if you have a team and
deploy to clustered servers, virtual environments sure start looking like a
horse and buggy.

------
king_magic
This feels like what will happen when the Node.js world gets bored with
JavaScript and sets their sights on Python.

~~~
jennasys
Yes, this! I'm not against the language evolving, but why do so many people
seem to want to turn it into JavaScript? I've been using Python for the last
10 years, and when learning it went out of my way to make sure I was using the
idioms of the language and not just use it as a different language with a
Python syntax.

------
BiteCode_dev
I don't see what's modern about it.

Those tools have been existing for years, and as always with Python, they
serve only a subset of the Python users.

Not that it's a bad article. If you don't know about them, give it a read,
it's good to know it exists and the post is well written.

But don't think it's the ultimate stack or whatever.

And if you learn Python, you should always strive to master the basics first:

\- installing python.org for windows and mac, or apt/yum on Linux (those ones
are tricky, it's more than just the "python" package to get pip and venv)

\- don't try to get the latest and greatest version (which is 3.8, or 3.9
alpha today). 3.6 is great already and is wildly available. I personnally
strive for 3.7 now, but I'm happy with 3.6 if I don't have to use asyncio. So
get the most modern version that is easy to install for you.

\- be comfortable with "-m", the "py" command on windows, "pip" (-m, --user,
install, install -r, freeze), and "-m venv".

\- make sure you understand what the PYTHON PATH is and how it works regarding
the import system

\- know how to configure your favorite editor to work with those

Once you have those basics, you can go and try whatever have you: peotry,
pyenv, pew, virtualenvwrapper, pipenv and so on. It will be easy because you
will have a solid understanding of where it comes from and how it plugs in
into the ecosystem. You will be able to chose if they are worth it or not for
you. And more importantly, when you will have to move away from them (because
of work, because they don't suit you, because they are abandonned...), you
will always be able to fall back to basics.

But franckly, there are so many things that you should be learning before
those extras: pdb, black, pylint, zipapps... Or even concepts like generators,
decorators, unpacking, etc. They all have a better value for the time you
spend on them than learning a new "modern" stack.

Not that it's not useful to have a modern stack. I update my stack all the
time.

I'm currently using pip + pew + doit + nox + pytest, and I'm experimenting
with dephell.

But it's my full time job. I'm an expert.

Most Python devs have a much limited amount of resource to spent in learning
stuff. They have deadlines, and other stuff to care about than Python.

I know because I go from company to administrations to students to train them,
and it's always the same: they chose Python because it's an efficient use of
their time.

~~~
greenshackle2
Context is important. Developing applications vs libraries vs scripts is
different, developing internal company software vs external software is
different.

If you're shipping a package on pypi, you will want to develop and
automatically test using multiple Python versions and maybe a range of
versions of your dependencies. Having good tools to manage multiple sets of
Python versions and dependencies and parallel automated testing is a godsend.

If you're writing an internal or web application that will run only in a
single well-known environment, pip + venv + pytest might well be good enough.

~~~
BiteCode_dev
Sure, but again, there is nothing modern about this. And certainly not, hyper
modern.

Besides, if you want to develop and automatically test using multiple Python
versions and a range of versions of your dependencies, none of the tools from
the article will do it.

At best they are one of the ways to get dependancies before you use a tool to
do it. You can do that with raw pip and venv as well.

What you need though, is something like tox, which has existed for years.

As mentioned in my comment, I personnally use nox for this, as it is,
ironically, more modern to my taste.

Nox doesn't care if you use poetry, pew, virtualenv wrapper or venv manually.

I would certainly advice to learn something like nox or doit before learning
poetry, for example.

Now I don't advice against learning poetry, mind you. It's a good tool, well
written.

Pyenv is another matter entirely. The day one really need what pyenv gives you
to the point it's worth investing in it, they can discard all my advices
entirely, as they would have the skill and experience to not need my
explanations.

~~~
greenshackle2
Not sure if you've read just the first article, it's a whole series, they
cover nox later on.

I don't see how this stack is "hypermodern" either, mind you.

My point was just that these articles and other like it tend to omit context
completely. What is this stack good for? Who should use it? Who shouldn't?
Without answering these questions the articles are not all that useful, or
even could be harmful, if a dev who doesn't know better apes a complicated
stack for no good reason.

~~~
BiteCode_dev
I missed that it was serie, indeed.

------
maliker
My team has struggled a lot with how to do python deployment.

We know we can isolate the interpreter and modules via venv/docker/etc. and
get repeatable and reliable deployments. However, because we're a utility
module, we like to allow users to import our code freely, and that's a lot
harder when that code is isolated and bundled with very specific requirements.

Seems like the only perfect solution is to just support every conceivable
version of python and our required modules. Which is of course very hard. It
would probably require greatly reducing our usage of tensorflow and some other
packages which change a lot and are trickier to use on Windows.

Good old dependency hell.

------
i3winner
I think poetry is great and I hope it becomes more widespread.

~~~
matanrubin
Totally agree. Poetry is a joy to work with

------
thmzlt
I wrote a tutorial the other day on using Nix to handle the Python package
situation. No need to install Python or even touch pip:

[https://thomazleite.com/posts/development-with-nix-
python/](https://thomazleite.com/posts/development-with-nix-python/)

~~~
silviogutierrez
+1 for nix. Nothing comes close. Pain to first learn it though.

But now you have truly deterministic builds.

Well, almost. Don't forget to pin channels.

------
neurostimulant
What's the benefit of using pyenv? I've been using `python -m venv .venv` to
create virtualenv within the project directory for a while now, and VS Code
recognize the .venv directory as python virtualenv and apply it to the
workspace automatically. So far it's been quite painless for me.

~~~
jhrmnn
They serve a different purpose. Pyenv enables you to maintain several Python
versions. You can that use the venv module of each respective Python to create
a virtual environment with a Python at a given version.

~~~
neurostimulant
I see, I think I confuse it with pyenv-virtualenv. I never need to use
multiple versions of python so I never try it so far. I always install the
latest version globally and use python 3 docker image as the base deployment
image.

~~~
thijsvandien
If you have multiple projects, it may be undesirable that you're forced to
have them all on the same Python version. Of course it helps that Python has
fairly good backward compatibility, and even using a newer version to develop
and test than the one you'll deploy with is not the end of the world, but
pyenv is easy enough that I see no reason not to do it properly.

------
fabioz
I guess that there are many tools for environments now... my personal choice
is using conda (I actually start with miniconda and download what I need from
conda-forge).

It provides the needed isolation, you can get lots of packages without having
to compile from conda-forge and it still works with pip for the odd cases
where a given package is not directly available (as a bonus, it works for
other native toolchains, not only Python, which is a huge plus for me).

As a note, I personally use it with `conda devenv` to be able to replicate
installs for a given project -- i.e.: [https://conda-
devenv.readthedocs.io/en/latest/](https://conda-
devenv.readthedocs.io/en/latest/).

~~~
tyrion
The only thing I never understood about conda, is why it does not support
packages from pypi.

I mean, the Python community has a standard repository for packages (pypi).
You cannot even name a library $something if $something is not available on
pypi. Why would I use something hosted by a private company that does not even
interface with the rest of the community?

This is not an attack on conda. I just cannot understand its rationale.

~~~
fabioz
The real problem (which is solved by conda) is that pypi doesn't solve dealing
with non-python packages well (say, compile scipy and make all related native
packages communicate well). This is a huge issue for people dealing with
scientific packages (and the main pain point which pypi being just python-
focused on installing on site-packages doesn't solve well enough IMHO).

Also, while it was initially done by a private company, I'd say it's
definitely a community effort right now (it's also the reason I tend to use
[https://conda-forge.org/](https://conda-forge.org/), which is community
driven and not anaconda).

As for dealing with pypi, it does integrate well enough for me (given that you
can just pip install packages on the python for which conda is managing the
env), but yes, the other way around isn't true (conda solves a bigger problem
than pypi up to the point that it's possible to even have non-python tools
available -- one real use case example I have here is having innosetup
binaries as a tool in the PATH in some conda env for doing builds).

Note: I don't have any affiliation with any of that, these are just my
preferences for managing python envs (when I'm developing pydevd, which is the
debugger engine used in pydev/pycharm/vscode, many times I need to reproduce
some weird env and before using conda that was pretty annoying).

------
crad
This is so painful to me. So much of a reinventing the wheel for things that
are not broken. :(

------
manojlds
Ok talks about some alternatives like Poetry and pyenv, which have their own
issues.

pyenv is a poor cousin of rbenv, has such a poor cli api and a leaky
abstraction over virtualenv (with its plugin for virtualenv)

~~~
hprotagonist
it’s not an abstraction over virtualenv _at all_ ; pyenv installs python
interpreters, and those interpreters should use _-m venv_ instead of
_virtualenv_ anyway.

did you mean pipenv instead? we suck at naming things (and pipenv has its
share of issues, no arguments there).

~~~
manojlds
No I meant the pyenv virtualenv plugin. Have been using it for a long time
that forgot that there's a plugin involved.

~~~
hprotagonist
oh! Yeah, I don't use it at all. pyenv-installer drags that plugin in, and i
cleverly fail to source it since it's totally useless for my purposes.

------
dilandau
This post is a snapshot of one webdev's toolbox. It certainly is indicative of
webdev's predilection for increasingly complex and opaque tooling, mixed-
markup formats, and over-engineering (if it can even be called engineering).

You don't really need any of that stuff. That whole tooling ecosystem
undergoes a rewrite every couple years, anyways, so I predict this post will
not age well.

Just stick to setup.py, pip, and virtualenv. It's sufficient into all thy
needs.

~~~
carapace
I was with you up until your last sentence. pip and virtualenv are hacky
kludges that grew legs and lungs, that's how old python's issues around
deployment are, there's an era before pip and virtualenv.

There should be an abstraction for it (deployment, versions, import hacking)
like what pathlib did for file path manipulation.

We have a classic _lava-flow_ pattern.
[http://mikehadlow.blogspot.com/2014/12/the-lava-layer-
anti-p...](http://mikehadlow.blogspot.com/2014/12/the-lava-layer-anti-
pattern.html)

~~~
raverbashing
What makes you think the newer tools are better than pip and virtualenv?

(They most likely use some of those in the background)

Edit: the link you quoted is very relevant

> We developers should recognise that we suffer from a number of quite harmful
> pathologies when dealing with legacy code:

> We are highly (and often overly) critical of older patterns and
> technologies.

> We think that the current shiny best way is the end of history; that it will
> never be superseded or seen to be suspect with hindsight

~~~
carapace
> What makes you think the newer tools are better than pip and virtualenv?

I don't. Heck, maybe they are. I don't use them.

My point is that rather than being the beneficiary of Python's excellent
abstraction powers, distribution et. al. is and has been a mess for decades.

Here's a history of the sordid mess starting from 1998: "The distutils-sig
dicussion list was created to discuss the development of distutils."
[https://www.pypa.io/en/latest/history/](https://www.pypa.io/en/latest/history/)

FWIW, Since pip and virtualenv became relatively stable and "blessed" and PyPI
has matured the only new thing I've tried is Anaconda.

Unfortunately my main Python project right now uses Tkinter and the folks who
make Anaconda have a circular dependency in their build system such that you
need python to build Freetype so their Python/Tkinter/TCL/Tk has gruesomely-
bad support for fonts, so my project looks like a potato.
[https://github.com/ContinuumIO/anaconda-
issues/issues/6833](https://github.com/ContinuumIO/anaconda-
issues/issues/6833) Someone has put in a PR to hopefully fix it, and-- ah!
--it seems to have gotten some attention earlier today:
[https://github.com/conda-forge/tk-
feedstock/pull/40](https://github.com/conda-forge/tk-feedstock/pull/40)
Fingers crossed for great good!

Anyhow, things are rough all over, eh?

------
meddlepal
This is why I gave up on Python.

~~~
panarky
What do you use now, and how is it better?

~~~
meddlepal
I work mostly in Go but personal stuff is a mix of Go, Java, or Kotlin.

Go's only flaw is the module system sucks. Java is rock solid and rolls up
nicely into a fat jar.

------
cpburns2009
All of these virtual environment managers over complicate dependencies with
Python. Using a simple virtual environment is quite easy. All you need is
Python, its built-in _venv_ module, _setuptools_ and _pip_. I've found _pip-
tools_ helps with generating/merging comprehensive _requirements.txt_ files.

------
albi_lander
> This article series is a guide to modern Python tooling with a focus on
> simplicity and minimalism

I'm not sure using pyenv + poetry + click qualifies as a minimalist setup. I
have never had to use pyenv as I figured out it was more robust to install the
python versions directly from python.org and create the appropriate symlinks.
Then using the venv module which has been included in python since 3.3. The
argparser module of python is not that bad once you are used to it.

I'm not saying the tools and libs mentioned in the article shouldn't be used,
it's just that they're not mandatory for whoever wants to stay close to the
bare minimum.

~~~
skrtskrt
I would use pyenv locally so I can easily switch versions but dev and prod is
a Docker container with one python version, then poetry install the virtual
environment. You don’t even really need to install into a venv, since you’re
already in an isolated container, but going straight for a venv is kind of a
reflex for most Python devs

------
triztian
This is exactly what I needed as a somewhat on/off longtime user of Python.

It basically boils down to using pyenv, poetry and setting up the
pyproject.toml and project src layout.

I read the comments before the article and was expecting a MUCH more
complicated content and intricate bash, tool or config setups but it seems
straightforward.

I feel everything else is part of getting started into professional software
development (git, modern OS, env vars setup, bash or shell scripts, etc) which
is it's own right complicated for newcomers.

I think deployment and packaging vary a lot depending on the target deploy env
which is why I'm fine with it being left out of the article.

------
nojito
I’ve moved away from requests to httpx.

~~~
darkteflon
I’ve had my eye on httpx for a while. Could you elaborate on your reasons for
changing and your experience with it?

~~~
nojito
Primarily async and it works extremely well alongside FastAPI.

------
euler_angles
One thing that I haven't been able to figure out with the 20 minutes or so of
reading about poetry that I just did -- does poetry support editable installs
akin to pip install -e .?

~~~
npage97
So you shouldn't need to use pip install -e . as poetry already configures
your package to be installed as editable.

For adding a separate package in your filesystem, I believe using
[https://python-poetry.org/docs/dependency-
specification/#pat...](https://python-poetry.org/docs/dependency-
specification/#path-dependencies) is what you are looking for

``` poetry add ../path/to/package ```

------
mrlonglong
I loved the wonderful Victorian drawings of air-fishes with people in them.
Who did those and where can I find out more about these?

~~~
imadethis
Google Reverse Image search gave me
[https://publicdomainreview.org/collection/leaving-the-
opera-...](https://publicdomainreview.org/collection/leaving-the-opera-in-the-
year-2000)

------
ggregoire
Reading this article (which asks me to install 18 apt packages on my system)
and the top comments here (complaints about developing, running and deploying
apps, replicating deterministic Python environments and so on) makes me wonder
why Python developers don't use Docker.

------
woile
Great resource, I'd add for the typing section, the usage of `py.typed` file.

It'd be worth sharing the alternative minimalistic approach

``` python3 -m venv .venv

. .venv/bin/activate

pip install -e . ```

But is good for people who are willing to install those tools to simplify
their work. I use pyenv and poetry when I can.

------
mark_l_watson
Poetry looks good, I haven't used it. It reminds me of the way Quicklisp
really revolutionized how to maintain and use packages in Common Lisp.

I don't do too much Python now, but I bookmarked this article to possibly use
as a future reference.

------
seemslegit
> This article series is a guide to modern Python tooling with a focus on
> simplicity and minimalism

No it isn't, it introduces at least two external tools of questionable
necessity - pyenv and poetry when python comes with venv and pip natively.

~~~
jacobush
Yeah, but pip and venv don’t leave me impressed, so please elaborate why those
are better.

~~~
smcl
I think that given pip and venv are bundled with Python, then anyone who wants
to introduce new dependencies to replace them needs to justify themselves. I
haven't used either pyenv or poetry so I have no opinion on them.

~~~
dhagz
The benefit of pyenv seems clear to me - it's a simpler way to have multiple
Python installs and manage which one is your "default" Python at a project,
directory, user, system, and global level (they can even all be different).

Poetry...not so much. requirements.txt is good enough for me most of the time.

~~~
akdor1154
Poetry gives you exactly what pip and venv give you, with the two pretty much
perfectly integrated, in a way that is pleasant to use and not something that
you'll grudgingly migrate your project to after your fifth dependency.

------
icedchai
Seems overly complicated. I'll just stick with pip and virtualenv for now.

------
kissgyorgy
If it would be really hypermodern, it would use httpx instead of requests:
[https://github.com/encode/httpx](https://github.com/encode/httpx)

------
orsenthil
The problem with this article is, it will take me a day or probably more to
get started with my application.

A newcomer will give up before the "hello world" program.

~~~
wegs
I don't think this is for a newcomer. Newcomer, here you go:

print("Hello World!")

This is for someone who has done Python for many years, but might have fallen
behind on the latest and greatest trends. It collects a series of promising,
bleeding-edge tools and gives an overview of what makes them clever and how to
use them.

~~~
airstrike
> It collects a series of promising, bleeding-edge tools and gives an overview
> of what makes them clever and how to use them.

"Promising" being the operative word here

------
andrewshadura
With setup.cfg supporting nearly everything setup.py needed to include, I fail
to see a reason to use poetry's own configuration file.

------
gbasin
Excited for 2030 when this is reduced to one click

------
wingi
This is hypermodern, but a security nightmare.

curl someting | bash

~~~
kevsim
This pops up occasionally on HN but I don't get it. Are you reading every line
of every installation script you run normally? What about the things they
download and execute? Are you never running a binary installer where you don't
have access to the code?

Genuinely curious what the nightmare is.

------
ra
Hypermodern? is that like the bleeding edge?

------
Uninen
Sad thing is `poetry init --no-interaction` fails if your name happens to have
non-ascii characters :(

------
deegles
This is great! Is there something like this for Typescript?

------
sabujp
s/hypermodern/annotations

------
davv
hm

------
tempotemporary
Want a real hyper modern Python? Try Haskell. (As someone who spent 10y with
Python)

~~~
ca_parody
Nim feels more like hyper modern python to me. Haskell is an entirely
different paradigm.

