Hacker News new | past | comments | ask | show | jobs | submit login

A lot of people are mentioning Apple removing scripting languages at the same time as Windows is adding it. The stub is nice for learners. But ultimately I like Apple's decision to remove them. You should not be relying on system installed interpreters, because you end up with an ancient, unmaintained version. And you are co-mingling all your dependencies with whatever else is installed. Everyone should be using virtualenv. And it should be easier to use virtualenv.

It depends on who you're talking about. There is an attitude of, "if you can't install a virtualenv as step 1 in learning how to do python, you don't deserve to learn python". Not, I would hope, consciously thought of in that way, but that's the practical affect.

I use virtualenv. When I was first learning to program, and I believe python is a better choice for a first language than PHP or Javascript, it would have been a substantial obstacle to have to do the virtualenv route first. Yeah, it's easy if you've done it and know how to figure out what the problem is if something goes wrong. It's an obstacle if you're just starting out, and I think keeping the on-ramp simple is very important for the long-term health of a language community.

Which suggests that, because python's on-ramp just got easier, it's long-term prospects just got better.

If you're just learning, you don't need to mess around with virtualenvs; you can install Python via the official Windows installer and install packages globally. I don't feel like installing something is a very big initial hurdle to overcome, as opposed to having it be pre-installed (but potentially very outdated in the future).

On the contrary, installing something is the biggest hurdle after perhaps coming up with the idea that there's something here to be learned at all. And sometimes it's part of the latter.

Well, I don't have that attitude. Lately, I push beginners towards repl.it so that they don't need to install anything or setup an IDE. There, each script/program is completely separate. So it avoids this issue entirely. Then, when they're a bit more familiar with programming, they can transition to a good local environment like PyCharm or some other setup that manages python versions, virtualenvs, etc.

What I'm saying is that bundling a specific Python interpreter and separate package environment should be the default. I'm not sure what it'd take to get there for a vanilla python install. But more system-level interpreters is not it.

Currently, the way I know of to do that is pyenv [0]. And I know PyCharm let's you pick a python version when you start a project (though I don't use pycharm so that may not be true anymore).

[0] https://github.com/pyenv/pyenv

I use pyenv, (mostly because its the first one I found).

though ironically in a very unpythonic way there are a bunch of these python environment creators. (conda, virulenv , pipenv....)

pycharm lets you select you favorite, which is pretty nice.

...which drives me nuts. I need the equivalent of a dummies guide to tell me which one to pick. obxkcd: https://xkcd.com/1987/

`python3 -mvenv` is the obvious true way

> I use virtualenv. When I was first learning to program, and I believe python is a better choice for a first language than PHP or Javascript

I’m a seasoned programmer and I absolutely hate how you’re forced to use VirtualEnv with python. One of the absolutely biggest warts of its already poor package-management.

It literally makes everything which should be simple, something you suddenly need to handle. Atrocious.

I like it much better like .Net, Node or Rust (and many others) where the dependencies are entirely handled within a project, completely obsoleting the need for a hack solution like virtualenvs in the first place.

I always felt that virtualenv is vastly superior to most other package managers. I think comparing it to Rust and .Net is a little unfair, as these are compiled languages and have different deployment models.

When I was first learning and there was a system Python I would install packages, then 6mo to a year later I would find something old. Because the OS came with Python and some number of libraries and because I modified that somehow, I couldn't get back to a known state without reinstalling my OS or updating/removing things likely causing problems. Everything I did were separate test projects, so it never made sense to install libraries globally.

Even now for many languages I shy away from installing globally (I usually install to user if I'm lazy). I feel like in almost any language a "project" (venv) should be taught early on and be simple to use. Basically, the first time you need a third-party module. I feel like newcomers should be shown the REPL, how to save code to a file and run it, standard libraries, then a workspace and how libraries work. Python could be a great example because of the batteries-included model I don't need external libraries for many tasks.

First step of creating a hello world Node.js application is to write a package.json file which is somewhat equivalent of virtualenv. Nobody ever complained about that. Isolated packages per project becomes the default and not an obscure extra command. Same with Rust, Gradle and many others.

I would like to reverse the question. If you can't install a virtualenv as step 1, the process of creating a virtualenv is simply too difficult and should be streamlined.

But at the same time, Microsoft is actually throwing people at the task to maintain it meanwhile Apple is letting them languish. Why discourage it if there are active maintainers? Are you saying that Linux is also doing it wrong? Python packages often come from distro maintainers and not python devs?

Yeah, Linux is doing it wrong. I hate that I can't get a new version of python on my machine because the distro maintainers haven't gotten around to it.

The way Linux package managers have handled multiple versions of runtimes in the past is definitely wrong.

There is a lot of work in this area going on now, though. I think RHEL 8's application streams concept is pretty good: https://developers.redhat.com/blog/2019/05/07/what-no-python...

Fedora has an analogue to this as well. It seems like a good move in the right direction.

Docker solves that. Yeah, you then have to deal with container problems, but it's the best packaging solution we have at the moment.

Maybe one day we'll take the lessons learned and natively build them into the OS, but distros are still obsessed with distro-specific kludges that require extra packaging work.

We have great universal packaging when it comes from the community at large, but not from single organizations. Pip is a platform-independent package manager! Why don't we have that for Linux software in general? Because each distro already has their own solution, and they won't work together to build a common one.

Well there is an effort to solve this with flatpack and snap

Both of which will be dead in 10 years, the former because it's solely focused on the desktop, latter because it's a vendor initiative not a community one.

> [Flatpak will be dead in 10 years] because it's solely focused on the desktop

By this logic, GNOME should have died a long time ago.

Focusing on the desktop makes Flatpak better at managing desktop apps. For servers we have, well, Docker.

> latter because it's a vendor initiative not a community one.

You may be right, but I don't think you should be quite so cynical.

18.04 has a ten year Long Term Support, and Ubuntu was first released in 2004.

You just compile your own from source. Python plays nice with older versions of itself. Use Gnu Stow as a secondary package manager to make it easier to remove or replace.

You can use, today, the same ways that you would force everyone to find/install python (virtualenv, docker, pipenv, conda, whatever is hot this week) in a world without distro-provided python.

If you can't figure out "./configure && make && make install" I think you're doing Linux wrong.

And then halfway through the compilation you find out you are missing fifteen different dev-packages and the only way to figure out which are by googling obscure error messages. Oh, can't find include efglx.h, you obviously have to apt install libstdglx4-dev. (This was probably written deep on some establish build environment-wiki page that's impossible to find but since you are on Ubuntu 18 instead of 14 it wouldn't have worked anyway because now the package has changed name to apt install glx-dev. Besides - they also forgot to list half of the dependencies required on that page).

Figuring out how to establish a build environment for any program you want to install really can't be the best way to do things.

You could use Guix or Nix to get a newer version.

You can.

Updating and replacing system Python takes tremendous blood and treasure. Compiling python from scratch, updating symlinks, resolving dependency issues and finally debugging the whole thing is no small endeavour. Been there, done that; never again.

(This was on a Ubuntu LTS system)

I've done this dozens of times, didn't seem like that big of a deal? I think the place this might have fallen over is the "updating symlinks" part. Don't replace the system Python, install yours alongside it. They don't need to know anything about each other; aliases in developers profiles and full paths in services/scripts.

Yup, stick it in /usr/local and place that in your PATH and Python search path before other dependencies. If you want to be fancy, put it somewhere in your home directory so other users on your same machine cannot be affected.

It's really not hard. I do this with other compilers and don't really have issues, though I tend to package it up into a container if I'm going to ship it (e.g. build environment for a team, interpretor for a production service, etc).

Or you could, just like on other platforms and with other languages, use virtual environments with a version manager such as pyenv for your development environment. Combine that with pipenv and you get pretty painless python development. To be fair, some of this stuff is more recent though.

Here is an excellent short blog describing all this and how pyenv and pipenv click together to create a virtual Python environment, that is completely independent of your OS vendor's runtime: https://gioele.io/pyenv-pipenv

This stuff is definitely quite new, and I guess the need for it has risen as a consequence of all the great new features that have shipped in Python during the last few years.

I recently had to write an utility script in Python. I chose to use Python 3 and type hints syntax, requiring at least Python 3.6. This was a problem when I realized some of my team use Ubuntu and Debian versions that ship with Python 3.5 (not to speak of Macs with Python 2.x, but these guys use homebrew anyway). Plenty of time was spent researching virtual environments and writing deployment instructions, and I ended up using pip in pyenv (but stopped short of using pipenv, since creating private distributable packages with it seemed convoluted.)

You could just use Anaconda or a similar system.

Go there once, script it, and never do it manually again.

Oh, you mean all that work that package maintainers do for not cost to you?

Not to mention there are other distribution tools to acquire a pre-built python release other than installing a .deb.

It's hard if you're trying to update the system Python and need to upgrade every dependency for that but if you just want to have, say, “python3.7” in your path it's just a couple of commands:


    ~ $ docker run -it --rm ubuntu:xenial
    root@b8cda311d766:/# apt-get update -qqy
    root@b8cda311d766:/# apt-get install -qqy software-properties-common
    root@b8cda311d766:/# add-apt-repository --yes ppa:deadsnakes/ppa
    gpg: keyring `/tmp/tmpe354n0av/secring.gpg' created
    gpg: keyring `/tmp/tmpe354n0av/pubring.gpg' created
    gpg: requesting key 6A755776 from hkp server keyserver.ubuntu.com
    gpg: /tmp/tmpe354n0av/trustdb.gpg: trustdb created
    gpg: key 6A755776: public key "Launchpad PPA for deadsnakes" imported
    gpg: Total number processed: 1
    gpg:               imported: 1  (RSA: 1)
    root@b8cda311d766:/# apt-get update -qqy
    root@b8cda311d766:/# apt-get install -qqy python3.7
    debconf: delaying package configuration, since apt-utils is not installed
    Selecting previously unselected package libpython3.7-minimal:amd64.
    (Reading database ... 7603 files and directories currently installed.)
    Preparing to unpack .../libpython3.7-minimal_3.7.3-1+xenial1_amd64.deb ...
    Unpacking libpython3.7-minimal:amd64 (3.7.3-1+xenial1) ...
    Selecting previously unselected package python3.7-minimal.
    Preparing to unpack .../python3.7-minimal_3.7.3-1+xenial1_amd64.deb ...
    Unpacking python3.7-minimal (3.7.3-1+xenial1) ...
    Selecting previously unselected package libpython3.7-stdlib:amd64.
    Preparing to unpack .../libpython3.7-stdlib_3.7.3-1+xenial1_amd64.deb ...
    Unpacking libpython3.7-stdlib:amd64 (3.7.3-1+xenial1) ...
    Selecting previously unselected package python3.7-lib2to3.
    Preparing to unpack .../python3.7-lib2to3_3.7.3-1+xenial1_all.deb ...
    Unpacking python3.7-lib2to3 (3.7.3-1+xenial1) ...
    Selecting previously unselected package python3.7-distutils.
    Preparing to unpack .../python3.7-distutils_3.7.3-1+xenial1_all.deb ...
    Unpacking python3.7-distutils (3.7.3-1+xenial1) ...
    Selecting previously unselected package python3.7.
    Preparing to unpack .../python3.7_3.7.3-1+xenial1_amd64.deb ...
    Unpacking python3.7 (3.7.3-1+xenial1) ...
    Processing triggers for mime-support (3.59ubuntu1) ...
    Setting up libpython3.7-minimal:amd64 (3.7.3-1+xenial1) ...
    Setting up python3.7-minimal (3.7.3-1+xenial1) ...
    Setting up libpython3.7-stdlib:amd64 (3.7.3-1+xenial1) ...
    Setting up python3.7-lib2to3 (3.7.3-1+xenial1) ...
    Setting up python3.7-distutils (3.7.3-1+xenial1) ...
    Setting up python3.7 (3.7.3-1+xenial1) ...
    root@b8cda311d766:/# python3.7
    Python 3.7.3 (default, Mar 26 2019, 01:59:45) 
    [GCC 5.4.0 20160609] on linux
    Type "help", "copyright", "credits" or "license" for more information.
    root@b8cda311d766:/# exit
    ~ $

Use pyenv.

> Are you saying that Linux is also doing it wrong?

I'd say yes, the coupling of operating system and distribution of software applications has always seemed very wrong to me.

Software should run on the operating system which functions as a platform, not be coupled with the operating system. Linux package management essentially works like an app store with the additional burden of maintenance loaded on the distribution owners. I'd be very happy if flatpaks and snaps take off quickly because distribution agnostic up to date software, maintained by the developer, should be the norm.

> I'd say yes, the coupling of operating system and distribution of software applications has always seemed very wrong to me.

I agree with this in principle, but how far down the chain do you go? Obviously, you need to have some base libraries included. You just need to accept that some of those libraries will be old, and that they aren't going to change so rapidly for this to be a problem.

A nice boundary would be the interoperability layer, libraries and applications needed for application-agnostic protocols are included others are not.

Apple is removing them. Good choice I think.

I believe that it's the Python Software Foundation that is controlling the install and maintaining updates.

But more broadly speaking, I'm not sure that's going to be enough. Even if you're using virtualenv's, having something installed at the system level means it's shared by all the software you're using. There's bound to be some incompatibility eventually. And then you're stuck waiting for one piece of software to support the latest version of Python. So, you have to keep putting off updates.

I think a better solution is to use something to set the python version for each program (ex. pyenv or docker).

I was going to agree with you.

But then I remember one scene in high school. I was talking to a friend who coded in IB American History and asked him how to get started with Python. He said just go to this website and download this tarball. I had no idea what a tarball was and was afraid it contained something bad. So I didn't download it and didn't play with Python until a few years later in college. If I had started playing with Python earlier, my life could be a lot different. I was not a guy back then who made programs on TI-84 calculators; I thought that was magic. I was planning on econ/finance. I used Windows (!) and didn't know anything about what Unix was, and barely knew about OS X.

It's hard to see why some decisions pan out and some don't; if you install a system version of Python how would the general public / PSF know how many people use it? But I don't think that's a good enough reason to remove system installs entirely. I think my younger self would have liked it.

I fully agree. I started programming in JavaScript because I only had access to public computers, where I couldn't install or even download binaries, but I could write a .html file and open it in the browser.

Ubiquitous access to these tools provides wonderful opportunities, and the drawbacks are minimal.

> If I had started playing with Python earlier, my life could be a lot different. I was not a guy back then who made programs on TI-84 calculators; I thought that was magic

How come you didn't start playing with the TI-84 programming language instead?

There's no more "battery included" than that, and it's not a pun only: you can program and run on the calculator (not a GREAT way to program it, but quite a nice way to spend your time instead of paying attention to the class :^) ) if you bought it new I guess you had a paper manual for the language (I had one for my TI-89), and there's so little you need to learn and take care of (libraries, dependencies, versions, execution bit, paths... None of these are an issue the novice user will be confronted with. It's like a C64 all over again).

A number of good points. At the time I took (I think) 5 IB higher-level courses as a junior/senior, as well as a pretty tough art curriculum, and economics competitions as extracurricular activities. There wasn't a lot of energy I could afford to something that didn't require my primary attention and time. Another big reason is the inability to quantify my fears and the ability to divide and conquer them, which is a problem I still have.

I'm in the US, not sure if you are. I think I had a similar experience. Didn't get into programming until much later in life and kicked every wall once I realized I love it.

People wondering why you didn't just power through your doubts may be forgetting what the world was like not too long ago. 'Nerds' weren't cool. If you chose to be a nerd it meant _offering_ yourself up as the _loser_ in every sitcom, movie, or book. Playing video games was something you were supposed to grow out of. No one was throwing 80k at a graduate just because they could pronounce 'WWW'. Perhaps most importantly, _every_ school was pushing the race to no where. If you were thinking about a 'trade' it meant you had given up and would crack out fixing sinks for the rest of your life. At _best_ computers were taught as something you'd use at other careers. Not a career itself, really.

* Please note the last line is less an insult to plumbers and more a commentary on how the perception of success is hammered into you as a very white collar thing.

I appreciate the honest and personal answer. What I meant was... considering you had access to a programming language (albeit very limited) in your TI but you didn't use it, why do you think it could have been different with python?

That's a great anecdote, and I see how learning Python might have been harder for someone if it was not included.

However, the software distribution model right now is fundamentally broken. And I do not think it is worth leaving broken simply because it makes learning easier for some people.

In general, I think the decision to push people to tie their app less to the system configuration and rely on virtualenv, containers etc more is a good move in the long run.

I very much disagree. The whole point of Python is to make programming as easy as possible, so that it can reach as many people as possible. Python literally gave up design optimizations like tail recursion elimination so that beginners wouldn't be confused with stack traces.

What about people who dip their toes in Python, then switch to another language? What about people who want to just replace their VBScript integrations with Excel and don't care about Python beyond 1-2 dependencies on local? You'd just push them back towards the less optimum situation.

Oh, and I just remembered, Windows 10 Home Edition does not ship with a hypervisor. You can't run Docker on Windows without Windows 10 Professional. That screwed my team over at a data science hackathon once.

If you're at the point where you're worrying about distribution environments, you're probably ahead of most people already, and you have the wherewithal to figure out what the state-of-the-art is in packaging and distribution, whatever that may be and regardless of how broken it is.

But the problem is that a Python installed this way can be abused as a dependency for application delivery.

You should not be relying on system installed interpreters, because you end up with an ancient, unmaintained version.

That's how Apple does it. Ubuntu LTS, Debian Stable, CoreOS/Redhat, etc, are useful because the old versions of the software are maintained. Though Windows has not traditionally bundled GNU or Open Source packages, it is also built with a strong commitment to long term maintenance: consider Notepad,MSPaint, and .NET.

Microsoft is including Python 3.7 which has planned support until at least 2023 and possibly 2026 https://www.python.org/dev/peps/pep-0537/#lifespan

Additionally, this is an "App" Store release that can update independently of the OS (and at least at this point in time Windows itself doesn't have a dependency on it).

Also, as the post sort of points out, this is being built by the Python community itself (rather than by Microsoft directly; contrasting with the distro model), using mostly the same CI/build infrastructure they already use for the builds at python.org and presumably may be as up to date as the Python team wishes it to be.

This will pull python down from the app store which is a community maintained package and installed just like any other store app.

It has nothing to do with Apple scenario where system included interpreter is vendor supported and used by other OS components, this is just a built in shortcut for the store version if python is not installed.

Removing them from MacOS was the right decision for MacOS, but only because Apple never maintained it. If Microsoft can provide a good distribution channel and maintain it, this is a fantastic move. And the Windows Store is a fine distribution channel, and Microsoft recently has been very on-top-of this kind of thing.

It is important to note that the Windows version is just a link to the latest version. They aren't shipping it.

Yup, the Store page is coming from MSIX packages being built from most of the same CI/build systems that produce the official builds on python.org.

> A lot of people are mentioning Apple removing scripting languages at the same time as Windows is adding it.

I think a big difference here is expectations: the Apple way is to have the tool itself installed, which creates various expectations with respect to presence and capabilities (or the lack thereof).

Because microsoft only adds a hook / stub, there's no expectations that the tool will be present (as by default it's not), it just makes it easier to enable the tool.

> You should not be relying on system installed interpreters, because you end up with an ancient, unmaintained version

What? The reason to use the distro-provided python package is because it's maintained by someone and gets you security updates. If Apple is just shipping a version and never bug fixing that version, well Apple sucks.

As it happens with Python, they do micro releases which other upstream projects might not. But, those releases might take longer than, say, a Linux distro might to get the fixes out to their users. So, while developing on an upstream codebase might be nice, what you need to focus on is the deployment lifecycle of your project.

Agree completely. It would be interesting to see how many installs there are of n/nvm/rvm/rbenv/virtualenv with only one version of scripting language installed. Often the version bundled with the system is too out of date or too locked down in root directories for users who just want something simple.

Updating Python once a year shouldn't be too much to ask of a company like Apple, and is adequate for 90% of users. I shouldn't need to install pipenv and virtualenv if I want to drop into a scripting environment once a month.

These tools aren't (or at least shouldn't be) only meant for professional engineers.

That's not the case here, according to the link. There is no and will be no no system installed interpreter in Windows. It's just directing people who enter 'python' or 'python3' on a CLI to a package on the Microsoft Store.

> you end up with an ancient, unmaintained version

Only if your system isn't getting any updates. It would be up to Apple to provide a recent version.

But Apple updating the version would break scripts that were relying on particular version's features.

> You should not be relying on system installed interpreters

No one seems to have a problem with that if it's bash.

You definitely shouldn't be willy-nilly modifying the environment of system interpreters, but relying on them for ops-like uses and bootstrapping dev environments is sensible.

My understanding is that Python break backwards compatibility (or forward compatibility) comparatively more often than bash. The problem with OS interpreters is entirely linked to stability and portability indeed regarding bash, many suggest to avoid bashism as much as possible and be posix compliant.

Apple is only removing Python 2, not Python 3.

I've seen lots of companies do what Apple is doing, and it always makes the platform less valuable.

Why can't Apple just install a few languages and keep up with them? It will basically kill them on the platform if they're not in common use. I suspect the people making these decisions believe it's good for apple's brand, but I think it will lead to an inward-looking insular culture.

I'm desperately in need of a guide on how to un-fuck my Python setup. On my Mac I have several versions of the default shipped Python, brew-installed versions of 2 and 3, and Anaconda-installed versions of 2 and 3, and I have no idea which I'm getting when I call 'python3' or 'ipython', or what should be being used. Substitute brew for apt and it's the same situation on Ubuntu. It's really confusing and means I have to troubleshoot dependencies for any Python project I try to run.

Disagree... there is tops of vbs around. why? It's default. Fought with a bank for months to use python but ended up with perl scripts on Solaris back in the day... Why? Perl came with Solaris. Having a trusted base and scripts that are easily auditable is good from a infosec perspective.

> Finally, with the May 2019 Windows Update, we are completing the picture. While Python continues to remain completely independent from the operating system, every install of Windows will include python and python3 commands that take you directly to the Python store page.

i guess Microsoft agrees with you

> You should not be relying on system installed interpreters, because you end up with an ancient, unmaintained version.

On the other hand, if you can bundle dependencies with your code, you are sure that a certain version of the language will be present and to some degree, you can just rely on that.

I also agree with Apple's decision and I was bit surprised that Microsoft would go the other direction until I read the article -- they also not bundling Python with the OS -- they're just putting it in the store and adding some stubs.

IDK, we were looking at using python in our .pkg's postinstall script instead of bash. I guess know we're going to have to use something compiled.

So - on my Linux machine, I shouldn't trust /usr/bin/python?

There goes probably 10% of the software I can apt-get install.

Honest question: I just use `python -m venv my_env` and then I use source to activate it.

Well, to be fair using the system interpreter is Exactly what we do on Linux.

Therein lies python’s biggest weakness. You effectively are containerizing the runtime to be sure your code will work. This is why languages like C++ are compiled to platform executable or why Java has byte run code (or whatever it’s called).

To be fair even with C++ you have to distribute some dependencies, e.g. MSVCRT. It's just that that is a lot easier and more efficient than distributing a copy of Python with every program that uses it, which is what currently happens on Windows.

There's no need to distribute MSVCRT dependency if CRT is linked statically (/MT or /MTd C++ compiler option vs /MD or /MDd C++ compiler option).

Maybe we can get them to include virtualenv by default too :)

It's included by default since 3.3 <https://docs.python.org/3/library/venv.html>.

Registration is open for Startup School 2019. Classes start July 22nd.

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact