Hacker News new | past | comments | ask | show | jobs | submit login
Apple removes Python 2.7 in macOS 12.3 beta (developer.apple.com)
681 points by tosh 5 months ago | hide | past | favorite | 408 comments



Finally! I’m a big fan of Python, but I hope they remove ALL Python from the OS itself. I’ve spent many hours over the years messing with setups to resolve bugs caused by python scripts picking up the Mac OS python install rather than the user-installed version. They really shouldn’t be bundling any Python version with the OS - the user should choose the versions installed. It’s much cleaner that way.


Or at least, they shouldn't be putting it in the $PATH.

Whether you're using Linux, macOS, or BSD, OS-packaged language runtimes exist for the OS itself to use, in OS-internal scripts and underpinning OS services. They shouldn't be considered "part of" the userland; as this invites userland applications to rely on them. These runtimes are actually something more like an "implementation detail" of the OS — the kind of thing whose binaries should live in $PREFIX/libexec rather than $PREFIX/bin.

(Mind you, if you're a system software developer, whose software gets packaged into Operating Systems, then you do have to rely on this "implementation detail." Which often means targeting very old versions of runtimes and libraries, compared to the versions that application developers get to target.)

Also, of course, there are "standard" language runtimes that should be part of the userland — e.g. /bin/sh. But these are literally standard — i.e. part of some standard like POSIX, that an OS can be tested for conformance against. Which is important, because it means that developers can write to a stable target of "POSIX compatible", assuming the existence of (and particular stable semantics of) particular standard runtimes.

But none of the "modern" language runtimes are part of any such standard. The version OSes ship should be assumed to not exist. (And major third-party applications do already do the right thing, in not making any assumptions of existence, but rather installing their own vendored versions of the runtimes that they can upgrade as needed.)


I’ve always thought it strange that most Linux distributions and other *nixes don’t put up barriers between not only system and userland, but also between “essential” userland (i.e. DEs) and other user-installed packages. The lack of separation seems like a setup for failure and complicates package manager design considerably.


This is where the excitements of using linux for day to day tasks come from, right? Install Steam and suddenly your DE got uninstalled [1], for example.

[1] https://www.reddit.com/r/linuxmemes/comments/qhjhf2/linus_tr...


But it really isn't. I never have problems in linux with python apps that are prepackaged by the distro. It's up to you to keep that separate if you install your own apps that require python. You really can't blame it on distro maintainers as they are really doing just fine with it. What happens is people won't take the time to understand what they're running instead of randomly typing a CLI or clicking on the app icon. Linux doesn't hold your hand (and really it can't! there are simply too many possible combinations) unless you use what comes with the distro.


> unless you use what comes with the distro.

And, even then, it's not the best choice for users that need hand-holding.


Can you come up with a solid definition to separate "essential" userland from other user-installed packages?

You cite DEs as essential, but maybe I want to be using my own Enlightenment install rather than whatever the system has installed (true story!)


Well I mean, there’s no reason why the user shouldn’t be able to control what’s installed as essential userland. It just needs to be managed entirely separately from the system and “normal” userland. It could be as simple as a flag on apt, dnf, etc.


This is the reason I really, really love Nix. I have been fighting this same battle for years and Nix finally fixed it.


I'd define it as "never have zero packages that provide a DE". This could be a global setting in the package manager which you would be able to set to `false` if so desired.


You can always build escape hatches for that kind of use case. Some more stringent separation between essential and installed components could have prevented Linus removing his DE.


What's a DE?


Short for Desktop Environment. You can choose yours in Linux.


By the same token, they should remove bash and libc from $PATH as well, as these are also runtimes that the OS itself uses. I think for a long time Python was just as much part of Ubuntu's intended user facing API as those two or GNU ls. Until the Python 3 came and split the Python community in two, there were few reasons to complain. If you want to package your own Python with your app, nothing stops you from doing so and calling it explicitly.


> I think for a long time Python was just as much part of Ubuntu's intended user facing API as those two or GNU ls.

To be clear, it doesn't really matter what Ubuntu's user-facing API is, because nobody writes a software package for Ubuntu specifically. They write it for, at least, Linux; or more likely, the de-facto subset of POSIX that includes at least Linux, macOS, and Windows MinGW. And that "Linux" target, almost always implies intended portability to "weird" distributions like CoreOS or Alpine Linux.

When you create a portable upstream software package that doesn't make any assumptions about where it's going to be installed, you can still rely on both the existence and the (POSIX subset) semantics of ls! But Python? No way.

In fact, you can't actually rely on libc being any particular way, either — but that's what autoconf is for. (Which in turn relies on the existence of, and standard POSIX semantics for, particular tools like m4.) Given that you have those POSIX-standard tools, you can generate a configure script. But you don't even need those tools to run the configure script. It just relies on, IIRC, POSIX-standard compatibility-mode cc(1). That's everything the configure script needs to probe the behavior of your system's libc (and kernel library headers, and a bunch of other things.) As such, a configure script, once generated by autoconf on some system somewhere, is portable to literally any POSIX with a compiler (which is an incredible feat requiring some horribly ugly hacks.)

As for bash: it's not something you can assume. When writing portable code, you can assume (re: POSIX) that whatever shell is at /bin/sh is a compatible implementation of the Bourne Shell; but you cannot assume that that shell is specifically the Bourne Again Shell, or that it can parse Bourne Again Shell syntax/semantics. Under Busybox, for example, /bin/sh is a compatibility mode of https://en.wikipedia.org/wiki/Almquist_shell, and bash-isms don't work. (Ever written a script to run on a Synology NAS? Ash is all you get.)


The moment you start talking about C compilers you've stopped talking about most software running today on Linux systems. The vast majority of packages are delivered as apt/rpm/apk etc installers, which are overwhelmingly binary or interpreted code. If you're delivering an apt package, you will want to assume (or ensure through declared dependencies) some things about the target system, such as bash or Python.

And you'll find numerous bash-only scripts online, either as examples, helpers, stack overflow answers, and many others. Comparatively few bother to support POSIX sh - though that trend is somewhat reversing with the rise of containers and Alpine Linux and BusyBox with them, plus Apple's moves against bash.

Just like you will find no tutorials telling you to edit a file with ed, even though that is the standard text editor.

Finally, if you think it's in any way common to write software to support any POSIX-compatible system, or even any Linux system, you live in a much nicer corner of the world than I do.


Not the same token, bash and libc are backwards compatible.


I already acknowledged the misguided Python 3 split made this into a problem. But that is not the reasoning offered by the GP - they were claiming that Python was only an implementation detail of the OS and shouldn't have been exposed in the first place, which makes no sense to me.


"They shouldn't be considered "part of" the userland; as this invites userland applications to rely on them."

Or should they? Windows applications rely on .net runtime that is part of the OS. For years now. It's one of the reasons why application development on Windows can be so laid back and stable.

I really, really don't understand why people in Unix-land feel the need to move fast and break things.


My understanding is that once they remove python2, there won't be any Python bundled. There will be a python3 handler that just prints a message asking you to use `xcode-select` to install the XCode command line tools, and that'll include the actual python3... but that's a fairly up-to-date version these days. (Right now it's 3.8.9.)


I fully agree with removing Python from OSes, but I don't know that it's reasonable for the average user to manage their own versions (I wish I didn't have to manage my own versions and I've been developing in Python for the last 15 years). The alternative is to ship the runtime, stdlib, and dependencies with each application, but that seems pretty sucky for an interpreted language and especially one that leans so hard on random native extensions (I've seen relatively simple applications pull in 250MB of dependencies in a compressed tarball, never mind runtime and stdlib).

Really I don't think Python (and probably other interpreted languages) is a very well suited for developing end-user applications--although some people have fewer scruples than me about wasting user resources.


Please rest assured that this is not a flamey / troll comment, it's just that your first paragraph to me sounds like "Python is just not worth the trouble" which you kind of allude to in your second paragraph after.

I mean, at what point should we start asking ourselves fundamental questions like "what is so heavenly about Python that we have to carry it everywhere and ensure its installation on every possible platform"?

Example #1: I know a guy who regularly writes scripts in OCaml and laughs at everybody else, saying he gets static types and mega-fast binaries (faster than Golang and slower than Rust) and a very terse syntax (he says it's often as short or shorter than bash even).

Example #2: I recently finally had enough of bash/zsh scripting weirdness for a few tasks I wanted done on my home server so I just learned enough Lua for an hour to make a fairly decent script that did exactly what I expected immediately after I understood a few of Lua's quirks (which are at least consistent and easy to remember, unlike the endless escaping and quoting rules of the scripting shells).

Example #3: I know a few people using a Rust library/tool that makes scripting with it much easier and almost painless. They swear by it.

--

I am not even mad at Python for potentially putting hundreds of megabytes or gigabytes of baggage on people's machines -- if we start having each app carry its own Python runtime. Meh. Many people would never care (me included). I do get ticked off by it being everywhere however; especially having in mind what an awful language for long-term projects it is (said to me by at least 5x senior Python devs at 40+ of age). And how often it messes up with other work you do on your computer, too -- which is the cherry on the cake and has been guilty of tons of broken Debian system-wide upgrades (among many others).

I wish people stopped getting tempted by "easy to start with" languages and just start seeing the forest and not only the trees... :(


I generally agree. I don’t think Python is even “easy to start with” these days. I think it got that reputation in the early 00s, but since then the competition has surpassed it.

Languages like Go are arguably easier to use and getting a basic project up and running and sharing it with others is far easier than with Python. If you really care about a scripting workflow, it’s as easy to do ‘go run main.go’ as it is to do ‘python main.py’ for projects without dependencies and for projects with dependencies it’s a lot easier to just do ‘go run main.go’ than configuring a venv and installing your dependencies and then running ‘python main.py’.

Nit: Ocaml is about on par with Go. Some programs will be faster and others will be slower, but OCaml is very much in the same performance tier as Go, Java, and C# (benchmark game prohibits idiomatic Go code for certain benchmarks so it appears artificially slow).


> … prohibits idiomatic Go…

No it does not.


Per the sibling thread, I think you’re mistaken, but that’s okay. It’s a widely written about topic and anyone can read up on it. We don’t need to hash it out here.


> We don’t need to hash it out here.

We don't: as you have shown absolutely nothing to back up your claim, it can be flatly rejected.


Please show that Go is being made to appear artificially slow, as you claim.


It’s not allowed to preallocate memory in certain benchmarks, so it has to do allocations in a tight loop. Go’s allocations are more expensive by design because idiomatic Go doesn’t create as much garbage as other GC languages.

If you’re here for a language slap fight, you’ll have to find someone else to talk to.

EDIT: good grief, the overwhelming majority of your recent comments are “debates” about the benchmark game. Do you have an alert for references to this phrase or something?


> They really shouldn’t be bundling any Python version with the OS

I fully agree, Operating Systems bundling Python and Ruby has caused me many headaches in the past and I always perceived it as a downside, never an advantage.


I mean the advantage is that you can run vanilla python on most things by default, which is how Ansible pretty much "just works".


Or rather, people should stop using newest and shiniest version of their interpreted language and stick with what most users have.

Python developers also should stop introducing breaking changes to their language with every release, like it is some toy.


> They really shouldn’t be bundling any Python version with the OS - the user should choose the versions installed.

Why? Having a bundled Python version - any version - means that you can ship Python scripts to others. Not having a Python version bundled means you can't ship Python scripts at all.

Sure, you can still technically ship them, but now you need to tell users "just run this simple scipt!... after you install Python X from place Y, and now you'll have multiple Python versions installed, so here is how to keep them separate, and [...]". Or, you can bundle your script with a whole Python installation!


It’s only really feasible to ship APIs in an end-user operating system that can make forward binary compatibility guarantees.

Python doesn’t make a sufficient guarantee of forward compatibility for either scripts or its C-linkable libraries. This means that, unless an OS vendor does a bunch of extra maintenance work on the Python that they ship, an application built in or making use of the Python shipped in one version of an OS may not run on a subsequent version of that OS.


When I have to install a CLI tool and it's in python I sigh loudly. Happy that many CLIs have moved or are moving to golang


I use pipx for this. It isolates each tool in its own virtualenv. You might still have to install a specific Python version (although the latest 3.X works fine for me), but it takes some of the pain out of the process and makes sure you don't pollute your global/system install.

On a Mac, it could be as simple as (assuming you already use Homebrew):

    brew install python
    brew install pipx
    pipx ensurepath # adds ~/.local/bin to $PATH
    pipx install <tool>
    <tool>


Pipx is really one of the best things that has happened to Python recently. Taking away all the headaches of manually managing venvs. They should integrate it into mainline imo.


That is all nice. And then python updates from 3.9.n to 3.9.n+1 and nothing works anymore.

brew sucks with python.


I've only run into issues like this when upgrading from Python 3.X to 3.Y, which is to be expected, and Homebrew lets you install a specific 3.X (e.g. brew install python@3.9) if you don't need/want be on the latest version.


Thanks, I didn’t know about pipx. If it works as advertised that will be a huge improvement for me!


pipx looks fantastic, thanks for sharing.


As a non python native, I sigh whenever I have to install anything in python. There's always a dependency I can't find and never know why. I never understand why the dependencies aren't just resolved by an application and local to that application. I mostly end up with something working. Sometimes I just create a VM for that single application.


I once really desperately needed a Python tool. Wasted an hour and getting more and more enraged trying to install it, overwriting hell-knows-what and ruining god-knows-how-many other programs in the process. Eventually it did occur to me to just use Docker and that did it. <facepalm>

Every time I am forced to do something with Python I end up hating it. It seems those who write it are just not interested in anyone who doesn't have a complete dev environment setup for it. :(


> It seems those who write it are just not interested in anyone who doesn't have a complete dev env...

And that's probably what makes Python so popular, and so great. I get it. It's attractive to people who aren't developing software for other developers.

But what you described is surely the result of that atmosphere.


> And that's probably what makes Python so popular, and so great. I get it.

I don't follow because that's actually a problem. Imagine having to install Haskell compiler -- exactly one particular version of it with the exact right combination of flags (and there are hundreds of them) -- for each project... while also having the whole thing kind of compile from source.

And that for people who just want to do something like `brew install a_valuable_tool` and have no patience (and certainly not the time) to tinker with Python's dev environment just so they can install a tool they need.

So maybe I am misunderstanding but no, that's not what makes Python popular or great, at all. It makes it a legendary pain for anyone who doesn't code it every day.

If I can't do `brew install your_tool` but have to fiddle with the intricacies of your language's ecosystem then you have failed as a tool author.

Contrast this with Golang tools (disclaimer: I dislike Golang) which in the worst case scenario involve 1-2 commands to download and compile+install the tool but usually don't involve even that because Golang's process is self-contained and that makes it super easy for all package managers to distribute it pre-compiled for their platform.


The distro should be packaging software for end users, relying on the python version they maintain. The problem is that Apple isn’t doing that at all, and brew seems to have issues getting all formulas to converge on known-good dependencies.


Try Anaconda (or miniconda/mamba): like virtualenvs but reproducible, dependency pinning, and a great solver. All in one handy tool


Thank you.


https://xkcd.com/1987/ says it all.


Some of this is brew-specific. The way it handles ownership isn't great.


I'm not sure if it resolves your ownership issues, but I am glad they went with `/opt/homebrew` in ARM Macs to at least keep homebrew in its own directory.


What they should have done from day one. The kludge they used in /usr was broken-by-design.


The way it handles ownership (or at least used to) is completely broken with screwed up stuff happening inside /usr land.

Macports puts everything under /opt/local and you add /opt/local/bin to your PATH.

I've never understood the preference for homebrew, when I moved to MacOS (from Linux) 5 years ago as my working development environment, I read about both and homebrew seemed like a complete kludge, whereas Macports worked the way it should.

SIP just made it even more relevant. Macports deals with specific MacOS related integration (eg Java etc) using the Apple blessed mechanisms while still providing all of the flexibility of the many many packages.


That’s hillarious. It’s why I love Rstudio.. don’t have to deal with any of that as someone who is just trying to prototype and get something to work.


nobody has ever shipped R as part of the base OS


Except Ubuntu DSVM


I feel like in general Python on Mac OS is messy compared to Linux and even Windows.


There is a lot of good things to say about Python which is a fairly nice language but I feel it is messy on every platform. Tooling makes the situation bearable nowadays but that's definitely not one of Python strong point.


But it's so easy to set the path up to python? I hear people complain about this all the time but I never encounter it because I'm careful and I know which interpreter I'm running ALWAYS. It really isn't a hard task to do. Having python in the OS is fine. I wish windows had a default version and I love that it's available on linux in most distros.


Apple ships with a number of utilities installed that require Python, like smtpd.py.


Where does this utility exist? I can't find it on my macOS installation.


A lot (all?) has been cleared out in Monterey. I can’t find smtpd or xattr’s scripts that required the previous version. I think whatever remained before 12.3 were Automator’s py scripts. But I don’t know what scripts those were.


Mine does seem to contain `/usr/bin/smtpd.py` with a modified date of January 22nd. It's an installation that's been updated through a number of versions of macOS, though.


Ah, I see...

It contains the following shebang:

  #!/System/Library/Frameworks/Python.framework/Versions/2.7/Resources/Python.app/Contents/MacOS/Python -R
Which continues to work since it's a system framework.


I'm running Monterey on a brand new machine (no updates from older versions of macOS) and mine is in /usr/bin/.


It seems that Spotlight doesn't index that folder, hence why I didn't find it in my quick search.

I see it now, but it uses a shebang that points to the system framework python:

  #!/System/Library/Frameworks/Python.framework/Versions/2.7/Resources/Python.app/Contents/MacOS/Python -R


Having some python always installed can be nice as an advanced calculator or short ad-hoc scripts.

My non tech mom needed help to wrestle a large set of semi structured data into a csv to use with Excel. On my dev machine I would have solved the problem in a minute with 5 lines of Python. On her machine, without any coding environment, I was completely stuck. Eventually figured out you can use the developer console in chrome to run some interactive Javascript and problem solved.


Maybe just call it `system-python`.


That’s basically what RHEL does: /usr/libexec/platform-python.


The only reason I'm using python is because it's available on most linuxes and macos by default. If it would be removed from macos, I'd never use it.


MacOS (Certified Unix OS) is bundled with multiple OpenSource projects since it's initial OS X release, just like many other *nix OSes. It's so easy to install your own version and override it that this is not an issue. Simple response... use brew.


PSA: don't use brew to manage your development python, unless you like having the carpet pulled out from under you by brew deciding to change 3.8 to 3.9 on you.

Use brew to install pyenv (or bootstrap it from github). Use pyenv to manage python versions, and then use a virtualenv.

Or conda, if that is your jam.

https://github.com/pyenv/pyenv


Don’t use brew to manage your development anything. Carpet-pulling is pretty much intrinsic to its design.

“hey I upgraded openssl for you and trashed the old one, hope you hadn’t built anything against it!”


Yes, this is what I do now and since moving to pyenv I haven't had any real issues. The only glitch was that I have to upgrade pyenv from 2.2.3 to 2.2.4 and it just tells me 2.2.4 is already installed when I'm running 2.2.3. I'm guessing that's a homebrew thing?


>It's so easy to install your own version and override it that this is not an issue.

It's easy to override it? Really? Don't forget that macOS comes with bash, zsh, ksh, tcsh... You have to override it in each of those!


You can easily set your $PATH so that your preferred tooling is first in the search order.


There's a bunch of places that can still get tripped up:

- scripts with `#!/usr/bin/python` instead of `#!/usr/bin/env python`

- shell scripts which call /usr/bin/python directly instead of the one in your path

- shell scripts which intentionally set a prefix in $PATH to some vendor directory with a custom python, but the vendoring is incomplete and leaky and it implicitly loads packages from your system python anyway

plus a hojillion other things I'm not thinking of...


But scripts calling /usr/bin/python want to use the "system" / "bundled" Python - why would you want those to run with your version of Python???


Because in 120% of situations, that "want" is actually "I had no clue how *nix works and just pasted in the result of `which python` in my system".


Do you think the same is true of #!/bin/bash? Why not?


GUI apps also have their own special $PATH set via launchctl.


macOS*

I don’t get your point. Are you agreeing?


you should absolutely avoid overriding the default python installation if you're on a distribution that ships python2 by default in particular, because that is a good way to kill a good chunk of your system.


Everyone should depend on “python3” or “python2”, never just “python”. They aren’t really the same language. python3 can’t run python2 scripts, nor vice versa, so no matter what you expected there will be systems on which you won’t get it.


100% this.


Thank you, I struggled with that problem for years. Happy that I am not the only one.


Not much of a choice anyway. Choosing to keep an unsupported piece of software in the system is a security liability.


It’s also about (damn) time.

I know it shouldn’t, but the python community’s bizarre behavior during the 2.X -> 3.X move honestly made me think less of the language.

Tech is supposed to just be tech, but when the community behaves this badly about adopting improvements how can that not influence your decision to invest in that tech? If 2.X -> 3.X was such a drama fest what’s gonna happen next time they need to upgrade? Etc


While I happily use Python 3 now, the transition was very difficult, and I was grumpy about it at the time.

I work with code that uses a lot of files in a lot of different text encodings. Some are XML, some plain text, and some binary. Coming from other languages, Python 2's Unicode support was very difficult to work with, and my team was excited to move to Python 3, until we actually had to do the work.

Long story short, Python 3's str/bytes separation was a nightmare, especially when having to deal with third-party libraries that were expecting one or the other. This was especially true for libraries that came from Python 2, and were still expecting strs as function parameters when they should have switched to bytes. There was so much encode() and decode()ing going on that we occasionally caught ourselves getting them backwards in code reviews. We took a step back to see if we were just architecting things poorly, but no, we were doing things the best we could for our problem domain.

In the end, we traded one set of Python 2 text encoding footguns for a mostly different set of Python 3 text encoding footguns. It's not like Python 2 was great. Having a single str type instead of the str/bytes separation is worse in theory. But because Python 3 didn't design the separation of those types well enough, and still has a ton of text encoding footguns. It wasn't better enough to really justify all the work that went into the conversion.


Transitioning was hard, especially with all the libraries in various state of support between 2 and 3, I had a similar experience.

But I disagree with you about the separation of bytes and string and the current state of the language. I write a lot of python that deals with bytes and text encoding, and now that all the libraries have caught up with 3, the situation is way better than it ever was. encoding, decoding, bytes manipulations are way less prone to errors.


I'm glad it's improved. I moved jobs a couple years ago, and since then, have only used Python for a few small personal projects. When I left, there were still some very rough edges around some very popular (and some not so popular) libraries.

And, as I said, I do prefer the str/bytes split over the unicode/str split, but I also wasn't doing a lot of raw byte manipulation. I agree that it was harder to do with the old str than bytes. I was mostly doing string operations on the grapheme cluster level, and then writing everything out as UTF-8, so I didn't see as much of the benefit.


> Python 3 didn't design the separation of those types well enough

I'm curious to know what you mean. How would you improve the design?


That was probably an inartful way of saying it on my part. Echoing what I said in another comment, I think the API for converting between bytes and str types is bad. .encode() and .decode() are bad names for what these functions do, and let to occasional incorrect code in both Python 2 and 3. I would have preferred something akin to:

b = b"some bytes" s = u"some str" foo = b.to_string(encoding) bar = s.to_bytes(encoding)

or

foo = str(b, encoding) bar = bytes(s, encoding)

I also mentioned that I think there shouldn't be any mechanism to read or write files to/from a string without specifying an encoding. locale.getpreferredencoding() is a mistake.

But one thing I didn't mention that I think would improve the design would be to make it harder to treat str as the bag-of-bytes type it was before. Swift took some steps down this path, and honestly, it made it much less pleasant to work with strings, but a bit safer. I would hope that a better design could be found.

This may be controversial, but I don't think you should be able to just subscript a str type. instead of s[1], you should be writing s.codepoints[1] or s.graphimeclusters[1], etc. depending on what you actually want. If str is truly a string type and not a bag of bytes, it should deal with the extra complexities that being a string brings to the table.


Oh, I see what you mean. Yeah, I have gotten tripped up wondering whether I should be calling encode() or decode() many times. Using the str() or bytes() constructor would be more intuitive for sure.

Interesting thought on subscripting strings. It would be hard to make it fly after all these years of habits, but I see your reasoning.


What do you think the current text encoding footguns are?

In a different direction, I don't know what your problem domain is/was, but in general when I'm dealing with UTF8, I don't need to convert back to bytes very often. Was the need for conversion mostly due to the libraries that still expected strings instead of bytes?


It's been quite a few years since we went through the conversion, and at this point, working in Python 3 is natural to me, so I may not be able to recall all the footguns. I can say a lot of the difficulty was due to libraries, both third-party and standard, and that hasn't improved very much. I don't want to single anyone out here, because it's pervasive. In Pytuon 2, str was the bag of bytes type. I think a lot of libraries didn't want to change to accepting bytes types instead, because it broke API compatibility, but it caused a lot of issues.

I should also say that we were working with files in tons of encodings, not just UTF-8. We had UTF-16 and UTF-32, both little and big endian, with and without BOMs, but we also had S-JIS and a bunch of legacy 8-bit encodings. Often we wouldn't know what encoding a file was in, so we'd have to use the chardet library, along with some home-grown heuristics to guess.

Off the top of my head, the two biggest footguns are:

- There should be no way to read or write the contents of a file into a str without specifying an encoding. locale.getpreferredencoding() is a mistake. File operations should be on bytes only, or require an explicit encoding.

- .encode() and .decode() are very poorly named for what they do, and it wasn't that uncommon that someone would get them backwards. Sometimes, exceptions aren't even thrown for getting them wrong, you just get incorrect data.

Both of which were still issues with Python 2. There's a valid architectural argument to be had between the Python 2 way, where str was a bag of bytes, and the unicode type was for decoded bytes, and the Python 3 way, where the bytes type is your bag of bytes and str holds your decoded string. I favor Python 3's way of doing it, but it's almost six of one, half a dozen of the other. The advantages of one over the other are slight, and given how many library functions relied on the old behavior, it was probably a mistake to change it like that, rather than continuing the Python 2 way, and fixing issues like those above that caused problems.


I haven't done a ton of work with Python recently, but the problems I remember encountering came from the fact that python doesn't try to have encodings in any other part of the basic type system. So like, if you have an int or a float, you can pass those to any interface that takes a 'number-y' value and it will mostly work like you expect. That's also how strings worked in P2 - you could pass them around and things would accept the values (though you might get gibberish out the other side). Now, in P3, things will blow up (which is helpful for finding where you went wrong ofc - I understand the utility), but it means that your code handling things-that-might-be-strings-or-bytes often needs to have a different structure than the rest of your code.

I think the P3 string/byte ecosystem was made substantially weaker by P3 deciding not to lean more into types (something I have complained about on here before!). Like...they are the only values where the stdlib is extremely specific about you passing a value that has the exact right type, but the standard tools for tracking that are pretty poor.


> but it means that your code handling things-that-might-be-strings-or-bytes often needs to have a different structure than the rest of your code.

Isn't that the point? String and bytes are different beasties. You can often encode strings to bytes and just about anything accepting bytes will accept it, but the converse is not true. Bytes are more permissive in that any sequence of any 0x00-0xff is acceptable, but str implies utf8 (not always guaranteed, I've seen some shit), meaning e.g. you can dump it to json without any escaping/base encoding.


Sounds like you should have moved on from Python to something else altogether?

Reading your comment feels like "I don't like Python for being Python", more or less. Apologies if I misread.


I actually like Python a lot. Although I'm no longer using it professionally, it's my first choice for personal projects.

I'm also an advocate for using the right tool for the job, and in this case, Python may not have been the right tool for this job, but this was only one component in a much larger system. Sometimes you have to be suboptimal locally to be optimal globally.

And it's not like Python couldn't handle it, it was just that it had some design decisions that made things a bit harder than it would have been in some other languages. We got it working pretty well in Python 2, then the Python 3 transition happened, and it was a lot of work to get everything working as it had been, for only a small benefit to our team, but we got it working in 3 as well, and to my knowledge, it's still humming along.


> the python community’s bizarre behavior during the 2.X -> 3.X move honestly made me think less of the language.

There weren't that many people who were outright against Python 3, other than Zed Shaw for a few years. It just took until Python 3.4 for the language to really be usable in production, and after that it took a couple more years for every library to be updated. But the community has been pretty unified for 5+ years at this point.


If by unified you mean no longer arguing, sure I switched from Python 2 to Go lang. (I will concede if I am doing exploratory stuff, I will start new repos in Python 3, e.g. for boto quick exploration.) I am still bitter about the fine Python 2 code I supported and extended that is now all go or Java.


This is my experience as well. There were plenty of people online vocally against Python3, but once we hit 3.4 I never worked with anyone IRL that was against it. The mindset was almost always "this is the way forward", so that's how we moved with our projects.


I used it production just fine before that. What didn't work for you? The unicode support was shoddy, but lots of other stuff worked fine.

The problem as I remember it was that there were some critical tools that just were not making the switch. The last one I could remember having to have a separate runtime for was graph-tool, but there were just a ton of them in the scientific computing community. Also, I don't think making print a function was worth it. So many people had muscle memory on that that personally I think it was worth just single casing it.


> What didn't work for you?

Weren't there a lot of major performance regressions in the first couple versions?


There definitely were, at least in the early versions. It would have been fine (I guess) if those early versions/iterations had been short-lived, so to speak, and for a stable and definitely faster version (compared to 2.7) to be made available shortly after that, but afaik that wasn't the case.

If it matters I've been writing Python professionally for more than 15 years and I started my career by seeing the (less famous) Zope2 to Zope3 botched migration. When Python3 was first announced I had hoped that the devs behind the project had learned from that related experience, guess I was wrong.


> But the community has been pretty unified for 5+ years at this point.

Sure, but they accomplished this mostly by displacing anyone who disagreed. That's fine, and Python 3 in a vacuum is a better language than 2. But I'd never make the mistake of choosing Python for a serious project going forward.


The "active community" is unified, I know a bunch of people with some personal piles of Python 2 who I don't expect to upgrade until they have no other choice. I also don't see why they should -- their code is running fine.


They forced incompatible changes on their users, all in the name of a Unicode fetish. They burned their credibility. The users acted rationally, it's the developers of Python who were abusive.


I think it’s less of a fetish and more of a “we made a huge mistake in the initial design of the language and confused text and the encoding of that text making a giant footgun that was hard to avoid.

The migration was difficult because you had to actually think about this stuff instead of it working by accident and living as a landmine if you encountered non UTF-8 text.

The fact that you actually have to decode your bytes and specify an encoding makes it really obvious when you’ve just assumed the world is UTF-8 without really backing it up.


What I don't get is it they break the language anyway, why they didn't fix classes. Most of python is pretty beautiful, but classes are so ugly, even basics like what is the Syntax for instance vs static cars, self/this, setting up constructors, calling inherited constructors. Like come on, how is it that they are uglier than Java in this regard.


I really don't know what you're talking about. I've written a lot of object-oriented code in Python and Java and there isn't a single thing I like better in Java (that doesn't come from the typing system). I don't know about 3.0, but it's been great for years. Take a look at some Django code.


super() is a little cleaner to call now at least.

The print() change drives me nuts: many scripts use the old form, and Python forces you to change it despite recognizing the old functionality. They could've simply allowed print expr to work as is and accepted it as a wart.

The async stuff feels a bit like a bridge too far for me WRT the language. Simplicity would be standardizing on a function color (as it were).

I don't envy their position of preserving backwards compat while evolving the language.


> They could've simply allowed print expr to work as is and accepted it as a wart.

Old print isn't an expr at all, that was a major point of the change.


I think the person you're replying to was saying that there was no reason to remove the statement form. The only thing that removing the statement form of print did was break existing code. You could easily allow print to act as a function by removing it as a reserved word in expression contexts.

They made print a function out of a misplaced sense of language "purity", where they made a random ideal and broke code to support it.

Let's compare to JS: 20 years of new features and old code still works. Keywords were added, and old code still works. Even code that uses those keywords still works. It took more effort from the language implementers, but that's as it should be: it took a bit more effort for a few teams (who make the engines), instead of a large amount of effort for tens of thousands of groups.


They made substantial changes to classes in Python3.


If the reactions seem bizarre, maybe you're missing something.

For me it was more than "just the tech". Python 3.x was not the Python 3000 I was promised for a long time: many of the additions did not warrant breaking changes (as demonstrated by the backports) and many of Python's deeper warts still remain unaddressed, even today.

I think the deeper issue is that at the rate that 2.x got popular, the original motivation, principles and process behind the language's development got diluted. The orchestration of the push for 3.x adoption was not technically motivated in a convincing manner and signaled a change in the process. I stopped trusting the process.


"Bizarre behavior" like not immediately throwing away hundreds of thousands of hours of work product because you said so?


I saw the drama and ignored it. Transition was seamless for 99% of the stuff I used, and the 1% I left behind I don't miss.


My company started life in about 2014 and for python is only on 3. And python 3 is great. Having written python 2 before and also perl, I really appreciate UTF-8 just works.

And I hope that python learned from 2 -> 3 and that in the future it will be better


The fact that you think this way shows why that community is so broken about things like this. You think the user community behaved badly? It was the developers.

For the vast majority of companies, python 3.x is not better enough for the cost. That is what happened. You may think it's awesome. It may be technically better it, etc. But the cost of moving was much more for most companies than the productivity gains. We've even measured it before. That is entirely on the python folks for not providing something people want enough to move to. All CSAT surveys of python i've seen at my company, and that my counterparts had ever seen, said literally the same thing. So it seems like something the community could have discovered pretty easily in a rigorous way if they wanted. I used to spend a lot of time on python-dev. It's a group of good people and engineers for sure. There was always good rigor around technical reasons to do something, and around performance benchmarking. But i don't remember seeing a lot of productivity research or CSAT or .... It's fair to point out that most OSS communities don't do that, but that's a bad thing, not a good one, and one of the things that often leads to community trouble

One of the other major factors, besides the productivity issue, that makes it not better enough is the migration cost. Software of this kind needs to be built to be migrated from and to. If not, it's broken and you deserve all the animosity you will surely get from your users. It doesn't matter if you screwed up the language, etc. Tough crap.

In python, for lots of folk with substantial stacks, the amount that could be auto-migrated by the tooling provided was low - 40-50%. Yes, simple stuff could be auto-migrated, but not most real stuff.

We actually ended up building our own tooling to get closer to 70% because how expensive the overall thing. It processed many millions of lines of python code, automatically kept track of which things would be best to rewrite/finish migrating first (because it would unblock other things), and continuously tried to auto-migrate any non-migrated changed code in a loop to maximize automation. The total engineer cost of moving was enormous. We are talking well over a thousand SWE years.

For what, exactly? As I said, we already measured (because plenty of new projects were python-3 only) and determined that all the improvements and hullaboo did not meaningfully improve productivity for people. Not enough. It was small, at best. It also didn't reduce the rate of production bugs, etc.

So all this work, for no meaningful value.

Because not enough were willing to do it, the result was to force it. Damn the users, full steam ahead. The users are just lazy.

I talked with lots of counterparts at other companies - most felt the same way. The result for us is lots of teams gave up python forever. The only reason it's growing at all is because of ML.

IMHO, python2->3 is an object lesson in doing it wrong and then victim blaming.


"We actually ended up building our own tooling to get closer to 70%. It processed maany millions of lines of python code, automatically kept track of which things would be best to rewrite/finish migrating first (because it would unblock other things), and continuously tried to auto-migrate any non-migrated changed code in a loop to maximize automation. The engineer cost of moving was enormous."

This was definately an opportunity for commercial software, i am not sure why companies worth hundreds of billions can't sort this out and expect a small nonprofit to provide this.


FWIW - i don't disagree, and I don't expect anything in the end, it's open source. People can do what thy want. I was more pointing out the opinion given of how the user community sucked is very broken.

What the "larger user" (however you want to frame it) community would have preferred (IMHO) is much better migration tooling, even if it came at a cost of improvements.

The dev community chose to spend time on improvements instead. That's their choice, but it doesn't make the community wrong to be upset about it.


> We are talking well over a thousand SWE years.

Why didn't you just take over support of python 2.7?


We did, internally, for a bit, but since they encouraged every library to drop support, and most did, it was not a tractable long term answer.


Inertia seems to best describe the hesitancy to change. The language improvements stand on their own I would think.


That's the problem - they literally do not. Technically better is not a good enough reason to change something. It has to provide some actual greater user value along axes that matter to users, and make the cost worth it. What you are seeing is that python3, well, doesn't. The options in that case are reduce cost or improve along axes that matter enough that cost is worth it.


> Inertia seems to best describe the hesitancy to change.

I'd have gone for 'rigor mortis', but hey, we can't agree on everything ... like, say, a very minor change to string encoding with a 12-year managed rollout.


like, say, a very minor change to string encoding with a 12-year managed rollout

That's exactly the point: it required a massive amount of effort for relatively minor benefits.


I did a bit of contract work for a place with over a million lines of custom python 2.x scripts that ran inside of a python interpreter embedded in a proprietary product which made it difficult to do things like write and execute unit tests for the code. I think they were still writing new python 2.x code in 2018.

A lot of the scripts were supporting processes that had a finite lifespan during the initial stages when the company infrastructure was being built out, rather than ongoing operational processes that needed to be performed indefinitely, so hopefully they'll be able to set a lot of that codebase on fire instead of maintaining or porting it to python 3.


> Tech is supposed to just be tech, but when the community behaves this badly about adopting improvements how can that not influence your decision to invest in that tech?

What you say was bad about it? And who were the bad people specifically? The people who were using python 2 or python 3?

For what it's worth, python3 >= 3.0 && python <= 3.2 were hideously broken in their unicode support. Arguably had worse/unusable uncode relative to python 2.6 or 2.7.

So there was a huge failure to launch type of problem, especially given how long python3 had been development.

It most definitely left a very sour taste in many people's mouth that didn't start dissipating till 3.5 or 3.6 when enough "killer" features had accumulated.

Even then, for a lot of usages, python 2.7 'just works'.


>Choosing to keep an unsupported piece of software in the system is a security liability.

It is the OS design that doesn't make it possible to support the principle of least privilege that is the security liability. Blaming users for wanting their working software to just keep working the same way is abuse!

If they want to run a piece of code written in HyperCard for the original Macintosh, they should be able to do so. This push towards ever more complexity while failing to fix the inbuilt security fault inherited from Unix is insane.


A purely Python-contained vulnerability would not be mitigated in any way by OS features.


Why not? If you have a fault in a lamp switch, the circuit breaker keeps it from burning down your house.

Why can't an OS keep a rogue program from taking down your machine? You could specify what to run, and with what resources, and then even if Satan himself wrote the code, it wouldn't ever corrupt anything other than the resources specified.


This has to be one of the dumbest arguments i've ever seen on this hellsite.

Firstly, a circuit breaker does nothing but protect the wiring in your house. A faulty lightswitch can still throw sparks and start a fire as long as it never draws more than 10A or whatever your circuit breaker is rated for.

Similarly, operating systems can prevent a rogue process from taking too much CPU or RAM, but can't prevent a process from ransoming your files, at least not without a substantially different sandboxing regime to what exists in mainstream operating systems today. Why we don't/can't have this is an interesting disucssion that is much larger than the python2/3 question.


If that was the argument they'd have replaced it way sooner, because 3.x came out 14 years ago, 3.4 (when the 3.x branch could be called "actually ready for use") came out 8 years ago, 2.7 was announced to go EOL "soon" 5 years ago, and was properly killed off almost 2 years ago now.

We've seen an entire major version release of MacOS come out since Python themselves went "look, we're killing this thing off, deal with it", at which point Apple should have dealt with it, but didn't.

This is a replacement for the convenience of their own staff working on the OS, nothing more. It's good that it's finally happened, but it's a good five years overdue, and there's no motivation beyond "we no longer use it" at play here.


Agreed on this one, a great move tbh.


> Not much of a choice anyway. Choosing to keep an unsupported piece of software in the system is a security liability.

Meanwhile, OSes like NixOS can easily deal with different versions of the same package (with security backports) ... Apple seems lightyears behind.


Except there are no official security backports for Python 2 anymore.


it's not just nix/NixOS, macports on macOS deals with different versions of the same package just fine, I have a few pythons installed.


What? In a point release?

We all knew this was coming, but doing it in a point release seems like exceedingly bad form, particularly as Apple does major releases annually. I never expect point releases to break stuff, at least not on purpose.


The risk of unpatched vulnerabilities between now and the next "major release" far exceeds the risk of inconveniencing anyone foolish enough to rely on the system Python 2.7 installation.


> The risk of unpatched vulnerabilities

If a company as 'small' as (pre-IBM) Red Hat can afford to support Python 2.7, then so can Apple:

* https://developers.redhat.com/blog/2018/11/14/python-in-rhel...

* https://access.redhat.com/solutions/4455511

Further, if Debian has the resources to support Python 2.x, then so does multi-trillion dollar company of Apple:

* https://packages.debian.org/search?keywords=python2

If they wanted to announce at the June 2022 WWDC that macOS 13 wasn't going to have it, that's fine. But we're several months into macOS 12.x and such a large change is bad form.


They literally did that, in 2019: https://developer.apple.com/documentation/macos-release-note..., and specifically mentioned *then* that Python2.7 was not recommended.


Yes, and I was surprised when Monterrey launched with Python 2 still included.

But I really did think that since it was in Monterrey, it was going to stick around until at least macOS 13.


Let's face it: at this point macOS version numbers are a marketing tool, and not much more. It's only 2 major releases ago that a "x.y" release *would* have been considered "major".


I don't think that's particularly relevant. It was pretty clear that between 10.0 and 10.15, the second number was the "major" version for all intents and purposes, and now it's the first number.


They probably had a major customer who needed it for some wacky reason.


Maybe they can afford it, but why should they?


Bingo. This falls under the scope of support for an enterprise linux company-- orgs pay them to keep stuff backwards compatible long after the upstream deprecates it. Apple is decidedly not an enterprise support organization. RHEL also doesn't want you using system python as your dev python. RHEL 8 uses 'platform-python' for the bits the system needs and if you install 3.6, they share the common bits If your using a mac as a dev box, and really need python 2.7, you still have options.


To offer a good service. To avoid breaking something the users depend on.


Apple supported Python 2 for two years after it was officially sunset by the Python Foundation. They've been advising that they will drop support for longer.

How long is long enough to avoid breaking something the users depend on? One more year? Five years? Fifty? I think we can all agree that it's reasonable to drop support at some point, making it just a question of when and at what cost.


"How long is long enough to avoid breaking something the users depend on?"

10 years - a piece of software compiled with 'supported' libraries, SDKs and what have you should function for 10 years. An appliance or a service should function for 10 years and have spare parts avaliable for that long.

That is the guideline I ise as a consumer, if I can see that something won't last a decade, I won't be spending money or time on it if i can avoid it. At that point it's no longer a durable good, it's a consumable.


Python2.7 was supposed to be EOL in 2015(!). It was released in 2010. It had 10 years of support mainline.


There would still be Python 2.7 software being produced up to EOL, so you have to start counting from 2015.


Again, the issue is "when", not "whether".

They could drop it with v12, or wait for v13, both would be fine. Dropping it in the middle of release cycle what people are objecting to.


Apple isn't in the business of spending excess money to avoid breaking things. If anything apple is the opposite.


If there are users that still depend on python 2.7, they can install it manually?

I mean I already think it's... interesting that an operating system comes with (multiple) programming language interpreters, available to the end-user.


That’s true if they’re using it from the command-line, but not if they’re using apps that linked against the system SDK’s libpython2.7.dylib (and assumed PyObjC would be present)


Is there any significant software still using that?

Apple is pretty aggressive about breaking backwards compatibility, as they did with x86, so it's no real surprise for unmaintained software to become completely unusable.


Based on what they’ve done, I guess the answer is no.

My company was shipping (somewhat niche) software linking with the built-in python2 up until last year though, and it’s been a scramble since Monterey to rebuild things to avoid the warning message. Definitely thought we would have python2 until 13.0 though (and part of me wonders whether this is just a “brown-out” for the beta and 12.3 release will be back to normal)


Notably, however, Apple didn't drop x86 support in a minor update.


And kept the support present for I think a year and a half alongside a runtime warning message (and removed SDK support earlier than that)


> programming language interpreters, available to the end-user.

Yeah, fuck those end-users who might want to automate things with a script.


I think an end user that wants to automate things with a script can be expected to install their interpreters themselves or use an interpreter that builds on the OS's shell.

I am not to familiar with MacOS, but I imagine it comes with a bash/sh terminal? If so, then that is something that should be first choice for automation.

Everything else, be it python, perl, ruby or whatever should be installed by a user as needed.


What do you think “the OS’s shell” is?


And I think that's totally reasonable—as long as Apple makes the change in a major update, when users can expect (some) things to break and should be encouraged to make time to prepare.

Part of it is that it's just a mental burden to keep this kind of compatibility information in my brain. I have VMs set up of past macOS releases in case I need to run something old. I don't have VMs set up for earlier versions of a release, and I don't even know where I'd get an installer. Admittedly, in this case it would just be a matter of installing Python, but I don't necessarily know that as a user, or the software could need to be linked to Cocoa or some such.


That's what Windows is for.


Yep. Comparison is not good since Red Hat et al. focus a lot on server side.


Supporting stone age system administration tools is basically Red Hat's core business though. That doesn't really seem like a fair comparison.


Debian drops the support of Python 2 starting with Bullseye ([0] and the supported versions list below):

NOTE: Debian 11 (bullseye) has removed the "python" package and the '/usr/bin/python' symlink due to the deprecation of Python 2.

[0] https://wiki.debian.org/Python#Python_in_Debian


Python 2 is still available in Bullseye:

* https://packages.debian.org/bullseye/python2

There has been no dropping of support: if you want Python 2.7 you can install in.


They don't bundle Python, it has to be installed separately.

macOS is going the same way, except that Apple doesn't have a package manager so it has to be a manual installation.


This is false. Homebrew is no different from any other open source package manager.

There is no need for a manual installation.


>> except that Apple doesn't have a package manager > This is false. Homebrew is no different from any other open source package manager.

Just to quote the literal slogan of homebrew:

>The Missing Package Manager for macOS (or Linux)

I think it is fair to say that Apple does in fact not have a package manager. Unless they do bundle homebrew with their OS nowadays?


You quote a marketing slogan. “Missing” in the sense that Apple doesn’t ship it.

> Apple doesn't have a package manager so it has to be a manual installation.

The quoted statement is clearly false. To maintain that python has to be installed by hand is incorrect.

Apple the corporation doesn’t have a package manager.

MacOS, the ecosystem does.

No different from Linux. There is no Linux package manager. There are numerous open source package managers for Linux.


> Apple the corporation doesn’t have a package manager. > MacOS, the ecosystem does. > No different from Linux. There is no Linux package manager. There are numerous open source package managers for Linux.

macOS the OS has a package manager in terms of a tool to install packages. It does not have the ability to find/download packages to install though.

In the sense of a package manager being able to download and install additional software, macOS has several choices available but none of them are included with the OS, or have the ability to update the OS.

There is arguably no single "Linux ecosystem". There are numerous Linux-based distributions, which are largely comparable to macOS as an OS. One of the very key differences between each family of distro's is the package manager the OS ships with by default and uses to update itself.

So no. macOS does not "have" a package manager that's comparable to those included by default with linux distributions, but there are several third party package managers available for it.


> or have the ability to update the OS

A red herring that is nothing to do with what we are discussing.

> It does not have the ability to find/download packages to install though.

Another false statement. Homebrew is just as capable of searching as any Linux package manager.


> Another false statement. Homebrew is just as capable of searching as any Linux package manager.

I wasn't talking about homebrew.

It helps if you understand the comment before you reply to it and make accusations.

Edit to add, because fuck the HN timeouts. Heaven forbid someone make 6 comments in two fucking hours:

Try reading the whole fucking comment again, with the points being made really hammered home for you:

> macOS the OS has a package manager in terms of a tool to install packages. It does not have the ability to find/download packages to install though.

This is about the built in macOS package manager, that the OS itself uses to install packages. It's invoked via `installer` on the command line, or Installer.app in the GUI. It does not expose the ability to find/fetch packages from remote URLs.

> In the sense of a package manager being able to download and install additional software, macOS has several choices available but none of them are included with the OS, or have the ability to update the OS.

This is referring to homebrew, macports, etc.

We certainly do see things differently, because only one of us is repeatedly calling the other a liar due to poor reading comprehension.


If you don’t include homebrew, as a package manager for MacOS, then you aren’t having a serious conversation.

This whole line of conversation is about whether it’s necessary to manually install python.

If you claim it is you are lying, because it can easily be installed via homebrew, and you are clearly aware of that.

If you have some need to claim that homebrew isn’t a MacOS package manager or that MacOS doesn’t “have” homebrew, because Apple doesn’t ship it. Go right ahead, but it’s a red herring.


MacOS the thing that comes preinstalled when you buy a computer from Apple doesn’t have a package manager. You have to install one yourself, manually.


By that logic MacOS doesn’t have any third party software at all.


No, it means that macOS and its third party software aren't one in the same thing.

Is faker.js a macOS vulnerability because there were people using Macs who installed npm and then installed faker.js? This isn't a theoretical question—whenever you install something, you need to consider where it's coming from and whether they can be trusted.


MacOS has third party software.

MacOS has homebrew as an open source package manager.


It has a third party package manager, yes. I'm pretty sure your parent meant that there isn't a first party one.

On Debian, packages in the official repositories are considered to be literally part of the Debian project. The maintainers are considered Debian maintainers, and bugs go into the Debian bug tracker.

This matters due to the trust issues I alluded to above. It also means that on Debian, there is one definitive version of Python (or at least one per Python version). The Homebrew, MacPorts, and Python.org Python binaries are all slightly different, such that it's possible for software to work with one but not the other.


No two Linux package managers are identical either, and not all Linux distributions work like Debian. Debian is a collection of volunteers. The choice to trust it is no different to the choice to trust Homebrew. Considering the package manager to be part of Debian doesn’t change this.

However none of this has any relevance to the claim in question, which is that python must be installed manually.

MacOS certainly has a package manager that can install and maintain python.


Except that most other open source package managers strive to be good at what they do.

Brew has a long history of treating any criticism of their truly woeful approach to security, dependency management etc, with "well fuck you we dont want to hear your opinion".


There were other better macOS package managers before it; macports was sort of even Apple sponsored.

Everyone switched to brew even though it was the most poorly designed because it came from the Rails hype era where everyone wanted all their tools to be written in Ruby and didn’t care about anything else.


I used and still use macports, but there were valid reasons to switch to homebrew: macports did not have precompiled packages, did not use system libraries, and machines were not as powerful as they are now.

This meant if you wanted to install, say, ImageMagick, you would spend one day compiling stuff.

Also contributing a brew recipe was (and probably still is) easier than contributing to macports, and brew casks are pretty convenient.

I deeply dislike some of the choices of homebrew, but I can understand why it was popular, and it wasn't just because of the ruby hype.


Yeah no kidding.


Bad form is coding against system binaries...


Just because someone has the resources to do something it doesn't mean that doing so is the right {business, technical} decision. You're confounding the two.


{business, technical}

Very HN: Business || technical. No concession for "Customer service."


Customer service is still a business reason. It serves to keep your customers happy, and the calculus here is probably that there aren't enough customers to keep happy by maintaining python2.7 that would justify the costs and headache.

You can't please everyone


Python 2 EOL was 2 years ago. 20 months before Monterey was released.

Edit: And it hardly came out of the blue. Python 2.7 was 10 years old then and people had known they needed to get off it for years.


As a counter-point, my app was able to rely on the system Python 2.7 for FIFTEEN YEARS because it was an extremely stable component of the OS (and it allowed the app binary to be literally 10 megabytes).

Now? I honestly can’t think of a way to release without adding something like 100-200 MB to the app bundle, consisting of an entire Python installation. With a user-facing app it is not ideal to try to say something like “just download Python and tell me where it is”.


The risk of unpatched vulnerabilities between now and the next "major release" far exceeds the risk of inconveniencing anyone foolish enough to rely on the system Python 2.7 installation.

You make an assumption that the way your system and method of working is the same as everyone else on the planet. It is not.

I have two tools that I use monthly which require python 2.7. The people who made the tools never updated them to work with 3. For this reason, I have a work box that I cannot upgrade.

I know that Macs are aimed at end users, and not developers, but it really is getting harder and harder for me to use my Macs for the kind of work I do. Each major software update borks my virtual hosts. PHP has been removed, and because of code signing requirements, adding it back is so complicated it takes an entire day and filtering through a dozen half-baked online tutorials. No, the ones that say "Just use brew!" are not the answer unless you have a very simple job. I don't use Docker, but from what I read on HN, that's six miles of bad road, too. But as I understand the situation, that's Docker's fault, not Apple's.


The doctor says to the patient, "What seems to be the trouble?"

The patient says, "It hurts when I do this."

The doctor says, "Don't do that."


I've previously worked at a vendor using Python2. Someone brought up this concern. I looked up the number of security releases we had cut in the past due to Python2 vulnerabilities -- it was zero.


This is the reason they are referred to as 0-day exploits and believe me, given enough time they will begin to appear.


We’re talking about a Python interpreter here. It already executes arbitrary input with the ability to write to files and make network connections.

What type of vulnerability with it just sitting there as an executable do you expect?


Doesn't need to be an RCE to be considered an exploit of the user's intentions. I can't fathom how people are still trying to use an EOL language or why they would even want to. Learn some new syntax, you'll live and the writing has been on the wall for like 20 years.


I don't think a whole lot of people are writing new software in Python 2, but I think a lot of people are still running old software via Python 2. Software they didn't write themselves and wouldn't know how to modify.

And that's not to say those users won't have to find something new eventually (or migrate the software to a contained environment). It just shouldn't happen in an automatically installed update.


Maybe? The timeline here is over a decade. It’s hard to imagine RCE without network facing services.


>The risk of unpatched vulnerabilities between now and the next "major release"

His point doesn't necessitate waiting. You could just classify it as a major release instead. That is not to say it should be a major release necessarily.


Right, Apple could have removed Python 2 last fall with the release of Monterrey, and I would have been neither surprised nor unhappy. I was actually kind of surprised when it was still in Big Sur.


> I was actually kind of surprised when it was still in Big Sur.

Removing support in the middle of 2020 would have been very unpopular.


Then they could have done it in 2021 with the release of Monterrey. Heck, Monterrey was released before Omicron, so if we're concerned about COVID that should have been less disruptive than doing it now.


This wouldn't have worked either. Lots of countries, where vaccines weren't widely available yet, were going through a huge Delta wave before and after WWDC.

Now that most places have vaccines, it probably doesn't make sense to wait for the next major release because this has already been long delayed.

Other factors could have also been at play - perhaps Apple Inc had a lot of internal tools on 2.X and has only recently gotten around to migrating all of those.


What server-oriented GNU/Linux distributions do is to keep patching those vulnerabilities themselves.

If they need to do it for a year, it's not that hard.


But is it worth it actually? I doubt that this MacOS version is used on servers…


Define server.

Public webserver? Unlikely for sure.

Office fileserver? Absolutely.

Build server? Sure.


If we go to that road, how often you need Python 2 for fileserver or build server?

Now, what is the percentage of that among all MacBooks on latest MacOS out there.


I think this mindset is dangerous. There's a really long tail of niche things people do on their computers, such that I suspect a majority of users have at least one niche need. Especially today, when Chromebooks and iPads are available for anyone who only needs basic web browsing.

If you stop paying attention to the niches, they add up to a lot of people.


You are right, but I would say that this is quite extreme case. This is a programming language which support ended 2 years ago. Only reason to use it is to avoid turning old code base to new one. There are some serious questions if that is really a good approach. And if it is, why you need to use pre-installed Python 2 instead of installing it by yourself?


I expect that there are users, who are not themselves developers, who are using either apps or scripts which rely on the system-installed Python 2. They likely aren't aware that they're using Python, and they may not even know what Python is. It could be one line in one shell script block in one automator action that a coworker put together five years ago.

It's okay for that stuff to break. (I wish Apple would put more effort than they do into making it break less often, but that's neither here nor there.) But, it should break in a major release, when the user is prepared for things to break. It should not break in an update that was installed automatically overnight.


If we go to that road, how often you need Python 2 for fileserver or build server?

Monthly, for me. Or more often if, like last week, a new customer required it.


It's funny how people think old, robust software is suddenly going to spring a bunch of vulnerabilities while sitting there, static, unchanging. It's far more likely the bleeding edge python(s) are going to be getting new bugs.

Old software isn't inherently more risky.


You're leaving out a bunch of scenarios:

* One of the libraries that old code depends on has a vulnerability discovered. They provide a patch, but it's only for the current version rather than the one used 10 years ago, which means that it's not as simple as just rebuilding Python 2.

* A new attack on a cryptographic component, network protocol, etc. is developed and suddenly defaults previously considered safe are no longer considered safe. As a great example, how many old things broke when major sites and services like PyPI or GitHub deprecated SSLv3, TLS 1.0, etc.?

* One of the Python libraries you use has had a vulnerability which was only recently discovered. The maintainer quite reasonably does not provide unpaid support for free software which they updated years ago.

What all of those have in common is that the problem is less the vulnerability than the fact that you are likely to need significant amounts of unrelated work getting other components updated to the point where you can install the patch. Given how often vulnerabilities are discovered which have been present for many years, this is likely to happen for any sufficiently large code base.


Of course it's not going to get new vulnerabilities, but that doesn't mean hackers won't find previously unknown vulnerabilities that have always been there. Looking through the CVE database, core python alone has had 15 CVE reports in 2020 and 2021, and for old pythons those will never be patched.

I'll never understand why people think they can write some software and it'll stay pristine forever. Even if it was had no known bugs at some point in the past, the world is a harsh place and your program, its dependencies, its programming language or even its OS can be found to be subtly broken at any moment. Even if it is not outright buggy, the world will change around you until the interfaces you depend on are no longer there. (looking at you, software program at $WORK that only works with PS/2 keyboards but not USB keyboards)


In the old days, software was like math - it had immutable, provable qualities. In the last several decades software has gradually become more and more like biology, where the core mechanisms are shrouded in mystery and we rely on observation and experiments to understand. The Star Trek game in Basic I typed into a computer is eternally true, a platonic form in a way software now no longer even aspires to be.


I'm sure it is a fine program, don't get me wrong. But citing a Basic I program as something that is eternally true is exactly what I mean when talking about the world moving on. Does it even run on 64-bit operating systems? Will it keep working correctly after the unix timestamp goes beyond its signed 32 bit maximum and overflows? I'll also eat my hat if there is anything you can prove about it in the modern meaning of formal verification.

There is plenty of software being written to very high standards of quality (formally verified real-time control software for rockets/power plants/etc comes to mind), but most of that is not available for free on the internet (or for $5/month as a SAAS) because you get what you pay for.


NASA software is very good, but not in general formally verified, I think (based mostly on reading postmortems on various NASA failures).

Even TLA+ analysis and proof of correctness doesn't stop things like "new zookeeper clients are pushing a lot of new writes => disk space needed for logs increases => across all the nodes in the zookeeper cluster => the entire formally correct cluster keeps falling over now but it didn't last week"

I meant more in the sense of the proofs in TAOCP - these are programs that don't interact with anything outside of the program itself, no non-specified input, so a lot simpler, no dates, no actual machine memory, just a little synthetic world defined by the language itself. Even TeX is imagined to be unchanging after Knuth's passing, and is basically unchanged since the 70s.

For sure the world has moved on, but we have lost the ability to have confidence in the software and have had to develop tools and ways of thinking inspired by biology.


If it is dynamically linked (probably the case, as that’s the norm on Mac OS), it might even get new vulnerabilities.

The combination (old Python, new shared library) could have vulnerabilities that neither (old python, old shared library) nor (new python, new shared library) has.


Many of those CVEs will not impact your application code, though. This is just a side effect of Python having a huge and somewhat crusty standard library.


I think, striving for software to not have a maintenance need if no new features need to be implemented is sound. I am very well aware, that the complexity of any but a trivial piece of software on a trivial piece of hardware is beyond human comprehension.

The general approach of most projects and vendors at the bottom of the stack to not accept full responsibility for their part is striking. The amount of lying, faking, leaky abstractions, adapting but not 100% etc. is so immense it is a wonder we get something done at the higher levels of the stack at all. I cannot comprehend, how it is possible, to hammer the RAM in such a way, that you can affect neighboring cells charge until the information interpreted based on the amount of charge in the cells is changed. This should never ever be possible. I haven't seen any such disclaimer this was possible on the package or in the data sheet of any RAM module. The same is with changing the kind of magnetic recording in the same hard drive model without notification (or change in model name that would be more honest). And then there are "subtle" problems, like coil whine on mainboards or PSUs for no reason - we know how to fix this. These high pitched sounds can give you headaches without you hearing much or anything. There is no indication about it on the package and almost no review tests this. And the whole Meltdown and Spectre type of bugs is just the cherry on top with major performance regressions because of it. So how are you supposed to write reliable software on top of all this if just a tight loop might actually change the state of hardware in such a way that is becomes a problem? I mean, this is just crazy.


Vulnerabilities are going to come up, absolutely. Unless you believe that the older version is somehow 100% bug free and vulnerability free just by virtue of it being old. Being old doesn’t grant it any magic powers.

What’s old now is what was bleeding edge years ago - it hasn’t been updated (otherwise it wouldn’t be old), it has just gotten older.

And the problem with keeping old versions around is that when (not if) someone does discover a vulnerability, what are you going to do then?

If your answer is “we’ll patch it then”, then that’s not going to be an old version anymore, is it?

If your answer is “I guarantee it will 100% never have any vulnerabilities ever and so it will never need to be updated”, I don’t think we’re living in the same reality.


You leave out that even if software A itself doesn't change

1. The OS and hardware under it can change and lead to new vulnerabilities.

2. Dependencies of Software A can change, which can lead to new vulnerabilities in Software A.

3. It's easier to find more and more vulnerabilities on a target that does no longer move. And once they are found there will be fewer experts to detect/fix/report it as time goes on.


There was some recent news about a vulnerability in pkexec that's been around for 12 years. Who knows what Python 2 exploits are still undiscovered in a similar manner? Since it's no longer actively supported, attackers have all the time they want to find those problems.


If you’ve made Python setuid, your security model is already garbage.


Exactly. This is why I refuse to upgrade from Windows 98


Why is this so downvoted? I really missed the days of python 2.7, I spent far less time debugging breaking version changes and spent more time on my actual work.

For security critical work, like web services, it makes more sense to constantly upgrade to the newest version. But for scientific work, which is a huge audience of python users, who often use obscure poorly supported libraries there is a lot of value in a stable foundation.


> I really missed the days of python 2.7, I spent far less time debugging breaking version changes and spent more time on my actual work.

This has been the Python 3 experience for years. If you had lots of previously ignored bugs in your Python 2 around string handling, it took longer but on a clean codebase it was often only a matter of minutes.


And yet just the other day on the python 3.10 migration in arch stuff broke and I wasted a good day hunting the error down. The solution was installing python 3.6, which should hopefully work for a year or two until THAT is left unsupported.


Kind of like how various Python 2 releases broke stuff and required time to fix? It's easy to forget that if it wasn't memorable but, for example, I remember the specific 2.7.9 release for that reason.


Agreed, Python 3 seems to have lost the plot.


Sibling post mentions the kext breakage happening as well as another example, but I feel like they've been slipping on this as they've had increasing difficulty matching major features to their unbending yearly release schedule. So more and more (and this is the case with macOS 12 also, see Universal Control) we've seen big headline features announced at WWDC keynotes not manage to make it into the .0, but rather trickle in over the .1 to .3 releases. Once that rubicon was crossed having more other major/breaking changes start sliding into point releases doesn't seem too surprising I guess, even if the cultural slippage is irritating.

Apple probably has to either give up strict semantic versioning for breaking changes, give up their strict release schedule, or be willing to defer announced features that don't make one major version all the way to the next. Something needs to give. Looks like they've decided to go with a "semantic-lite" where breaking changes and major features aren't the rule with point releases but can still happen in old enough/deprecated areas.

It does seem more needless though in cases like this granted, unlike other examples it's not really clear why they'd do this now rather than 12.0. Not as if Python 2.7 was a spring chicken last year either, support ended in 2020 and that was like 11 years after release already? Heck, why not 11.0?


What am I missing? When did Apple ever state they used semantic versioning in the first place?


It's not necessarily strict semantic versioning. However, Apple has always had a distinction between major and minor updates. Minor updates are installed on most Macs automatically, whereas major ones are not. Major updates have longer developer preview and public beta periods. Major releases continue to receive security patches for a couple of years.

If Apple isn't going to follow any types of rules, then what's even the point?


I kinda view them as using a modified version of semantic versioning where, if we take A.B.C, the normal roles of A and B are combined into B, and A is the big annual release with the media hype and advertising and shiniest features at the forefront.

Meaning A and B bumps could be breaking, but C is still just for bugfixes.

Sure it's not a great system from the technical side of things, but it shouldn't be shocking that Apple puts marketing first.


> It's implied in the fact that Apple has different types of updates in the first place.

As far as I understand, Semantic Versioning is not a synonym for "versioning scheme", but refers to a specific scheme (https://semver.org). So I don't understand how Apple having any versioning scheme implies that it must be semver.


I edited my post slightly. I never meant to imply that Apple follows strict semantic versioning—and in fact, they clearly don't, since they stayed on one major release for well over a decade.

It's okay if Apple's "minor releases" occasionally have breaking changes. But I don't think "wait until the next major release to remove an entire language runtime which has been built-in for 15+ years" is too much to ask.


Semver is a particular set of rules for defining what is a patch level, what is a minor update, and what is a major update. Before semver, these ideas still existed, but different groups had different ideas of what they meant.

That is perhaps still true in some large companies.


> it's not really clear why they'd do this now rather than 12.0. Not as if Python 2.7 was a spring chicken last year either, support ended in 2020 and that was like 11 years after release already? Heck, why not 11.0?

I don't think introducing breaking changes in 2020 (with the world turned upside down) for 11.0 would have been a good idea. The same reasoning applies to 12.0, many countries were going through a Delta wave without having much vaccine availability this past summer.

I think Apple has been very reasonable about this - they've been clear that 2.x is going away, they have been extremely accommodating of changing circumstances, and 2.x can still be installed manually after this change.


> What? In a point release?

See also the breaking change of 12.3 kernel extensions with Dropbox and Microsoft OneDrive:

* https://www.macrumors.com/2022/01/25/macos-12-3-cloud-storag...

* https://help.dropbox.com/installs-integrations/desktop/macos...


> I never expect point releases to break stuff

Apple doesn't claim to follow semantic versioning, which is where your expectation comes from.


Frankly, I think the fact that it breeds this sort of expectation is a knock against semantic versioning. See also the faker/colors dumpster fire.


There's no law saying you cannot install an old version of Python 2.7 manually is there? I assumed this was just removing it from being bundled.


Yeah, and it’s been years or decades since you wanted to use the bundled software on MacOS for stuff that was important. Them removing Python 2 probably will fix more problems due to multiple Python installs than it will break anything. I tend to get a lot of support questions on “how to make this Python thing work on my Mac laptop” and I have seen some screwed up things.


Of course.

Brew, docker, no idea if they have working release on the site, but that’s another possible way.


It’s not semver, but I’m willing to bet (although I haven’t checked) that there were probably a few leftover system apps that didn’t get moved to Python 3 before 12.0 (many of the features were delayed, I’m guessing due to the pandemic their work schedule has been disrupted since Big Sur was feature frozen).


But see, if that is the case... Apple is a large company with lots of resources and knowledge of their internal roadmap. If it took Apple's own internal developers until now to remove their dependence on Python, then third-party developers and their users should get at least until next fall anyway.

(I'm not defending the use of Python 2. I'm arguing that, as a general rule, if something takes the vendor X time, users should be given X + Y time.)


As far as I know:

Apple OS naming isn't semver. They really have two OS major releases a year, Spring and Fall trains.


It was deprecated back in 10.15 this didn't come completely unexpected. Using it since then always had the risk of it getting removed in a major or a point release.

https://developer.apple.com/documentation/macos-release-note...


Major release are already such a pain in terms of just how many dev ecosystem things break every time. Doing it in a minor release is really a blessing in disguise -- all you have to deal with is this.


To be fair it's more annoying to have python default to 2.7, I have rewritten everything in python 3 years ago. So.. Yay! and good riddance.


Well, anyone who still has critical stuff on Python 2, really has nobody but themselves to blame.

Bad form? Maybe. Good riddance? Yes.


It's not really about Python itself.

Let's say I have a little utility shell script I wrote ten years ago. I basically haven't touched it ever since. Inside the script is one line of python for fetching the current time with more precision than `date`, or something, which I forgot about a long time ago.

When I install a major update, I test all my scripts to make sure they still work. I do not test after every point release, and I shouldn't have to!


You should be testing early and often and not relying on arbitrary OS version numbers (Apple disagrees with you regarding the utility) to provide some kind of api stability or assurance.

This is why docker and other hermetically sealed packaging and languages (golang) systems work by avoiding the OS implementations which must change and update as quickly as possible.

The fact anyone would rely exclusively macOS level implementations and somehow expect them to remain stable and backwards compatible for any stretch of time blows my mind.


I have a few PHP ones. I just took the opportunity to rewrite them when it was removed.

The fact it worked so well for so many years is a test that it’s needed and that it should be rewritten. All software should be kept up to date.

The same way software might have security issues and fixed, it can have other issues that need fixing.


> I do not test after every point release, and I shouldn't have to!

It's not every point release - it's just this one, and it's because this change was (very reasonably) delayed from 2020.

2.X can still be installed manually, so you have that workaround.


macOS doesn’t follow SemVer. Major OS releases are often point versions.


It’s a mid-year release for Apple, even if marketing doesn’t indicate it in the version number.


macOS isn't quite semver unfortunately. The mid year releases are usually quite significant.

While I'm not a fan of removing Python 2 mid major cycle, it's not without precedent, and they've had popups since the first Monterey release anytime anything linked against the system Python 2


Until recently, point releases were major releases.


Apple doesn’t treat point releases that way. They have, for decades, been making breaking changes in late cycle point releases in macOS.


They should have removed it in Sierra, if not El Capitan, in my opinion.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: