Hacker News new | past | comments | ask | show | jobs | submit | page 2 login
“Python's batteries are leaking” (pyfound.blogspot.com)
552 points by narimiran 7 months ago | hide | past | web | favorite | 409 comments

> Brown called out the XML parser and tkinter in particular for making the standard library larger and harder to build, burdening all programmers for the sake of a few

Tkinter needs to go...There is very little reason except for the legacy ones, why it needs to be there still...

If Tkinter goes, then Python is dead to me. I use Python for small dep-free single file GUIs for small projects meant to be used by people who have Python installed but can't be bothered to go through an installation checklist. Tkinter is great for that and is the only reason I bother using Python for anything.

"pip install --user -r requirements.txt" is not a terribly odious "installation checklist", and I don't think it's the start of a slippery slope towards one either.

If you're worried about people not RTFM when using your projects, you could always start your scripts with the standard try/import/except wrapper around the required package, and tell them to run pip install when it's not found--or (and this is a terrible idea) run it in a subprocess for them.

Asking my users to use the command line is odious.

Pardon me, but why wouldnt you be able to just run an old python if tkinter came out of the stdlib, or even just package it with whatever you are distributing?

The point was that it’s much easier to distribute a single file app than it is to “correctly” distribute a full app with dependencies bundled.

Could you use cython to accomplish this? I personally haven’t used it, but looking at some examples it looks really straightforward.

Tkinter is awesome, I hope it isn't jettisoned. I haven't found anything that easy to use with a similar Canvas.

Being able to add items to a canvas, attach callbacks to them - without tracking them yourself, setting up the infrastructure for indexes, bounding boxes, all that stuff - I haven't found anything else as easy as Tkinter's Canvas.

If there is something better, I'd love to hear about it.

Come over to the PyCairo side of The Force! It will drawn you in and fill your path with beautiful pixels.


You do have a good point about Tk's canvas easy input handling though, which is something Cairo doesn't do. It's just an immediate mode drawing api like PostScript's rendering model or html canvas 2d context.

You can pass a pycairo context into C++ Python extensions, and they can go to town with it quite efficiently.



I love Tkinter, but I could accept letting it be an installable package rather than built-in. I can certainly see its size and complexity being a burden on smaller installations. I think making "pip install tkinter" be bulletproof would go a long way. You're not going to start using Tkinter unless you're reading a tutorial anyway.

There's always going to be a good debate about which batteries should be included. I'd almost put numpy ahead of Tkinter, but maybe that reflects my bias for scientific programming.

There is the expectation of IDLE, no? Isn't that a tkinter product?

This question suggests a criterion for what batteries should be included, which is platform dependence. What both Tkinter and IDLE do, is ensure that there is a basic GUI library, and a code editor, that can run on any platform.

Coming from Visual Basic, and similar things (HyperCard, Turbo Pascal), being able to run a program with GUI on both Windows and Linux with no code changes was nothing short of amazing to me. So maybe that's one way to draw the line with batteries, namely to make it so a beginner can get started with Python, with no regrets about which OS they're using. This requires some out-of-box hardware abstraction, where they keyboard, mouse, and display, all effectively count as hardware.

But this only can't justify that tkinter being part of the STANDARD library.

Taking tkinter out of the standard library won't stop IDLE being built with it, if they chose to do so.

Though OS packaging may frequently obscure this because OS packagers break up the language distribution, Idle is part of the language distribution, so (unless you expand the distribution to include core packages outside of stdlib, as with Ruby’s gemification effort) it has to be built with stdlib alone. Tkinter, therefore, needs to be part of stdlib.

Taking IDLE outside of the standard distribution and make a separate download link for it, isn't in anyway going to hurt Python at this point.

Just move tkinter like what they do with typing package is good enough to address everyone's need.

Depends, do you consider IDLE part of the standard library? (it's separated in many distributions, but documented in the standard library documentation, so it's a bit mixed, but it's probably okay to not consider it part of the library, but just the standard distribution)

It is.

Tkinter is used by matplotlib's show() function. Without tkinter, you would somehow need to replace it with some other platform independent gui library.

If it weren't in the stdlib, could it not be installed as a depdencency when you install matplotlib?

Why does it need to go?

It uses a non trivial amount of resources for (nearly) every python install. Therefore if it isnt of substantial use then it should be removed to free those resources

It does, but it's hardly an egregious amount (my 3.7 install on Win10 has it at 800Kb... though I guess tcl takes up 10MiB =P). For reference, the built in venv module (which I never use cause I'm all about that virtualenv life) is 1MB. The python tests are 20MB, and that ships by default on every python installation.

Like if you were in a resource constrained situation, you would be customizing your python install anyways (if nothing else, getting rid of the docs, the tests, maybe stripping out all but the compiled code...).

Maybe just use kitdll instead of a full blown Tcl/Tk distro, to provide the dependency, that would shrink the size a lot.


I clicked on this link think "uh oh standard ill informed rant post"

However brown has solid good points.

The brilliant selling point of python is the massive standard lib. If the quality of the libraries fall, then python's use as a tool drops dramatically.

One of Node's massive failures is that is has no standard lib.

Coupled with weak infrastructure around third-party library selection. npms.io has a "quality" score (which npm pulled in recently), but that magick number includes things like download counts, if the homepage is on a custom domain (!?) and if the readme has badges (!!?). It doesn't include code quality metrics, if the package or git repo is using GPG signing, or clear signs of abandonment, like N ignored PRs, or N,000 ignored open issues.

I've been astounded how few non-trivial packages are actually in a consumable state, and how many seemingly-simple packages have N dependencies that pull in M more. By and large it's a zombie wasteland of cruft.

I wouldn't really trust only-crowdsourced ratings, but I think that might be a nice component for npms.io to include, perhaps. Stackoverflow answers, for example, seem to be directionally correct if you sort by upvote count.

> The debate at this point became personal for Van Rossum, and he left angrily.

Guido should stop acting like a child. Listening to people, hearing them out - even when it’s uncomfortable - is the mark of a good leader.

I tell new PMs “this is the best job in the world 90% of the time but the other 10% is eating shit with a smile”

He is not. He has been working on it for 25 years and he is just fed up to having to explain again and again the same things. He doesn't want to see the careful and long work he did being dragged to a standard he considers lower.

There is always a new person coming with a new idea. It's exhausting, because it's required to make the language evolve, but it's also a new opportunity to screw things up evertime.

And if he had let most people got their way during the last 2 decades, python would have ended just meh.

Of course, everytime a new debate starts, everybody thinks that this time, just this time, he is wrong and they are right. I did too. We are all part of it.

I get the reaction. There is a limit to what a person can take, and it's why he stepped down as a bdfl.

But seing your baby and your reputation at stake is hard.

I dunno, Python is the epitome of a "meh" language. Guido has help Python back with his antiquated "get off my lawn" attitude since he created it.

The language itself is inferior in expressiveness and performance to almost any other modern language. The only reason anyone uses it anymore is the network effects of the library are very strong, especially is fields relating to ML and data science.

Those libs did not come out of nowhere, and the language did not rise from 1991 to the today without inherent qualities that draw people to it.

Python didn't have any specialty like PHP, or an accidental monopoly like JS. It didn't come with a killer app like Ruby. It hasn't been made by a giant company like Go.

It's pretty much a self-made language.

I'm not sure if I'd agree. Python was pretty lucky to have numpy/pandas when "data science" started taking off a few years ago. It easily could have been another language.

Numpy arrived in python because the language allowed mathematicians who were not programmers to get their job done and then do stuff that were not math related as well.

They don't have the desire to learn what a monad is, they don't want to type variables when exploring a badly formated heterogenous data dump, and they do want to be able to read the code of their intern freshly out of school once he leave without writing down any doc.

Pretty lucky to be an overnight success, 25 years in the making.

There’s other languages that fit the same criteria that are much better: Scala, OCaml, Racket, or Haskell.

The inherent qualities that draw people to Python are that it’s so inexpressive and crippled that you can learn it in 30 minutes.

What you call crippled, some call it sculpted. What you call inexpressive, some call it readable.

You have a lot of contempt to offer, I'll let you with it and get work done.

Agreed. Most people who like Python seem to never have had experience with functional languages.

Of course we do. We just happen to value things you don't. When you don't work with 10x programmers and experience the corporate world, you suddently see purity in a very different light.

Most people don’t have much experience. Any more middlebrow dismissal to offer?

Isn't part of the issue mixing general purpose code that doesn't change very often (the kind that belongs in a standard library) with code that changes all the time (the kind that belongs on PyPI)?

I remember one of Go's core devs voicing some of the same concerns regarding the SMTP-library [0] a while back.


I may be terribly wrong about this, but I would think that in general if someone makes an improvement to something you make then you would want to integrate that person and their improvement in some way. Treat like any other update: mention it in the version notes and warn about compatibility of code using the previous version. I know that renovation can suck but it's something that we need to be doing. A comprehensive stdlib means that once you've got it, you don't need to worry about being able to access packages along the way - how are you going to download a package if you can't connect to the repository? How much can you trust a third-party dev vs the core team?

If a package is really niche, it may not make sense to put in the integration work. But for a package that is used by a significant majority in a general application - why would you want to keep it separate if it is so much better?

I am ignoring human interaction here - there are probably of dozens of answers to that question if you count personal motivations.

> How much can you trust a third-party dev vs the core team?

Seems like a false dilemma to me. The core team could still maintain "blessed" packages that don't ship with the default installation.

> why would you want to keep it separate if it is so much better?

This is addressed in the article, most of the 5th paragraph is dedicated to it.

I've wondered about standard libraries for a while now. What happens if you discover a security vulnerability in your stdlib? Presumably you'd have to bump the language version, deploy it out and beg users to upgrade. Except, users don't upgrade stuff. While if you version the standard library, every new project will get the newer version of the standard library. Sure, there's space tradeoffs, though you could offer a manual linking option. But at the very least, the amount of new projects with the vulnerability will be next to nil.

And what if the standard library just gets dated? Take Node for instance. The fs module has a whole bunch of outdated callback based functions. Sure, you can wrap them in promisify, but it sucks that we have these outdated functions stuck around forever.

There's definitely tradeoffs with package/dependency multiplication, but I don't think standard libraries are as clear cut as people make them out to be.

You're trading things to update. Either the bug is in the standard library and you need to update the standard library or the bug is not in the standard library and you need to update not-the-standard-library. Either way, the maintainer of the library can't get everybody to upgrade.

> I've wondered about standard libraries for a while now. What happens if you discover a security vulnerability in your stdlib?

That depends on what you are writing.

For an application that is deployed stand-alone, you'll likely fat-package it with python and all the libraries. In case of a security issue, you create a new version of your application that bundles the fixed python.

For an application that is deployed on the system python (more typical on Linux), it's the system admin's task to update the system python.

I'm talking from a language maintainer perspective. From a user perspective, sure, you can upgrade. But that doesn't mean everybody will upgrade. In general, people don't like upgrading. That's why browsers all automatically update these days.

> What happens if you discover a security vulnerability in your stdlib?

As the user of the stdlib, it really depends on what the security issue is. For most users, if they can verify they're not using module X, then there's no problem and no rush for them to upgrade. In a language like Python that's very dynamic, you also might be able to download a hotfix .py file provided by the language maintainers (or any other party) as an alternative to upgrading, or as something for the interim while you wait for the upgrade to get released, backported and released for older major versions, and available for the deploy environment -- this might even take the form of an iptables rule, depending on the issue. In short my point is that there are alternatives besides bumping one release's release version and hoping people upgrade.

> Except, users don't upgrade stuff.

Actually users do. Not all users, but your sweeping statement isn't true of all users either. Experience and reading reports tell me that many if not most users actually upgrade, especially when there's a significant security issue and especially when upgrading is easy. 2.7.14 to 2.7.15 was a problem for no one.

For the users who don't upgrade, they're not likely to upgrade a third party / split off thing any more than the core language.

There's also another party of users worth mentioning, those that don't upgrade specific things because past upgrades have been a terrible experience. This is a reputation issue.

> And what if the standard library just gets dated? Take Node for instance. The fs module has a whole bunch of outdated callback based functions. Sure, you can wrap them in promisify, but it sucks that we have these outdated functions stuck around forever.

Sucks for whom? I've been out of the Node ecosystem for several years, is there a pfs module that uses promises (either built-in or as an external lib)? If so users can use that if they want. Does it suck for the maintainer? I don't see how if those functions are small and don't really need any maintenance.

What would suck is if all the existing programs that wrote some fs-using code a long time ago and haven't had to touch it now suddenly need to go update it to satisfy people's fashions.

You might be interested in this talk that makes the case to stop breaking your API especially if you're requiring more and providing less. https://www.youtube.com/watch?v=oyLBGkS5ICk

> Actually users do. Not all users, but your sweeping statement isn't true of all users either. Experience and reading reports tell me that many if not most users actually upgrade, especially when there's a significant security issue and especially when upgrading is easy. 2.7.14 to 2.7.15 was a problem for no one.

Sure, some people definitely upgrade. But there's a big difference between requiring someone to actively hear and heed your warning and having them automatically upgrade whenever they create a newer project. Or even just prompting them to upgrade like NPM does for package vulnerabilities.

> For the users who don't upgrade, they're not likely to upgrade a third party / split off thing any more than the core language.

That's not the point. If your standard library is a package, any new project will automatically download the latest version of the standard library. If I have a vulnerability in an npm package, I simply push a new version and people will automatically download that new version when they create a new project. If there is a vulnerability in the Python standard library, users will continue to have that vulnerability until they actively patch it.

> Sucks for whom? I've been out of the Node ecosystem for several years, is there a pfs module that uses promises (either built-in or as an external lib)? If so users can use that if they want. Does it suck for the maintainer? I don't see how if those functions are small and don't really need any maintenance.

Not a canonical one. They're starting to add a new promise based API, but now we have two different APIs doing the same thing. What if we come up with a newer way to do async? Do we add yet another API?

> What would suck is if all the existing programs that wrote some fs-using code a long time ago and haven't had to touch it now suddenly need to go update it to satisfy people's fashions.

That's a bit of a straw man. With packages, you can still support multiple versions. It's just that fs would no longer be tied to Node. You could make a newer fs package, fs 2.0 or whatever, that uses promises. The older one would still be supported, of course, but it wouldn't be stuck in Node.

Wouldn't new projects also be using the latest stable released version of the language (and hence standard lib) that's shipped with their distro or wherever they're getting it from, especially in the deployment environment? Or at least I think they'd be as likely to do so as getting the newest versions of libs from the lib repository...

Unless, and maybe this is a source of disagreement, it's still the standard in Node land to specify dependencies as "whatever the latest version is, I don't care"? That approach has teeth. After you've been bitten a few times you learn to say "no more" and version pin. There are other nice benefits to version pinning too (like reproducible builds). It's pretty common in Java land, even though some people still specify "version x or higher". It's a tradeoff.

One downside is that if you don't occasionally check maven central or wherever for new versions you might miss a security issue, and if you're copying dependency specifying files over to new projects the issue will persist. This is solvable with tooling -- http://www.mojohaus.org/versions-maven-plugin/ comes to mind and I was pleasantly surprised to get an email from Github that even a toy project of mine pointed to a version of a JSON parser with a known issue.

> What if we come up with a newer way to do async? Do we add yet another API?

Sure. Especially if people were just fine using the older API. For the fs situation, putting it in its own package would be great, up until the point I have to modify years-old code that's been working just fine through upgrades until someone decided to break things... for JS that's even more intolerable since nothing in the type system forbids the function updating to accept a promise or a callback. That (as would be a new API for async) would be a non-breaking change, it only provides more and the requirements for old code are the same.

If it's possible to break up the core in a way that preserves legacy code, I'm all for it. Java's Project Jigsaw did this to an extent with its module system. Another way to help that is to separate what's standard (such as with an official ANSI or ISO standard) for the language and what a particular implementation of the standard bundles with its releases on top of the standard. I recall at one point the old IO fork of Node promised to never break "core javascript APIs", but the industry consortia that defines what that means makes me think it was more a hollow promise.

To take an example from a language that has official standards, C, across decades of versions of C compilers and standard libraries from various vendors, if my code is C89 then it compiles with compilers that support C89. People might be offended that a function like strcpy exists, but it's there, I can use it if I want, or a separate library like bstring, or maybe the compiler provides their own __builtin__strcpy_chk that I can use directly or switch between with a flag (_FORTIFY_SOURCE). This might end up being less work for the compilers, too, since they don't need a security bulletin saying "some uses of strcpy can lead to a buffer overflow, removing it in v2.1.2, update immediately to strcpy2". They can (and did) just provide a new strcpy2, made available a seamless upgrade (I believe even Node has done this with some shims for deprecated APIs) if you don't want to change the source text, and warn people about the old one.

From a different perspective, servers usually have a maintenance cycle to update system package where the standard library could be updated, user package on the other hand have to be updated by the user themselves ( and they could stick with the obsolete version if they want too, and could have a per project package version ) there is no convenient way for the user to update all the versions.

> there is no convenient way for the user to update all the versions.

Perhaps I'm misreading this, but isn't that the purpose of package managers? I can run an `npm install foo` and update it at will.

there is no such thing like "npm update" to update all package as far as am aware for the python world ( at least as far as pip is concerned ), plus user deployed package could not be covered by update performed by the system administrator.

IMO, Golang does the best job of maintaining a high quality standard library. I disagree that modules should be moved into the external package ecosystem. However, Go isn't preinstalled on most systems like Python.

I have to develop enterprise software that runs across a wide range of platforms, and being able to take advantage of the fact that Python is pre-installed on all of these systems with its standard library is a godsend.

Go programs are statically built as fat binaries. That basically solves the problem in itself. Python scripts rely on a (very) large shared constellation of thing being present on the host system. Change one of these and you might break some stuff.

It’ll be interesting to see how Go (Rust, and other new languages) evolve and if they can avoid some level of package decay when they reach the age of Python, Java, etc.

Notably, a number of packages in the Go standard library have officially become "frozen":


I’ve been working with go a lot lately and they seem really focused on not letting this happen. Every single thing in the language and standard library seem completely focused on minimalism and compiler time. The standard lib is unlikely to change all that much and people are not picking the language for a bunch of convenience features.

Third party package problems will be an issue at some point but that’s more due to them be so focused on minimalism they don’t have clear guidance on setting up and maintaining packages.

3rd party packages are already a problem because a github repo shouldn't be treated as a dependency source. Gomod solves some problems but still uses git repos as the source.

The primary reason Go can get away with this strategy is because the Go community actively promotes fewer dependencies = better. So if you write Go you have to often accept the fact that the second you add a 3rd party dependency that you're now officially on your own if that dependency breaks or becomes unsupported.

This is not necessarily a bad thing. But in order to move software forward I still think we can do better than to push this responsibility to all individual end users.

This is one area where I feel like most popular languages today still fail compared to CPAN. CPAN's value was not just packaging and distribution, it was an integrated test report pipeline and infrastructure, actively managing and gatekeeping of library maintainers, CPAN mirroring functionality, and easy acceptance of bug reports and user feedback against a library.

I think this is mostly due to the 1.x guarantee that anything old will compile on anything new. I don’t know if Python or others formalized the same guarantee, but I don’t remember any issues 2.2 - 2.15.x (of course 2-3 is infamous). However 3.3+ (when I joined the py3 crew from py2) I don’t recall any either.

My experience with Go has been equally pleasant (though it does require more boilerplate for commonly accepted reasons in the community), but in no way unique.

> Golang does the best job of maintaining a high quality standard library.

Why? Not only Go. Clojure as well. Clojure's standard library is incredibly consistent and stable. Documentation pretty dry though. That's the only complain I can think of.

Some of her concerns do make sense, but the using, “a lot of our users stil use python 2.x” as a justification, in 2019, is ridiculous. Those same users had years to adopt/change the code base.

dont put all the blame on users. there's a ton of software that is python 2 only, for example gyp.

I think it is important to notice here that a lot of the tension here is also Twisted vs. asyncio.

I've never liked asyncio, while Twisted felt natural to me. So I would agree that an inferior solution has been pushed heavily in the stdlib and also to the syntax level.

Moving the entire stdlib to PyPI is of course entirely foolish and would destroy Python.

I have mixed feelings about this because I've seen both parts of the same situation.

In certain situations, I've been working with a python interpreter on a RHEL machine where pip was not installed (and I was not allowed to install it as well as make other modification: the machine was owned by the client and I had to work with what I had available).

- having some basic functionality in the core libraries was a godsend because I could work with that, even though it was not "ergonomic"

- not being ergonomic, it was a "poor experience" (and certainly not optimized or anything nice to see).

> Standard Library Modules Crowd Out Innovation

This heading is the essential problem in innovation writ large: some giant can ignore you and squash you without any effort at all, without even considering your existence.

There's an important tradeoff going on between library code that is (in theory) trusted and library code that is less trusted but has other advantages like solving a problem better or being a solution to problems that the most trust code can't solve.

Right now the equilibrium in this tug-of-war is that a certain set of functionality comes by default in the python standard library and everything else is just a package that you can install.

Obviously from a dev point of view it's a hassle to have to decide which of two or more packages for X is best, the pythonic way would be that there should be one and only one package for X. Of course at the cutting edge there have to be competing packages because there needs to be room for innovation.

But obviously not everything can be in the python standard library.

Not really sure what the solution is, but maybe there should be tiers of packages, with "Tier 1" being standard library, "Tier 2" having some kind of official stamp that it has been security audited to a certain standard, that Python has some control over who gets to modify it and why etc. Then maybe "Tier 3" could cover everything else, i.e. any bob random can go make a package on PyPi and it's Tier 3.

In addition, the process of going from Tier 3 to Tier 2 would give people a chance to winnow libraries down to one way of doing each thing at the Tier 2 level.

This might not be realistic but it's what my gut is telling me. C&C welcome.

Wow this is not the conduct I expect from a language creator. I don’t care if you’re Albert Einstein. Humility and being able to take criticism is far more admirable to me.

He asked her twice to be more specific about what she was arguing, which seems fair after a long tirade of shitting on anything and everything non optimal about the stdlib without any specific point. And he then left during Q&A which is totally fair, he’s his own person and it wasn’t his Q&A, people are blowing this out of proportion.

We weren't there. He might've needed to be someplace else.

People leave all the time during talks, and I've been to talks where most people left as soon as Q&A started.

I think he was probably frustrated by Python 2 support. If twisted has 50% userbase in Python 2 that don't upgrade and requiring more volunteer time on Python 2 just handcuffs language development on Python 3. Python 2 has been deprecated for long time and if people still want free Python 2 support I think it's just indigenous to the language contributors to spend their already constraint free time on Python 2.

What "conduct"? It's a simple disagreement, not some big dramatic fight.

> Brown went further adding that because few Python core developers are also major library maintainers, library authors’ complaints are devalued or ignored.

PHP has a similar issue to this. The people writing C were not using the language. The best example is PDO. A lot of C was written, but it was essentially abandonware because the PHP users could not make any changes without getting the C maintainers to both agree and have the time.

A lot of this is side effects of the Python 2 vs 3 schism imo. If it weren't for that situation, practically everyone would on be 3.x, and supporting older versions wouldn't be important, so package maintainership wouldn't be as difficult.

Put another way, the whole Python universe from my point of view has become a cautionary tale about breaking changes. Given Python's popularity, this might be an unpopular opinion, but I have yet to find someone who loves Python who still loves it as much when they discover other newer languages (I'm sure you exist, I just haven't met you!).

Python is having its time in the sun really because it is a default install for most unix distributions, so even people stuck in government labs can use it because 2.7 is already installed. Even apt depends on it via the debian software-properties package, so it isn't going anywhere any time soon.

The real question is how many people would use Python if it was as little known as, say, Elixir.

Yep, Python is just a plain, boring interpreted language from the 80s that was designed as a reaction to Perl's syntax, and is unfortunate enough to contain a lot of dynamic properties that designing a JIT compiler for it is a massive amount of work. It can be seen in context with interpreted languages during that time such as Perl, Tcl, awk, sed, etc. It became popular because it was baby's first language taught in universities to both CS and non-CS majors. Python is simply not competitive in terms of language features with modern programming languages. I tend to assume most of the people praising Python are amateur or new programmers.

stdlib acts as the foundation upon which a lot of the 3rd party stuff us built - it's a feature to have it move slowly and not break often.

What's the point of saying that a standard library feature was not added soon enough? Nobody can go back in time and add it earlier. I can complain to death about features missing from C++11 or I can start using C++14.

I think the point is that if those things in stdlib were external packages then older versions of the language would be easier to support because you could just update the package. I think the way it was phrased made it really easy to misunderstand.

I think this goes to her point of Twisted wanting to support really old versions of Python and they would be a lot more comfortable not supporting really old versions of packages.

So they can keep python 2 on life support longer? That is not a good idea.

If that happens regularly, it makes stdlib less useful for other libraries to build upon, forcing them into (sometimes awkward) workarounds, and is a sign of the stdlib not getting enough maintenance.

I got lost on the "we have to back port to python 2.7" argument.

Force the upgrade already. Code still on 2.7 if still useful can be upgraded.

I think the best model i have seen for a lean stdlib has been that of Golang. You have the standard lib and then you have the packages under Golang.org/x/ which are experimental packages that sometimes end up being merged into the core language. The stuff which is not in the stdlib (TOML, yaml etc.) have been supported very well by community packages.

Good points, some things on the standard library are just painful

My "favourite" library quirk. socket.fromfd is only available on "Unix" on Python 2.X, that was fixed in Python 3.X

The worse offender being the logging library. It's the least pythonic thing in the whole std library (ok maybe ABC is worse, but oh well)

Most of this post seems to revolve around the idea of py2 having an outdated standard library.

https://pythonclock.org/ has Py2 reaching EOL in 7 months. Realistically I'd say the time passed years ago.

py3 is over ten years old now. We're not talking about some new unstable piece of kit, I'd imagine that a large percentage, perhaps even 50%, of the HN audience started their career after the transition had already started.

You read that wrong. Most of the points apply to Python 2 and 3 equally, or even only to Python 3. (obviously with the exception of the one of Python 2 not having gotten some useful bits, but the pattern continues)

Why, time and time again, does Guido seem incapable of reasonable debate, or ideas that challenge his own? It's completely rude to interrupt a presenter with 'what is your point?'

Years ago I was in contact with the author of Nuitka, who was very excited to share his work thus far. During his presentation, Guido kept huffing and making snide comments under his breath. All because he disagrees with the premise behind Nuitka.

I like Python, and can appreciate his work and contributions. That said, I can't help think the community can become less toxic without him as BDFL.

Guido resigned as BDFL in July of 2018[0].

[0] https://mail.python.org/pipermail/python-committers/2018-Jul...

Apologies if it wasn't clear, but that's what I was alluding to. A lot of folks, and to an extent myself, feel that it will become a worse, design by committee language. I was just trying to point out the upside.

Maybe he’s simply burnt out? The pressure of being in charge of something as big as Python must be intense. People make demands on his time and expect the “benevolent dictator” to cunningly solve everything like a modern day Salomon.

I know he gave up the title, but as he personifies Python, I imagine the influx of requests for his attention may not have subsided much.

Nobody required him to be in charge of Python. Nobody required him to personify Python. Plenty of languages have succeeded without their creators handling every little detail, from C to JavaScript to PHP to Java.

A core skill of any open-source project maintainer is recognizing when you're burnt out enough that your continued presence isn't helping the community and is just getting in the way of others, and stepping aside. If you can't do this, you're not a good maintainer. I'm happy to have sympathy for your personal problems, but you're still failing at your job. Here, for instance, is a short but effective way to do it: https://mail.mozilla.org/pipermail/rust-dev/2013-August/0054... (That was almost two years before Rust reached 1.0, and the language hasn't suffered one bit for it.)

And part of Amber's point is that core Python development is taking on too many burdens for no actual benefit. By bundling so much in the standard library and expecting the standard library to be usable for real work without installing additional packages, the core team (and Guido in particular) increases their own workload, impedes the ability of others to contribute, and produces worse results for end users than they would otherwise.

> Plenty of languages have succeeded without their creators handling every little detail, from C to JavaScript to PHP to Java.

Sure, if your definition of success is being used by as many people as possible, but there are other (more) important criteria for asserting the quality of a language.

When you have a vision for your project, you might be afraid that other people are going to ruin it, because they don't understand, or they don't have good taste, etc.

In that case the right thing to do is to make it clear that you're not prioritizing number of users, you're not here to solve problems for users other than yourself at least in the short term (because you're interested in pursuing your vision, not implementing feature requests from people without good taste), and that you're not trying to take everyone's feedback into account. Then people stop expecting you to do more than you can, and you can slowly and quietly pursue your vision and perhaps find a small number of collaborators who share your taste.

Clojure sort of had this moment recently, where various people in the community were unhappy with the language direction and Rich Hickey (who is a person who cares about vision and taste in a language) made it quite clear that addressing their problems directly wasn't his definition of success. He was building a language for himself / his company to use, and if it worked for other people, great, and if they wanted to contribute to his vision, great, but if it didn't work for them, they should not expect Clojure to change. https://gist.github.com/richhickey/1563cddea1002958f96e7ba95... That, at least, is clear, and it means that authors of significant third-party libraries that are clashing with the vision of the language (and its community) can make an informed decision to spend time elsewhere, avoiding frustration on all sides. https://twitter.com/cemerick/status/1067111260611850240

I don't think Guido is/was actually trying to do this, and I think it's unfair to say that was his goal. If it was, then he was deliberately tricking people by having a core team, BDFL-Delegates, a language summit, etc. If it was, then he was being rude by asking her to come to the language summit instead of saying "Amber, Twisted is very good but your vision for Python is not my vision for Python." Guido, as far as I can tell, built Python to be a widely-used language, not a language following any sort of vision he started with. Guido does want Twisted and other Twisted-scale projects around, and does want Python to be a useful language for them. That's why I say that if he's burned out, the right way to execute his vision (which is exactly what he's doing, in fact) is to step aside graciously.

I think the tone of your post seems to imply that writing languages in some quasi-democratic (rather than oligarchic/dictatorial/elitist) way is or should be default. In fact, those assumptions are not the default (I can't think of any languages that are truly democratic, certainly most big ones aren't). Nor should it work like that imo, we don't design power plants by referendum. We put experts in a room, probably with a dictatorial decision maker and everyone else can whistle (or at least would really have to shout loud and be clearly right to be heard).

Also, if you're going for max language usage through design, how do you do that? By trying to perfect your design, so it's kind of an irrelevant motivator. Rich got a ton of clojure users (relative to his resources), he did that by having strong (reasoned) beliefs, not by opening the floor to a vote. Rich did /not/ say that he was only building Clojure only for his company, that's a very misleading statement that you've made.

On the flipside, it must be a dream to have an enormous team of hundreds, a bug tracker where new issues are triaged by eager volunteers within minutes and expertly resolved usually without your needing to be involved, a whole organization that puts on conferences for you, sysadmins creating and maintaining your entire development, testing, and release infrastructure, big companies supporting you with substantial donations....just saying Guido has a spectacular organization of hundreds of people that must be really nice to have. That you aren't directly responsible for every single detail helps a lot.

Was he like that before ? some say he was always balanced even when python3 came out.

let's hope everything settles smoothly :)

My only experiences with Guido when he was a Googler were incredibly unpleasant.

(This was years ago, and i don't have the email exchange anymore, so i'm doing my best to describe it from memory)

I reported a bug I had debugged pretty heavily and believed was likely a bug in the appengine datastore (this was before it was publicly available, IIRC), with a fairly detailed repro recipe, etc.

(I was not on the appengine team, just building an app)

I had debugged all the client side code all the way down to the rpc to the datastore server and was positive there was nothing weird going on there at all.

Within a minute or two of me sending the email to the email alias , he replies with "This must be a bug in your code, that can't happen". He didn't even look at it (i looked at the logs)

I replied with "I agree it should be impossible, but if you could look at it for a second, i think you'll see that it's not and is actually happening. The code is very simple, etc".

He says "I don't have time to fix your code".

So i spend the time and reduce it to a simple, 20 line piece of completely obvious code (IIRC i believe it had no real code except to instantiate the class and store it in the datastore) with no dependencies.

and say "here, i took the time to try to make it as clear and obvious that this is really not a bug in the code, because there is no real code here"

He replied with something else abrasive and dismissive.

Then, about 20 minutes later, one of his teammates replies with basically, "oh shit, this is bad".

(Because what i had discovered turned out to have caused data loss that they couldn't automatically fix. The could get the data back from restores, but it wasn't clear what to do with it, you needed intervention from a user)

Honestly, it would have been better for him to not respond at all. (he was not the on-call team member at the time anyway)

We all have our bad days/times of course (i definitely did!), so i do hope he's found more inner peace than he had back then. But yeah.

Whether you gave him the most succinct demonstration up front or not, it doesn't excuse Guido's behavior and dismissiveness towards you at all. Guido should have found a way to be patient with you and ask for additional information if he felt he didn't have a clear enough explanation, and should have taken the time to prove his hunch (which turned out to be wrong) that the problem was on your end.

I did observe that you first provided a "fairly detailed repro recipe", and then after he rebuffed you, you provided a "simple, 20 line piece of obvious code". What Guido should have done was ask you to make the extra effort to provide the latter case if the "recipe" you provided was in fact too much effort for him to look at, for something that didn't seem to be a bug to him. That is, he wanted you to "spend the time" as you said, and if he had more experience with this kind of thing perhaps he would have known to just ask for it rather than dismissing the whole thing.

What happens to me, at least, a lot, is I get bug reports that are like "here just unzip this 10M attachment, install theses libraries and datafiles and then watch the log output for the thing I spent five paragraphs not really describing". These are not coworkers or customers of mine, for whom I might be obligated to go through all those steps for, they are regular users who have downloaded my software for free (as they should), and likely saving their company thousands or even millions of person-hours of work by doing so. The thing I ask these people in return is that they A. report issues to me and B. do as much work as they can to help me fix the problem - I'm not a concierge, the help process is more of part of how the give and take of open source software is supposed to work between parties.

I encourage these people to please pass along an MCVE, e.g. the most succinct demonstration script possible, and quite often when I get the sense that they don't really have the experience to know what I'm looking for (despite my sending them the link to what an MCVE is), I will often read their verbal description, then write my own MCVE in about one minute that shows what they are asserting is not true, and then I paste that into the issue; I have a Python fragment script that I use as a starting point for writing 90% of these test cases. I ask them to please modify the MCVE to show the thing they are actually trying to do. That's how I get them to send me a succinct problem description that isn't a huge waste of my time.

I understand this is likely not at all what happened in your case as you were both at Google and I'm sure folks there are more sophisticated than this. I just had the thought based on how you had sent two versions of the issue.

I don't disagree with your assessment at all. There is never a reason to act like that and i didn't take it as "the normal course of business".

FWIW: the detailed repro recipe was hermetic and single-binary (and guaranteed to work on his machine for various reasons), it just had more code than strictly necessary (It was still <500 lines, it was a very simple webapp).

The actual recipe was closer to "run this binary, click on new record button, click save, observe results in datastore".

I actually reduced it not because the recipe was too detailed, but to remove the argument that it was my code.

Your experience was unfortunate, but it sounds like ultimately it just wasn't the right job for him. Which happens to everyone at some point or another.

It's safe to say that he did an outstanding job for the first 27 years.

But everyone has their limits.

Has DHH, Matz, Ryan Dahl, the Allaires, Larry Wall, or anyone else who founded a language or related-project been known for behaving like this?

Ryan Dahl didn't invent a new language, just a pragmatic async server runtime based on CommonJS and inspired by Ruby/Rack. He walked away just about two years after having launched node.js to focus on what he studied in the first place (math, ML), annoyed by his celebrity status [1].

[1]: https://mappingthejourney.com/single-post/2017/08/31/episode...

that's what I meant by "a language or related-project"

Once he chose to step away, he truly stepped away, which is a good model for any open source maintainer.

Linus Torvalds has been known to, yes! He even took a little break to deal with it.


how is that an excuse for being uncivil? if you're burnt out then don't attend the talk? simple.

From my own past experience with depression, it’s not as simple as “I’ll just stop doing all the things that have kept me in motion until now.”

This is not meant to speculate on van Rossum’s health, just trying to point out that there are situations where apparent rudeness may be out of the person’s immediate control.

If you're sufficiently in control of your actions to keep developing a major programming language, you're sufficiently in control of your actions to either not be rude while doing so or be aware that you're going to be rude and need to find a way of dealing with the situation (avoiding it, taking extra time before speaking, working with a therapist, whatever). Mental health issues don't excuse bad behavior.

(Also, personally, if I were behaving poorly I would much prefer people to say "hm, 'geofft is being rude but I know he's a better person than that and can improve" and not "hm, 'geofft is being rude, that's just the way he is, I wonder if he's got mental health problems.")

> Mental health issues don't excuse bad behavior.

Basically agreed.

> If you're sufficiently in control of your actions to keep developing a major programming language, you're sufficiently in control of your actions to [...] not be rude while doing so [...]

That's a lovely thought, but no. When I've had bad times, my social skills and emotional cope would sometimes go to shit and leave my technical skills intact.

You can say the same thing about anyone doing anything. It's just giving excuses for bad behavior.

Yes. It is an excuse, even for poor behavior.

There's galaxies or distance between "this is acceptable" and "this is understandable".

People are screwed up. You are, too. I hope your business partners and allies don't hold you to the standard you're bearing when your time to behave badly comes.

It seems that those programming language communities which have a more "social" aspect also tends to lead to more associated drama like this.

Seems like there might be some other confounding factors here:

- A language that is simply less active or whose users feel less ability to have a say in the language's development is going to be both less social and less dramatic, but also carries a high chance that existing users are only there for legacy reasons and are always considering moving to another language better suited for their purposes.

- A language run by a corporation is less "social," and has the drama play out in business negotiations, closed-door committee meetings, and lawsuits instead of on blogs and public mailing lists and (in this case) closed-door meetings with an expectation of public reporting.

I'm not aware of any drama in the ASP community, but that probably also had something to do with many users having picked Python for their next project instead of ASP. I'm aware of some drama in the Clojure community, the conclusion of which seems to have been that Clojure is their corporate sponsor's language and if it doesn't work for you you should find something else.

It's one of the things I really like about the R community. It's incredibly friendly and without egos, and has lots of people who actively work to make the community as safe and welcoming for everybody. I can't say I've ever heard about drama like this from there.

R is one of the few languages that I actually use a little bit, yet know absolutely nothing of the community and history. I mean I could tell you all about a dozen niche languages from Lisp to Haskell, but nothing of this masterpiece. Time to go read up on R :).

Any chance you could give a short little blurb for how the community is organized and works? I'm curious how different it is to Python. I also sometimes worry about the future of R where you have Python-Pandas coupled with Spyder IDE and Julia-DataFrame library with Atom-Juno as IDE as well as JuliaDB. It seems like a lot of communities are moving in on what R does best.

Reading the article, she seemed to handle it well, and he stormed off.

What’s your point? Because it is easy to criticize an important work. Not so easy to help improve it or leave it be due to historical and compatibility concerns.

It may not make complete sense, unless you are Dutch.

Yeah, it's disappointing, he seems like a bit of a dick TBH.

(Though in nothing on the level of some in open source).

At this point my MVP version of python is Anaconda-latest.

For me it is Miniconda + a list of package requirements. Then you can just create an environment for each project and install new dependencies as you go.

It seems weird to me that they specifically call out only Guido's behavior. I feel like there was probably a lot of context lost in "he left angrily" and its a bit unfair bordering on disingenuous.

Its never too late to rewrite python

I like python, its a nice simple language that you can use to pump out a proof of concept real quick with little hassle, but for full on production I avoid it. But I think this is a larger trend in programming, in my opinion the majority of programmers are super lazy. Everyone is in a mad dash to get the cool new thing out so they just slap a bunch of dependencies on it and damn the consequences of developer debt down the road. More people would rather roll with an MVP as the final product than build something from scratch that's more robust, resilient, and efficient. Then after a while you have this huge mess of old broken code that can't be fixed anymore and just needs to be redone from scratch.

Sure you are going to need to redo code anyways from scratch eventually. But, programmers like I mentioned, which is surprisingly a huge chunk, make problems worse for themselves throughout the lifetime of the code by being short sighted and stamping their approval on code their too lazy to rewrite because their boss doesn't know any better.

It's important to use dataclasses/attrs, pytest, mypy, pylint. Those tools transform python from an ad hoc language to a robust one.

For speed, pypy is the way to go.

I think there’s another common reason to ship the buggy mvp: you launch, get promoted, and find a new job. All the glory and none of the maintenance.

I may have had to clean up in the aftermath a couple of times.

> More people would rather roll with an MVP as the final product than build something from scratch that's more robust, resilient, and efficient.

Well, yeah, of course. It's expensive to reinvent the wheel.

Its more expensive trying to fix broken code and working around other work arounds that should not be in production. Slows you down, bloats your code, makes it harder for new employees to learn the system when they come on board, etc.

Is the quality of the dependencies you use really so bad that it'd take you more time to fix them than to write your own code and then fix that?

Like I can see where I'd make that tradeoff, but it'd have to be a small function in a really niche use case, which isn't really that common. (I work on the JVM though and don't know how good/bad Python libraries would be in general.)

Well it matters when you roll it out at scale. I suppose client side you don't have to worry too much. I think it's still a bit lazy but it doesn't have too much of an impact on small projects. But, as an example, the last company I worked at we made a neural network engine from scratch because we were just not happy about the available frameworks, their heavy dependency usage, and their very specific requirements of os and language versions. In our tests while I was there, we were able to get 3 times the speed up against the fastest framework of the ones we tested, which was Tensorflow. A lot of the speed up came from just taking good programming practices into account and knowing the system in and out and potential weak points. That's how much the dependencies were killing the speed. Since I've left, that's one of their main things now, just making high quality machine learning modules that get the best bang for their buck.

So yeah, I'd say the dependencies are pretty bad. It's a process outside of your control and you kind of just have to deal with whatever is under the hood. More importantly though, what's under the hood was likely meant to be general purpose and there is more than likely a better way to do it for your particular situation. And as she mentions a lot of the bugs are indefinitely there so you just have to have permanent workarounds, which is never good.

This is in no way specific to Python, and I think happens eventually in most languages. Only the heavier formal systems remain manageable.

Oh definitely. I think though, for whatever reason, python has quite a bit more of it than most. I think that's probably partly because of how fast it has been adopted and rolled out.

Python was one of the first languages I learned.

Recently started using Go a little, and it does feel better but it just takes longer for me to write (probably because it's newer to me too). At work, people want things quickly. I'd love to be able to spend my time writing things properly in Go, but it's just not as quick and easy as Python is.

This is entirely anecdotal, but my experience has been that many people espouse the speed of Python development when in fact they put out sloppy, non-production ready code that needs a ton of extra review, effort, and sprints to get up to snuff. I often find myself producing higher quality, production ready Python code than those who cling to it like a safety blanket and complain non-stop about how "slow" languages like C#, Go, and TypeScript make them.

On top of that, I believe that I am able to produce production ready code in other languages even faster than Python! Particularly when it comes to doing refactors for exploratory architecture in early POC/MVP; love the tooling assists so I'm not drowning in "is not a functions" and other dumb stuff while I'm working on hammering out interfaces, abstractions, and other structure.

Granted, I have almost certainly not worked with the best Python programmers, or the best programmers that happen to only use Python to put it another way. But looking at Python itself, the languishing ecosystem and stdlib, and every large OSS Python project, what does that even look like?!

So, I would say that yes it has a lot to do with familiarity and comfort... I'm very suspicious of people who insist on using Python and have little familiarity with anything else.

Sounds like you are blaming Python because it is easy to use, therefore enables poor programmers to work.

I’d rather have the first and risk the latter than the alternative.

"Laziness" is a terrible explanation for broad system characteristics, such as the fact that many software people end up building messy MVPs.

If an explanation boosts your ego, there is a decent chance it is flawed or incomplete.

You can have a messy MVP, I always start with some half baked MVP. But if you roll with that as your product you're shooting yourself in the foot a million times down the line. MVP should be the discovery phase and then after you know and understand the problem you should just build the thing from scratch. It saves you a bunch of pain down the road. If it took you 1 or 2 months to build the MVP it will take you half or less that time to recreate it from scratch. You understand the problem completely and most of it will be you just retyping from memory say in data oriented design rather than object oriented. So really there is not too much effort required after you have defined the problem and solution. So yeah, it has nothing to do with ego, it's just a reality of working on projects other people's companies will be relying on.

If a programmer is lazy, forcing him to use Java or C++ isn't going to fix it. And I've seen good developers ship plenty of quality code in Python or Ruby.

That's not what I'm saying, what I'm saying is that if there is a known bug in a python library, don't use it and code it from scratch. Especially if your job is to roll it out at a large scale. You could save your company tons of money just by taking the time to do it right. But yeah, you can ship python and ruby all you want. That's fine, but if you are using buggy or broken libraries because you don't want to code one up yourself, then that is in fact not quality and is indeed lazy.

Me personally, for my company I would not hire any contractor making code for me in python. Not because python is inherently bad or unoptimized in all cases, but because I know the general tendency is to use lots of dependencies. Pythons motto over all is easy over hard that has really spread in the community in a not so good way. Like I said though, I use it, I like it, it is awesome for banging out prototypes, but I avoid it for production. There is just to many potholes to the point I would rather just code something from scratch closer to the hardware, which I will admit I do not enjoy xD, but if its what needs to be done, then so be it. There are just a lot of things that I work on where even 5% efficiency could save you thousands.

This is a strange grab bag of ideas. You take Python’s strengths at prototyping and wide support and make them sound like weaknesses. Right tool for the job has been a saying for a while.

Why is it so difficult to admit Node.js did the package thing right, by keeping a local folder just for the app, isolation from other apps with zero effort?

I think people conflate NPM-as-package-installation-system, which I would agree works very, very well (though a lot of that is the JavaScript module/import system in general and not NPM specifically), with two other things: NPM the web platform (which has had a lot of pretty severe security/community issues), and the JavaScript community's tendency to proliferate lots and lots of modules, many in competition, to solve problems that other languages' communities tend to solve either via reimplementation or via the standard library.

I think any discussion about NPM or JS packaging compared to other package managers needs to discuss those things as orthogonal, largely unrelated concepts. Otherwise everyone just picks a favorite punching bag (e.g. left-pad) and talks past each other.

It's worth noting that npm was developed independently of node


The right thing is a machine-global package cache that can hold multiple versions of each package, and loader machinery that can pick out the right ones. That means you only have to download and store each package once, and you never get interference between projects.

NPM installs (and downloads?) a copy of each package for each project. This is the wrong thing to do.

You comment has nothing to do with the article. Also I believe there’s a pep working on that, but again that has nothing to do with what does and does not go in the standard libs.

The node way is certainly not the right way to handle dependencies, but it is laboring with a language that has absolutely zero batteries built in. Micropackages and sprawling dependency trees is a disaster.

I'm cautiously hopeful on the direction that .Net Core is taking, with the major pieces of the old framework modularized as nuget packages.

Its not only that python stdlib libraries are getting old and sometimes appear to be unmaintained, some of the more recent additions and changes are lacking.

I was rather shocked to find that python didn’t have a full-featured crypto library included in its standard lib. The alternatives all ended up being unmaintained or maintained by small groups (which makes trust in the soundness difficult). I tapped our security team, who were in disbelief, but ultimately they gave up to and I wrote the software in go instead.

You want pyca/cryptography. The last thing in the world you want is a standard crypto library that no experts are enthusiastic about maintaining. Golang had an unfair advantage here, because the language team included cryptography engineers. It would be weird if most languages had the same kind of crypto in their stdlibs.

I think there is in general nothing wrong with a language ecosystem where key parts of the whole platform are in well-maintained third-party libraries rather than the standard library. Which is also something Amber Brown is saying here.

The problem with third-party libraries is that it's impossible to know which ones are "standard" and which ones are copycat hobby projects in various states of disrepair. And in the worst case you can pick a library, get a day or two into using it, and discover it has a show-stopper limitation.

I recently hacked together some Python to process MIDI files. Should I have used this:


or this:


or this:


or this?


A well-maintained and thoughtfully curated stdlib makes these choices for you - which is one less thing to worry about, and can be a significant time saver.

In a perfect world everything would be in the stdlib, everything would be well-integrated, everything would serve every use case, and everything would be well-maintained by engaged and motivated maintainers.

For some kinds of libraries, you can sacrifice a whole bunch of those constraints and still have it make sense to host it in the stdlib. I'm fine if the JSON library is slow or inflexible if it covers a bunch of use cases and doesn't impede people from writing better ones. You can see this to some extent with the state of "router" HTTP libraries in Go.

But for some things, most notably cryptography, it's worse to have a suboptimal version in the stdlib than to have none at all.

To add to net/http not being very competitive performance wise within Go's language peer group, the stdlib file system walking methods took liberties in baking in assumptions without escape hatches that nerf the performance unnecessarily for a lot of heavy-lift use cases. Go definitely has a few shortcomings in the otherwise fantastically useful stdlib.

You can have curation outside the standard library, without incurring all the costs of moving libraries into the standard library.

When you find a coworker trying to use this module..

> cryptography/src/cryptography/hazmat/__init__.py

  Hazardous Materials
  This is a "Hazardous Materials" module. You should ONLY use it if you're
  100% absolutely sure that you know what you're doing because this module
  is full of land mines, dragons, and dinosaurs with laser guns.

Also I expect that a) pyca/cryptography is maintained by as many people as Go's cryptography libraries are, so the worry about maintenance by small groups doesn't argue in favor of Go here b) the number of people maintaining any particular module in the Python standard library is very small (e.g., IIRC there's one maintainer for `ssl`).

I would be surprised if either 'kentm or their security team wanted the rest of the Python standard library developers (or the Go ones, for that matter) to mess with cryptography modules when it's not their area of expertise.

I think we're saying basically the same thing.

Sure, but in Python's case, what's wrong with bundling a light wrapper around a well maintained C library?

That it would be inferior to pyca/cryptography?

Honestly, I've no idea.

For all I know some Python devs are even more sophisticated about security than the C developers who are maintaining libraries for Linux and *BSD distros.

I had to code in Python recently and had PTSD over all the syntactically significant whitespace. I have no idea how I ever found productivity in the language coming back to it now with fresher eyes. Refactoring, editing, and writing new code feels like such a drag.

I don't think Python is worth using for most new non-data science projects nowadays. The language is ugly and hard to work with. Also the performance is terrible.

There’s no accounting for poor taste.

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact