Tkinter needs to go...There is very little reason except for the legacy ones, why it needs to be there still...
If you're worried about people not RTFM when using your projects, you could always start your scripts with the standard try/import/except wrapper around the required package, and tell them to run pip install when it's not found--or (and this is a terrible idea) run it in a subprocess for them.
Being able to add items to a canvas, attach callbacks to them - without tracking them yourself, setting up the infrastructure for indexes, bounding boxes, all that stuff - I haven't found anything else as easy as Tkinter's Canvas.
If there is something better, I'd love to hear about it.
You do have a good point about Tk's canvas easy input handling though, which is something Cairo doesn't do. It's just an immediate mode drawing api like PostScript's rendering model or html canvas 2d context.
You can pass a pycairo context into C++ Python extensions, and they can go to town with it quite efficiently.
There's always going to be a good debate about which batteries should be included. I'd almost put numpy ahead of Tkinter, but maybe that reflects my bias for scientific programming.
Coming from Visual Basic, and similar things (HyperCard, Turbo Pascal), being able to run a program with GUI on both Windows and Linux with no code changes was nothing short of amazing to me. So maybe that's one way to draw the line with batteries, namely to make it so a beginner can get started with Python, with no regrets about which OS they're using. This requires some out-of-box hardware abstraction, where they keyboard, mouse, and display, all effectively count as hardware.
Taking tkinter out of the standard library won't stop IDLE being built with it, if they chose to do so.
Just move tkinter like what they do with typing package is good enough to address everyone's need.
Like if you were in a resource constrained situation, you would be customizing your python install anyways (if nothing else, getting rid of the docs, the tests, maybe stripping out all but the compiled code...).
However brown has solid good points.
The brilliant selling point of python is the massive standard lib. If the quality of the libraries fall, then python's use as a tool drops dramatically.
One of Node's massive failures is that is has no standard lib.
I've been astounded how few non-trivial packages are actually in a consumable state, and how many seemingly-simple packages have N dependencies that pull in M more. By and large it's a zombie wasteland of cruft.
I wouldn't really trust only-crowdsourced ratings, but I think that might be a nice component for npms.io to include, perhaps. Stackoverflow answers, for example, seem to be directionally correct if you sort by upvote count.
Guido should stop acting like a child. Listening to people, hearing them out - even when it’s uncomfortable - is the mark of a good leader.
I tell new PMs “this is the best job in the world 90% of the time but the other 10% is eating shit with a smile”
There is always a new person coming with a new idea. It's exhausting, because it's required to make the language evolve, but it's also a new opportunity to screw things up evertime.
And if he had let most people got their way during the last 2 decades, python would have ended just meh.
Of course, everytime a new debate starts, everybody thinks that this time, just this time, he is wrong and they are right. I did too. We are all part of it.
I get the reaction. There is a limit to what a person can take, and it's why he stepped down as a bdfl.
But seing your baby and your reputation at stake is hard.
The language itself is inferior in expressiveness and performance to almost any other modern language. The only reason anyone uses it anymore is the network effects of the library are very strong, especially is fields relating to ML and data science.
Python didn't have any specialty like PHP, or an accidental monopoly like JS. It didn't come with a killer app like Ruby. It hasn't been made by a giant company like Go.
It's pretty much a self-made language.
They don't have the desire to learn what a monad is, they don't want to type variables when exploring a badly formated heterogenous data dump, and they do want to be able to read the code of their intern freshly out of school once he leave without writing down any doc.
The inherent qualities that draw people to Python are that it’s so inexpressive and crippled that you can learn it in 30 minutes.
You have a lot of contempt to offer, I'll let you with it and get work done.
I remember one of Go's core devs voicing some of the same concerns regarding the SMTP-library  a while back.
If a package is really niche, it may not make sense to put in the integration work. But for a package that is used by a significant majority in a general application - why would you want to keep it separate if it is so much better?
I am ignoring human interaction here - there are probably of dozens of answers to that question if you count personal motivations.
Seems like a false dilemma to me. The core team could still maintain "blessed" packages that don't ship with the default installation.
> why would you want to keep it separate if it is so much better?
This is addressed in the article, most of the 5th paragraph is dedicated to it.
And what if the standard library just gets dated? Take Node for instance. The fs module has a whole bunch of outdated callback based functions. Sure, you can wrap them in promisify, but it sucks that we have these outdated functions stuck around forever.
There's definitely tradeoffs with package/dependency multiplication, but I don't think standard libraries are as clear cut as people make them out to be.
That depends on what you are writing.
For an application that is deployed stand-alone, you'll likely fat-package it with python and all the libraries. In case of a security issue, you create a new version of your application that bundles the fixed python.
For an application that is deployed on the system python (more typical on Linux), it's the system admin's task to update the system python.
As the user of the stdlib, it really depends on what the security issue is. For most users, if they can verify they're not using module X, then there's no problem and no rush for them to upgrade. In a language like Python that's very dynamic, you also might be able to download a hotfix .py file provided by the language maintainers (or any other party) as an alternative to upgrading, or as something for the interim while you wait for the upgrade to get released, backported and released for older major versions, and available for the deploy environment -- this might even take the form of an iptables rule, depending on the issue. In short my point is that there are alternatives besides bumping one release's release version and hoping people upgrade.
> Except, users don't upgrade stuff.
Actually users do. Not all users, but your sweeping statement isn't true of all users either. Experience and reading reports tell me that many if not most users actually upgrade, especially when there's a significant security issue and especially when upgrading is easy. 2.7.14 to 2.7.15 was a problem for no one.
For the users who don't upgrade, they're not likely to upgrade a third party / split off thing any more than the core language.
There's also another party of users worth mentioning, those that don't upgrade specific things because past upgrades have been a terrible experience. This is a reputation issue.
> And what if the standard library just gets dated? Take Node for instance. The fs module has a whole bunch of outdated callback based functions. Sure, you can wrap them in promisify, but it sucks that we have these outdated functions stuck around forever.
Sucks for whom? I've been out of the Node ecosystem for several years, is there a pfs module that uses promises (either built-in or as an external lib)? If so users can use that if they want. Does it suck for the maintainer? I don't see how if those functions are small and don't really need any maintenance.
What would suck is if all the existing programs that wrote some fs-using code a long time ago and haven't had to touch it now suddenly need to go update it to satisfy people's fashions.
You might be interested in this talk that makes the case to stop breaking your API especially if you're requiring more and providing less. https://www.youtube.com/watch?v=oyLBGkS5ICk
Sure, some people definitely upgrade. But there's a big difference between requiring someone to actively hear and heed your warning and having them automatically upgrade whenever they create a newer project. Or even just prompting them to upgrade like NPM does for package vulnerabilities.
> For the users who don't upgrade, they're not likely to upgrade a third party / split off thing any more than the core language.
That's not the point. If your standard library is a package, any new project will automatically download the latest version of the standard library. If I have a vulnerability in an npm package, I simply push a new version and people will automatically download that new version when they create a new project. If there is a vulnerability in the Python standard library, users will continue to have that vulnerability until they actively patch it.
> Sucks for whom? I've been out of the Node ecosystem for several years, is there a pfs module that uses promises (either built-in or as an external lib)? If so users can use that if they want. Does it suck for the maintainer? I don't see how if those functions are small and don't really need any maintenance.
Not a canonical one. They're starting to add a new promise based API, but now we have two different APIs doing the same thing. What if we come up with a newer way to do async? Do we add yet another API?
> What would suck is if all the existing programs that wrote some fs-using code a long time ago and haven't had to touch it now suddenly need to go update it to satisfy people's fashions.
That's a bit of a straw man. With packages, you can still support multiple versions. It's just that fs would no longer be tied to Node. You could make a newer fs package, fs 2.0 or whatever, that uses promises. The older one would still be supported, of course, but it wouldn't be stuck in Node.
Unless, and maybe this is a source of disagreement, it's still the standard in Node land to specify dependencies as "whatever the latest version is, I don't care"? That approach has teeth. After you've been bitten a few times you learn to say "no more" and version pin. There are other nice benefits to version pinning too (like reproducible builds). It's pretty common in Java land, even though some people still specify "version x or higher". It's a tradeoff.
One downside is that if you don't occasionally check maven central or wherever for new versions you might miss a security issue, and if you're copying dependency specifying files over to new projects the issue will persist. This is solvable with tooling -- http://www.mojohaus.org/versions-maven-plugin/ comes to mind and I was pleasantly surprised to get an email from Github that even a toy project of mine pointed to a version of a JSON parser with a known issue.
> What if we come up with a newer way to do async? Do we add yet another API?
Sure. Especially if people were just fine using the older API. For the fs situation, putting it in its own package would be great, up until the point I have to modify years-old code that's been working just fine through upgrades until someone decided to break things... for JS that's even more intolerable since nothing in the type system forbids the function updating to accept a promise or a callback. That (as would be a new API for async) would be a non-breaking change, it only provides more and the requirements for old code are the same.
To take an example from a language that has official standards, C, across decades of versions of C compilers and standard libraries from various vendors, if my code is C89 then it compiles with compilers that support C89. People might be offended that a function like strcpy exists, but it's there, I can use it if I want, or a separate library like bstring, or maybe the compiler provides their own __builtin__strcpy_chk that I can use directly or switch between with a flag (_FORTIFY_SOURCE). This might end up being less work for the compilers, too, since they don't need a security bulletin saying "some uses of strcpy can lead to a buffer overflow, removing it in v2.1.2, update immediately to strcpy2". They can (and did) just provide a new strcpy2, made available a seamless upgrade (I believe even Node has done this with some shims for deprecated APIs) if you don't want to change the source text, and warn people about the old one.
Perhaps I'm misreading this, but isn't that the purpose of package managers? I can run an `npm install foo` and update it at will.
I have to develop enterprise software that runs across a wide range of platforms, and being able to take advantage of the fact that Python is pre-installed on all of these systems with its standard library is a godsend.
Third party package problems will be an issue at some point but that’s more due to them be so focused on minimalism they don’t have clear guidance on setting up and maintaining packages.
The primary reason Go can get away with this strategy is because the Go community actively promotes fewer dependencies = better. So if you write Go you have to often accept the fact that the second you add a 3rd party dependency that you're now officially on your own if that dependency breaks or becomes unsupported.
This is not necessarily a bad thing. But in order to move software forward I still think we can do better than to push this responsibility to all individual end users.
This is one area where I feel like most popular languages today still fail compared to CPAN. CPAN's value was not just packaging and distribution, it was an integrated test report pipeline and infrastructure, actively managing and gatekeeping of library maintainers, CPAN mirroring functionality, and easy acceptance of bug reports and user feedback against a library.
My experience with Go has been equally pleasant (though it does require more boilerplate for commonly accepted reasons in the community), but in no way unique.
Why? Not only Go. Clojure as well. Clojure's standard library is incredibly consistent and stable. Documentation pretty dry though. That's the only complain I can think of.
I've never liked asyncio, while Twisted felt natural to me. So I would agree that an inferior solution has been pushed heavily in the stdlib and also to the syntax level.
Moving the entire stdlib to PyPI is of course entirely foolish and would destroy Python.
In certain situations, I've been working with a python interpreter on a RHEL machine where pip was not installed (and I was not allowed to install it as well as make other modification: the machine was owned by the client and I had to work with what I had available).
- having some basic functionality in the core libraries was a godsend because I could work with that, even though it was not "ergonomic"
- not being ergonomic, it was a "poor experience" (and certainly not optimized or anything nice to see).
This heading is the essential problem in innovation writ large: some giant can ignore you and squash you without any effort at all, without even considering your existence.
Right now the equilibrium in this tug-of-war is that a certain set of functionality comes by default in the python standard library and everything else is just a package that you can install.
Obviously from a dev point of view it's a hassle to have to decide which of two or more packages for X is best, the pythonic way would be that there should be one and only one package for X. Of course at the cutting edge there have to be competing packages because there needs to be room for innovation.
But obviously not everything can be in the python standard library.
Not really sure what the solution is, but maybe there should be tiers of packages, with "Tier 1" being standard library, "Tier 2" having some kind of official stamp that it has been security audited to a certain standard, that Python has some control over who gets to modify it and why etc. Then maybe "Tier 3" could cover everything else, i.e. any bob random can go make a package on PyPi and it's Tier 3.
In addition, the process of going from Tier 3 to Tier 2 would give people a chance to winnow libraries down to one way of doing each thing at the Tier 2 level.
This might not be realistic but it's what my gut is telling me. C&C welcome.
People leave all the time during talks, and I've been to talks where most people left as soon as Q&A started.
PHP has a similar issue to this. The people writing C were not using the language. The best example is PDO. A lot of C was written, but it was essentially abandonware because the PHP users could not make any changes without getting the C maintainers to both agree and have the time.
Put another way, the whole Python universe from my point of view has become a cautionary tale about breaking changes. Given Python's popularity, this might be an unpopular opinion, but I have yet to find someone who loves Python who still loves it as much when they discover other newer languages (I'm sure you exist, I just haven't met you!).
Python is having its time in the sun really because it is a default install for most unix distributions, so even people stuck in government labs can use it because 2.7 is already installed. Even apt depends on it via the debian software-properties package, so it isn't going anywhere any time soon.
The real question is how many people would use Python if it was as little known as, say, Elixir.
I think this goes to her point of Twisted wanting to support really old versions of Python and they would be a lot more comfortable not supporting really old versions of packages.
Force the upgrade already. Code still on 2.7 if still useful can be upgraded.
My "favourite" library quirk. socket.fromfd is only available on "Unix" on Python 2.X, that was fixed in Python 3.X
The worse offender being the logging library. It's the least pythonic thing in the whole std library (ok maybe ABC is worse, but oh well)
https://pythonclock.org/ has Py2 reaching EOL in 7 months. Realistically I'd say the time passed years ago.
py3 is over ten years old now. We're not talking about some new unstable piece of kit, I'd imagine that a large percentage, perhaps even 50%, of the HN audience started their career after the transition had already started.
Years ago I was in contact with the author of Nuitka, who was very excited to share his work thus far. During his presentation, Guido kept huffing and making snide comments under his breath. All because he disagrees with the premise behind Nuitka.
I like Python, and can appreciate his work and contributions. That said, I can't help think the community can become less toxic without him as BDFL.
I know he gave up the title, but as he personifies Python, I imagine the influx of requests for his attention may not have subsided much.
A core skill of any open-source project maintainer is recognizing when you're burnt out enough that your continued presence isn't helping the community and is just getting in the way of others, and stepping aside. If you can't do this, you're not a good maintainer. I'm happy to have sympathy for your personal problems, but you're still failing at your job. Here, for instance, is a short but effective way to do it: https://mail.mozilla.org/pipermail/rust-dev/2013-August/0054... (That was almost two years before Rust reached 1.0, and the language hasn't suffered one bit for it.)
And part of Amber's point is that core Python development is taking on too many burdens for no actual benefit. By bundling so much in the standard library and expecting the standard library to be usable for real work without installing additional packages, the core team (and Guido in particular) increases their own workload, impedes the ability of others to contribute, and produces worse results for end users than they would otherwise.
Sure, if your definition of success is being used by as many people as possible, but there are other (more) important criteria for asserting the quality of a language.
When you have a vision for your project, you might be afraid that other people are going to ruin it, because they don't understand, or they don't have good taste, etc.
Clojure sort of had this moment recently, where various people in the community were unhappy with the language direction and Rich Hickey (who is a person who cares about vision and taste in a language) made it quite clear that addressing their problems directly wasn't his definition of success. He was building a language for himself / his company to use, and if it worked for other people, great, and if they wanted to contribute to his vision, great, but if it didn't work for them, they should not expect Clojure to change. https://gist.github.com/richhickey/1563cddea1002958f96e7ba95... That, at least, is clear, and it means that authors of significant third-party libraries that are clashing with the vision of the language (and its community) can make an informed decision to spend time elsewhere, avoiding frustration on all sides. https://twitter.com/cemerick/status/1067111260611850240
I don't think Guido is/was actually trying to do this, and I think it's unfair to say that was his goal. If it was, then he was deliberately tricking people by having a core team, BDFL-Delegates, a language summit, etc. If it was, then he was being rude by asking her to come to the language summit instead of saying "Amber, Twisted is very good but your vision for Python is not my vision for Python." Guido, as far as I can tell, built Python to be a widely-used language, not a language following any sort of vision he started with. Guido does want Twisted and other Twisted-scale projects around, and does want Python to be a useful language for them. That's why I say that if he's burned out, the right way to execute his vision (which is exactly what he's doing, in fact) is to step aside graciously.
Also, if you're going for max language usage through design, how do you do that? By trying to perfect your design, so it's kind of an irrelevant motivator. Rich got a ton of clojure users (relative to his resources), he did that by having strong (reasoned) beliefs, not by opening the floor to a vote.
Rich did /not/ say that he was only building Clojure only for his company, that's a very misleading statement that you've made.
let's hope everything settles smoothly :)
(This was years ago, and i don't have the email exchange anymore, so i'm doing my best to describe it from memory)
I reported a bug I had debugged pretty heavily and believed was likely a bug in the appengine datastore (this was before it was publicly available, IIRC), with a fairly detailed repro recipe, etc.
(I was not on the appengine team, just building an app)
I had debugged all the client side code all the way down to the rpc to the datastore server and was positive there was nothing weird going on there at all.
Within a minute or two of me sending the email to the email alias , he replies with "This must be a bug in your code, that can't happen". He didn't even look at it (i looked at the logs)
I replied with "I agree it should be impossible, but if you could look at it for a second, i think you'll see that it's not and is actually happening. The code is very simple, etc".
He says "I don't have time to fix your code".
So i spend the time and reduce it to a simple, 20 line piece of completely obvious code (IIRC i believe it had no real code except to instantiate the class and store it in the datastore) with no dependencies.
and say "here, i took the time to try to make it as clear and obvious that this is really not a bug in the code, because there is no real code here"
He replied with something else abrasive and dismissive.
Then, about 20 minutes later, one of his teammates replies with basically, "oh shit, this is bad".
(Because what i had discovered turned out to have caused data loss that they couldn't automatically fix. The could get the data back from restores, but it wasn't clear what to do with it, you needed intervention from a user)
Honestly, it would have been better for him to not respond at all.
(he was not the on-call team member at the time anyway)
We all have our bad days/times of course (i definitely did!), so i do hope he's found more inner peace than he had back then.
I did observe that you first provided a "fairly detailed repro recipe", and then after he rebuffed you, you provided a "simple, 20 line piece of obvious code". What Guido should have done was ask you to make the extra effort to provide the latter case if the "recipe" you provided was in fact too much effort for him to look at, for something that didn't seem to be a bug to him. That is, he wanted you to "spend the time" as you said, and if he had more experience with this kind of thing perhaps he would have known to just ask for it rather than dismissing the whole thing.
What happens to me, at least, a lot, is I get bug reports that are like "here just unzip this 10M attachment, install theses libraries and datafiles and then watch the log output for the thing I spent five paragraphs not really describing". These are not coworkers or customers of mine, for whom I might be obligated to go through all those steps for, they are regular users who have downloaded my software for free (as they should), and likely saving their company thousands or even millions of person-hours of work by doing so. The thing I ask these people in return is that they A. report issues to me and B. do as much work as they can to help me fix the problem - I'm not a concierge, the help process is more of part of how the give and take of open source software is supposed to work between parties.
I encourage these people to please pass along an MCVE, e.g. the most succinct demonstration script possible, and quite often when I get the sense that they don't really have the experience to know what I'm looking for (despite my sending them the link to what an MCVE is), I will often read their verbal description, then write my own MCVE in about one minute that shows what they are asserting is not true, and then I paste that into the issue; I have a Python fragment script that I use as a starting point for writing 90% of these test cases. I ask them to please modify the MCVE to show the thing they are actually trying to do. That's how I get them to send me a succinct problem description that isn't a huge waste of my time.
I understand this is likely not at all what happened in your case as you were both at Google and I'm sure folks there are more sophisticated than this. I just had the thought based on how you had sent two versions of the issue.
FWIW: the detailed repro recipe was hermetic and single-binary (and guaranteed to work on his machine for various reasons), it just had more code than strictly necessary (It was still <500 lines, it was a very simple webapp).
The actual recipe was closer to "run this binary, click on new record button, click save, observe results in datastore".
I actually reduced it not because the recipe was too detailed, but to remove the argument that it was my code.
But everyone has their limits.
Once he chose to step away, he truly stepped away, which is a good model for any open source maintainer.
This is not meant to speculate on van Rossum’s health, just trying to point out that there are situations where apparent rudeness may be out of the person’s immediate control.
(Also, personally, if I were behaving poorly I would much prefer people to say "hm, 'geofft is being rude but I know he's a better person than that and can improve" and not "hm, 'geofft is being rude, that's just the way he is, I wonder if he's got mental health problems.")
> If you're sufficiently in control of your actions to keep developing a major programming language, you're sufficiently in control of your actions to [...] not be rude while doing so [...]
That's a lovely thought, but no. When I've had bad times, my social skills and emotional cope would sometimes go to shit and leave my technical skills intact.
People are screwed up. You are, too. I hope your business partners and allies don't hold you to the standard you're bearing when your time to behave badly comes.
- A language that is simply less active or whose users feel less ability to have a say in the language's development is going to be both less social and less dramatic, but also carries a high chance that existing users are only there for legacy reasons and are always considering moving to another language better suited for their purposes.
- A language run by a corporation is less "social," and has the drama play out in business negotiations, closed-door committee meetings, and lawsuits instead of on blogs and public mailing lists and (in this case) closed-door meetings with an expectation of public reporting.
I'm not aware of any drama in the ASP community, but that probably also had something to do with many users having picked Python for their next project instead of ASP. I'm aware of some drama in the Clojure community, the conclusion of which seems to have been that Clojure is their corporate sponsor's language and if it doesn't work for you you should find something else.
Any chance you could give a short little blurb for how the community is organized and works? I'm curious how different it is to Python. I also sometimes worry about the future of R where you have Python-Pandas coupled with Spyder IDE and Julia-DataFrame library with Atom-Juno as IDE as well as JuliaDB. It seems like a lot of communities are moving in on what R does best.
(Though in nothing on the level of some in open source).
Sure you are going to need to redo code anyways from scratch eventually. But, programmers like I mentioned, which is surprisingly a huge chunk, make problems worse for themselves throughout the lifetime of the code by being short sighted and stamping their approval on code their too lazy to rewrite because their boss doesn't know any better.
For speed, pypy is the way to go.
I may have had to clean up in the aftermath a couple of times.
Well, yeah, of course. It's expensive to reinvent the wheel.
Like I can see where I'd make that tradeoff, but it'd have to be a small function in a really niche use case, which isn't really that common. (I work on the JVM though and don't know how good/bad Python libraries would be in general.)
So yeah, I'd say the dependencies are pretty bad. It's a process outside of your control and you kind of just have to deal with whatever is under the hood. More importantly though, what's under the hood was likely meant to be general purpose and there is more than likely a better way to do it for your particular situation. And as she mentions a lot of the bugs are indefinitely there so you just have to have permanent workarounds, which is never good.
Recently started using Go a little, and it does feel better but it just takes longer for me to write (probably because it's newer to me too). At work, people want things quickly. I'd love to be able to spend my time writing things properly in Go, but it's just not as quick and easy as Python is.
On top of that, I believe that I am able to produce production ready code in other languages even faster than Python! Particularly when it comes to doing refactors for exploratory architecture in early POC/MVP; love the tooling assists so I'm not drowning in "is not a functions" and other dumb stuff while I'm working on hammering out interfaces, abstractions, and other structure.
Granted, I have almost certainly not worked with the best Python programmers, or the best programmers that happen to only use Python to put it another way. But looking at Python itself, the languishing ecosystem and stdlib, and every large OSS Python project, what does that even look like?!
So, I would say that yes it has a lot to do with familiarity and comfort... I'm very suspicious of people who insist on using Python and have little familiarity with anything else.
I’d rather have the first and risk the latter than the alternative.
If an explanation boosts your ego, there is a decent chance it is flawed or incomplete.
Me personally, for my company I would not hire any contractor making code for me in python. Not because python is inherently bad or unoptimized in all cases, but because I know the general tendency is to use lots of dependencies. Pythons motto over all is easy over hard that has really spread in the community in a not so good way. Like I said though, I use it, I like it, it is awesome for banging out prototypes, but I avoid it for production. There is just to many potholes to the point I would rather just code something from scratch closer to the hardware, which I will admit I do not enjoy xD, but if its what needs to be done, then so be it. There are just a lot of things that I work on where even 5% efficiency could save you thousands.
I think any discussion about NPM or JS packaging compared to other package managers needs to discuss those things as orthogonal, largely unrelated concepts. Otherwise everyone just picks a favorite punching bag (e.g. left-pad) and talks past each other.
NPM installs (and downloads?) a copy of each package for each project. This is the wrong thing to do.
I'm cautiously hopeful on the direction that .Net Core is taking, with the major pieces of the old framework modularized as nuget packages.
I think there is in general nothing wrong with a language ecosystem where key parts of the whole platform are in well-maintained third-party libraries rather than the standard library. Which is also something Amber Brown is saying here.
I recently hacked together some Python to process MIDI files. Should I have used this:
A well-maintained and thoughtfully curated stdlib makes these choices for you - which is one less thing to worry about, and can be a significant time saver.
For some kinds of libraries, you can sacrifice a whole bunch of those constraints and still have it make sense to host it in the stdlib. I'm fine if the JSON library is slow or inflexible if it covers a bunch of use cases and doesn't impede people from writing better ones. You can see this to some extent with the state of "router" HTTP libraries in Go.
But for some things, most notably cryptography, it's worse to have a suboptimal version in the stdlib than to have none at all.
This is a "Hazardous Materials" module. You should ONLY use it if you're
100% absolutely sure that you know what you're doing because this module
is full of land mines, dragons, and dinosaurs with laser guns.
I would be surprised if either 'kentm or their security team wanted the rest of the Python standard library developers (or the Go ones, for that matter) to mess with cryptography modules when it's not their area of expertise.
For all I know some Python devs are even more sophisticated about security than the C developers who are maintaining libraries for Linux and *BSD distros.