parallel downloads don't need multi-processing since this is an IO bound usecase. asyncio or GIL-threads (which unblock on IO) would be perfectly fine. native threads will eventually be the default also.
Indeed, but unzipping while downloading do. Analysing multiple metadata files and exporting lock data as well.
Now I believe unzip releases the GIL already so we could already benefit from that and the rest likely don't dominate perfs.
But still, rust software is faster on average than python software.
After all, all those things are possible in python, and yet we haven't seen them all in one package manager before uv.
Maybe the strongest advantage of rust, on top of very clean and fast default behaviors, is that it attracts people that care about speed, safety and correctness. And those devs are more likely to spend time implementing fast software.
Thought the main benefit of uv is not that it's fast. It's very nice, and opens more use cases, but it's not the killer feature.
The killer feature is, being a stand alone executable, it bypasses all python bootstrapping problems.
Again, that could technically be achieved in python, but friction is a strong force.
> Maybe the strongest advantage of rust, on top of very clean and fast default behaviors, is that it attracts people that care about speed, safety and correctness. And those devs are more likely to spend time implementing fast software.
people who have this opinion should use Rust, not Python, at all. if Python code does not have sufficient speed, safety, and correctness for someone, it should not be used. Python's tools should be written in Python.
> The killer feature is, being a stand alone executable, it bypasses all python bootstrapping problems.
I can't speak for windows or macs but on Linux, system pythons are standard, and there is no "bootstrapping problem" using well known utilities that happen to be written in Python.
> And then uv arrived. And they disapeared for those people.
I'm not arguing against tools that make things as easy as possible for non programmers, I'm arguing against gigantic forks in the Python installation ecosystem. Forks like these are harmful to the tooling, I'm already suffering quite a bit due to the flake8/ruff forking where ruff made a much better linter engine but didnt feel like implementing plugins, so everyone is stuck on what I feel is a mediocre set of linting tools. Just overall I don't like Astral's style and I think a for-profit startup forking out huge chunks of the Python ecosystem is going to be a bad thing long term.
As for 20 ms, if you deal with 20 dependencies in parallel, that's 400ms just to start working.
Shaving half a second on many things make things fast.
Althought as we saw with zeeek in the other comment, you likely don't need multiprocessing since the network stack and unzip in the stdlib release the gil.
Threads are cheaper.
Maybe if you'd bundle pubgrub as a compiled extension, you coukd get pretty close to uv's perf.
At least one worker for each virtual cpu core you get for CPU. I got 16 on my laptop. My servers have much more.
If I have 64 cores, and 20 dependencies, I do want the 20 of them to be uncompressed in parallel. That's faster and if I'm installing something, I wanna prioritize that workload.
But it doesn't have to be 20. Even say 5 with queues, that's 100ms. It adds up.
- the first and MAIN reason is excellent windows support. Nobody wants to say this, but it was the first unix friendly community for a good scriting language to not shit on windows users.
- batteries included is frawn uppon today, but it was a BIG deal back then
- python had the best c integration system after lua and got the Numric lib because if that. It became numpy.
- it was dynamically typed, but strongly typed. No ===.
- optional parenthesis for ruby and symbols from perls were a repellant to many C and java users.
- the error messages were better than the competition.
- lisp shell without lisp
- docstrings and help()
- the packaging ecosystem that everybody craps on today was actually the best at the time. Ruby's was AWFUL at the time.
- one of the first scripting language with good namespace. Ruby could have been that but killef it with monkey patching for years.
- python found ground outside of the web unlike php, js and ruby very early
- python devs were willing to work with os devs to improve integration, and so linux and mac got system tools written in python
- django and twisted were amazing at the time. The dev server without apache was a killer feature
- google advertised they used python for the search engine and youtube. Then HN successes like reddit znd dropbox. It became so famous.
It was not inevitable at all. Just a collection of random stuff.
I just tried to match a URL against about a hundred patterns of various types (thanks to Claude code), expecting it to be a non-issue.
A hundred regex tests, for example, is generally very fast. A quick Python script made them run in 0.85ms. A hundred Flask router tests is 2.64ms.
So I had no reason to think this API would be slow. Surely matching a URL is a subset of generalized regexes and can only be fast? And given that routing is not an activity you do a lot, why would it matter anyway?
But the performances were atrocious: it took 8 seconds to resolve the worst-case scenario on Firefox, and it locked the entire browser UI.
Ok, note to self, stay away from the URL Pattern API.
For what it's worth, quite a lot of libraries don't use NFA/DFA style regexes and instead use something like PCRE, which aren't not necessarily linear in the worst case. I'd hope that URL pattern matching wouldn't need recursive backtracking or whatever, but probably quite a lot of the time people use libraries with the less performance implementations they're not intending to use those features either, so it probably wouldn't be the first time anyone accidentally make their matching way slower from this if that's what happened here.
In the near future I fear there may be laws about “LLMing while drunk” after enough rogue LLM agents vibe coded while drunk cause widespread havoc. You know folks harassing exs or trying to hack military depos to get a tank.
For these reasons, many countries have adopted a point-based system for driving licences. E.G: in France you have 12 points, driving over the speed limit is a fine, but also removes up to 6 points depending on the speed.
If you go down to 0 points, your licence is suspended.
If you stay without a fine for long enough, you get back points.
Some countries have fines that depend on how much you make. Some countries will destroy your car if you really behave badly.
New York actually does have a points system, but since they're tied to the driver's license rather than the car itself, you only get them if you're actually pulled over, not from cameras. Within NYC there's a fair amount of camera enforcement, but comparatively very little by the police directly, so drivers whose licenses might otherwise be suspended via points are still driving around.
The mechanisms for keeping people off the road are also just weaker in the US—I believe the penalties for driving with a suspended license are comparatively lighter, plus if your license is suspended you can often still get a "restricted" license that still lets you drive to work.
France gets around that by assuming it's the car's owner's fault. If you were not driving the car during the infraction, the person driving the car must fill out a form saying he or she did and take the hit voluntarily.
If the car's owner is a company, the company must declare a default conductor for this purpose.
To the proud contrarian, "the empire did nothing wrong". Maybe Sci-fi has actually played a role in the "memetic desire" of some of the titans of tech who are trying to bring about these worlds more-or-less intentionally. I guess it's not as much of a dystopia if you're on top and its not evil if you think of it as inevitable anyway.
I don't know. Walking on everybody's face to climb a human pyramid, one don't make much sincere friends. And one certainly are rightfully going down a spiral of paranoia. There are so many people already on fast track to hate anyone else, if they have social consensus that indeed someone is a freaking bastard which only deserve to die, that's a lot of stress to cope with.
Future is inevitable, but only ignorants of self predictive ability are thinking that what's going to populate future is inevitable.
Still can't believe people buy their stock, given that they are the closest thing to a James Bond villain, just because it goes up.
I've been tempted to. "Everything will be terrible if these guys succeed, but at least I'll be rich. If they fail I'll lose money, but since that's the outcome I prefer anyway, the loss won't bother me."
Trouble is, that ship has arguably already sailed. No matter how rapidly things go to hell, it will take many years before PLTR is profitable enough to justify its half-trillion dollar market cap.
It goes a bit deeper than that since they got funding in the wake of 9/11 and the requests for intelligence and investigative branches of government to do better and coalescing their information to prevent attacks.
So "panopticon that if it had been used properly, would have prevented the destruction of two towers" while ignoring the obvious "are we the baddies?"
To be honest, while I'd heard of it over a decade ago and I've read LOTR and I've been paying attention to privacy longer than most, I didn't ever really look into what it did until I started hearing more about it in the past year or two.
But yeah lots of people don't really buy into the idea of their small contribution to a large problem being a problem.
>But yeah lots of people don't really buy into the idea of their small contribution to a large problem being a problem.
As an abstract idea I think there is a reasonable argument to be made that the size of any contribution to a problem should be measured as a relative proportion of total influence.
The carbon footprint is a good example, if each individual focuses on reducing their small individual contribution then they could neglect systemic changes that would reduce everyone's contribution to a greater extent.
Any scientist working on a method to remove a problem shouldn't abstain from contributing to the problem while they work.
Or to put it as a catchy phrase. Someone working on a cleaner light source shouldn't have to work in the dark.
>As an abstract idea I think there is a reasonable argument to be made that the size of any contribution to a problem should be measured as a relative proportion of total influence.
Right, I think you have responsibility for your 1/<global population>th (arguably considerably more though, for first-worlders) of the problem. What I see is something like refusal to consider swapping out a two-stroke-engine-powered tungsten lightbulb with an LED of equivalent brightness, CRI, and color temperature, because it won't unilaterally solve the problem.
Stock buying as a political or ethical statement is not much of a thing. For one the stocks will still be bought by persons with less strung opinions, and secondly it does not lend itself well to virtue signaling.
Well, two things lead to unsophisticated risk-taking, right... economic malaise, and unlimited surplus. Both conditions are easy to spot in today's world.
Damn it, this unicorn farting rainbows and craping gold is not yet capable of towing another car. I don't know why they advertise it as a replacement for my current mode of transportation.
You can study those kids and compare them to the reported lifestyle of the parents at the time before their conception.