Hacker News new | past | comments | ask | show | jobs | submit | notatallshaw's comments login

> b) you can print important travel documents ahead of time c) you can bring a backup device on your travels.

While I remain on the fence about passkeys, I often see this argument from advocates when things go wrong along the lines of: "why didn't you securely store your backups and keep them with you at all times"

Firstly, it's annoying when a user is already in that situation, it does not help them solve the problem right now.

Secondly, it clearly faces a scalability issue, a significant proportion of passkey users right now tend to be well informed on the authentication model and threats, but when it scales to billions of users many of them will not be well informed and will not think about how to be prepared for when things go wrong, potentially locking them out of everything they would need to fix the situation.

While I appreciate the advantage this model gives, I struggle to imagine it will work well on the unsuspecting public.


> I struggle to imagine it will work well on the unsuspecting public.

Right, that's because there's a 0% chance we end up in a world where everyone who uses the internet reliably carries a Yubikey or similar backup on their person. It's wild to me how many people here are seriously proposing this as a solution. That might work fine for a typical HN poster, but expecting it to scale to the general public is some insane tech bubble myopia.

In reality companies will either provide enough workarounds to negate any security benefits or a whole bunch of people are going to lose access to accounts they own. If passkeys ever get widespread I'm guessing we'll see more of the former option, leading to a lot of unnecessary confusion and very little practical benefit.


You seem to be conflating problems and different groups of people that aren't directly related.

To clarify some different groups:

* The faster-cpython project, headed by Guido and his team at Microsoft, is continuing fine, most of the low hanging fruit was accomplished by 3.11, further improvements were moved to medium term goals and they were delayed further by the free threaded project which broke assumptions that were made to optimize CPython, they have adapted but it has pushed big optimizations out to later released (think probably 3.15ish)

* The free threaded project, initiated by Sam Gross at Meta, is continuing fine, it was never intended to be ready by 3.13, the fact there is even a build officially published is very fast progress. There isn't yet a plan to make it default, depending on the compatibility of the free threaded build it could be a quick transition or a 5+ year switch over.

* The PSF, the Steering Council, and the CoC WG are all different groups that have different responsibilities and aren't typically involved in day-to-day choices of committing certain features.

* The release manager is a core developer in charge of making the final choice on whether a particular feature is stable or not. It was the 3.13 release manager who decided to revert the new GC which was intended to generally improve performance for non-free threaded builds, which it may still do in a future release with sufficient fine tuning.

Now, there are clearly communication issues in areas of the Python community, but there is also a lot of people getting on with great work and improvements and communicating fine.


Thanks for the party line, backed by the usual flaggers. This is how Python maintains its market share and popularity. Google though does not seem too impressed by Python any longer. Others will follow.


> Thanks for the party line, backed by the usual flaggers

The previous post was wildly conflating different things, I think is a real disservice to the actual events that happened.

I think there are systemic issues with the current structure of community governance, but having the events which show that conflated with completely unrelated work makes it more difficult to address these issues, as now there is misinformation floating around and criticism can be dismissed as being uninformed.


There was an attempt to make imports lazy: https://peps.python.org/pep-0690/

It was ultimately rejected due to issues with how it would need to change the dict object.

IMO all the rejection reasons could be overcome with a more focused approach and implementation, but I don't know if there is anyone wishing to give it another go.


> Please consider consolidating python dependency management instead of fragmenting it: https://github.com/mamba-org/mamba

Mamba doesn't even interact with the official python packing ecosystem... It is purely a conda replacement and conda packaging is not Python packaging (it's a more fundamental fragmentation than choosing a particular tool). So weird choice to not fragment Python dependency management.

If you depend on both conda and standard Python packaging (e.g. PyPI.org) you can use Pixi: https://github.com/prefix-dev/pixi?tab=readme-ov-file#pixi-p.... Pixi attempts to bridge the conda world and the Python package world, so it's possible to rely more deeply on standards based packaging but still use conda tooling when you need it.


You don't need to have your biz tasks in the same venv as your airflow worker to take advantage of task mapping or XCOM.

You just need a well defined interface on how to call your biz tasks, and then you can use any of ExternalPythonOperator, DockerOperator, DockerSwarmOperator, KubernetesPodOperator, etc. etc. or write your own to pass in values or data to your task however you want.

Airflow is quite complex, and I don't recommend it as people's go to, but IMO that's in large part because it is so unopinionated about how you call and run your tasks and leaves the configuration up to you. But this also means it ends up being a lot of people's choice because they are able to get it to fit what they need.


Can you please expand or explain on this analogy.

I'm struggling to connect how Google's last major browser engine competitor potentially losing it's funding due to a separate monopoly issue is similar to laws being passed to reduce death and hospitalizations.


FOF ¶ 334. Google’s default placements on Firefox generate 80% of Mozilla’s overall operating revenue, demonstrating that the vast majority of query volume on Firefox goes through defaults.

Ref: https://s3.documentcloud.org/documents/25032745/045110819896...


The DOJ is concerned with moving the needle on one specific issue (search monopoly a.k.a. seatbelts) and is not concerned about the side effects on related entities (Firefox a.k.a. undertakers).


The analogy is: Sometimes the one negative outcome (Mozilla & hospitals losing money) is a price well paid for another positive one (regulating monopolies & reduced deaths).


Not using a seatbelt is unhealthy. Using Google search as the default is unhealthy as well.


They are not Google's competitior, they are their partner in crime.


A JIT is now available in CPython main, it's not that performant yet so won't be turned on for Python 3.13 by default, informational PEP is here (still being reviewed, check the discourse thread for more details): https://peps.python.org/pep-0744/


Windows 10 has for some time been nagging my local account.

In the start menu it shows a yellow "!" over my profile icon and if I click that it informs me that I should sign into a Microsoft account.

I'm able to dismiss it, but then it reappears a couple of weeks later.


Can you try switching to LTSC/IOT? There are various grey(or darker)-market solutions available.


With all the shenanigans Microsoft is pulling I no longer feel any guilt or trepidation about getting an LTSC key through a gray area.


I've been running LTSC for a while. Just about everything works, until I downloaded the latest Forza Motorsport. It won't start because all the xbox "apps" are missing. It's the only game so far that doesn't work that I have tried.


Ironically your comment probably explained why my Forza Horizon 4 fails to launch, thanks for the reply. I'd probably rather not have the game than have the xbox apps to be honest. (Though game bar still seems to exist when I search for it...)


> Python distribution and packaging is just fundamentally horribly broken

It's clearly not because most people successfully use it fine.

The problem of distribution and packaging is often a matter of user expectations vs. the actual problem.

The user expectations is that Python is a high level language and will run the same across different machines regardless of OS and Hardware.

The actual problem is Python is a glue language often depending on lots of libraries that are sensitive to how they were compiled and what hardware they are targeting. So people can end up moving the the glue of the project first and then expect everything else to just work.

Things are getting better (e.g. https://lukeplant.me.uk/blog/posts/python-packaging-must-be-...), a lot of people have put a lot of work into standards and improving the ecosystem, but it is a hard problem that most other popular languages don't interface with nearly as much.


> It's clearly not because most people successfully use it fine.

Well, no because it's perfectly possible to successfully use a horribly broken system. I use Python "successfully", it just meant I have spend probably literal weeks of my life fighting pip and virtualenv and relative imports and finding I need flags like `--config-settings editable_mode=compat`.

> The problem of distribution and packaging is often a matter of user expectations vs. the actual problem.

Ha yes, I expect it to work reliably and simply and it doesn't!

> The actual problem is Python is a glue language often depending on lots of libraries that are sensitive to how they were compiled and what hardware they are targeting.

That's completely irrelevant to the kind of problems I was talking about. I run into issues with Python failing to compile C libraries relatively rarely! Even compiling Python itself seems to work quite well (maybe not surprising since that's one of the only ways to get a new version on Linux).

It's all the packaging infrastructure that's a mess. Pip, virtualenv, setuptools, and also the module import system is a total disaster. You don't see questions like this for Go:

https://stackoverflow.com/questions/14132789/relative-import...


> It's all the packaging infrastructure that's a mess. Pip, virtualenv, setuptools, and also the module import system is a total disaster. You don't see questions like this for Go:

> https://stackoverflow.com/questions/14132789/relative-import...

I think you'll find the further you delve into it the less the problems are distinct, a lot of the issues with the module import system, pip, virtualenv, setuptools, etc. is because they are designed with having to support a vast range of things, from being depended on to interact with system libraries to downloading sdists and compiling arbitrary languages, etc.

Though the specific example you linked was largely solved with Python 3, there was a lot of confusion during the 2 to 3 transition because people had to support both behaviors, but most people don't have to think about Python 2 any more.


> Though the specific example you linked was largely solved with Python 3

I can assure you it absolutely was not.

> I think you'll find the further you delve into it the less the problems are distinct, a lot of the issues with the module import system, pip, virtualenv, setuptools, etc. is because they are designed with having to support a vast range of things, from being depended on to interact with system libraries to downloading sdists and compiling arbitrary languages, etc.

Not really. There are plenty of systems that have to "support a vast range of things" that aren't this bad.

In my opinion it's because the core Python devs didn't particularly care about the issue, never really tried to solve it, and as a result we have 10 incompatible half-baked third party solutions.

It's similar to the situation with C/C++ - worse in some ways, better in others (at least there is a de facto package registry in Python). In some ways it's because both languages are very old and predate the idea that packaging should be easy and reliable. That's fine, but please don't pretend that it is easy and reliable now.


> I can assure you it absolutely was not.

The question, as posted, was a confusion about how Python 2 relative importing worked, which was indeed bad. I don't know what you think you are pointing out, you haven't said, and the question *is* about Python 2.

> Not really. There are plenty of systems that have to "support a vast range of things" that aren't this bad. > > In my opinion it's because the core Python devs didn't particularly care about the issue, never really tried to solve it, and as a result we have 10 incompatible half-baked third party solutions.

I agree that a lot of the solutions were created when there was not an understanding, or thought out design, on what would be a good packaging solution.

But these ill thought out solutions were exactly because of trying to support this wide range of situations, from working on weird OSes, to integrating with strange build systems.

However, what you seemed to have missed is there is now a first party standard on:

* How package installers (pip, poetry, PDM, etc.) should interact with package builders (setuptools, hatchling, etc.)

* How and where build configuration and project metadata should be stored (pyproject.toml)

Almost every popular Python package tool now supports these standards, meaning they all interact with each other pretty well.

Dropping all legacy configuration, and updating the standards to support edge cases is still a long road, but it is a road being travelled and things are getting better.


I don't disagree with what you're saying but that article seems odd. It's just a story of someone installing something once that didn't break immediately. Not only is it anecdotal, it doesn't even seem to confirm whether the pip story is getting better or if they just got lucky.


I agree, but there is no large scale study on this.

As someone who has managed Python distributions in a large company and who triages issues on the pip github issue page that my anecdotal experience is things are getting better.

The only hard statistic I can point to is the number of top packages on PyPI that offer wheels has substantially gone up, and is close to 100% in the top 500.


They use it fine by using Docker. So many major Python repos on GitHub come with a Dockerfile compared to, say, NodeJS. It's unfortunate, but having dealt with Python packages before, I don't blame them.


Lots of analysts, data scientists, traders, engineers, etc., use Python third party packages successfully, and have never touched, or maybe even head of, Docker.

And yeah, in general there are significantly less NodeJS third party packages interfacing with packages that directly depend on OSes and hardware. Python has many third party packages that are older than NodeJS that depend on foreign function interfaces, win32 com apis, directly talking to graphics shaders, etc.


Python packages are painful even if nothing native is involved. There are NodeJS packages that rely on native code too, difference is the packaging system is a lot simpler to use, though it's starting to get worse with the `import` vs `require` nonsense and Typescript.


> Actually, it only builds it locally if it can't find a pre-packaged version for your system/arch. Admittedly that's most of the recent ones on a Mac, but there is a difference (I've been using pyenv for nearly ten years[1] now).

Really? I regularly use this on Linux exactly because it compiles from source rather than using the system provided Python which often have patches in them that breaks pip's test suite.

I've never encountered it trying to use the system provided Python, but maybe there's a weird quirk in the way I am using it.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: