Hacker News new | past | comments | ask | show | jobs | submit login

Because Continum had a loss leader with anaconda that, in 2010, was solving a ton of packaging problems.

Today I would say anaconda brings more problems than it solves (https://www.bitecode.dev/p/why-not-tell-people-to-simply-use), but at the time, we didn't have wheels for everything, and it was a life saver for anybody that wanted to use c extensions.

So anaconda became first popular because it solved a real end user problem, then it moved on to be the corporation providers because it already was well known.

It was a very good strategy.




The issue is there's c extensions, and there's c extensions. Something like cryptography is self-contained (kinda, we're sweeping a lot under the carpet with rust here), whereas something like pytorch cares much more about what environment it was built in and will run in (and then there's things like mpi4py, which you can't really distribute wheels for on PyPI). conda, by basically distributing a full userland (which has its own issues), can handle all those packages, and even more hairy ones (e.g. how would you manage the R packages you use for r2py), and because it runs on Windows (similar solutions before and after conda were tied to, or at least started with, unix-based systems), it replaced the less general ones on Windows which usually only supported a limited number of packages (e.g. ActivePython, Enthought).

PyPI distributed wheels (you can obviously build wheels however you like, but that doesn't mean they'll run on others' systems) may at some point get close to what conda does (likely by reinventing it in an ad-hoc fashion), but there's enough mindshare (especially around its target of "data science", where all the tutorials are around using conda) that I don't see it disappearing.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: