Hacker News new | past | comments | ask | show | jobs | submit login

I don't think so. When you consider that you'd have to index every library and every version of it. That's a huge index. Google has the "live at HEAD" [1] mentality so in reality you don't need as much. I don't remember if xrefs for old versions of the code we're retained.

1: https://abseil.io/about/philosophy#upgrade-support




It would be huge but storage is cheap, you could build in phases based on popularity. Maybe paid subscriptions for the full index?

But yeah, it could get expensive to keep the indexes current as things grow.


Yeah but you also have to understand the dependency management systems for every open source project on GitHub to get the right dependency, which is a sisyphean task. Google has the advantage there is only one way to build stuff.


There are only so many package managers out there. Most languages have a "preferred" one which makes things even easier. Stuff built with ad-hoc systems might be more difficult, but I don't think that's the norm.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: