Hacker News new | past | comments | ask | show | jobs | submit login

> Nothing about your problems had anything to do with git & everything to do with the commercial service you were using for your source code hosting.

All the commercial service providers recommend keeping total repository sizes <1GB or so, and I hear nothing but performance complaints and how much they miss perforce from those who foolishly exceed those limits, even when self hosting on solid hardware - which is 100% the fault, or at least limitation, of git - I believe you'll agree.

LFS is a suggested alternative by several commercial service providers, not just one, and seems to be one of the least horrible options with git. You're certainly not suggesting any better alternatives, and I really wish you would, because I would love for them to exist. This results in a second auth system on top of my regular git credentials, recentralization that defeats most of the point of using a DVCS in the first place, and requires a second set of parallel commands to learn, use, and remember. I got tired enough of explaining to others why you have a broken checkout when you clone an LFS repository before installing the LFS extension, that I wrote a FAQ entry somewhere that I could link people. If you don't think these are problems with "git", we must simply agree to disagree, for there will be no reconciling of viewpoints.

When I first hit the quota limits, I tried to setup caching. Failing that, I tried setting up a second LFS server and having CI pull blobs from that first when pulling simple incremental commits not touching said blobs. Details escape me this long after the fact - I might've tried to redirect LFS queries to gitlab? After a couple hours of failing to get anywhere with either despite combing through the docs and trying things that looked like they should've worked, then I tried to pay github more money - on top of my existing monthly subscription - as an ugly business-level kludge to solve a technical issue of using more bandwidth than should really have been necessary. When that too failed... now you want to pin the whole problem on github? I must disagree. We can't pin it on the CI provider either - I had trouble convincing git to use an alternative LFS server for globs when fetching upstream, even when testing locally.

I've tried gitlab. I've got a bitbucket account and plenty of tales of people trying to scale git on that. I've even got some Microsoft hosted git repositories somewhere. None of them magically scale well. In fact, so far in my experience, github has scaled the least poorly.

> Github the company is not interested in providing you (or anyone else) with free storage for arbitrary data.

I pay github, and tried to pay github more, and still had trouble. Dispense with this "free storage" strawman.

> You were unable to pay for the storage options they do provide because you did not have admin rights to the github account you wanted to work with.

To be clear - I was also unable to pay to increase LFS storage on my fork, because they still counted against the original repository. Is this specific workaround for a workaround for a workaround failing, github's fault? Yes. When git and git lfs both failed to solve the problem, github also failed to solve the problem. Don't overgeneralize the one ancedote of a failed github-specific solution, from a whole list of git problems, to being the whole problem and answer and it all being github's fault.

> None of this is a problem with git, be it GUI git clients or command line ones.

My git gui complaints are a separate issue, which I apparently shouldn't merely summarize for this discussion.

Clone https://github.com/rust-lang/rust and run your git GUI client of choice on it. git and gitk (ugly, buggy, and featureless though it may be) handle it OK. Source Tree hangs/pauses frequently enough I uninstalled, but not so frequently as to be completely unusable. I think I tried a half dozen other git UI clients, and they all repeatedly hung or showed progress bars for minutes at a time, without ever settling down, when doing basic local use involving local branches and local commits - not interacting with a remote. Presumably due to insufficient lazy evaluation or insufficient caching. And these problems were not unique to that repository either, and occured on decent machines with an SSD for the git UI install and the clone. These performance problems are 100% on those git gui clients. Right?

> This isn’t just "technically correct".

Then please share how to simply scale git in practice. Answers that include spending money are welcome. I haven't figured it out, and neither has anyone I know. You can awkwardly half-ass it by making a mess with git lfs. Or git annex. Or maybe the third party git lfs dropbox or git bittorrent stuff, if you're willing to install more unverified unreviewed never upstreamed random executables off the internet to maybe solve your problems. I remember using bittorrent over a decade ago for gigs/day of bandwidth, back when I had much less of it to spare.

> It’s the "a commercial company doesn’t have to provide you with a service if they don’t want to" kind of correct.

If it were one company not providing a specific commercial offering to solve a problem you'd have a point. No companies offering to solve my problem for git to my satisfaction, despite a few offering it for perforce, is what I'd call a git ecosystem problem.




No one is saying git doesn’t have problems. It's just weird that you keep on conflating issues with Github with issues with git.


I'm conflating at most one github specific issue (singular), not "issues". And I'm doing so because it's at best a subproblem of a subproblem of a subproblem.

If my computer caught fire and exploded due to poor electrical design, you wouldn't say "nothing about your problems had anything to do with your computer and everything to do with the specific company that provided your pencils" when in my growing list of fustrations I offhandedly mentioned breaking a pencil tip after resorting to that, what with the whole computer being unavailable and all. That would be weird.

Even if we did hyper focus on that pencil - pretty much every pencil manufacturer is giving me roughly the same product, and the fundamental problem of "pencils break if you grip them too hard" isn't company specific. It's more of a general problem with pencils.

Github gave me a hard quota error. Maybe Gitlab would just 500 on me, or soft throttle me to heck to the point where CI times out. Maybe Bitbucket's anti-abuse measures would have taken action and I'd have been required to contact customer support to explain and apologize to get unbanned. git lfs's fundamental problem of being difficult to configure to scale via caching or distribute via mirroring isn't company specific. It's more of a general problem with git lfs. Caching and mirroring are strategies nearly as old as the internet for distribution - git lfs should be better about using them.

It would've turned github's hard quota error into a non-event, non-issue, non-problem - just like they are with core git. Alternatively, core git should be better about scaling. Or, as a distant third alternative, I could suggest a business solution to a technical problem - GitHub should be better about letting me pay them to waste their bandwidth. Then I could workaround git's poor scaling for a little bit more, for a bit longer.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: