Hacker News new | past | comments | ask | show | jobs | submit login

Does anyone have experience with large repo's of say, 100 GB? Does jj incur performance penalty's compared to native git?



It depends on whether you're talking about 100 GB repository size or working copy size.

- Currently no partial/shallow clones, so you need to materialize the entire repo on disk.

- Working copy status can take a while (see https://github.com/martinvonz/jj/issues/1841 for tracking issue). This can be ameliorated at present by configuring Watchman as a filesystem monitor, and work is underway to improve status further.

- No support for Git LFS at present (even in colocated repos). When using the Git backend with jj, you would expect the same problems with regards to large file management.

- I haven't noticed any particular performance issues when interacting with the object store a large repository. It should be approximately the same as for Git since it uses libgit2 for the Git backend.

- Faster for operations that can be done in-memory in jj but only on-disk with Git, such as various rebase operations.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: