|
|
Subscribe / Log in / New account

Large files with Git: LFS and git-annex

Large files with Git: LFS and git-annex

Posted Dec 12, 2018 17:30 UTC (Wed) by derobert (subscriber, #89569)
In reply to: Large files with Git: LFS and git-annex by gebi
Parent article: Large files with Git: LFS and git-annex

That sounds like you were running git-annex repair, which starts by unpacking the repository. But you really only ever run that if there is an error, which should be extremely rare since git is pretty stable now. You want git fsck (to check the git repository) and git-annex fsck (to confirm files match their checksums). Neither should appreciably grow the repository (git-annex fsck may store some metadata about last check time).


to post comments

Large files with Git: LFS and git-annex

Posted Dec 12, 2018 19:04 UTC (Wed) by gebi (guest, #59940) [Link] (1 responses)

yes, exactly, but from my reading of the docs it was the only method to check if the replication count of each object was still what was defined, thus it needed to be run regularaly without errors (eg. wanted to run it once per week, just like zfs scrub).

Large files with Git: LFS and git-annex

Posted Dec 12, 2018 19:11 UTC (Wed) by derobert (subscriber, #89569) [Link]

Pretty sure git-annex fsck does that, at least my runs of it sometimes report a lower than desired number of copies. It also checks the data is correct (matches checksum), detecting any bitrot, though --fast should disable that part.

Note that it only checks one repository (which doesn't have to be the local one, useful especially for special remotes). So you need to have it run for all the repositories you trust to keep copies to detect bitrot, accidental deletion, etc. And it stores the data locally, so you may need git-annex sync to make the results known across the git-annex network.


Copyright © 2025, Eklektix, Inc.
Comments and public postings are copyrighted by their creators.
Linux is a registered trademark of Linus Torvalds