0

Since I believe there is no way to strip largefiles out of a repository, I'm looking for a way to either:

  • clone to (create) a new repo that contains at least all the same files, even without history (export tip revision only) if necessary, deleting all largefiles.

  • achieve similar results without manually exporting and then reimporting files, a painstaking manual process.

If there is no such command, I might be able to write the command I want (in Python).

Warren P
  • 65,725
  • 40
  • 181
  • 316
  • For anyone wondering WHY I would want to back out on largefiles; Once largefiles is in a repo, it can not be pushed/synced with Bitbucket repo! – Warren P May 28 '15 at 17:17

1 Answers1

4

Simply use

hg lfconvert --to-normal <old> <new>

This will convert the repository in directory <old> to a repository in directory <new> with all large files turned back into normal files. Revision hashes will change, but otherwise, the revision history should remain intact.

If you actually want to first strip all large files from the repository and lose all information association associated with them (i.e. if your intent is to destroy the large files rather than keep them), first run:

hg convert --filemap <nolf> <old> <new>

where <nolf> is the path to a file containing the single line:

exclude .hglf

and <old> is the original repository and <new> the target directory for the conversion.

This conversion will exclude the .hglf directory, which contains all the "stand-in" files for large files. Note that such a conversion will also destroy all commits that only changed largefiles along with their commit messages (since they become empty commits).

You can also use hg convert with an appropriate --filemap after hg lfconvert --to-normal to selectively delete only some large files.

Reimer Behrends
  • 8,600
  • 15
  • 19