I want to use Unison to sync 2 file systems of about 2.5TB, but every time I try to sync, the host on which I run the job kills it (OoM kill) because it uses a ridiculous amount of memory. (The host runs Ubuntu and has 6GB RAM and 2GB SWAP)
Is there any automated way to make Unison either not use as much RAM, or split the job in to multiple tasks, without the risk of missing parts of the data?
I can manually create multiple prf files, each with their own set of paths, but that means I have to make sure that every time ANY person makes a new folder or a new file in the root, the prf files get modified, which is a recipe for disaster, as you'll always find that exactly that important bit of data was not copied because a path entry wasn't made...
The path parameter is (for unclear reasons) the only Unison parameter that takes path/file information, but does not support regex or wildcards.
(I've tried google and duckduckgo, but was unable to find anything useful.)