0

I am trying to work with a file which is around 1GB. This file is constantly made changes to and committed to the repository and updated frequently by other users. And this file will only get bigger in the days to come.

The problem is it takes a long time to commit or update the file since its 1GB. I looked up about CVS file size limit and found very little information and that it had something to do with swap memory.

Is there a work around for this ? What would be the best possible solution to make files move around faster? Is it something to do with swap space or CVS configuration ?

Any help/suggestion/directions is much appreciated.

Naveen
  • 687
  • 3
  • 12
  • 24
  • What is in this file? If it is so big and modified so frequently, is revision control really what you need? ie. will you ever want to roll back to an earlier version or examine changes made to the file? If you only care about keeping the latest version safe then backup might be what you are after. Also note that if the file changes significantly then your repository will grow rapidly as it stores the differences between revisions and checking out a version will take longer and longer. – Burhan Ali May 26 '12 at 10:08

1 Answers1

0

The swap space or CVS configuration will not solve your problem. You will have the same problem in other version control software (e.g. TFS, SNV, SourceAnywhere) as well.

Your file is too big, you should find another way to split it into smaller files. BTW, what is the type of this file? if it a compiled binary file, you can consider to keep the source code instead.

Robby Shaw
  • 4,725
  • 1
  • 14
  • 11