I've been tasked with migrating our entire PVCS repository to git including all of the history. The only way that I've come up with to do this is to run a PVCS VLOG command to extract the revision history (for all files) to a file and then parse that file (using a C# program) to get the list of revisions for each file.
Then revision-by-revision I GET the given revision of the file from PVCS, ADD the file to GIT and do a COMMIT. So for each of the ~14,000 files I will have a commit for each revision of the file (and each file could have from 1-100+ revisions). Am I crazy in thinking this will work? Are there just going to be too many commits making the repo too large and unwieldy?