I've read multiple answers advising on using either filter-branch or BFG to accomplish this task, but I feel I need further advice because my situation is a bit peculiar.
I have to manage two repositories, one is basically a clone of the other, and ideally, I'd want to pull the changes from the origin into the clone on a daily basis. However, the origin repo contains very big files in its history, which are above Github's size limits. So I have to remove these files, but at the same time, I don't want to harm the existing commit history beyond the changes to those specific files. From what I understand, BFG performs a complete rewrite of the history, which will fool Github into thinking that all existing files were deleted and recreated as new files, whereas filter-branch doesn't do that, but it's also extremely slow by comparison, and my repository is very big reaching about 100000 commits...
So I'm trying to figure out what's the best way to go about this. Should I use BFG at certain points, and simply accept that I'm gonna see ridiculous pull requests as a result of its modifications, or maybe I should use filter-branch in some manner? To clarify, there are only 3 files which are the cause of this grievance.