I have to run some regression models and descriptives on a big dataset. I have a folder of around 500 files (update: txt files) which I would like to merge, and are in total 250GB.
I know how to merge all files from a folder, but although I am running it on a 128RAM server, I keep getting out of memory.
I am looking for any tips/advice on how to load in/merge these files in a manageable way (if possible) way using R.I have been looking into packages such as "ff" and "bigmemory", will these offer me a solution?