I'm encountering problems while trying to read a big .txt file (7.7 GB) into R. The file contains 817426 columns and more than 1000 rows. All variables are numeric. I tried out some different packages so far (data.table; vroom; bigreadr) with the commands fread; vroom; big_fread2.
With fread, I have been able to read the first 145 rows into my R session, but it crashes once I try to read 146 rows. For the other commands, the system just aborts after some time and the error message is:
R session aborted. R encountered a fatal error. The session was terminated
These are the codes I used so far:
system.time(dfUga <- fread("CpG_sexageres.txt", nrows=145, header = TRUE, sep = "\t", colClasses="numeric"))
system.time(dfUga <- vroom("CpG_sexageres.txt", col_names = TRUE))
system.time(dfUga <- big_fread2("CpG_sexageres.txt"))
Any suggestions are highly appreciated. Cheers