1

I am trying to load a huge file into R (>30 Gb).

The commands I am trying to run are not only slow but they also crash the ram:

m = read.table("myfile.txt")
print("Done reading m")
m.row.sums = apply(m,1,sum)
write.table(m.row.sums, file = "myfile_rowsums.txt")
m.col.sums = apply(m,2,sum)
write.table(m.col.sums, file = "myfile_colsums.txt")
m.sub = m[,which(m.col.sums>1000)]
write.table(m.col.sums, file = "myfile_sub.txt")

Is there a more more memory efficient (and faster) way to achieve the same in R?

jazzurro
  • 23,179
  • 35
  • 66
  • 76
Dnaiel
  • 7,622
  • 23
  • 67
  • 126

0 Answers0