0

I use read.csv.ffdf from ff package to load a 830MB CSV file, which is about 8800000 rows and 19 columns: 

library(ff)
library(ffbase)
green_2018_ff <- read.csv.ffdf("green_2018.csv", header = TRUE)

But when I check the the size of green_2018_ff using object_size from pryr package, the object is about 1.13GB in memory:

library(pryr)
object_size(green_2018_ff)  #1.13GB

I used to consider that the ffdf is only a memory mapping object, it should be very small in memory, much smaller than the origin CSV. Is there anything wrong with my code or data? Thanks. 

Kim.L
  • 121
  • 10

1 Answers1

0

It's the call to object.size itself which pulls your data in RAM