0

I am using ff package to do linear regression in R. Here tt is as.ffdf object. The code goes as follows:

> fit <- lm(ADA ~ DUMMY + NLEAD + BIG4 + LOGMKT + LEV + ROA + ROAL + LOSS +  
+                CFO + BTM + GROWTH + ALTMAN + ABSACCRL + 
+                STDEARN + TENURE + YEAR_FE , data = tt, weight = WEIGHT)

However, I am getting the following error:

Error: cannot allocate vector of size 2.0 Gb

How can I pre-create fit as a ff object, so that fit can absorb the entire data being returned to it? Thanks.

Sumit
  • 2,242
  • 4
  • 25
  • 43
  • Have a look here: http://www.bytemining.com/2010/08/taking-r-to-the-limit-part-ii-large-datasets-in-r/ – John Mar 10 '14 at 08:52

1 Answers1

0

Can't check this on your data as you don't provide any. But this should get you running.

library(devtools)
install_github("edwindj/ffbase", subdir="pkg")

require(ffbase)
fit <- bigglm(ADA ~ DUMMY + NLEAD + BIG4 + LOGMKT + LEV + ROA + ROAL + LOSS +  
                CFO + BTM + GROWTH + ALTMAN + ABSACCRL + STDEARN + TENURE + YEAR_FE , 
              data = tt, family = gaussian(), weight = WEIGHT)

Or have a look at the example in the help of ?chunk.ffdf