0

I would like to know how I can solve the following problem. I've got a table in postgreSQL with 100million of rows and 4 columns that I would like to use in R using ffdf. Here's my code

query <- "select * from ratings"
drv <- dbDriver("PostgreSQL")
rating.ff <- read.dbi.ffdf(query = query, 
                           dbConnect.args=list(drv,user="postgres",
                                               password="mypassword"),
                           verbose = TRUE)

I get the following error:

    Error en postgresqlExecStatement(conn, statement, ...) : 
     RS-DBI driver: (could not Retrieve the result : 
     out of memory for query result)

Could someone help me solving this?

Jilber Urbina
  • 58,147
  • 10
  • 114
  • 138
  • 1
    Maybe read the documentation for that function and use the arguments that let you specify the chunk size that the results are fetched in...? – joran Feb 08 '13 at 21:39

1 Answers1

0

This message comes from the RPostgreSQL package, indicating that your query is too large to be handled by your database settings. I don't believe this is an R memory issue, you need to tune your database configs so that it can handle returning larger resultsets.