4

I m running the following code block 8 times, but with different queries in three different R scripts. I have already established the connection with the server.

rs<-dbSendQuery(con,"select owner_name, owner_domain, count(*) as avg from kapsule,
recview_history where recview_history.vsrc='em' and 
recview_history.g_conf_id=kapsule.g_conf_id group by owner_name;")

d<-fetch(rs, n=0)
d$test<-apply(d,1,function(row) 1)
dp<-ddply(d, .(test), transform, percentile=ecdf(avg)(avg))
write.csv(dp, file="/tmp/creator_data/embeds.csv")
rm(rs)
rm(d)
rm(dp)

I m running this on a very large data set and hence after the first two csv's are created in each script, the script gives the following error.

Error in mysqlExecStatement(conn, statement, ...) :
RS-DBI driver: (connection with pending rows, close resultSet before continuing)

when i ran the three scripts earlier on a smaller data set, they worked fine. Is the problem due to the large size of the data in the MySQL server? Any help in this regard will be duly appreciated. Thank you in advance.

m_amber
  • 747
  • 3
  • 13
  • 23
  • You never close `rs` with `dbClearResult` – hadley Jun 28 '13 at 14:49
  • I corrected my mistake. dbGetQuery works much better than dbSendQuery – m_amber Jul 01 '13 at 06:58
  • Is this still an ongoing issue? From my experience in dealing with large data. You have to set the timeout to 0. Or infinite. There is a timeout for connections that autocloses it. Hope this helps – Luigi Apr 16 '15 at 09:12

0 Answers0