I am baffled why a simple query takes so long when executed through RPostgreSQL
but completes almost instantly through another client. I have a Postgresql database ncol = 500
and nrow = 100
. I would like to read the whole database into a dataframe set via the dbReadTable
function. Failing this, even reading in a single row via dbGetQuery
would be ok.
library(RPostgreSQL)
drv = dbDriver("PostgreSQL")
con = dbConnect(drv, dbname = "MYDBNAME", host = "MYHOST",
port = 5432, user = "MYUSER", password = "MYPASSWORD")
before = Sys.time()
res = dbGetQuery(con, "SELECT * FROM detailed WHERE project_code='ABCD1234' LIMIT 1;")
after = Sys.time()
before2 = Sys.time()
res2 = dbReadTable(con, "detailed")
after2 = Sys.time()
dbDisconnect(con) # Close PostgreSQL connection
ellapsed = after - before
ellapsed2 = after2-before2
ellapsed
> Time difference of 50.31242 secs
ellapsed2
> Time difference of 50.38002 secs
However, I can run SELECT * FROM detailed;
in 0.5 seconds through another client.
Why does this query take so long to run? What can I do to speed this up? Could this be a Postgres driver issue with RPostgreSQL
?