Wondering if anyone has successfully managed to serialize and unserialize a R model into a Postgres database. I have tried different ways of serializeing JSON, raw e.t.c without success. I'm using the RPostgreSQL package
Pseudo code, not working
# SERILIZE
fit <- lm(reading ~ ., mdata.sel)
pgcon <- mpr.getDBConnection()
on.exit(dbDisconnect(pgcon))
df <- data.frame(serialize(fit,NULL))
vector <- vector()
vector[1] <- "poly"
colnames(df) <- vector
dbWriteTable(pgcon, "ptest",
value = df , append = TRUE, row.names = FALSE)
# UNSERIALIZE
rows<-dbGetQuery(pgcon, "SELECT encode(poly::bytea,'escape') from ptest")
iter_model<-postgresqlUnescapeBytea(rows[["encode"]])
model<-unserialize(iter_model)
EDIT
Found a sample in How to write and read binary data with RPostgresql, which stores the model, unfortunately when unserializing the retrieved object, it becomes corrupt
con <- mpr.getDBConnection()
on.exit(dbDisconnect(pgcon))
dbGetQuery(con,"CREATE TABLE byteatable (name text NOT NULL, val bytea, PRIMARY KEY (name))")
ser <- serialize(fit,NULL,ascii=F)
postgresqlEscapeBytea(con, ser)
iq <- sprintf("INSERT INTO byteatable values('%s',E'%s');","name1", postgresqlEscapeBytea(con, ser))
dbGetQuery(con, iq)
rows<-dbGetQuery(con, "SELECT * from byteatable")
ser2<-postgresqlUnescapeBytea(rows[[2]])
unserialize(ser2) # CORRUPT