I'm trying to create an SQL
database from a set of data.frame
s in R
. These data.frame
s have the same row names but different columns. In the SQL
database I would therefore like them all to be in the same table, where data.frame
1 with n1 columns will populate 1:n1 columns in this SQL
table, data.frame
2 with n2 columns will populate (n1+1):n2 columns in this SQL
table, and so on.
For writing the first data.frame
(df1
below) to the SQL
database table, my code is:
library(sqldf)
setwd(db_path)
db <- dbConnect(SQLite(), dbname=db_name)
dbWriteTable(db, table_id, as.data.frame(df1))
Where db_path
is the path to the SQL database, db_name
is the name, table_id
is the table's name, and df1
is the first data.frame
.
Obviously, I can cbind
these data.frame
s and then just dbWriteTable
that combined data.frame
. Or, for every data.frame
I want to add use dbGetQuery
to fetch the table_id
, cbind
the new data.frame
to that and then write that back to db
. My data.frame
s, however, are quite large, and therefore I was wondering if there's any operation that avoids holding all of them in memory before writing them to db
. In other words, is there an SQL
operation that would append new data.frame
s directly to table_id
?