The Issue
I am attempting to copy the contents of an R dataframe df
to a PostgreSQL table table_name
located in schema schema_name
. By default, PostgreSQL will write tables to the public
schema and I do not want to change this setting. The two unique aspects of this transfer are:
- Writing to a table under a non-default schema; and
- The dataframe
df
contains a fewer number of fields thantable_name
. All the fields contained indf
, however, do exist intable_name
.
What I've Tried
I first attempted to use dbWriteTable
from the RPostgreSQL
package by using a workaround:
dbWriteTable(con, c("schema_name","table_name"), df, append = T)
resulting in the following exception:
Error in postgresqlgetResult(new.con) :
RS-DBI driver: (could not Retrieve the result : ERROR: missing data for column "extra_col"
CONTEXT: COPY df, line 1: " [removed contents] "
I then attempted to us dbWriteTable2
from the caroline
package (a wrapper for the aforementioned dbWriteTable
function), but the non-default schema hack employed above does not appear to work:
dbWriteTable2(con, c("schema_name","table_name"), df, append = T, add.id = FALSE)
creates the following exception:
creating NAs/NULLs for for fields of table that are missing in your df
Error in postgresqlExecStatement(conn, statement, ...) :
RS-DBI driver: (could not Retrieve the result : ERROR: relation "schema_name" does not exist
LINE 1: SELECT * FROM schema_name ORDER BY id DESC LIMIT 1