I feel like this method has changed up multiple times over the years and I haven't been able to keep up with the change.
Quite simply, how can I upload a regular data frame/table from R to BigQuery:
Here was my original approach and error
job = insert_upload_job(
project = "projectid",
dataset = "dataset",
table = "tablename",
values = table_wanting_to_upload,
write_disposition='WRITE_TRUNCATE',
create_disposition = "CREATE_IF_NEEDED") # pushing table to BQ
And this is my error
Error: billing is not a string (a length one character vector).
In addition: Warning message:
'insert_upload_job' is deprecated.
Use 'bq_perform_upload' instead.
See help("Deprecated") and help("bigrquery-deprecated").
Naturally, I use R's suggestion and use bq_perform_upload
instead
job = bq_perform_upload(
project = "projectid",
dataset = "dataset",
table = "table_name",
values = table_wanting_to_upload,
write_disposition='WRITE_TRUNCATE',
create_disposition = "CREATE_IF_NEEDED") # pushing table to BQ
And I get the following error:
**Error in as_bq_table(x) : argument "x" is missing, with no default**
I don't understand what "x" means and the documentation around the bq_perform_upload
function is lacking. I then tried to do the following to satisfy the x error
job = bq_perform_upload(x = predictions,
project = "projectid",
dataset = "dataset",
table = "table_name",
write_disposition='WRITE_TRUNCATE',
create_disposition = "CREATE_IF_NEEDED") # pushing table to BQ
And I then get the following error which is unclear to what the issue is
Error in UseMethod("as_bq_table") :
no applicable method for 'as_bq_table' applied to an object of class "c('grouped_df', 'tbl_df', 'tbl', 'data. Frame')"
The method used to be much simpler, for example & simply:
job <- bq_perform_upload("project","dataset","table_name", dataframe)
Answer
bqr_upload_data(projectId = 'project',
datasetId = 'dataset', tableId = 'table_name', upload_data = predictions ,
create = c("CREATE_IF_NEEDED"),
schema = NULL, sourceFormat = c("CSV", "DATASTORE_BACKUP",
"NEWLINE_DELIMITED_JSON", "AVRO"), wait = TRUE, autodetect = TRUE,
nullMarker = NULL, maxBadRecords = NULL, allowJaggedRows = FALSE,
allowQuotedNewlines = FALSE, fieldDelimiter = ",")