We have a local directory with 20000 images that needs to go to GCS. Using R (and googleCloudStorageR
), we can loop over each of the images and upload to GCS as such:
# setup
library(googleCloudStorageR)
gcs_auth(json_file = 'my/gcs/admin/creds.json')
all_png_images <- list.files('../path/to/local-images/directory')
# and loop
for(i in 1:length(all_png_images)) {
googleCloudStorageR::gcs_upload(
file = paste0(base_url, all_png_images[i]),
bucket = 'my-bucket',
name = paste0('images-folder/', all_png_images[i]),
predefinedAcl = 'default'
)
}
and this works perfectly... however, it would be much better if I could simply point to the directory and upload all at once, rather than have to point to a directory and loop over each file. I have tried to use the gcs_save_all
function, to no success:
googleCloudStorageR::gcs_save_all(
directory = 'path-to-all-images',
bucket = 'my-bucket'
)
throws the error 2020-10-01 16:23:47 -- File size detected as 377.1 Kb 2020-10-01 16:23:47> Request Status Code: 400 Error: API returned: Cannot insert legacy ACL for an object when uniform bucket-level access is enabled. Read more at https://cloud.google.com/storage/docs/uniform-bucket-level-access
I am trying to find out why gcs_save_all
is not working, or if there is another way I can do this in R.