1

Google makes it difficult to get your data if you are not experienced in programming. I did a data export from Google to export all company data - Google Data Export

It shows the root folder and to download, I run this command (it automatically enters this command):

gsutil -m cp -r \ "gs://takeout-export-myUniqueID" \.

But I have no idea where it would save it being I am not a GCP customer, only Google Workspace. Workspace won't help because they say it's a GCP product but I am exporting from Workspace. Craziness

Can someone let me know the proper command to run on my local machine with Google's SDK to download this folder? I was able to start the download yesterday but it said there was an invalid character in the file names so it killed the export.

Appreciate any help!

Osvaldo
  • 473
  • 1
  • 12
BMoreIT
  • 11
  • 1
  • Maybe it's me that I can't see the relation of this with Python, but does this question have something to do with programming? If it doesn't, Stack Overflow it's not the place to ask this. – FourBars Feb 10 '22 at 16:59
  • hey @FourBars, I was told by GCP support that I may be able to get help here. Just went off what they told me. Apologies if this is not the correct place. – BMoreIT Feb 10 '22 at 17:05
  • Please add more clarity to your question. Whenever I've run Google Takeout (is this what you're using?), Google emails a link to a download file (e.g. zip), click that and you're off to the races. Your question suggests (`gsutil`) that, somehow you've asked for (been given) a link to a Google Cloud Storage bucket which is unusual. It's possible that you've been given what's called a "Signed URL" which is a time-bound link to your data (zipped) in a Google Cloud Storage bucket. That link will expire. You should have been given a link (rather than a `gsutil` command) that you click to download. – DazWilkin Feb 10 '22 at 18:28
  • @DazWilkin Thanks for the help. The process you described is for 1 user but I am doing an entire organization. It goes to the data export page above which provides a link to GCP's "Project" area. I get a root folder that I can drill down into and then download a ZIP file for the data. I was hoping to download the entire root folder but GCP doesn't allow that and says I have to download via the CLI. It automatically inputs the command above but I do not know where that downloads the root folder to. Ideally, I would like to download it to my local drive. It is not d/ling to my drive – BMoreIT Feb 10 '22 at 19:16
  • Got it. Assuming you have installed `gcloud` (which includes `gsutil`) the that command will copy your archive to the current working directory (i.e. `.`). If you want it to go elsewhere, you can provide an absolute reference instead of `.`. `-m` is to parallelize (faster), `-r` is to recursively descend through the entire bucket. – DazWilkin Feb 10 '22 at 19:30
  • @DazWilkin Yes, I downloaded the SDK because I read somewhere that you cannot download locally with the CLI in the cloud. I did successfully start some type of export to my desktop BUT... Windows didn't like the characters in some folder and it stopped the export. Do you know of any switch that will bypass that issue to maybe remove that characters ? This is all new to me and I appreciate the help. – BMoreIT Feb 10 '22 at 19:39
  • I'm unfamiliar with Windows. You can safely remove the backslash (\\) characters you included. Those are unnecessary. That *may* help. – DazWilkin Feb 10 '22 at 19:42
  • Does this answer your question? [Error "No URLs matched" When copying Google cloud bucket data to my local computer?](https://stackoverflow.com/questions/49491936/error-no-urls-matched-when-copying-google-cloud-bucket-data-to-my-local-comput) – Osvaldo Feb 14 '22 at 17:35

0 Answers0