5

I am trying to use the command line on my local machine (anaconda prompt) to download a folder from a google cloud bucket. I am trying to do so using the gsutil command.

I am first able to log in to the project using gcloud auth login.

Doing so opens a browser, which I visit and login using the email for which I've been given permissions for the project, and then I run gcloud config set project PROJECT_ID.

At this point I think I'm able to run gsutil cp -r gs://{bucket_name}/{folder_name} .

However, when I do so, the CLI simply pauses for a little bit, and then a new line appears. No error messages or any indication of anything going on is printed out, and no data is downloaded.

I'm very confused what the problem might be. In my previous attempts, I got messages saying that I did not have permissions, which I thought I had fixed by logging in using gcloud auth login. But I cannot find any other documented instance on the web of the particular fail mode I'm in. I would be so grateful for any help!

  • 2
    The command-line option **-r** means recursive. That option should be applied to prefix (folder) names. You are applying it to an object name. For example to download all of the objects in the prefix **images** to your local directory: **gsutil cp -r gs://{bucket_name}/images .** – John Hanley Oct 05 '21 at 00:49
  • If possible, can you include the actual error message that you received? What version of `gsutil` you are using? – JM Gelilio Oct 05 '21 at 08:14
  • Thank you! Yes I actually am applying to a folder. And I am not getting any error messages. It is just clearly not running. – seeker_after_truth Oct 05 '21 at 16:25
  • And I'm using gsutil version 5.2 – seeker_after_truth Oct 05 '21 at 16:26
  • Please post an image where you executed the command and there's no downloaded file and no error. If you have still free trial package, you can contact GCP support to further investigate your issue. Regarding permission issue, double check the spelling of bucket name that you are trying to access. – JM Gelilio Oct 06 '21 at 08:41
  • 1
    Yes it turns out my mistake was that I was including an extra forward / – seeker_after_truth Oct 06 '21 at 23:13
  • 1
    Thank you! I had the exact same problem. Turns out the sample command GCP generated had an trailing / after the {folder_name}. Removed it and it worked! – Alexandre Nov 02 '21 at 15:17

2 Answers2

8

I had the exact same issue on Windows (cmd) while running the suggested command that was auto-generated by GCP:

gsutil -m cp -r "gs://{bucket_name}/{folder_name}/" .

Turns out the problem was with the trailing / (As suggested by @seeker_after_truth in a comment)

The following command worked:

gsutil -m cp -r "gs://{bucket_name}/{folder_name}" .
Alexandre
  • 161
  • 4
  • 1
    I've had this same problem several times, and this is the fix. I keep forgetting about it. Very dumb; the / should make no difference. – Pavel Komarov Apr 05 '22 at 20:43
2

You could consider the following sample commands and bullet points to achieve your task

Enter in to your project of choice and run the command :

gsutil -m cp “<source i.e., your bucket path followed by star.star>”  <destination path i.e., your local machine location where you intend to download the folder>

For example:

gsutil -m cp "gs://my-bucket-name/*.*" D:\folder1\folder2
MaartenDev
  • 5,631
  • 5
  • 21
  • 33
  • 1
    I hope if this was of help to you. If it answered your question, click the check mark to accept it. That way others know that you've been (sufficiently) helped – Abhishek Kumar Oct 07 '21 at 08:52
  • 1
    `gsutil -m cp -r gs://my-bucket-name/folder/*.*` worked for weird file types – Jason Dec 26 '22 at 15:26
  • Good answer, especially with some of the naming generated in exports by GCP. – amac Aug 02 '23 at 04:12