11

I am trying to download a folder which is inside my Google Cloud Bucket, I read from google docs gsutil/commands/cp and executed below the line.

gsutil cp -r appengine.googleapis.com gs://my-bucket

But i am getting the error

CommandException: No URLs matched: appengine.googleapis.com

Edit

By running below command

gsutil cp -r gs://logsnotimelimit .

I am getting Error

IOError: [Errno 22] invalid mode ('ab') or filename: u'.\logsnotimelimit\appengine.googleapis.com\nginx.request\2018\03\14\14:00:00_14:59:59_S0.json_.gstmp'

Sudhanshu Gaur
  • 7,486
  • 9
  • 47
  • 94

5 Answers5

12

What is the appengine.googleapis.com parameter in your command? Is that a local directory on your filesystem you are trying to copy to the cloud bucket?

The gsutil cp -r appengine.googleapis.com gs://my-bucket command you provided will copy a local directory named appengine.googleapis.com recursively to your cloud bucket named my-bucket. If that's not what you are doing - you need to construct your command differently.

I.e. to download a directory named folder from your cloud bucket named my-bucket into the current location try running gsutil cp -r gs://my-bucket/folder .

-- Update: Since it appears that you're using a Windows machine (the "\" directory separators instead of "/" in the error message) and since the filenames contain the ":" character - the cp command will end up failing when creating those files with the error message you're seeing.

Mihail Russu
  • 2,526
  • 1
  • 17
  • 27
  • Can you please have a relook at my updated post. Because now i am getting a new error. – Sudhanshu Gaur Mar 26 '18 at 14:37
  • Are you there ?? – Sudhanshu Gaur Mar 26 '18 at 15:00
  • The `gsutil cp -r gs://logsnotimelimit .` command looks fine assuming you have the rights to read from `logsnotimelimit` and write locally which it seems you do. I just tried running a similar one right now and it worked correctly as expected copying the entire bucket into the current directory. Not sure why it's not working for you, but you may want to make sure you're running the latest gcloud suite via the `gcloud components update` command, also try adding the `-m` param to run in multiple threads to see if the issue might be in this one file, i.e. `gsutil -m cp -r gs://logsnotimelimit .` – Mihail Russu Mar 26 '18 at 15:02
  • 1
    Hi Sudhanshu, I am not expert on Google Storage but is it possible the complex characters (colons, full stops etc) in the file name are causing confusion and giving the "bad filename" error? On the link [here](https://cloud.google.com/storage/docs/naming) there are some guidelines on filenaming. [This person](https://stackoverflow.com/questions/44915833/gsutil-will-not-download-files-to-my-windows-machine-from-powershell) had a similar problem and it turned out to be filename related. – Paul Mar 26 '18 at 16:40
  • @MihailRussu I first updated my `gcloud components` and then tried this command `gsutil -m cp -r gs://logsnotimelimit .` but still i am getting the same error. – Sudhanshu Gaur Mar 26 '18 at 16:56
  • @Paul thanks for helping me out of here, the naming is default naming used by the app engine. e.g `logsnotimelimit/appengine.googleapis.com/stdout/2018/03/14/14:00:00_14:59:59_S0.json` – Sudhanshu Gaur Mar 26 '18 at 16:59
  • 2
    @SudhanshuGaur are you on a Windows machine by any chance (or using a Windows filesystem)? Since the directory separator shown in your error message appears to be \ instead of `/` it's probably safe to assume that it is correct in which case @Paul is likely right as you cannot have files with the `:` character in their names on a Windows filesystem. Give it a try on a linux/macos machine and it will probably work. – Mihail Russu Mar 26 '18 at 17:17
  • Yeah it solved the problem, it was because of name of json files were having `:` in them (y) thanks a lot. You can update your answer so that i can accept it (y). – Sudhanshu Gaur Mar 26 '18 at 17:42
  • @SudhanshuGaur ok, great! Update posted. – Mihail Russu Mar 26 '18 at 19:16
5

Just wanted to help people out if they run into this problem on Windows. As administrator:

  • Open C:\Program Files (x86)\Google\Cloud SDK\google-cloud-sdk\platform\gsutil\gslib\utils
  • Delete copy_helper.pyc
  • Change the permissions for copy_helper.py to allow writing
  • Open copy_helper.py
  • Go to the function _GetDownloadFile
  • On line 2312 (at time of writing), change the following line
download_file_name = _GetDownloadTempFileName(dst_url)

to (for example, objective is to remove the colons):

download_file_name = _GetDownloadTempFileName(dst_url).replace(':', '-')
  • Go to the function _ValidateAndCompleteDownload
  • On line 3184 (at time of writing), change the following line
final_file_name = dst_url.object_name

to (for example, objective is to remove the colons):

final_file_name = dst_url.object_name.replace(':', '-')
  • Save the file, and rerun the gsutil command
  • FYI, I was using the command gsutil -m cp -r gs://my-bucket/* . to download all my logs, which by default contain : which does not bode well for Windows files!

Hope this helps someone, I know it's a somewhat hacky solution, but seeing as you never need (should have) colons in Windows filenames, it's fine to do and forget. Just remember that if you update the Google SDK you'll have to redo this.

Eugene
  • 1,539
  • 12
  • 20
3

This is also gsutil's way of saying file not found. The mention of URL is just confusing in the context of local files.

huoneusto
  • 1,164
  • 1
  • 8
  • 19
1

I got same issue and resolved it as below.

  1. Open a cloud shell, and copy objects by using gsutil command.

gsutil -m cp -r gs://[some bucket]/[object] .

  1. On the shell, zip those objects by using zip command.

zip [some file name].zip -r [some name of your specific folder]

  1. On the shell, copy the zip file into GCS by using gsutil command.

gsutil cp [some file name].zip gs://[some bucket] .

  1. On a Windows Command Prompt, copy the zip file in GCS by using gsutil command.

gsutil cp gs://[some bucket]/[some file name].zip .

I wish this information helps someone.

Delta
  • 157
  • 13
0

Be careful, in this command, the file path is case sensitive. You can check if it is not a capitalized letter issue.