0

I'm using azcopy to upload local files to a blob storage.

I'm using command:

azcopy copy "localpath" "destinationpath(with SAS)" --include="*.csv" --recursive=true

I also tried

azcopy sync "localpath" "destinationpath(with SAS)" --include="*.csv"

The files I'm trying to upload are each 1GB+. When I manually upload a file to the data lake it takes 40min+ for 1 file. If I do it with azcopy it takes 30min+ per file and often fails.

Is it normal that it takes this long? Am I doing something wrong or is there a faster way of doing this?

Ivan Glasenberg
  • 29,865
  • 2
  • 44
  • 60
mrdeadsven
  • 744
  • 1
  • 9
  • 22

1 Answers1

1

As you may know, the azcopy is optimized for better performance. I see your code, nothing is missing. If that's the case, we can do nothing(or maybe you can check if it's network issue?).

You can take a try with Azure Data Factory, it provides very high performance which can be up to 1-GB/s data loading speed into Data Lake Storage Gen1.

Ivan Glasenberg
  • 29,865
  • 2
  • 44
  • 60
  • Yes, we just found the problem. We have to use a proxy, but apparently the proxy is being throttled and gave us very bad upload speed. We talked to our network team and now it is solved, so it was indeed a network related problem. – mrdeadsven Oct 03 '19 at 10:37