0

I have two Data lake gen2 storage accounts and I need to transfer about 250GB of data from one to another recursively (the whole file system structure). The file system structure contains lot of files (tens of thousands).

I've tried:

  1. Downloading Data with AzCopy CLI.

  2. Downloading Data with Azure storage explorer.

  3. Transfering Data using Azure Data Factory.

All resulting same problem - timeout.

Here is the error:

Transfer of 'dev/PROJECT/' to 'Z:\XXX' failed: 120000 items transferred, 
error: failed to perform copy command due to error: cannot start job due 
to error: error listing the files inside the given source url XXX.
...
RESPONSE ERROR (ServiceCode=OperationTimedOut) =====\\nDescription=500
Operation could not be completed within the specified time., Details: (none)\\n

So I think, that it's because there is so much files in filesystem.

Is there any workaround or other way to do it?

--UPDATE--

I've started manually downloading folders one by one using storage explorer and most of them were downloaded sucesfully.

But I ran into one folder which has several MBs and I can't download it what so ever. Download is stucked at "Starting transfer of 'dev/XXX/NB054/' to 'C:\XXX' (using name and key)". But when I download files inside the folder it's OK.

Any ideas?

guderkar
  • 133
  • 1
  • 10
  • It sounds like an issue you should contact Azure support about. – silent Jul 12 '19 at 07:28
  • Just curious , with what you have gone through do you think the size of the files or volume of the files is triggering this ? – HimanshuSinha Jul 16 '19 at 22:10
  • I think eventually none of this. There is a folder that I can't download with storage explorer which is the cause of timeout. But I can download the content of the folder (files in the folder). I really don't know where is the problem... – guderkar Jul 16 '19 at 22:15

0 Answers0