I have two Data lake gen2 storage accounts and I need to transfer about 250GB of data from one to another recursively (the whole file system structure). The file system structure contains lot of files (tens of thousands).
I've tried:
Downloading Data with AzCopy CLI.
Downloading Data with Azure storage explorer.
Transfering Data using Azure Data Factory.
All resulting same problem - timeout.
Here is the error:
Transfer of 'dev/PROJECT/' to 'Z:\XXX' failed: 120000 items transferred,
error: failed to perform copy command due to error: cannot start job due
to error: error listing the files inside the given source url XXX.
...
RESPONSE ERROR (ServiceCode=OperationTimedOut) =====\\nDescription=500
Operation could not be completed within the specified time., Details: (none)\\n
So I think, that it's because there is so much files in filesystem.
Is there any workaround or other way to do it?
--UPDATE--
I've started manually downloading folders one by one using storage explorer and most of them were downloaded sucesfully.
But I ran into one folder which has several MBs and I can't download it what so ever. Download is stucked at "Starting transfer of 'dev/XXX/NB054/' to 'C:\XXX' (using name and key)". But when I download files inside the folder it's OK.
Any ideas?