1

As part of migrating data from ADLS Gen1 to ADLS Gen2 using ADFv2, we are having below scenario.

source -> raw/datasourceA/2019/2019-Aug/12-Aug-2019/files
          raw/datasourceA/2019/2019-Aug/13-Aug-2019/files
          raw/datasourceA/2020/2020-Apr/02-Apr-2020/files

target -> raw/eval/datasourceA/12-Aug-2019/files
          raw/eval/datasourceA/13-Aug-2019/files
          raw/eval/datasourceA/02-Apr-2020/files

One option to achieve this is by having source path and target path mapping in table and read each row using ADF lookup activity. However doing so, we will end up having so many entries in table.

Is there any other way to achieve it dynamically in ADF ? In control table I just want to have below source and target path and rest to be handled by ADF.

source path -> raw/datasourceA/
target path -> raw/eval/datasourceA/
Ritesh
  • 1,030
  • 2
  • 12
  • 28

1 Answers1

0

Because your folders are hierarchical, I support your idea to pass the file path as a parameter to the copy activity. In ADF, it is more convenient to traverse one hierarchical file.

  1. Declare an array type variable and asign the value ["2019/2019-Aug","2020/2020-Apr"]. enter image description here

  2. Specify the file path via add dynamic content @concat('raw/datasourceA/',item()). enter image description here

  3. Then sink to the target folder. enter image description here

  4. We can see the source folders were copied to the target folder. enter image description here

Joseph Xu
  • 5,607
  • 2
  • 5
  • 15
  • 1
    Hi @Joseph, it is working fine. However, for loop is working only in sequential mode with this. I am getting error while going with parallel execution. The same issue has been highlighted in given SO post https://stackoverflow.com/questions/62356952/leasealreadypresent-error-in-azure-data-factory-v2#comment118988889_62356952 – Ritesh May 08 '21 at 06:15
  • Sorry, sir. I haven't reproduce your problem. : ( – Joseph Xu May 12 '21 at 06:47