0

I have a zip file in my blob of size 900mb, so how can I unzip the same in the Azure platform itself?

Have tried using blob to blob unzipping using logical app. But there the maximum file size is 50mb only.

Any inputs is appreciated.

Shyamlesh
  • 13
  • 1
  • 4

2 Answers2

1

You have the option to go with Azure Data Factory. Azure Data Factory supports to decompress data during copy. Specify the compression property in an input dataset and the copy activity reads the compressed data from the source and decompress it.

Also,there is an option to specify the property in an output dataset which would make the copy activity compress then write data to the sink.

For your use-case - you need to read a compressed (example GZIP) data from an Azure blob, decompress it and write result data to an Azure blob, so define the input Azure Blob dataset with compression type set to GZIP.

Link - ADF - compression support

Abhishek
  • 2,482
  • 1
  • 21
  • 32
  • But in ADF it asks to map all the dataset with the input and output parameters. Like if am unzipping a folder in ADF, which has 5 csv files inside it then do I have to unzip all the files inside the folder one by one and map them. Moreover, I got lot of mapping issues like input file has 19 rows and output file has mismatch rows and all. So it took alot of time for a single file to unzip.Then I tried with Logic app which cud only unzip a maximum of 50 mb file. – Shyamlesh Aug 09 '18 at 16:30
  • Will it support for a folder with multiple files in it – Shyamlesh Aug 11 '18 at 07:02
  • Definitely it will – Abhishek Aug 11 '18 at 07:28
1

As Abhishek mentioned, you could use ADF. And you could use copy data tool to help you create the pipeline. For example, if you just want to unzip a file, you could use the following settings.enter image description here

Fang Liu
  • 2,325
  • 2
  • 13
  • 18