Azure Data Factory supports compress/decompress data during copy. When you specify compression property in an input dataset, the copy activity read the compressed data from the source and decompress it; and when you specify the property in an output dataset, the copy activity compress then write data to the sink.
For example:
Read .zip file from FTP server, decompress it to get the files inside, and land those files in Azure Data Lake Store. You define an input FTP dataset with the compression type property as ZipDeflate.
For more details, please reference: Compression support.
Here's the tutorial about Copy data from FTP server by using Azure Data Factory.
Other format dataset
To copy data from FTP in ORC/Avro/JSON/Binary format, the following properties are supported in this link: Other format dataset.

Tips:
- To copy all files under a folder, specify folderPath only.
- To copy a single file with a given name, specify folderPath with folder part and fileName with file name.
- To copy a subset of files under a folder, specify folderPath with folder part and fileName with wildcard filter.
Hope this helps.