I'm currently using azure functions for the following process :
- Dowload tar.gz archives from a client ftp. Most of them are daily packages under 20mo, but a few of them are more than 100mo (edited once year ends and replace all previous packages)
- Extract the archives into blobs
- Do some stuff later on the blobs
I'm using generic storage blobs (much cheaper than hot blob container). My problem is currently with the archives exceding 100mo : even with 10mins timeout these can't be extracted. I'm wondering if azure functions are the right tool for this or if I should try azure batch? Maybe a mix of both : a function to trigger a batch for the extraction process?
Another solution would be to run the extraction of large archives from my computer which should be more performant than functions VM.
Any ideas on how to do this?