0

I'm trying to download big files that are bigger than 1GBytes and process them and store the results into the database. I would like to do this as a serverless application which is keeping an eye on the FTP server that is holding files and then compare it with the previous snapshot from then files. and process those files that have been changed.

I created this in 5 different azure functions but it almost does not work and it is extremely slow compared to the local version of it.

My question is what is the best technology or framework for me to have a fast real-time serverless program that does this for me

sishanov
  • 141
  • 2
  • 9
  • This needs fleshing out really, what does your function do processing wise, where is the database, how is it triggered, what is slow, is it the processing or the downloading. Bear in mind if it's the processing you might have a powerful local machine but a serverless (Consumption plan) function would run on a much lower spec machine, also bear in mind memory requirements as a single instance on the consumption plan is a single CPU with 1.5GB of memory available etc. etc. – Simon Feb 13 '20 at 21:39
  • Thanks for your response. I actually realized that azure functions are not the proper choice to meet my purposes that's why I chose to go with a virtual machine and get benefits of windows services instead. – sishanov Feb 16 '20 at 21:52

0 Answers0