I am using Azure function apps(Python) with a blob trigger to process a CSV and move the records to a event hub. I have a working code(up to 50 rows) after following a standard documentation. However I want to know what approach should be followed if the file is in size range of a few GBs. Will this entire file be sent to the Azure function in one go? What if it needs to be read in a chunks of fixed size or line by line, will the trigger concept of Azure support that?
I am looking for any approach/code for the above problem in python that avoid loading the complete file in the azure function container memory.