I am using documentDB with Azure Function App. I have created a basic HTTPtrigger in JS to store(insert) entries in documentDB.
Data throughput for collection is set to 2500 (RU).
Here req.body is an array and req.body.length is around 2500 objects resulting in 1 MB of size, which I believe is fairly small.
module.exports = function (context, req) {
context.bindings.document = [];
if (req.body) {
//if(req.body instanceof Array){context.log("It is an array");}
context.bindings.document = req.body; // here document is function app parameter
res = {status: 200};
}
else {
res = {
status: 400,
body: "Pass Parameters"
};
}
context.done(null, res);};
For every single request(POST) to function app, it takes avg around 30-40s to execute and to store values in collection, which is really long. And it results in connection timeout for parallel requests.
Is there any performance tweak that can be used with documentDB or Azure function app to lower the execution time?
How does Function App handles documentDB in background? Is it utilizing best practices?
I am familiar with bulk insert/update operations in other NoSQL, but couldn't find anything for documentDB.