0

Hi Team,

Thanks in advance. I would like to know how to solve Azure timeout error

I have created a function app with Azure Sql Database in the Azure environment and I have a very long running process which is running more than 15 mins. After 10 mins azure functions returns timeout error. So, I searched it in the internet and I found 2 solutions.

1. To set time limit 00:10:00 in host.json file will resolve the issue. 
2. To change the plan from consumption to App service plan.

Current Azure Functions Settings: 1. Pricing tier : Consumption plan 2. App Version : v1.0

Current SQL DB Plan/Settings: 1. Pricing tier : Standard S0: 10 DTUs

Solution 1) I know that, In consumption plan, the maximum timeout is 10 minutes. In an app service plan, it can be indefinite.

Solution 2) I want to know If I increase my SQL Database pricing plan from Standard to premium will it solve Timeout error?

Solution 3) Increasing the DTU's in Standard plan can solve my issue or not. If Yes, How many DTU's should I need to increase to solve my timeout error.

I want to know is there any other method we can overcome this timeout issue other than 3 solutions provided above. I spent more time on the internet but nothing works. As all the above service cost more I want to get some experts to help before going for the plan. Your help will save my day.

2 Answers2

1
  1. Correct
  2. How can we know? we know nothing about your solution. if DTU is the bottleneck, it might, but setting Function to Basic app service plan is cheaper anyway.
  3. See 2
4c74356b41
  • 69,186
  • 6
  • 100
  • 141
0

Have you looked into azure durable functions? They can be used to circumvent the 10 minute timer. They restart when they time out but they remember their context, so if you set it up properly it's a viable solution.

jkc
  • 31
  • 1
  • Not sure why this was down voted. For the vague details about the problem presented above and a request for an alternate solution, this seems like a decent proposal. Durable functions will allow you to get around the 10 minute timeout by implementing an orchestrator. – Lance Nov 29 '19 at 15:49
  • But if you have, say, an import job running, it loads a lot of data, converts it, and save it some place. If it takes 10 minutes to be 55% done with converting, wouldn't this just mean that the durable function restarts and gets to the same place again: 55% done with converting. Isn't the correct answer that you should break up the job into subtasks, eg a batch of conversions etc. That mean a lot of reprogramming to structure the import to fit this. "change to durable functions" isnt a silverbullet. You actually need to reimplement the flow. As I see it. – mslot Apr 10 '20 at 08:41