0

I'm getting quite a few timeouts as my blob storage trigger is running. It seems to timeout whenever I'm inserting values into an Azure SQL DB. I have just raised the functionTimeout parameter in the host.json file to "functionTimeout": "00:40:00" before running the storage trigger, although I'm seeing timeouts happen within a couple of minutes. Why would this be the case? My function app is on ElasticPremium pricing tier.

EDIT:

System.TimeoutException message:

Exception while executing function: Functions.BlobTrigger2 The operation has timed out.

My connection to the db (I close it at the end of the script):

# urllib.parse.quote_plus for python 3
        params = urllib.parse.quote_plus(fr'Driver={DRIVER};Server=tcp:{SERVER_NAME},1433;Database=newTestdb;Uid={USER_NAME};Pwd={PASSWORD};Encrypt=yes;TrustServerCertificate=no;Connection Timeout=30;')
        conn_str = 'mssql+pyodbc:///?odbc_connect={}'.format(params)
        engine_azure = create_engine(conn_str,echo=True)
        conn = engine_azure.connect()

This is the line of code that is run before the timeout happens (Inserting to db):

processed_df.to_sql(blob_name_file.lower(), conn, if_exists = 'append', index=False)
BlakeB9
  • 345
  • 1
  • 3
  • 13
  • 1
    What is the connection, command and query timeout settings when you connect to your Azure SQL database? Share more details on the timeout error - code and exact error message. – Anand Sowmithiran Apr 21 '22 at 17:05
  • @AnandSowmithiran updated with some more info. can provide more if needed – BlakeB9 Apr 21 '22 at 18:13
  • 1
    the timeout in connection string is for 'query' [timeout](https://code.google.com/archive/p/pyodbc/wikis/Connection.wiki#timeout), you are setting it to 30 seconds. Any records were inserted into your sql tables at all ? Can you try with smaller blob file ? how big is your blob file ? – Anand Sowmithiran Apr 21 '22 at 18:25
  • @AnandSowmithiran that makes sense, some of the timeouts do happen in 30 seconds. I wonder why some of them go to a couple minutes. And yes I can trace to see if any records made it in, one second. – BlakeB9 Apr 21 '22 at 18:35
  • 1
    Ok, another thing, the dataframe to_sql() function takes in a `chunksize` argument, you can use it to make batched insert, instead of ALL the rows in the dataframe into your sql table. try setting that to 10 or 100 rows, refer [this](https://pandas.pydata.org/docs/reference/api/pandas.DataFrame.to_sql.html) – Anand Sowmithiran Apr 21 '22 at 18:37
  • @AnandSowmithiran it does not seem that any rows are making it in when it timesout. Also, I've found that some timeouts are happening within a few seconds.. but I'll try and change the Connection Timeout parameter in the connection and see if that helps. Would you happen to know how high that parameter could go up to? – BlakeB9 Apr 21 '22 at 18:45
  • @AnandSowmithiran thank you, so if I set that to 100, it would insert 100 rows at a time until it reaches the end? Also that would explain why I didn't see any records in the database from the failed attempts. – BlakeB9 Apr 21 '22 at 18:47
  • Thought I had it sorted out, but I received another timeout in 10 seconds of the function being invocated. – BlakeB9 Apr 21 '22 at 20:08

0 Answers0