1

I'm trying to insert a single row to a table that has a nvarchar(max) column (say response) using python's pymssql library.

The other columns are straightforward - one nvarchar(10) column, two nvarchar(30) columns, two date columns and one bigint column

I don't really have control over the length of the response string I get and as a result they can be of arbitrary length. Everywhere I've searched shows (e.g. What is the maximum characters for the NVARCHAR(MAX)?) that nvarchar(max) can support up to a billion characters. However, the code I'm using seems to break at around the 130 million character mark.

Code:

import pymssql
from pathlib import Path
conn = pymssql.connect('server', 'user', 'password', 'db')
content = Path('file.txt').read_text()
print(len(content)) # 547031539
with conn.cursor() as cursor:
    cursor.execute("Insert into table_name values (%s, %s, %s, %d, %s, %s, %s)", ('2020-05-21', '2023-03-27T20:51:50.221718', '2023-03-27T19:34:02.103253', 127671, content[:133_949_006], 'New', '2020-05-22'))
conn.commit()

Traceback:

---------------------------------------------------------------------------
MSSQLDatabaseException                    Traceback (most recent call last)
File src\pymssql\_pymssql.pyx:461, in pymssql._pymssql.Cursor.execute()

File src\pymssql\_mssql.pyx:1087, in pymssql._mssql.MSSQLConnection.execute_query()

File src\pymssql\_mssql.pyx:1118, in pymssql._mssql.MSSQLConnection.execute_query()

File src\pymssql\_mssql.pyx:1251, in pymssql._mssql.MSSQLConnection.format_and_run_query()

File src\pymssql\_mssql.pyx:1789, in pymssql._mssql.check_cancel_and_raise()

File src\pymssql\_mssql.pyx:1835, in pymssql._mssql.raise_MSSQLDatabaseException()

MSSQLDatabaseException: (20047, b'DB-Lib error message 20047, severity 9:\nDBPROCESS is dead or not enabled\n')

During handling of the above exception, another exception occurred:

OperationalError                          Traceback (most recent call last)
Cell In[126], line 2
      1 with conn.cursor() as cursor:
----> 2     cursor.execute(Insert into table_name values (%s, %s, %s, %d, %s, %s, %s)", ('2020-05-21', '2023-03-27T20:51:50.221718', '2023-03-27T19:34:02.103253', 127671, content[:133_949_006], 'New', '2020-05-22'))
      3 conn.commit()

File src\pymssql\_pymssql.pyx:479, in pymssql._pymssql.Cursor.execute()

OperationalError: (20047, b'DB-Lib error message 20047, severity 9:\nDBPROCESS is dead or not enabled\n')
wkgrcdsam
  • 236
  • 1
  • 2
  • 11
  • you want to put a 500 mb text into a Text column, wouldn't it be easier to save only the file path, it will take also ages to get the data back and sql server are not really the best place to save such big amount in one column – nbk Mar 28 '23 at 16:21
  • Agreed that this would be better. However, I do want to understand where the issue is here - is it at the FreeTDS layer (which is what pymssql uses) or an issue with pymssql or a limitation of SQL server itself or maybe with the database (perhaps some configurations) – wkgrcdsam Mar 28 '23 at 16:28
  • Also - it errors out with only ~ 130 million characters. I ran a binary search to see where it starts erroring out for that file and it was at `133_949_006` characters – wkgrcdsam Mar 28 '23 at 16:35
  • it will be one of the timeouts, as i said that isn't really a good thing for sql server but see https://learn.microsoft.com/en-us/sql/database-engine/configure-windows/configure-the-remote-query-timeout-server-configuration-option?view=sql-server-ver16 – nbk Mar 28 '23 at 16:36
  • The value is set at 600 for the db server. However the error message on the client side gets thrown in under a minute – wkgrcdsam Mar 28 '23 at 16:41
  • almost all service make an entry in the event log so that would be the first where to look, also has sqlserver his own log, which have to be enabled – nbk Mar 28 '23 at 16:56

0 Answers0