I have PDF files stored as image datatype (large binary data as mentioned in the doc) in a sybase database table. I am trying to read one of those files from the db and write it to a file in a local folder using python pyodbc package like this example :
driver = "FreeTDS"
prt = 'port'
db = 'db'
passwd = 'passwd'
usr = 'usr'
serv = 'serv'
conn = pyodbc.connect(driver=driver, server=serv, port=prt, uid=usr, pwd=passwd)
sql_query = (
"SELECT ARCH_DOC_DOC as file_content FROM table_name WHERE ARCH_DOC_ID = id"
)
cursor = conn.cursor()
cursor.execute(sql_query)
pdf_data = cursor.fetchone()[0]
with open('my_test_file.pdf', 'wb') as f:
f.write(pdf_data)
I am using TDS driver and running this code on Debian GNU/Linux 11 machine
Compile-time settings (established with the "configure" script)
Version: freetds v1.2.3
freetds.conf directory: /etc/freetds
MS db-lib source compatibility: no
Sybase binary compatibility: yes
Thread safety: yes
iconv library: yes
TDS version: auto
iODBC: no
unixodbc: yes
SSPI "trusted" logins: no
Kerberos: yes
OpenSSL: no
GnuTLS: yes
MARS: yes
The problem is that I am getting corrupt file in the end and after testing a couple of files I noticed that I am always getting a file size 33ko. For example, the original file size that I am using to test is 90ko in the db and the file I am getting is only 33ko. So I am wondering if the issue is in the database/driver config or if there is a limit in the size of data that I can read with pyodbc ? And how can I fix that ?