I'm running a relatively simple python script that is meant to read a text file that has a series of stored procedures - one per line. The script should run the stored procedure on the first line, move to the second line, run the stored procedure on the second line, etc etc. Running these stored procedures should populate a particular table.
So my problem is that these procedures aren't populating the table with all of the results that they should be. For example, if my file looks like
exec myproc 'data1';
exec myproc 'data2';
Where myproc 'data1' should populate this other table with about ~100 records, and myproc 'data2' should populate this other table with an additional ~50 records. Instead, I end up with about 9 results total - 5 from the first proc, 4 from the second.
I know the procedures work, because if I run the same sql file (with the procs) through OSQL and I get the correct ~150 records in the other table, so obviously it's something to do with my script.
Here's the code I'm running to do this:
import pypyodbc
conn = pypyodbc.connect(CONN_STR.format("{SQL Server}",server,database,user,password))
conn.autoCommit = True
procsFile = openFile('otherfile.txt','r+')
#loop through each proc (line) in the file
for proc in procsFile:
#run the procedure
curs = conn.cursor()
curs.execute(proc)
#commit results
conn.commit()
curs.close();
conn.close();
procsFile.close();
I'm thinking this has something to do with the procedures not committing...or something?? Frankly I don't really understand why only 5/100 records would be committed.
I dont know. any help or advice would be much appreciated.