I am trying to insert excel data into a table (postgres) using a python script. I am running into an issue though where some of the larger numbers get inserted as exponentials. I realized this is happening when I am converting the file from .xls to .csv (I never open the .xls files because I realize that excel does some funky stuff where it'll save larger numbers into exponential form)
Is there an easy way to ensure the numbers don't get displayed as exponentials?
ie 812492400097 is being displayed as 8.12E+11
Here is the convert to .csv script:
import xlrd
import unicodecsv
import sys
import os
import datetime
def csv_from_excel(xlsfile, csvfile):
wb = xlrd.open_workbook(xlsfile)
sh = wb.sheet_by_index(0)
outputfile = open(csvfile, 'wb')
wr = unicodecsv.writer(outputfile, quoting=unicodecsv.QUOTE_ALL)
for rownum in xrange(sh.nrows):
wr.writerow(sh.row_values(rownum))
outputfile.close()
def log(s):
print str(datetime.datetime.now()) + ": " + s
#main
if len(sys.argv) < 2:
print "Missing parameters: input xls file"
sys.exit()
sourcefile = sys.argv[1]
destfile = sourcefile.split('.')[0] + '.csv'
log("processing " + sourcefile)
csv_from_excel(sourcefile, destfile)
Also wondering if perhaps instead of ensuring the .csv doesn't turn numbers to exponentials, to turn exponentials to numbers when inserting into the postgres table?