0

It takes a long time to finish 100000 (user,password) tuple insertions.

def insertdata(db,name,val):
    i = db.insert()
    i.execute(user= name, password=val)
#-----main-------
tuplelist = readfile("C:/py/tst.txt")  #parse file is really fast
mydb = initdatabase()
for ele in tuplelist:
    insertdata(mydb,ele[0],ele[1])

which function take more time ? Is there a way to test bottleneck in python? Can I avoid that by caching and commit later?

Ben
  • 51,770
  • 36
  • 127
  • 149
CodeFarmer
  • 2,644
  • 1
  • 23
  • 32

1 Answers1

3

have the DBAPI handle iterating through the parameters.

def insertdata(db,tuplelist):
    i = db.insert()
    i.execute([dict(user=elem[0], password=elem[1]) for elem in tuplelist])
#-----main-------
tuplelist = readfile("C:/py/tst.txt")  #parse file is really fast
mydb = initdatabase()
insertdata(mydb,tuplelist)
zzzeek
  • 72,307
  • 23
  • 193
  • 185