For Example:
books = [{'name':'pearson', 'price':60, 'author':'Jesse Pinkman'},{'name':'ah publications', 'price':80, 'author':'Gus Fring'},{'name':'euclidean', 'price':120, 'author':'Skyler White'},{'name':'Nanjial', 'price':260, 'author':'Saul Goodman'}]
I need to insert each dictionary into already created table by just taking 'author','price' I have like 100k records to be inserted into table. Right now what I am doing is to loop through the list of dictionaries and take the required key/value pair and insert one by one
def insert_books(self, val):
cur = self.con.cursor()
sql = """insert into testtable values {}""".format(val)
cur.execute(sql)
self.con.commit()
cur.close()
for i in books:
result = i['author'],i['price']
db_g.insert_books(result) #db_g is class - connection properties
So is there a faster and easier way to bulk insert the data like 10k at a time?