Nowadays most databases support inserting multiple records into database in one run. That is much faster than inserting records one by one, because only one transaction is need. The SQL syntax is similar to this:
INSERT INTO tbl_name (a,b,c)
VALUES(1,2,3), (4,5,6), (7,8,9);
Right now I'm using Python Scrapy on a small project. I use its item pipeline to store scraped data into a database. However, the logic behind item pipeline is that the relevant method will be called on each item. So it will always insert a single item at a time. How can I collect like 100 items and insert them in one run?