0

I currently have a database with 7 tables, each table with 40k rows. I would like to create a new database with the same 7 table names with data from rows 20k to 40k. How could I possibly achieve that using python?

Any help is highly appreciated.

Vandan Revanur
  • 459
  • 6
  • 17
  • 3
    SQLite keep data in file so copy this file to new name and use SQL query to delete 20k rows. – furas Sep 20 '19 at 08:50
  • Yes, I tried doing that using a database explorer, but it consumes a lot of memory and hangs execution.That is why I would like to know if I can perform this action via a python script. – Vandan Revanur Sep 20 '19 at 10:11
  • you can do this with Python but I expect that it will use also a lot of memory or CPU to read from one file and write to another. You would do it partially - read few rows from one file and write them to another file, then read another few file and write to file, ect. Mabye you could do the same when you delete rows. – furas Sep 20 '19 at 10:15
  • BTW: [Copy table structure to new table in sqlite3](https://stackoverflow.com/questions/12730390/copy-table-structure-to-new-table-in-sqlite3), [SQLite: moving rows from one SQLite database to another SQLite database ](https://gist.github.com/cfmeyers/6347564). All without using Python. – furas Sep 20 '19 at 10:49
  • 1
    @Vandan Why do this in database explorer? Just copy the file using standard OS function like any other file. Then open the new database and execute certain `DELETE` SQL commands, then VACUUM the database. This should not use lots of memory and hang execution. Something else must be wrong. If you have a more specific question about your hanging code, you need to provide more detail... likely asking another question with code snippets. – C Perkins Sep 20 '19 at 12:16

0 Answers0