1

I have a rather large sqlite3 database that I've built/populated through python over a few months. I have done many inserts/deletes etc while putting it together and it now uses the majority of my diskspace so I don't have enough disk space left to add a required index/do other things I need to do on it.

I found out about the 'VACUUM;' command which sounds like it will help reduce the disk space my database uses, however as it makes a copy it needs more spare diskspace than I have to run it (I have already deleted as many other files as I possibly can). Is there any way to run the VACUUM in bits/chunks to avoid needing so much spare disk space? Or are there any other things I could do to help reduce the size of my database?

Emi OB
  • 2,814
  • 3
  • 13
  • 29
  • How about `VACCUM INTO SomeTempPath` to get a new and vaccumed db in another internal/external drive? Then replace the original with it. Note, the original db remains _unvaccumed_ this way. See [VACUUM with an INTO clause](https://www.sqlite.org/matrix/lang_vacuum.html). – dr.null Sep 01 '22 at 11:48
  • @dr.null Thanks for the idea! unfortunately I don't think I'll be allowed an external device for security reasons, and shared drive won't be big enough – Emi OB Sep 01 '22 at 11:52
  • 2
    I see. Then tell the security guy that you can't beat the physics. If you don't have enough space to have the temp db during the `VACCUM` process then you need some `VOODOO` cmd rather than `VACCUM`. – dr.null Sep 01 '22 at 12:02
  • typo correction... `VACUUM` is. – dr.null Sep 23 '22 at 04:39

0 Answers0