1

I have a big-ol' dbm file, that's being created and used by my python program. It saves a good amount of RAM, but it's getting big, and I suspect I'll have to gzip it soon to lower the footprint.

I guess usage will involve un-gzipping it to the disk, using it, and erasing the extracted dbm when I'm done.

I was wondering whether there perhaps exists some nice way of compressing the dbm and keep working with it somehow. In my spcific usecase, I only need to read from it.

Thanks.

Jay
  • 2,535
  • 3
  • 32
  • 44

1 Answers1

0

You can gzip the value or use a key/value store that support compression like wiredtiger.

amirouche
  • 7,682
  • 6
  • 40
  • 94
  • I actually thought LevelDB might work, but I was hoping to stick to the standard library. the values are paths to files if that's of relevance. – Jay Apr 01 '17 at 05:13
  • wiredtiger has also key prefix compression. LevelDB is much slower than wiredtiger. I can't affortd GPL try RockDB. – amirouche Apr 01 '17 at 17:53