I have a large dictionary object dict_tmp
that takes 40GB
in RAM (system has a total of 64GB
), which has string keys and float values. I use d = shelve.open(fname, protocol=2)
and d['dict_tmp'] = dict_tmp
to save the dictionary, which produces the foll error:
Traceback (most recent call last):
File "file.py", line 160, in <module>
d['dict_tmp'] = dict_tmp
File "/usr/lib/python2.7/shelve.py", line 133, in __setitem__
self.dict[key] = f.getvalue()
File "/usr/lib/python2.7/bsddb/__init__.py", line 279, in __setitem__
_DeadlockWrap(wrapF) # self.db[key] = value
File "/usr/lib/python2.7/bsddb/dbutils.py", line 68, in DeadlockWrap
return function(*_args, **_kwargs)
File "/usr/lib/python2.7/bsddb/__init__.py", line 278, in wrapF
self.db[key] = value
TypeError: Data values must be of type string or None.
I believe shelve module is for general python objects but I think this issue may be related to anydbm. Any help would be really appreciated!
Question aside: If not shelve (as answered here and here), which is the best way to save large dictionaries? Thank you so much!