0

If a dill file is to large to RAM, is it possible to load it in an alternative method. For example python3 is throwing Memory Errors when I load a serialized object of about 1.2 GB.

file = open('SerializedData.pkl', 'rb')
data = dill.load(file)

This will not pass because the RAM is too small. However, is it possible to load it in a different way so that I extract the data without overloading the RAM?

Troll_Hunter
  • 465
  • 2
  • 9
  • 15
  • 1
    Load it into _where_? RAM? See the problem... – Boris the Spider Sep 08 '15 at 18:54
  • 1
    If you have a single large object as your question implies, I don't think you can do much about it. Though you could try increasing the swap size and pray that your script won't start thrashing too much. – Cristian Ciupitu Sep 08 '15 at 19:42
  • It could also be a bug, either of dill or cPickle like this [one](http://bugs.python.org/issue13555). Addendum for the previous comment: if you're on a Unix like system, you might also need to increase the ulimits. – Cristian Ciupitu Sep 08 '15 at 19:47
  • Related: [Pickle File too large to load](http://stackoverflow.com/q/26394768/12892) – Cristian Ciupitu Sep 08 '15 at 21:26
  • possible duplicate of [Getting Dill Memory Error when loading serialized object, how to fix?](http://stackoverflow.com/questions/32464356/getting-dill-memory-error-when-loading-serialized-object-how-to-fix) – Cristian Ciupitu Sep 09 '15 at 00:52
  • 2
    I'm the `dill` author. It depends on what you are trying to load. If you are serializing data, it might be possible to use compressed or reduced format. For example, if it's a `numpy.ndarray`, you could use `numpy` serialization. With `dill`, you have several settings, such as `byref` and `recurse`, which can change the size of what's pickled. However, if you are looking to pickle data, I'd check if the data object has a preferred serialization method. However, for an already built pickle, you are pretty much screwed, I believe. – Mike McKerns Sep 09 '15 at 08:04

0 Answers0