2

I used setattr to save some information with a Pytorch Tensor, which can be retrieved as expected:

import torch
before_write = torch.Tensor()
setattr(before_write, "features", "my_features")
print(before_write.features)
> 'my_features'

Unexpectedly, when I write the tensor to disk, and then read it back in again, the features have disappeared:

torch.save(before_write, "~/my_tensor")
after_write = torch.load("~/my_tensor")
print(after_write.features)
> AttributeError: 'Tensor' object has no attribute 'features'

Why is the attribute lost when writing the tensor to disk? Is this expected behaviour? Would there be a workaround to save the features together with the tensor?

NB: using an empty Python class and writing it with pickle, does retain the attributes.

MartijnVanAttekum
  • 1,405
  • 12
  • 20
  • Have you tried unpickling the saved tensor using `pickle.load()` instead of `torch.load()`? The `torch.load()` function is documented to have some extra behaviour beyond basic unpickling and it seems likely the attribute gets lost there? https://pytorch.org/docs/stable/torch.html?highlight=save#torch.load – Grismar Feb 20 '20 at 21:12
  • @Grismar: I don't think it makes sense to `pickle.load()` the torch object, right? But I did try to `pickle.dump()` and then read back in with `pickle.load()`, resulting in the same error. – MartijnVanAttekum Feb 20 '20 at 21:16
  • judging from the similarity in the serialized representation as shown via `pickle.dumps()` of a tensor with and without a large attribute, it seems that tensor attributes are just not serialized. That leaves question 2 and 3 then... – MartijnVanAttekum Feb 20 '20 at 21:53
  • 1
    The reason I thought it made sense is because the documentation for `torch.load()` states that it's using `pickle` (with specific settings), so I figured there might be a clue there whether the unpickling is the problem, or whether the specific settings for `pickle` cause the issue. – Grismar Feb 20 '20 at 22:48

0 Answers0