>>> a = torch.arange(12).reshape(2, 6)
>>> a
tensor([[ 0, 1, 2, 3, 4, 5],
[ 6, 7, 8, 9, 10, 11]])
>>> b = a[1:, :]
>>> b.storage() is a.storage()
False
But
>>> b[0, 0] = 999
>>> b, a # both tensors are changed
(tensor([[999, 7, 8, 9, 10, 11]]),
tensor([[ 0, 1, 2, 3, 4, 5],
[999, 7, 8, 9, 10, 11]]))
What is exactly the objects that stores tensor data? How can I make check if 2 tensors share memory?