14

I am currently using the tensor.resize() function to resize a tensor to a new shape t = t.resize(1, 2, 3).

This gives me a deprecation warning:

non-inplace resize is deprecated

Hence, I wanted to switch over to the tensor.resize_() function, which seems to be the appropriate in-place replacement. However, this leaves me with an

cannot resize variables that require grad

error. I can fall back to

from torch.autograd._functions import Resize
Resize.apply(t, (1, 2, 3))

which is what tensor.resize() does in order to avoid the deprecation warning. This doesn't seem like an appropriate solution but rather a hack to me. How do I correctly make use of tensor.resize_() in this case?

kmario23
  • 57,311
  • 13
  • 161
  • 150
LL_
  • 173
  • 1
  • 1
  • 7
  • Are you sure you want to use resize the tensor, not reshape? And if you want to resize it, is there anything you can archive with resize, what isn't possible by using slicing operations? – MBT Jun 06 '18 at 11:30
  • I think you have a point. Actually, now that you mention it I'm realizing that I probably should have used reshape in the first place. Here's the output of `t.size()` before and after the operation: torch.Size([16, 512, 8, 10, 2]) and torch.Size([16, 512, 8, 20]) – LL_ Jun 06 '18 at 11:46
  • Yes, so I think you can just use `view` or `reshape` (from version 0.4.0). – MBT Jun 06 '18 at 11:51

3 Answers3

13

You can instead choose to go with tensor.reshape(new_shape) or torch.reshape(tensor, new_shape) as in:

# a `Variable` tensor
In [15]: ten = torch.randn(6, requires_grad=True)

# this would throw RuntimeError error
In [16]: ten.resize_(2, 3)
---------------------------------------------------------------------------
RuntimeError                              Traceback (most recent call last)
<ipython-input-16-094491c46baa> in <module>()
----> 1 ten.resize_(2, 3)

RuntimeError: cannot resize variables that require grad

The above RuntimeError can be resolved or avoided by using tensor.reshape(new_shape)

In [17]: ten.reshape(2, 3)
Out[17]: 
tensor([[-0.2185, -0.6335, -0.0041],
        [-1.0147, -1.6359,  0.6965]])

# yet another way of changing tensor shape
In [18]: torch.reshape(ten, (2, 3))
Out[18]: 
tensor([[-0.2185, -0.6335, -0.0041],
        [-1.0147, -1.6359,  0.6965]])
kmario23
  • 57,311
  • 13
  • 161
  • 150
  • 1
    As already noted by @blue-phoenox in the comments to the question, what I tried to achieve was indeed a reshape rather than a resize. I implemented the solution suggested here and it works just fine! – LL_ Jun 06 '18 at 15:53
  • 1
    As a side-note: I used the `tensor.reshape()` function due to its easier readability! – LL_ Jun 06 '18 at 15:54
1

Please can you try something like:

import torch
x = torch.tensor([[1, 2], [3, 4], [5, 6]])
print(":::",x.resize_(2, 2))
print("::::",x.resize_(3, 3))
Antony Hatchkins
  • 31,947
  • 10
  • 111
  • 111
Sunil
  • 171
  • 2
  • 4
0

Simply use t = t.contiguous().view(1, 2, 3) if you don't really want to change its data.

If not the case, the in-place resize_ operation will break the grad computation graph of t.
If it doesn't matter to you, just use t = t.data.resize_(1,2,3).

Daniel
  • 2,195
  • 3
  • 14
  • 24