1

Now when torch.autograd.Variable is merged with torch.tensor and obsolete, why did they deprecate some functions in torch.nn.functional but not others? Namely, tanhis deprecated but not sigmoid or relu.

>>> torch.__version__
'1.1.0'
>>> u
tensor(2., grad_fn=<MeanBackward0>)
>>> torch.nn.functional.tanh(u)
C:\Users\mlearning\AppData\Local\Continuum\anaconda3\lib\site-packages\torch\nn\functional.py:1374: UserWarning: nn.functional.tanh is deprecated. Use torch.tanh instead.
  warnings.warn("nn.functional.tanh is deprecated. Use torch.tanh instead.")
tensor(0.9640, grad_fn=<TanhBackward>)   
>>> torch.nn.functional.sigmoid(u)
tensor(0.8808, grad_fn=<SigmoidBackward>)    
>>> torch.nn.functional.relu(u)
tensor(2., grad_fn=<ReluBackward0>)

Is there any difference between torch.nn.functional.relu and torch.relu, or I can use them interchangeably?

user31264
  • 6,557
  • 3
  • 26
  • 40
  • 1
    These kind of design choices are hard to answer. You're better off asking these on Github so that the devs see them more easily. About the usage: there should be no usage difference, but if one is deprecated and the other is not, it's best to use the latter. – Bram Vanroy Jun 23 '19 at 11:58

1 Answers1

1

You can check this thread where one of the few main PyTorch designers (actually a creator) set the directive.

You can also check the reasoning behind. Also, you may propose the same for the other 2 functions.

The other should deprecate as well.

prosti
  • 42,291
  • 14
  • 186
  • 151