Now when torch.autograd.Variable
is merged with torch.tensor
and obsolete, why did they deprecate some functions in torch.nn.functional
but not others? Namely, tanh
is deprecated but not sigmoid
or relu
.
>>> torch.__version__
'1.1.0'
>>> u
tensor(2., grad_fn=<MeanBackward0>)
>>> torch.nn.functional.tanh(u)
C:\Users\mlearning\AppData\Local\Continuum\anaconda3\lib\site-packages\torch\nn\functional.py:1374: UserWarning: nn.functional.tanh is deprecated. Use torch.tanh instead.
warnings.warn("nn.functional.tanh is deprecated. Use torch.tanh instead.")
tensor(0.9640, grad_fn=<TanhBackward>)
>>> torch.nn.functional.sigmoid(u)
tensor(0.8808, grad_fn=<SigmoidBackward>)
>>> torch.nn.functional.relu(u)
tensor(2., grad_fn=<ReluBackward0>)
Is there any difference between torch.nn.functional.relu
and torch.relu
, or I can use them interchangeably?