Can I get a Gradient Clipping function in Chainer?
I found some codes in Pytorch Documentation : https://pytorch.org/docs/stable/_modules/torch/nn/utils/clip_grad.html
Is there anything like alternative function in Chainer? I just found chainer.optimizer_hooks.GradientClipping, but this is hard to utilize.
Thanks in advance.