It is proved that Randomized ReLU, as well as Parametric ReLU, perform better compared to ReLU in CIFAR-10, CIFAR-100 and NDSB datasets. While Keras has provided documentation on how to use PReLU, I was not able to find one for RReLU. Can someone brief about the usage of Randomized ReLU activation in Keras?
Below is the description of RReLU from the paper arXiv:1505.00853v2.