You can use tf.custom_gradient to define your own forward and backprop step in a single method. Here is a simple example:
import tensorflow as tf
tf.InteractiveSession()
@tf.custom_gradient
def custom_multiply(a, x):
# Define your own forward step
y = a * x
# Define your own backward step
def grads(dy): return dy * x, dy * a + 100
# Return the forward result and the backward function
return y, grads
a, x = tf.constant(2), tf.constant(3)
y = custom_multiply(a, x)
dy_dx = tf.gradients(y, x)[0]
# It will print `dy/dx = 102` instead of 2 if the gradient is not customized
print('dy/dx =', dy_dx.eval())
If your want to customize your own layer, simply replace the core function used in tf.layers.Dropout.call
with your own's.