0

I am trying to implement the BReLU Activation Function using tensorflow.keras which is described below. enter image description here

Following is the code I wrote for the custom layer:

class BReLU(Layer):

    def __init__(self):
        super(BReLU, self).__init__()

    def call(self, inputs):
        for i, element in enumerate(inputs):
            if i % 2 == 0:
                inputs[i] = tf.nn.relu(inputs[i])
            else:
                inputs[i] = -tf.nn.relu(-inputs[i])

I am trying to test the implementation using the following code snippet:

>>> import warnings
>>> warnings.filterwarnings('ignore')
>>> from custom_activation import BReLU
>>> from tensorflow.keras.layers import Input
>>> from tensorflow.keras.models import Model
>>> inp = Input(shape = (128,))
>>> x = BReLU()(inp)

Upon executing the test snippet, I am getting the following error:

Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "C:\Users\KIIT_Intern\.conda\envs\style_transfer\lib\site-packages\tensorflow\python\keras\engine\base_layer.py", line 554, in __call__
    outputs = self.call(inputs, *args, **kwargs)
  File "C:\Workspace\Echo\Echo\Activation\Tensorflow\custom_activation.py", line 308, in call
    for i, element in enumerate(inputs):
  File "C:\Users\KIIT_Intern\.conda\envs\style_transfer\lib\site-packages\tensorflow\python\framework\ops.py", line 442, in __iter__
    "Tensor objects are only iterable when eager execution is "
TypeError: Tensor objects are only iterable when eager execution is enabled. To iterate over this tensor use tf.map_fn.

How do I modify the implementation of the layer to make it work without enabling eager execution?

Soumik Rakshit
  • 859
  • 9
  • 22

1 Answers1

1

Supposing i refers to the last axis.

def brelu(x):

    #get shape of X, we are interested in the last axis, which is constant
    shape = K.int_shape(x)

    #last axis
    dim = shape[-1]

    #half of the last axis (+1 if necessary)
    dim2 = dim // 2
    if dim % 2 != 0:
        dim2 += 1

    #multiplier will be a tensor of alternated +1 and -1
    multiplier = K.ones((dim2,))
    multiplier = K.stack([multiplier,-multiplier], axis=-1)
    if dim % 2 != 0:
        multiplier = multiplier[:-1]

    #adjust multiplier shape to the shape of x
    multiplier = K.reshape(multiplier, tuple(1 for _ in shape[:-1]) + (-1, ))

    return multiplier * tf.nn.relu(multiplier * x)

Use it in a lambda layer:

x = Lambda(brelu)(inp)
Soumik Rakshit
  • 859
  • 9
  • 22
Daniel Möller
  • 84,878
  • 18
  • 192
  • 214
  • In the penultimate line of the brelu function, I am getting the error `TypeError: unsupported operand type(s) for +: 'generator' and 'tuple'` after executing your implementation. – Soumik Rakshit Aug 12 '19 at 17:44
  • 1
    @SoumikRakshit, Use `multiplier = K.reshape(multiplier, tuple(1 for _ in shape[:-1]) + (-1,))` – Daniel Möller Aug 12 '19 at 19:06