1

How can I use the PReLU activation function with an variable input size?

from keras.layers import Conv2D, Input
from keras.layers.advanced_activations import PReLU

X = Input(shape=(None,None,1))
conv1Y = Conv2D(filters=49, kernel_size=7, name='conv1')(X)
conv1Y = PReLU()(conv1Y)

The code above yields

TypeError: unsupported operand type(s) for *: 'NoneType' and 'NoneType'

Manuel Schmidt
  • 2,429
  • 1
  • 19
  • 32
  • Have you tried `conv1Y = Conv2D(filters=49, kernel_size=7, name='conv1', activation=PReLU())(X)`? – Marcin Możejko May 22 '17 at 08:23
  • Yes: `UserWarning: Do not pass a layer instance (such as PReLU) as the activation argument of another layer. Instead, advanced activation layers should be used just like any other layer in a model.` ... `TypeError: unsupported operand type(s) for *: 'NoneType' and 'NoneType'` – Manuel Schmidt May 22 '17 at 08:52

1 Answers1

0

Use PreLU(shared_axes=[1,2]) so shape is computed on compilation.

roj4s
  • 251
  • 2
  • 8