2

I have two packages I'd like to use, one is written in Keras1.2, and the other one in tensorflow. I'd like to use a part of the architecture that is built in tensorflow into a Keras model.

A partial solution is suggested here, but it's for a sequential model. The suggestion regarding functional models - wrapping the pre-processing in a Lambda layer - didn't work.

The following code worked:

inp = Input(shape=input_shape)
def ID(x):
    return x
lam = Lambda(ID)  
flatten = Flatten(name='flatten')
output = flatten(lam(inp))
Model(input=[inp], output=output)

But, when replacing flatten(lam(inp)) with a pre-processed output tensor flatten(lam(TF_processed_layer)), I got: "Output tensors to a Model must be Keras tensors. Found: Tensor("Reshape:0", shape=(?, ?), dtype=float32)"

Tal
  • 127
  • 1
  • 7

2 Answers2

0

You could try wrapping your input tensor into the Keras Input layer and carry on building your model from there. Like so:

inp = Input(tensor=tftensor,shape=input_shape)
def ID(x):
    return x
lam = Lambda(ID)  
flatten = Flatten(name='flatten')
output = flatten(lam(inp))
Model(input=inp, output=output)
rahullak
  • 81
  • 1
  • 8
0

You are not defining your lamba correctly for Keras. Try something like this

def your_lambda_layer(x):
    x -= K.mean(x, axis=1, keepdims=True)
    x = K.l2_normalize(x, axis=1)
    return x

....
model.add(Lambda(your_lambda_layer))

of seeing you are using the Functional API like this

def your_lambda_layer(x):
    x -= K.mean(x, axis=1, keepdims=True)
    x = K.l2_normalize(x, axis=1)
    return x

....
x = SomeLayerBeforeLambda(options...)(x)
x = (Lambda(your_lambda_layer))(x)

But even so, the lambda layer may not be able to be flattened so printout the shape of the lambda and take a look at it and see what it is.

MNM
  • 2,673
  • 6
  • 38
  • 73