11

I just want to implement a function that given a matrix X returns the covariance matrix of X (X^T*X), which is just a simple matrix multiplication.

In Tensorflow it's gonna be easy: tf.matmul(X, tf.transpose(X))

But I didn't expect that it's a nightmare with Keras. The APIs in Keras like multiply and dot don't fit my request. I also tried different ways (Lambda layer and mixed with TF operations) but still failed, occurred lots of errors.

Hope someone may help. Thanks.

Rui Meng
  • 113
  • 1
  • 1
  • 4

3 Answers3

6

Actually you do have the analogous in Keras. Try dot(x, transpose(x)).

A working example comparing the two platforms follows.

import keras.backend as K
import numpy as np
import tensorflow as tf


def cov_tf(x_val):
    x = tf.constant(x_val)
    cov = tf.matmul(x, tf.transpose(x))
    return cov.eval(session=tf.Session())

def cov_keras(x_val):
    x = K.constant(x_val)
    cov = K.dot(x, K.transpose(x))
    return cov.eval(session=tf.Session())

if __name__ == '__main__':
    x = np.random.rand(4, 5)
    delta = np.abs(cov_tf(x) - cov_keras(x)).max()
    print('Maximum absolute difference:', delta)

The maximum absolute difference is printed and gives me something around 1e-7.

grovina
  • 2,999
  • 19
  • 25
  • 1
    If the goal is to perform a matrix product as a layer of a model then you should not use the backend. keras.backend will just refer the operation to the backend framework, and that causes problems when saving the model. Instead you should use keras.layers.dot which is specifically for performing tensor products in a model layer. https://keras.io/layers/merge/#dot – B Custer Feb 22 '20 at 00:48
  • Hi, @grovina, I'm a newbie to TensorFlow. I'm curious about the backend setting above. You have imported the TensorFlow explicitly. Then why do you need to ```import keras.backend as K``` at the same time? – Jie May 08 '21 at 05:50
  • @Jie, note that TF and Keras merged so you don't need to think of them separately. You could probably do it the Keras way natively in TF. – grovina May 09 '21 at 08:15
2

You must have a layer, and inside the layer make the calculation.

import keras.backend as K
from keras.layers import Lambda
from keras.models import Model

inp = Input((your input shape))
previousLayerOutput = SomeLayerBeforeTheCovariance(blabla)(inp)    

covar = Lambda(lambda x: K.dot(K.transpose(x),x), 
    output_shape = (your known shape of x))(previousLayerOutput)

nextOut = SomeOtherLayerAfterThat(blablabla)(covar)
lastOut = AnotherLayer(bahblanba)(nextOut)

model = Model(inp, lastOut)
Daniel Möller
  • 84,878
  • 18
  • 192
  • 214
  • Thanks for this answer. Still hate to use Lambda, but you are absolutely correct ---> "You must have a layer" for your output to be useful in a model. – jsfa11 Dec 22 '18 at 21:14
  • Yes, but the solution you suggest will not save to file, ie you will not be able to serialize the model with keras.Model.save. Though keras.Model.save_weights will probably work if you don't need to save the whole model. Use keras.layers.dot instead of making a lamda layer with a backend product. https://keras.io/layers/merge/#dot – B Custer Feb 22 '20 at 00:55
  • It does save and load. – Daniel Möller Feb 22 '20 at 02:29
  • But today I would recommend `tf.matmul`, it's so much easier to understand how it works. – Daniel Möller Feb 22 '20 at 09:42
-1

You can use keras.layers.merge.Multiply()

It takes as input a list of tensors, all of the same shape, and returns a single tensor (also of the same shape).

The keras documentation

Cheers A.

anthdm
  • 7
  • 1