1

I feel like I don't really know what I'm doing so I will describe what I think I'm doing and what I want to do and where that fails.

Given a normal variational autoencoder:

...
net = tf.layers.dense(net, units=code_size * 2, activation=None)
mean = net[:, :code_size]
std = net[:, code_size:]
posterior = tfd.MultivariateNormalDiagWithSoftplusScale(mean, std)
net = posterior.sample()
net = tf.layers.dense(net, units=input_size, ...)
...

What I think I'm doing: Let the neural network find a "mean" and "std" value and use it to create a Normal distribution (Gaussian). Sample from that distribution and use that for the decoder. In other words: learn a Gaussian distribution of the encoding

Now I would like to do the same for a mixture of Gaussians.

...
net = tf.layers.dense(net, units=code_size * 2 * code_size, activation=None)

means, stds = tf.split(net, 2, axis=-1)

means = tf.split(means, code_size, axis=-1)
stds = tf.split(stds, code_size, axis=-1)

components = [tfd.MultivariateNormalDiagWithSoftplusScale(means[i], stds[i]) for i in range(code_size)]
probs = [1.0 / code_size] * code_size

gauss_mix = tfd.Mixture(cat=tfd.Categorical(probs=probs), components=components)
net = gauss_mix.sample()
net = tf.layers.dense(net, units=input_size, ...)
...

That seemed relatively straight forward for me except that it fails with the following error:

Shapes () and (?,) are not compatible

This seems to come from probs that doesn't have the batch dimension (I didn't thought it would need that).

I thought that probs defines the probability between the components.

If I define a probs that also has the batch dimension I get the following cryptic error I don't know what it should mean:

Dimension -1796453376 must be >= 0

Do I generally misunderstand some concepts?

Or what do I need to do differently?

Spenhouet
  • 6,556
  • 12
  • 51
  • 76

0 Answers0