There is an implementation for RBMs. Thr original RBM implementation is for the discrete data such as images, my data is a binary data, does the code work for real data too? I read somewhere that there is a gaussin version of the typical RBM that works for that, is it also implemented in that module?
Asked
Active
Viewed 150 times
-1
-
2Could you please give some more details? Where and what is that implementation you're talking about? Are you talking about the RBM implementation or Geoffrey Hinton and Ruslan Salakhutdinov for their science paper? – Amir Jan 02 '16 at 00:19
-
Thanks for your response, I'm talking about this one, deeplearning.net/tutorial/code/rbm.py do you know a better implementation: – HHH Jan 03 '16 at 05:34
1 Answers
0
In short, an RBM is simply a Markov random field on a bipartite graph. So therefore you can have any probability distribution to describe the relationship between nodes.
In terms of code, you don't really need to copy things explicitly. The role that the probability function selected will play is in the contrastive divergence algorithm. You should only have to change the way that the samples are selected. The parts of the code that need to be changed are copied below.
def sample_h_given_v(self, v0_sample):
''' This function infers state of hidden units given visible units '''
# compute the activation of the hidden units given a sample of
# the visibles
pre_sigmoid_h1, h1_mean = self.propup(v0_sample)
# get a sample of the hiddens given their activation
# Note that theano_rng.binomial returns a symbolic sample of dtype
# int64 by default. If we want to keep our computations in floatX
# for the GPU we need to specify to return the dtype floatX
h1_sample = self.theano_rng.binomial(size=h1_mean.shape,
n=1, p=h1_mean,
dtype=theano.config.floatX)
return [pre_sigmoid_h1, h1_mean, h1_sample]
def propdown(self, hid):
'''This function propagates the hidden units activation downwards to
the visible units
Note that we return also the pre_sigmoid_activation of the
layer. As it will turn out later, due to how Theano deals with
optimizations, this symbolic variable will be needed to write
down a more stable computational graph (see details in the
reconstruction cost function)
'''
pre_sigmoid_activation = T.dot(hid, self.W.T) + self.vbias
return [pre_sigmoid_activation, T.nnet.sigmoid(pre_sigmoid_activation)]
def sample_v_given_h(self, h0_sample):
''' This function infers state of visible units given hidden units '''
# compute the activation of the visible given the hidden sample
pre_sigmoid_v1, v1_mean = self.propdown(h0_sample)
# get a sample of the visible given their activation
# Note that theano_rng.binomial returns a symbolic sample of dtype
# int64 by default. If we want to keep our computations in floatX
# for the GPU we need to specify to return the dtype floatX
v1_sample = self.theano_rng.binomial(size=v1_mean.shape,
n=1, p=v1_mean,
dtype=theano.config.floatX)
return [pre_sigmoid_v1, v1_mean, v1_sample]

Joshua Howard
- 876
- 1
- 12
- 25