0

I want to use RBM pretraining weights from Hinton paper code for weights of MATLAB native feedforwardnet toolbox. Anyone can help me how to set or arrange the pre-trained weight for feedforwardnet?

for instance, i used Hinton code from http://www.cs.toronto.edu/~hinton/MatlabForSciencePaper.html

and use the pre-trained weights for matlab feedforwardnet.

W=hintonRBMpretrained;

net=feedforwardnet([700 300 200 30 200 300 700]);

net.setwb(net,W);

how to set up or arrange the W such that it will match the feedforwardnet structure? I know how to use single vector but i am afraid that the order or the weights sequence is incorrect.

  • Welcome to SO. Please read this [how-to-ask](http://stackoverflow.com/help/how-to-ask) and follow the guidelines there to refine your question with additional information, such as code and error message to describe your programming problem. – thewaywewere May 07 '17 at 02:29

1 Answers1

0

The MATLAB feedforwardnet function returns a Neural Network object with the properties as described in the documentation. The workflow for creating a neural network with pre-trained weights is as follows:

  1. Load data
  2. Create the network
  3. Configure the network
  4. Initialize the weights and biases
  5. Train the network

The steps 1, 2, 3, and 5 are exactly as they would be when creating a neural network from scratch. Let's look at a simple example:

% 1. Load data
load fisheriris
meas = meas.';
species = species.';
targets = dummyvar(categorical(species));

% 2. Create network
net = feedforwardnet([16, 16]);

% 3. Configure the network
configure(net, meas, targets)

Now, we have a neural network net with 4 inputs (sepal and petal length and width), and 3 outputs ('setosa', 'versicolor', and 'virginica'). We have two hidden layers with 16 nodes each. The weights are stored in the two fields net.IW and net.LW, where IW are the input weights, and LW are the layer weights:

>> net.IW
ans =
  3×1 cell array

    [16×4 double]
    []
    []

>> net.LW
ans =
  3×3 cell array

                []               []    []
    [16×16 double]               []    []
                []    [3×16 double]    []

This is confusing at first, but makes sense: each row in both these cell arrays corresponds to one of the layers we have.

In the IW array, we have the weights between the input and each of the layers. Obviously, we only have weights between the input and the first layer. The shape of this weight matrix is 16x4, as we have 4 inputs and 16 hidden units.

In the LW array, we have the weights from each layer (the rows) to each layer (the columns). In our case, we have a 16x16 weight matrix from the first to the second layer, and a 3x16 weight matrix from the second to the third layer. Makes perfect sense, right?

With that, we know how to initialize the weights we have got from the RBM code:

net.IW{1,1} = weights_input;
net.LW{2,1} = weights_hidden;

With that, you can continue with step 5, i.e. training the network in a supervised fashion.

hbaderts
  • 14,136
  • 4
  • 41
  • 48
  • Thanks for the response. But the problem is i don't know the order of weights from the pre-training. I am afraid the weights are on decreasing index or flipped upside or left-light where the weight size is the same as feedforwardnet weights. – Sandi Inspiratips May 11 '17 at 14:53
  • What i did is i set 'vishid' as input weights without changing the order. – Sandi Inspiratips May 11 '17 at 14:57