In my Neural Network I have combined all of the weight matrices into one large matrix: e.g A 3 layer matrix usually has 3 weight matrices W1, W2, W3, one for each layer. I have created one large weight matrix called W, where W2 and W3 are appended onto the end of W1. If W1 has 3 columns, W2 has 3 columns, and W3 has 2 columns, my matrix W will have 8 columns.
The number of layers/ Number of Inputs/Outputs is stored as a global variable.
This means I can use the feedforward code with only 2 input arguments, as the feedforward code splits W up into W1, W2, W3...etc, inside the function.
Output_of_Neural_Net = feedforward(Input_to_Neural_Net,W)
I also store the training data as a global variable. This means I can use the cost function with only one input argument.
cost = costfn(W)
The purpose of this is so that I can use built in MATLAB functions to minimise the cost function and therefore obtain the W that gives the network that best approximates my training data.
I have tried fminsearch(@costfn,W)
and fminunc(@costfn,W)
. Both give mediocre results for the function I am trying to approximate, although fminunc
is slightly better.
I now want to try Back-Propagation to train this network, to see if it does a better job, however most implementations of this are for networks with multiple weight matrices, making it more complicated.
My question is: Will I be able to implement back propagation with my single appended weight matrix, and how can I do this?
I feel like using a single weight matrix should make the code simpler, but I can't work out how to implement it, as all other examples I have seen are for multiple weight matrices.
Additional Information
The network will be a function approximator with between 8 and 30 inputs, and 3 outputs. The function it is approximating is quite complicated and involves the inverse of elliptic integrals (and so has no analytical solution). The inputs and outputs of the network will be normalised so that are between 0 and 1.