0

I'm trying to implement a neural network with sigmoid function But the following code doesnot work This is the training part of neural network. It doesnot update the weights properly What is wrong in this code?

clc; clear all; close all;
% load train_data1
train_data1=[-0.498800000000000,-0.257500000000000;-0.492800000000000,-0.274300000000000;-0.470300000000000,-0.282600000000000;-0.427400000000000,-0.474000000000000;-0.420400000000000,-0.518000000000000;-0.326300000000000,-1.13230000000000;-0.317300000000000,-0.875300000000000;-0.295000000000000,-1.02770000000000;-0.267600000000000,-0.882800000000000;-0.260500000000000,-0.976500000000000;-0.216100000000000,-0.970400000000000;-0.207000000000000,-0.813800000000000;-0.164000000000000,-0.696600000000000;-0.159900000000000,-0.793300000000000;-0.122000000000000,-0.764400000000000;-0.0729000000000000,-0.435300000000000;-0.00640000000000000,-0.0546000000000000;0.132200000000000,0.710300000000000;0.137100000000000,0.587000000000000;0.160300000000000,0.819200000000000;0.230600000000000,0.989200000000000;0.286800000000000,0.737700000000000;0.334000000000000,0.943500000000000;0.375200000000000,0.688200000000000;0.429700000000000,0.567800000000000];
train_data1 = sortrows(train_data1);
% normalize data to [0,1]
data1=[train_data1];
max1=max(max(data1));
min1=min(min(data1));
train_data2 = (train_data1 - min1) / ( max1 - min1);

x = train_data2(:,1); % train input data
r = train_data2(:,2); % train output data


hidden_neurons = 2;
maxepochcount = 1000;

datacount1 = size(x,1);
% add a bias as an input
bias = ones(datacount1,1);
% x = [x bias];
% read how many inputs
inputcount = size(x,2);
% ---------- data loaded -----------
% ---------- set weights -----------
% set initial random weights
WI = (randn(inputcount,hidden_neurons) - 0.5)/10;
WO = (randn(1,hidden_neurons) - 0.5)/10;
%-----------------------------------
%--- Learning Starts Here! ---------
%-----------------------------------
eta1 = 0.5;
eta2 = eta1/5;
% do a number of epochs
for iter = 1:maxepochcount
% loop through the data
    for j = 1:datacount1
        % read the current sample
        I = x(j,:);
        D = r(j,1);
        % calculate the error for this sample
        H = (sigmoid(I * WI))';
        O = H' * WO';
        error = D-O;
        % adjust weight between hidden & output
        delta_i = O.*(1-O).*(D-O); % D actual, O calculated output
        % Calculate error for each node in layer_(n-1)
        delta_j = H.*(1-H).*(WO.'*delta_i); % H.' is the output of hidden layer
        % Adjust weights in matrices sequentially
        WO = WO + eta2.*delta_i*(H.') % H.' is the output of hidden layer
        WI = WI + eta1.*(delta_j*(I))' % I.' is the inputs

%         % adjust weight between hidden & output
%         delta_HO = error.*eta2 .* hidden_val;
%         WO = WO - delta_HO';
%         % adjust the weights between input & hidden
%         delta_IH = eta1 .* error .* WO' .* (1 - (H .^ 2)) * I;
%         WI = WI - delta_IH';

    end
    O = sigmoid(WO*sigmoid(x * WI)');
%     error(iter) =  (sum(error .^ 2)) ^ 0.5;
    if rem(iter,100)==0     % Every 100 epochs, show how training is doing
     plot(x,O, 'color','red','linewidth',2); hold on;    
     drawnow;
     iter

    end

%  return   
end
  • Have you used a debugger to step through the code and make sure that the values at each step are what you expect? – beaker Mar 07 '17 at 21:00
  • there is a plot function at the end of the loop to see the outcome of the net after each epoch. I use tanh version of this code and it works fine but sigmoid function doesnot work.I suspect the weight update part of the code – Hüseyin Kömürcü Mar 07 '17 at 21:22
  • is it necessary to normalize the input and output values to [0,1] for a network with sigmoid activation function. Actually my data values lie between -1,+1 – Hüseyin Kömürcü Mar 07 '17 at 21:33

1 Answers1

0

only the output values are needed to be scaled to the activation function. If we use tanh we must scale them to [-1,1], in case of sigmoid [0,1]. The code is working fine but sometimes it needs more epochs.