Questions tagged [backpropagation]

Backpropagation is a method of the gradient computation, often used in artificial neural networks to perform gradient descent.

Backpropagation is a method of the gradient computation, often used in artificial neural networks to perform gradient descent. It led to a “renaissance” in the field of artificial neural network research.

In most cases, it requires a teacher that knows, or can calculate, the desired output for any input in the training set. The term is an abbreviation for "backward propagation of errors".

1267 questions
0
votes
1 answer

Back propagation with a simple ANN

I watched a lecture and derived equations for back propagation, but it was in a simple example with 3 neurons: an input neuron, one hidden neuron, and an output neuron. This was easy to derive, but how would I do the same with more neurons? I'm not…
Sully Chen
  • 329
  • 2
  • 13
0
votes
1 answer

Backpropagation error : conceptual or programing?

I wrote the following backpropagation algorithm to model the two input identity function clc % clear nh = 3; % neurons in hidden layer ni = 2; % neurons in input layer eta = .001; % the learning rate traningSize…
Soumy
  • 113
  • 1
  • 12
0
votes
1 answer

MLP: When Reduced # Hiddens Fails for Over Training

I am in a epic debate with a colleague who claims that reducing the number of hiddens is the best way to deal with over training. While it can be demonstrated that generalization error decreases with training of such a net, ultimately it will not…
0
votes
1 answer

Thresholds in backpropagation

What is the use of thresholds in backpropagation algorithm. I wrote a java code for class label identification. I used some random thresholds (0-1) for the neurons. I trained the system and tested using some data.It worked pretty good. But what…
mRbOneS
  • 121
  • 1
  • 14
0
votes
1 answer

Calculating derivatives with backpropagation using Sutskever's technique

In "TRAINING RECURRENT NEURAL NETWORK" by Ilya Sutskever, there's the following technique for calculating derivatives with backpropagation in feed-forward neural networks. The network has l hidden layers, l+1 weight matrices and b+1 bias…
Paz
  • 737
  • 7
  • 22
0
votes
2 answers

FeedForward Neural Network: Using a single Network with multiple output neurons for many classes

I am currently working on the MNIST handwritten digits classification. I built a single FeedForward network with the following structure: Inputs: 28x28 = 784 inputs Hidden Layers: A single hidden layer with 1000 neurons Output Layer: 10…
0
votes
0 answers

Understanding Neural network backpropagation using matlab

I am working on backpropagation algorithm.can anyone explain me how do i plot performance graph without using matlab tools.i mean please give me detailed information about performance plot and how do i plot using my input and target set..also…
0
votes
1 answer

In neural networks, why is the bias seen as either a "b" parameter or as an additional "wx" neuron?

In other words, what is the main reason from switching the bias to a b_j or to an additional w_ij*x_i in the neuron summation formula before the sigmoid? Performance? Which method is the best and why? Note: j is a neuron of the actual layer and i a…
Guillaume Chevalier
  • 9,613
  • 8
  • 51
  • 79
0
votes
1 answer

Perceptron learns to reproduce just one pattern all the time

This is rather a weird problem. A have a code of back propagation which works perfectly, like this: Now, when I do batch learning I get wrong results even if it concerns just a simple scalar function approximation. After training the network…
0
votes
1 answer

MATLAB Neural Network Toolbox BPN

I am using the Iris Data Set to train my NN using Back Propagation. The code is attached. p = [ 5.1,3.5,1.4,0.2; %iris data…
0
votes
0 answers

Couldn't fit the data using NEURAL NETWORKS IN MATLAB

i have been trying the fit the data to a nonlinear model using neural networks in matlab. i have several sets of data. and my code is working fine for some data sets but not for all the data sets. for some data sets i am able to fit with good…
Kishore
  • 1
  • 1
0
votes
2 answers

Calculating error for a neural network

I have written a back-propagation MLP neural network and I want training to stop when the error is less than or equal to 0.01 I have my dataset which has been split to be 60% training data, 20% validation data and 20% testing data. My main loop for…
0
votes
1 answer

Neural network to solve AND

I'm working on implementing a back propagation algorithm. Initially I worked on training my network to solve XOR to verify that it works correctly before using it for my design. After reading this I decided to train it to solve AND gate first. I'm…
Alaa
  • 539
  • 3
  • 8
  • 29
0
votes
1 answer

How can I add concurrency to neural network processing?

The basics of neural networks, as I understand them, is there are several inputs, weights and outputs. There can be hidden layers that add to the complexity of the whole thing. If I have 100 inputs, 5 hidden layers and one output (yes or no),…
Shamoon
  • 41,293
  • 91
  • 306
  • 570
0
votes
2 answers

i need a way to train a neural network other than backpropagation

This is an on-going venture and some details are purposefully obfuscated. I have a box that has several inputs and one output. The output voltage changes as the input voltages are changed. The desirability of the output sequence cannot be evaluated…
Bing Bang
  • 524
  • 7
  • 16