Questions tagged [backpropagation]

Backpropagation is a method of the gradient computation, often used in artificial neural networks to perform gradient descent.

Backpropagation is a method of the gradient computation, often used in artificial neural networks to perform gradient descent. It led to a “renaissance” in the field of artificial neural network research.

In most cases, it requires a teacher that knows, or can calculate, the desired output for any input in the training set. The term is an abbreviation for "backward propagation of errors".

1267 questions
0
votes
1 answer

Use number of misclassificatios as objective function for back propagation

I'm new to machine learning (neutral network) and I have a question, please help me explain. In back propagation, the objective function to be minimized is usually a sum of the squared error between the output and the target. However, in…
lenhhoxung
  • 2,530
  • 2
  • 30
  • 61
0
votes
1 answer

Back propagation error doesnt decrease after 3 epochs! Beginner needing help MATLAB

Before I begin, I'd just like to preface this by saying that I only started coding in October so excuse me if it's a little be clumsy. I've been trying to make a MLP for a project I've been doing. I have the hidden layer(Sigmoid) and output…
0
votes
2 answers

Neural network with batch training algorithm, when to apply momentum and weight decay

I built a neural network and successfully trained it by using backpropagation with stochastic gradient descent. Now I'm switching to batch training but I'm a bit confused about when to apply momentum and weight decay. I know fair well how…
mp85
  • 422
  • 3
  • 17
0
votes
1 answer

Operations on sums inside functions in Maxima

I am trying to compute derivative for something like back-propagation analytically, using Maxima. So I…
alex
  • 256
  • 1
  • 13
0
votes
1 answer

What should be the value of parameters of neural network having large data sample?

I have done coding for neural network in Python for the multi-layer,feed-forward, back-propagation structure. In this network structure I have 24 nodes in input layer, 18 nodes in hidden layer and 1 node in output layer. I am getting the good…
lkkkk
  • 1,999
  • 4
  • 23
  • 29
0
votes
1 answer

neural networks for Farsi OCR

I'm trying to implement a farsi OCR using neural networks,I am using 5000 training examples each is a 70 * 79 matrix,concretely I have a 5530 units input layer and one hidden layer(4000 units) and a 38 units output. what training algorithm should I…
arian
  • 76
  • 7
0
votes
1 answer

Neural network - unsignificant output data for small dataset

So I am working on an implementation of a backprop neural network : I made this 'NEURON' class , as every beginner in neural network do . However, I am having weird results : you see, when the dataset is small (like in the case of a XOR function,…
0
votes
1 answer

Is the mini-batch gradient just the sum of online gradients?

I am adapting code for training a neural network that does online training to work for mini-batches. Is the mini-batch gradient for a weight (de/dw) just the sum of the gradients for the samples in the mini-batch? Or, is it some non-linear…
0
votes
1 answer

Subscript indices must be real positive integers, and they are (Matlab)

I am trying to code a simple backpropagation network in Matlab, and I am getting the following error: Subscript indices must either be real positive integers or logicals. in line 144 of my code, which is during this section: for l =…
David Myers
  • 103
  • 1
0
votes
2 answers

Backpropogation: WHERE is Derivative of Transfer Function

First off: I understand derivatives and the chain rule. I'm not great with math, but I have an understanding. Numerous tutorials on Backpropogation (let's use this and this) using gradient descent state that we use the derivative of the transfer…
SilverFox
  • 115
  • 2
  • 10
0
votes
0 answers

Multilayer Perceptron backpropagation

I'm trying to figure out a question that asks why training times in MLP nets increase dramatically if unnecessary additional layers are added between the inputs and outputs. (It's not a HW question) I guess it's something to do with the…
user1360909
  • 401
  • 1
  • 4
  • 4
0
votes
1 answer

Perceptron with sigmoid stuck in local Minimum (WEKA)

I know that usually you don't have local minima in the error surface using a perceptron (no hidden layers) with linear output. But is it possible to get stuck in local minima with a perceptron using a sigmoid function since it is not linear? I'm…
sven
  • 1
  • 1
0
votes
3 answers

How to choose the number of nodes for using BP network in face recognition?

I read some books but still cannot make sure how should I organize the network. For example, I have pgm image with size 120*100, how the input should be like(like a one dimensional array with size 120*100)? and how many nodes should I adapt.
litaoshen
  • 1,762
  • 1
  • 20
  • 36
0
votes
1 answer

Aforge BackPropagation Using

I am using aforge framework on visual studio. I have no error but I am getting wrong output. My code; public void btn_hesapla_Click(object sender, EventArgs e) { double girdi; girdi = Convert.ToDouble(txt_girdi.Text); …
user3425879
  • 3
  • 1
  • 8
0
votes
1 answer

Neural Network can't learn XOR

I've created a neural network, with the following structure: Input1 - Input2 - Input layer. N0 - N1 - Hidden layer. 3 Weights per node (one for bias). N2 - Output layer. 3 Weights (one for bias). I am trying to train it the XOR function with the…
jub
  • 213
  • 1
  • 11