Questions tagged [backpropagation]

Backpropagation is a method of the gradient computation, often used in artificial neural networks to perform gradient descent.

Backpropagation is a method of the gradient computation, often used in artificial neural networks to perform gradient descent. It led to a “renaissance” in the field of artificial neural network research.

In most cases, it requires a teacher that knows, or can calculate, the desired output for any input in the training set. The term is an abbreviation for "backward propagation of errors".

1267 questions
0
votes
1 answer

Appropriate backpropagation parameters

I want to train a neural network to perform signal classification. The network has 50 inputs of the format: [-1 .. 1] 50 hidden layers (not restricted) 10 outputs hyperbolic tangent (not restricted) I am restricted to library (hnn) to do the…
Boris Mocialov
  • 3,439
  • 2
  • 28
  • 55
0
votes
1 answer

Temperature prediction using artificial neural network

I am using four parameters temperature, rainfall, humidity and date for the prediction. I am trying to predict a single parameter temperature. I am trying to use back propagation algorithm for training. What might be the best network structure for…
0
votes
1 answer

Learning ANN in Matlab (Multi-layer Back-propagation )

I'm writing this code for learning process of ANN (multi-layer back-propagation ) but the result of learning is very bad it's not near to 1 at any time I know we can not give any guaranty to make learning successfully but I want to know if I make…
Samah Ahmed
  • 419
  • 8
  • 24
0
votes
1 answer

Backpropagation neural network, too many neurons in layer causing output to be too high

Having neural network with alot of inputs causes my network problems like Neural network gets stuck and feed forward calculation always gives output as 1.0 because of the output sum being too big and while doing backpropagation, sum of gradients…
0
votes
0 answers

Neural Network Back Propagation XOR

Recently I've been brushing up on my machine learning, and as such decided to implement a basic neural network in Java using the back propagation algorithm. I've gone over the maths and checked against various other tutorials, but am still having…
0
votes
1 answer

Neural Network fails on mnist

I coded a neural network in python to solve the mnist task. But the error rate changes really little (6th digit after comma) after one epoch and the network hasn't learnd much after 10000 epochs... Can you help me what I've done wrong and how to…
Peter234
  • 1,052
  • 7
  • 24
0
votes
1 answer

Adding momentum term in online back propagation weight update?

I have implemented the ANN for two-layer network, I need to modify my weight update code with momentum, but i need to know how can i update it. below is the code snap of only weight update. The below code updates weight for each example it has seen,…
0
votes
0 answers

Neural Networks DataSet learning

for a while now, i I am writing my own neural network for recognizing digits. It works perfectly fine for one given input and one expected output. It's getting close to the values until the total error is arround around 0.00001 or something like…
0
votes
1 answer

Simple backpropagation Neural Network algorithm (Python)

I'm trying to understand back-propagation, for that I using some python code, but it's noting working properly. When I train with xor input-output the error does not converge. But if I change the value of the last output of xor it converge. If I put…
bottega
  • 123
  • 1
  • 13
0
votes
1 answer

How to compute the gradient of loss with repect to an arbitrary layer/weight in Torch?

I'm transiting from Theano to Torch. So please bear with me. In Theano, it was kind of straight-forward to compute the gradients of loss function w.r.t even a specific weight. I wonder, how can one do this in Torch? Assume we have the following code…
Amir
  • 10,600
  • 9
  • 48
  • 75
0
votes
1 answer

How to compute an error for neural networks with unknown ideal?

Ok, so i set up a neural network through some trial and error. Going into backpropagation next. But in order to do that, i need to calculate my error on the outputs. The situation i made for my testing area is the following. I have a car, in the…
0
votes
1 answer

Neural Network bad convergeance

I read a lot about NN last two weeks, I think i saw pretty much every "XOR" approach tutorials on net. But, i wasn't able to make work my own one. I started by a simple "OR" neuron approach. Giving good results. I think my problem is in…
0
votes
1 answer

How backpropagation works in Convolutional Neural Network(CNN)?

I have few question regarding CNN. In the figure below between Layer S2 and C3, 5*5 sized kernel has been used. Q1. How many kernel has been used there? Do each of these kernel connected with each of the feature map in Layer S2 ? Q2. When using…
Avijoy Chakma
  • 147
  • 3
  • 14
0
votes
1 answer

Backpropagation: Updating the first weight layer

According Andrew Ng's notes on backpropagation (page 9), the delta values are only calculated for the hidden layers (n-1 to 2). These deltas are then accumulated and used to update the weight matrices. However, the notes do no mention how to update…
Soubriquet
  • 3,100
  • 10
  • 37
  • 52
0
votes
2 answers

Java Backpropagation Algorithm is very slow

I have a big problem. I try to create a neural network and want to train it with a backpropagation algorithm. I found this tutorial here http://mattmazur.com/2015/03/17/a-step-by-step-backpropagation-example/ and tried to recreate it in Java. And…