Questions tagged [backpropagation]

Backpropagation is a method of the gradient computation, often used in artificial neural networks to perform gradient descent.

Backpropagation is a method of the gradient computation, often used in artificial neural networks to perform gradient descent. It led to a “renaissance” in the field of artificial neural network research.

In most cases, it requires a teacher that knows, or can calculate, the desired output for any input in the training set. The term is an abbreviation for "backward propagation of errors".

1267 questions
-1
votes
1 answer

Implementing a conv2d backward in pytorch

I want to implement backward function of conv2d. Here is an example of a linear function: # Inherit from Function class LinearFunction(Function): @staticmethod # bias is an optional argument def forward(ctx, input, weight, bias=None): …
core_not_dumped
  • 759
  • 2
  • 22
-1
votes
1 answer

Is Adam optimezer updating weight in every layer?

I'm newbie in Neural network So I little bit confuse about ADAM optimezer. For Example I use MLP with architecture like this: I've used SDG before, so I want to ask if changing the weight with adam's optimization is the same as SDG updating the…
-1
votes
1 answer

What is going wrong in my backpropagation implementation?

I'm following the 15 Steps to Implement a Neural Net guide. I'm stuck on Step 12, where backpropagation implementation is described. Here's the (relevant) code I have written: def feed_forward(inputs, weights, biases): net =…
-1
votes
1 answer

Backprop not getting to layers pytorch

I had some trouble getting layers in nn.Module to work. I had a bunch of layers that I combined into another layers input. I combined their input this way: previous_out = torch.tensor([previousLayer1Out, previousLayer2Out])
brando f
  • 311
  • 2
  • 9
-1
votes
1 answer

outputs after back propagation converge to a small value 0.01

the below code is my main code block iter_pos=0 max_iter=120 iter_cost=[] parameters=generate_parameters() while iter_pos
-1
votes
1 answer

How does error get back propagated through pooling layers?

I asked a question earlier that might have been too specific so I'll ask again in more general terms. How does error get propagated backwards through a pooling layer when there are no weights to train? In the tensorflow video at 6:36…
-1
votes
1 answer

Where is the sigmoid derivative used in the backpropagation algorithm and how are weights updated

I am a year 10 student trying to learn how a neural network works in python code. I don't have much calculus knowledge, only to the extent of a limited understanding of derivatives and how to find them. I have made a simple feed-forward network in…
IRL135
  • 1
  • 1
-1
votes
1 answer

Confusion in Backpropagation

I've started working on Forward and back propagation of neural networks. I've coded it as-well and works properly too. But i'm confused in the algorithm itself. I'm new to Neural Networks. So Forward propagation of neural networks is finding the…
MBAQ
  • 31
  • 5
-1
votes
1 answer

Write a code to calculate Backward Propagation, Deep Learning course by Andrew NG

So I've taken the Deep Learning AI course by Andrew NG on coursera. I am currently working the last assignment in week 2. I reached the part where I have to write the forward and backward propagation function. I managed to write the fwd_propagate…
-1
votes
1 answer

Incompatible types in assignment of variables in C++

recently I have been trying to make a Neural Network with an arduino library and I came across a library, that was quite literally, called Neural Network by George Chousos. I stumbled apon a couple of errors that I managed to fix quite simply, but…
-1
votes
1 answer

Manually set gradient values in TensorFlow and use them in backpropagation

Just to make sure this is not an XY problem, I'll describe the situation: I'm building a NN with keras/TensorFlow, and the loss function I'd like to use appears to be non-differentiable to TF. Wrapping it in tf.py_function didn't work, as the…
bernie
  • 59
  • 1
  • 1
  • 8
-1
votes
1 answer

In backpropogation, what does it mean when the error of a neural network converges to 0.5?

I've been trying to learn the math behind neural networks and have implemented (in Octave) a version of the following equations which include bias terms. Back-propagation equations matrix form: Visual representation of the problem and…
-1
votes
1 answer

How to train rebel neurons?

I'm training a pretty basic NN over mmnist fashion dataset. I'm using my own code, which is not important. I use a rather simplified algorith similar to ADAM and a cuadratic formula (train_value - real_value)**2 for training and error calculation. I…
José Chamorro
  • 497
  • 1
  • 6
  • 21
-1
votes
1 answer

Are Back Propagation and Recurrent Neural Networks same?

I have searched this topic everywhere, but I couldn't find the exact solution I was looking for. So, I am still quite confused between the two terms Back Propagation and Recurrent Neural Networks. I have read that back propagation is used after the…
Vivek
  • 336
  • 2
  • 4
  • 18
-1
votes
1 answer

How are Weights Changed by Backpropagation in Neural Networks

So I know that backpropagation uses the gradients and passes them back through the neural network to update the weights. But how exactly are the weights updated for the layers in the middle. Do the non-output layers use the same gradients that the…
mg nt
  • 161
  • 3
  • 14