Questions tagged [backpropagation]

Backpropagation is a method of the gradient computation, often used in artificial neural networks to perform gradient descent.

Backpropagation is a method of the gradient computation, often used in artificial neural networks to perform gradient descent. It led to a “renaissance” in the field of artificial neural network research.

In most cases, it requires a teacher that knows, or can calculate, the desired output for any input in the training set. The term is an abbreviation for "backward propagation of errors".

1267 questions
7
votes
1 answer

Correct backpropagation in simple perceptron

Given the simple OR gate problem: or_input = np.array([[0,0], [0,1], [1,0], [1,1]]) or_output = np.array([[0,1,1,1]]).T If we train a simple single-layered perceptron (without backpropagation), we could do something like this: import numpy as…
alvas
  • 115,346
  • 109
  • 446
  • 738
7
votes
2 answers

How to apply Guided BackProp in Tensorflow 2.0?

I am starting with Tensorflow 2.0 and trying to implement Guided BackProp to display Saliency Map. I started by computing the loss between y_pred and y_true of an image, then find gradients of all layers due to this loss. with tf.GradientTape() as…
Tai Christian
  • 654
  • 1
  • 10
  • 21
7
votes
1 answer

Truncated Backpropagation Through Time (BPTT) in Pytorch

In pytorch, I train a RNN/GRU/LSTM network by starting the Backpropagation (Through Time) with : loss.backward() When the sequence is long, I'd like to do a Truncated Backpropagation Through Time instead of a normal Backpropagation Through Time…
u2gilles
  • 6,888
  • 7
  • 51
  • 75
7
votes
1 answer

Backpropagation with Momentum

I'm following this tutorial for implementing the Backpropagation algorithm. However, I am stuck at implementing momentum for this algorithm. Without Momentum, this is the code for weight update method: def update_weights(network, row, l_rate): …
7
votes
1 answer

keras combining two losses with adjustable weights

So here is the detail description. I have a keras functional model with two layers with outputs x1 and x2. x1 = Dense(1,activation='relu')(prev_inp1) x2 = Dense(2,activation='relu')(prev_inp2) I need to use these x1 and x2, Merge/add Them and…
7
votes
3 answers

Neural network for letter recognition

I'm trying to add to the code for a single layer neural network which takes a bitmap as input and has 26 outputs for the likelihood of each letter in the alphabet. The first question I have is regarding the single hidden layer that is being added.…
dylan
  • 91
  • 1
  • 4
7
votes
2 answers

Full-matrix approach to backpropagation in Artificial Neural Network

I am learning Artificial Neural Network (ANN) recently and have got a code working and running in Python for the same based on mini-batch training. I followed the book of Michael Nilson's Neural Networks and Deep Learning where there is step by step…
7
votes
1 answer

Neural Network settings for fast training

I am creating a tool for predicting the time and cost of software projects based on past data. The tool uses a neural network to do this and so far, the results are promising, but I think I can do a lot more optimisation just by changing the…
danpalmer
  • 2,163
  • 4
  • 24
  • 41
7
votes
2 answers

Unit testing backpropagation neural network code

I am writing a backprop neural net mini-library from scratch and I need some help with writing meaningful automated tests. Up until now I have automated tests that verify that weight and bias gradients are calculated correctly by the backprop…
Paul Manta
  • 30,618
  • 31
  • 128
  • 208
7
votes
2 answers

Neural Network not fitting XOR

I created an Octave script for training a neural network with 1 hidden layer using backpropagation but it can not seem to fit an XOR function. x Input 4x2 matrix [0 0; 0 1; 1 0; 1 1] y Output 4x1 matrix [0; 1; 1; 0] theta Hidden / output layer…
7
votes
3 answers

Part 2 Resilient backpropagation neural network

This is a follow-on question to this post. For a given neuron, I'm unclear as to how to take a partial derivative of its error and the partial derivative of it's weight. Working from this web page, it's clear how the propogation works (although I'm…
6
votes
1 answer

neural network-back propagation, error in training

after reading some articles about neural network(back-propagation) i try to write a simple neural network by myself. ive decided XOR neural-network, my problem is when i am trying to train the network, if i use only one example to train the…
aliyaho
  • 61
  • 1
  • 2
6
votes
1 answer

Which multiplication and addition factor to use when doing adaptive learning rate in neural networks?

I am new to neural networks and, to get grip on the matter, I have implemented a basic feed-forward MLP which I currently train through back-propagation. I am aware that there are more sophisticated and better ways to do that, but in Introduction to…
tunnuz
  • 23,338
  • 31
  • 90
  • 128
6
votes
3 answers

Is there a method in Pytorch to count the number of unique values in a way that can be back propagated?

Given the following tensor (which is the result of a network [note the grad_fn]): tensor([121., 241., 125., 1., 108., 238., 125., 121., 13., 117., 121., 229., 161., 13., 0., 202., 161., 121., 121., 0., 121., 121., 242., 125.], …
jwelch1324
  • 61
  • 1
  • 1
  • 3
6
votes
2 answers

Matrix dimensions not matching in back propagation

Here I'm attempting to implement a neural network with a single hidden layer to classify two training examples. This network utilizes the sigmoid activation function. The layers dimensions and weights are as follows : X : 2X4 w1 : 2X3 l1 : 4X3 w2 :…
blue-sky
  • 51,962
  • 152
  • 427
  • 752