Questions tagged [backpropagation]

Backpropagation is a method of the gradient computation, often used in artificial neural networks to perform gradient descent.

Backpropagation is a method of the gradient computation, often used in artificial neural networks to perform gradient descent. It led to a “renaissance” in the field of artificial neural network research.

In most cases, it requires a teacher that knows, or can calculate, the desired output for any input in the training set. The term is an abbreviation for "backward propagation of errors".

1267 questions
0
votes
1 answer

CS231N Lecture 4 Back Prop - Chain Rule

I am sure this has a simple answer! I am asking to improve my understanding. A diagram: a modification of: CS231N Back Propagation If the Cain Rule is applied to get the Delta for Y, the Gradient will be: dy = -4 according to the Diagram. Applying…
Rusty Nail
  • 2,692
  • 3
  • 34
  • 55
0
votes
1 answer

how to judge a neural network?

I wrote a neural network, its mostly based (bug fixed) on the neural nets from James McCaffrey https://visualstudiomagazine.com/articles/2015/04/01/back-propagation-using-c.aspx i came across various Git projects and books, using his code And as…
Peter
  • 2,043
  • 1
  • 21
  • 45
0
votes
1 answer

back propagate doesn't work in Tensorflow

I am a freshman in Tensorflow. Recently i want to fit a non-liner function"y = 1 + sin(x * pi/4)" with a two-layer neural network model. The code of program is following: #!/usr/bin/python import tensorflow as tf import numpy as np …
WangYang
  • 466
  • 1
  • 5
  • 15
0
votes
2 answers

Is it normal to get big error in backpropagation neural network when I using the same data training and data test?

I'm doing some programming with neural network backpropagation. I have about 90 datas and doing some training with all data for data training (90 datas) and same data for data test (90 datas). I'm using iteration threshold about 2 iteration to test…
0
votes
1 answer

Deep Neural Networks, What is Alexnet output and loss function relationship?

I am trying to understand DNN with Matconvnet DagNN. I've a question based on the following last two layers of a net which uses euclidean loss for regression net.addLayer('fc9', dagnn.Conv('size', [1 1 4096 1], 'hasBias', true, 'stride', [1,1],…
h612
  • 544
  • 2
  • 11
0
votes
0 answers

Training Backpropagation Neural Network with 150k training pairs

I am currently trying to train my backpropagation to classify 150k training pairs. Each training pair is a vector of 18 Bipolar numbers and it runs through 2 hidden layers with a final output of 1 number (18-18-18-1). When I feed my neural network…
0
votes
0 answers

Function approximation by ANN

So I have something like…
viceriel
  • 835
  • 12
  • 18
0
votes
1 answer

Bayesian network error in MATLAB

I tried to run my following code to train my Bayesian network. p = [-1:.05:1]; t = sin(2*pi*p)+0.1*randn(size(p)); net = feedforwardnet(2,'trainbr'); net = train(net,p,t); a = net(p); and received an error which is as below; Default value is not a…
0
votes
1 answer

Performance comparison plotting for different Back propagation algorithms

I am implementing various Backpropagation algorithms for the same dataset and trying to compare the performance. I got a help from the following tutorial for the…
mari
  • 167
  • 4
  • 15
0
votes
1 answer

Nueral Network for Linear Regression: prediction different every time

I have 200 training examples. I have run linear regression with 6 features on this dataset and it works fine, so I want to run nueral networs on it too. Problem: each time I run the program, the prediction (pred) is different, vastly…
0
votes
1 answer

Can someone tell me what is wrong with this back propagation implementation

So I am trying to implement a backpropagation neural network in c#. And I've come across a hiccup. When training the network, all the outputs are either 0.49???... or 0.51???... Here's my network class namespace BackPropNetwork { public class…
0
votes
1 answer

clockwork neural network (CW RNN)

Thanks for reading this post ! Quick question for RNN enthusiasts here : I know that in backproprgation through time (BPPT), there is at least 3 steps : For each element in a sequence : Step 1 - Compute 'error ratio' of each neuron, from upper…
0
votes
2 answers

Multiply vectors with shape (2,) and (3, 1)

I have this code: import numpy as np def sigmoid(x): """ Calculate sigmoid """ return 1 / (1 + np.exp(-x)) x = np.array([0.5, 0.1, -0.2]) target = 0.6 learnrate = 0.5 weights_input_hidden = np.array([[0.5, -0.6], …
VansFannel
  • 45,055
  • 107
  • 359
  • 626
0
votes
0 answers

Prevent network weights to be updated when performing backward pass

I am trying to perform a backward pass through my network and I don't want to update my network weights of my network when, I do a backward pass. output = net:forward(input) err = criterion:forward(output, label) df_do = criterion:backward(output,…
0
votes
1 answer

Should the output of backpropogation converge to 1 given the output is (0,1)

I am currently trying to understand the ANN that I created for an assignment that essentially takes gray scale (0-150)images (120x128) and determines whether the person is Male or Female. It works for the most part. I am treating this like a boolean…
Kendall
  • 17
  • 3