Questions tagged [backpropagation]

Backpropagation is a method of the gradient computation, often used in artificial neural networks to perform gradient descent.

Backpropagation is a method of the gradient computation, often used in artificial neural networks to perform gradient descent. It led to a “renaissance” in the field of artificial neural network research.

In most cases, it requires a teacher that knows, or can calculate, the desired output for any input in the training set. The term is an abbreviation for "backward propagation of errors".

1267 questions
0
votes
1 answer

RNN: Back-propagation through time when output is taken only at final timestep

In this blog on Recurrent Neural Networks by Denny Britz. Author states that, "The above diagram has outputs at each time step, but depending on the task this may not be necessary. For example, when predicting the sentiment of a sentence we may…
0
votes
0 answers

how the cross-entropy speed up backpropagation on the hidden layer?

I am learning the http://neuralnetworksanddeeplearning.com/chap3.html. It says the cross-entropy cost function can speed up the network, because the δ'(z) canceled on the last layer. the partial derivative for the last layer L: ∂(C)/∂(w) =…
Joey
  • 175
  • 7
0
votes
0 answers

dot product between two branches of a cnn in tensorflow

I'm trying to use tensorflow to implement a cnn that, when given two images, it can find in which position both images are the most similar. Something to whats described in here: Efficient Deep Learning for Stereo Matching I'm sharing the variables…
Findios
  • 307
  • 1
  • 4
  • 14
0
votes
0 answers

Neural Network not identifying correct pattern

I have a neural network that contains 2 input neurons, 1 hidden layer containing 2 neurons, and one output neuron. I am using this neural network for the XOR problem, but it does not work. Test Results: If you test 1, 1 you get the output of -1…
0
votes
2 answers

Back-propagating gradients through a sparse tensor?

I have a normal feed-forward network that produces a vector v. The elements of v are then used as the non-zero entries of a sparse matrix M (assume the coordinates are predefined). The sparse matrix is then multiplied by a dense vector and a loss is…
zergylord
  • 4,368
  • 5
  • 38
  • 60
0
votes
1 answer

Can I view convolutional neural network as fully connected neural network

for example,there is a image of 3 by 3, and a convolutional neural network that has two 2x2 filters convolves the image in the end,the output dimention is 2x2x2 Can I view the above procedure as the followings? because of 2x2 filter,after sliding…
user3094631
  • 425
  • 3
  • 13
0
votes
1 answer

How can I made Backpropagation algorithm for Titanic

I'd like to make Backpropagation algorithm for Titanic (Kaggle's competition). It's very easy to do but I have one problem. Backpropagation algorithm is about numbers. But we have string types in Titanic. For example, We have a column "Sex" (male…
vjg
  • 131
  • 1
  • 13
0
votes
1 answer

Cross entropy applied to backpropagation in neural network

I watched this awesome video by Dave Miller on making a neural network from scratch in C++ here: https://vimeo.com/19569529 Here is the full source code referenced in the video: http://inkdrop.net/dave/docs/neural-net-tutorial.cpp It uses mean…
0
votes
1 answer

Neural network function converges to y=1

I'm trying to program a neural network with backpropagation in python. Usually converges to 1. To the left of the image there are some delta values. They are very small, should they be larger? Do you know a reason why this converging could…
sezanzeb
  • 816
  • 8
  • 20
0
votes
0 answers

My matlab neural network backpropagation algorithm seems buggy

Here is my code. I think it is wrong because the difference between this computed gradient and my numerical estimate is too significant. It doesn't seem to be due to wrongly inverting matrices, etc. For context, Y is the output layer, X is the input…
0
votes
1 answer

Backpropagation: networkerror of one testinput rises, the others go down, whats wrong?

I am currently trying to program a neural network... for learning I want to use the backpropagation algorithm! My problem is, that I don't know where my error is. I try to train it the logical AND. My network errors after the first round…
praetorianer777
  • 309
  • 3
  • 12
0
votes
1 answer

Encog Backpropagation Error not changing

The total error for the network did not change on over 100,000 iterations. The input is 22 values and the output is a single value. the input array is [195][22] and the output array is [195][1]. BasicNetwork network = new BasicNetwork(); …
0
votes
1 answer

Implementation of backpropagation algorithm

I'm building a neural network with the architecture: input layer --> fully connected layer --> ReLU --> fully connected layer --> softmax I'm using the equations outlined here DeepLearningBook to implement backprop. I think my mistake is in eq. 1.…
0
votes
1 answer

Siamese Net BackProp, how to effectively update?

How to most effectively update the shared weights of a Siamese Net, given contrastive loss function in Tensorflow?
trdavidson
  • 1,051
  • 12
  • 25
0
votes
0 answers

How to use a dataset in Neural Network training

I am trying to implement a Neural Network. I am currently working on the backpropagation part. I don't need help with the coding. I have wrote the feedForward part so far with great success. But my question more related to the dataset I am using.…