Questions tagged [backpropagation]

Backpropagation is a method of the gradient computation, often used in artificial neural networks to perform gradient descent.

Backpropagation is a method of the gradient computation, often used in artificial neural networks to perform gradient descent. It led to a “renaissance” in the field of artificial neural network research.

In most cases, it requires a teacher that knows, or can calculate, the desired output for any input in the training set. The term is an abbreviation for "backward propagation of errors".

1267 questions
-2
votes
1 answer

Dimension of gradients in backpropagation

Take a simple neural network that takes in data of dimension NxF, and output NxC where the N, F, and C represent number of samples, features, and C output neurons respectively. Needless to say, softmax function with cross-entropy is used given we…
VM_AI
  • 1,132
  • 4
  • 13
  • 25
-2
votes
1 answer

Parts of Speech classification problem - Neural Network not learning

I am writing a NN to classify Polish parts of speech. When I launch the neural network I noticed that the weights get constantly increased and hidden the error (cost) get's to maximum instead of being minimized. Here is my Network Class: import…
-2
votes
1 answer

backpropagation with more than one node per layer

I read this article about how backpropagation works, and I understood everything they said. They said that to find the gradient we have to take a partial derivative of the cost function to each weight/bias. However, to explain this they used a…
-2
votes
1 answer

What machine learning model should I use?

I'm currently making a machine learning model for a student project, and I'm still deciding what model I should use. Here's the brief I was given: Global Terrorism Database (GTD) is an open-source database including information on terrorist events…
-2
votes
1 answer

Prediction always 1 or 0

EDIT: squashing input between 0, 1 gives me about 0.5 output per neuron per data set. It seems the output is always 1 with every set of inputs I feed forward after I train. However if I change the learning rate from pos. to neg. and vice versa, The…
-2
votes
1 answer

How does backpropagation work?

I created my first simple Neural Net on the paper. It has 5 inputs(data - float number from 0.0 to 10.0) and one output. Without hidden layers. For example at start my weights = [0.2, 0.2, 0.15, 0.15, 0.3]. Result should be in range like input…
-2
votes
1 answer

Backpropagation Optimization: How do I use the derivatives for optimizing the weights and biases?

Given the derivative of the cost function with respect to the weights or biases of the neurons of a neural network, how do I adjust these neurons to minimize the cost function? Do I just subtract the derivative multiplied by a constant off of the…
Ron Lauterbach
  • 107
  • 1
  • 1
  • 12
-2
votes
1 answer

what is net.trainParam.mu in NNtoolbox, should i change this parameter?

when I am using "trainbr", it always came out with "maximum mu reached". what does this mean? should i change this parameter?
-2
votes
1 answer

What is Backpropagation?

Can you break down 'Backpropagation' into its simplest form? I'm ok with math, but I'm just trying to get an general idea of the term. I'm reading this article: The mostly complete chart of Neural Networks, explained In the last paragraph on Feed…
-2
votes
1 answer

Keras - Case wise masking of the output for backpropagation

I am trying to replicate this article https://arxiv.org/pdf/1606.07659v1.pdf in Keras. It uses auto-encoder as recommender systems. The idea is to mask some of the known values (ratings) in order to teach your network to predict the unknown values…
Cyrilleb
  • 1
  • 1
-2
votes
1 answer

Local minima in Backpropagation algorithm

The addition of an extra term, called a proportional factor reduces the convergence of the back propagation algorithm. So how to avoid local minima in Back propagation algorithm.
-2
votes
1 answer

Backpropagation algorithm with adaptive learning rate

I searched to learn Backpropagation algorithm with adaptive learning rate, and find a lot of resources but it was hard for me to understand, because I'm new in neural network. I know how standard backpropagation algorithm works, very well. Is…
starrr
  • 1,013
  • 1
  • 17
  • 48
-2
votes
1 answer

Backpropagation alhorithm: where did I make mistake?

There is my implementation of BP alhorithm. I tested it and found incorrect data after training. So, where did I make mistake? double OpenNNL::_changeWeightsByBP(double * trainingInputs, double *trainingOutputs, double speed, double…
Robotex
  • 1,064
  • 6
  • 17
  • 41
-3
votes
1 answer

How does back propagation work for for multiple hidden networks?

I have implemented the required equations for backpropagation for the output layer but for the hidden layers, I am getting really confused with the chain rule. When the number of hidden layers is. more, it gets more confusing. How to ease out hidden…
-3
votes
1 answer

Why is my neural network stagnating around a certain cost?

I am making a neural network that's supposed to be capable of identifying handwritten numbers using the Mnist database downloadable here. The network works perfectly with one to 5 examples but after 10 it starts to get a bit iffy. Using a standard…
1 2 3
84
85