Questions tagged [backpropagation]

Backpropagation is a method of the gradient computation, often used in artificial neural networks to perform gradient descent.

Backpropagation is a method of the gradient computation, often used in artificial neural networks to perform gradient descent. It led to a “renaissance” in the field of artificial neural network research.

In most cases, it requires a teacher that knows, or can calculate, the desired output for any input in the training set. The term is an abbreviation for "backward propagation of errors".

1267 questions
0
votes
3 answers

When I need to update weights in MultiLayer Perceptron?

Im researching about MultiLayer Perceptrons, a kind of Neural Networks. When I read about Back Propagation Algorithm I see some authors suggest to update weights inmediately after we computed all errors for specific layer, but another authors…
Mr Rivero
  • 1,248
  • 7
  • 17
0
votes
1 answer

Backpropagation problems

I want to make a financial forecasting applications, these applications implement the backpropagation algorithm. in this case I am using XOR as an example. I want to find the best weight on the input, bias, output. and I have done. but when at…
Ugy Astro
  • 357
  • 3
  • 6
  • 16
0
votes
2 answers

Using Back Propagation Learning in Aforge

I'm new to neural networks and I'm using Aforge Neural network library for a character recognition task. I want to use the back propagation to train my network. Here's the code given in the AForge Documentation. // initialize input and output…
kinath_ru
  • 4,488
  • 3
  • 21
  • 26
0
votes
1 answer

AForge.NET - Backpropagation learning always returns values [-1;1]

I have some problem with backpropagation learning using AForge.NET - Neuro Learning - Backpropagation . I actually try to implement neural network as in samples (Aproximation). My problem is about this: 1. input vector {1,2,3,...,19,20} 2. output…
0
votes
1 answer

python(3.23) Implementation back propagation, type error on list

I get a type error when trying to run my back propagation on a neural net when trying to train it to do 'and' pattern. just do be clear, I'm not requesting anyone read or review my code.. I'm just giving a bunch of it, because I'm not really…
0
votes
1 answer

How to select number of Input layers, hidden layers and output layer using newff in Matlab?

I am using newff for stock price forecasting project, I am trying to setup a Back-propagation feed forward ANN of 4 inputs, 1 hidden layers and 1 output layer (4-1-1). I have read many forums to learn how to correctly specify these parameters for…
Shoaib Khan
  • 1
  • 1
  • 1
0
votes
2 answers

How to implement Back Propagation algorithm for the following input/output?

I would like to implement a back propagation algorithm in python or C++ for the following input [[11, 15], [22, 17]] [[8, 11], [23, 19]] [[9, 14], [25, 22]] [[6, 9], [17, 13]] [[2, 6], [29, 25]] [[4, 8], [24, 20]] [[11, 15], [27, 24]] [[8, 11], [31,…
0
votes
1 answer

Output Value Of Neural Network Does Not Arrive To Desired Values

I made a neural network that also have Back Propagation.it has 5 nodes in input layer,6 nodes in hidden layer,1 node in output layer and have random weights and i use sigmoid as activation function. i have two set of data for input. for example…
Arash
  • 3,013
  • 10
  • 52
  • 74
0
votes
2 answers

How to correctly export Weight and Bias value of Backpropagation neural network into another programming language (Java)

I created backpropagation Neural Network using Matlab. I tried to implement XOR gate using Matlab, then getting its weight and bias to create neural network in java. Network consist of 2 input neuron, 2 hidden layer each using 2 neuron and 1 output…
justmyfreak
  • 1,260
  • 3
  • 16
  • 30
0
votes
2 answers

Backpropagation learning fails to converge

I use a neural network with 3 layers for categorization problem: 1) ~2k neurons 2) ~2k neurons 3) 20 neurons. My training set consists of 2 examples, most of the inputs in each example are zeros. For some reason after the backpropagation training…
Natalia
  • 554
  • 1
  • 6
  • 16
0
votes
1 answer

parallelizing dynamic arrays

this is part of back-propagation algorithm code on neural networks. in our case we want to parallelize the for( pt=0; pt< N_PT_pair; pt++) loop, the for(epoch=0; epoch< MaxEpoch; epoch++) can not be parallelized. initialize W1[ ] [ ] and W2[ ][ ]…
-1
votes
2 answers

Back propagation algorithm

I found an example online which contains a method that back propagates the error and adjusts the weights. I was wondering how this exactly works and what weight update algorithm is used. Could it be gradient descent? /** * all output…
-1
votes
0 answers

Neural network backpropagation dloss/dactivation in hidden layer

I’m trying to create a neural network using numpy to get a better understanding of it. I am trying to calculate dL/da in the hidden layer after the output layer. I wrote a 2-4-4-2 nn and found the formula dL/da_hidden = dL/da_nextLayer *…
-1
votes
0 answers

Please Help confirm my artificial intelligence network understanding

I have been hard at work learning how to write custom neural network understanding without a library of any sort it is a fairly complicated concept but i have read enough and watched enough videos to think i get it but my network is not…
-1
votes
0 answers

How to turn softmax jacobian matrix to a vector gradient so it can be used in backpropegation?

I am learning how to differentiate the softmax function, I am using the article: https://towardsdatascience.com/derivative-of-the-softmax-function-and-the-categorical-cross-entropy-loss-ffceefc081d1 So in the example the (4x1) matrix is turned into…