Questions tagged [backpropagation]

Backpropagation is a method of the gradient computation, often used in artificial neural networks to perform gradient descent.

Backpropagation is a method of the gradient computation, often used in artificial neural networks to perform gradient descent. It led to a “renaissance” in the field of artificial neural network research.

In most cases, it requires a teacher that knows, or can calculate, the desired output for any input in the training set. The term is an abbreviation for "backward propagation of errors".

1267 questions
0
votes
1 answer

Python Backpropagation - How to Initialize the starting activation?

I am having some troubles implementing this backprop network. I'm not really understanding how to start this off because in this network my first layer only has 8 nodes. But my prompt gives me 10 in the training set. In the first group for example,…
0
votes
1 answer

How to replace pixel value if we use minMaxLoc function

i am trying to select a 3x3 non overlapping region of interest from an image, and than select the maximum of that 3x3, than process it. After processing now i want to save the new processed value in the original image pixel location from where the…
0
votes
1 answer

How many backpropogation passes on each set of inputs are required?

In a neural network how many passes on each input should I carry out?
Wightboy
  • 251
  • 1
  • 12
0
votes
1 answer

Standardising Training Set in Backpropogation

If I was to standardise the training data before I train the neural network, after the training do I then de-standardise the training data and feed it back in to the neural network to show the final modelled results and expected results. Or do I…
obsessiveCookie
  • 1,130
  • 2
  • 18
  • 33
0
votes
1 answer

Multilayer perceptron with target variable as array instead of a single value

I am new to deep learning and I have been trying to use the theano library to train my data. MLP tutorial here has a scalar output value while my use case has an array with a 1 corresponding to the value depicted in the output. For example (assume…
0
votes
0 answers

Python Backpropagation: No output value

so I'm trying to work on back propagation right now but for some reason, I'm getting an output, an activation, but nothing for the output value. Right now I'm just working with a one layer network, just something simple but it doesn't seem to do…
0
votes
2 answers

How to decide activation function in neural network

I am using feedforward, backpropagation, multilayer neural network and I am using sigmoid function as a activation function which is having range of -1 to 1. But the minimum error is not going below 5.8 and I want so less, you can see the output…
lkkkk
  • 1,999
  • 4
  • 23
  • 29
0
votes
2 answers

How to automate testing in weka?

My C# program generates both training and testing data. I need to use Back Propagation Neural Network/ Multilayer perceptron in Weka GUI for classification & testing. Currently I'm supplying the testing data manually. As my C# program generates test…
Aswini V
  • 3
  • 4
0
votes
2 answers

early stopping in neural network using validation set

I want to use early stopping method to avoid over fitting in neural network. I have divided my dataset to 60-20-20 60 - training 20 - validation set 20 - test set I have a doubt while implementing early stopping. We update weights for one epoch…
alex
  • 1,421
  • 1
  • 16
  • 19
0
votes
0 answers

Tanh activation function giving higher error and worse output than sigmoid one

I implemented the tanh function as my activation function, but the result somehow is worse than with a sigmoid activation function. Moreover, while checking the error, it shows that the error goes up down and up again, over and over. Here are…
wendy0402
  • 495
  • 2
  • 5
  • 16
0
votes
0 answers

output neural network backpropagation not accurate

Recently, I and my partner developed a chord recognition tool using a neural network for research. For input, we are using the results from a pitch class profile. There are 12 inputs as representations of each pitch class. The output is 5 nodes. We…
wendy0402
  • 495
  • 2
  • 5
  • 16
0
votes
2 answers

Determining hidden neurons in Neural Network

How do we select number of neurons for hidden layer (Backpropagation Network)? Is there any hard-and-fast rule for selecting number of hidden neurons? I found that it should be nearly equal to square root of (no_input_neurons * no_output_neurons) in…
Ayam
  • 43
  • 7
0
votes
1 answer

Error Backpropagation - Neural network

I am trying to write a code for error back-propagation for neural network but my code is taking really long time to execute. I know that training of Neural network takes long time but it is taking long time for a single iteration as…
0
votes
0 answers

Deep Neural Network final output neurons stops at a medium point and does not go towards desired Target

Hope you all to be well. I have two questions. 1) in my deep network, my desired target output is [1,0] for class1 and [0,1] for class2. However after thousands of epochs (2000, 3000) it comes to MSE of 0.234 optimal and than it almost stays there…
0
votes
0 answers

Neural Networks: Online learning (one-example-at-a-time)

I have a neural network that performs a classification task, and it works fairly well when the training set is large enough. However, I'm looking for a way to train the NN with one labelled example at a time. That is, I intercept data, one example…
user8472
  • 726
  • 1
  • 8
  • 16