Questions tagged [backpropagation]

Backpropagation is a method of the gradient computation, often used in artificial neural networks to perform gradient descent.

Backpropagation is a method of the gradient computation, often used in artificial neural networks to perform gradient descent. It led to a “renaissance” in the field of artificial neural network research.

In most cases, it requires a teacher that knows, or can calculate, the desired output for any input in the training set. The term is an abbreviation for "backward propagation of errors".

1267 questions
-1
votes
1 answer

Neural Network playing Tic Tac Toe doesn't learn

I have a neural network playing tic-tac-toe. (I know there are other better methods for this, but I want to learn about NN) So the NN plays against a random AI. First, it should learn to make an allowed move, ie. not choosing a field that is already…
Dan
  • 13
  • 3
-1
votes
1 answer

Training a neural network

I am trying to train a neural network to play a game with a snake chasing a target. It's my first attempt to train a neural network. I am using Encog framework in Java with back propagation. To create the training set I record the movements the user…
Student
  • 47
  • 1
  • 7
-1
votes
1 answer

How do I get backpropagation to work for a MLP? MATLAB

I am trying to get an MLP to work. My goal is to get the net to predict output Yt when given Yt-1,Yt-2...,Yt-10. I've been using a generated dataset, which should be no trouble. My net will always output a straight line and will shift that line up…
-1
votes
1 answer

How do I create a back propagation neural network that has different kinds of output?

I'm sorry, I've just learned about the neural network and I have not yet understood in its implementation. Suppose I want to make a back propagation neural network that accepts multiple real numbers as input and produces two types of output, which…
-1
votes
1 answer

Approximation of best settings for a neural network?

I am a programming enthusiast so please excuse me and help fill any gaps.. From what i understand good results from a neural network require the sigmoid and either learn rate or step rate (depending on training method) to be set correctly along with…
Catnaps909
  • 117
  • 4
-1
votes
2 answers

Neural Network Training Methodology

need some help regarding training of neural network. to give you the background i have trained and tested my neural network for AND and OR and seems to work fine. FYI i am using back-propagation neural network. So coming to the problem i want to…
KungFu_Panda
  • 111
  • 11
-1
votes
1 answer

Using a single weight matrix for Back-Propagation in Neural Networks

In my Neural Network I have combined all of the weight matrices into one large matrix: e.g A 3 layer matrix usually has 3 weight matrices W1, W2, W3, one for each layer. I have created one large weight matrix called W, where W2 and W3 are appended…
-1
votes
1 answer

weights of layers in backpropagation algorithm

I searched through the internet a lot but could not come to the conclusion why do we use weights in each layer of backpropagation algorithm. i know that the weights are multiplied to the output of previous layer to get the input of the next layer,…
ark
  • 41
  • 1
  • 1
  • 6
-1
votes
1 answer

different method in weight changes on Backpropagation algorithm?

I've been using this tutorial for my reference on coding backpropagation. But, today, I've found another tutorial that has used same reference with me but with another approach in changing of synapse weight. What's different about the both of…
Pattisahusiwa
  • 205
  • 2
  • 8
-2
votes
2 answers

How to design a neural network that fits a function?

How to "design a back propagation neural network which can fit the function y = 9x + 3x^ + 8x^3 + 2x^4 + 2 with 1 input, 1 output, 1 hidden layer with 4 neurons."?
-2
votes
1 answer

function missing 2 required positional arguments: 'X_train' and 'y_train'

i'm writing Python with Jupyter notebook in reference and having this kind of error: TypeError Traceback (most recent call last) in 1 # Section II: First run the…
anisagml
  • 1
  • 1
  • 2
-2
votes
1 answer

Is Gradient Descent always used during backpropagation for updating weights?

Gradient Descent, rmsprop, adam are optimizers. Assume I have taken adam or rmsprop optimizer while compiling model i.e model.compile(optimizer = "adam"). My doubt is that, now during backpropagation, is gradient Descent is used for updating weights…
-2
votes
1 answer

where is the mistake in this neural network implementation?

I recently started to learn deep learning and I tried to write a forward and backward propagation from the beginning but I think there is a problem with my code so I know this is hard to find the problem and you may find it silly to code from the…
-2
votes
1 answer

Receiving AttributeError: 'tuple' object has no attribute 'T'

I am new in this website so sorry if I am not doing this thing right but I have a problem.What should i do to fix this ? import numpy as np X = (([0, 0, 0, 1], [0, 0, 1, 0], [0, 1, 0, 0], [1, 0, 0, 0])) h = (([0, 0], [0, 1], [1, 0], [1,…
hacquerqop
  • 7
  • 1
  • 4
-2
votes
1 answer

'GoogLeNet' object has no attribute 'features'

I am trying to use this module (https://github.com/utkuozbulak/pytorch-cnn-visualizations) to visualize what the network looks at in my images and I edited it to suit my needs a little bit. However, I get an error and I am not able to solve this…