Questions tagged [mlp]

Tag for questions about Multi layer perceptrons (MLP). It is a class of feedforward artificial neural network. Multilayer perceptrons are sometimes colloquially referred to as "vanilla" neural networks, especially when they have a single hidden layer

275 questions
1
vote
1 answer

How to train a model for XOR using scikit-learn?

Is there a magic sequence of parameters to allow the model to infer correctly from the data it hasn't seen before? from sklearn.neural_network import MLPClassifier clf = MLPClassifier( activation='logistic', …
haalcala
  • 63
  • 5
1
vote
1 answer

Steps to follow before feeding a time series dataset in to neural network in R

Hi I'm having a time series dataset as below. When it is plotted this is what I recognized. I'm asked to forecast time series in this dataset. But I'm not sure on what are the preprocessing steps I need to be done, before feeding it to NN. Do I…
user3789200
  • 1,166
  • 2
  • 25
  • 45
1
vote
0 answers

How to clip layer output in MLP with `tf.keras.activations.relu()`?

According to the documentation, tf.keras.activations.relu(x, alpha=0.0, max_value=None, threshold=0) seems to clip x within [threshold, max_value], but x must be specified. How can I use it for clipping the output of a layer in neural network? Or is…
Paw in Data
  • 1,262
  • 2
  • 14
  • 32
1
vote
0 answers

Why MLP has better performance compared to CNN and LSTM?

I am comparing the performance of MLP, CNN, and LSTM in classification of single word speech. The performance of MLP is somehow equal to CNN. However, it has better performance than LSTM. Can anyone let me know what is the reason. The dataset…
MonoLiza
  • 21
  • 2
1
vote
1 answer

How to translate the neural network of MLP from tensorflow to pytorch

I have built up an MLP neural network using 'Tensorflow', which is stated as follow: model_mlp=Sequential() model_mlp.add(Dense(units=35, input_dim=train_X.shape[1], kernel_initializer='normal', activation='relu')) model_mlp.add(Dense(units=86,…
will_cheuk
  • 379
  • 3
  • 12
1
vote
1 answer

Keras layer shape incompatibility for a small MLP

I have a simple MLP built in Keras. The shapes of my inputs are: X_train.shape - (6, 5) Y_train.shape - 6 Create the model model = Sequential() model.add(Dense(32, input_shape=(X_train.shape[0],),…
Qubix
  • 4,161
  • 7
  • 36
  • 73
1
vote
1 answer

How do I use the MLP score function? Error: shapes (295,1) and (7,450) not aligned: 1 (dim 1) != 7 (dim 0)

I have recently begun coding deep neural networks in Python and I have been stuck with this problem for weeks. I have checked other similar questions but could not grasp the solution. I have a feed forward neural network and I am trying to obtain…
1
vote
1 answer

Keras: good result with MLP but bad with Bidirectional LSTM

I trained two neural networks with Keras: a MLP and a Bidirectional LSTM. My task is to predict the words order in a sentence, so for each word, the neural network has to output a real number. When a sentence with N words is processed, the N reals…
pairon
  • 427
  • 1
  • 7
  • 18
1
vote
1 answer

Calculate the parameters per model layer for Keras MLP

I am trying to follow this SO post on how the params are calculated for each layer, can anyone give me a tip? Here is the output of my model.summary(): This is the model: model = Sequential() model.add(Dense(60, input_dim=44,…
bbartling
  • 3,288
  • 9
  • 43
  • 88
1
vote
0 answers

MLP in Keras returns always 0.0 loss

I'm implementing a multilayer perceptron with Keras to predict the correct words order in a sentence. I'm using train_on_batch()because I convert each sentence in a tree and then order each local subtree: when each subtree is ordered, the entire…
pairon
  • 427
  • 1
  • 7
  • 18
1
vote
1 answer

sklearn MLPClassifier - zero hidden layers (i.e. logistic regression)

We know that a feed forward neural network with 0 hidden layers (i.e. just an input layer and an output layer) with a sigmoid activation function at the end should be equivalent to logistic regression. I wish to prove this to be true, but I need to…
aranglol
  • 129
  • 1
  • 9
1
vote
0 answers

MLP problem , ERROR: Found array with dim 3. Estimator expected <= 2 or

Im trying to solve a machine learning problem with MLP. So I have this particular file with a lot of matrices each one with 21 rows and 43 columns,separated by ********, where first row is a simulation and the next 20 is the response(each column…
mrpickle
  • 11
  • 1
1
vote
0 answers

The gradients are all 0 when backwards and the parameters didnot change at all

I implement the policy gradient method to learn the unknown function( which is a 10 loop sum function here), but the model did not update. The learning data is input and the target. func2 include the MLP model which to predict the target number. The…
1
vote
2 answers

Forecasting using MLP neural network

I'm trying to write a code in R to predict the currency rate of USD/EUR using MLP neural Network I'm facing a problem with the function neuralnet it shows an error: Error in neurons[[i]] %*% weights[[i]] : requires numeric/complex matrix/vector…
1
vote
1 answer

What is wrong with my approach of using MLP to make a chess engine?

I’m making a chess engine using machine learning, and I’m experiencing problems debugging it. I need help figuring out what is wrong with my program, and I would appreciate any help. I made my research and borrowed ideas from multiple successful…
Lowkey
  • 11
  • 2