2

I am trying to create a basic NN using MLP Classifier. When I use the method mlp.fit a get the following error:

ValueError: Unknown label type: (array([

Below my simple code

df_X_train = df_train[["Pe/Pe_nom","Gas_cons","PthLoad"]]
df_Y_train = df_train["Eff_Th"]

df_X_test = df_test[["Pe/Pe_nom","Gas_cons","PthLoad"]]
df_Y_test = df_test["Eff_Th"]

X_train = np.asarray(df_X_train, dtype="float64")
Y_train = np.asarray(df_Y_train, dtype="float64")
X_test = np.asarray(df_X_test, dtype="float64")
Y_test = np.asarray(df_Y_test, dtype="float64")

from sklearn.neural_network import MLPClassifier

mlp = MLPClassifier(hidden_layer_sizes=(100,), verbose=True)
mlp.fit(X_train, Y_train)

Actually I don't understand why the method fit does not like the float type of X_train and Y_train.

Just to make everything clear below matrix dimensions:

X_train.shape --> (720, 3)
Y_train.shape --> (720,)

I wish i asked it in a correct way, thank you.

Below the complete error:

> --------------------------------------------------------------------------- ValueError                                Traceback (most recent call
> last) <ipython-input-6-2efb224ab852> in <module>()
>       2 
>       3 mlp = MLPClassifier(hidden_layer_sizes=(100,), verbose=True)
> ----> 4 mlp.fit(X_train, Y_train)
>       5 
>       6 #y_pred_train = mlp.predict(X_train)
> 
> C:\ProgramData\Anaconda3\lib\site-packages\sklearn\neural_network\multilayer_perceptron.py
> in fit(self, X, y)
>     971         """
>     972         return self._fit(X, y, incremental=(self.warm_start and
> --> 973                                             hasattr(self, "classes_")))
>     974 
>     975     @property
> 
> C:\ProgramData\Anaconda3\lib\site-packages\sklearn\neural_network\multilayer_perceptron.py
> in _fit(self, X, y, incremental)
>     329                              hidden_layer_sizes)
>     330 
> --> 331         X, y = self._validate_input(X, y, incremental)
>     332         n_samples, n_features = X.shape
>     333 
> 
> C:\ProgramData\Anaconda3\lib\site-packages\sklearn\neural_network\multilayer_perceptron.py
> in _validate_input(self, X, y, incremental)
>     914         if not incremental:
>     915             self._label_binarizer = LabelBinarizer()
> --> 916             self._label_binarizer.fit(y)
>     917             self.classes_ = self._label_binarizer.classes_
>     918         elif self.warm_start:
> 
> C:\ProgramData\Anaconda3\lib\site-packages\sklearn\preprocessing\label.py
> in fit(self, y)
>     282 
>     283         self.sparse_input_ = sp.issparse(y)
> --> 284         self.classes_ = unique_labels(y)
>     285         return self
>     286 
> 
> C:\ProgramData\Anaconda3\lib\site-packages\sklearn\utils\multiclass.py
> in unique_labels(*ys)
>      94     _unique_labels = _FN_UNIQUE_LABELS.get(label_type, None)
>      95     if not _unique_labels:
> ---> 96         raise ValueError("Unknown label type: %s" % repr(ys))
>      97 
>      98     ys_labels = set(chain.from_iterable(_unique_labels(y) for y in ys))
> 
> ValueError: Unknown label type: (array([1.        , 0.89534884, 0.58139535, 0.37209302, 0.24418605,
   0.15116279, 0.09302326, 0.23255814, 0.34883721, 0.37209302,
   0.30232558, 0.23255814, 0.18604651, 0.12790698, 0.08139535,
   0.08139535, 0.19767442, 0.27906977, 0.26744186, 0.22093023,
   0.1744186 , 0.11627907, 0.06976744, 0.05813953, 0.1744186 ,
   0.26744186, 0.34883721, 0.40697674, 0.46511628, 0.45348837,
   0.38372093, 0.31395349, 0.26744186, 0.36046512, 0.44186047,
   0.48837209, 0.53488372, 0.48837209, 0.40697674, 0.31395349,
   0.24418605, 0.1744186 , 0.19767442, 0.29069767, 0.36046512,
   0.3255814 , 0.26744186, 0.20930233, 0.13953488, 0.09302326,
   0.04651163, 0.09302326, 0.19767442, 0.29069767, 0.26744186,
   0.20930233, 0.1627907 , 0.11627907, 0.06976744, 0.03488372,
   0.12790698, 0.24418605, 0.31395349, 0.26744186, 0.20930233,
   0.1627907 , 0.11627907, 0.06976744, 0.03488372, 0.13953488,
   0.25581395, 0.30232558, 0.24418605, 0.19767442, 0.15116279,
   0.09302326, 0.05813953, 0.04651163, 0.1627907 , 0.26744186,
   0.30232558, 0.24418605, 0.19767442, 0.13953488, 0.09302326,
   0.05813953, 0.06976744, 0.18604651, 0.27906977, 0.27906977,
   0.23255814, 0.1744186 , 0.12790698, 0.08139535, 0.03488372,
   0.10465116, 0.22093023, 0.29069767, 0.26744186, 0.22093023,
   0.1627907 , 0.11627907, 0.06976744, 0.03488372, 0.12790698,
   0.24418605, 0.30232558, 0.25581395, 0.20930233, 0.15116279,
   0.10465116, 0.05813953, 0.03488372, 0.15116279, 0.26744186,
   0.30232558, 0.25581395, 0.19767442, 0.15116279, 0.09302326,
   0.05813953, 0.09302326, 0.20930233, 0.29069767, 0.26744186,
   0.22093023, 0.1627907 , 0.11627907, 0.06976744, 0.02325581,
   0.12790698, 0.23255814, 0.31395349, 0.26744186, 0.20930233,
   0.1627907 , 0.11627907, 0.06976744, 0.03488372, 0.13953488,
   0.25581395, 0.31395349, 0.25581395, 0.20930233, 0.15116279,
   0.10465116, 0.05813953, 0.02325581, 0.11627907, 0.22093023,
   0.29069767, 0.24418605, 0.19767442, 0.13953488, 0.09302326,
   0.04651163, 0.02325581, 0.10465116, 0.20930233, 0.30232558,
   0.25581395, 0.20930233, 0.15116279, 0.10465116, 0.05813953,
   0.03488372, 0.13953488, 0.24418605, 0.31395349, 0.25581395,
   0.20930233, 0.15116279, 0.10465116, 0.15116279, 0.26744186,
   0.3372093 , 0.36046512, 0.30232558, 0.24418605, 0.19767442,
   0.1744186 , 0.25581395, 0.3255814 , 0.38372093, 0.41860465,
   0.34883721, 0.29069767, 0.23255814, 0.1627907 , 0.1744186 ,
   0.27906977, 0.34883721, 0.3255814 , 0.26744186, 0.20930233,
   0.15116279, 0.09302326, 0.04651163, 0.10465116, 0.22093023,
   0.30232558, 0.25581395, 0.20930233, 0.15116279, 0.10465116,
   0.05813953, 0.02325581, 0.12790698, 0.24418605, 0.30232558,
   0.25581395, 0.20930233, 0.15116279, 0.10465116, 0.1627907 ,
   0.26744186, 0.37209302, 0.45348837, 0.51162791, 0.55813953,
   0.59302326, 0.62790698, 0.56976744, 0.48837209, 0.40697674,
   0.36046512, 0.43023256, 0.47674419, 0.48837209, 0.39534884,
   0.30232558, 0.23255814, 0.1627907 , 0.10465116, 0.19767442,
   0.29069767, 0.31395349, 0.25581395, 0.20930233, 0.15116279,
   0.10465116, 0.05813953, 0.02325581, 0.03488372, 0.15116279,
   0.25581395, 0.25581395, 0.20930233, 0.15116279, 0.10465116,
   0.06976744, 0.03488372, 0.04651163, 0.1627907 , 0.26744186,
   0.25581395, 0.20930233, 0.1627907 , 0.11627907, 0.06976744,
   0.03488372, 0.        , 0.10465116, 0.20930233, 0.27906977,
   0.22093023, 0.1744186 , 0.12790698, 0.08139535, 0.08139535,
   0.19767442, 0.29069767, 0.36046512, 0.43023256, 0.48837209,
   0.53488372, 0.56976744, 0.60465116, 0.52325581, 0.45348837,
   0.38372093, 0.45348837, 0.51162791, 0.54651163, 0.54651163,
   0.44186047, 0.36046512, 0.27906977, 0.20930233, 0.1744186 ,
   0.25581395, 0.3372093 , 0.3372093 , 0.27906977, 0.22093023,
   0.1627907 , 0.10465116, 0.05813953, 0.06976744, 0.18604651,
   0.27906977, 0.27906977, 0.22093023, 0.1744186 , 0.12790698,
   0.08139535, 0.03488372, 0.10465116, 0.22093023, 0.30232558,
   0.27906977, 0.22093023, 0.1744186 , 0.11627907, 0.19767442,
   0.29069767, 0.36046512, 0.40697674, 0.34883721, 0.29069767,
   0.23255814, 0.1744186 , 0.20930233, 0.30232558, 0.36046512,
   0.34883721, 0.29069767, 0.23255814, 0.1744186 , 0.11627907,
   0.06976744, 0.11627907, 0.22093023, 0.30232558, 0.27906977,
   0.23255814, 0.1744186 , 0.12790698, 0.08139535, 0.12790698,
   0.24418605, 0.3255814 , 0.27906977, 0.23255814, 0.1744186 ,
   0.12790698, 0.08139535, 0.03488372, 0.        , 0.11627907,
   0.22093023, 0.27906977, 0.22093023, 0.1744186 , 0.12790698,
   0.08139535, 0.04651163, 0.02325581, 0.11627907, 0.23255814,
   0.30232558, 0.25581395, 0.19767442, 0.15116279, 0.10465116,
   0.05813953, 0.08139535, 0.19767442, 0.29069767, 0.29069767,
   0.23255814, 0.18604651, 0.13953488, 0.08139535, 0.04651163,
   0.06976744, 0.18604651, 0.27906977, 0.27906977, 0.23255814,
   0.1744186 , 0.12790698, 0.08139535, 0.04651163, 0.12790698,
   0.24418605, 0.3255814 , 0.27906977, 0.22093023, 0.1744186 ,
   0.11627907, 0.06976744, 0.03488372, 0.13953488, 0.24418605,
   0.30232558, 0.25581395, 0.19767442, 0.15116279, 0.10465116,
   0.05813953, 0.02325581, 0.13953488, 0.24418605, 0.26744186,
   0.22093023, 0.1744186 , 0.12790698, 0.06976744, 0.03488372,
   0.08139535, 0.19767442, 0.27906977, 0.29069767, 0.24418605,
   0.19767442, 0.13953488, 0.09302326, 0.11627907, 0.23255814,
   0.3255814 , 0.30232558, 0.25581395, 0.19767442, 0.15116279,
   0.09302326, 0.04651163, 0.08139535, 0.19767442, 0.27906977,
   0.31395349, 0.25581395, 0.19767442, 0.15116279, 0.10465116,
   0.05813953, 0.09302326, 0.20930233, 0.30232558, 0.27906977,
   0.23255814, 0.1744186 , 0.12790698, 0.08139535, 0.03488372,
   0.03488372, 0.15116279, 0.25581395, 0.26744186, 0.20930233,
   0.1627907 , 0.11627907, 0.06976744, 0.03488372, 0.01162791,
   0.12790698, 0.23255814, 0.31395349, 0.29069767, 0.24418605,
   0.18604651, 0.13953488, 0.09302326, 0.05813953, 0.1744186 ,
   0.27906977, 0.34883721, 0.29069767, 0.23255814, 0.1744186 ,
   0.11627907, 0.06976744, 0.09302326, 0.19767442, 0.30232558,
   0.31395349, 0.26744186, 0.20930233, 0.15116279, 0.10465116,
   0.05813953, 0.09302326, 0.20930233, 0.30232558, 0.27906977,
   0.23255814, 0.1744186 , 0.12790698, 0.08139535, 0.03488372,
   0.08139535, 0.20930233, 0.29069767, 0.26744186, 0.20930233,
   0.1627907 , 0.11627907, 0.06976744, 0.03488372, 0.09302326,
   0.20930233, 0.27906977, 0.23255814, 0.18604651, 0.13953488,
   0.09302326, 0.04651163, 0.05813953, 0.18604651, 0.26744186,
   0.3372093 , 0.30232558, 0.24418605, 0.19767442, 0.13953488,
   0.09302326, 0.1744186 , 0.27906977, 0.34883721, 0.30232558,
   0.24418605, 0.18604651, 0.13953488, 0.08139535, 0.03488372,
   0.04651163, 0.1627907 , 0.26744186, 0.26744186, 0.22093023,
   0.1627907 , 0.11627907, 0.06976744, 0.03488372, 0.03488372,
   0.15116279, 0.25581395, 0.27906977, 0.22093023, 0.1744186 ,
   0.12790698, 0.08139535, 0.03488372, 0.01162791, 0.12790698,
   0.23255814, 0.29069767, 0.24418605, 0.19767442, 0.13953488,
   0.09302326, 0.05813953, 0.05813953, 0.1744186 , 0.27906977,
   0.29069767, 0.24418605, 0.18604651, 0.13953488, 0.09302326,
   0.11627907, 0.23255814, 0.30232558, 0.34883721, 0.29069767,
   0.24418605, 0.18604651, 0.12790698, 0.15116279, 0.25581395,
   0.3255814 , 0.30232558, 0.24418605, 0.19767442, 0.13953488,
   0.09302326, 0.12790698, 0.22093023, 0.30232558, 0.25581395,
   0.20930233, 0.1627907 , 0.11627907, 0.05813953, 0.02325581,
   0.05813953, 0.1744186 , 0.26744186, 0.22093023, 0.1744186 ,
   0.12790698, 0.08139535, 0.04651163, 0.01162791, 0.11627907,
   0.22093023, 0.25581395, 0.22093023, 0.1744186 , 0.12790698,
   0.08139535, 0.03488372, 0.08139535, 0.19767442, 0.27906977,
   0.34883721, 0.29069767, 0.24418605, 0.18604651, 0.13953488,
   0.10465116, 0.22093023, 0.30232558, 0.3255814 , 0.27906977,
   0.22093023, 0.1627907 , 0.10465116, 0.05813953, 0.02325581,
   0.12790698, 0.24418605, 0.29069767, 0.24418605, 0.19767442,
   0.13953488, 0.09302326, 0.05813953, 0.02325581, 0.10465116,
   0.22093023, 0.30232558, 0.24418605, 0.19767442, 0.15116279,
   0.09302326, 0.05813953, 0.02325581, 0.06976744, 0.18604651,
   0.27906977, 0.25581395, 0.20930233, 0.1627907 , 0.10465116,
   0.06976744, 0.03488372, 0.04651163, 0.1627907 , 0.25581395,
   0.3255814 , 0.38372093, 0.44186047, 0.41860465, 0.34883721,
   0.29069767, 0.24418605, 0.25581395, 0.34883721, 0.41860465,
   0.46511628, 0.5       , 0.51162791, 0.41860465, 0.3372093 ,
   0.26744186, 0.20930233, 0.20930233, 0.30232558, 0.37209302,
   0.36046512, 0.29069767, 0.22093023, 0.15116279, 0.10465116,
   0.09302326, 0.19767442, 0.27906977, 0.25581395, 0.20930233,
   0.1627907 , 0.11627907, 0.06976744, 0.02325581, 0.08139535,
   0.19767442, 0.26744186, 0.22093023, 0.1744186 , 0.13953488,
   0.09302326, 0.04651163, 0.02325581, 0.13953488, 0.24418605,
   0.26744186, 0.22093023, 0.1744186 , 0.12790698, 0.08139535,
   0.1744186 , 0.26744186, 0.34883721, 0.40697674, 0.46511628,
   0.41860465, 0.34883721, 0.27906977, 0.22093023, 0.18604651,
   0.27906977, 0.34883721, 0.37209302, 0.30232558, 0.24418605,
   0.1744186 , 0.11627907, 0.06976744, 0.03488372, 0.15116279]),)
Venkatachalam
  • 16,288
  • 9
  • 49
  • 77
Pier
  • 97
  • 1
  • 13
  • Could you please check what `type(X_train)` and `type(Y_train)` returns? Maybe your array is wrapped in a different data type – offeltoffel Jan 09 '19 at 09:32
  • I'm not sure if this is causing your problem, but it might be a problem nevertheless: I think MLPClassifier.fit expects also Y to be rectangular, i.e. in your case have shape (720, 1) and not (720,). You can achieve this easily replacing `df_Y_train = df_train["Eff_Th"]` with `df_Y_train = df_train[["Eff_Th"]]`. – Marco Spinaci Jan 09 '19 at 09:33
  • I think in your question, you wanted to write `Y_train.shape --> (720,)` – Sheldore Jan 09 '19 at 09:34
  • @MarcoSpinaci: That should not cause any trouble, as long as dimensions match. Try `X_train = np.array([[1,2,3],[10,20,30],[100,200,300]])` and `Y_train = np.array([3, 30, 300])` and you'll see that it works. I think the problem is more related to the data types of X_train and Y_train. – offeltoffel Jan 09 '19 at 09:38
  • 1
    @offeltoffel Thank you for your answer, both variables X_train and Y_train are dtype('float64') – Pier Jan 09 '19 at 09:41
  • @offeltoffel you're right, I talked too hastily looking at the docs without trying it out myself. Indeed it should work for good numeric data... – Marco Spinaci Jan 09 '19 at 09:42
  • @MarcoSpinaci Hello Marco, thank you. I tried also to reshape Y_train from (720,) to (720,1) but nothing changes. I get the same error. – Pier Jan 09 '19 at 09:42
  • @Pier, could you please add some more details about the error you get, for example the full traceback or at least the full error message? – Marco Spinaci Jan 09 '19 at 09:44
  • Actually if i convert my matrixs from float64 to int, everything works and i dont get the error but, of course, the precision is not acceptable. – Pier Jan 09 '19 at 09:45
  • @MarcoSpinaci I added it in the post – Pier Jan 09 '19 at 09:48
  • @Pier look at AI_Learning's answer below, the answer was much easier than what was written here! You just used classifier instead of regressor, classifier needs a discrete output (the class it belongs to), hence it can't accept floating points (if you convert it to integers it interprets e.g. 273 as a completely different class from 274, just as different from -1000 etc.). If you want to predict floats of course you need to use regression not classification, e.g. MLPRegressor... – Marco Spinaci Jan 09 '19 at 09:51

1 Answers1

2

It looks you need MLPRegressor instead MLPClassifier, when you say your target variable needs to be in precision of float.

Venkatachalam
  • 16,288
  • 9
  • 49
  • 77
  • 1
    Exactly. Now with the full traceback we can see that the MLPC is trying to learn floating point values as labels. In the original question it looked like the error message ends after `(array([` – offeltoffel Jan 09 '19 at 09:55
  • 1
    Thank you to both, with MLPRegressor is working fine. – Pier Jan 09 '19 at 10:15