0

I'm curious if neural networks (or neurolab in particular) needs the target/input data to be [-1:1]?

I'm trying to train a network to predict water evaporation from my kitchen garden, given these inputs:

  • temperature (C),
  • barometer (mbar),
  • precipitation (mm),
  • wind (m/s) and
  • initial soil moisture (%),

where each row of data represents 1 hour.

The training data is simply the measured delta soil moisture (%) by the end of the hour. I have collected ~1020 samples of data - perhaps this isn't enough?

All of these values are outside the range of -1:1 (temperature goes to -5, and barometer to 1040). I'm completely blank on neural networks, and have read a bit and done self-study only, but I would have expected that it should be fine to use larger numbers?

Code:

in_min_max = [
    [0, 100],       # initial soil humidity
    [950, 1050],    # hpa
    [0, 40],        # precip_mm
    [0, 100],       # wind_mps,
    [-5, 40]        # temp
]
# Create net with 5 inputs and 3 hidden, 1 output neuron
net = nl.net.newff(in_min_max, [5, 1])
net.trainf = nl.train.train_gd
error = net.train(i_data, t_data, epochs=500, show=100, goal=0.01)
print error

Output:

Epoch: 100; Error: 27215.4999985;
Epoch: 200; Error: 27215.4999985;
Epoch: 300; Error: 27215.4999985;
Epoch: 400; Error: 27215.4999985;
Epoch: 500; Error: 27215.4999985;
The maximum number of train epochs is reached
[26831.39953304854, 27190.818210734968, 26736.57181692442, 27215.499998465435, 27215.499998465435, 27215.499998465435, 27215.499998465435, 27215.499998465435, 27215.499998465435, 27215.499998465435, 27215.499998465435, 27215.499998465435, 27215.499998465435, 27215.49999846543, 27215.49999846543, 27215.499998465435, 27215.499998465428, 27215.499998465428, 27215.499998465428, 27215.499998465428, 27215.499998465428, 27215.499998465428, 27215.499998465428, 27215.499998465428, 27215.499998465428, 27215.499998465428, 27215.499998465424, 27215.499998465428, 27215.499998465424, 27215.49999846542, 27215.49999846542, 27215.499998465424, 27215.499998465424, 27215.499998465424, 27215.499998465424, 27215.49999846542, 27215.499998465424, 27215.49999846542, 27215.49999846542, 27215.49999846542, 27215.49999846542, 27215.49999846542, 27215.49999846542, 27215.49999846542, 27215.49999846542, 27215.49999846542, 27215.49999846542, 27215.499998465413, 27215.499998465413, 27215.499998465413, 27215.499998465413, 27215.499998465413, 27215.499998465413, 27215.499998465413, 27215.499998465413, 27215.499998465413, 27215.499998465413, 27215.499998465413, 27215.499998465413, 27215.49999846541, 27215.49999846541, 27215.499998465406, 27215.499998465406, 27215.499998465406, 27215.499998465406, 27215.499998465406, 27215.499998465406, 27215.499998465406, 27215.499998465406, 27215.499998465406, 27215.499998465406, 27215.499998465406, 27215.499998465406, 27215.499998465406, 27215.499998465402, 27215.499998465402, 27215.499998465402, 27215.4999984654, 27215.4999984654, 27215.4999984654, 27215.4999984654, 27215.4999984654, 27215.4999984654, 27215.4999984654, 27215.4999984654, 27215.4999984654, 27215.499998465395, 27215.499998465395, 27215.499998465395, 27215.499998465395, 27215.499998465395, 27215.499998465395, 27215.499998465395, 27215.49999846539, 27215.49999846539, 27215.49999846539, 27215.49999846539, 27215.49999846539, 27215.49999846539, 27215.49999846539, 27215.49999846539, 27215.49999846539, 27215.49999846539, 27215.49999846539, 27215.49999846539, 27215.49999846539, 27215.49999846539, 27215.49999846539, 27215.499998465388, 27215.499998465384, 27215.499998465384, 27215.499998465384, 27215.499998465384, 27215.499998465384, 27215.499998465384, 27215.499998465384, 27215.499998465384, 27215.499998465384, 27215.499998465384, 27215.499998465384, 27215.49999846538, 27215.49999846538, 27215.49999846538, 27215.49999846538, 27215.499998465377, 27215.499998465377, 27215.499998465377, 27215.499998465377, 27215.499998465377, 27215.499998465377, 27215.499998465377, 27215.499998465377, 27215.499998465377, 27215.499998465377, 27215.499998465377, 27215.499998465377, 27215.499998465373, 27215.499998465373, 27215.499998465373, 27215.49999846537, 27215.49999846537, 27215.49999846537, 27215.49999846537, 27215.49999846537, 27215.49999846537, 27215.49999846537, 27215.49999846537, 27215.49999846537, 27215.49999846537, 27215.49999846537, 27215.49999846537, 27215.499998465366, 27215.499998465366, 27215.499998465366, 27215.499998465366, 27215.499998465362, 27215.499998465362, 27215.499998465362, 27215.499998465362, 27215.499998465362, 27215.499998465362, 27215.499998465362, 27215.499998465362, 27215.499998465362, 27215.499998465362, 27215.499998465362, 27215.49999846536, 27215.499998465355, 27215.499998465355, 27215.499998465355, 27215.499998465355, 27215.499998465355, 27215.499998465355, 27215.499998465355, 27215.499998465355, 27215.499998465355, 27215.499998465355, 27215.499998465355, 27215.499998465355, 27215.499998465355, 27215.499998465355, 27215.499998465355, 27215.499998465355, 27215.499998465355, 27215.499998465355, 27215.499998465348, 27215.499998465348, 27215.499998465348, 27215.499998465348, 27215.499998465348, 27215.499998465348, 27215.499998465348, 27215.499998465348, 27215.499998465348, 27215.499998465348, 27215.499998465348, 27215.499998465348, 27215.499998465348, 27215.499998465348, 27215.499998465344, 27215.49999846534, 27215.49999846534, 27215.49999846534, 27215.49999846534, 27215.49999846534, 27215.49999846534, 27215.49999846534, 27215.49999846534, 27215.49999846534, 27215.49999846534, 27215.49999846534, 27215.49999846534, 27215.49999846534, 27215.49999846534, 27215.49999846534, 27215.499998465337, 27215.499998465333, 27215.499998465333, 27215.499998465333, 27215.499998465333, 27215.499998465333, 27215.499998465333, 27215.499998465333, 27215.499998465333, 27215.499998465333, 27215.499998465333, 27215.499998465333, 27215.499998465333, 27215.499998465333, 27215.499998465333, 27215.49999846533, 27215.499998465326, 27215.499998465326, 27215.499998465326, 27215.499998465326, 27215.499998465326, 27215.499998465326, 27215.499998465326, 27215.499998465326, 27215.499998465326, 27215.499998465326, 27215.499998465326, 27215.499998465326, 27215.499998465326, 27215.499998465326, 27215.499998465326, 27215.49999846532, 27215.49999846532, 27215.49999846532, 27215.49999846532, 27215.49999846532, 27215.49999846532, 27215.49999846532, 27215.49999846532, 27215.49999846532, 27215.49999846532, 27215.49999846532, 27215.49999846532, 27215.49999846532, 27215.49999846532, 27215.49999846532, 27215.49999846531, 27215.49999846531, 27215.49999846531, 27215.49999846531, 27215.49999846531, 27215.49999846531, 27215.49999846531, 27215.49999846531, 27215.49999846531, 27215.49999846531, 27215.499998465308, 27215.49999846531, 27215.49999846531, 27215.499998465308, 27215.499998465308, 27215.499998465308, 27215.499998465308, 27215.499998465304, 27215.499998465308, 27215.499998465304, 27215.499998465304, 27215.499998465304, 27215.499998465304, 27215.499998465304, 27215.499998465304, 27215.499998465304, 27215.499998465304, 27215.499998465304, 27215.4999984653, 27215.499998465297, 27215.499998465297, 27215.499998465297, 27215.499998465297, 27215.499998465297, 27215.499998465297, 27215.499998465297, 27215.499998465297, 27215.499998465297, 27215.499998465297, 27215.499998465297, 27215.499998465297, 27215.499998465297, 27215.499998465297, 27215.499998465293, 27215.499998465293, 27215.499998465293, 27215.499998465293, 27215.499998465293, 27215.499998465293, 27215.49999846529, 27215.49999846529, 27215.49999846529, 27215.49999846529, 27215.49999846529, 27215.49999846529, 27215.49999846529, 27215.499998465286, 27215.49999846529, 27215.499998465286, 27215.499998465286, 27215.499998465286, 27215.499998465286, 27215.499998465286, 27215.499998465286, 27215.499998465286, 27215.499998465286, 27215.499998465286, 27215.499998465286, 27215.499998465286, 27215.499998465282, 27215.499998465282, 27215.499998465282, 27215.499998465282, 27215.49999846528, 27215.49999846528, 27215.49999846528, 27215.49999846528, 27215.49999846528, 27215.49999846528, 27215.49999846528, 27215.49999846528, 27215.49999846528, 27215.499998465275, 27215.499998465275, 27215.499998465275, 27215.49999846527, 27215.49999846527, 27215.49999846527, 27215.49999846527, 27215.49999846527, 27215.49999846527, 27215.49999846527, 27215.49999846527, 27215.49999846527, 27215.499998465268, 27215.49999846527, 27215.49999846527, 27215.499998465268, 27215.49999846527, 27215.49999846527, 27215.499998465268, 27215.499998465264, 27215.499998465268, 27215.499998465268, 27215.499998465264, 27215.499998465264, 27215.499998465264, 27215.499998465264, 27215.499998465264, 27215.499998465264, 27215.499998465264, 27215.499998465264, 27215.499998465264, 27215.499998465257, 27215.49999846526, 27215.49999846526, 27215.49999846526, 27215.49999846526, 27215.499998465257, 27215.499998465257, 27215.499998465257, 27215.499998465257, 27215.499998465257, 27215.499998465257, 27215.499998465257, 27215.499998465257, 27215.499998465257, 27215.499998465253, 27215.499998465253, 27215.499998465253, 27215.49999846525, 27215.499998465253, 27215.499998465253, 27215.49999846525, 27215.49999846525, 27215.49999846525, 27215.49999846525, 27215.49999846525, 27215.49999846525, 27215.49999846525, 27215.49999846525, 27215.49999846525, 27215.499998465246, 27215.49999846525, 27215.499998465246, 27215.49999846525, 27215.499998465242, 27215.499998465242, 27215.499998465242, 27215.499998465242, 27215.499998465242, 27215.499998465242, 27215.499998465242, 27215.49999846524, 27215.499998465242, 27215.499998465242, 27215.499998465242, 27215.49999846524, 27215.49999846524, 27215.49999846524, 27215.49999846524, 27215.49999846524, 27215.49999846524, 27215.49999846524, 27215.49999846524, 27215.49999846524, 27215.499998465235, 27215.499998465235, 27215.499998465235, 27215.499998465235, 27215.499998465235, 27215.499998465235, 27215.499998465235, 27215.49999846523, 27215.49999846523, 27215.49999846523, 27215.49999846523, 27215.499998465228, 27215.49999846523, 27215.49999846523, 27215.499998465228, 27215.499998465228, 27215.499998465228, 27215.499998465228, 27215.499998465228, 27215.499998465228, 27215.499998465228, 27215.499998465228, 27215.499998465228, 27215.499998465228, 27215.499998465224, 27215.499998465224, 27215.499998465224, 27215.499998465224, 27215.499998465224, 27215.499998465224, 27215.499998465224, 27215.499998465224, 27215.499998465224, 27215.499998465224, 27215.499998465224, 27215.49999846522, 27215.49999846522, 27215.49999846522, 27215.49999846522, 27215.499998465217, 27215.499998465217, 27215.499998465217, 27215.499998465217, 27215.499998465217, 27215.499998465213, 27215.499998465217, 27215.499998465217, 27215.499998465217, 27215.499998465213, 27215.499998465213, 27215.499998465213, 27215.499998465213, 27215.499998465213, 27215.499998465213, 27215.499998465213, 27215.499998465213, 27215.49999846521, 27215.49999846521, 27215.49999846521, 27215.49999846521, 27215.499998465206, 27215.499998465206, 27215.499998465206, 27215.499998465206, 27215.499998465206, 27215.499998465206, 27215.499998465206, 27215.499998465206, 27215.499998465202, 27215.499998465202, 27215.499998465202, 27215.499998465202, 27215.499998465202]

I can post the input/target data also if this would help, but I think I need to know the basics first - I could be using a completely wrong network for this type of task.. Any help/pointers are appreciated.

I don't need a super precise predicting network, as this will only be used to irrigate my kitchen garden, but I would assume that an error of 27215 will get me into trouble..

  • If I remember correctly, neural networks only calculated with values between -1 and 1, so yes, input values should always be between -1 and 1. (and output values too) – MegaIng Feb 21 '18 at 18:58
  • Thats at least the impression I got, looking through examples - I assume that it is then valid to transform in/output to -1:1,, and then train, simulate, and convert the -1:1 output back into % delta value? Seems a nice convenience to that this supported out of the box.. – Steffen Schumacher Feb 21 '18 at 20:53
  • Yes, that is valid and also the norm. – MegaIng Feb 22 '18 at 14:46

0 Answers0