4

I have trained a ffnn to fit a unknown function with pybrain. I build the ffnn like this

net = buildNetwork(1, 2, 1,hiddenclass=TanhLayer)

I said to pybrain to print the params of the net with the command

print net.params

and pybrain return me the params

(1.76464967 , 0.46764103 , 1.63394395 ,-0.95327762 , 1.19760151, -1.20449402, -1.34050959)

now I want to use this fitted function in another script. I tried

def netp(Q):
    net = buildNetwork(1, 2, 1,hiddenclass=TanhLayer)
    net._setParameters=(1.76464967 , 0.46764103 , 1.63394395 ,-0.95327762 , 1.19760151, -1.20449402, -1.34050959)
    arg=1.0/float(Q)
    p=float(net.activate([arg]))
    return p

The problem is that the values returned from the nets are completely out of mind. example

 0.0749046652125 1.0
-2.01920546405 0.5
-1.54408069672 0.333333333333
 1.05895945271 0.25
-1.01314347373 0.2
 1.56555648799 0.166666666667
 0.0824497539453 0.142857142857
 0.531176423655 0.125
 0.504185707604 0.111111111111
 0.841424535805 0.1

where the first column if the output of the net, and the second the input. The output of the net has to be close to the input value. What's the problem? Where I am doing wrong? It's a problem of over fitting or a I am missing something?

alko
  • 46,136
  • 12
  • 94
  • 102
emanuele
  • 2,519
  • 8
  • 38
  • 56
  • for what input you get *out of mind* output? please also include packages versions – alko Dec 12 '13 at 14:47
  • I am feeding the function with probabilities and the net returning me negative values or bigger then 1. – emanuele Dec 12 '13 at 14:52

1 Answers1

2

A typo:

net._setParameters=(1.76464967 , 0.46764103 , 1.63394395 ,-0.95327762 , 1.19760151, -1.20449402, -1.34050959)

This line effectively replaces private _setParamethers method with a tuple. Try if replacing this line with

net._setParameters([1.76464967 , 0.46764103 , 1.63394395 ,-0.95327762 , 1.19760151, -1.20449402, -1.34050959])

will help.

Second, don't see reasons for 1/Q operation, so simple

>>> def netp(Q): return float(net.activate([Q]))
>>> for i in inp:
...   print '{}\t{:.5f}'.format(i, netp(i))

yields

1.0      0.97634
0.5      0.46546
0.33333  0.29013
0.25     0.20762
0.2      0.16058
0.16666  0.13042
0.14285  0.10952
0.125    0.09421
0.11111  0.08254
0.1      0.07335
alko
  • 46,136
  • 12
  • 94
  • 102
  • no, the problem is the same, and now i am noting that the outputs values are random. – emanuele Dec 12 '13 at 15:04
  • I want that this repetitive calculation is done by inside the function. – emanuele Dec 12 '13 at 15:07
  • @emanuele no, not why in function, why at all. see edit in my answer – alko Dec 12 '13 at 15:09
  • because Q si reprensenting the odd. Then the inverse is the probability. – emanuele Dec 12 '13 at 15:12
  • @emanuele You got it wrong, 1/0.01=100 can't be probability. You have linear layers in and out and tanh inside, so feeding something big as 1/0.01 you'll definetely wont get back something small. It have + or - sign depending on params, but never will be small by absolute value – alko Dec 12 '13 at 15:14
  • @emanuele and random values result from random net params. you can try to print net.params inside your old function (before fix) to be sure) – alko Dec 12 '13 at 15:18