I need to freeze some of layer's parameters during training. I tried to set needs_gradient
attribute by model.L1.b.needs_gradient = False
but I get the following exception:
AttributeError Traceback (most recent call last)
<ipython-input-57-93ef31fae7d8> in <module>()
----> 1 model.L1.b.needs_gradient = False
/home/aj/anaconda3/envs/cntk-py27/lib/python2.7/site-packages/cntk/cntk_py.pyc in <lambda>(self, name, value)
1263 for _s in [Variable]:
1264 __swig_setmethods__.update(getattr(_s, '__swig_setmethods__', {}))
-> 1265 __setattr__ = lambda self, name, value: _swig_setattr(self, Parameter, name, value)
1266 __swig_getmethods__ = {}
1267 for _s in [Variable]:
/home/aj/anaconda3/envs/cntk-py27/lib/python2.7/site-packages/cntk/cntk_py.pyc in _swig_setattr(self, class_type, name, value)
72
73 def _swig_setattr(self, class_type, name, value):
---> 74 return _swig_setattr_nondynamic(self, class_type, name, value, 0)
75
76
/home/aj/anaconda3/envs/cntk-py27/lib/python2.7/site-packages/cntk/cntk_py.pyc in _swig_setattr_nondynamic(self, class_type, name, value, static)
64 if (not static):
65 if _newclass:
---> 66 object.__setattr__(self, name, value)
67 else:
68 self.__dict__[name] = value
AttributeError: can't set attribute
Please help me to eliminate the exception or another way to freeze parameters. thanks