4

How can one access to the learned weights of a DNN saved as following:

lstm_network_output.save(model_path)

chrisbasoglu
  • 357
  • 1
  • 9

3 Answers3

3

The weights/parameters of a network can be accessed by calling ‘lstm_network_output.parameters’ which returns a list of ‘Parameter’ variable objects. The value of a Parameter can be obtained using ‘value’ property of the Parameter object in the form of a numpy array. The value of the Parameter can be updated by ‘.value = ’.

Anna Kim
  • 61
  • 2
0

If you used name= properties in creating your model, you can also identify layers by name. For example:

model = Sequential([Embedding(300, name='embed'), Recurrence(LSTM(500)), Dense(10)])
E = model.embed.E   # accesses the embedding matrix of the embed layer

To know that the parameter is .E, please consult the docstring of the respective function (e.g. help(Embedding)). (In Dense and Convolution, the parameters would be .W and .b.)

The pattern above is for named layers, which are created using as_block(). You can also name intermediate variables, and access them in the same way. E.g.:

W = Parameter((13,42), init=0, name='W')
x = Input(13)
y = times(x, W, name='times1')
W_recovered = y.times1.W

# e.g. check the shape to see that they are the same
W_recovered.shape  # --> (13, 42)
W.shape            # --> (13, 42)

Technically, this will search all parameters that feed y. In case of a more complex network, you may end up having multiple parameters of the same name. Then an error will be thrown due to the ambiguity. In that case, you must work the .parameters tuple mentioned in Anna's response.

0

This python code worked for me to visualize some weights:

import numpy as np
import cntk as C

dnnFile = C.cntk_py.Function.load('Models\ConvNet_MNIST_5.dnn') # load model from MS example
layer8 = dnnFile.parameters()[8].value()
filter_num = 0
sliced = layer8.asarray()[ filter_num ][ 0 ] # shows filter works on input image
print(sliced)
Alexey Birukov
  • 1,565
  • 15
  • 22