0

I have exported my model from pytorch to ONNX but I am getting empty input list on running this code

input_names = [input.name for input in onnx_model.graph.input]

My model is

    TaggingAgent(
  (_encoder): BiGraphEncoder(
    (_utt_encoder): BiRNNEncoder(
      (_word_embedding): Embedding(2770, 128)
      (_rnn_cell): LSTM(128, 128, batch_first=True, bidirectional=True)
    )
    (_dialog_layer_user): GAT(
      (firstlayer): GraphAttentionLayer (256 -> 256)
      (secondlayer): GraphAttentionLayer (256 -> 256)
      (attention_0): GraphAttentionLayer (256 -> 256)
      (attention_1): GraphAttentionLayer (256 -> 256)
      (attention_2): GraphAttentionLayer (256 -> 256)
      (attention_3): GraphAttentionLayer (256 -> 256)
      (attention_4): GraphAttentionLayer (256 -> 256)
      (attention_5): GraphAttentionLayer (256 -> 256)
      (attention_6): GraphAttentionLayer (256 -> 256)
      (attention_7): GraphAttentionLayer (256 -> 256)
      (out_att): GraphAttentionLayer (2048 -> 256)
    )
  )
  (_decoder): RelationDecoder(
    (_sent_layer_dict): ModuleDict(
      (0): BiLSTMLayer(
        (_rnn_layer): LSTM(256, 128, batch_first=True, bidirectional=True)
      )
      (1): UniLinearLayer(
        (_linear_layer): Linear(in_features=256, out_features=256, bias=True)
      )
    )
    (_act_layer_dict): ModuleDict(
      (0): BiLSTMLayer(
        (_rnn_layer): LSTM(256, 128, batch_first=True, bidirectional=True)
      )
      (1): UniLSTMLayer(
        (_rnn_layer): LSTM(256, 256, batch_first=True)
      )
    )
    (_relate_layer): GraphRelation(
      (_sent_linear): Linear(in_features=256, out_features=256, bias=False)
      (_act_linear): Linear(in_features=256, out_features=256, bias=False)
      (_dialog_layer): GAT(
        (firstlayer): GraphAttentionLayer (256 -> 256)
        (secondlayer): GraphAttentionLayer (256 -> 256)
        (attention_0): GraphAttentionLayer (256 -> 256)
        (attention_1): GraphAttentionLayer (256 -> 256)
        (attention_2): GraphAttentionLayer (256 -> 256)
        (attention_3): GraphAttentionLayer (256 -> 256)
        (attention_4): GraphAttentionLayer (256 -> 256)
        (attention_5): GraphAttentionLayer (256 -> 256)
        (attention_6): GraphAttentionLayer (256 -> 256)
        (attention_7): GraphAttentionLayer (256 -> 256)
        (out_att): GraphAttentionLayer (2048 -> 256)
      )
    )
    (_sent_linear): Linear(in_features=256, out_features=3, bias=True)
    (_act_linear): Linear(in_features=256, out_features=15, bias=True)
  )
  (_criterion): NLLLoss()
)

What am I doing wrong here? I need some elements in the input list to work further. Is the problem with my pytorch model or with conversion from pytorch to onnx. What to make change?

0 Answers0