Problem description
We would like our onnx models to have some form of description, and ideally some other metadata including our internal version number. Currently we train with pytorch lightning and use onnxruntime for inference.
Below is a minimal executable example which assigns a model description by:
- Export with
torch.onnx.export
- Load with
onnx.load
- Set
model.doc_string
- Export with
onnx.save
- Load with
onnxruntime.InferenceSession
The question
Using
onnx
seems unnecessary, is there a way to include a model description when usingtorch.onnx.export
?
Reproducible example
>>> import torch
>>> import torchvision
>>> from onnxruntime import InferenceSession
>>>
>>> # get a sample model
>>> dummy_input = torch.randn(10, 3, 224, 224)
>>> _model = torchvision.models.alexnet(pretrained=True)
>>> input_names = [ "actual_input_1" ] + [ "learned_%d" % i for i in range(16) ]
>>> output_names = [ "output1" ]
>>>
>>> # export onnx
>>> torch.onnx.export(_model, dummy_input, "alexnet.onnx", verbose=False, input_names=input_names, output_names=output_names,strip_doc_string=False)
>>>
>>> # read in the exported onnx for infererence
>>> sess = InferenceSession('alexnet.onnx')
>>> meta = sess.get_modelmeta()
>>>
>>> # review onnx metadata
>>> meta.description
''
This downloads a demo model and when we print meta.description
we can see it is blank.
I can set this description by loading the model with onnx and then saving it again
>>> import onnx
>>> model = onnx.load('alexnet.onnx')
>>> model.doc_string = 'my_description'
>>> onnx.save(model, 'alexnet2.onnx')
>>> sess = InferenceSession('alexnet2.onnx')
>>> meta = sess.get_modelmeta()
>>>
>>> # review onnx metadata
>>> meta.description
'my_description'