2

I have a model, exported from pytorch, I'll call main_model.onnx. It has an input node I'll call main_input that expects a list of integers. I can load this in onnxruntime and send a list of ints and it works great.

I made another ONNX model I'll call pre_model.onnx with input pre_input and output pre_output. This preprocesses some text so input is the text, and pre_output is a list of ints, exactly as main_model.onnx needs for input.

My goal here is, using the Python onnx.helper tools, create one uber-model that accepts text as input, and runs through my pre-model.onnx, possibly some connector node (Identity maybe?), and then through main_model.onnx all in one big combined.onnx model.

I have tried using pre_model.graph.node+Identity connector+main_model.graph.node as nodes in a new graph, but the parameters exported from pytorch are lost this way. Is there a way to keep all those parameters and everything around, and export this one even larger combined ONNX model?

maccam912
  • 792
  • 1
  • 7
  • 22
  • did you try to run those onnx models separately on onnxruntime feeding the output of the first model to the second? if so did it work? – kiranr Feb 12 '21 at 22:23
  • Yep it worked. I can run them in sequence, just passing output from the pre_model to input of the main_model, and get the expected output from the main model. – maccam912 Feb 12 '21 at 22:28
  • why do you wanna combine them? why don't you just run them separately? – kiranr Feb 12 '21 at 22:45
  • I'd like to just deliver a single .onnx file to someone to deploy. Even if it's simple, delivering two models also requires some communication about what order to run them in, confirm the input doesn't change between models, etc. so it's simpler just to share a single onnx model. – maccam912 Feb 13 '21 at 02:25
  • 1
    @maccam912 I did not find a straight forward solution, but adding nodes is possible to a current graph. You can refer to the code in this PR(https://github.com/onnx/onnx/pull/3264). I have to address the comments on that but it should give you decent idea to add the nodes in an onnx graph. Hope this helps. – Sarthak Agrawal Feb 15 '21 at 20:31
  • @maccam912 do you have an example solution for this? I'm working on the same thing. – Megan Hardy Jun 01 '21 at 20:43

2 Answers2

2

This is possible to achieve albeit a bit tricky. You can explore the Python APIs offered by ONNX (https://github.com/onnx/onnx/blob/master/docs/PythonAPIOverview.md). This will allow you to load models to memory and you'll have to "compose" your combined model using the APIs exposed (combine both the GraphProto messages into one - this is easier said than done - you' ll have to ensure that you don't violate the onnx spec while doing this) and finally store the new Graphproto in a new ModelProto and you have your combined model. I would also run it through the onnx checker on completion to ensure the model is valid post its creation.

Dharman
  • 30,962
  • 25
  • 85
  • 135
0

If you have static size inputs, sclblonnx package is an easy solution for merging Onnx models. However, it does not support dynamic size inputs.

For dynamic size inputs, one solution would be writing your own code using ONNX API as stated earlier.

Another solution would be converting the two ONNX models to a framework(Tensorflow or PyTorch) using tools like onnx-tensorflow or onnx2pytorch. Then pass the outputs of one network as inputs of the other network and export the whole network to Onnx format.

FurkanCoskun
  • 86
  • 1
  • 6