0

I have trained my model as QNN with brevitas. Basically my input shape is:

torch.Size([1, 3, 1024])

I have exported the .pt extended file. As I try my model and generate a confusion matrix I was able to observe everything that I want. So I believe that there is no problem about the model.

On the other hand as I try to export the .onnx file to implement this brevitas trained model on FINN, I wrote the code given below:

from brevitas.export import FINNManager
FINNManager.export(my_model, input_shape=(1, 3, 1024), export_path='myfinnmodel.onnx')

But as I do that I get the error as:

torch.onnx.export(module, input_t, export_target, **kwargs)

TypeError: export() got an unexpected keyword argument 'enable_onnx_checker'

I do not think this is related with the version. But if you want me to be sure about the version, I can check these too.

If you can help me I will be really appreciated. Sincerely;

petezurich
  • 9,280
  • 9
  • 43
  • 57
MrFoxs
  • 3
  • 3

1 Answers1

0

The problem is related to pytorch version > 1.10. Where "enable_onnx_checker" is no more a parameter of torch.onnx.export function.

This is the official solution from the repository. https://github.com/Xilinx/brevitas/pull/408/files

The fix is not yet release. Is in dev branch. You need to compile brevitas by yourself or simply change the code in brevitas/export/onnx/manager.py following official solution.

After that i am able to get onnx converted model.

Eric Tondelli
  • 113
  • 1
  • 11