0

Is it possible to run the inference graph on the jetson board without converting it into TensorRT format as mentioned in the github repo ?

Will we be able to run the Tensorflow Object Detection API without using TensorRT on the Jetson board?

Sachin Mohan
  • 883
  • 8
  • 15

1 Answers1

0
  • Install TF from Nvidia website for the right jetpack version.

  • Coming to the object detection, execute first code block under Step 1, mentioned in this link

    git clone --quiet https://github.com/tensorflow/models.git
    
    apt-get install -qq protobuf-compiler python-pil python-lxml python-tk
    
    pip install -q Cython contextlib2 pillow lxml matplotlib
    
    pip install -q pycocotools
    
    cd /content/models/research
    protoc object_detection/protos/*.proto --python_out=.
    
    import os
    import sys
    os.environ['PYTHONPATH'] += ':/content/models/research/:/content/models/research/slim/'
    sys.path.append("/content/models/research/slim/")
    
    python object_detection/builders/model_builder_test.py
  • Then follow the steps from this famous blog of Gilbert Tanner

Note: You might get errors when running the code block as above, or while running the python file as below. Keep fixing every errors, until TF is properly installed.

python object_detection/builders/model_builder_tf1_test.py

Tensorflow installed and I did inference as well.

Sachin Mohan
  • 883
  • 8
  • 15