I'm following Beeren Sahu's guide for using DeepLab in Tensorflow: https://beerensahu.wordpress.com/2018/04/17/guide-for-using-deeplab-in-tensorflow/
I'm trying to use DeepLab model for semantic segmentation in TensorFlow. I've downloaded the DeepLab code here: https://github.com/tensorflow/models
After running:
hpcsub -cmd python3.4 model_test.py
I'm get the following error:
Traceback (most recent call last):
File "model_test.py", line 20, in <module>
from deeplab import common
ImportError: No module named 'deeplab'
Basically complaining about line 20 in model_test.py:
from deeplab import common
I understand that it's a 'deeplab' dependency error, however I do not know how to resolve it. As Sahu's tutorial recommended, I added both of the below libraries:
# From tensorflow/models/research/
export PYTHONPATH=$PYTHONPATH:`pwd`:`pwd`/slim
export PYTHONPATH=$PYTHONPATH:`pwd`:`pwd`/deeplab
And with these two export commands, I still get the same result.
I've found others with a similar issue on Github, but they have yet to find a solution: 1-- https://github.com/tensorflow/models/issues/5214 2-- https://github.com/tensorflow/models/issues/4364
If you don't have a solution, but can recommend helpful tutorials on using Google's open sourced DeepLab-v3 for semantic image segmentation please share!!