-1

If you are getting this error while following the code from this tutorial

https://pixellib.readthedocs.io/en/latest/image_ade20k.html

ValueError: You are trying to load a weight file containing 293 layers into a model with 147 layers
J33T
  • 13
  • 6

3 Answers3

0

The issue can be solved by installing these versions*

!pip3 install tensorflow==2.6.0
!pip3 install keras==2.6.0
!pip3 install imgaug
!pip3 install pillow==8.2.0
!pip install pixellib==0.5.2
!pip install labelme2coco==0.1.2
J33T
  • 13
  • 6
  • I'm trying to downgrade tensorflow version but it is showing :Could not find a version that satisfies the requirement tensorflow==2.6.0 (from versions: 2.8.0rc1, 2.8.0, 2.8.1, 2.8.2, 2.8.3, 2.9.0rc0, 2.9.0rc1, 2.9.0rc2, 2.9.0, 2.9.1, 2.9.2, 2.10.0rc0, 2.10.0rc1, 2.10.0rc2, 2.10.0rc3, 2.10.0, 2.11.0rc0, 2.11.0rc1, 2.11.0rc2) ERROR: No matching distribution found for tensorflow==2.6.0 – Vinod Patidar Nov 10 '22 at 13:16
0

I got it working by changing some imports in pixellib/semantic/deeplab.py.

You can substitute all imports tensorflow.python.keras with tensorflow.keras exept for from tensorflow.python.keras.utils.layer_utils import get_source_inputs.

0

The problem which I feel is that to load this pretrained model we need a GPU so just to solve it temporarily we can do it in google colaboratory

  • To achieve it open

google colaboratory > Runtime > Change runtime type > Hardware accelerator > GPU

and then upload the .h5 file in the colab notebook which you are on(this might take 30 mins(...i know its a bit long but yeah...))

after this use this code

import pixellib
from pixellib.semantic import semantic_segmentation
segment_image = semantic_segmentation()
segment_image.load_ade20k_model("deeplabv3_xception65_ade20k.h5")
segment_image.segmentAsAde20k("path_to_image", output_image_name= "path_to_output_image")

Run this code line by line It worked for me It may also work for other probably

Chiru108
  • 1
  • 2