0

I have a classifier of images that each have exactly one of 5 labels [0-4]. I have hit an accuracy wall at ~72% and am looking for a way over it. I have noticed that my classes [in my training set] are quite 'heavy' in 0's, and a little less 'heavy' in 4's. 1's, 2's and 3's are less common.

So:

1) Is this a likely factor in my inaccuracy problem? 1a) How can I be sure?

2) If so, how can I deal with it?

Here is the model as it stands. I have been tweaking parameters for a while:

Layer (type)                 Output Shape              Param #
=================================================================
conv2d_1 (Conv2D)            (32, 318, 318, 4)         112
_________________________________________________________________
conv2d_2 (Conv2D)            (32, 318, 318, 4)         148
_________________________________________________________________
conv2d_5 (Conv2D)            (32, 318, 318, 4)         148
_________________________________________________________________
conv2d_6 (Conv2D)            (32, 318, 318, 4)         148
_________________________________________________________________
max_pooling2d (MaxPooling2D) (32, 106, 106, 4)         0
_________________________________________________________________
flatten (Flatten)            (32, 44944)               0
_________________________________________________________________
d0 (Dense)                   (32, 16)                  719120
_________________________________________________________________
softmax_d1 (Dense)           (32, 5)                   85
=================================================================
  • First, we should establish which classes the model is having trouble with. Can you print a confusion matrix on the evaluation set predictions and update your post with it? – Andy Carlson Aug 03 '19 at 00:05

1 Answers1

0

1 - It's possible that this results in innacuracy if the imbalance is high.
2 - You can use class_weight in model.fit(class_weight={0: w0, 1: w1, 2: w2, 3: w3) to fix this.

Daniel Möller
  • 84,878
  • 18
  • 192
  • 214