I'm working with OpenCV on hand detection. But I'm struggling when trying to contours of the threshed image. findContour
will always try to find white area as contour.
So basically it works in most cases but sometimes my threshed image looks like this :
_, threshed = cv2.threshold(gray, 127, 255, cv2.THRESH_BINARY|cv2.THRESH_OTSU)
So to make it works I just need to change the threshold type cv2.THRESH_BINARY_INV
.
_, threshed = cv2.threshold(gray, 127, 255, cv2.THRESH_BINARY_INV|cv2.THRESH_OTSU)
And it works well.
My question is how can I determine when the threshold need to be reversed ? Does I Need to always found contours on both threshed images, and compare the result (I this case How ?) ? or there is a way to allays knows if contours are not totally missed.
EDIT : There is a way to be 100% sure contour looks like a hand ?
EDIT 2 : So I forgot to mention that I'm trying to detect fingertips and defects using this method so I need defects, which with the first threshed image I can't find them, because it reversed. See blue point on the First contour image.
Thanks.