I'm searching for a method to fit a rectangle into an image. The given data is a gradient image of the rectangle which hast to be detected, the width and height the rectangle should have and an approximation of the position and its rotation.
The algorithm should find the best fit for the rectangle. It should not(!) find a minimum enclosing rectangle. This is due to outliers in the data which would corrupt the result. So the edges of the fitted rectangle should lay on the edges in the gradient image and it should propably use the magnitude of the gradient.
I was thinking of some kind of least squares matching or gradient descent but couldn't find an approach to get started.
The rectangles do have rounded corners which should not be a problem for the detection but it could be integrated and a scaling factor would also come in handy because the objects have quite big tolerances.
My current approach is kind of a brute force method. I'm drawing a rectangle with the given parameters and multiply the array of the drawn rectangle with the array of the gradient image pointwise. After that I sum up all the values in the array and get the position and rotation where the sum is the maximum. It works pretty well but it takes too many iterations and a good guess at the start of the search is needed.