I have a straight line which intersects a convex polygon in 2D plane. There exists a circle with constant radius. The center of circle is moving on this line. So at first the polygon and circle don't intersect with each other, as the circle gets closer to the polygon the intersection increases and then decreases as they go further from each other. I want to prove the area of the intersection of the convex polygon and circle doesn't have local minima(as the circle moves on the line).
Asked
Active
Viewed 332 times
0
-
I tried to show there's not possible to having local minima by moving epsilon to the right and moving epsilon to the left. For the case there is just one intersection or two intersection point between the circle and polygon I could prove it(by showing different possible scenarios) but as the number of intersection points between the circle and the polygon increases you have to consider lots of cases. – ladan Jun 03 '15 at 12:32
1 Answers
0
Interesting problem. Please post solution once you find it. My approach would be to take a similar route to Fortunes algorithm to build a Voronoi graph - meaning I would consider "events" that are happening when the circle traverses a convex polygon.
Basically to better understand the problem, consider the restriction that the circle is traveling on straight line - why is that important - look at counter examples. Then look when will this fail if poly is not convex?
The events that I would consider would be an entry/exit of a poly vertex into circle, and entry exit of an poly edge from/into the circle. Then keep track of area increasing or decreasing through each event, and show that it is necessarily monotonic.

Nicko Po
- 727
- 5
- 21