I have a problem from homework where I need to find an O(n.log(n))
greedy algorithm for the minimum number of lines required to intersect all circles in the plane, as shown with the example below.
The starting point of all the lines is (0,0)
which is the origin point. The set C
contains n
circles, where each circle c_i
has information about its radius r_i
and its centre coordinate (x_i, y_i)
.
I have tried making greedy rules:
- iterate over each circle in the set
C
and pickc_i
- construct 3 lines from
origin
toc_i
, where 2 lines are tangent lines that only intersect 1 point in the circle, and 1 line is secant line that goes through the circle via its center. - iterate over other remaining circles
c_j (j != i)
and look how many circles intersect with these lines - choose the line
L_i
and remove the circles that intersect with it from the plane. - continue until the plane is empty.
But I don't think that this greedy rule will achieve the optimum solution and its complexity won't be O(n.log(n))
.
Any hints or full solution is OK. It is also mentioned in the problem sheet that greedy rules that give minimum + 1
lines is fine.