Let X be a set of n intervals on the real line. We say that a set P of points stabs X if every interval in X contains at least one point in P. Describe and analyze an efficient algorithm to compute the smallest set of points that stabs X . Assume that your input consists of two arrays XL [1 .. n] and XR[1..n], representing the left and right endpoints of the intervals in X.
Any suggestions where to start and how to solve it? Greedy algorithm? Huffman's?