I am not even sure if this can be done in polynomial time.
Problem:
Given two arrays of real numbers,
A = (a[1], a[2], ..., a[n]), B = (b[1], b[2], ..., b[n]), (b[j] > 0, j = 1, 2, ..., n)
and a number
k
, find a subsetA'
ofA (A' = (a[i(1)], a[i(2)], ..., a[i(k)]))
, which contains exactlyk
elements, such that,(sum a[i(j)])/(sum b[i(j)])
is maximized, wherej = 1, 2, ..., k
.
For example, if k == 3
, and {a[1], a[5], a[7]}
is the result, then
(a[1] + a[5] + a[7])/(b[1] + b[5] + b[7])
should be larger than any other combination. Any clue?