Given an array of size 2n, I need to divide it into two equal subarrays of length n so that their difference is minimum. The maximum length of the array is 200 where each element is within 1 to 100. But I would want a more general solution if possible I have been trying to do this problem with dynamic programming but couldn't come out with any satisfactory results. The problems I am facing
- Can't figure out a proper memorization method. Brute force approach would take the worst case number of operations to 200c100, which is not plausible.
- The problem is not exactly minimization or maximization of sum of a subset. For example, I need to get my sum close to 500 using 40 elements, suppose I have taken 25 elements and sum is 290, now I need to get a value as close to 210 possible. This is neither the minimization of the remaining 15 elements, nor the maximization. Again, looping through the numbers close to 210 doesn't seem plausible either in terms of time complexity (i.e: 209, 211, 208, 212, ...).