I am trying to find minimum number of entries in a list or vector which constitute maximum sum. Is there a way to find it? I am trying as follows but not success:
D=[[Real('d%s%s' % (i+1,j+1)) for j in range(input)] for i in range (input)]
dr=[D[i][j]== randint(1,5) for i in range (input) for j in range (input)]
G=[[Real('g%s%s' % (i+1,j+1)) for j in range(input)] for i in range (input)]
ge=[G[i][j]==randint(1,5) for i in range (input) for j in range (input)]
dSum = [Real('dSum%s' % (i+1)) for i in range(input)]
gSum = [Real('gSum%s' % (i+1)) for i in range(input)]
benefit = [B[i]==If(dSum[i]>=gSum[i],(dSum[i]-gSum[i]),(gSum[i]-dSum[i])) for i in range(input)]
opt = Optimize()
opt.add(dr)
opt.add(ge)
opt.add([dSum[i]==sum(D[i]) for i in range(input)])
opt.add([gSum[i]==sum(G[i]) for i in range(input)])
opt.add(benefit)
opt.add(sumVal==sum(B))
here is where I need help:
opt.minimize(B(i) i in range (len(benefit))) #is it wrong?
opt.maximize(sumVal)