1

I am working with maximazation problems in GAMS where I will choose X=(x_1,x2,...,x_n) such that f(X)=c_1*x_1+...c_n*x_n is maximized. The c's are known scalars and I know n (10 in my case). I want my constraints to be such that the first (n-1)=9 x's should sum up to one and the last one should be less than 10. How do I use the sum to do so?

This is what I have tried:

SET C / c1 .... c2 /;
ALIAS(Assets,i)

Parameter Valuesforc(i) 'C values'/
*( here are my values typed in for all the C1)

POSITIVE VARIABLES
    x(i);

EQUATIONS
    Const1 First constraint
    Const1 Second constraint
    Obj    The Object;
* here comes the trouble:
Const1 ..   x(10) =l= 10
Const2 ..   sum((i-1),x(i)) =e= 1

The code is not done all the way but I believe the essential setup is typed in. How do you make the summation to find x_1+x_1 + .... x_(n-1) and how do you refer to x_10?

Kim
  • 159
  • 3
  • 11

1 Answers1

2

Try this:

Const1 ..   x('10') =l= 10;
Const2 ..   sum(i$(ord(i)<card(i)),x(i)) =e= 1;

Edit: Here are some notes to explain what happens in Const2, especially in the "$(ord(i) < card(i))" part.

So, all in all, there is a condition saying that all elements of i should be included in the sum except for the last one.

Lutz
  • 2,197
  • 1
  • 10
  • 12
  • Thanks. This will probably solve my problem. Will you please ad some words on why constraint 2 looks they way it does? Why will `$` sign work in this case? – Kim Sep 28 '18 at 08:52
  • 1
    I added some comments to my post. Hope, that helps. – Lutz Sep 28 '18 at 09:31