0

Using pyscipopt in python I have set up a continuous variable and binary variable:

For j in J:
   x[j] = model.addVar(vtype="C", name="x(%s)"%j)   
   y[j] = model.addVar(vtype="B", name="y(%s)"%j)

What I am trying to do is get a simple indicator variable of the form:

when x[j] = 0 -> y[j] = 0, and when x[j] > 0 -> y[j] = 1

But I cannot figure out how to set up the constraint. I tried to read up on Big M but still seem to be struggling.

And ultimately, the goal is to get the optimization model to use just one of x[3] or x[7], or neither, but not both in the solution.

So eventually once the indicator variable is working, I was thinking something along the lines of:

model.addCons(y[3] + y[7] <= 1)

Feel free to offer any suggestions on this too if there is a better way to do it.

1 Answers1

1

just one of x[3] or x[7], or neither, but not both in the solution

A big-M approach would be:

x[3] <= b[3]*U[3]      # b is a binary variable
x[7] <= b[7]*U[7]      # U is an upperbound on x (constant)
b[3]+b[7] <= 1
0 <= x[3] <= U[3]
0 <= x[7] <= U[7]
b[3],b[7] ∈ {0,1}

An approach based on indicator constraints would look like:

b[3]==0 ==> x[3]=0   # or x[3]<=0 (this is used in addConsIndicator)
b[7]==0 ==> x[7]=0
b[3]+b[7] <= 1
b[3],b[7] ∈ {0,1}

In PySCIPOpt this can be specified using model.addConsIndicator. Note that this approach can be used when we don't have good upper bounds on x. You also may want to have a look at model.addConsCardinality for a simpler approach.

Coding

Here is the trivial part. Convert the mathematical formulation into code. The authors of the Python interface for SCIP have done their best to make this step very easy. So for the programmers who specialize in copy-paste: the big-M approach can be coded as:

from pyscipopt import Model

J = range(10)
U = [10*j for j in J]  # upper bound on x


model = Model("Formulation1")

#
# variables
#
x = [model.addVar(vtype="C", lb=0, ub=U[j]) for j in J]
b = [model.addVar(vtype="B") for j in J]

#
# objective
#
model.setObjective(sum(x), sense="maximize")

#
# constraints
#
model.addCons(x[3] <= b[3]*U[3])
model.addCons(x[7] <= b[7]*U[7])
model.addCons(b[3]+b[7]<=1)


#
# solve
#
model.optimize()

#
# print solution
#
for j in J:
    print("j=%2s, x[j]=%5s, b[j]=%5s" % (j,model.getVal(x[j]),model.getVal(b[j])))

The solution looks like:

j= 0, x[j]= -0.0, b[j]= -0.0
j= 1, x[j]= 10.0, b[j]= -0.0
j= 2, x[j]= 20.0, b[j]= -0.0
j= 3, x[j]=  0.0, b[j]=  0.0
j= 4, x[j]= 40.0, b[j]= -0.0
j= 5, x[j]= 50.0, b[j]= -0.0
j= 6, x[j]= 60.0, b[j]= -0.0
j= 7, x[j]= 70.0, b[j]=  1.0
j= 8, x[j]= 80.0, b[j]= -0.0
j= 9, x[j]= 90.0, b[j]= -0.0

Indeed not both x[3] and x[7] are nonzero.

The same result can be obtained by using model.addConsCardinality:

from pyscipopt import Model

J = range(10)
U = [10*j for j in J]  # upper bound on x


model = Model("Formulation3")

#
# variables
#
x = [model.addVar(vtype="C", lb=0, ub=U[j]) for j in J]

#
# objective
#
model.setObjective(sum(x), sense="maximize")

#
# constraints
#
# We use model.addConsCardinality
#    Add a cardinality constraint that allows at most 'cardval' many nonzero variables.
# Only the following two parameters are used:
#       :param consvars: list of variables to be included
#       :param cardval: nonnegative integer
#
model.addConsCardinality([x[3],x[7]],1)


#
# solve
#
model.optimize()

#
# print solution
#
for j in J:
    print("j=%2s, x[j]=%5s" % (j,model.getVal(x[j])))

The results are:

j= 0, x[j]= -0.0
j= 1, x[j]= 10.0
j= 2, x[j]= 20.0
j= 3, x[j]=  0.0
j= 4, x[j]= 40.0
j= 5, x[j]= 50.0
j= 6, x[j]= 60.0
j= 7, x[j]= 70.0
j= 8, x[j]= 80.0
j= 9, x[j]= 90.0

Again not both x[3] and x[7] are non-zero.

PS. I am confused, why providing the code was needed. The detailed description in the answer should be more than enough to get you on the correct path. I prefer a carefully crafted and well-stated mathematical model above a bunch of code any time.

Erwin Kalvelagen
  • 15,677
  • 2
  • 14
  • 39
  • I think I understand the logic but not how to implement it in pyscipopt and I am unable to find any examples beyond the pyscipopt documentation. Could you provide the code for the constraints (and the accompanying variables if different from what I have)? – placeholder Jun 29 '20 at 15:23
  • Are you saying you are not able to get any of these three approaches working? That would be very unfortunate. – Erwin Kalvelagen Jun 29 '20 at 23:02
  • Correct. I cannot figure out the syntax of model.addConsIndicator and model.addConsCardinality and have not been able to find any examples of either of these two functions being applied anywhere. For Big M I think I get the logic behind it but do not know how to fit it into the pyscipopt model since "if, then" statements are not accepted. – placeholder Jun 29 '20 at 23:49
  • Please note that I have provided a detailed big-M MIP formulation that implements the "if-then" in the form of inequalities. You probably missed this important point. Transcribing my suggested formulation in Python/SCIP should be trivial. – Erwin Kalvelagen Jun 30 '20 at 03:03
  • You are correct, I was focusing too much on the addConsCardinality/Indicator. I did get the Big M approach to work. It appears it is specific to not having both x[3] and x[7] in the solution as it does not fully assign the appropriate 0's and 1's to all b[j] based on whether x[j] was used in the solution. So if I wanted a solution that had the constraint x[1] + x[2] + x[3] == 3 (use all three items), I would need to figure out the specific Big M constraints to proceed this? I am guessing there is no general form that provides the appropriate indicators that correspond to all x[j]? – placeholder Jun 30 '20 at 05:10
  • To the person that contributed the coding part of the solution, I was asking only to see something like "model.addConsCardinality([x[3],x[7]],1)" so that I could figure out what it was trying to achieve. But your full solution was not in vain and has provided further insight and greater understanding, even though these insights must now feel mundane to anyone not brand new to this. Thank you. – placeholder Jun 30 '20 at 05:12