I am trying to get the solution for the Sudoku binary integer programming (BIP) problem. I am modelling it as:
- Grid: 2D 9x9 list, each with 9 binary variables.
- Constraints: standard sudoku rules
- Objective: Maximize(1)
- Initial values: none, i just want to get a sample solution
Here is the code. The moment is remove the constraint boolean=True, there is no error. But how can it work with the boolean constraint. I do not understand the meaning of "TypeError: G must be a 'd' matrix". Any hint will be helpful.
import cvxpy as cp
import numpy as np
x = [[cp.Variable(9, boolean=True) for j in range(9)] for i in range(9)]
objective = cp.Maximize(1)
constraints = []
cs = constraints
for i in range(9):
for j in range(9):
# one value per square
cs.append(cp.sum(x[i][j]) == 1)
# unique value every row
cs.append(cp.sum([x[k][i][j] for k in range(9)]) == 1)
# unique value every column
cs.append(cp.sum([x[i][k][j] for k in range(9)]) == 1)
# unique value every box
cs.append(cp.sum([x[(i//3)*3+k//3][(i%3)*3+k%3][j] for k in range(9)]) == 1)
prob = cp.Problem(objective, constraints)
result = prob.solve()
print("z:", result)
a = [[0 for j in range(9)] for i in range(9)]
for i in range(9):
for j in range(9):
for k in range(9):
if(x[i][j].value[k] > 0):
a[i][j] = k+1
print(a[i])