3

I am using lpSolve in R. My models (data envelopment analysis) run fine on my MAC, but when I try to run it on a UNIX cluster many of the models are found to be degenerate. The lp.control options are the same on both systems. I have been able to reduce, though not eliminate, the number of degeneracies by playing with the presolve and anti.degen options.

Note that the exact same problem occurs when I use pre-built R packages (Benchmarking, nonparaeff) to solve the same linear programming models.

Would anyone know why the degeneracy errors on the UNIX cluster?

Cheers,

Peter

If anyone is interested the code is as follows. Basically, it creates a linear programming model for each of the three hundred agents and solves this problem. On my MAC all the problems are solved just fine, but on the cluster 90% of the problems are found to be degenerate:

library(lpSolveAPI)
set.seed(198302)

##############Create data
x=matrix(rnorm(1200,5,3),300,4)
y1=x%*%c(.4,.2,.7,.8)+rnorm(300,4,.5)
y2=x%*%c(.5,.8,.2,.3)+rnorm(300,4,.5)
y=cbind(y1,y2)

##############Write DEA function
xref=x
yref=y

##Define dimensions
mx<-ncol(xref)
my<-ncol(yref)
nref<-nrow(xref)
nobs<-nrow(x)

##Define empty matrices for efficiency scores, lambdas and slacks
eff<-rep(0,nobs) 
lambda<-matrix(0,nobs,nref)
slacks<-matrix(0,nobs,mx)

##Start the model, noting that there will be as many constraints as there are inputs, outputs and one additional constraint on lambda.  And as many decision variables as there are producers (lambdas) + one (efficiency score)
deamodel<-make.lp(nrow=mx+my+1,ncol=nref+1)
    ## For each input and output set the row as a constraint
    for (h in 1:mx) set.row(deamodel, h, c(0, -xref[, h]))
    for (h in 1:my) set.row(deamodel, mx + h, c(0, yref[, h]))
    ##Set a single objective function
    set.objfn(deamodel, 1, 1)
    ##Set the type of constraints.  Inputs and outputs are all greater than, lambdas is equal to
    set.constr.type(deamodel, c(rep(">=", mx + my),"="))
    ##Set another row as a constraint on lambdas
    set.row(deamodel, (mx+my+1), c(0,rep(1,nref)))
    set.rhs(deamodel, 1, mx+my+1)

##In a loop create a lp model for each producer
for (k in 1:nobs){
    ##Set the right hand side equal to output of the individual producer 
    set.rhs(deamodel, c(rep(0,mx), y[k, ]), 1:(mx + my))
    ##Set first column equal to input of producer
    set.column(deamodel, 1, c(1,x[k,]), 0:mx)
    ##Set some presolve options
    lp.control(deamodel, presolve=c("rows", "columns","rowdominate","coldominate","bounds"))
    ##Solve the model
    a=solve(deamodel)
    if (a==0){
        eff[k]<-get.objective(deamodel)
        lambda[k,]<-get.variables(deamodel)[-1]
        slacks[k,]<-get.constraints(deamodel)[1:mx]}
    if (a!=0){
        eff[k]<-NA
        lambda[k,]<-NA
        slacks[k,]<-NA
    }}

eff

1 Answers1

1

It looks like you are solving 300 small problems that are inherently highly degenerate of size (301 variables, 7 constraints). Presolve and anti.degen can only take you so far.

From the lpSolve FAQ:

The main problems with these codes have to do with scaling, use of explicit inverses and lack of reinversion, and handling of degeneracy. Even small problems that are ill-conditioned or degenerate can bring most of these tableau codes to their knees.

The lpSolve implementation on your Unix cluster (that identifies the degenerate solutions) appears to be different from the one that your Mac (called from R) is invoking.

First Test: Write out the 300 MIPs (write.lp) and see if they are identical in R and in your unix cluster. (You are using rnorm and even very small rounding can throw off some of these highly-degenerate problems.)

If your goal is simply to get rid of the degeneracy, try perturbing the rhs and especially your objective function.

If you really want to get to the root of why the two systems vary, I suggest compiling the lpSolve *.c files, yourself (see here) and calling that version from R as well as from your Unix cluster to see if there is still a variation in the results.

Hope that helps you move forward.

Ram Narasimhan
  • 22,341
  • 5
  • 49
  • 55