-1

EDIT: I found a solution thanks to the paper A survey of algorithms for calculating power indices of weighted majority games

I'd like to solve the following problem:

Given a set N={1,..,n} with a vector of weights W={w_1,..,w_n}, sorted in descending order and with total sum 1, find all subsets C (also called "coalitions") of N such that

  1. C is "winning", that means that the sum of the weights of elements inside the subset C exceeds a certain threshold (e.g. 0.5)
  2. C is "minimal", that means that removing any element from C will make the subset not "winning" anymore.

To clarify, an example could be: N={1,2,3} W={0.45,0.35,0.2}

  1. The "winning" subsets here are {1,2},{2,3},{1,3} and {1,2,3} since they all exceed 0.5 total weight
  2. only {1,2}, {2,3} and {1,3} are minimal, since in {1,2,3} you could remove one element and get the ones above.

From the paper cited above, I now implemented this code which recursively generates the list of all minimal winning coalitions

MWC = function(w,threshold=0.5){
  n = length(w)
  l = list()
  enumerate = function(S1,n1,b1){
    if(n1==n){
      return(list(c(S1,n1)))
    }else{
      if(sum(w[(n1+1):n]) >= b1){
        l = c(l,enumerate(S1,n1+1,b1))
      }
      if(w[n1] >= b1){
        l=c(l,list(c(S1,n1)))
      }else{
        l = c(l,enumerate(c(S1,n1),n1+1,b1-w[n1]))
        return(l)
      }
    }
  }
  return(enumerate(c(),1,threshold))
}

w = c(0.46,0.3,0.19,0.05)
MWC(w)

The code runs up to around n = 20, after the exponential complexity makes everything unfeasible.

pier
  • 1
  • 1

1 Answers1

0

One solution to your problem will be using while loop and applying the above constraints within the loop.

Below is the working code:

N = 1:10

W <- c(0.60, 0.40, 0.33, 0.30, 0.25, 0.20, 0.15, 0.10, 0.05, 0.03);

Winning_Sets <- c()

i <- 1

while (i != (length(N) - 1)) {
  if (i == 1) {
    Subsets <- combn(N, i)
  } else {
    Keep_Combs <- Subsets[,which(Intermediate_Sets == 'No_Match')]
    Subsets <- combn(N, i)
    if (i == 2) {
      Subsets <- apply(Subsets, 2, function(x) {
        x <- x[x[i - 1] %in% Keep_Combs]
      })
    } else {
      Subsets <- apply(Subsets, 2, function(x) {
        unlist(apply(Keep_Combs, 2, function(y) {
          if (identical(y, x[1:(i - 1)]) == TRUE) {
            x
            }
          }))
        })
    }
    Subsets <- do.call('cbind', Subsets)
  }
  Intermediate_Sets <- apply(Subsets, 2, function(x) {
    if (sum(W[x]) > 0.5) {
      paste('{', paste(x, collapse = ','), '}', sep = '')
    } else {
      'No_Match'
    }
    })
  Winning_Sets <- append(Winning_Sets, unlist(Intermediate_Sets[Intermediate_Sets != 'No_Match']))
  if (length(which(Intermediate_Sets == 'No_Match')) > 0) {
    i <- i + 1
  } else {
    i <- length(N) - 1
  }
}

Output:

Winning_Sets
 [1] "{1}"          "{2,3}"        "{2,4}"        "{2,5}"        "{2,6}"        "{2,7}"       
 [7] "{3,4}"        "{3,5}"        "{3,6}"        "{4,5}"        "{2,8,9}"      "{2,8,10}"    
[13] "{3,7,8}"      "{3,7,9}"      "{3,7,10}"     "{4,6,7}"      "{4,6,8}"      "{4,6,9}"     
[19] "{4,6,10}"     "{4,7,8}"      "{5,6,7}"      "{5,6,8}"      "{3,8,9,10}"   "{4,7,9,10}"  
[25] "{5,6,9,10}"   "{5,7,8,9}"    "{5,7,8,10}"   "{6,7,8,9,10}"
Isa
  • 496
  • 3
  • 6
  • Hi, this code doesn't seem to always work (for example it doesn't work with N = 1:3 W <- c(0.45,0.35,0.2); ) I noticed that the problem in this specific case it stops after the first iteration since i=2. Some other times it returns the error "Error in do.call("cbind", Subsets): second argument must be a list" I also noticed that it uses the function combn(N,i), so when you proceed in the iterations you will still generate all the subsets of N and then you remove the ones that where generated in excess. This gives problem when N is big if i is sufficiently high. – pier Nov 30 '21 at 09:48
  • But I think that it can be fixed by passing to combn a different N that excludes the elements removed...I'll try it ! – pier Nov 30 '21 at 09:49
  • My bad missed to check those conditions but by adding some constraints it can be solved. Can you also tell how big N will be? – Isa Nov 30 '21 at 10:29
  • Ideally around 50, but I don't know if that will be feasible since the possible number of subsets is ( 2^57 - 1) ...however the bigger the better, with my code I have serious slow downs around 20, I'm hoping to get at least to 30... – pier Nov 30 '21 at 10:54