Note: You don't have to understand Approximation Algorithms to answer this
Hello.
I need to prove an algorithm approximation by using expectation.
The algorithm takes x_i \in {0,1,2}
such that i\in 1,...n+2
and there are constants c_i \in 0,1,2
such that i\in 1,...,n
and would like to find an assignment to the variables such that x_i +x_(i+1)+x_(i+2) != 0 \mod(3)
for all i
such that the number of indexes such that x_i +x_(i+2) = c_i \mod(3)
is maximized.
it does the following:
choose x_1 , x_2 \in 0,1,2
independently and uniformly at random and then
for each i\in 3,...,n+2
given the values of x_(i-2),x_(i-1)
assign to x_i
one of two values in {b\in 0,1,2 | x_(i-1)+x_(i-2)+b != 0 \mod(3)}
uniformly at random.
the assignment to each x_i
is independent for all x_j
such the j<i-2
.
I need to prove this algorithm gives a 1/3
approximation to the problem described, using expectation(meaning proving that for some X random variable that we assign to this question, E[X]=1/3
)
I am struggling with defining such X and calculating this, I keep getting 2\3 instead of 1\3.
can anyone help with the calculation?