I'm looking for an explanation why this code in Javascript works.
Idea looks as follow:
* 273 84
* \ /
* 1 | 2 | 4 = 7
* -----+-----+-----
* 8 | 16 | 32 = 56
* -----+-----+-----
* 64 | 128 | 256 = 448
* =================
* 73 146 292
Every time a player sets his figure, the number of the field is added to his score. The numbers around the field are the wins.
Now there is this checkup:
wins = [7, 56, 448, 73, 146, 292, 273, 84],
win = function (score) {
var i;
for (i = 0; i < wins.length; i += 1) {
if ((wins[i] & score) === wins[i]) {
return true;
}
}
return false;
},
What I don't understand now: If a player sets a figure in fields (numbers in the fields, order matters) 1,16,4,2, then he has a score of 23. How does the code knows that he has 3 in a row even though he does not have score 7? (which is the row on top) Because the code only compares the score with the wins, and 23 is not a win!