Given n
variables I want to create decimals which equal all the possible binary variations in a truth table. e.g.
For a
,b
and c
let:
a = 11110000 (240)
b = 11001100 (204)
c = 10101010 (170)
For p
and q
let:
p - 1100 (12)
q - 1010 (10)
I have worked out a formula to create the first number in any set, like so:
n = number of variables
x = 2^n
decimal = (2^x) - (2^(x/2));
I have implemented this in JavaScript like so:
var vars = ["a", "b", "c"];
var bins = [];
for (var i = 0; i < vars.length; i++) {
var rows = 1 << vars.length;
bins[i] = (1 << rows) - (1 << ((rows) / 2));
console.log(bins[i].toString(2)); // logs 11110000
}
I can't work out how to calculate the rest of the numbers, does anyone know a formula to do this?