8

In a shift left operation for example,

5 << 1 = 10

10 << 1 = 20

then a mathematical equation can be made,

n << 1 = n * 2.

If there is an equation for a shift left operation,

then is it possible that there is also a

mathematical equation for an AND operation?

or any other bitwise operators?

DShah
  • 9,768
  • 11
  • 71
  • 127
e19293001
  • 2,783
  • 9
  • 42
  • 54
  • 2
    You may find the following related question of interest: [Do bitwise operators (other than shifts) make any mathematical sense in base-10?](http://stackoverflow.com/questions/3319974/do-bitwise-operators-other-than-shifts-make-any-mathematical-sense-in-base-10) – Greg Hewgill Aug 26 '11 at 03:11
  • Nothing's coming to mind. Obviously you can use addition for bitwise OR if the operands have no 1 bits in common, and you can loop on arithmetic that manages to isolate a single bit, but I don't know of anything that would do a "parallel" AND, eg. (And I've invented more than a such few tricks in my time.) – Hot Licks Aug 26 '11 at 03:13
  • It's much easier to define arithmetic operations in terms of bitwise operations. In fact, this is what CPU designers do. (I'm sure that's a gross oversimplification.) – Keith Thompson Aug 26 '11 at 03:24
  • I use this for quick calculations: http://www.bitwiseoperatorcalculator.com – jonprasetyo Sep 20 '14 at 18:41

6 Answers6

4

There is no straightforward single operation that maps to every bitwise operation. However, they can all be simulated through iterative means (or one really long formula).

(a & b)

can be done with:

(((a/1 % 2) * (b/1 % 2)) * 1) +
(((a/2 % 2) * (b/2 % 2)) * 2) +
(((a/4 % 2) * (b/4 % 2)) * 4) +
...
(((a/n % 2) * (b/n % 2)) * n)

Where n is 2 to the number of bits that A and B are composed minus one. This assumes integer division (remainder is discarded).

Variable Length Coder
  • 7,958
  • 2
  • 25
  • 29
2

That depends on what you mean by "mathematical equation". There is no easy arithmetic one.

If you look at it from a formal number-theoretic standpoint you can describe bitwise "and" (and "or" and "xor") using only addition, multiplication and -- and this is a rather big "and" from the lay perspective -- first-order predicate logic. But that is most certainly not what you meant, not least because these tools are enough to describe anything a computer can do at all.

hmakholm left over Monica
  • 23,074
  • 3
  • 51
  • 73
1

Yes,they are sums. Consider for a binary word of length n. It can be written as the following; A=a0*2^0+a1*2^1+a2*2^3....an*2^n. Where an is an element of {0,1}

Therefore if an is a bit in A and bn is a bit in B, then; AandB=a0*b0*2^0+a1*b1*2^1...an*bn*2^n similarly AxorB=(a0+b0)mod2*2^0+(a1+b1)mod2*2^1...+(an+bn)mod2*2^n

Consider now the identity; Axor1=notA

We now have the three operators we need (Bitwise AND,Bitwise XOR and Bitwise NOT)

From these two we can make anything we want.

For example, bitwise OR

not[(notA)and(notB)]=not[not(AorB)]=AorB

Its not guaranteed to be pretty though.

In response to the comment regarding mod2 arithmetic not being very basic, that's true in a sense. However,while its common because of the prevalence of computers nowadays, the entire subject we are touching on here is not particularly "basic". The OP has grasped something fundamental. There are finite algebraic structures studied in the mathematical field known as "Abstract Algebra" such as addition and multiplication modulo n (where n is some number such as 2, 8 or 2^32). There are other structures using binary operations (addition is a binary operation, it takes two operands and produces a result, as is multiplication, and xor) such as xor, and ,bit shifts etc, that are "isomorphic" to the addition and multiplication over integers mod n. that means they act the same way, they are associative, distributive etc. (although they may or may not be commutative, think of matrix multiplication) Its hard to tell someone where to start looking for more information. I guess the best way would be to start with a book on formal mathematics.(Mathematical proofs) You need that to understand any advanced mathematics text. Then a text on abstract algebra. If your a computer science major you will get a lot of this in your classes. If your a mathematics major, you will study these things in depth all in good time. If your a history major, Im not knocking history , im a history channel junkie, but you should switch majors because your wasting your talents!

  • 1
    Your response to the mod2 comment does not belong in an answer. Also, I don't believe you're actually answering the question that the OP asked. It is obvious that through some combination of bitwise operators any math can be done, otherwise computers wouldn't work. The question is simply: does `x AND y` map to some mathematical identity like `x LEFT SHIFT 1` maps to `x * 2` or are there any other similar mappings. – Joey Harwood Feb 08 '18 at 19:20
1

Except for specific circumstances, it is not possible to describe bitwise operations in other mathematical operations.

An and operation with 2n-1 is the same as a modulus operation with 2n. An and operation with the inverse of 2n-1 can be seen as a division by 2n, a truncation, and a multiplication by same.

Ignacio Vazquez-Abrams
  • 776,304
  • 153
  • 1,341
  • 1,358
1

It depends on what you mean by “mathematical”. If you are looking for simple school algebra, then answer is no. But mathematics is not sacred — mathematicians define new operations and concepts all the time.

For example, you can represent 32-bit numbers as vectors of 32 booleans, and then define “AND” operation on them which does standard boolean “and” between their corresponding elements.

hamstergene
  • 24,039
  • 5
  • 57
  • 72
0

Here is a proof that for 2-bit bitwise operations you cannot describe & with just + - and * (check this, just came up with it now, so, who knows):

The question is, can we find a polynomial

x & y == P(x, y)

where

P(x, y) = a0_0 + a1_0*x + a0_1*y + a2_0*x^ + ...

Here's what it would have to look like:

   0 1 2 3
  --------
0| 0 0 0 0
1| 0 1 0 1
2| 0 0 2 2
3| 0 1 2 3

First, clearly a0_0 == 0. Next you can see that if P is rewritten:

                     |------- Q(x, y) --------|
P(x, y) = xy*R(x,y) + a1_0*x + a0_1*y + ...

And y is held 0, while x varies over 0, 1, 2, 3; then Q(x, y) must be 0 for each of those values. Likewise if x is held 0 and y varied. So Q(x, y) may be set to 0 without loss of generality.

But now, since P(2, 2) = 2, yet 2 * 2 == 0, the polynomial P cannot exist.

And, I think this would generalize to more bits, too.

So the answer is, if you're looking for just +, * and -, no you can't do it.

Owen
  • 38,836
  • 14
  • 95
  • 125
  • Um, are you somehow assuming that the basic arithmetic operations must be modulo 2^n? That's not a very elementary "mathematical" assumption, but otherwise your "2*2==0" claim doesn't seem to make sense. – hmakholm left over Monica Aug 26 '11 at 03:58
  • @Henning Oh yes, I am assuming this. I assumed that because that's how they work on a computer. – Owen Aug 26 '11 at 06:07
  • Well, they could mean just mathematical addition, subtraction and so forth. Similarly for the bitwise operators, which one can define for infinite-precision integers in the same way as they work for finite-width ones. – hmakholm left over Monica Aug 26 '11 at 11:57
  • Well, I guess that would be a different proof then :) – Owen Aug 26 '11 at 21:57
  • 1
    But (which just now strikes me) actually even simpler for that case: Assume that `x & y` were a polynomial in `x` and `y`. Then, in particular `x & 1` would be a polynomial in `x`. But that cannot be, because it has infinitely many zeroes but is not identically zero. – hmakholm left over Monica Aug 26 '11 at 22:05