0

I have a problem using bitsets from a char when I use indices.

#include <iostream>
#include <string>
#include <vector>
#include <algorithm>
#include <bitset>

using namespace std;

int main()
{
char c = 'C';
    bitset<7> b(c);
    cout << b << endl;
    for(int j = 0; j!=7;++j){
           cout<<b[j];
        }
    return 0;

}

results

1000011
1100001

I must have forgotten something very simple. Can someone explain why the results of bitset is not the same when using indices and not?

  • Unrelated to your question, but that loop condition is not "standard", it's harder to understand the meaning of the loop, and is incredible fragile. I hope you don't use such a construct very often and especially not if sharing code with others. – Some programmer dude May 08 '15 at 06:25

4 Answers4

1

Your for loop writes the least significant bit first, which is the opposite of what is done normally when writing numbers

marom
  • 5,064
  • 10
  • 14
0

That is because a std::bitset indexing starts with the least significant bit first (at position 0).

vsoftco
  • 55,410
  • 12
  • 139
  • 252
0

Read this (from here) :

"A basic_string whose contents are used to initialize the bitset: The constructor parses the string reading at most n characters beginning at pos, interpreting the character values '0' and '1' as zero and one, respectively. Note that the least significant bit is represented by the last character read (not the first); Thus, the first bit position is read from the right-most character, and the following bits use the characters preceding this, from right to left."

So the bitset stores the bits for 'c' from right to left , and you are accessing the bits from left to right.

user2125722
  • 1,289
  • 3
  • 18
  • 29
-1
1000011   
+     +   
|     |   
v     v   
b[6]  b[0]

Do you get it?

jfly
  • 7,715
  • 3
  • 35
  • 65