-4

(uint32_t header;char array[32];) how do I copy the data from header to array in c++ ? how to carry out this conversion ? I tried type -casting, but it doesn't seem to work .

  • `uint32` is 32 bits. `char[32]` is 32 * 8 bits. So the question is really, what are you **actually** trying to do? How do you imagine such a conversion should take place? – Assimilater Jul 14 '17 at 03:11
  • i actually want to store the uint32 data in the form of character array – user8158123 Jul 14 '17 at 03:14
  • size can be adjusted accordingly right – user8158123 Jul 14 '17 at 03:14
  • but what does that mean? Do you want to map `array[0]` to the 8 most significant bits of `header` and `array[3]` to the 8 least significant bits of `header` and fill `array[4]` through `array[31]` with 0? ... this is not something so intuitive.... – Assimilater Jul 14 '17 at 03:16
  • it might help to explain **why** you are trying to do this, what you are actually hoping to accomplish (you may be going about it the wrong way). Right now it's just pulling teeth...see [this explanation of the xy problem](https://meta.stackexchange.com/a/66378) – Assimilater Jul 14 '17 at 03:18
  • see uint32 will have 32 digits right , so i would like to store these 32 bits in a character array. is it possible to do that ? – user8158123 Jul 14 '17 at 03:36
  • 1
    So, you want something like `array[0]` to be `'0'` or `'1'` based on `header` bit 0, `array[1]` to be based on bit 1, and so on? – Remy Lebeau Jul 14 '17 at 05:39
  • @user8158123 it's possible to do anything if you do it the right way...and know what you want. Maybe the thing you lack is understanding that a character is 8 bits. Maybe you just want to access bits individually and a union with a bitfield struct is more what you're after. The point is *we can't read your mind* (and for that matter neither can a compiler, that's why simple type casting failed because there is no obvious conversion for what you want) – Assimilater Jul 14 '17 at 15:27

2 Answers2

1

Use std::bitset to get a binary representation and convert that to char array:

#include <iostream>
#include <cstdint>
#include <bitset>

int main()
{
    std::uint32_t x = 42;
    std::bitset<32> b(x);
    char c[32];
    for (int i = 0; i < 32; i++)
    {
        c[i] = b[i] + '0';
        std::cout << c[i];
    }
}

This will resemble a little-endian representation.

Ron
  • 14,674
  • 4
  • 34
  • 47
0

I know this question is a bit old but I will write an answer that may help someone else. So, basically you could use std::bitset which represents a fixed-size sequence of N bits.

With std::bitset<32> bits(value) you can create a sequence of 32 bits representing the 4-byte integer. You can also use std::bitset::to_string function in order to convert those bits to std::string.

However, if you want to have some more sophisticated output you can use the following function:

void u32_to_binary(uint32_t const& value, char buffer[]) {
    std::bitset<32> bits(value);
    auto stringified_bits = bits.to_string();

    size_t position = 0;
    size_t width = 0;
    for (auto const& bit : stringified_bits) {
        width++;
        buffer[position++] = bit;

        if (0 == width % 4) {
            buffer[position++] = ' ';
            width = 0;
        }
    }

    buffer[position] = '\0';
}

which will create an output like this:

0000 0000 0000 0000 0000 0000 0001 1100

Here is how you can use it:

#include <iostream>
#include <bitset>

int main() {
    char buff[128];
    u32_to_binary(28, buff);
    std::cout << buff << std::endl;

    return 0;
}
NutCracker
  • 11,485
  • 4
  • 44
  • 68