0

Is there a standard way to encode and decode byte arrays into a sequence of characters that can be stored into a std::string?

The reason for this question is that HyperTable value type is std::string, so whatever custom type you want to store in the table needs to be serialised and stored into the string, and then, deserialized for reading. (see this thread for details: https://groups.google.com/forum/?fromgroups=#!topic/hypertable-user/_igfvLv9IfA)

Bernhard Barker
  • 54,589
  • 14
  • 104
  • 138
lurscher
  • 25,930
  • 29
  • 122
  • 185
  • 1
    `std::string(byteArray, byteArray+size)`? Are the byte arrays interdependent (do they contain data that must be changed when they are deserialised)? – Mankarse Dec 31 '12 at 08:51
  • @Mankarse, the problem with that simple approach is that, according to the documentation detail link i posted on the question, the SELECT will un-escape \n \t and \\ one-byte versions into their two-byte versions. So if my binary data contains one of those bytes by chance they will be unescaped, corrupting the data. My hope was that this problem was so common that a std function would exist somewhere for this. I've since wrote one and it seems to work but i'm happy to replace it with something standard, i hate reinventing the wheel – lurscher Jan 09 '13 at 01:15
  • There is no standard way to do this, but if I understand correctly, it could be implemented in less than 10 lines of code. – Mankarse Jan 09 '13 at 01:20
  • did you try to escape your strings before writing them? if you do it yourself the HT process combined with you own decode won't corrupt the data – Matthias Kricke Feb 06 '13 at 09:49

0 Answers0