1

I wanted to write some protocol buffers to the socket and read it back from the client side. Things didn't work so I wrote some decoding part in the server itself right after I encoded it. Can you please take a look at the code below and tell me what I am doing wrong ?

(I had to use arraystream and coded stream so that I can write a delimiter)

int bytes_written = tData.ByteSize() + sizeof(google::protobuf::uint32);
google::protobuf::uint8 buffer[bytes_written];
memset(buffer, '\0', bytes_written);
google::protobuf::io::ArrayOutputStream aos(buffer,bytes_written);
google::protobuf::io::CodedOutputStream *coded_output = new google::protobuf::io::CodedOutputStream(&aos);
google::protobuf::uint32 size_  = tData.ByteSize();
coded_output->WriteVarint32(size_);

tData.SerializeToCodedStream(coded_output);

int sent_bytes = 0;
std::cout << buffer << std::endl;
if ( (sent_bytes = send(liveConnections.at(i), buffer, bytes_written, MSG_NOSIGNAL)) == -1 )
    liveConnections.erase(liveConnections.begin() + i);
else
    std::cout << "sent "  << sent_bytes << " bytes to " << i << std::endl;

delete coded_output;



////////////////


google::protobuf::uint8 __buffer[sizeof(google::protobuf::uint32)];
memset(__buffer, '\0', sizeof(google::protobuf::uint32));
memcpy (__buffer, buffer, sizeof(google::protobuf::uint32));

google::protobuf::uint32 __size = 0;

google::protobuf::io::ArrayInputStream ais(__buffer,sizeof(google::protobuf::uint32));
google::protobuf::io::CodedInputStream coded_input(&ais);
coded_input. ReadVarint32(&__size);
std::cout <<" size of payload is "<<__size << std::endl;

google::protobuf::uint8 databuffer[__size];
memset(databuffer, '\0', __size);
memcpy (databuffer, buffer+sizeof(google::protobuf::uint32), __size);    

std::cout << "databuffs " << "size " << __size << "  "<< databuffer << std::endl;
google::protobuf::io::ArrayInputStream array_input(databuffer,__size);
google::protobuf::io::CodedInputStream _coded_input(&array_input);
data_model::terminal_data* tData = new data_model::terminal_data();
if (!tData->ParseFromCodedStream(&_coded_input))
{
    std::cout << "data could not be parsed" << std::endl;     
}
else
{
    std::cout <<" SYMBOL --" << tData->symbol_name() << std::endl;
}
delete tData;

Out put of the program:

size of payload is 55
databuffs size 55  C109"056*    BANKNIFTY0���20140915@�J    145406340
data could not be parsed
C109"056*   BANKNIFTY0���20140915@�J    145406340
Jerry Coffin
  • 476,176
  • 80
  • 629
  • 1,111
Chani
  • 5,055
  • 15
  • 57
  • 92

1 Answers1

1

WriteVarint32 doesn't necessarily write 4 bytes, and ReadVarint32 doesn't read 4 bytes. "Var" stands for "variable", as in "variable length encoding".

When encoding, you write the size (which can take as little as one byte), then immediately the proto. When decoding, you read the size, then advance by four bytes, then read the proto. So, you are parsing starting from the wrong offset.

Use CurrentPosition() after ReadVarint32 to figure out how many bytes the size indicator consumed. Advance by that number of bytes.

Igor Tandetnik
  • 50,461
  • 4
  • 56
  • 85