Client Application sends two (ushort) numbers via QTcpSocket to the server:
ushort MessageId = 4;
ushort MessageSize = 0;
socket->write((const char*) &MessageId, sizeof(ushort));
socket->write((const char*) &MessageSize, sizeof(ushort));
socket->waitForBytesWritten();
Server Application receives the 4 bytes long message and puts it into a QByteArray buffer then decodes the numbers:
int bytes = socket->bytesAvailable();
QByteArray buffer = socket->read(bytes)
const char * messageIdBytes = buffer.mid(0, 2);
ushort messageId = (ushort)(*messageIdBytes);
const char * messageSizeBytes = buffer.mid(2, 4);
ushort messageSize = (ushort)(*messageSizeBytes );
qDebug() << QString("MessageId Bits: [%1], Value: [%2].").arg(QString::number(messageId, 2), QString::number(messageId));
qDebug() << QString("MessageSize Bits: [%1], Value: [%2].").arg(QString::number(messageSize, 2), QString::number(messageSize));
This gives the following server output: (I added the spaces for readability)
MessageId Bits: [1111 1111 1101 1101], Value: [65501].
MessageSize Bits: [1111 1111 1101 1101], Value: [65501].
- Problem: the server output should receive MessageId 4 and MessageSize 0.
- Observation: sending different values from the client doessn't even affect the server output. It's always that weird number 65501..
- Interesting: it does work however if I only write one number instead of two!
Any idea what I'm doing wrong?