I'm implementing HTTPS server using OpenSSL and I've faced some strange behavior while reading a data. The following code I guess is illustrating the issue:
char m_readBuffer[READ_BUFFER_SIZE];
std::vector<char> data;
bool readMore = true;
bool isError = false;
do
{
SSL *ssl = m_sslClient[i];
retval = SSL_read(ssl, m_readBuffer, READ_BUFFER_SIZE);
if (retval <= 0)
{
int errorCode = SSL_get_error(ssl, retval);
if (errorCode == SSL_ERROR_WANT_READ)
{
readMore = false;
isError = false;
}
else
{
readMore = false;
isError = true;
CloseClient(i);
}
}
else
{
data.insert(data.end(), m_readBuffer, m_readBuffer + retval);
}
}
while(readMore == true);
if(isError == false)
{
ProcessData(i, data);
}
The code works as expected, i.e. I read the data that a browser sends to me without any problem. But I've come to this code let's say experimentally. The code that boring me is the SSL_read
return value. I can't (or I guess I can't) determine exactly when the data stream ended. The documentation says that SSL_ERROR_WANT_READ
means that there is more data and I have to repeat reading. But by experience I came to the fact that if SSL_read
return -1
and then SSL_get_error
returns SSL_ERROR_WANT_READ
this indicates that there in no more data.
Ok, that works but I'm worried that this may be working incorrectly.
Is there a right way to determine that I read out all the data?