Have aa GzipStream
feeding an SSLStream
.
First time today noticed "The gzip stream can't contain more than 4GB data."
at System.IO.Compression.FastEncoder.GetCompressedOutput(Byte[] outputBuffer)
at System.IO.Compression.DeflateStream.InternalWrite(Byte[] array, Int32 offset, Int32 count, Boolean isAsync)
at System.IO.Compression.DeflateStream.Write(Byte[] array, Int32 offset, Int32 count)
at System.IO.Compression.GZipStream.Write(Byte[] array, Int32 offset, Int32 count)
at ...
The Writer to the network is feeding in data faster than the Reader is getting the data. As such the cause of the error is unclear to me.
Is this is a limitation on the total bytes written through the stream or is this is an issue with backlog getting the data out of the GZipStram
and into the SSLStream
?
The Reader is able to unzip and use the data before the stream ends, so I never thought that there might be such a limit on total bytes written.
There doesn't seem to be a way to check the length.
Could anybody share examples on how they handled this?
Code outline:
TcpClient network = = new TcpClient();
network.Connect(m_config.Address.Host, m_config.Address.Port);
SslStream sslStream = new SslStream(network.GetStream(), true .. ssl bits
Stream outStream = new GZipStream(sslStream, CompressionMode.Compress, true);
try {
String nextMessage;
while (messages.Dequeue(out nextMessage))
{
byte[] buffer = Encoding.UTF8.GetBytes(nextMessage + "\n");
outStream.Write(buffer, 0, buffer.Length);
}
} catch()