I have a simple application running a TCP server, based on Netty. For the incoming requests, I am using ZlibCodecFactory.newZlibDecoder(ZlibWrapper.GZIP)
which is working perfectly fine, but after some time, it stops decoding anything, acting like a blackhole for any incoming data, hence nothing is getting out of it.
I have dig through the logs, and found some strange phenomenon happening, right before the blackhole behaviour starts:
My pipeline looks like below:
pipeline.addLast(new DummyInboundHandler("BEFORE"));
pipeline.addLast(ZlibCodecFactory.newZlibDecoder(ZlibWrapper.GZIP));
pipeline.addLast(new DummyInboundHandler("AFTER"));
DummyInboundHandler
s are added purely logging purposes to see what is going in and out. Up to some point, everything is okay, and I can decompress the incoming requests and process it. But then the incoming data is split into two (not sure if it is random or based on some lenght criteria) and then the decoder stops working. So it looks like this in the logs: (A,B,C basically represents byte array data)
Client Side sent data : AB
Server Side Received&Decompressed Data:
...
<Everything is ok up to this point>
Received : A - Details - Readable Length:416 Total Readlables:416 Class:PooledUnsafeDirectByteBuf
Capacity: 416
Decompressed: C
Received : B - Details - Readable Length: 6 Total Readlables:6 Class:PooledUnsafeDirectByteBuf Capacity: 480
<Nothing decompressed - but I can see incoming data>
...
One more weird thing is both A and AB decompresses into C when I manually decompress. As you can see above, the second part (the B part of the sent data) looks weird as it has a large capacity but only few indexes are used. Not sure if it gives any clues.
Anyone had similar issues with netty and the pipelines? Am I missing some configs ?
UPDATE:
I have made further tests, and got more confused now.
I have written two methods, one using the Netty ByteBuf and channels, and the other one for decompressing plain byte array, as follows:
public static void zlibdecode(String hex){
ByteBuf data = null;
ByteBuf deflatedData = null;
ByteBuf buf = null;
byte[] bytes = hexStringToByteArray(hex);
try
{
data = Unpooled.wrappedBuffer(bytes);
EmbeddedChannel chDecoder = new EmbeddedChannel(ZlibCodecFactory.newZlibDecoder(ZlibWrapper.GZIP));
chDecoder.writeInbound(data.copy());
buf = chDecoder.readInbound();
try {
StringJoiner sj = new StringJoiner("-");
for (byte b : ByteBufUtil.getBytes(buf)) {
sj.add("" + String.format("%02X", b));
}
System.out.println("Plain Data\n" + sj.toString() + "\n");
} catch (Exception e) {
e.printStackTrace();
}
}
finally
{
if (data != null)
{
data.release();
}
if (deflatedData != null)
{
deflatedData.release();
}
if (buf != null)
{
buf.release();
}
}
}
public static void bytedecode(String hex){
byte[] compressedRaw = hexStringToByteArray(hex);
try {
java.io.ByteArrayInputStream bytein = new java.io.ByteArrayInputStream(compressedRaw);
java.util.zip.GZIPInputStream gzin = new java.util.zip.GZIPInputStream(bytein);
java.io.ByteArrayOutputStream byteout = new java.io.ByteArrayOutputStream();
int len = 0;
byte buf[] = new byte[1024];
while (len >= 0) {
len = gzin.read(buf, 0, buf.length);
if (len > 0) {
byteout.write(buf, 0, len);
}
}
byte[] decompressed = byteout.toByteArray();
StringJoiner sj = new StringJoiner("-");
for (byte b : decompressed) {
sj.add(String.format("%02X", b));
}
String hexDecomp = sj.toString();
System.out.println("Plain Data: \n" + hexDecomp);
} catch (Exception e) {
e.printStackTrace();
}
}
For the case above, sending only A to zlibdecode
method works ok, and the decompression works, however, for the same input bytedecode
fails with
java.io.EOFException: Unexpected end of ZLIB input stream
which is perfectly fine, as the proper input stream should have AB, not only A. But still, cant understand why the first one does not fail.