4

I explore netty to communicate Objects between VMs. I use ObjectEncoder & ObjectDecoder respectively to serialize these.

I quickly found out that this solution is limited to max 1MB-sized objects. As I intend to communicate larger objects and given I do not intend to limit this size, I used Integer.MAX_VALUE to set the maximum frame length.

Unfortunately it looks like this value is picked up to initialize some buffers, thus resulting in unnecessary GC-ing and very likely in OutOfMemory.

Is there a way to create an unlimited ObjectEncoder/Decoder while using DynamicChannelBuffers so that not too much memory is wasted?

sam-w
  • 7,478
  • 1
  • 47
  • 77
javadoc
  • 86
  • 4

1 Answers1

4

ObjectDecoder extends LengthFieldBasedFrameDecoder which extends FrameDecoder. FrameDecoder manages the decode buffer and it uses a dynamic buffer with initial capacity of 256.

However, once you receive a large object, the dynamic buffer expands itself, but never shrinks. If you have multiple connections that exchange large objects, your ObjectDecoder will all have a very large buffer eventually, potentially leading to OutOfMemoryError.

This issue has been fixed last week and a new release (3.2.7.Final) will be released this week.

trustin
  • 12,231
  • 6
  • 42
  • 52
  • Would it mean that with 3.2.7 we can use Integer.MAX_VALUE for lenghtdecoders without OOM issues? – Abe Nov 11 '11 at 03:51
  • 1
    Yes, if my analysis on your problem is correct. However, you will still get an OOME if a client sends a very large object that doesn't fit into the VM heap. – trustin Nov 13 '11 at 03:27