We would like to use protobuf-net with very large objects (serialized to >2 GB). Protobuf-net is not designed for this, but in our tests of large objects (serialized to around 100 MB) it performs better than anything else, so it stands to reason that it would work well with very large objects. It seems like there are limitations to the size of the stream that protobuf-net can work with due to internal implementation details. A co-worker has read in passing that it was possible to inject a custom stream reader/write to get around this limit, but couldn't find the source. I googled around, and could not find anything about this. My question is, is it possible to do this? If so, could you post a link to documentation or provide a code sample?
Asked
Active
Viewed 120 times
1
-
Have you read [this] (https://stackoverflow.com/questions/15794274/serialize-list-of-huge-composite-graphs-using-protobuf-net-causing-out-of-memory) – oakman Sep 12 '17 at 19:07