1

We are attempting to train a network of knee MRI through Niftynet. We have a spatial window_size = (400,400,400) with pixdim = (0.4,0.4,0.4). When we run these images with a lower window size (for example 160,160,160) - there is no problem and it works quite well, however when we increase the window_size to achieve higher resolution outputs we get an error: Cannot serialize protocol buffer of type tensorflow.GraphDef as the serialized size (3459900923 bytes) would be larger than the limit (2147483647 bytes).

This is due to a limit in protobuf and because Niftynet / Tensorflow have decided it should be int32 which gives maxvalue (2 ^ 32) / 2 = 2147483648. At the same time I have heard that protobuf should really be able to cope with uint64, which then will be able to handle a much larger number? Do you know if this can be manipulated in Tensorflow/Niftynet?

  • a lot of serialization libraries don't attempt to go beyond the 2GiB limit, mostly because it is awkward; you're absolutely right that *in terms of the protocol* there's no reason for this limit - it is purely an implementation thing; however, Google's advice is *way* smaller than this ([source](https://developers.google.com/protocol-buffers/docs/techniques#large-data)). – Marc Gravell Apr 03 '19 at 09:20

0 Answers0