I have a gRPC client integration which recieves messages which are about 65MB in size (date x time tuple arrays mainly). Deserializing the received message seems to allocate about 700MB of extra unmanaged memory on the initial request. Subsequent identical requests do not increase the total process memory consumption anymore.
Is that normal? Any ideas how to figure out what is going on, or even better, to control this?