I'm trying to send an enum over the wire and I want to know how big the buffer should be on the receiving end.
I've noticed that given some enum
#[derive(Serialize)]
enum Message {
Big(u128),
}
let big = Message::Big(u128::MAX);
let big_bytes = bincode::serialize(&big).unwrap();
println!("{}", big_bytes.len()); // 20
let size_of_message = std::mem::size_of::<Message>();
println!("{}", size_of_message); // 16
The serialized version is 4 bytes larger than the enum.
But if you add another variant
#[derive(Serialize)]
enum Message {
Small(u8),
Big(u128),
}
let big = Message::Big(u128::MAX);
let big_bytes = bincode::serialize(&big).unwrap();
println!("{}", big_bytes.len()); // 20
let size_of_message = std::mem::size_of::<Message>();
println!("{}", size_of_message); // 24
The enum becomes 4 bytes larger than the serialized version.
This post lets me know that the extra 8 bytes come from the tag. Should I feel safe setting the buffer size to size_of::Message() + 4
or size_of::Message() + 8
? I guess I don't care about the up-to-12 extra bytes this apparently might give me.
As I add more complicated struct variants to the Message
, I'd hope to have some systemic way of determining the maximum size. I want to be confident that it is correct enough, if maybe a little generous. It may be that enums get serialized differently on different systems or something else that I'm not taking into account as well.
Hoping someone can help me understand better.
Thanks.