I'm trying to send the bytes
data of the raw FullHD 32-bit color depth frame (which is about 3.6MB) on the payload, while emitting event, to the server.
Here's the code handling that.
The data_chunk
is a UstreamChunk
object with its definition copied below.
self.sio.emit("process", data_chunk.to_json())
class UstreamChunk:
def __init__(self, chunk: bytes, part_number: int):
self.data = chunk
self.part_number = part_number
def to_json(self) -> Dict:
return {
"data_chunk": self.data,
"part_number": self.part_number,
}
The best I could actually do, however, is to send 360p frame.. Its' size is about 0.88MB.
Everything that's longer, like 16-bit HD frame (1.76MB) is lost on the way to the server.
The event/message doesn't even hit the server. Is it possible, that this behavior is related to maximum payload size?