What is the best practice for sending OpenGL pixel data across a network and then then displaying it on on a client as a bitmap image.
What I currently have is
- Get the pixel data using glReadPixels,
- Create a FreeImage object using FreeImage_ConvertFromRawBits
However I am unable to find any documentation or examples on serialising this to a byte array in bitmap RGB format. This will then be passed down the network to a client, de-serialised back into a bitmap image and rendered. Some help in this area would be greatly appreciated.
Please also provide some advice regarding best practice for this type of networking. The screen will be captured and sent at 30fps however at a relatively low resolution. Would a compressed JPG image be a better approach or would the compression complexity outweigh the extra data?