Goal: Streaming Android Camera to Wowza Server in proper orientation no matter the orientation of device (i.e. the video is always right side up)
I've looked at all the questions on here regarding camera orientation and so far they all seem to just change the preview which is rendered to the screen or set a flag in the MP4 file (not appropriate for my use case: streaming).
I'm streaming camera frames to a Wowza server and on the Wowza server the received video is always landscape. This is fine if the phone is always held in the same orientation but I can't guarantee my users will do this. From what I've gathered, when you grab frames directly from the camera and feed them to the encoder, you are getting the natural orientation of the device's camera (which may be mounted landscape in my case) and this is completely unaffected by the preview. This is problematic because if the device is rotated during the stream, the image is rotated with it.
I have tried using openGL matrix to transform the preview in a custom GLSurfaceView.Renderer and all it does is transform the View on the screen, not the frames sent to the encoder.
I have read over Grafika's examples and am not sure where in the process I need to rotate frames before they are fed to the encoder. I am using a SurfaceTexture as a camera preview which is then rendered to a GLSurfaceView (and using my own custom GLSurfaceView.Renderer).
How can I rotate the frames to the encoder?
Ideally I would do this in openGL. I had thought about rotating the frames before filling mediacodec.dequeueInputBuffer buffers but this will be done using the CPU and I'm wary because of the real-time application. Maybe I am overlooking something with regards to the preview. I've seen other broadcasting applications tear down the Preview layer on the UI and rebuild it whenever the device is rotated.
References: