I've got some C code to render some OpenGL stuff and it's running on both Android and iOS. On Android it looks fine. But on iOS it is flipped vertically.
Here's some simple code to demonstrate (only copied the relevant parts because OpenGL C code is long-winded):
GLfloat vVertices[] = {
0.0f, 0.5f,
-0.5f, -0.5f,
0.5f, -0.5f
};
glViewport(0, 0, context->width, context->height);
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
glUseProgram(data->programObject);
glVertexAttribPointer(0, 2, GL_FLOAT, GL_FALSE, 0, vVertices);
glEnableVertexAttribArray(0);
glDrawArrays(GL_TRIANGLES, 0, 3);
glDisableVertexAttribArray(0);
On Android it looks like this:
But on iOS it looks like this:
The only thing that differs between the two platforms is the initialization code for OpenGL ES, since all the OpenGL code is shared C code. However, I can't spot anything obviously wrong with the init code.
Here's the init code (I removed most error handling because there are no errors being triggered apart from the one I left in):
- (void)initGL {
_context = [[EAGLContext alloc] initWithAPI:kEAGLRenderingAPIOpenGLES3];
[EAGLContext setCurrentContext:_context];
[self createCVBufferWithSize:_renderSize withRenderTarget:&_target withTextureOut:&_texture];
glBindTexture(CVOpenGLESTextureGetTarget(_texture), CVOpenGLESTextureGetName(_texture));
glTexImage2D(GL_TEXTURE_2D, // target
0, // level
GL_RGBA, // internalformat
_renderSize.width, // width
_renderSize.height, // height
0, // border
GL_RGBA, // format
GL_UNSIGNED_BYTE, // type
NULL); // data
// HACK: we always get an "error" here (GL_INVALID_OPERATION) despite everything working. See https://stackoverflow.com/questions/57104033/why-is-glteximage2d-returning-gl-invalid-operation-on-ios
glGetError();
glGenRenderbuffers(1, &_depthBuffer);
glBindRenderbuffer(GL_RENDERBUFFER, _depthBuffer);
glRenderbufferStorage(GL_RENDERBUFFER, GL_DEPTH_COMPONENT16, _renderSize.width, _renderSize.height);
glGenFramebuffers(1, &_frameBuffer);
glBindFramebuffer(GL_FRAMEBUFFER, _frameBuffer);
glFramebufferTexture2D(GL_FRAMEBUFFER, GL_COLOR_ATTACHMENT0, GL_TEXTURE_2D, CVOpenGLESTextureGetName(_texture), 0);
glFramebufferRenderbuffer(GL_FRAMEBUFFER, GL_DEPTH_ATTACHMENT, GL_RENDERBUFFER, _depthBuffer);
if(glCheckFramebufferStatus(GL_FRAMEBUFFER) != GL_FRAMEBUFFER_COMPLETE) {
NSLog(@"failed to make complete framebuffer object %x", glCheckFramebufferStatus(GL_FRAMEBUFFER));
} else {
NSLog(@"Successfully initialized GL");
char* glRendererName = getGlRendererName();
char* glVersion = getGlVersion();
char* glShadingLanguageVersion = getGlShadingLanguageVersion();
NSLog(@"OpenGL renderer name: %s, version: %s, shading language version: %s", glRendererName, glVersion, glShadingLanguageVersion);
}
}
And here's the code that creates the actual texture (using EAGL):
- (void)createCVBufferWithSize:(CGSize)size
withRenderTarget:(CVPixelBufferRef *)target
withTextureOut:(CVOpenGLESTextureRef *)texture {
CVReturn err = CVOpenGLESTextureCacheCreate(kCFAllocatorDefault, NULL, _context, NULL, &_textureCache);
if (err) return;
CFDictionaryRef empty;
CFMutableDictionaryRef attrs;
empty = CFDictionaryCreate(kCFAllocatorDefault,
NULL,
NULL,
0,
&kCFTypeDictionaryKeyCallBacks,
&kCFTypeDictionaryValueCallBacks);
attrs = CFDictionaryCreateMutable(kCFAllocatorDefault, 1,
&kCFTypeDictionaryKeyCallBacks,
&kCFTypeDictionaryValueCallBacks);
CFDictionarySetValue(attrs, kCVPixelBufferIOSurfacePropertiesKey, empty);
CVPixelBufferCreate(kCFAllocatorDefault, size.width, size.height,
kCVPixelFormatType_32BGRA, attrs, target);
CVOpenGLESTextureCacheCreateTextureFromImage(kCFAllocatorDefault,
_textureCache,
*target,
NULL, // texture attributes
GL_TEXTURE_2D,
GL_RGBA, // opengl format
size.width,
size.height,
GL_BGRA, // native iOS format
GL_UNSIGNED_BYTE,
0,
texture);
CFRelease(empty);
CFRelease(attrs);
}
Can anyone tell me why iOS is flipped like this? I've since noticed other people with the same problem, such as here but haven't found a solution yet.