-2

I got SDL2 on mac via brew's brew install sdl2. Now I try to compile a simple C program and I'm unable to start it.

I'm trying to compile as follows:

gcc -Wall -std=c99 -I/opt/homebrew/Cellar/sdl2/2.28.2/include/SDL2 -c ../src/main.c -o target/o/main.o
gcc -Wall -std=c99 -I/opt/homebrew/Cellar/sdl2/2.28.2/include/SDL2 -c ../src/glad/glad.c -o target/o/glad.o
gcc -ldl -L/opt/homebrew/Cellar/sdl2/2.28.2/lib -lSDL2 -o target/csdldemo target/o/main.o target/o/glad.o 

Here /opt/homebrew/Cellar/sdl2/2.28.2/include/SDL2 is a path to SDL2 installation.

Application compiles without any issues. However when I start it via ./target/csdldemo it immediately gets killed by an OS. I.e.:

 ./target/csdldemo     
zsh: killed     ./target/csdldemo

I can still start an application via lldb ./target/csdldemo and then running it (i.e. via r command in the lldb prompt), but it fails later when I try to use OpenGL 3.3 functions.

I tried to run it at least using lldb but I think the following doesn't work:

// I run the following after SDL_Init(SDL_INIT_VIDEO) but before SDL_CreateWindow:
SDL_GL_SetAttribute(SDL_GL_ACCELERATED_VISUAL, 1);
SDL_GL_SetAttribute(SDL_GL_CONTEXT_MAJOR_VERSION, 4);
SDL_GL_SetAttribute(SDL_GL_CONTEXT_MINOR_VERSION, 1);
SDL_GL_SetAttribute(SDL_GL_DOUBLEBUFFER, 1);
SDL_GL_SetAttribute(SDL_GL_DEPTH_SIZE, 24);

Also tried setting major/minor version as 3.3 and 4.5 to no avail.

The code that doesn't work on Mac OS X is an attempt to compile shader.

Shader definition:

#version 330 core
out vec4 FragColor;
void main() {
    FragColor = vec4(1.0, 0.5, 0.2, 1.0);
}

Code that compiles the above:

CHECK(fragmentShader = glCreateShader(GL_FRAGMENT_SHADER));
CHECK(glShaderSource(fragmentShader, 1, &fragmentShaderSource, NULL));
CHECK(glCompileShader(fragmentShader));

Here: a CHECK is a simple macro that checks OpenGL error after executing an OpenGL API call and the error stating version '330' is not supported shows up after executing glCompileShader:

ERROR: 0:1: '' :  version '330' is not supported
ERROR: 0:1: '' : syntax error: #version

In the above snippet I use glCompileShader from glad which was generated as follows:

 OpenGL loader generated by glad 0.1.34 on Thu Aug 24 17:59:33 2023.

    Language/Generator: C/C++
    Specification: gl
    APIs: gl=3.0
    Profile: compatibility
    Extensions:
        
    Loader: True
    Local files: True
    Omit khrplatform: False
    Reproducible: False

    Commandline:
        --profile="compatibility" --api="gl=3.0" --generator="c" --spec="gl" --local-files --extensions=""

Mac OS X version is 13.4 (Ventura).

Exact same program compiles and works fine on linux.

I suspect that the above is a combination of two issues:

  1. Inability to run compiled binary on Mac as evidenced by immediate program termination. Unlike others this attempt doesn't show up in security settings so I'm not sure how to make Mac OS to permit execution of my own compiled binaries. Presumably brew itself doesn't have this issue as I can use binaries installed via brew.
  2. Various sources state that my version of Mac should support OpenGL 4.1, I can only speculate that security subsystem somehow interferes with shader compilation?

Tried:

  1. Compiling and running the same program on Linux. Tried using cmake for both Mac and Linux as well as using custom Makefile on Mac OS. Everything works as expected on Linux, in all cases I see binary immediately killed on Mac. Like I said, I can still run the program via lldb.
  2. Tried setting other versions of OpenGL via corresponding SDL_GL_SetAttribute. Tried: 3.3, 4.1 and 4.5.

I wonder what is missing in my setup what makes glCompilerShader to fail. Any steps to diagnose/troubleshoot the problem further would be appreciated.

J. A.
  • 1
  • 2
  • Possible duplicate of [Unable to use OpenGL 3.3 with SDL2 on M2 Mac OS X](https://stackoverflow.com/questions/76984976/unable-to-use-opengl-3-3-with-sdl2-on-m2-mac-os-x) – genpfault Aug 28 '23 at 05:03
  • Please clarify your specific problem or provide additional details to highlight exactly what you need. As it's currently written, it's hard to tell exactly what you're asking. – Community Aug 28 '23 at 06:15
  • Add `SDL_GL_SetAttribute(SDL_GL_CONTEXT_PROFILE_MASK, SDL_GL_CONTEXT_PROFILE_CORE);` before `SDL_GL_SetAttribute(SDL_GL_CONTEXT_MAJOR_VERSION, 4)`. And remember that max OGL version for Mac is 3.3 (likely 4.1 too, but not for too old hardware) – Ripi2 Aug 28 '23 at 19:05
  • Also use the same version for `SDL_GL_SetAttribute` as for the first line in each shader. Currently you ask for 4.1 but use 3.3 in the shader – Ripi2 Aug 28 '23 at 19:07
  • `SDL_GL_SetAttribute(SDL_GL_CONTEXT_PROFILE_MASK, SDL_GL_CONTEXT_PROFILE_CORE)` has solved the issue. Thank you very much! Please, write an answer so I could accept it. – J. A. Aug 29 '23 at 22:28

1 Answers1

0

For those hitting the same issue: the missing part was SDL_GL_SetAttribute(SDL_GL_CONTEXT_PROFILE_MASK, SDL_GL_CONTEXT_PROFILE_CORE) before setting all other attributes.

Thanks to user Ripi2 who suggested this.

As for zsh killing the process - or, more precisely, OS sending SIGKILL to the process - that was due to the absence of code signature. This problem could be diagnosed by looking at crash reports available in the Mac OS X Console application.

J. A.
  • 1
  • 2