7

I'm updating a program from SDL 1 to SDL 2 and need to use color palettes. Originally, I used SDL_SetColors(screen, color, 0, intColors); but that does't work in SDL 2. I'm trying to use:

SDL_Palette *palette = (SDL_Palette *)malloc(sizeof(color)*intColors);
SDL_SetPaletteColors(palette, color, 0, intColors);
SDL_SetSurfacePalette(surface, palette);

But SDL_SetPaletteColors() returns -1 and fails. SDL_GetError gives me no information.

How can I make a palette from a SDL_Color and then set it as my surface's palette?

TheCodeMan54
  • 143
  • 2
  • 5

2 Answers2

7

It's hard to tell what your variables are and how you intend to use them without seeing your declarations.

Here's how I set up a grayscale palette in SDL_gpu:

SDL_Color colors[256];
int i;

for(i = 0; i < 256; i++)
{
    colors[i].r = colors[i].g = colors[i].b = (Uint8)i;
}

#ifdef SDL_GPU_USE_SDL2
SDL_SetPaletteColors(result->format->palette, colors, 0, 256);
#else
SDL_SetPalette(result, SDL_LOGPAL, colors, 0, 256);
#endif

The result SDL_Surface already has a palette because it is has an 8-bit pixel depth (see note in https://wiki.libsdl.org/SDL_Palette).

Jonny D
  • 2,244
  • 15
  • 22
2

It has been awhile since the OP posted the question and there has been no accepted answer. I ran into the same issue while trying to migrate a SDL 1.2 based game into using 2.0. here is what I did hoping it could help other who may be facing similar issue:
Replace:
SDL_SetColors(screen, color, 0, intColors);

With:
SDL_ SDL_SetPaletteColors(screen->format->palette, color, 0, intColors);

David

us_david
  • 4,431
  • 35
  • 29