I want to be able to render the character's sprite as pure white with ColorMod
and AlphaMod
set to some external values at some frames on top of character's actual sprite. Here's this code:
auto spr = m_currentAnimation->getSprite();
bool shining = m_shineLockedTimer.isActive() || m_shineAlphaTimer.isActive();
renderer_.renderTexture(spr, texPos.x + xoffset, texPos.y, camera_, flip);
if (shining)
{
float alpha = 1 - m_shineAlphaTimer.getProgressNormalized();
auto pw = renderer_.createTexture(m_currentAnimation->getSize());
renderer_.setRenderTarget(pw);
renderer_.fillRenderer({255, 255, 255, 0});
auto blendmode = SDL_ComposeCustomBlendMode(SDL_BLENDFACTOR_ONE, SDL_BLENDFACTOR_ONE, SDL_BLENDOPERATION_MAXIMUM, SDL_BLENDFACTOR_ONE, SDL_BLENDFACTOR_ZERO, SDL_BLENDOPERATION_ADD);
SDL_SetTextureBlendMode(spr, blendmode);
renderer_.renderTexture(spr, 0, 0);
SDL_SetTextureBlendMode(spr, SDL_BLENDMODE_BLEND);
SDL_SetTextureBlendMode(pw, SDL_BLENDMODE_BLEND);
SDL_SetTextureColorMod(pw, m_colorShine.r, m_colorShine.g, m_colorShine.b);
SDL_SetTextureAlphaMod(pw, alpha * m_colorShine.a);
renderer_.setRenderTarget(nullptr);
renderer_.renderTexture(pw, texPos.x + xoffset, texPos.y, camera_, flip);
SDL_DestroyTexture(pw);
std::cout << SDL_GetError() << std::endl;
SDL_ClearError();
}
Basically I create a pw
(pure white) texture, set it as a render target, fill with white color and 0 alpha, copy alpha from an actual sprite using custom blend mode, return render target to normal, render this new pw
texture and destroy it. This code is called at each frame (when the effect is applied) for 2 characters. Just in case, here's .cpp file with this draw function.
The result looks like this. However, it causes this artifact. I did some debugging and here are the results:
- The artifact only happens with the character that is rendered first at the same frame even if the rendering order changes during the effect.
- The artifact depends on the position where you render the texture. I've tried to render
pw
in the constant position on screen without any scaling or rotation and the artifact remained the same. Here's the code of this render function, just in case:
void Renderer::renderTexture(SDL_Texture* tex_, float x_, float y_, float w_, float h_)
{
SDL_FRect dst;
dst.x = x_;
dst.y = y_;
dst.w = w_;
dst.h = h_;
SDL_RenderCopyF(m_renderer, tex_, NULL, &dst);
}
void Renderer::renderTexture(SDL_Texture* tex_, float x_, float y_) // The one I used
{
int w, h;
SDL_QueryTexture(tex_, NULL, NULL, &w, &h);
renderTexture(tex_, x_, y_, w, h);
}
- I can't say for sure if the artifact depends on framerate, specific sprite of any character or their distance. For some reason, it's behavior changed a lot when I tried to change framerate (on lower framerate
pw
texture is not visible at all) or make this so that this effect is applied constantly (I removedSDL_SetTextureColorMod
andSDL_SetTextureAlphaMod
, for some reason there was no artifact with some sprites unless other character used other unrelated sprites - for example, if the character with artifact jumps, he has no artifact unless another character jumps as well, even if other character actually uses other set of sprites after that, like some airborne attack, but there are still no artifact if the second character goes straight into the attack without using jump animation) - The artifact remains if I remove
BlendMode
,ColorMod
andAlphaMod
manipulations and if I don't fill the texture with white, but it disappears if I removeSDL_DestroyTexture(pw);
, and I honestly can't understand the reason. The code above doesn't return anything from SDL_GetError(), meaning SDL_DestroyTexture(pw) works correctly
As I understand, SDL_DestroyTexture(pw)
causes this artifact, which I can't understand at all: even if I don't destroy it, I lose the pointer to SDL_Texture, and even if SDL for some reason allocates new texture at the same address where the previous texture was located, I overwrite the texture with fillRenderer. Here are some of the functions that I use:
SDL_Texture* Renderer::createTexture(const Vector2<int>& size_)
{
return SDL_CreateTexture(m_renderer, SDL_PIXELFORMAT_RGBA8888, SDL_TEXTUREACCESS_TARGET, size_.x, size_.y);
}
void Renderer::setRenderTarget(SDL_Texture* tex_)
{
int i = SDL_SetRenderTarget(m_renderer, tex_);
if (i != 0)
std::cout << i << ": " << SDL_GetError() << std::endl;
}
void Renderer::fillRenderer(const SDL_Color& col_)
{
SDL_SetRenderDrawColor(m_renderer, col_.r, col_.g, col_.b, col_.a);
SDL_RenderClear(m_renderer);
}
I use debug mode so there should be no code-breaking optimization from compiler. So, my question is: why does SDL_DestroyTexture(pw)
cause this artifact and how can I fix it?
Also, I have one more question: is it okay to create and destroy texture on each frame or is it better to create white version of every animation along with an actual animations and use them?
UPD. I changed animation class to optionally prerender white sprites and it seems to work fine, although I still want to know what was the reason for that bug