2

I have a question regarding creating textures without a file. My goal is to make a function that takes a vec3 color as an input and returns a texture ID for that texture.

This is what I have so far, but it gives odd outputs which has stripes of random jumbled colors:

unsigned int colorToTexture(glm::vec3 color, const int size) {
    // Create id for texture
    unsigned int tex;
    // generate and bind texture
    glGenTextures(1, &tex);
    glBindTexture(GL_TEXTURE_2D, tex);
    // set texture wrap parameters
    glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_REPEAT);
    glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_REPEAT);
    // set texture filter parameters
    glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR_MIPMAP_LINEAR);
    glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
    // set image data
    unsigned char* data = new unsigned char[size * size * sizeof(unsigned char)];
    for (unsigned int i = 0; i < (int)(size * size * sizeof(unsigned char)) / 3; i ++) {
        data[i * 3] = (int)(color.x * 255);
        data[i * 3 + 1] = (int)(color.y * 255);
        data[i * 3 + 2] = (int)(color.z * 255);
    }
    // set texture data and generate mipmaps
    glTexImage2D(GL_TEXTURE_2D, 0, GL_RGB, size, size, 0, GL_RGB, GL_UNSIGNED_BYTE, data);
    glGenerateMipmap(GL_TEXTURE_2D);
    // free image memory
    delete[] data;
    return tex;
}

Here's an example of what it does (Note that it works fine with regular textures generated from loaded files): Wierd output from generated texture

CodeMan
  • 99
  • 6
  • 1
    Try using `glPixelStorei(GL_UNPACK_ALIGNMENT, 1)` before the `glTexImage2D` call. – G.M. Aug 07 '20 at 08:09
  • 1
    What is the point of creating a `size`x`size` texutre with a solid color? This will both consume more memory and will be slower than just using `1`x`1`. – derhass Aug 07 '20 at 11:56
  • I'm doing this so that I could add functionality to edit the texture later in an editor. I'm also thinking that using a 1x1 texture might not work since it's dimensions have to be a power of 2 (While 1 is 2 to the 0th power, it seems a little risky to me) – CodeMan Aug 07 '20 at 16:53
  • 1x1 textures worked right from the beginning, but the "power of two" requirement fell 20 years ago. – derhass Aug 07 '20 at 22:17

1 Answers1

3

You are uploading only 1/3 data into that texture which is most likely the problem, try this code:

// set image data
unsigned char* data = new unsigned char[3 * size * size * sizeof(unsigned char)];
for (unsigned int i = 0; i < size  * size; i++) 
{
    data[i * 3] = (unsigned char)(color.x * 255.0f);
    data[i * 3 + 1] = (unsigned char)(color.y * 255.0f);
    data[i * 3 + 2] = (unsigned char)(color.z * 255.0f);
}

Furthermore, the glTexImage2D call should be this since only RGB values are put into the data variable:

glTexImage2D(GL_TEXTURE_2D, 0, GL_RGB, size, size, 0, GL_RGB, GL_UNSIGNED_BYTE, data);
CodeMan
  • 99
  • 6
Michaelt LoL
  • 452
  • 5
  • 10
  • While this makes the artifacts appear less, the problem still presents itself. However, If I also change the glTexImage2D line to use a format of GL_RGB, everything works. I noticed this before and changed it, but it didn't fix the problem so I changed it back without thinking about how It would actually fix a part of the problem. Thank you for your response, this makes a lot of sense. I've never created a texture like this before so it was helpful and informative. – CodeMan Aug 07 '20 at 17:02
  • @CodeMan so it means you fixed it fullly or problem still appears? – Michaelt LoL Aug 07 '20 at 17:55
  • I fixed it fully, I just added the other part to the answer. – CodeMan Aug 08 '20 at 15:14