1

I'm doing some work on Volume Rendering. I want to allocate a 3D luminance texture of 1024x1024x1024 uchar. Unfortunately it always fails.

By adding glGetError() after glTexImage3D(...), I get the error code 1285, which means "Out of memory".

However, my card is NV quadro 4800, whose memory size is 1536MB, larger than the texture size above(1GB). The driver version of the card is 296.88. The version of glew is the latest version 1.8.

My code about allocating the texture is shown as follows:

glBindTexture(GL_TEXTURE_3D, texVoxels);
glTexParameteri(GL_TEXTURE_3D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
glTexParameteri(GL_TEXTURE_3D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
glTexImage3D(GL_TEXTURE_3D, 0, GL_LUMINANCE, volumeSize.x, volumeSize.y, volumeSize.z, 0, GL_LUMINANCE, GL_UNSIGNED_BYTE, voxels);

int err = glGetError();
printf("%d\n", err);

PS. By using *glGetIntegerv(GL_MAX_3D_TEXTURE_SIZE, dim)*, it returns 2048, which means my hardware should be able to allocate 1024 cubic uchar texture.

PPS. BTW, I also do some work using CUDA. There' s no problem when I allocate 2048 cubic uchar texture in CUDA.

What's the problem of this openGL 3D texture?

rtrobin
  • 270
  • 2
  • 11

2 Answers2

3

Thanks for all friends answering my question. I eventually find my fault.

I used to compile the code under win32 mode. When I turn to using x64 mode to compile and run, it eventually WORKS !! And no matter GL_LUMINANCE or GL_LUMINANCE8, it works.

It seems like the device can' t address video memory larger than 600MB(more or less).

Here is the final link I found which explain the same problem I met. http://www.opengl.org/discussion_boards/showthread.php/177442-Out-of-memory-problem?highlight=memory

rtrobin
  • 270
  • 2
  • 11
  • Just had the same issue and can confirm targeting x64 works, in my case the memory limit targeting x86 seems to be about 200Mb 3D texture on a Nvidia GTX 970M – Martins.A Nov 14 '16 at 18:59
2

What's the problem of this openGL 3D texture?

You exceeded memory limitations.

Just because your GPU's memory is 1.5GB doesn't mean that you can allocate a 1GB chunk as a single texture. Your OpenGL implementation has the right to refuse to do so on memory grounds, regardless of how much GPU memory your card has.

And if it wants to reject this, there's nothing you can do about it.

The best you can do is try giving your image a proper pixel size. That is, use GL_LUMINANCE8 rather than the unsized GL_LUMINANCE. Or better yet, use GL_R8. There is no guarantee that any of these would work.

Nicol Bolas
  • 449,505
  • 63
  • 781
  • 982
  • Thanks for reply.:) I really know this. Even some few days before, I also thought that my GPU couldn't allocate 1G texture. However, there is an opensource project called **VoReEn**, which succeeded in allocating 1G texture on my GPU! It uses **GL_ALPHA8** and works. But in my code, **GL_ALPHA8** and **GL_LUMINANCE8** don't work at all. :( – rtrobin Aug 04 '12 at 10:03
  • Sometimes that's just what happens. You can increase your chances of being able to make a large texture by closing other applications that might be using some (and make sure *your* program isn't using much VRAM either). – geometrian Aug 04 '12 at 15:02
  • My problem solved.:) I used to compile the code under win32 mode. When I turn to using x64 mode to compile and run, it eventually WORKS !! – rtrobin Aug 06 '12 at 07:28