2

I'm loading a large coloured point cloud into OpenSceneGraph using a Vec4Array for per vertex for colors. This is very memory inefficient, requiring 16 bytes of color information per vertex where I'd like to store 4 or less. The datasets I'm dealing with are huge so this is a significant problem.

Is there any way I can get OpenSceneGraph to use lower resolution colouring (e.g. 24 bits of color is more than adequate)?

Damon
  • 67,688
  • 20
  • 135
  • 185
SmacL
  • 22,555
  • 12
  • 95
  • 149

1 Answers1

2

You could possibly use Vertex attributes and a shader to encode XYZ and color into as few bits as you want, and then decode them back out in the shader. But other than that OpenGL (and OSG) doesn't really give you a lot of flexibility.

A lot of game architectures (consoles and mobile devices use a 16-bit 5-6-5 RGB bit coding scheme.

XenonofArcticus
  • 573
  • 2
  • 7
  • Thanks for the reply. The OSG beginners guide gives a GLSL shader example, so decoding a color attribute via a shader seems worth investigating (and gives me an excuse to learn more about shaders) – SmacL Jan 26 '13 at 12:54