2

I have spent a great deal of time trying to figure out OSG's memory management. I have a scene graph with several children (actually a LOD based on an octree).

However, when I need to reset my scene (I just want to wipe ALL nodes from de scene and also wipe the memory), I use

// Clear main osg::Group root node
m_rootNode->removeChildren(0, m_rootNode->getNumChildren());
m_rootNode->dirtyBound();

// Clear Main view scene data from osg::Viewer
m_viewer->setSceneData(nullptr);

BEFORE I do this, I check all my nodes with a NodeVisitor pattern, and found out that ALL my nodes have reference count of 1, i.e, after clearing them from the scene, I expect my memory to be freed. However, this does not happen: my scene is actually reset, all the nodes disappear from the viewer, but the memory remains occupied.

Nonetheless, when I load another scene to my viewer, the memory is rewritten somehow (i.e., the memory usage does not increase, hence there is no memory leak, but used memory is always the same)

I can't have this behaviour, as I need to closely control memory usage. How can I do this?

manatttta
  • 3,054
  • 4
  • 34
  • 72

1 Answers1

1

Looks like OSG keeps cached instances of your data, either as CPU-side or GPU-side objects.

You could have a look at osgDB's options to disable caching in first place (CACHE_NONE, CACHE_ALL & ~CACHE_ARCHIVES), but this can actually increase your memory consumption as data may not be re-used and re-loaded multiple times.

You could instruct osg::Texture to free the CPU-side texture data after it was uploaded to OpenGL - in case you don't need it any more. This can be done conveniently via the osgUtil::Optimizer::TextureVisitor which you would want to set up to change the AutoUnref for each texture to true. I think, running osgUtil::Optimizer with the OPTIMIZE_TEXTURE_SETTINGS achieves the same effect.

Then, after closing down your scene, as you did in your Question's code, you could explicitly instruct OSG's database pager to wipe its caches:

for( osgViewer::View v in AllYourViews )
{
    v->getDatabasePager()->cancel();
    v->getDatabasePager()->clear();
}

To finally get rid of all pre-allocated GPU-side objects and their CPU-side representations, you would need to destroy your views and GLContext's.

Kento Asashima
  • 312
  • 1
  • 9
  • Thank you for your hints. Can you plese give me some extra insight on the following: How do I actually call osgDB options do deactivate the cache? How do I destroy the views and GLContexts specifically? – manatttta Aug 23 '16 at 10:56