3

I am running a maya python script that runs through the scene and reduces the polygon count of a mesh object if it is above a certain range.

This runs fine on a small scene, but when I run it on thousands of mesh objects, the memory use of my machine counts up and up until maya freezes at around 80% of the physical memory use.

My question is, can I add some kind of memory flush, other than what I already have, into the python loop that will 'reset' the memory use? Or is this impossible, as the thread is still running?

I am running this:

selectedObjects = ls (sl=True, fl=True)
for obj in selectedObjects:
    reduceMesh(obj) # my function
    cmds.flushUndo()
    cmds.clearCache( all=True )
    cmds.DeleteHistory()

Adding the flush undo seems to help a little, but I still see the memory rise...

anti
  • 3,011
  • 7
  • 36
  • 86

1 Answers1

2

You can try turning reducing the size of Maya's undo cache or turning the undo queue off altogether using cmds.undoInfo. You'll need to try a couple of strategies to see what the real culprit is -- but in the end you're changing the contents of thousands of meshes so the undo stack will be huge since you effectively need to maintain both the old and new state for thousands of meshes.

If that becomes impossible you can troll the big scene, saving out individual objects into different files and referencing them back into the original scene. Then you can process all of those files individually. This will have some perf overhead but it will probably let you finish the job.

theodox
  • 12,028
  • 3
  • 23
  • 36
  • Plus if you using 2015 Ext 1 or 2016 then use the profiler to see who making the mess – Achayan Dec 02 '15 at 18:27
  • I don't want to see who is making the mess, I just want to flush it. So far I didn't find a way to do it ... Tested also in Maya 2017. – Romulus Nov 30 '16 at 13:38