0

I'm wondering how to cope with the following problem. Inside my C++ class I have an auxiliary PyObject pointer.

class Foo
{
     public:
     // Should I new the dictionary here in constructor?
     Foo()
     {

     }
     // Must decrease the reference count or explicitly delete the dictionary?
     ~Foo()
     {
         Py_DECREF(myDictionary);
     }

     void sync()
     {
          myDictionary = PyDict_New();
          for (int i=0; i<myInternalData.size(); i++)
          {
                  PyObject *key =  PyInt_FromLong(i);
                  PyObject *val = PyInt_FromLong(myInternalData.at(i));
                  PyDict_SetItem(dict,key,val);
                  Py_DecRef(key);
                  Py_DecRef(val);
          }
     }

     private:
     PyObject *myDictionary;
     std::vector<int> myInternalData;
}

In my C++ code the myInternalData structure is occasionally updated or resized and I want to know how to cope with the proper memory allocation of my python dictionary.

I don't know how to deallocate the memory associated with it, or how to correctly keep it synchronized with my internal std::vector without corrupting the heap or provoking memory leaks.

Some helps with Python C API? Should I deallocate the PyDict with PyObject_Del and then reallocating it again? Someone suggesting another approach?

linello
  • 8,451
  • 18
  • 63
  • 109
  • Are you calling `sync` frequently? From what I can understand this is a possible source of memory leaks, since you are allocating new objects. Also (I might be wrong here), I think that probably the appropriate way to deallocate the dictionary is to decrease the ref counter to zero (http://docs.python.org/2/c-api/refcounting.html). Did you try using valgrind to see where you are getting mem-leaks? – nvlass Jan 07 '13 at 14:11

1 Answers1

1

It's not clear to my why you're using a dictionary in Python, when you index with a contiguous set of integers, starting at 0. However: before using it, you'll have to do a PyDict_New to create the dictionary. Following that, when you resync, you should clear the dictionary before starting, using PyDict_Clear, rather than reallocate a new one.. Nothing else should be necessary. (If you reallocate a new one, as you do in your code, you should decrement the reference count on the old one first. But any code on the Python side which refers to the old one will continue to refer to the old one; PyDict_Clear is probably the better solution.)

Also, you should pay attention where temporary Python objects are involved. For the moment, nothing else is necessary because you only use Python (and thus, C) functions in the loop, and they cannot trigger a C++ exception. Change the code ever so slightly, and this may cease to be the case. As a general rule, I've found that you should wrap the PyObject* in a class whose destructor calls Py_DecRef, rather than call it explicitly, and maybe miss the call due to an exception.

James Kanze
  • 150,581
  • 18
  • 184
  • 329
  • I moved the `PyDict_New` in the constructor and now in the `sync` method I only call `PyDict_Clear(myDictionary)`. I've also added a `Py_DECREF(myDictionary)` in the destructor and now all the memory leaks are eliminated! – linello Jan 07 '13 at 16:05