I'm coding a XML parser with libxml2. Actually, I finished it but there is a pretty annoying problem of memory. The program firstly get some links from my database and all those links point to a XML file. I use curl to download them. The process is simple : I download a file, then I parse it, and so on...
The problem seems to be when a parsing is finished. Curl downloads the next file but it seems that the previous XML is not freed, because I guess libxml2 loads it in RAM. When parsing the last XML, I find myself with a ~2.6GB of leak (yeah, some of these file are really big...) and my machine only has 4GB of RAM. It works for the moment, but in the future, more links will be added to the database, so I must fix it now.
My code is very basic:
xmlDocPtr doc;
doc = xmlParseFile("data.xml");
/* code to parse the file... */
xmlFreeDoc(doc);
I tried using:
xmlCleanupParser();
but the doc says : "It doesn't deallocate any document related memory." (http://xmlsoft.org/html/libxml-parser.html#xmlCleanupParser)
So, my question is : Does somebody know how to deallocate all this document related memory ?