I have one entity in MarkLogic under which around 98k+ documents (/someEntity/[ID].xml
) are present and I have one situation in which I have to add a few new tags in all those documents.
I prepared a query to do add child node and then try to run against that entity receiving expanded tree cache full. I increased cache memory to few more gigs and it works and takes a long time to complete. Also tried with xdmp:clear-expanded-tree-cache()
and it also won't work.
Any pointers how we can fetch the URL's in the chunks of 10k and process so it won't spike the memory and won't throw an error after some time of query processing.