I have an python script running, that starts the same function in multiple threads. The functions creates and process 2 counters (c1 and c2). The result of all c1 counters from the forked processes should be merged together. Same to the results of all the c2 counters, returned by the different forks.
My (pseudo)-code looks like that:
def countIt(cfg)
c1 = Counter
c2 = Counter
#do some things and fill the counters by counting words in an text, like
#c1= Counter({'apple': 3, 'banana': 0})
#c2= Counter({'blue': 3, 'green': 0})
return c1, c2
if __name__ == '__main__':
cP1 = Counter()
cP2 = Counter()
cfg = "myConfig"
p = multiprocessing.Pool(4) #creating 4 forks
c1, c2 = p.map(countIt,cfg)[:2]
# 1.) This will only work with [:2] which seams to be no good idea
# 2.) at this point c1 and c2 are lists, not a counter anymore,
# so the following will not work:
cP1 + c1
cP2 + c2
Following the example above, I need a result like: cP1 = Counter({'apple': 25, 'banana': 247, 'orange': 24}) cP2 = Counter({'red': 11, 'blue': 56, 'green': 3})
So my question: how can I count things insight a forked process in order to aggregate each counter (all c1 and all c2) in the parent process?