I want to write a program that calculates the time it takes to read in a folder of .py files and calculate the cyclomatic complexity of each of the files. I have Radon installed to calculate the complexity, but I also want to be able to implement a distributed system that creates a set of n workers, where each worker is given a seperate file in the folder which is then calculated using radon.
I'm using dask for the distributed system and was wondering is it possible to achieve what I'm asking above. I.e. if I have a folder of 10 .py files, I can create 1 worker which will read in all the files and calculate complexity, then my program will log the time it took to do that. Or i could specify 10 worker nodes who will look for work (i.e. the files to calculate) and each will take a file and run concurrently, then the program will log the time it took to do that.
I have the basic program set up using dask, which calls a function, but am unsure if you can give a list of items which is distributed over a set workers which then call the function and return the results.
Is this possible using dask?