0

I'm trying to parallelize a loop for reading and writing a significant amount of text file using multiprocessing module. This is my problem: I have a class named SWAT which includes several methods. A method called "auto" try to automate all other methods. it requires 3 args. What can I do to parallel the loop?

def auto (self, directory, number_samples, number_trajectories, grid_jump):

import glob, os
    os.chdir (directory)
    a = SWAT.generate_samples (self, number_samples, number_trajectories, 
grid_jump)
    b = [i.split (',') for i in SWAT.readlist (self, 'rng.txt')]

    for i in range (len(a)): 
        path = directory + '\\TxtInOut{}'. format (i)
        os.mkdir (path)
        sample = open ('sample.txt', 'w')
        for j in range (len (b)):
            sample.write ('{},{},{},{}\n'. format (b [j] [0], b [j] [1], b [j] [2], a [i] [j]))
        sample.close ()

        for file in glob.glob ('*cio'):
        e = SWAT.file_cio (self, file)
        c = SWAT.fig_fig (self, e [1] [6] [1])
        SWAT.wwq_write (self, SWAT.basins_bsn (self, e [1] [46] [1]) [1] [63] [1], path) 
        SWAT.basins_bsn_write (self, e [1] [46] [1], path)
        for j in c [0]:
            d = SWAT.subbasin (self, c [0] [j])
            SWAT.pnd_write (self, d [1] [30] [1], path)
            for k in d [3]:
                SWAT.hru_write (self, d [3] [k] [0], path)
                SWAT.mgt_write (self, d [3] [k] [1], path)
                SWAT.gw_write (self, d [3] [k] [4], path)
                SWAT.sep_write (self, d [3] [k] [6], path)
            SWAT.subbasin_write (self, c [0] [j], path)
        for j in c [1]:
            SWAT.rte_write (self, c [1] [j] [0], path)
            SWAT.swq_write (self, c [1] [j] [1], path)
            SWAT.file_cio_write (self, file, path)
  • you want to parallelize reading from one file? or do you want to parallelize reading from multiple files simultaneously? assuming your files are either on the same disk or at least are using some sort of raid, why do you expect to gain speed by reading it in different threads? the limiting factor is most likely the disk speed.... – AntiMatterDynamite Jan 29 '18 at 13:20
  • Actually, there is more than 1000 text file which I'm going to read once and write them 404 times with new values! and each time that I write a series of files I should run a software to calculate an output and ... (I'm going to do sensitivity analysis) – Reza Ehsani Jan 29 '18 at 15:34

0 Answers0