I am running this code below using multiprocessing to run ticker_list
through a request and parsing program faster. The following code works, but it is very slow. I am not so sure that this is the correct usage of multiprocessing. If there is a more efficient way to do this then please let me know.
ticker_list = []
with open('/home/a73nk-xce/Documents/Python/SharkFin/SP500_Fin/SP_StockChar/ticker.csv', 'r', newline='') as csvfile:
spamreader = csv.reader(csvfile)
for rows in spamreader:
pass
for eachTicker in rows:
ticker_list.append(eachTicker)
def final_function(ticker):
try:
GetFinData.CashData(ticker_list)
except Exception as e:
pass
if __name__ == '__main__':
jobs = []
p = mp.Process(target=final_function, args=(ticker_list,))
jobs.append(p)
p.start()
p.join()