I'm trying to use a bash script to run multiple python scripts at once, and I modify python variables like amounts or min/max from the bash script (called MasterLoader) by passing in args like so:
A_Num=20
python A1.py "global AMOUNT; AMOUNT = $A_Num" &&
python A2.py &&
#More follows
$SHELL
And within A1 I have a function:
def take_args():
for i in range(len(sys.argv)-1):
exec(sys.argv[i+1])
Now, this works fine. My issue came from when I implemented a module hosting all functions for every file (I had ~250 lines or more of redundant code each module) like so:
def take_args():
for i in range(len(sys.argv)-1):
exec(sys.argv[i+1])
def common_function1():
#Code here
#More functions follow
I called this module functions, and now in A1 (Any of them really) I simply do:
from functions import *
def main()
#<Variables here>
take_args()
However, this no longer works, as the variable in A1 is not being changed. My thinking is now it's running take_args() in functions and is changing that global AMOUNT? Since the public function refactor I have a single function and no global variables, so I'd like to do this without using either of the solutions I can currently implement:
copying def take_args(): across every function and making all possible variables I'd want to adjust global
having to modify both MasterLoader's args that I pass and A1.py to pass the related variables to functions
If possible I'd like the solution to involve simply changing MasterLoader's args, like I had before. Solution 2 doesn't work for me because I have ~60 scripts that I'd have to refactor if I wanted new variable changed, and solution 1 is clunky from a programming perspective (duplicate code, global variables)