0

I'm trying to use a bash script to run multiple python scripts at once, and I modify python variables like amounts or min/max from the bash script (called MasterLoader) by passing in args like so:

A_Num=20

python A1.py "global AMOUNT; AMOUNT = $A_Num" &&
python A2.py &&
#More follows

$SHELL

And within A1 I have a function:

def take_args():
    for i in range(len(sys.argv)-1):
        exec(sys.argv[i+1])

Now, this works fine. My issue came from when I implemented a module hosting all functions for every file (I had ~250 lines or more of redundant code each module) like so:

def take_args():
    for i in range(len(sys.argv)-1):
        exec(sys.argv[i+1])

def common_function1():
    #Code here
#More functions follow

I called this module functions, and now in A1 (Any of them really) I simply do:

from functions import *

def main()
    #<Variables here>

    take_args()

However, this no longer works, as the variable in A1 is not being changed. My thinking is now it's running take_args() in functions and is changing that global AMOUNT? Since the public function refactor I have a single function and no global variables, so I'd like to do this without using either of the solutions I can currently implement:

  1. copying def take_args(): across every function and making all possible variables I'd want to adjust global

  2. having to modify both MasterLoader's args that I pass and A1.py to pass the related variables to functions

If possible I'd like the solution to involve simply changing MasterLoader's args, like I had before. Solution 2 doesn't work for me because I have ~60 scripts that I'd have to refactor if I wanted new variable changed, and solution 1 is clunky from a programming perspective (duplicate code, global variables)

Patrick
  • 13
  • 1
  • 5
  • It's unclear what you mean by "implemented a module hosting all functions for every file" and "A[N]" (or "A[N].py". Also you don't show the bash script you're using with this multiple-function module that's giving you problems. – martineau Aug 30 '13 at 14:46
  • Edited it. I'm not sure what you mean by your last sentence, it's MasterLoader, the first block of code. I run that to execute all my scripts at once and it passes in the args as it runs them. – Patrick Aug 30 '13 at 15:31
  • Clear example of the [XY problem](http://meta.stackexchange.com/questions/66377/what-is-the-xy-problem). You do **not** want to use `exec` like that. You have developed an insecure, buggy method of passing arguments to a program. – Bakuriu Aug 30 '13 at 15:41
  • I can't see a different way of not having to declare args in the scripts themselves, and suddenly I'm back to manually re-factoring a whole bunch of scripts every new variable I want to modify. If you can explain a proper way to do this (modify the python variables from the bash script without editing the python scripts themselves), I'd all ears, but I just can't see one with my level of experience. – Patrick Aug 30 '13 at 15:48
  • Still don't really understand. However, perhaps you can pass `sys.argv` to `take_args()` as an argument instead of having it reference the module attribute directly itself. i.e. `def take_args(argv):...` and then `take_args(sys.argv)`. That way you be sure it's getting the current program arguments. – martineau Aug 30 '13 at 16:06
  • Ok, to break it down as simple as I can: MasterLoader,sh runs, the first line it passes in an argument to A1, being "global AMOUNT; AMOUNT = $A_Num" A1.py then runs, and the first thing it does is from functions.py import *. It then runs take_args() This causes it to exec the argument, so it runs global AMOUNT; AMOUNT = 20. Sadly, this doesn't actually change the global value AMOUNT in A1.py And I don't know why. It worked before when take_args() was declared in A1.py, and I didn't have functions.py at all. I don't want to revert and gain 10k lines – Patrick Aug 30 '13 at 17:20

0 Answers0