0

I know there is some issues with passing in more complicated data structures, such as a list of lists, to a python script via the CLI.

I was wondering if running a python script from node code had any of these same issues.

Basically, say I have the following code in a node app:

const spawn = require("child_process").spawn;
const pythonProcess = spawn('python',["path/to/script.py", arg1, arg2, arg3]);

Question the above code is from

Suppose that arg1 and arg2 are lists of lists in the node app. And suppose arg3 is a double.

The corresponding code in my script.py file that is meant to parse and receive these arguments into variables looks like so:

import sys
if __name__ == '__main__':
    oc = sys.argv[1]
    nc = sys.argv[2]
    r = sys.argv[3]

Will oc and nc here be lists of lists in python? Or does something else need to be done to get this working?

sometimesiwritecode
  • 2,993
  • 7
  • 31
  • 69

1 Answers1

3

The easiest way to pass complex structures is to serialize it first in some common data format, such as JSON:

const myList = ["foo", "bar", "baz"];
const { spawn } = require("child_process");
const python = spawn('python',["script.py", JSON.stringify(myList)]);

And deserialize on callee side:

import sys, json

if __name__ == '__main__':
    my_list = json.loads(sys.argv[1])

But, instead of passing serialized params as callee arguments, better use stdout and stdin streams for interchanging data larger than a few hundreds of bytes:

const { spawn } = require("child_process");

const python = spawn('python', ["script.py"]);
const buffers = [];

python.stdout.on('data', (chunk) => buffers.push(chunk));
python.stdout.on('end', () => {
    const result = JSON.parse(Buffer.concat(buffers));
    console.log('Python process exited, result:', result);
});

python.stdin.write(JSON.stringify(["foo", "bar", "baz"]));
python.stdin.end()

And accept it from sys.stdin via json.load, which takes streams instead of strings:

import sys, json

if __name__ == '__main__':
    my_list = json.load(sys.stdin)
    json.dump(my_list, sys.stdout)

Savva Surenkov
  • 195
  • 1
  • 8
  • Thanks! quick question. How would i pass multiple different vars that are lists using the stdout and stdin streams you demonstrate as the better way to do this? This example only shows how to pass one list in. – sometimesiwritecode Jan 28 '19 at 05:18
  • @mark_maker, the same way, just wrapping all variables you need to pass in js object: `const passedVars = { arg1, arg2, myList }; python.stdin.write(JSON.stringify(passedVars))`. On python side, such clause will be unwrapped in `dict` instance with corresponding fields: `data = json.load(sys.stdin) # data['arg1'], data['arg2'], data['myList']` – Savva Surenkov Jan 28 '19 at 07:06
  • @mark_maker or you can pass it as array instead of object, and deconstruct it directly into variables: `python.stdin.write(JSON.stringify([arg1, arg2, myList]))` ... `arg1, arg2, myList = json.load(sys.stdin)`. But in that case ordering of deconstructed variables on python side should correspond to order of items in the array on JS side, so I recommend to stick to the first approach with object/dict instances marshalling and depend on names instead. – Savva Surenkov Jan 28 '19 at 07:13
  • how to send python list to node js then? – Luk Aron Jun 20 '21 at 09:21
  • @LukAron the example above was updated with bi-directional communication. Please note that node should explicitly pass EOF to python stdin stream (`python.stdin.end()`), otherwise python process would be stuck indefinitely waiting for more data to come. In complex cases you may want to use other approaches like unix socket communications or full-featured IPCs – Savva Surenkov Jun 23 '21 at 07:27