I have a C++ program which exposes a Python interface to execute users' embedded Python scripts.
The user inserts the path of the Python script to run and the command-line arguments. Then the script is executed through
boost::python::exec_file(filename, main_globals, main_globals)
To pass the command-line arguments to the Python script we have to set them through the Python C-API function
PySys_SetArgv(int args, char** argv)
before calling exec_file()
.
But this requires to tokenize the user's string containing the command-line arguments to get the list of arguments, and then to pass them back to the Python interpreter through PySys_SetArgv
.
And that's more than a mere waste of time, because in this way the main C++ program has to take the responsibility of tokenizing the command-line string without knowing the logics behind, which is only defined in the custom user's script.
A much nicer and cleaner approach would be something like this in metacode:
string command_line_args = '-v -p "filename" -t="anotherfile" --list="["a", "b"]" --myFunnyOpt'
exec_file( filename, command_line_args, ...)
I spent hours looking at the Boost and Python C-API documentation but I did not find anything useful. Do you know if there is a way to achieve this, i.e. passing a whole string of command line arguments to an embedded Python script from C++?
Update:
As Steve suggested in the comments here below, I solved my problem tokenizing the input string, following https://stackoverflow.com/a/8965249/320369.
In my case I used:
// defining the separators
std::string escape_char = "\\"; // the escape character
std::string sep_char = " "; // empty space as separator
std::string quote_char = ""; // empty string --> we don't want a quote char'
boost::escaped_list_separator<char> sep( escape_char, sep_char, quote_char );
because I wanted to be able to parse tuples containing strings as well, like:
'--option-two=("A", "B")'
and if you use:
escaped_list_separator<char> sep('\\', ' ', '"');
as in the original post, you don't get the quoted strings tokenized correctly.