17

I have a bash script, a.sh , and in it I have call a python script b.py . The python script calculates something, and I want it to return a value that will be used later in a.sh . I know I can do

In a.sh:

var=`python b.py`

In b.py:

print x # when x is the value I want to pass

But this is not so convenient, because I also print other messages in b.py

Is there any better way to do it?

Edit:

What I'm doing now is just

var=`python b.py | tail -n 1`

It means I can print many things inside b.py, but only the last line (the last print command, assuming it doesn't contain "\n" in it) will be stored in var.

Thanks for all the answers!

Rivka
  • 823
  • 2
  • 7
  • 20
  • 2
    Well, maybe you could _not_ do the batch script to starty with - do everything from Python, from where the disctintion between data and code is always clear, you don't have to spawn a new proccess simply to read a scalar value (like the answers using "cat" bellow), and so on. – jsbueno Nov 23 '10 at 15:22

8 Answers8

11

I would print it to a file chosen on the command line then I'd get that value in bash with something like cat.

So you'd go:

python b.py tempfile.txt
var=`cat tempfile.txt`
rm tempfile.txt

[EDIT, another idea based on other answers]

Your other option is to format your output carefully so you can use bash functions like head/tail to pipe only the first/last lines into your next program.

Chris Pfohl
  • 18,220
  • 9
  • 68
  • 111
  • :-) Although I'd argue it's not ugly at all. Keeping information in files is exactly what files are for. It allows for much easier debugging in a multi-program solution to a problem. – Chris Pfohl Nov 23 '10 at 15:03
  • 1
    Perhaps "ugly" was too strong... I perceive files to be more for long(er)-term storage of data. Writing something to disk then reading it right back milliseconds later as a means of passing around data in what is conceptually a single program seems inelegant to me (surely this is what RAM is for?). There's also about 10 billion things that can go wrong when doing file I/O (though to be fair, usually things work perfectly). On the other hand, it works!(And the other alternatives are even less savoury (or more complex) -- IPC in Bash, anyone?) – Cameron Nov 23 '10 at 15:27
  • @Cameron: IPC? Named pipes, process substitution and (in Bash 4) coprocesses. – Dennis Williamson Nov 23 '10 at 16:23
  • @Dennis: Thanks for the info (the only one of those I'd heard of was named pipes), but my point was that doing file I/O is easier than any of these techniques (powerful though they may be). – Cameron Nov 23 '10 at 16:37
4

I believe the answer is

.py

import sys 
a=['zero','one','two','three'] 
b = int(sys.argv[1]) 
###your python script can still print to stderr if it likes to 
print >> sys.stderr, "I am no converting" 
result = a[b] 
print result

.sh

#!/bin/sh 

num=2 
text=`python numtotext.py $num` 
echo "$num as text is $text" 
Jakob Bowyer
  • 33,878
  • 8
  • 76
  • 91
2

In your python script, redirect another messages to stderr, and print x to stdout:

import sys
...
print >>sys.stderr, "another message"
print x

in the bash script:

...
var=`python b.py 2>/dev/null`

Also, if x is an integer between 0,255, you can use exit code to pass it to the bash:

import sys
...
sys.exit(x)

in bash:

python b.py
var=$?

Please note that exit code is used to indicates errors, 0 means no error, and this breaks the convention.

khachik
  • 28,112
  • 9
  • 59
  • 94
1

On bash backsticks works

I usualy do something like

PIP_PATH=`python -c "from distutils.sysconfig \
import get_python_lib; print(get_python_lib())"`


POWELINE_PATH=$PIP_PATH"/powerline"
echo $POWELINE_PATH
CESCO
  • 7,305
  • 8
  • 51
  • 83
1

I'm not sure about "better", but you could write the result to a file then read it back in in Bash and delete it afterwards.

This is definitely ugly, but it's something to keep in mind in case nothing else does the trick.

Cameron
  • 96,106
  • 25
  • 196
  • 225
0

You can write the output to a temporary file, and have the shell read and delete that file. This is even less convenient, but reserves stdout for communication with the user.

Alternatively, you can use some kind of format for stdout: the first n lines are certain variables, the rest will be echoed by the parent shell to the user. Also not convenient, but avoids using tempfiles.

Martin v. Löwis
  • 124,830
  • 17
  • 198
  • 235
0

In shell script you can use like this python_ret=$(python b.py). It contains all print messages from python file b.py. Then you can search for a string which you are looking for. For example, if you are looking for 'Exception', you can lieke this

if [[ $python_ret == *"Exception:"* ]]; then
    echo "Got some exception."
    exit 1
fi
chanduthedev
  • 356
  • 2
  • 9
0

Better to forward the print value from the python script to a temp file before assigning it in a bash value. I believe there's no need to remove the file if this is the case.

!#/bin/bash
python b.py > tempfile.txt
var=`cat tempfile.txt`

Then, get the value:

echo $var
  • 1
    This answer is essentially the same as the already accepted answer. If this answer is helpful to you, you can share your feedback by voting up! – Register Sole Jan 20 '22 at 08:39