147

I'd like to be able to use the result of the last executed command in a subsequent command. For example,

$ find . -name foo.txt
./home/user/some/directory/foo.txt

Now let's say I want to be able to open the file in an editor, or delete it, or do something else with it, e.g.

mv <some-variable-that-contains-the-result> /some/new/location

How can I do it? Maybe using some bash variable?

Update:

To clarify, I don't want to assign things manually. What I'm after is something like built-in bash variables, e.g.

ls /tmp
cd $_

$_ holds the last argument of the previous command. I want something similar, but with the output of the last command.

Final update:

Seth's answer has worked quite well. Couple of things to bear in mind:

  • don't forget to touch /tmp/x when trying the solution for the very first time
  • the result will only be stored if last command's exit code was successful
jww
  • 97,681
  • 90
  • 411
  • 885
armandino
  • 17,625
  • 17
  • 69
  • 81
  • After seeing your edit I thought to delete my answer. I wonder whether there is anything built-in that you are looking for. – taskinoor May 10 '11 at 20:02
  • I couldn't find anything built-in. I was wondering if it would be possible to implement it.. maybe through .bahsrc? I think it'd be a pretty handy feature. – armandino May 11 '11 at 03:47
  • I am afraid all you can do is either redirect the output to file or pipe or capture it, otherwise it won't be saved. – bandi May 11 '11 at 08:20
  • This is correct, the only thing that is saved is the return code which you can retrieve with $?. – Bashwork May 19 '11 at 14:19
  • You have the source for `bash` - you can always use it to make your own (slightly modified) version which saves output to a `$` variable :-) – paxdiablo May 23 '11 at 04:23
  • 4
    You can't do that without the cooperation of the shell and the terminal, and they generally don't cooperate. See also [How do I reuse the last output from the command line?](http://unix.stackexchange.com/q/9024) and [Using text from previous commands' output](http://unix.stackexchange.com/questions/385) on [Unix Stack Exchange](http://unix.stackexchange.com/). – Gilles 'SO- stop being evil' May 23 '11 at 14:57
  • @Giles Thanks for the links. Learned about tmux which looks really good. – armandino May 23 '11 at 20:01
  • @armandino: I think you need to re-edit the question to explain what you are trying to do that the answers listed below are not satisfying. There are answers which recommend solutions for programming, for saving history for later investigation, and for storing the output of programs in a variable. What is it that you need which these solutions are not providing? – Seth Robertson May 24 '11 at 13:25
  • @Seth Most answers involve manually assigning variables or re-running the command which is not really what I'm after. I'm looking for ideas and other alternatives to what has been contributed so far. – armandino May 25 '11 at 01:06
  • @armandino: Yes, most answers do that. Two answers do not. One assigns the output to a variable as one option and another option and a second answer talks about logging all output using `script` or `| tee` and parsing the output. If you cannot express what you want that *all* of the available answers do not provide, you are unlikely to get it. I have some even more complicated ideas that might work under some narrow conditions, but there is no point exploring them without knowing if your needs match those conditions. – Seth Robertson May 25 '11 at 02:48
  • 6
    One of the main reasons why the output of commands is not captured is because the output can be arbitrarily large - many megabytes at a time. Granted, not always that large, but big outputs cause problems. – Jonathan Leffler May 26 '11 at 04:06
  • @JonathanLeffler but it *is* captured! that's why you can scroll up in your terminal and look at the output. So it would really make sense to have a convenient shortcut to access the output instead of having to scroll up and highlight/paste. Yes, for very long output it's maybe not all stored, in this case it would still make sense to have the part that the terminal remembers available. – user313032 Oct 18 '21 at 22:45
  • since it doesn't seem to be possible with bash, are there other shells who can do that? – user313032 Oct 18 '21 at 22:45

22 Answers22

95

I don't know of any variable that does this automatically. To do something aside from just copy-pasting the result, you can re-run whatever you just did, eg

vim $(!!)

Where !! is history expansion meaning 'the previous command'.

If you expect there to be a single filename with spaces or other characters in it that might prevent proper argument parsing, quote the result (vim "$(!!)"). Leaving it unquoted will allow multiple files to be opened at once as long as they don't include spaces or other shell parsing tokens.

Daenyth
  • 35,856
  • 13
  • 85
  • 124
  • 1
    Copy-pasting is what I usually do in such a case, also because you typically need only a part of the output of the command. I'm surprised you are the first one to mention it. – Bruno De Fraine May 23 '11 at 07:50
  • 5
    You can do much the same with fewer keystrokes with: `vim \`!!\`` – psmears Mar 02 '15 at 20:09
  • 1
    To see the difference from the accepted answer, execute `date "+%N"` and then `echo $(!!)` – Marinos An Oct 31 '19 at 16:36
76

This is a really hacky solution, but it seems to mostly work some of the time. During testing, I noted it sometimes didn't work very well when getting a ^C on the command line, though I did tweak it a bit to behave a bit better.

This hack is an interactive mode hack only, and I am pretty confident that I would not recommend it to anyone. Background commands are likely to cause even less defined behavior than normal. The other answers are a better way of programmatically getting at results.


That being said, here is the "solution":

PROMPT_COMMAND='LAST="`cat /tmp/x`"; exec >/dev/tty; exec > >(tee /tmp/x)'

Set this bash environmental variable and issues commands as desired. $LAST will usually have the output you are looking for:

startide seth> fortune
Courtship to marriage, as a very witty prologue to a very dull play.
                -- William Congreve
startide seth> echo "$LAST"
Courtship to marriage, as a very witty prologue to a very dull play.
                -- William Congreve
Zearin
  • 1,474
  • 2
  • 17
  • 36
Seth Robertson
  • 30,608
  • 7
  • 64
  • 57
  • 2
    @armandino: This of course causes programs expecting to interact with a terminal on standard-out to not work as expected (more/less) or store odd things in $LAST (emacs). But I think it is about as good as you are going to get. The only other option is to use (type)script to save a copy of EVERYTHING to a file and then use PROMPT_COMMAND to check for changes since the last PROMPT_COMMAND. This will include stuff you don't want, though. I'm pretty sure you are not going to find anything closer to what you want that this, though. – Seth Robertson May 19 '11 at 20:51
  • 4
    To go a little bit further: `PROMPT_COMMAND='last="$(cat /tmp/last)";lasterr="$(cat /tmp/lasterr)"; exec >/dev/tty; exec > >(tee /tmp/last); exec 2>/dev/tty; exec 2> >(tee /tmp/lasterr)'` which provides both `$last` and `$lasterr`. – raphink Oct 12 '12 at 09:16
  • @cdosborn: man by default sends the output through a pager. As my previous comment said: "programs expecting to interact with a terminal on standard-out to not work as expected (more/less)". more & less are pagers. – Seth Robertson Feb 21 '16 at 05:13
  • I get: `No such file or directory` – Francisco Corrales Morales Sep 12 '16 at 19:33
  • @Raphink I just wanted to point out a correction: your suggestion should end in `2> >(tee /tmp/lasterr 1>&2)` because the standard output of `tee` must be redirected back to standard error. – Hugues Jan 30 '17 at 06:03
  • Here is my complete version, which also deals correctly with trailing newlines: `: >/tmp/last_stdout 2>/tmp/last_stderr; PROMPT_COMMAND='last_stdout="$(cat /tmp/last_stdout; printf x)"; last_stdout=${last_stdout%x}; last_stderr="$(cat /tmp/last_stderr; printf x)"; last_stderr=${last_stderr%x}; exec >/dev/tty 2>/dev/tty; exec > >(tee /tmp/last_stdout) 2> >(tee /tmp/last_stderr 1>&2)'` However, it still has the problem that `$last_stderr` includes the previous command prompt. – Hugues Jan 30 '17 at 06:06
19

Bash is kind of an ugly language. Yes, you can assign the output to variable

MY_VAR="$(find -name foo.txt)"
echo "$MY_VAR"

But better hope your hardest that find only returned one result and that that result didn't have any "odd" characters in it, like carriage returns or line feeds, as they will be silently modified when assigned to a Bash variable.

But better be careful to quote your variable correctly when using it!

It's better to act on the file directly, e.g. with find's -execdir (consult the manual).

find -name foo.txt -execdir vim '{}' ';'

or

find -name foo.txt -execdir rename 's/\.txt$/.xml/' '{}' ';'
rlibby
  • 5,931
  • 20
  • 25
  • 5
    They are _not_ silently modified when you assign, they are modified when you _echo!_ You only have to do `echo "${MY_VAR}"` to see this is the case. – paxdiablo May 23 '11 at 04:21
13

There are more than one ways to do this. One way is to use v=$(command) which will assign the output of command to v. For example:

v=$(date)
echo $v

And you can use backquotes too.

v=`date`
echo $v

From Bash Beginners Guide,

When the old-style backquoted form of substitution is used, backslash retains its literal meaning except when followed by "$", "`", or "\". The first backticks not preceded by a backslash terminates the command substitution. When using the "$(COMMAND)" form, all characters between the parentheses make up the command; none are treated specially.

EDIT: After the edit in the question, it seems that this is not the thing that the OP is looking for. As far as I know, there is no special variable like $_ for the output of last command.

taskinoor
  • 45,586
  • 12
  • 116
  • 142
12

Disclamers:

  • This answer is late half a year :D
  • I'm a heavy tmux user
  • You have to run your shell in tmux for this to work

When running an interactive shell in tmux, you can easily access the data currently displayed on a terminal. Let's take a look at some interesting commands:

  • tmux capture-pane: this one copies the displayed data to one of the tmux's internal buffers. It can copy the history that's currently not visible, but we're not interested in that now
  • tmux list-buffers: this displays the info about the captured buffers. The newest one will have the number 0.
  • tmux show-buffer -b (buffer num): this prints the contents of the given buffer on a terminal
  • tmux paste-buffer -b (buffer num): this pastes the contents of the given buffer as input

Yeah, this gives us a lot of possibilities now :) As for me, I set up a simple alias: alias L="tmux capture-pane; tmux showb -b 0 | tail -n 3 | head -n 1" and now every time I need to access the last line i simply use $(L) to get it.

This is independent of the output stream the program uses (be it stdin or stderr), the printing method (ncurses, etc.) and the program's exit code - the data just needs to be displayed.

Wiesław Herr
  • 304
  • 2
  • 8
  • Hey, thanks for this tmux tip. I was looking for basically the same thing as the OP, found your comment, and coded up a shell script to let you pick and paste part of the output of a previous command using the tmux commands you mention and Vim motions: https://github.com/bgribble/lw – Bill Gribble Nov 11 '15 at 21:14
  • A great answer that I came across some 9 years later. In the meantime, a few things changed in tmux. First, tmux assigns a default buffer-prefix name of "buffer", so the `tmux showb -b 0` above would be `tmux showb -b buffer0`. But that still doesn't work because the most recent buffer is now the *highest* numbered buffer, rather than 0. So you'll need use `capture-pane -p buffname`. But the biggest problem that I haven't solved is that tmux seems to return blank lines up to the end of the screen, so this solution only works if the cursor is at the bottom of the screen in the first place. – NotTheDr01ds Jan 15 '21 at 00:58
12

It's quite easy. Use back-quotes:

var=`find . -name foo.txt`

And then you can use that any time in the future

echo $var
mv $var /somewhere
Wes Hardaker
  • 21,735
  • 2
  • 38
  • 69
9

I think you might be able to hack out a solution that involves setting your shell to a script containing:

#!/bin/sh
bash | tee /var/log/bash.out.log

Then if you set $PROMPT_COMMAND to output a delimiter, you can write a helper function (maybe called _) that gets you the last chunk of that log, so you can use it like:

% find lots*of*files
...
% echo "$(_)"
... # same output, but doesn't run the command again
Jay Adkisson
  • 461
  • 2
  • 3
7

You could set up the following alias in your bash profile:

alias s='it=$($(history | tail -2 | head -1 | cut -d" " -f4-))'

Then, by typing 's' after an arbitrary command you can save the result to a shell variable 'it'.

So example usage would be:

$ which python
/usr/bin/python
$ s
$ file $it
/usr/bin/python: symbolic link to `python2.6'
Joe Tallett
  • 71
  • 1
  • 1
  • Thank you! This is what I needed to implement my `grab` function that copies nth line from last command to clipboard https://gist.github.com/davidhq/f37ac87bc77f27c5027e – davidhq Sep 11 '15 at 22:46
6

Capture the output with backticks:

output=`program arguments`
echo $output
emacs $output
bandi
  • 4,204
  • 3
  • 25
  • 24
6

I just distilled this bash function from the suggestions here:

grab() {     
  grab=$("$@")
  echo $grab
}

Then, you just do:

> grab date
Do 16. Feb 13:05:04 CET 2012
> echo $grab
Do 16. Feb 13:05:04 CET 2012

Update: an anonymous user suggested to replace echo by printf '%s\n' which has the advantage that it doesn't process options like -e in the grabbed text. So, if you expect or experience such peculiarities, consider this suggestion. Another option is to use cat <<<$grab instead.

Tilman Vogel
  • 9,337
  • 4
  • 33
  • 32
5

By saying "I'd like to be able to use the result of the last executed command in a subsequent command", I assume - you mean the result of any command, not just find.

If thats the case - xargs is what you are looking for.

find . -name foo.txt -print0 | xargs -0 -I{} mv {} /some/new/location/{}

OR if you are interested to see the output first:

find . -name foo.txt -print0

!! | xargs -0 -I{} mv {} /some/new/location/{}

This command deals with multiple files and works like a charm even if the path and/or filename contains space(s).

Notice the mv {} /some/new/location/{} part of the command. This command is build and executed for each line printed by earlier command. Here the line printed by earlier command is replaced in place of {}.

Excerpt from man page of xargs:

xargs - build and execute command lines from standard input

For more detail see man page: man xargs

ssapkota
  • 3,262
  • 19
  • 30
4

I usually do what the others here have suggested ... without the assignment:

$find . -iname '*.cpp' -print
./foo.cpp
./bar.cpp
$vi `!!`
2 files to edit

You can get fancier if you like:

$grep -R "some variable" * | grep -v tags
./foo/bar/xxx
./bar/foo/yyy
$vi `!!`
halm
  • 377
  • 1
  • 8
2

If all you want is to rerun your last command and get the output, a simple bash variable would work:

LAST=`!!`

So then you can run your command on the output with:

yourCommand $LAST

This will spawn a new process and rerun your command, then give you the output. It sounds like what you would really like would be a bash history file for command output. This means you will need to capture the output that bash sends to your terminal. You could write something to watch the /dev or /proc necessary, but that's messy. You could also just create a "special pipe" between your term and bash with a tee command in the middle which redirects to your output file.

But both of those are kind of hacky solutions. I think the best thing would be terminator which is a more modern terminal with output logging. Just check your log file for the results of the last command. A bash variable similar to the above would make this even simpler.

Spencer Rathbun
  • 14,510
  • 6
  • 54
  • 73
1

you can use !!:1. Example:

~]$ ls *.~
class1.cpp~ class1.h~ main.cpp~ CMakeList.txt~ 

~]$ rm !!:1
rm class1.cpp~ class1.h~ main.cpp~ CMakeList.txt~ 


~]$ ls file_to_remove1 file_to_remove2
file_to_remove1 file_to_remove2

~]$ rm !!:1
rm file_to_remove1

~]$ rm !!:2
rm file_to_remove2
rkm
  • 892
  • 2
  • 16
  • 32
  • This notation grabs the numbered argument from a command: See documentation for "${parameter:offset}" here: https://www.gnu.org/software/bash/manual/html_node/Shell-Parameter-Expansion.html . See also here for more examples: http://www.howtogeek.com/howto/44997/how-to-use-bash-history-to-improve-your-command-line-productivity/ – Alexander Bird Oct 09 '15 at 14:11
1

I had a similar need, in which I wanted to use the output of last command into the next one. Much like a | (pipe). eg

$ which gradle 
/usr/bin/gradle
$ ls -alrt /usr/bin/gradle

to something like -

$ which gradle |: ls -altr {}

Solution : Created this custom pipe. Really simple, using xargs -

$ alias :='xargs -I{}'

Basically nothing by creating a short hand for xargs, it works like charm, and is really handy. I just add the alias in .bash_profile file.

eXc
  • 171
  • 4
1

It can be done using the magic of file descriptors and the lastpipe shell option.

It has to be done with a script - the "lastpipe" option will not work in interactive mode.

Here's the script I've tested with:

$ cat shorttest.sh 
#!/bin/bash
shopt -s lastpipe

exit_tests() {
    EXITMSG="$(cat /proc/self/fd/0)"
}

ls /bloop 2>&1 | exit_tests

echo "My output is \"$EXITMSG\""


$ bash shorttest.sh 
My output is "ls: cannot access '/bloop': No such file or directory"

What I'm doing here is:

  1. setting the shell option shopt -s lastpipe. It will not work without this as you will lose the file descriptor.

  2. making sure my stderr also gets captured with 2>&1

  3. piping the output into a function so that the stdin file descriptor can be referenced.

  4. setting the variable by getting the contents of the /proc/self/fd/0 file descriptor, which is stdin.

I'm using this for capturing errors in a script so if there is a problem with a command I can stop processing the script and exit right away.

shopt -s lastpipe

exit_tests() {
    MYSTUFF="$(cat /proc/self/fd/0)"
    BADLINE=$BASH_LINENO
}

error_msg () {
    echo -e "$0: line $BADLINE\n\t $MYSTUFF"
    exit 1
}

ls /bloop 2>&1 | exit_tests ; [[ "${PIPESTATUS[0]}" == "0" ]] || error_msg

In this way I can add 2>&1 | exit_tests ; [[ "${PIPESTATUS[0]}" == "0" ]] || error_msg behind every command I care to check on.

Now you can enjoy your output!

montjoy
  • 363
  • 2
  • 6
1

Here's one way to do it after you've executed your command and decided that you want to store the result in a variable:

$ find . -name foo.txt
./home/user/some/directory/foo.txt
$ OUTPUT=`!!`
$ echo $OUTPUT
./home/user/some/directory/foo.txt
$ mv $OUTPUT somewhere/else/

Or if you know ahead of time that you'll want the result in a variable, you can use backticks:

$ OUTPUT=`find . -name foo.txt`
$ echo $OUTPUT
./home/user/some/directory/foo.txt
Nate W.
  • 9,141
  • 6
  • 43
  • 65
1

As an alternative to the existing answers: Use while if your file names can contain blank spaces like this:

find . -name foo.txt | while IFS= read -r var; do
  echo "$var"
done

As I wrote, the difference is only relevant if you have to expect blanks in the file names.

NB: the only built-in stuff is not about the output but about the status of the last command.

Gilles 'SO- stop being evil'
  • 104,111
  • 38
  • 209
  • 254
0xC0000022L
  • 20,597
  • 9
  • 86
  • 152
0

I find remembering to pipe the output of my commands into a specific file to be a bit annoying, my solution is a function in my .bash_profile that catches the output in a file and returns the result when you need it.

The advantage with this one is that you don't have to rerun the whole command (when using find or other long-running commands that can be critical)

Simple enough, paste this in your .bash_profile:

Script

# catch stdin, pipe it to stdout and save to a file
catch () { cat - | tee /tmp/catch.out}
# print whatever output was saved to a file
res () { cat /tmp/catch.out }

Usage

$ find . -name 'filename' | catch
/path/to/filename

$ res
/path/to/filename

At this point, I tend to just add | catch to the end of all of my commands, because there's no cost to doing it and it saves me having to rerun commands that take a long time to finish.

Also, if you want to open the file output in a text editor you can do this:

# vim or whatever your favorite text editor is
$ vim <(res)
Connor
  • 4,216
  • 2
  • 29
  • 40
0

This is not strictly a bash solution but you can use piping with sed to get the last row of previous commands output.

First lets see what i have in folder "a"

rasjani@helruo-dhcp022206::~$ find a
a
a/foo
a/bar
a/bat
a/baz
rasjani@helruo-dhcp022206::~$ 

Then, your example with ls and cd would turn to sed & piping into something like this:

rasjani@helruo-dhcp022206::~$ cd `find a |sed '$!d'`
rasjani@helruo-dhcp022206::~/a/baz$ pwd
/home/rasjani/a/baz
rasjani@helruo-dhcp022206::~/a/baz$

So, the actual magic happens with sed, you pipe what ever output of what ever command into sed and sed prints the last row which you can use as parameter with back ticks. Or you can combine that to xargs also. ("man xargs" in shell is your friend)

rasjani
  • 7,372
  • 4
  • 22
  • 35
0
find . -name foo.txt 1> tmpfile && mv `cat tmpfile` /path/to/some/dir/

is yet another way, albeit dirty.

Xavier T.
  • 40,509
  • 10
  • 68
  • 97
Kevin
  • 1,489
  • 2
  • 20
  • 30
0

The shell doesn't have perl-like special symbols that store the echo result of the last command.

Learn to use the pipe symbol with awk.

find . | awk '{ print "FILE:" $0 }'

In the example above you could do:

find . -name "foo.txt" | awk '{ print "mv "$0" ~/bar/" | "sh" }'
ascotan
  • 1,634
  • 11
  • 8