173

I want to run a bash subshell, (1) run a few commands, (2) and then remain in that subshell to do as I please. I can do each of these individually:

  1. Run command using -c flag:

    $> bash -c "ls; pwd; <other commands...>"
    

    however, it immediately returns to the "super" shell after the commands are executed. I can also just run an interactive subshell:

  2. Start new bash process:

    $> bash
    

    and it won't exit the subshell until I say so explicitly... but I can't run any initial commands. The closest solution I've found is:

    $> bash -c "ls; pwd; <other commands>; exec bash"
    

    which works, but not the way I wanted to, as it runs the given commands in one subshell, and then opens a separate one for interaction.

I want to do this on a single line. Once I exit the subshell, I should return back to the regular "super"shell without incident. There must be a way~~

NB: What I am not asking...

  1. not asking where to get a hold of the bash man page
  2. not asking how to read initializing commands from a file... I know how to do this, it's not the solution I'm looking for
  3. not interested in using tmux or gnu screen
  4. not interested in giving context to this. I.e., the question is meant to be general, and not for any specific purpose
  5. if possible, I want to avoid using workarounds that sort of accomplish what I want, but in a "dirty" way. I just want to do this on a single line. In particular, I don't want to do something like xterm -e 'ls'
SABBATINI Luca
  • 1,731
  • 2
  • 11
  • 3
  • I can imagine an Expect solution, but it's hardly the one-liner you want. In what way is the `exec bash` solution unsuitable for you? – glenn jackman Mar 09 '12 at 16:21
  • @glennjackman sorry, I'm not familiar with the jargon. What is an "Expect solution"? Also, the `exec bash` solution involves two separate subshells. I want one continuous subshell. – SABBATINI Luca Mar 09 '12 at 19:18
  • 7
    The beauty of `exec` is that it *replaces* the first subshell with the second, so you're only left 1 shell below the parent. If your initialization commands set environment variables, they will exist in the exec'ed shell. – glenn jackman Mar 09 '12 at 19:45
  • 2
    possible same http://stackoverflow.com/questions/7120426/invoke-bash-run-commands-inside-new-shell-then-give-control-back-to-user – Ciro Santilli OurBigBook.com Mar 22 '16 at 10:35
  • i gave up, but i didn't see this mentioned here, so FWIW: `RUN_CONDITIONAL=true bash --login` and put your commands in .bash_profile or w/e. no idea on the portability. – Cory Mawhorter May 10 '18 at 18:19
  • 8
    And the problem with `exec` is that you lose anything that's not passed down to subshells via the environment, such as non-exported variables, functions, aliases, ... – cjs Sep 15 '18 at 03:49
  • 2
    +1 for adding all these clarifications in the question, it heads off a lot of unhelpful replies – phette23 Jul 07 '21 at 18:40
  • Did you find a satisfactory answer? – a06e Oct 16 '21 at 20:47
  • Using the temporary named pipes trick is the only thing that lets me set `trap` calls in the newly-spawned subshell, and then have them still present when the human user gets to type into the subshell. Cheers! – Ti Strga Aug 22 '22 at 15:32
  • Regarding your point #3. It's funny, I found your question while looking for the same thing, but I was looking for it specifically to use it inside screen :) I wanted to start screen split in two, run two shells, and then run two docker containers inside. Then I want to be able to kill one of the containers, and quickly run it again, so I wanted to have it exit back to the shell, ready to re-run. – Gene Pavlovsky Jan 16 '23 at 15:29

11 Answers11

137

This can be easily done with temporary named pipes:

bash --init-file <(echo "ls; pwd")

Credit for this answer goes to the comment from Lie Ryan. I found this really useful, and it's less noticeable in the comments, so I thought it should be its own answer.

Jonathan Potter
  • 1,553
  • 1
  • 10
  • 7
  • 20
    This presumably means that `$HOME/.bashrc` is not executed though. It would have to be included from the temporary named pipe. – Hubro Aug 04 '15 at 21:47
  • 18
    To clarify, something like this: `bash --init-file <(echo ". \"$HOME/.bashrc\"; ls; pwd")` – Hubro Aug 04 '15 at 22:02
  • 10
    This is so gross but it does work. I can't believe bash doesn't support this directly. – Pat Niemeyer Feb 24 '17 at 16:07
  • @Hubro, what is the purpose of the `.` in the command? Without it, I get a "permission denied" on my `.bashrc`, but I don't understand why, and how the `.` fixes it. – Gus Mar 12 '17 at 02:01
  • 4
    @Gus, the `.` is a synonym for the `source` command: https://ss64.com/bash/source.html. – Jonathan Potter Mar 12 '17 at 03:18
  • This does not seem to work on older versions of bash, I've tried it on bash v3.2.57 and although bash does start, the commands inside braces do not get executed. – Ján Lalinský May 16 '17 at 11:07
  • 3
    Is there a way to make it work with user switching, such as `sudo bash --init-file <(echo "ls; pwd")` or `sudo -iu username bash --init-file <(echo "ls; pwd")`? – jeremysprofile Aug 21 '18 at 18:52
  • After running the above command as-is, I get: `-sh: syntax error near unexpected token `('` This works: `echo "ls; pwd" > initfile ; bash --init-file initfile` – FractalSpace Aug 27 '18 at 14:41
  • Thanks! I thought it'd be worth to mention that my issue was solved with this solution in combination with exec, i.e.: `bash -c "exec bash --init-file <(echo 'ls; pwd')"` – tor Oct 12 '20 at 13:07
  • @FractalSpace - are you sure you were executing the command from a bash prompt? Looks like a tcsh error to me. – hepcat72 Aug 30 '21 at 14:33
  • Seemingly, no matter how I do this, the shell exits immediately. I'm trying to start a bash shell (from tcsh) that initializes an environment for a project that requires bash. I'd like to create an alias in tcsh that switches to bash, cd's to the project directory, and initializes that environment, but it seems the best I can do is enter bash, then use a bash alias to do it: so 2 commands instead of 1. – hepcat72 Aug 30 '21 at 15:37
  • @hepcat72 I always use bash, sometimes dash but never tcsh. Also, I no longer see that error. – FractalSpace Aug 31 '21 at 18:33
  • 1
    assumes you're starting the new bash shell from a bash shell. Use the answer below if you're not. – Evan Jul 08 '22 at 17:33
  • @Evan has a crucial comment that other commenters may have missed: the `<(...)` syntax is parsed **by the same outer shell that's starting bash**, so if you're running some other shell intending to _launch_ bash, this won't work directly. Either use ExpoBi's answer, or run bash _and then_ use this syntax. – Ti Strga Aug 22 '22 at 15:09
21

Try this instead:

$> bash -c "ls;pwd;other commands;$SHELL"

$SHELL It makes the shell open in interactive mode, waiting for a close with exit.

chicks
  • 3,793
  • 10
  • 27
  • 36
ExpoBi
  • 335
  • 2
  • 3
  • 20
    FYI, this opens a new shell afterwards, so if any of the commands affect the current shell's state (e.g. sourcing a file) it might not work as expected – ThiefMaster Apr 08 '17 at 14:47
  • 1
    @ThiefMaster you can easily circumvent this behavior (if unwanted) wrapping the init command with a subshell: `bash -c "(ls;pwd;other commands;);$SHELL"` – gallo Feb 14 '22 at 23:57
  • 1
    Nice tip, but only partly true, @gagallo7; environment variables are inherited "down only", and since that will be a child shell, the parent will not see any env var assignments. See comments on original question for use of exec to retain env vars (though it probably loses anything your comment would preserve). – Justin Feb 23 '22 at 18:06
18

You can do this in a roundabout way with a temp file, although it will take two lines:

echo "ls; pwd" > initfile
bash --init-file initfile
Eduardo Ivanec
  • 14,881
  • 1
  • 37
  • 43
  • 2
    For a nice effect you can make the temp file remove itself by including `rm $BASH_SOURCE` in it. – Eduardo Ivanec Mar 09 '12 at 17:05
  • Eduardo, thank you. That's a nice solution, but... are you saying that this can't be done **without** having to fiddle with file I/O. There's obvious reasons why I would prefer to keep this as a self contained command, because the moment files come into the mix I'll have to start worrying about how to make random temp-files and then, as you mentioned, deleting them. It just requires so much more effort this way if I want to be rigorous. Hence the desire for a more minimalistic, elegant solution. – SABBATINI Luca Mar 09 '12 at 17:20
  • Use the mktemp utility to create a unique temp file. – cjc Mar 09 '12 at 17:39
  • 2
    @SABBATINILuca: I'm not saying anything like that. This is just a way, and `mktemp` does solve the temp file issue as @cjc pointed out. Bash *could* support reading the init commands from stdin, but as far as I can tell it doesn't. Specyfing `-` as init file and piping them half works, but Bash then exits (probably because it detected the pipeline). The elegant solution, IMHO, is to use exec. – Eduardo Ivanec Mar 09 '12 at 17:50
  • 1
    Doesn't this also override your normal bash initialisation ? @SABBATINILuca what are you trying to achieve with this why do you need to spawn a shell auto run some commands and then keep that shell open? – user9517 Mar 09 '12 at 17:58
  • @Lain, yes it would be slightly more complicated to run the original init ("rc") files alongside the few extra commands that I want to run at the beginning. This is why I consider Eduardo's solution, as solid as it is, to be just less than ideal. But like I said earlier, the question is posed as a hypothetical one. I think there ought to be a trivially straightforward way to do this with 3-5 extra characters, but already I count **at least** ~4 lines worth of work: making a temp file, echoing your commands into that file, running bash, removing that file, etc... Nevermind why I want to do it. – SABBATINI Luca Mar 09 '12 at 18:04
  • @EduardoIvanec thanks for the mktemp command, I had forgotten about it. Sorry to press you further (you've been very helpful), but how exactly do you suggest exec should be used? Do you mean as I had written it in my original post? Also, could you tell me why you think it would be the most elegant solution? – SABBATINI Luca Mar 09 '12 at 18:13
  • 14
    This is an old question, but Bash can create temporary named pipes by using the following syntax: bash --init-file <(echo "ls; pwd"). – Lie Ryan Feb 13 '14 at 14:59
6

Why not use native subshells?

$ ( ls; pwd; exec $BASH; )
bar     foo     howdy
/tmp/hello/
bash-4.4$ 
bash-4.4$ exit
$

Enclosing commands with parentheses makes bash spawn a subprocess to run these commands, so you can, for example, alter the environment without affecting parent shell. This is basically more readable equivalent to the bash -c "ls; pwd; exec $BASH".

If that still looks verbose, there are two options. One is to have this snippet as a function:

$ run() { ( eval "$@"; exec $BASH; ) }
$ run 'ls; pwd;'
bar     foo     howdy
/tmp/hello/
bash-4.4$ exit
$ run 'ls;' 'pwd;'
bar     foo     howdy
/tmp/hello/
bash-4.4$ exit
$

Another is to make exec $BASH shorter:

$ R() { exec $BASH; }
$ ( ls; pwd; R )
bar     foo     howdy
/tmp/hello/
bash-4.4$ exit
$

I personally like R approach more, as there is no need to play with escaping strings.

toriningen
  • 231
  • 2
  • 5
  • 1
    I’m not sure if there are any caveeats using exec for the scenario the OP has in mind, but for me this is the far best solution proposed, because it uses plain bash commands without any string escaping issues. – Peter Oct 01 '19 at 09:05
5

What you need is to execute a startup script, then proceed with the interactive session.

The default startup script is ~/.bashrc, another script could be given with the --init-file option. If you simply pass an --init-file option, your script would replace the default, rather than augment it.

The solution is to pass, using the <(...) syntax, a temporary script that sources the default ~/.bashrc followed by any other commands:

bash --init-file <(echo ". ~/.bashrc; ls; pwd; ### other commands... ###")
5

The "Expect solution" I was referring to is programming a bash shell with the Expect programming language:

#!/usr/bin/env expect
set init_commands [lindex $argv 0]
set bash_prompt {\$ $}              ;# adjust to suit your own prompt
spawn bash
expect -re $bash_prompt {send -- "$init_commands\r"}
interact
puts "exiting subshell"

You'd run that like: ./subshell.exp "ls; pwd"

glenn jackman
  • 4,630
  • 1
  • 17
  • 20
  • I guess this would have the advantage of registering the commands in the history too, am I wrong? Also I'm curious if bashrc/profile is executed in this case? – muhuk Nov 07 '15 at 10:49
  • Confirmed that this allow you to put commands in the history, which the other solutions don't. This is great for starting a process in a .screenrc file -- if you exit the started process, you don't close the screen window. – Dan Sandberg Oct 29 '18 at 13:56
1

I don't have enough reputation here to comment yet, but I think the various solutions with bash's --rcfile (aka --init-file) are the most practical.

For a POSIX-compatible solution, you could also try something like

sh -si < my_init_file

where my_init_file terminates with

exec </dev/tty

which connects stdin to the terminal.

You could even try

sh -i -c'my startup commands; exec </dev/tty'

Note that making the shell interactive (with -i) means that certain errors are treated more leniently.

1

If sudo -E bash does not work, I use the following, which has met my expectations so far:

sudo HOME=$HOME bash --rcfile $HOME/.bashrc

I set HOME=$HOME because I want my new session to have HOME set to my user's HOME, rather than root's HOME, which happens by default on some systems.

jago
  • 11
  • 1
0
$ bash --init-file <(echo 'ls; pwd')
$ bash --rcfile <(echo 'ls; pwd')

In case you can't use process substitution:

$ cat script
ls; pwd
$ bash --init-file script

With sh (dash, busybox):

$ ENV=script sh

Or:

$ bash -c 'ls; pwd; exec bash'
$ sh -c 'ls; pwd; exec sh'
x-yuri
  • 2,141
  • 2
  • 24
  • 29
0

less elegant than --init-file, but perhaps more instrumentable:

fn(){
    echo 'hello from exported function'
}

while read -a commands
do
    eval ${commands[@]}
done
0

I accomplish basically the same thing by just using a script, typically for the purpose of setting environment variables for use in a specific project directory;

$ cat shell.sh
#!/bin/bash
export PATH=$PWD/bin:$PATH
export USERNAME=foo
export PASSWORD=bar
export DB_SERVER=http://localhost:6001
bash

$ echo ${USERNAME:-none}
none

$ ./shell.sh

$ echo $USERNAME
foo

This drops you into an interactive bash shell after all the environment adjustments are made; you can update this script with the relevant other commands you want to run.

user5359531
  • 131
  • 1
  • 7