I have a long-running python program that I'm trying to run on a remote server.
I've looked at "How to keep processes running after ending ssh session?", " How to start process via SSH, so it keeps running?", "Run a persistent process via ssh", and a few other topics, but they don't seem to help.
I've tried running the python process with screen
(via detaching a screen
containing a background process) and nohup
, but in both cases, when I exit the ssh session (which--I'm not sure if this matters--is run with X11 forwarding, since the python program is creating some graphics), the ssh session hangs.
The ssh process hangs even if I redirect stdin
, stdout
, stdout
from/to /dev/null
.
Killing the ssh session kills the python process. When I kill the ssh, the following error message is printed on the remote server: g_dbus_connection_real_closed: Remote peer vanished with error: Underlying GIOStream returned 0 bytes on an async read (g-io-error-quark, 0). Exiting.
Furthermore, I don't actually want to redirect stdout
or stderr
to /dev/null
, since I want to redirect them to a log file. So I didn't try running the python process as a daemon. (Perhaps it's bad that the logging is sent to stdout
, I guess...)
What should I do, so that I can: (1) keep my process running after logging out, (2) redirect stdout
/stderr
to a log file?
(One thing which "worked" was suspending and then rerunning the ssh process [after it hangs] in the background, but what if I want to shut off my computer?)