0

The main problem is that i have two containers: containerA and containerB. containerB is the container of Portia, which is running and i cant stop due to external reasons. And in containerA i need to execute a docker exec of containerB.

I've readen two main solutions, the first one that i already tried and it works is using ssh to run the script in the host but i cant have a user with no password and giving a password seems to me like it's not the best way to do this. The second way is using docker.sock and a docker compose file, but many people said in comments that this is not a secure way.

Can someone explain me other way or if i am wrong and why? Thanks for your time.

MarinerZZ
  • 332
  • 1
  • 3
  • 13
  • Have you gone through all the answers mentioned here https://stackoverflow.com/questions/44446472/docker-run-on-a-remote-host – mchawre Jun 24 '19 at 12:08
  • Hi, the answers that are mentioned are all about ssh or docker.sock except for one that i am not understanding very well. it is using docker machine, do you mean this one? – MarinerZZ Jun 24 '19 at 12:39

2 Answers2

0

You’ve basically highlighted the only two ways to directly run a command in another container. In particular, allowing docker exec access gives your process unlimited root-level control over the host, and any security issue in your setup opens the very real possibility of compromising the host (I have seen many SO questions with trivial shell-injection attacks on system("docker exec $COMMAND") type calls).

Best practice is to try to avoid docker exec as much as possible. It is a very helpful debugging tool, but it should not be in your core application flow at all. (It is very much the equivalent of “ssh as root to the server and...”, which is never a best practice.). If one container needs to request that another container does something, this is typically done via some sort of HTTP interface.

David Maze
  • 130,717
  • 29
  • 175
  • 215
0

As you mentioned in your question that there are two possible ways you find to run docker commands remotely.

  • Using ssh
  • Using docker socket

both of which you think is not secure.

But that's not true. I'm not sure about ssh, but docker socket can be secured.

Check this out.

You have to enable tls on your docker daemon socket, to make it secured. And then you can run docker remotely in a secure manner.

Quoting the first paragraph from the link mentioned above.

By default, Docker runs through a non-networked UNIX socket. It can also optionally communicate using an HTTP socket.

If you need Docker to be reachable through the network in a safe manner, you can enable TLS by specifying the tlsverify flag and pointing Docker’s tlscacert flag to a trusted CA certificate.

In the daemon mode, it only allows connections from clients authenticated by a certificate signed by that CA. In the client mode, it only connects to servers with a certificate signed by that CA.

Hope this helps.

mchawre
  • 10,744
  • 4
  • 35
  • 57
  • Yeah it seems like that. If you want it in secure manner. Then go ahead and follow this link https://docs.docker.com/engine/security/https/ to securely configure docker and try to run commands remotely as mentioned in the link. Also please upvote and accept the answer if it helped. :) – mchawre Jun 25 '19 at 07:07
  • So in this case you think using docker.sock is the best aproach? I am new to docker and i have seen for example in this other question: [lhttps://stackoverflow.com/questions/35110146/can-anyone-explain-docker-sock](https://stackoverflow.com/questions/35110146/can-anyone-explain-docker-sock) that is strongly recommended not to use docker.sock due to possible surface attacks as [David Maze](https://stackoverflow.com/users/10008173/david-maze) said in hte other comment. – MarinerZZ Jun 25 '19 at 07:10
  • Ok, i will investigate your way a little more, thanks for the help – MarinerZZ Jun 25 '19 at 07:11