1

I am currently running a mac mini hosting at anytime 2 to 6 docker containers built off the same agent. The container function as build agents but every now and then builds fail and do not clean up correctly. This leads to files being left over that take up space.

I want to setup and cron job, that runs nightly, a docker exec command against every container and deletes my build directory.

i've tried the simple: docker exec -it `docker ps -q` ls /root/build-dir

but this fails with: OCI runtime exec failed: exec failed: container_linux.go:344: starting container process caused "exec: \"ee89958ce4bc\": executable file not found in $PATH": unknown

However this works: docker exec -it ee89958ce4bc ls /root/build-dir

Is there a way to use docker exec against multiple container without writing a complicated script that will loop through container ID's from docker ps?

NOTE: I do not want to change the docker container and add the cron job there. I want this to run on the host machine

Riaz Abed
  • 27
  • 1
  • 5

1 Answers1

2

you can use for loop:

for con in `docker ps -q` ; do echo "$con" && docker exec -it $con ls -l; done
LinPy
  • 16,987
  • 4
  • 43
  • 57
  • 1
    Thank you @LinPy works perfectly. A shame that docker doesn't cater for something like this. – Riaz Abed Aug 06 '19 at 09:44
  • @Linpy do not forget to credit this answer https://stackoverflow.com/questions/56880821/docker-exec-on-all-running-containers/56881399#56881399 – Adiii Aug 06 '19 at 10:16
  • I just wrote the answer on the fly it did not required too much knowledge to answer it :) – LinPy Aug 06 '19 at 10:20