31

I keep getting connection timeout while pulling an image:

enter image description here

First, it starts downloading the 3 first layers, after one of them finish, the 4th layer try to start downloading. Now the problem is it won't start until the two remaining layers finish there download process, and before that happens (I think) the fourth layer fails to start downloading and abort the whole process. So I was thinking, if downloading the layers one by one would solve this problem. Or maybe a better way/option to solve this issue that may occure when you don't have a very fast internet speed.

Asme Just
  • 1,287
  • 5
  • 27
  • 42
  • 1
    I don't think you can download docker images from browser. Maybe there is a command to `resume` the download? – Jatin Dec 23 '16 at 15:15
  • Yeah, I wish/hope there is one. And the biggest problem is even if 9/10 of the layers finished (and even get extracted), if the 10th layers fails, you'll have to download again all of them again (including all the already finished ones). – Asme Just Dec 23 '16 at 15:39
  • 1
    It seems there is an [open issue](https://github.com/docker/docker/issues/12823) for that . Also, this SO question: [How to resume downloading image when interrupted?](http://stackoverflow.com/questions/35315735/docker-how-to-resume-downloading-image-when-interrupted) – Jatin Dec 24 '16 at 18:59
  • `--max-concurrent-downloads` solved the timeout problem so, I can can rely on it till they implement a better download management. – Asme Just Dec 24 '16 at 19:33

4 Answers4

30

The Docker daemon has a --max-concurrent-downloads option. According to the documentation, it sets the max concurrent downloads for each pull.

So you can start the daemon with dockerd --max-concurrent-downloads 1 to get the desired effect.

See the dockerd documentation for how to set daemon options on startup.

Harald Albers
  • 1,913
  • 16
  • 20
29

Please follow the step if docker running already Ubuntu:

sudo service docker stop
sudo dockerd --max-concurrent-downloads 1

Download your images after that stop this terminal and start the daemon again as it was earlier.

sudo service docker start
Pankaj Cheema
  • 1,028
  • 2
  • 13
  • 26
3

There are 2 ways:

  1. permanent change. add docker settings file:

sudo vim /etc/docker/daemon.json

the json file as below:

{ "max-concurrent-uploads": 1, "max-concurrent-downloads": 4 }

after adding the file, run sudo service docker restart

  1. temporary change

stop the docker by

sudo service docker stop

then run

sudo dockerd --max-concurrent-uploads 1

at this point, start the push at another terminal. it will transfer files one by one. when you finished, restart the service or computer.

Ken
  • 1,234
  • 10
  • 16
-1

Building on the previous answers, in my case I couldn't do service stop, and also I wanted to make sure I would restart the docker daemon in the same state, I thus followed these steps:

  1. Record the command line used to start the docker daemon:

     ps aux | grep dockerd
    
  2. Stop the docker daemon:

     sudo kill <process id retrieved from previous command>
    
  3. Restart docker daemon with max-concurrent-downloads option: Use the command retrieved at the first step, and add --max-concurrent-downloads 1

Additionally

You might still run into a problem if even with a single download at a time, your pull is still aborted at some point, and layers that are already downloaded are erased. It's a bug, but it was my case.

A solution in that case is to make sure to keep already downloaded layers, voluntarily.

The way to do that is to regularly abort the pull manually, but NOT by killing the docker command, but BY KILLING THE DOCKER DAEMON.

Actually, it's the daemon that erases already downloaded layers when the pull fails. Thus, by killing it, it can't erase these layers. The docker pull command does terminate, but once you restart the docker daemon, and then relaunch your docker pull command, downloaded layers are still here.

Vic Seedoubleyew
  • 9,888
  • 6
  • 55
  • 76