3

I'm running a jupyter/scipy-notebook Docker container. I have not restricted the memory assigned to the container with the run command.

However, what I'm seeing issuing the docker stats command is that the container is limiting its memory usage to 2 GB (on 16 GB available!), even if doing complex calculations.

How is this possible?

espogian
  • 607
  • 8
  • 23

3 Answers3

3

Alter the resources (RAM) settings from Docker Desktop - MAC/Windows.

MAC - Docker Desktop

Preferences --> Advanced --> Change Ram Settings

Windows - Docker Desktop

Settings --> Resources --> Change the CPU / RAM / SWAP Settings

enter image description here

Reference: Compiled the solution from @samirko and @Andris Birkmanis. (Added Windows Solution)

jona
  • 592
  • 1
  • 4
  • 16
2

I am running Docker on Mac OS and Jupyter crashed when trying to read over 600MB CSV file. Following Andris Birkmanis instructions helped to tackle the issue by increasing the size of allocated memory for Docker.

samirko
  • 21
  • 4
  • Well, the system does not allow me to comment on others answers and thought it is of general benefit to emphasize the importance of adjusting Docker's memory on Mac OS. – samirko Jun 25 '20 at 05:50
0

If everything is going well, by default, docker shouldn't limit by default memory usage at all. So, your MEM USAGE / LIMIT doing docker stats [containerid] should be the same than your total memory (16Gb in your case), although it's not free but available.

Furthermore, there's no way to set by default a docker memory limit invoking dockerd,

So, the only thing I can purpose is specify memory limit in docker run

  • -m, --memory="" Memory limit (format: <number>[<unit>]). Number is a positive integer. Unit can be one of b, k, m, or g. Minimum is 4M.
  • --memory-swap="" Total memory limit (memory + swap, format: <number>[<unit>]). Number is a positive integer. Unit can be one of b, k, m, or g.
  • --memory-reservation="" Memory soft limit (format: <number>[<unit>]). Number is a positive integer. Unit can be one of b, k, m, or g.
  • --kernel-memory="" Kernel memory limit (format: <number>[<unit>]). Number is a positive integer. Unit can be one of b, k, m, or g. Minimum is 4M.

For more information, please check Docker documentation run-time options

Check your docker run --memory-reservation=10g ...and let's see.

Alejandro Galera
  • 3,445
  • 3
  • 24
  • 42
  • This answer is very helpful but unfortunately I don't think this is a Docker configuration issue. As you correctly are pointing out, Docker does not restrict the available memory. Even with the parameter --memory-reservation=10g, the container is still using 2GB of memory... I think this is more something related to the configuration of Jupyter itself inside the container, but I can't find the solution – espogian Jun 13 '18 at 09:07
  • I don't think it's an Jupyter issue because I've just tested it in my Debian 9 and it takes limit to my 16Gb. – Alejandro Galera Jun 13 '18 at 09:25
  • 1
    There is also an engine-wide limit of resources; e.g., using Docker Desktop on Mac, it can be set in Preferences->Advanced. – Andris Birkmanis Mar 31 '19 at 03:44