1

My main question is: does wgMaxShellMemory limit the total memory used by all processes for shell tasks, like ImageMagick thumbnail creation? Or is it rather a per-process limit? The documentation for this setting seems vague.

I have a wiki which is using too much memory on a shared host. I've narrowed the culprit down to ImageMagick using too much memory when converting uploaded images. I've reduced the max upload size in php.ini, which will help, but I was hoping wgMaxShellMemory would serve as a hard limit for total memory usage by ImageMagick and other MediaWiki background processes.

As a sidenote, I also couldn't figure out from documentation whether image conversion is a part of the job queue, thus allowing wgJobRunRate to slow down thumbnail creation.

gilrain
  • 11
  • 2

1 Answers1

1

First, the memory limit only applies to GNU/Linux, not e.g. Windows.

Short answer: It's a per-process limit for the bash script, which will probably (this is true on the "its children" systems, and the limit is not much use otherwise) also apply to child processes. So each ImageMagick call gets its own separate limit.

Thumbnail generation for images is not part of the job queue (multimedia like videos is a different matter).


Details:

By default, it works like this:

  1. When wfShellExec is called, it creates a new bash process with proc_open.
  2. That runs a bash script called limit.sh.
  3. That new bash process sets the memory limit (wgMaxShellMemory) using ulimit -v.
  4. The bash process then runs the desired command with either /usr/bin/timeout (if there's also a wall clock limit) or eval.

The man page for bash's ulimit says '-v' is, "The maximum amount of virtual memory available to the shell, and, on some systems, to its children."

NOTE: There is an option to use cgroups, but it is off by default.

Matthew Flaschen
  • 868
  • 2
  • 7
  • 11