0

I've set some scheduled jobs on gitlab. It is running some parallel jobs every minute and reaches about 500 simultaneous process . After some point runner throws below error and fails.

Running with gitlab-runner 15.5.0 (0d4137b8)
  on XXXX y4s23Bpz
Preparing the "shell" executor
00:00
Using Shell executor...
Preparing environment
00:00
ERROR: Job failed (system failure): prepare environment: failed to start process: fork/exec /usr/bin/su: too many open files. Check https://docs.gitlab.com/runner/shells/index.html#shell-profile-loading for more information

I've increased open files limit but didn't resolve problem. I couldn't figure out which limit it hits.

~$ ulimit -a 
core file size          (blocks, -c) 0
data seg size           (kbytes, -d) unlimited
scheduling priority             (-e) 0
file size               (blocks, -f) unlimited
pending signals                 (-i) 515311
max locked memory       (kbytes, -l) 65536
max memory size         (kbytes, -m) unlimited
open files                      (-n) 999999
pipe size            (512 bytes, -p) 8
POSIX message queues     (bytes, -q) 819200
real-time priority              (-r) 0
stack size              (kbytes, -s) 8192
cpu time               (seconds, -t) unlimited
max user processes              (-u) 515311
virtual memory          (kbytes, -v) unlimited
file locks                      (-x) unlimited
ibrahim
  • 431
  • 1
  • 7
  • 20

0 Answers0