0

Hello all fellow devs,

I have a problem in a production site (laravel). Sometimes(like 4-8 request an hour) the most simple requests are causing this kind of errors:

[13-Dec-2020 18:07:07] WARNING: [pool www] child 514732, script '/home/****/****/public/index.php' (request: "GET /index.php") execution timed out (71.782847 sec), terminating

When i check my access log i see that it is the most simple request. It even happens with request where there is no mysql query executed. When i look at memory usage there is 1GB available when it happens. CPU is running at 10% tops.

  • I use digitalocean load balancer to redirect traffic to server.
  • There are about 60-120 requests a minute per server
  • It is happening on all my servers (i have 4)
  • php using opcache
  • mysql hosted on separate servers
  • redis hosted on separate servers
  • I have tried to lower and raise the fpm childs

PHP fpm settings

pm = dynamic
pm.max_children = 40
pm.start_servers = 15
pm.min_spare_servers = 15
pm.max_spare_servers = 25
;pm.process_idle_timeout = 10s;
pm.max_requests = 500

I hope you guys can help me, i have searched all over the internet but nothing works. Thank you so much.

Timo

  • What about the PHP code that gets executed? – Zoli Szabó Dec 13 '20 at 19:48
  • Sometimes some request with queries that normally return response in 50 ms. But also sometimes this is hanging: ```public function keepalive(){ return Array("status"=>200); }``` – Timo Dekker Dec 13 '20 at 19:49
  • Could it be because of MySql's max_connections? – Zoli Szabó Dec 13 '20 at 19:53
  • I can see there is a stable 20 connections. I don't think this is the problem because i can't find any error's logged. When i break the connection with mysql i will get a mysql timeout error within 10 seconds. – Timo Dekker Dec 13 '20 at 20:02
  • Do you have any auto_prepend_file? Or is the delay really before you execute your first line of PHP? If you have ruled out any delay in code, I'd recommend checking the OS: MySQL sockets, RAM, or maybe max open files: https://stackoverflow.com/questions/3734932/max-open-files-for-working-process – Zach Rattner Dec 13 '20 at 21:34
  • @ZachRattner I do not use an auto_prepend_file. I don't think it is ram, it just happend and 0.6gb of 2gb is used, and swap is 0. I did check max open files, my cat /proc/sys/fs/file-max was 9223372036854775807, so i did nothing there. My ulimit -Hn = 1048576, and my ulimit -Sn was 1024. I raised 1024 to 4096, i hope this helps. I use hosted mysql instance, so i don't really know how to check the mysql sockets. – Timo Dekker Dec 14 '20 at 07:46

2 Answers2

1

So it was an issue of the redis connection.

Solution was to use tls1.2.

https://github.com/phpredis/phpredis/issues/1726

0

The problem come from Redis connection with TLS 1.3 (https://bugs.php.net/bug.php?id=79501) Redis connection with TLS 1.3 is not stable. If you connect to Redis with tls protocol, for example your Redis host is tls://your-redis-domain-here, PHP will use TLS 1.3 by default. To resolve this issue you have to specify tlsv1.2 in your connection, for example: tlsv1.2://your-redis-domain-here

Tung Nguyen
  • 767
  • 7
  • 6