0

In my production Environment I'm observing a sporadic issue where pages are taking a long time to load. In the error logs we are seeing:

PHP Fatal error:  Maximum execution time of 30 seconds exceeded

The affected line is where a session is being created for the user.

The directories are physical. There are +3.5 million files in the directory. The trash collection is set for 31 days for sessions in PHP.

The issue is sporadic so I can't trigger it. The behavior is consistent that it is always the session starting that takes above 30 seconds to execute. The lines prior to that run fine, if I list the contents of the sessions directory (ls /var/www/sessions/) it takes +45 seconds just from the command line. I think application monitoring would be good but this seems to be an issue at the system level.

I've looked at the cloudwatch metrics but don't see a bottleneck involving the disc reads there.

Could anyone advise on what issues we might be running into and how to resolve them?

John Rotenstein
  • 241,921
  • 22
  • 380
  • 470
Santosh Baruah
  • 115
  • 1
  • 6
  • How many items are in that directory? Is that directory physically local or a virtual one that is mounted locally? – Chris Haas Dec 21 '21 at 15:44
  • 1
    It sounds a bit like PHP is not cleaning out old sessions – RiggsFolly Dec 21 '21 at 15:53
  • Hello Chris Haas, The directories are physical. There are +3.5 million files in the directory. The trash collection is set for 31 days for sessions in PHP. – Santosh Baruah Dec 21 '21 at 16:15
  • Does the 3.5 million align with the number of users you expect in a month? Besides not cleaning up, it is possible that you have something that is creating rogue sessions somehow. If it does align with your expectations, you might want to look into the [`N` parameter](https://stackoverflow.com/a/21452048/231316), although make sure to read all of the instructions and caveats. – Chris Haas Dec 21 '21 at 17:51
  • Bots don't keep sessions so any bots visiting generate a new file for every page they visit. For application monitoring is there a tool you'd recommend we use? – Santosh Baruah Dec 22 '21 at 14:00
  • The friendly bots identify themselves, so I would start with those and not bother with a session, if that's an option. [Google](https://developers.google.com/search/docs/advanced/crawling/overview-google-crawlers), [Microsoft](https://www.bing.com/webmasters/help/which-crawlers-does-bing-use-8c184ec0) and [Baidu](https://user-agents.net/browsers/baidu-browser). Longer term, do you need a session on things that a bot can get to, or could the session be moved to only after form interaction? Otherwise, I agree with the Redis or similar for session storage recommendation. – Chris Haas Dec 22 '21 at 14:19

1 Answers1

0

PHP uses session.gc_probability to sporadically cleanup the sessions folder. Make sure to set it to 0 in production so your API/page calls don't hang.

I suggest checking session.gc_maxlifetime value, it will give you some ideas about how long the files will be kept.

You can call session_gc() to force cleanup manually (and probably simulate your issue), check more info at https://www.php.net/manual/en/function.session-gc.php. If it hangs for too long running via the command line, you might consider deleting the entire session folder instead (WARNING: this will kill all user's sessions).

Note that some distros/packages install automatically a session garbage collection cron job, I had issues a long ago with too many files at the folder and the cron job simply hangs (more details https://serverfault.com/questions/511609/why-does-debian-clean-php-sessions-with-a-cron-job-instead-of-using-phps-built).

As a long-term solution, I would say to move away from file-based sessions and use Redis to handle sessions, especially on AWS where the disk performance is not the best. Not sure what framework you use (most modern ones have built-in solutions for it) but you can also find framework-less solutions online.

Paulo H.
  • 1,228
  • 10
  • 15