1

I have a piece of code in most of my functions for logging purpose now on production and we see a logs files of 2gb per day is it okay to write directly to log files using file_put_contents. We are speculating this file_get_contents will become a blocking point as this size and traffic increases. Can someone guide me a proper way to do logging in physical files without blocking php. Any other approach is most welcomed.

file_put_contents(
            getcwd() . "debug_log.txt",
            "" . print_r($updateFieldsArray, true) . "\n",
            FILE_APPEND | LOCK_EX
        );

Thanks in advance.

jision
  • 181
  • 2
  • 12
  • Probably write logs into a db can be a solution for your problem – Sfili_81 Mar 13 '19 at 11:18
  • @Sfili_81 yeah that can be an option too – jision Mar 13 '19 at 11:25
  • How about using PHPs own `error_log` function? – Dave Mar 13 '19 at 11:41
  • @dave just went through http://php.net/manual/en/function.error-log.php and as per this it has memory limitations also although that can be configured. Also what if the code is running in cli . – jision Mar 13 '19 at 11:48
  • Not sure where you saw memory limitations. How PHP is being used doesn't matter especially if you default the message type. – Dave Mar 13 '19 at 11:52
  • If cli and webserver is not an issue this one makes sense, also I was thinking to use logrotate to keep the file size in control. thanks @dave – jision Mar 13 '19 at 11:57

1 Answers1

1

Writing logs to files will eventually become a bottleneck if you keep growing (just because anything can become a bottleneck), but we can't tell you if it's something you should worry about now. You could start using a logging library that allows changing the storage system through configuration, such as monolog.

These libraries are very helpful when developing, too. You can do things like enable debug output but only for the part of the app you're working with.

You can always move the logs to a faster disk or stop logging information you don't need. The harder problem with logging to files is what will you do when you have more than one server (think load balancing, high availability). Now you have to read the logs on all servers to find anything.

A possible solution is having all servers send their logs to a centralized log server. This can be done with syslog. Alternatively each server can have a program that reads the log files as they are produced and stores the information in a central database. This is what logstash does.

Joni
  • 108,737
  • 14
  • 143
  • 193
  • thanks that was very informative, as for the central logging we will be going with an ELK stack. – jision Mar 14 '19 at 06:50