1

I am running a Debian 10 Buster server on a google VM with PHP 7.3 Apache2 and MySQL 8. On the server, I am running Laravel 8 as an API for two clients. I recently upgraded the VM to an n2-standard-2 (2 vCPUs, 8 GB memory) and wanted to check the throughput my server can handle.

My issue is that my server never utilizes 100% CPU in fact it only ever uses a max of 50%. In the beginning, I thought it was because it was only using one core, but if I run top on the server while stress-testing a route on the server I can see that both cores are used, but only up to 50%, as shown in the photo. What is the problem here? Is there a way I can utilize both the cores fully? I seem to have enough available memory, and the disk I/O never spikes very high.

Laravel is running on a reverse proxy on port 9595 and the PID 19541 is the PPID of the PHP artisan serve command. I don't have queueing on my Laravel app, but from what I can tell it is only really very useful for longer and more CPU intensive tasks, which this should not be? (the test is running a number of fairly simple POST requests, reading from MySQL).

top 1 output while running stress test on server

EDIT: Under further investigation I have found the following: I have 1 CPU core with 2 threads. I can stresstest both threads to a 100%. It is because my php7.3 mod only uses one thread that I can never reach 100% with incoming https requests. If I change MaxRequestWorkers in Apache it does not make a difference. My best guess is that php only uses the one thread. What I don't understand is why Apache does not automatically make use of both cores, as Apache should be able to do this?

Mariann
  • 11
  • 3

1 Answers1

0

After a lot of testing I found the bottleneck: I was serving Laravel on PHP artisan via a reverse proxy, and it was artisan that was turning away requests when it reached a certain level. I omitted this by just serving Laravel with Apache. The bottleneck disappeared and I can now use 100% CPU power if I want.

Mariann
  • 11
  • 3