I am managing a service that is serving JSON files to my users. They all are loading the same cached json file, however there are a lot of users! On an average day I get over 2 million loads on the JSON file.
With this amount of usage I am only running this off of a VPS with 1gb of RAM. I've been dipping into SWAP a lot lately, and getting error messages like this one:
server reached MaxRequestWorkers setting, consider raising the MaxRequestWorkers setting
I have increased the worker count to 250
which is probably a mistake right there. The average apache process is using 22.5 mb of memory. I have around 800mb free to use with apache.
I assume I should probably be using 35 workers based on this so that I avoid SWAP which creates extra latency, however would that slow down the total JSON response time? The users are not loading my webpage at all, in fact the user should never even see anything to do with connecting to my site, which means they will not be refreshing and hammering the site with more requests.
I am wondering what I should do here? Do I need to upgrade, or do I just need to modify apache better? I am not sure if in this case I need more workers or if I should put it to a low value like 35?
And here are some usage graphs on the 4 core, 1gb memory VPS (the spike right at the end is when I restarted apache):