-3

I have a request that requires my server to establish like 3 external requests before it returns to the user. This makes this request takes like 0.8s to execute per user. What tips should one follow to optimize it so that the server can handle several requests per second? I use Apache+PHP+MySQL. Currently, on AWS micro machine, I can't handle even 5 requests per second!

  • 2
    Have you pinpointed where the time is spent? Is your app slow in processing the external results, or are the externals servers slow to respond. You need to measure before you can optimize. – Martijn Heemels Nov 09 '11 at 22:06
  • 2
    Modify your service to not make external requests, or at least do some caching if possible, so that you don't have to perform them very often. – Zoredache Nov 09 '11 at 23:51
  • I think you are asking (in addition to speeding up the requests) how you can get Apache to handle more requests per second. On your t1.micro, you will max out memory much before you max out CPU for Apache requests. If possible, avoid mod_php (use php-fpm ideally, or mod_fcgi if that isn't possible) and if you don't need Apache try a lighter server (such as Nginx or lighttpd). Cache (memcached, redis, varnish, etc.) whatever you can (both requests and responses - if they aren't completely individual). Determine the per thread memory usage of Apache, and conservatively set the MaxServers. – cyberx86 Nov 10 '11 at 00:01

1 Answers1

1

Much of your waiting I expect is spent actually blocked whilst responses come through. Doing this serially is not ideal.

You should parallelize. You can use curl_multi_init from the php-curl module to asynchronously connect back to your data stores.

This doesn't fix a underlying issue -- the remote datastores response speed is out of your control. But, then again you should be caching some of this content (you can use memcache for that if you wanted) even if it becomes stale in a minute or two.

warren
  • 18,369
  • 23
  • 84
  • 135
Matthew Ife
  • 23,357
  • 3
  • 55
  • 72