1

I'm currently hosting sites for clients using the following setup:

  • Apache 2.2.16
  • mod_fastcgi 2.4.7
  • php 5.3.3

Apache uses the worker MPM and serves PHP through a dynamic FastCGI config:

FastCgiSuexec /usr/sbin/suexec
FastCgiConfig -singleThreshold 0 -pass-header Authorization -idle-timeout 3600
SuexecUserGroup user group
Action application/x-httpd-fastphp /cgi-bin/php.fcgi
AddType application/x-httpd-fastphp .php

This works well but this creates a php-cgi instance for each VirtualHost (even if the cgi-bin dir is physically the same for a given account) so if a client has a few subdomains it begins to add up, starving the server and trashing it.

So this brings me to the actual question, is it possible to serve all php requests for an account through a single php-cgi instance? Preferably while maintaining the suexec bit to prevent clients from reading files outside of their directory.

hlidotbe
  • 215
  • 3
  • 6

2 Answers2

0

If i may add a personal opinion..

Use Nginx, its faster, simpler, and much easier to do what you want.. ( and uses much less memory )

Its a personal preference of mine.. :D

https://calomel.org/nginx.html is a nice detailed example

Arenstar
  • 3,602
  • 2
  • 25
  • 34
  • nginx/lighttpd are an option if: 1. I can easily configure virtualhosts and mass virtual hosts (I don't think it is a problem) and 2. I need to be able to host "legacy" php application with htaccess/rewriterules without having to put my hands in it (this might be trickier) – hlidotbe Nov 10 '10 at 20:27
  • Well.. virtual hosts are simple.. true The htaccess will need to be changed, this is also unfortunately true.. :( Fastcgi is really suited to Nginx,Lighttpd moreso than Apache.. I honestly think its a better solution to use them.. Just depends on your motivation to test it out i guess.. I hope this helps anyways :D – Arenstar Nov 10 '10 at 20:33
  • Sure it helps. Unfortunately my motivation is not really a problem here, having clients and only 24h per day on the other hand... – hlidotbe Nov 10 '10 at 20:40
  • Before you give up.. if your using opensource software.. you will surely find the equivalent htaccess/rewrite rules with a few google's.. If its custom then.. yeah.. you need to spend some time.. I hope you find the time to play with it.. – Arenstar Nov 10 '10 at 20:43
  • I'm trying nginx and it looks like it could do for 90% of our hosting needs, the rest could be put on a smaller server with a classic configuration. That being said, because it forced me to spawn php-cgi myself, it also gave me the solution for my problem => FcgiExternalServer and one spawned instance per account. Thank you – hlidotbe Nov 11 '10 at 13:11
0

Do you have some fcgid configuration for each virtual host? (Apart from the global fcgid config), that is, inside the tags?

I just thought that this could be one possible reason for the problem, which could be solved by adding the fcgid config for some filesystem path which is common for all the web-accounts, something like this (for example in the /etc/apache2/sites-enabled/000-default, if using Ubuntu):

NameVirtualHost SOME-IP-ADDRESS:80

<Directory /some-path/webaccounts/>
    AddHandler fcgid-script .php
    FCGIWrapper /usr/lib/cgi-bin/php5 .php
</Directory>

<VirtualHost SOME-IP-ADDRESS:80>
...

At least we are using fcgid with some 20-30 vhosts on a 2Gb RAM machine (Ubuntu 8.04) without problems, with a setup along the lines with the above.

Samuel Lampa
  • 545
  • 1
  • 4
  • 7
  • The fcgi handler is indeed per virtualhost (the last three lines of the config above are in the vhosts). This is because of the suexec which allows me to 1. guarantee they can't go outside their directory, 2. allows their site to write without worrying about permissions. – hlidotbe Nov 10 '10 at 22:11