0

I am working for a client running a HostGator Dedicated Server with hundreds of sites on it. Since there will be some common files between them I am contemplating doing the following:

  1. creating a COMMON folder in /home/ that holds the universal files
    /home
    ...site1
    ...site2
    ...site3
    ...site4
    ...COMMON
    ........universal.css
    ........universal.js
    ........universal.php
    ...site999
    ...site1000

  2. Adding a script to every site that uses a PHP include(); pointing to the respective file in the COMMON directory.
    <?php
    include("../../../COMMON/universal.php");
    ?>

In this way I will be able to make modifications to code on all sites at once.

Now the core question is how does something like this impact the server?? These sites won't ever have heavy traffic but there are literally hundreds of them. This is such a basic question but is 1000's of requests for the same file the same as 1000's of requests distributed across hundreds of files?

filip
  • 125
  • 6

3 Answers3

0

Should be OK, especially since this will be mostly read-only thing. Right? :)

Janne Pikkarainen
  • 31,852
  • 4
  • 58
  • 81
0

include("../../../COMMON/universal.php");

No!!!!

Never use relative paths for include files!

add COMMON to the PHP include path (you can do this in the php.ini, in .htaccess or at the top of each PHP script, e.g.

<?php
ini_set('include_path',ini_get('include_path').':/home/COMMON:');
symcbean
  • 21,009
  • 1
  • 31
  • 52
0

In reality, what kind of simultaneous access to any given file do you expect there to be? From your description I would be very surprised if any given file is being accessed by more than a couple if instances at any given time. That load is probably going to be even less because of the various levels of caching most systems have. In short, I really don't think you would have a problem, even if all the sites are quite busy.

John Gardeniers
  • 27,458
  • 12
  • 55
  • 109
  • That's great to hear. Thank you. A probable maximum would be about 10 thousand simultaneous connections. In theory how different for the server is it to handle thousands of requests for one file compared to the same amount of requests distributed thousands of files for example? Not sure where to look to learn about this area of how servers work. – filip Nov 07 '10 at 19:22
  • @filip, how the server handles simultaneous reads of a single file will depend on the operating system and the file system being used. As a general rule the limiting factor will be the amount of memory available for the file pointers (internal links to an open file). Also consider that an include file will take only a fraction of a second to read, so the actual number of simultaneous reads is not the same as the number of simultaneous connections. 1,00's of simultaneous connections may well result in less than a dozen simultaneous reads of any given file. – John Gardeniers Nov 07 '10 at 22:33