I am working for a client running a HostGator Dedicated Server with hundreds of sites on it. Since there will be some common files between them I am contemplating doing the following:
creating a COMMON folder in /home/ that holds the universal files
/home
...site1
...site2
...site3
...site4
...COMMON
........universal.css
........universal.js
........universal.php
...site999
...site1000Adding a script to every site that uses a PHP
include();
pointing to the respective file in the COMMON directory.
<?php
include("../../../COMMON/universal.php");
?>
In this way I will be able to make modifications to code on all sites at once.
Now the core question is how does something like this impact the server?? These sites won't ever have heavy traffic but there are literally hundreds of them. This is such a basic question but is 1000's of requests for the same file the same as 1000's of requests distributed across hundreds of files?