0

My web server is running (apache 2.4.10) different virtual hosts for the following domain name :

foo.example.com
bar.example.com
www.example.com
example.com

Here is the configuration file for my vhosts :

<VirtualHost *:80>
        DocumentRoot /var/www/

        Redirect 404 /
        ErrorLog ${APACHE_LOG_DIR}/error.log
        CustomLog ${APACHE_LOG_DIR}/access.log combined
</VirtualHost>

<VirtualHost *:80>
        ServerName foo.example.com
        DocumentRoot /var/www/foo/

        ErrorLog ${APACHE_LOG_DIR}/foo-error.log
        CustomLog ${APACHE_LOG_DIR}/foo-access.log combined
</VirtualHost>

<VirtualHost *:80>
        ServerName bar.example.com
        DocumentRoot /var/www/bar/

        ErrorLog ${APACHE_LOG_DIR}/bar-error.log
        CustomLog ${APACHE_LOG_DIR}/bar-access.log combined
</VirtualHost>

<VirtualHost *:80>
        ServerName example.com
        ServerAlias www.example.com
        DocumentRoot /var/www/www/

        ErrorLog ${APACHE_LOG_DIR}/www-error.log
        CustomLog ${APACHE_LOG_DIR}/www-access.log combined
</VirtualHost>

I would like to know how to handle the robots.txt file. I want all my domain to not be indexed.

Here's my robots.txt file :

User-agent: *
Disallow: /

I copied it into several directory like this :

/var/www
     |-- foo
     |   |
     |   `-- robots.txt
     |
     |-- bar
     |   |
     |   `-- robots.txt
     |
     |-- robots.txt
     |
     `-- www
         |
         `-- robots.txt

Is this a proper configuration?

Kiwi387
  • 3
  • 2

1 Answers1

2

Looks ok to me, why don't you just try it and see ? If they are all going to be the same you may want to consider using links to a single document so you only have to make changes in one place.

Also bear in mind that not all robots will honour a robots.txt file, you may end up having to block them using other means.

user9517
  • 115,471
  • 20
  • 215
  • 297
  • Well, as there is some delay between the moment I create the `robots.txt` files and the robots read it, I can't tell. Moreover, I can't tell whether my configuration was made the proper way, or this is a bad practice. I note your suggestion for making a single file, didn't thought about that. – Kiwi387 Aug 09 '15 at 16:40
  • 1
    Can't tell what ? There are plenty of robots.txt checkers/analysers available, Google even provides one all you need to do ifs erm google for them. – user9517 Aug 09 '15 at 16:42