1

I have a development 'environment' under a virtual host on an apache server.
I'd to block this virtual host and all of it's sub directories for being indexed by search engines.

I have this code (which I pulled from another question on here) but his question was for the whole server, I'd just like sometime to confirm that this will work, and that it will only affect only the vhost and its subdirs ( I don't want the whole server affected )

<VirtualHost *:80>
  ServerName d.domain.co.uk
  ServerAlias www.d.domain.co.uk
  DocumentRoot /var/www/html/d.domain.co.uk

  # Block all robots on all subdirs
  <Location "robots.txt">
   SetHandler None
  </Location>
  Alias /robots.txt /var/www/html/d.domain.co.uk/robots.txt
</VirtualHost>

Regards, Matt

Clarkey
  • 698
  • 3
  • 11
  • 28
  • If people are giving the lines 6 to 9 a go into their `*.conf`, don't forget to add it to your SSL config and afterwards restarting your apache service. `sudo service apache2 restart` – Jack May 23 '20 at 10:13

1 Answers1

1

Yes, it will only affect this vhost since your <Location> and Alias directives are set inside your vhost declaration.

Anyway, it's easy to test: just access /robots.txt on another vhost and check if its content matches the real file or if it's the same as /var/www/html/d.domain.co.uk/robots.txt

Capsule
  • 6,118
  • 1
  • 20
  • 27