1

I have to make a robots.txt file for two different websites but they are both on the same root directory.

They are as different as google.com and yahoo.com but they are both tied to the same place so one change effects both. Looking for a solution to create a single robots file for both domains, any help would be greatly appreciated.

Found this other resource but I am not sure if it will apply: Robots.txt for multiple domains

Community
  • 1
  • 1

1 Answers1

0

The solution given in the link that you have mentioned will definitely work. I am using the same setup on nginx server. As an alternative I though I could also try to add the domain to the robots.txt file.Like:

Disallow: domain1.com/path/to/disallow 
Disallow: domain2.com/path/to/disallow

But I am not 100% sure if this will work.