0

I have a situation where we have two code bases that need to stay intact..

example: http://example.com.

And a new site http://www.example.com.

The old site (no WWW) supports some legacy code and has the rule:

User-agent: *
Disallow: /

But in the new version (with WWW) there is no robots.txt.

Is Google looking to the old (no WWW) robots.txt file as its rule? And will adding

User-agent: *
Allow: /

to the (WWW) side override this?

Changing robots.txt on in the old codebase is not an option at this time.

TRiG
  • 10,148
  • 7
  • 57
  • 107
g00se0ne
  • 4,560
  • 2
  • 21
  • 14

1 Answers1

2

No, the subdomain "www." and the subdomain "" are separate subdomains, and the robots.txt from one of them is not used for the other.

Guffa
  • 687,336
  • 108
  • 737
  • 1,005