0

In a shop created with PS 1.7.6.1 we have created a resellers "view"; At the moment we have the mail webshop for B2C on www.domainname.com and a view with reseller.domainname.com (for B2B market)

For aspects related to SEO (duplicate product sheets etc ...), I would NOT index the entire alias of the subdomain "reseller.domainname.com"

I can NOT proceed via FTP with robots.txt file as there is no root dedicated to that alias, so it is impossible to add a command dedicated to that Url (that's not a real subdomain)

Is it possible to proceed via the HTACCESS file? Is there any way to prevent URL path indexing reseller.domainname.com

Thank you

1 Answers1

0

Do you mean that both websites are sharing the same document root (common scenario with third-level subdomain multishops ?)

In this case the solution is to edit your .htaccess like this

 RewriteRule ^robots\.txt$ robots/%{HTTP_HOST}.txt [L]

This way you can have a different robots.txt for each shop named:

robots/mysite1.com.txt robots/mysubdomain.mysite2.com.txt

Most likely you would like to add a

User-agent: *
Disallow: /

on the robots.txt of the reseller shop.

gennaris
  • 1,467
  • 1
  • 11
  • 17
  • Hello and thanks for the reply. Do you mean that both websites are sharing the same document root (common scenario with third-level subdomain multishops?) = Yes exactly So, entering that command in the .htaccess file happens that: I can create a folder /robots/ inside which I will insert a robots.txt file dedicated to each domain. So in the root document I will have: robots/domainname.com.txt robots/reseller.domainname.com.txt is it right? – martoneg Mar 01 '21 at 14:27
  • yes correct - the {HTTP_HOST} will be replaced by your domain – gennaris Mar 01 '21 at 16:42