I am working on Web application, which allows users to create their own webapp in turn. For each new webapp created by my application I Assign a new Subdomain. e.g. subdomain1.xyzdomain.com, subdomain2.xyzdomain.com etc.
All these Webapps are stored in Database and are served by a python script (say
default_script.py) kept in /var/www/
.
Till now, I have blocked Search Engine indexing for directory ( /var/www/
) using robots.txt. Which essentially blocks indexing of my all scripts including default_script.py as well as content served for multiple webapps using that default_script.py script.
But now I want that some of those subdomains should be indexed.
After searching for a while I was able to figure out a way to block indexing of my scripts by explicitly specifing them in robots.txt
But I am still doubtful about the following:
Will blocking the my default_script.py from indexing also block indexing of all content that are served from default_script.py. If yes then if I let it index, will default_script.py start showing up in search results also.
How can I allow indexing of some of the Subdomains seletively.
Ex: Index subdomain1.xyzdomain.com but NOT subdomain2.xyzdomain.com