I would like to create a publicly accessible Google Apps site (i.e. users do not need to be authenticated to access the content) while maintaining a policy crawlers and bots exclusion with Robots.txt. Does anyone know how to do that?
Asked
Active
Viewed 247 times
0
-
1I don't specifically know much about Google Apps - is it stopping you from simply serving out a robots.txt "as usual"? – Jakob Borg Jan 18 '10 at 21:32
-
what are you trying to achieve as you cant make your apps site publicly accessible. – seanl Jan 19 '10 at 11:55
-
seanl, it's very possible to make a google site publicly accessible, see http://sites.google.com/a/lokad.com/translate/ – Joannes Vermorel Jan 20 '10 at 17:08
1 Answers
1
robots.txt doesn't prevent interactive browsers from using the site. It is only used by robots like crawlers, feedreaders, recursive download tools (though the latter will let the user override it).

Tobu
- 4,437
- 1
- 24
- 31
-
Absolutely, but it does not matter in my case. I just want the results not to show up in Google. – Joannes Vermorel Jan 20 '10 at 11:49
-
Yeah, but now it's obvious you can just write a two-line robots.txt that excludes everyone everywhere (/ and *). – Tobu Jan 20 '10 at 22:17