-2

My application urls are like the following

http://example.com/app/1

http://example.com/app/2

http://example.com/app/3

...

http://example.com/app/n

Now I want to block all these URLs from crawling but not the http://example.com/app How can I do this using robots.txt

1 Answers1

0

Add the following to your robots.txt

Disallow: /app/

This will allow http://example.com/app but not http://example.com/app/* You can test your robots.txt here https://www.google.com/webmasters/tools/robots-testing-tool