0

I have a website that uses pretty urls and need to block certain parameters from search using robots.txt.

my url structure is like: http://example.com/vcond/Used/make/mymake/features/myfeatures

How can i use robots.txt to block urls only when features is a parameter of the url. I had read that you can do something like this:

Disallow: *features

And this will block bots from any url that has features in it. Is this true? But i need urls like: http://example.com/vcond/Used/make/mymake To work!!

Thank

user2707535
  • 75
  • 2
  • 6

1 Answers1

2
Disallow: /*/features

Should do the trick. See https://developers.google.com/webmasters/control-crawl-index/docs/robots_txt.

Also, see this thread: Robots.txt: Is this wildcard rule valid?

Community
  • 1
  • 1
  • Note that this does not work for all robots.txt parsers, as the `*` wildcard in `Disallow` is not part of the original robots.txt specification. – unor Nov 12 '13 at 13:39