I would like to disallow some URLs in robots file of my website and have some difficulties.
Right now my robots file has the following content:
User-agent: *
Allow: /
Disallow: /cgi-bin/
Sitemap: http://seriesgate.tv/sitemap.xml
I do not want Google to index the following URLs:
http://seriesgate.tv/watch-breakingbad-online/season5/episode8/searchresult/
There are 8000 more URLs like this. So a code in robots file that block all this.
AND also I want to disallow search box from robots file so that search pages are not crawled by Google for example this URL:
seriesgate.tv/search/indv_episodes/friends/
Any ideas?