In robot.txt file I have put a URL /custompages/*
and google bot should not crawl the pages which are matched with "/custompages/".
But when I looked into webmaster, I can still see the error messages from those links.
User-agent: *
Disallow: /search/application/*
Disallow: /custompages/*
above is my robot.txt file.
On webmaster I can see Googlebot couldn't access the contents of this URL because the server had an internal error when trying to process the request. These errors tend to be with the server itself, not with the request.
for the URL
custompages/savesearch?alid=9161
Where wrong might have happened..?