0

I have tons of 404 crawl errors(my old url's).. I deleted them via Google Webmaster tools > remove url tool..

example: www.mysite.com/page1.html

But there are some external source sites which link my old urls on their content pages (ex: www.anothersite.com).. And because of they have my old urls on their pages, my url removal always fails..

What can i do now? i cannot delete these links; i don't know who is these websites owners.. And there are tons of external URLs like this; i cannot delete one by one via pressing button again and again.

Can robots.txt be enough? or what can i do more?

computingfreak
  • 4,939
  • 1
  • 34
  • 51
user3283814
  • 35
  • 1
  • 5

1 Answers1

0

You dont want to use robots.txt for blocking the url(Google does not recommending).

404s are a perfectly normal (and in many ways desirable) part of the web. You will likely never be able to control every link to your site, or resolve every 404 error listed in Webmaster Tools. Instead, check the top-ranking issues, fix those if possible, and then move on.

https://support.google.com/webmasters/answer/2409439?hl=en

venkat
  • 13
  • 3