-1

I used google webmaster tool to "Fetch as Google", but I receive the response "partial" because some resources are blocked. when I check the result, I can read that Googlebot couldn't get all resources for this page. I have a list with all the files, the priority high are all the images form google maps. as you can see in this screenshot https://cdn.pbrd.co/images/12opwvUd.png is there a method to avoid this?

my robots.txt is very basic, so I don't think is a problem of my robots.txt:

User-agent: *
Disallow: /cgi-bin/
Disallow: /sass/

any suggestion or advice?

mattia
  • 591
  • 2
  • 7
  • 22

1 Answers1

1

Don't worry about it.

Google Maps server is blocking these requests. Not your server. You can see this by looking at the Google Maps robots.txt: https://maps.googleapis.com/robots.txt

Guessing Google doesn't think the maps data is an important part of your page and worth indexing. And it probably wastes a lot of time crawling Google maps loaded on other sites.

So it's normal and won't stop Google indexing your content from your site.

It would be nice if Fetch As Google explained this better - especially as both products are from Google! But on other hand why should it get preferential treatment over other sites that provide resources but don't want to be crawled.

Barry Pollard
  • 40,655
  • 7
  • 76
  • 92