0

When crawling a completely new URL, the googlebot crawler will always fetch the latest version of the page the first time it is tested,

After this first instance of a new URL being tested, it continues to load the same cached copy even when the “Live URL Testing” option is used.

By going through the HTML I’ve confirmed that all the changes in site elements since the initial crawl of these pages, have not been reflected on GSC.

The regular users however, are receiving the updated version of the page immediately.

For example, when crawling this URL through GSC:

https://app.aventure.vc/research/companies/bettercloud-new-york-ny-usa

The screenshot provided by GSC:

https://i.stack.imgur.com/9V9zb.png

**

The content in the first image has been the same for over 3 days, since changes were made to the website and that page in particular.**

The actual content on the site:

https://i.stack.imgur.com/V7Tgu.png

(Note that the values within the info cards in both images are different)

How do I disable this caching and have the crawler always receive the latest version of the site?

These are the relevant meta tags I have placed in the header of the website, in an attempt to stop the crawler from receiving cached content but the crawler still continues to receive the same cached pages.

<meta http-equiv="Cache-Control" content="no-cache, no-store, must-revalidate" />

<meta http-equiv="Pragma" content="no-cache" />

<meta http-equiv="Expires" content="0" />

<meta name="robots" content="all">

0 Answers0