2

I created Gatsby.js website hosted by Gatsby Cloud(Individual Trial Plan) and using Contentful. Although I didn't set Robot.txt, the production site automatically had x-robots-tag: none at HTTP header. This caused Google Search Console to show error 'noindex tag is added'.

Could you kindly help me how to remove x-robots-tag: none and set x-robots-tag: all.

I have already tried to use proper robot.txt with gatsby-plugin-robots-txt but never solved.

Ferran Buireu
  • 28,630
  • 6
  • 39
  • 67
d.y
  • 105
  • 1
  • 6

2 Answers2

4

Taken from GatsbyJs support

By default, all Previews, Builds, and PR Builds are set to Public. We set an x-robots-tag: none Header to prevent search engines from crawling your site that is hosted on the gtsb.io domain.

Add your domain and run LH again

J. Doe
  • 187
  • 2
  • 10
  • 2
    I can confirm that it only shows that in the internal Lighthouse report, the external shows me: Page isn’t blocked from indexing – Jannis Hell Jun 20 '21 at 14:23
2

Hi have you tried with no-index attribute of gatsby-plugin-next-seo

This property works in tandem with the nofollow property and together they populate the robots and googlebot meta tags.

Kelei Ren
  • 360
  • 4
  • 10