0

I have thousands of pages on my website: https://twoshot.app

at https://twoshot.app/robots.txt, you get:

# https://www.robotstxt.org/robotstxt.html
User-agent: *
Disallow:

Checking my google search console says pages are blocked, though - and only ~50 pages are indexed

I'm using a reactjs single-page web app hosting with netlify.

Interestingly, in the passed, google was able to crawl 500+ pages: enter image description here

Tobiq
  • 2,489
  • 19
  • 38

1 Answers1

0

You need to submit the sitemap.xml file in the google search console.