I have thousands of pages on my website: https://twoshot.app
at https://twoshot.app/robots.txt, you get:
# https://www.robotstxt.org/robotstxt.html
User-agent: *
Disallow:
Checking my google search console says pages are blocked, though - and only ~50 pages are indexed
I'm using a reactjs single-page web app hosting with netlify.
Interestingly, in the passed, google was able to crawl 500+ pages: