The default robots.txt file disable search engine indexing? And I cannot replace it with my own. What did I miss?
That's the default and fixed one:
User-agent: *
Disallow: /
The default robots.txt file disable search engine indexing? And I cannot replace it with my own. What did I miss?
That's the default and fixed one:
User-agent: *
Disallow: /
According to slack, robots.txt files on *.surge.sh subdomains are locked to prevent link farming... sad.