-1

our 2 products such as www.nbook.in and www.a4auto.com are removed from all search engines. These projects got sub-domains and the links from subdomains are available. The domain is not blacklisted. The custom sitemap is created and the same is getting indexed even now. Analyzed the URL by google search console and it seems fine. 'site:nbook.in' in google didn't produce any result. Actually, we got more than 75,000 links available on google. it was working fine till last week. This is affecting our product's Alexa rank and reach. There is no issue with robots.txt file in the server document root folder. There is no rule set to deny any bot or user agent. It is not just Google, all search engines removed our links and this is making me confuse. We have other products which is designed and developed on the same way but there is absolutely no issue. I think there is nothing more to do with google search console/webmaster. I'm ready to provide more data upon requirement. Please help.

Sajan Saj
  • 1
  • 4
  • I'm voting to close this question as off-topic because its not programming related may be better suited for https://webmasters.stackexchange.com/ – Linda Lawton - DaImTo Jan 07 '20 at 11:37
  • This question appears to be off-topic because it is about **SEO** which is off-topic at Stack Overflow. Please read ["Which SEO questions should be closed as non-programming/non-admin?"](//meta.stackoverflow.com/a/382618) to better understand when SEO questions are acceptable to ask here (most are not) and where you might be able to get assistance. – John Conde Jan 07 '20 at 12:44

1 Answers1

0

create robots.txt on root and put this code

User-agent: *
Disallow: /
Moh Ali
  • 71
  • 2