Can I for example, include
User-agent: * Disallow: /
in the robots and then include all of the links i want including the home page in the sitemap and multiple other sitemaps and have them be indexed? Does that work?
I have noticed one of our competitors uses robots to disallow links, and also uses rel=nofollow on those exact same links, but uses extensive sitemaps to make up for it and they are indexed. I have noticed that this method is better for certain sites which are link heavy and require specific order.
Can I do this?