-2

I have a website with over 4,00,000 pages and i have created 10 sitemaps with 40,000 links in each site dynamically with php and submitted it in my google webmasters account , i add 50 - 60 pages to my website daily and i don't want to create another sitemap after every 40,000 links now . I have a solution in mind for this which is making a sitemap dynamically which shows all the links to pages created with in last 30 days now and re-submitting it everyday once (with a cron job) but here's the problem the pages i have created before last 30 days will not be in any of the sitemaps so i wanna know is if the links are indexed by google and after resubmitting the sitemap if the links are not in the sitemap will they get unindexed ? and if yes i would really like to know the solution for this ..

I am kind of beginner in seo so if it's a bad question i am really sorry but i searched alot before posting this question but couldn't find any solution.

user2801966
  • 418
  • 3
  • 11
  • 4
    This question appears to be off-topic because it is about SEO – John Conde Sep 21 '13 at 12:30
  • not sure what you mean by off topic ? i have added the tag "seo" to it..? – user2801966 Sep 21 '13 at 13:02
  • With such a huge website sitemaps is to no use at all. All you get is a maintenance horror, and most likely even worse indexing than without it. Off course google are indexing your pages, both with or without sitemaps. Sitemaps is not for indexing but for detailed instructions about your website structure to the search engines . – davidkonrad Sep 21 '13 at 13:19
  • Okie then without the sitemap how can i make sure that google spider crawls my latest pages – user2801966 Sep 21 '13 at 13:33

1 Answers1

0

You might want to look at the Sitemap index standard to see if this may help you break your very large site into more manageable chunks for Google and other search engines to traverse through your sitemaps. Particularly since you are using PHP, the "last updated" date and the assigned weight still factor into the crawl frequency.

To answer your question, though, I am fairly sure the answer is "No". Google has no reason to delete a page from their index unless you explicitly tell them to (using the section in Webmaster Tools or if your server responds with a 301 or 404 HTTP status code).

But I really do think you could benefit from using the Sitemap directory schema described above.

  • I am using the same scheme that's why i have 10 sitemaps for 400000 pages coz the amx num links per site map is limited to 50000, just wanted to know if once a page is indexed in google will it get unindexed after i update the sitemap with new links but i think you answered the question – user2801966 Sep 22 '13 at 06:36