0

My site's content is defined by user-provided issues and comments and is dynamic and ever growing in nature. The database is expected to host millions of records.However, these contents are presented to visitors based on their selection of different categories and types (using dropdown options). In essence, these contents will not be visible to a search engine crawler. What is the best way to make these contents available to the crawler. Should I run batch operations at regular interval, create static webpages and make them available to the crawlers through sitemaps? Please suggest. Thanks.

user1928896
  • 514
  • 1
  • 4
  • 16
  • I suggest having a simple index page that lists all detail pages in-order of their database primary key (assuming it's a simple incrementing number) or their creation dates - you would add paging controls to this index page to prevent it growing infinitely, of course. – Dai Nov 08 '14 at 11:53
  • Yes, that's what I intend to do; include simple-index page(s) in sitemap. I was wondering if there were other ways. – user1928896 Nov 09 '14 at 13:59

1 Answers1

0

Make it available via sitemaps. You could use query parameters with all relevant combinations served to users. Each URL + query parameters should display some unique content of your database.

Jérôme Verstrynge
  • 57,710
  • 92
  • 283
  • 453