0

I have finalized working on my Asp.Net 4.0 website. Now that i am to publish it by next few days, i am finding resources that can help me better rank my site on popular search engines. My site displays both static and dynamic contents. For dynamic contents i will be generating dynamic sitemap each week. My problem is that i read on google webmaster website that google accepts sitemaps only with .txt extension. (https://support.google.com/webmasters/answer/183668?hl=en). Orignal instructions quoted as:

* For best results, use the following guidelines for creating text file sitemaps:
   You must fully specify all URLs in your sitemap as Google attempts to crawl them exactly as you list them.
   Your text file must use UTF-8 encoding.
   Your text file should contain nothing but the list of URLs.
   You can name the text file anything you wish, provided it has a .txt extension (for instance, sitemap.txt).

As i have mentioned, i will be using c# code to dynamically generate xml sitemap for my site but i am not sure i will be able to write xml (by using C#) into .txt files. I have very little knowledge of writing xml by using C# (Such as by utilizing XmlWriter Class). I have found this website which uses it's sitemap file which is in .xml extension (http://www.mikesdotnetting.com/sitemap). Can anybody tell me what do i need to do to complete this final step of my project. Another thing that i am interested to know is should i submit my sitemap every time to google when a link is modified? Google says to submit your sitemap to google that contains no more than 50000 urls or less than 50mb.

John Conde
  • 217,595
  • 99
  • 455
  • 496
  • There's something wrong with education these days if they teach you how to write an entire XML file, but fail to teach you to write some text in a plain ol' text file... – Blindy Sep 23 '14 at 19:58
  • 1
    Ok, so Google says it accepts text files. What was your question again? Did you try it? Where is your text file? Is it valid? – Jonathan Wood Sep 23 '14 at 20:01
  • I think, there must be some difference between xml and txt files in terms of operation that we can perform on these two type of files by using c#. By the way i didn't learn coding by attending classes. i learned it myself. Help to the problem? – SaraWelfareOrg SWO Sep 23 '14 at 20:04
  • Jonathan Wood, can you please check this link: http://www.mikesdotnetting.com/sitemap they are using xml files but still getting ranked properly – SaraWelfareOrg SWO Sep 23 '14 at 20:05
  • @SaraWelfareOrgSWO: Yes, XML files are preferred, and that's what they are using. So I would expect that to work. I thought you were asking about TXT files. – Jonathan Wood Sep 23 '14 at 20:14

1 Answers1

0

Submitting every dynamic URL isn't necessarily going to improve your ranking. A lot of your ranking will depend on your pagerank, not the number of URLs you have. You need good content, and people who link to your site that think your content is good.

You read the document wrong. The very first line says

In addition to the standard XML format, Google also accepts the following file types as sitemaps:

The .txt file extension is only required when you have a text file that lists only the urls. Notice how even their submission example has a .xml extension since it's using the XML sitemap format.

It would be much simpler instead of generating a new sitemap file weekly to simply write a handler to generate it upon request, and cache the data for a period of time so you're not constantly generating it at every request for the sitemap file.

If google knows where the sitemap is, it will check it out periodically anyways so re-submitting it might not get you anything. You also don't want to submit too often as Google and other search engines may think you are trying to spam them. That's why the xml sitemap definition has elements for change frequency so they know how often to re-spider the page.

FYI, don't expect to see good ranking right away. It takes a while, months even, depending on the popularity of your site and the quality of the content. The volume of links won't help and Google will find them anyways when it spiders. It is possible to do too much.

Mark Fitzpatrick
  • 1,624
  • 1
  • 11
  • 8
  • Thanks mark-fitzpatrick for your helpful answer. Can you please tell me one more thing? If at time of publishing my webiste, i have some 100 urls in my sitemap that i manually submit to google. Later on, these urls will increase day by day (Public profile of donors, i mean links to their profile which will be generated when a new member signup). So if the number increase to 500, do i need to re-submit it periodically to google? – SaraWelfareOrg SWO Sep 23 '14 at 20:18
  • Google will keep track of these. Keep in mind, Google and other search engines watch the quality of the pages. It's not always worth putting a page in a sitemap if the content of that page isn't different enough as they can see it as spamming. Also, they can relate the title to the content to the url as part of the determination. If these other pages are available anywhere on the site as through a grid or list, then it's still going to find them (assuming it's a list that's can be scanned, ie: no ajax enabled grid). – Mark Fitzpatrick Sep 23 '14 at 20:51
  • Thanks you for these helpful information. If you visit google's webmaster page they have mentioned: – SaraWelfareOrg SWO Sep 23 '14 at 21:50
  • Once you've made your sitemap, you can then submit it to Google with the Sitemaps page, or by inserting the following line anywhere in your robots.txt file: Sitemap: http://example.com/sitemap_location.xml – SaraWelfareOrg SWO Sep 23 '14 at 21:51
  • If i prepare a robot.txt file, and follow the instrunctions as said above, i think i do not need to manually submit my sitemap to google – SaraWelfareOrg SWO Sep 23 '14 at 21:52