0

I have to make an api call and get the response. This response contains more than 4000 urls. I have to list all these urls in the sitemap for the search engines to crawl easily. I have to write a handler for doing this task. Can someone suggest me an example for doing this.

TJK
  • 441
  • 4
  • 11
  • 27

2 Answers2

1

I'll assume you are talking about a sitemap in XML format, but you didn't specify what the source is besides that you are to do an API call. However, the 3rd or so result from a Google search on "asp.net google sitemap" should give you a perfect starting point:

http://www.mikesdotnetting.com/Article/94/Create-a-Google-Site-Map-with-ASP.NET

I would suggest creating an ASHX handler (File -> New -> Generic Handler in Visual Studio) instead of a page like they do in the example.

Upload the handler to the website and add the sitemap to e.g. Google by using their Webmaster Tools.

Eirik H
  • 654
  • 2
  • 8
  • 30
  • if I have many links in the sitemap, can i specify daily and 0.8 once for all of them. or do i have to specify individually for every link – TJK Jan 05 '12 at 22:59
  • The priority of an URL is relative to other URLs on your site and the default value is 0.5. So just adjust the priority for the s that should differ from the default and you should be able to skip setting it for the rest. I don't think you can set a default changefreq to use between URLs. – Eirik H Jan 06 '12 at 08:15
  • I have generated a sitemap xml with the help of a handler. this sitemap will be updated on a daily basis. Where should be the of this xml in my project? How do I confirm if the search engines actually crawled the links in the sitemap xml? – TJK Jan 09 '12 at 20:59
0

A quick search on Google resulted in this link:

XML sitemap with ASP.NET

Which should get you most of the way with the handler and composing the XML.

Michiel van Oosterhout
  • 22,839
  • 15
  • 90
  • 132