3

i am using Html Agility Pack to fetch data from website(scrapping)

My problem is the website from i am fetching the data is load some of the content after few seconds of page load.

SO whenever i am trying to read the particular data from particular Div its giving me null.

but in var page i just not getting the division reviewBox..becuase its not loaded yet.

public void FetchAllLinks(String Url)
{
    Url = "http://www.tripadvisor.com/";
    HtmlDocument page = new HtmlWeb().Load(Url);

    var link_list= page.DocumentNode.SelectNodes("//div[@class='reviewBox']");

    foreach (var link in link_list)
    {
        htmlpage.InnerHtml = link.InnerHtml;
    }
}

so can anyone please tell me how to delay the request that

HtmlDocument page = new HtmlWeb().Load(Url);

will load the full data in page varibale

unknown6656
  • 2,765
  • 2
  • 36
  • 52
BhavikKama
  • 8,566
  • 12
  • 94
  • 164

1 Answers1

3

It's not about delaying the request. That node is populated by javascript using the DOM and the Html Agility Pack is the wrong tool for that requirement (it isn't a web engine at all, it only loads the base Html).

When I need to get at stuff that requires a full web engine to parse, I typically use WatiN. It's designed to help unit test actual web pages, but that means it allows programmatic access to web pages through a given browser engine and will load the full document. It comes with IE or Firefox drivers out of the box and I vaguely recall that Chrome wasn't hard to use, either.

dlh
  • 549
  • 4
  • 7
  • 20
Jacob Proffitt
  • 12,664
  • 3
  • 41
  • 47