0

I have a website but on its category page , product list generated after page loaded via javascript. And my crawler goes it and couldnt find any product. How can i solve that problem ?

        CrawlConfig config = new CrawlConfig();
        config.setCrawlStorageFolder(rootFolder);
        config.setMaxPagesToFetch(100000000);
        config.setMaxDepthOfCrawling(-1);
        config.setPolitenessDelay(1);
        config.setUserAgentString("Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/33.0.1750.146 Safari/537.36");
        //config.setResumableCrawling(true);
        config.setIncludeHttpsPages(true);



        PageFetcher pageFetcher = new PageFetcher(config);
        RobotstxtConfig robotstxtConfig = new RobotstxtConfig();
        robotstxtConfig.setEnabled(false);
        RobotstxtServer robotstxtServer = new RobotstxtServer(robotstxtConfig, pageFetcher);
        CrawlController controller = new CrawlController(config, pageFetcher, robotstxtServer);



        controller.addSeed(siteDomain);
        for(int i = 4; i<=14; i++)
        {
            if(i < args.length)
        {
            controller.addSeed(args[i]);
        }
        }



        controller.start(Crawling.class, numberOfCrawlers);


        List<Object> crawlersLocalData = controller.getCrawlersLocalData();
Muhammet Arslan
  • 975
  • 1
  • 9
  • 33

1 Answers1

0

Unfortunately, crawler4j only supports static content. For javascript and ajax support use a crawler like crawljax or nutch with selenium.