Totally new to scrapy and crawlspider.. I'm stuck on how to define rules for nested crawling?? I've a rule defined as
Rule(LinkExtractor(
allow=(),
restrict_xpaths='//div[@class="sch-main-menu-sub-links-left"]'
), callback='parse_item', follow=True)
It crawls categories page, then, from each category page extracted from this rule, I've to crawl all the products on that page with rule
Rule(LinkExtractor(
allow=(),
restrict_xpaths='div[@class="sch-category-products-item"]'
), callback='parse_product', follow=True)
Also the category page is paginated.. But I'm not getting on how to do this. first rule is successful but where to place second Rule? Is there any way in CrawlSpider to define rule levels??