How can I send the scraped URL's from one spider to the start_urls
of another spider?
Specifically, I want to run one spider which gets a list of URL's from an XML page. After the URL's have been retrieved I want them to by used by another spider for scraping.
from scrapy.spiders import SitemapSpider
class Daily(SitemapSpider):
name = 'daily'
sitemap_urls = ['http://example.com/sitemap.xml']
def parse(self, response):
print response.url
# How do I send these URL's to another spider instead?
yield {
'url': response.url
}