1

I'm into my 3rd scrapy project and I'm getting a little bolder. I want to give this program to non-technical users so either cmd line or preferably .exe

First off, I started using Crawler.Process, using the documentation I came up with this:

process = CrawlerProcess()
process.crawl(FirstSpider)
process.crawl(SecondSpider)
process.crawl(ThirdSpider)
process.crawl(LastSpider)
process.start()

Each spider is in its own .py file so I've imported each one into one spider and put this block of code at the bottom, if there's a better way I'm all ears.

I tried running this as is, in the command dialogue and it returns an error saying the scraper.list doesn't exist when I try to import the other spiders.

I can run each scraper from within the file using the VS code terminal using typical scrapy crawl xyz... so how do we wrap it up for end users?

Thanks in advance.

Artie
  • 82
  • 1
  • 8

1 Answers1

0

Thanks Furas, I apologize for the omission, I ended up solving my own problem. The script was too deep in the file structure and I had to move it further up. It was unable to read the Scraper.items folder contents because it wasn't traveling up then back down the file path.

I've almost wrapped up this project but I'm having trouble with the exporter, I've posted that question here: Using Scrapy JsonItemsLinesExporter, returns no value

Artie
  • 82
  • 1
  • 8