1

I have created a spider using Portia UI and I have deployed and scheduled in one of my virtual machine using scrapyd. Spider ran fine and scraped website contents.

But when I try to deploy and schedule the same spider in another similar virtual machine using scrapyd, the spider ran fine but not crawling any content.

Both machines have similar configuration, setups, packages and versions.

What could be the possible issue?

Edit
I have done the following
- Installed all Portia packages in my machine using docker
- Created a spider(say myspider)
- Deployed and scheduled that spider using scrapyd
- Got extracted content from spider run
- Cloned the machine and added to another network with different ISP
- Deployed the same spider(myspider)
- Spider ran fine but website content didn't extract
- I have created a new spider with some different URL and that spider is crawling the website content fine

Prabhakar
  • 1,138
  • 2
  • 14
  • 30

0 Answers0