I know this probably seems ridiculous. I have given up on a windows scrapyd implementation and have set up a ubuntu machine and got everything working just great. I ahve 3 projects each with their own spider. I can run my spiders from the terminal using:
curl http://localhost:6800/schedule.json -d project=myproject -d spider=spider2
Everything seems to work in the web UI as well with the scraped items from when I run the above code showing up in the correct places.
I want to run project 1 every day at 12:00am, project 2 every second day at 2:00am and project 3 every 2 weeks at 4:00am. Please help me to learn how to do this.
Is scrapyd even an appropriate solution for this task?