I have the following two Scrapy projects with the following configurations
The Project1's scrapy.cfg
[settings]
default = Project1.settings
[deploy]
url = http://localhost:6800/
project = Project1
[scrapyd]
eggs_dir = eggs
logs_dir = logs
logs_to_keep = 500
dbs_dir = dbs
max_proc = 5
max_proc_per_cpu = 10
http_port = 6800
debug = off
runner = scrapyd.runner
application = scrapyd.app.application
and Project2's scrapy.cfg
[settings]
default = Project2.settings
[deploy]
url = http://localhost:6800/
project = Project2
[scrapyd]
eggs_dir = eggs
logs_dir = logs
logs_to_keep = 500
dbs_dir = dbs
max_proc = 5
max_proc_per_cpu = 10
http_port = 6800
debug = off
runner = scrapyd.runner
application = scrapyd.app.application
but when I take look at http://localhost:6800/jobs I always see just 8 items are in running, it means default max_proc_per_cpu is not applied, I delete the projects with the following commands
curl http://localhost:6800/delproject.json -d project=Project1
curl http://localhost:6800/delproject.json -d project=Project2
and deploy them again to make sure new changes are deployed. but the running spiders number still is 8 .
my VPS CPU has two cores. I could get it with
python -c 'import multiprocessing; print(multiprocessing.cpu_count())'
.
how can I get the Scrapyd deployed configuration? how can I set Max process per cpu?