I am trying to run scrapyd my ubuntu server which has a public IP using the following config file named scrapy.cfg
[settings]
default = web_crawler.settings
[deploy:default]
url = http://127.0.0.1:6800/
project = web_crawler
[scrapyd]
eggs_dir = eggs
logs_dir = logs
jobs_to_keep = 5
dbs_dir = dbs
max_proc = 1
max_proc_per_cpu = 4
finished_to_keep = 100
poll_interval = 5.0
bind_address = 0.0.0.0
http_port = 6800
debug = off
runner = scrapyd.runner
application = scrapyd.app.application
launcher = scrapyd.launcher.Launcher
webroot = scrapyd.website.Root
After I start the scrapyd server using scrapyd
command and deploy the spider using scrapyd-deploy default
, I try to access the scrapyd interface in the browser http://publicip:6800
but it just says connection refused. What could be the reason for this? There is no firewall, all ports are open.
What is it that I am missing?