0

I am trying to put my logs in a "logs" folder but when I try to deploy to Scrapy I get No such file or directory: '/scrapinghub/sfb/logs/random_log.log' but I think I am declaring it correctly in the setup.py file. What am I doing wrong here?

File structure:

sfbu 
     -- bin
            -- sfbitemcomparer.py
     -- requirements.txt
     -- scrapy.cfg
     -- setup.py
     -- sfb 
            -- logs 
                    -- random_log.log
            -- _init_.py
            -- items.py
            -- middlewares.py
            -- pipelines.py
            -- settings.py
            -- spiders
                       -- spider.py

setup.py:

setup(
    name         = 'sfb',
    version      = '1.0',
    packages     = find_packages(),
    scripts      = ['bin/sfbitemcomparer.py'],
    package_data = {
        'sfb': ['sfb/logs/*.log']
    },
    entry_points = {'scrapy': ['settings = sfb.settings']},
)
weston6142
  • 181
  • 14
  • Do you get above error using `shub deploy` or while running your spider? – gangabass Aug 14 '20 at 02:23
  • Shun deploy or just deploying on the Scrapinghub website. I ended up just changing all my file handlers to stream handlers and just look on the Scrapinghub website at the logs. Do you have a good log file solution if I am using manual loggers and not the scrap root logger? – weston6142 Aug 14 '20 at 02:59
  • Your manual loggers should work without any changes. Don’t they? Are you using Python logging or something else? – Gallaecio Aug 17 '20 at 12:25
  • I actually ditched this method and ended up just using loggly with the loggly handler and it works great! – weston6142 Aug 17 '20 at 14:14

0 Answers0