0

I read some threads about connecting Mysql with scrapinghub deployed script. They reccomend to change *.yml file and add requirements txt. This solution worked few days ago. Now it doesnt.

enter image description here

Here is error from Shub Deploy.

 Collecting MySQLdb-python==1.2.5 (from -r /app/requirements.txt (line 1))
←[91m  Could not find a version that satisfies the requirement MySQLdb-python==1.2.5 (from -r /app/
requirements.txt (line 1)) (from versions: )
←[0m
←[91mNo matching distribution found for MySQLdb-python==1.2.5 (from -r /app/requirements.txt (line
1))
←[0m
{"message": "The command '/bin/sh -c sudo -u nobody -E PYTHONUSERBASE=$PYTHONUSERBASE     pip insta
ll --user --no-cache-dir -r /app/requirements.txt' returned a non-zero code: 1", "details": {"messa
ge": "The command '/bin/sh -c sudo -u nobody -E PYTHONUSERBASE=$PYTHONUSERBASE     pip install --us
er --no-cache-dir -r /app/requirements.txt' returned a non-zero code: 1", "code": 1}, "error": "req
uirements_error"}
{"status": "error", "message": "Requirements error"}

Is it SH that changed smth or where am I wrong?

Billy Jhon
  • 1,035
  • 15
  • 30

2 Answers2

4

1) You should write MySQL-python==1.2.5 in your requirements.txt file, this is what I have and my project is working fine.

And then I am using MYSQL like this

import MySQLdb

conn = MySQLdb.connect(user=DB_CREDS['user'], passwd=DB_CREDS['pass'], db=DB_CREDS['db'], host=DB_CREDS['host'], charset="utf8", use_unicode=True)

cursor = MySQLdb.cursors.DictCursor(conn) 

conn.autocommit(True)

cursor.execute("SELECT myColumn FROM table_name")

for row in self.cursor.fetchall():
    print row['myColumn']

2) Also remove build/ and project.egg-info/ folders before deploying again.

3) Make sure your scrapinghub.yml file looks like this

projects:
  default: 123213213
requirements_file: requirements.txt
Umair Ayub
  • 19,358
  • 14
  • 72
  • 146
  • then what do you import? MySql or MySqldb? – Billy Jhon Nov 03 '17 at 14:17
  • Nope. Does not make any difference. See screenshot - http://cropme.ru/s/h/b/3/ed12017d.png And Yes i have seen your question and answers before. And that is what strange. it did work 5 days ago for another project. Now none of them works. – Billy Jhon Nov 03 '17 at 14:32
  • @BillyJhon Ah, I didnt notice, you have incorrect settings in `scrapinghub.yml` file ... please see my updated answer again, see point.3 – Umair Ayub Nov 03 '17 at 14:35
  • Update `shub` by `pip install shub --upgrade` ... Or try to logout from shub and then log back in – Umair Ayub Nov 03 '17 at 14:42
  • Looks like settings made the trick. Thanx for all your help. I will play with it abit more and mark your answer as accepted. Thanks again. – Billy Jhon Nov 03 '17 at 14:47
0

get the same issue.

python: 3.9.6 pip: 21.1.3 shub: 2.13.0

// scrapyinghub.yml
projects:
  default: 70449

requirements_file: requirements.txt
// requirements.txt
pip==21.1.3
$ python -m shub deploy
Deploying to Scrapy Cloud project "70449"
Deploy log last 30 lines:
 ---> Using cache
 ---> d5f56e9b2c2f
Step 3/12 : ADD eggbased-entrypoint /usr/local/sbin/
 ---> Using cache
 ---> e4326e357a5c
Step 4/12 : ADD run-pipcheck /usr/local/bin/
 ---> Using cache
 ---> 72567751d2ac
Step 5/12 : RUN chmod +x /usr/local/bin/run-pipcheck
 ---> Using cache
 ---> 365b8f799cc4
Step 6/12 : RUN chmod +x /usr/local/sbin/eggbased-entrypoint &&     ln -sf /usr/local/sbin/eggbased-entrypoint /usr/local/sbin/start-crawl &&     ln -sf /usr/local/sbin/eggbased-entrypoint /usr/local/sbin/scrapy-list &&     ln -sf /usr/local/sbin/eggbased-entrypoint /usr/local/sbin/shub-image-info &&     ln -sf /usr/local/sbin/eggbased-entrypoint /usr/local/sbin/run-pipcheck
 ---> Using cache
 ---> 3b80a05baecd
Step 7/12 : ADD requirements.txt /app/requirements.txt
 ---> Using cache
 ---> 81a2d0e0fc10
Step 8/12 : RUN mkdir /app/python && chown nobody:nogroup /app/python
 ---> Using cache
 ---> 12d562276a09
Step 9/12 : RUN sudo -u nobody -E PYTHONUSERBASE=$PYTHONUSERBASE -E PIP_NO_CACHE_DIR=0     pip install --user --no-cache-dir -r /app/requirements.txt
 ---> Running in 0c8c5e436aea
Collecting pip==21.1.3 (from -r /app/requirements.txt (line 1))
  Could not find a version that satisfies the requirement pip==21.1.3 (from -r /app/requirements.txt (line 1)) (from versions: )

No matching distribution found for pip==21.1.3 (from -r /app/requirements.txt (line 1))

{"message": "The command '/bin/sh -c sudo -u nobody -E PYTHONUSERBASE=$PYTHONUSERBASE -E PIP_NO_CACHE_DIR=0     pip install --user --no-cache-dir -r /app/requirements.txt' returned a non-zero code: 1", "details": {"message": "The command '/bin/sh -c sudo -u nobody -E PYTHONUSERBASE=$PYTHONUSERBASE -E PIP_NO_CACHE_DIR=0     pip install --user --no-cache-dir -r /app/requirements.txt' returned a non-zero code: 1", "code": 1}, "error": "requirements_error"}

{"status": "error", "message": "Requirements error"}
Deploy log location: /tmp/shub_deploy_se518dkv.log
Error: Deploy failed: b'{"status": "error", "message": "Requirements error"}'

Updated

specify the scrapy-stack then deploy success.

projects:
  default: 70449

stack: scrapy:1.3-py3

requirements_file: requirements.txt
$ python -m shub deploy

Packing version v1.0.1-7-g00c7efa-tmp
Deploying to Scrapy Cloud project "70449"
{"status": "ok", "project": 70449, "version": "v1.0.1-7-g00c7efa-tmp", "spiders": 3}
Run your spiders at: https://app.scrapinghub.com/p/70449/
eddie
  • 1
  • 1