0

I am trying to get my Scrapy crawler running again. I have just installed Debian Stretch, 64 bit. Of course I installed python and python-scrapy. I get to the virtualenv (Requirement already satisfied) but I get the following ImportError: libssl.so.1.0.0: cannot open shared object file: No such file or directory error:

.../env/local/lib/python2.7/site-packages/cryptography/hazmat/bindings/openssl/binding.py", line 13, in <module>
from cryptography.hazmat.bindings._openssl import ffi, lib
ImportError: libssl.so.1.0.0: cannot open shared object file: No such file or directory

Any ideas what I can do?

$ python -V
Python 2.7.13

$ scrapy -v
Scrapy 1.4.0

python-cryptography installed version: 1.7.1-3

$openssl version
OpenSSL 1.1.0f  25 May 2017

$which openssl
/usr/bin/openssl

$pip freeze
acme==0.10.2
asn1crypto==0.22.0
attrs==16.3.0
backports.shutil-get-terminal-size==1.0.0
beautifulsoup4==4.5.3
boto==2.44.0
certbot==0.10.2
cffi==1.10.0
chardet==2.3.0
chrome-gnome-shell==0.0.0
click==6.6
colorama==0.3.7
ConfigArgParse==0.11.0
configobj==5.0.6
constantly==15.1.0
cryptography==2.0
cssselect==1.0.1
decorator==4.0.11
Django==1.10.7
dnspython==1.15.0
enum34==1.1.6
funcsigs==1.0.2
html5lib==0.999999999
idna==2.5
incremental==16.10.1
ipaddress==1.0.18
ipython==5.1.0
ipython-genutils==0.1.0
keyring==10.1
keyrings.alt==1.3
lxml==3.7.1
mock==2.0.0
musicbrainzngs==0.6
mutagen==1.36
mysqlclient==1.3.7
numpy==1.12.1
PAM==0.4.2
parsedatetime==2.1
parsel==1.2.0
pathlib2==2.1.0
pbr==1.10.0
pdfshuffler==0.6.0
pexpect==4.2.1
pickleshare==0.7.4
Pillow==4.0.0
prompt-toolkit==1.0.9
psutil==5.0.1
ptyprocess==0.5.1
pyasn1==0.1.9
pyasn1-modules==0.0.7
pycparser==2.18
pycrypto==2.6.1
PyDispatcher==2.0.5
Pygments==2.2.0
pygobject==3.22.0
PyICU==1.9.5
pyOpenSSL==17.1.0
PyPDF2==1.26.0
pyRFC3339==1.0
pyserial==3.2.1
pytz==2016.7
pyxdg==0.25
queuelib==1.4.2
quodlibet==3.7.1
requests==2.12.4
scour==0.32
Scrapy==1.4.0
SecretStorage==2.3.1
service-identity==16.0.0
simplegeneric==0.8.1
six==1.10.0
sqlparse==0.2.2
traitlets==4.3.1
Twisted==16.6.0
urllib3==1.19.1
virtualenv==15.1.0
w3lib==1.17.0
wcwidth==0.1.7
webencodings==0.5
zope.component==4.3.0
zope.event==4.2.0
zope.hookable==4.0.4
zope.interface==4.3.2

Contents of requirements.txt (worked fine in Debian Wheezy):

Scrapy==1.0.3
ipdb==0.8.1
service_identity==14.0.0
pytest==2.7.2

1 Answers1

0

In Stretch, libopenssl is available under version 1.1 onwards.

I suggest you find a most recent version of Scrappy (AFAIK, the current stable is 1.4.0. I haven't tested it under stretch, however).

ZeusInTexas
  • 309
  • 1
  • 7
  • My bad, I just got you. Have you tried checking out the version of python-cryptography? – ZeusInTexas Aug 01 '17 at 21:31
  • I have removed scrapy (`apt-get remove --purge python-scrapy` and `pip uninstall scrapy`) and updated scrapy to version 1.4.0 (`sudo pip install scrapy`). I also installed `cryptography-2.0.2-cp27-cp27mu-manylinux1_x86_64.whl`. However, the problem persists. – Katja Eichelberger Aug 01 '17 at 21:47
  • You installed `cryptography` or `python-cryptography`? – ZeusInTexas Aug 01 '17 at 22:13
  • I have both installed `cryptography` and `python-cryptography`. – Katja Eichelberger Aug 02 '17 at 05:28
  • I tried reproducing your issue on a fresh, clean stretch installation but import scrapy worked. Can you edit your first post and add some of your code? Also, did you updated from wheezy and what is your version of `python-cryptography`? – ZeusInTexas Aug 02 '17 at 10:37
  • `python-cryptography` installed version is 1.7.1-3. Updated the original question with more information. Did not update, fresh install. Which code would you like to see? – Katja Eichelberger Aug 02 '17 at 13:01
  • Well, that's weird. I'm out of ideas. Maybe try your code in a clean VM to see if it comes from the system or the code itself? One dirty hack would be to link the lib `ln -s /usr/lib/libssl.so.1.{1.0f,0.0}`, but it's almost guaranteed to fail! Btw, can you post the output of `/usr/lib/libssl*` ? – ZeusInTexas Aug 03 '17 at 08:45