4

I try to create an executable of our scrapy project using cx_freeze (also tried py2exe, which seems to run into the same problem), but run into quite some errors. Currently I'm stuck on the following error, I've searched here and on other places for similar problems, but haven't found a solution yet. The build of the executable succeeds, but when I try to scrape I run into the following error:

C:\**\build\exe.win32-2.7>fourmi.exe search Methane
2014-06-04 16:05:31+0200 [-] ERROR: Traceback (most recent call last):
2014-06-04 16:05:31+0200 [-] ERROR:   File "C:\Python27\lib\site-packages\cx_Fre
eze\initscripts\Console.py", line 27, in <module>
2014-06-04 16:05:31+0200 [-] ERROR:     exec(code, m.__dict__)
2014-06-04 16:05:31+0200 [-] ERROR:   File "fourmi.py", line 92, in <module>
2014-06-04 16:05:31+0200 [-] ERROR:     search(arguments, loader)
2014-06-04 16:05:31+0200 [-] ERROR:   File "fourmi.py", line 78, in search
2014-06-04 16:05:31+0200 [-] ERROR:     setup_crawler(docopt_arguments["<compoun
d>"], settings, source_loader, docopt_arguments["--attributes"].split(','))
2014-06-04 16:05:31+0200 [-] ERROR:   File "fourmi.py", line 41, in setup_crawle
r
2014-06-04 16:05:31+0200 [-] ERROR:     crawler.configure()
2014-06-04 16:05:31+0200 [-] ERROR:   File "C:\Python27\lib\site-packages\scrapy
\crawler.py", line 46, in configure
2014-06-04 16:05:31+0200 [-] ERROR:     self.extensions = ExtensionManager.from_
crawler(self)
2014-06-04 16:05:31+0200 [-] ERROR:   File "C:\Python27\lib\site-packages\scrapy
\middleware.py", line 50, in from_crawler
2014-06-04 16:05:31+0200 [-] ERROR:     return cls.from_settings(crawler.setting
s, crawler)
2014-06-04 16:05:31+0200 [-] ERROR:   File "C:\Python27\lib\site-packages\scrapy
\middleware.py", line 29, in from_settings
2014-06-04 16:05:31+0200 [-] ERROR:     mwcls = load_object(clspath)
2014-06-04 16:05:31+0200 [-] ERROR:   File "C:\Python27\lib\site-packages\scrapy
\utils\misc.py", line 42, in load_object
2014-06-04 16:05:31+0200 [-] ERROR:     raise ImportError("Error loading object
'%s': %s" % (path, e))
2014-06-04 16:05:31+0200 [-] ERROR: ImportError: Error loading object 'scrapy.co
ntrib.memusage.MemoryUsage': No module named multipart

The setup.py used for the build:

import sys
from cx_Freeze import setup, Executable

# After running the setup file (python setup.py build) the scrapy/VERSION file has to  
be manually put into the
# library.zip, also the FourmiCrawler map has to be copied to both the library and the     
exe.win32-2.7 folder. after
# putting the files in the library the library has to be zipped and replace the old  
library.
# Dependencies are automatically detected, but it might need fine tuning.
build_exe_options = {"packages": ["os", "scrapy", "lxml", "w3lib", "pkg_resources",          
"zope.interface", "twisted.internet"], "excludes": []}

# GUI applications require a different base on Windows (the default is for a
# console application).
base = None

setup(  name = "Scrapy",
        version = "0.1",
        description = "My GUI application!",
        options = {"build_exe": build_exe_options},
        executables = [Executable("fourmi.py", base=base)])

In Can't make standalone binary scrapy spider with cx_Freeze it was suggested to add twisted to the packages to include, if I do that I get the following error:

running build
running build_exe
Traceback (most recent call last):
File "setup.py", line 20, in <module>
executables = [Executable("fourmi.py", base=base)])
File "C:\Python27\lib\site-packages\cx_Freeze\dist.py", line 362, in setup
distutils.core.setup(**attrs)
File "C:\Python27\lib\distutils\core.py", line 152, in setup
dist.run_commands()
File "C:\Python27\lib\distutils\dist.py", line 953, in run_commands
self.run_command(cmd)
File "C:\Python27\lib\distutils\dist.py", line 972, in run_command
cmd_obj.run()
File "C:\Python27\lib\distutils\command\build.py", line 127, in run
self.run_command(cmd_name)
File "C:\Python27\lib\distutils\cmd.py", line 326, in run_command
self.distribution.run_command(command)
File "C:\Python27\lib\distutils\dist.py", line 972, in run_command
cmd_obj.run()
File "C:\Python27\lib\site-packages\cx_Freeze\dist.py", line 232, in run
freezer.Freeze()
File "C:\Python27\lib\site-packages\cx_Freeze\freezer.py", line 603, in Freeze
self.finder = self._GetModuleFinder()
File "C:\Python27\lib\site-packages\cx_Freeze\freezer.py", line 345, in _GetMo
duleFinder
finder.IncludePackage(name)
File "C:\Python27\lib\site-packages\cx_Freeze\finder.py", line 688, in Include
Package
self._ImportAllSubModules(module, deferredImports)
File "C:\Python27\lib\site-packages\cx_Freeze\finder.py", line 324, in _Import
AllSubModules
recursive)
File "C:\Python27\lib\site-packages\cx_Freeze\finder.py", line 324, in _Import
AllSubModules
recursive)
File "C:\Python27\lib\site-packages\cx_Freeze\finder.py", line 320, in _Import
AllSubModules
raise ImportError("No module named %r" % subModuleName)
ImportError: No module named 'twisted.conch.ssh.transport'

Does anybody have suggestions for me to try? Do I forget a certain package in the setup.py file? Also I couldn't find any mentions of a succesfull scrapy executable generation.

Edit: I tried a very ugly solution by simply copying my python site-packages one by one. When I copied the crypthography package the bug was resolved (but replaced with a new bug):

C:\**\build\exe.win32-2.7>fourmi.exe search Methane
Traceback (most recent call last):
File "C:\Python27\lib\site-packages\cx_Freeze\initscripts\Console.py", line 27
, in <module>
exec(code, m.__dict__)
File "fourmi.py", line 26, in <module>
File "C:\Python27\lib\site-packages\twisted\internet\reactor.py", line 38, in
<module>
from twisted.internet import default
File "C:\Python27\lib\site-packages\twisted\internet\default.py", line 56, in
<module>
install = _getInstallFunction(platform)
File "C:\Python27\lib\site-packages\twisted\internet\default.py", line 50, in
_getInstallFunction
from twisted.internet.selectreactor import install
File "C:\Python27\lib\site-packages\twisted\internet\selectreactor.py", line 1
8, in <module>
from twisted.internet import posixbase
File "C:\Python27\lib\site-packages\twisted\internet\posixbase.py", line 24, i
n <module>
from twisted.internet import error, udp, tcp
File "C:\Python27\lib\site-packages\twisted\internet\tcp.py", line 29, in <mod
ule>
from twisted.internet._newtls import (
File "C:\Python27\lib\site-packages\twisted\internet\_newtls.py", line 21, in
<module>
from twisted.protocols.tls import TLSMemoryBIOFactory, TLSMemoryBIOProtocol
File "C:\Python27\lib\site-packages\twisted\protocols\tls.py", line 41, in <mo
dule>
from OpenSSL.SSL import Error, ZeroReturnError, WantReadError
File "C:\Python27\lib\site-packages\OpenSSL\__init__.py", line 8, in <module>
from OpenSSL import rand, crypto, SSL
File "C:\Python27\lib\site-packages\OpenSSL\rand.py", line 11, in <module>
from OpenSSL._util import (
File "C:\Python27\lib\site-packages\OpenSSL\_util.py", line 4, in <module>
binding = Binding()
File "C:\**\build\exe.win32-2.7\library.zip\cryptography\haz
mat\bindings\openssl\binding.py", line 83, in __init__
File "C:\**\build\exe.win32-2.7\library.zip\cryptography\haz
mat\bindings\openssl\binding.py", line 99, in _ensure_ffi_initialized
File "C:\**\build\exe.win32-2.7\library.zip\cryptography\haz
mat\bindings\utils.py", line 72, in build_ffi
File "C:\Python27\lib\site-packages\cffi\api.py", line 341, in verify
lib = self.verifier.load_library()
File "C:\Python27\lib\site-packages\cffi\verifier.py", line 73, in load_librar
y
self._write_source()
File "C:\Python27\lib\site-packages\cffi\verifier.py", line 125, in _write_sou
rce
file = open(self.sourcefilename, 'w')
IOError: [Errno 2] No such file or directory: 'C:\\**\\buil
d\\exe.win32-2.7\\library.zip\\cryptography\\hazmat\\bindings\\__pycache__\\_cff
i__x969a4f6ex69432c5f.c'
Community
  • 1
  • 1
basvb
  • 41
  • 4
  • 1
    You miss the email package. Please, check my solution in [Issue packaging scrapy spider with cx_Freeze or py2exe](http://stackoverflow.com/a/25513413/3975786) – mcandal Aug 31 '14 at 22:21

0 Answers0