无法部署到Scrapinghub

问题描述:

当我尝试部署使用shub deploy,我得到这个错误:无法部署到Scrapinghub

Removing intermediate container fccf1ec715e6 Step 10 : RUN sudo -u nobody -E PYTHONUSERBASE=$PYTHONUSERBASE pip install --user --no-cache-dir -r /app/requirements.txt ---> Running in 729e0d414f46 Double requirement given: attrs==16.1.0 (from -r /app/requirements.txt (line 51)) (already in attrs==16.0.0 (from -r /app/requirements.txt (line 1)), name='attrs')

{"message": "The command '/bin/sh -c sudo -u nobody -E PYTHONUSERBASE=$PYTHONUSERBASE pip install --user --no-cache-dir -r /app/requirements.txt' returned a non-zero code: 1", "details": {"message": "The command '/bin/sh -c sudo -u nobody -E PYTHONUSERBASE=$PYTHONUSERBASE pip install --user --no-cache-dir -r /app/requirements.txt' returned a non-zero code: 1"}, "error": "build_error"}

{"message": "Internal build error", "status": "error"} Deploy log location: c:\users\dr521f~1.pri\appdata\local\temp\shub_deploy_pvx7dk.log Error: Deploy failed: {"message": "Internal build error", "status": "error"}

这是我的requirements.txt

attrs==16.1.0 
beautifulsoup4==4.5.1 
cffi==1.8.2 
click==6.6 
cryptography==1.5 
cssselect==0.9.2 
enum34==1.1.6 
fake-useragent==0.1.2 
hubstorage==0.23.1 
idna==2.1 
ipaddress==1.0.17 
lxml==3.6.1 
parsel==1.0.3 
pyasn1==0.1.9 
pyasn1-modules==0.0.8 
pycparser==2.14 
PyDispatcher==2.0.5 
pyOpenSSL==16.1.0 
pypiwin32==219 
queuelib==1.4.2 
requests==2.11.1 
retrying==1.3.3 
ruamel.ordereddict==0.4.9 
ruamel.yaml==0.12.13 
scrapinghub==1.8.0 
Scrapy==1.1.2 
scrapy-fake-useragent==0.0.1 
service-identity==16.0.0 
shub==2.4.0 
six==1.10.0 
Twisted==16.4.0 
typing==3.5.2.2 
w3lib==1.15.0 
zope.interface==4.3.2 

为什么我不能部署?

从文档here

Note that this requirements file is an extension of the Scrapy Cloud stack, and therefore should not contain packages that are already part of the stack, such as scrapy.

正如你可以在错误看到:

Running in 729e0d414f46 Double requirement given: attrs==16.1.0 (from -r /app/requirements.txt (line 51)) (already in attrs==16.0.0 (from -r /app/requirements.txt (line 1)), name='attrs')

它说Double requirement given

对整个项目和Scrapinghub使用不同的requirements.txt。我最终创建了包含此项的shub-requirements.txt

beautifulsoup4==4.5.1 
fake-useragent==0.1.2