Я установил scrapy_proxies с помощью pip install scrapy_proxies. Но всякий раз, когда я запускаю своего паука, я получаю следующий журнал ошибок:
scrapy crawl event -o items_new.csv
2018-09-13 01:15:19 [scrapy.utils.log] INFO: Scrapy 1.5.0 started (bot: allevents)
2018-09-13 01:15:19 [scrapy.utils.log] INFO: Versions: lxml 4.2.1.0, libxml2 2.9.8, cssselect 1.0.3, parsel 1.4.0, w3lib 1.19.0, Twisted 17.9.0, Python 2.7.12 (default, Dec 4 2017, 14:50:18) - [GCC 5.4.0 20160609], pyOpenSSL 17.5.0 (OpenSSL 1.1.0h 27 Mar 2018), cryptography 2.2.2, Platform Linux-4.15.0-33-generic-x86_64-with-Ubuntu-16.04-xenial
2018-09-13 01:15:19 [scrapy.crawler] INFO: Overridden settings: {'NEWSPIDER_MODULE': 'allevents.spiders', 'FEED_URI': 'items_new.csv', 'SPIDER_MODULES': ['allevents.spiders'], 'RETRY_HTTP_CODES': [500, 503, 504, 400, 403, 404, 408], 'BOT_NAME': 'allevents', 'RETRY_TIMES': 10, 'CLOSESPIDER_PAGECOUNT': 2300, 'FEED_FORMAT': 'csv', 'USER_AGENT': 'Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/65.0.3325.181 Safari/537.36'}
2018-09-13 01:15:19 [scrapy.middleware] INFO: Enabled extensions:
['scrapy.extensions.closespider.CloseSpider',
'scrapy.extensions.feedexport.FeedExporter',
'scrapy.extensions.memusage.MemoryUsage',
'scrapy.extensions.logstats.LogStats',
'scrapy.extensions.telnet.TelnetConsole',
'scrapy.extensions.corestats.CoreStats']
Unhandled error in Deferred:
2018-09-13 01:15:19 [twisted] CRITICAL: Unhandled error in Deferred:
2018-09-13 01:15:19 [twisted] CRITICAL:
Traceback (most recent call last):
File "/home/hp/.local/lib/python2.7/site-packages/twisted/internet/defer.py", line 1386, in _inlineCallbacks
result = g.send(result)
File "/home/hp/.local/lib/python2.7/site-packages/scrapy/crawler.py", line 98, in crawl
six.reraise(*exc_info)
File "/home/hp/.local/lib/python2.7/site-packages/scrapy/crawler.py", line 80, in crawl
self.engine = self._create_engine()
File "/home/hp/.local/lib/python2.7/site-packages/scrapy/crawler.py", line 105, in _create_engine
return ExecutionEngine(self, lambda _: self.stop())
File "/home/hp/.local/lib/python2.7/site-packages/scrapy/core/engine.py", line 69, in __init__
self.downloader = downloader_cls(crawler)
File "/home/hp/.local/lib/python2.7/site-packages/scrapy/core/downloader/__init__.py", line 88, in __init__
self.middleware = DownloaderMiddlewareManager.from_crawler(crawler)
File "/home/hp/.local/lib/python2.7/site-packages/scrapy/middleware.py", line 58, in from_crawler
return cls.from_settings(crawler.settings, crawler)
File "/home/hp/.local/lib/python2.7/site-packages/scrapy/middleware.py", line 34, in from_settings
mwcls = load_object(clspath)
File "/home/hp/.local/lib/python2.7/site-packages/scrapy/utils/misc.py", line 44, in load_object
mod = import_module(module)
File "/usr/lib/python2.7/importlib/__init__.py", line 37, in import_module
__import__(name)
ImportError: No module named scrapy_proxies
Но даже в каталоге spider я явно пытался установить scrapy_proxy, и он показал, что scrapy_proxies уже установлен.
Кроме того,
hp@hp-HP-Notebook:~$ pip --version
pip 9.0.1 from /home/hp/anaconda3/lib/python3.6/site-packages (python 3.6)