При развертывании в Scrapy Cloud нет модуля с именем «scrapy.contrib». - PullRequest
0 голосов
/ 19 июня 2019

Я разработал паука в anaconda3, и я пытаюсь развернуть его в Scrapy Cloud.Но у меня возникает ошибка, когда я начинаю очистку.

    Traceback (most recent call last):
  File "/usr/local/lib/python3.6/site-packages/sh_scrapy/crawl.py", line 148, in _run_usercode
    _run(args, settings)
  File "/usr/local/lib/python3.6/site-packages/sh_scrapy/crawl.py", line 103, in _run
    _run_scrapy(args, settings)
  File "/usr/local/lib/python3.6/site-packages/sh_scrapy/crawl.py", line 111, in _run_scrapy
    execute(settings=settings)
  File "/usr/local/lib/python3.6/site-packages/scrapy/cmdline.py", line 150, in execute
    _run_print_help(parser, _run_command, cmd, args, opts)
  File "/usr/local/lib/python3.6/site-packages/scrapy/cmdline.py", line 90, in _run_print_help
    func(*a, **kw)
  File "/usr/local/lib/python3.6/site-packages/scrapy/cmdline.py", line 157, in _run_command
    cmd.run(args, opts)
  File "/usr/local/lib/python3.6/site-packages/scrapy/commands/crawl.py", line 57, in run
    self.crawler_process.crawl(spname, **opts.spargs)
  File "/usr/local/lib/python3.6/site-packages/scrapy/crawler.py", line 171, in crawl
    crawler = self.create_crawler(crawler_or_spidercls)
  File "/usr/local/lib/python3.6/site-packages/scrapy/crawler.py", line 200, in create_crawler
    return self._create_crawler(crawler_or_spidercls)
  File "/usr/local/lib/python3.6/site-packages/scrapy/crawler.py", line 205, in _create_crawler
    return Crawler(spidercls, self.settings)
  File "/usr/local/lib/python3.6/site-packages/scrapy/crawler.py", line 55, in __init__
    self.extensions = ExtensionManager.from_crawler(self)
  File "/usr/local/lib/python3.6/site-packages/scrapy/middleware.py", line 53, in from_crawler
    return cls.from_settings(crawler.settings, crawler)
  File "/usr/local/lib/python3.6/site-packages/scrapy/middleware.py", line 35, in from_settings
    mw = create_instance(mwcls, settings, crawler)
  File "/usr/local/lib/python3.6/site-packages/scrapy/utils/misc.py", line 140, in create_instance
    return objcls.from_crawler(crawler, *args, **kwargs)
  File "/usr/local/lib/python3.6/site-packages/scrapy/extensions/feedexport.py", line 205, in from_crawler
    o = cls(crawler.settings)
  File "/usr/local/lib/python3.6/site-packages/scrapy/extensions/feedexport.py", line 189, in __init__
    self.exporters = self._load_components('FEED_EXPORTERS')
  File "/usr/local/lib/python3.6/site-packages/scrapy/extensions/feedexport.py", line 256, in _load_components
    d[k] = load_object(v)
  File "/usr/local/lib/python3.6/site-packages/scrapy/utils/misc.py", line 44, in load_object
    mod = import_module(module)
  File "/usr/local/lib/python3.6/importlib/__init__.py", line 126, in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
  File "<frozen importlib._bootstrap>", line 994, in _gcd_import
  File "<frozen importlib._bootstrap>", line 971, in _find_and_load
  File "<frozen importlib._bootstrap>", line 941, in _find_and_load_unlocked
  File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
  File "<frozen importlib._bootstrap>", line 994, in _gcd_import
  File "<frozen importlib._bootstrap>", line 971, in _find_and_load
  File "<frozen importlib._bootstrap>", line 953, in _find_and_load_unlocked
ModuleNotFoundError: No module named 'scrapy.contrib'

В чем проблема?(Я использую Scrapy 1.6. Я отключил промежуточное ПО агента пользователя, потому что я прочитал, что scrapy.contrib использовал его. Но это ничего не меняет.

Добро пожаловать на сайт PullRequest, где вы можете задавать вопросы и получать ответы от других членов сообщества.
...