Подключиться к локальному приложению из колбы в контейнере - PullRequest
0 голосов
/ 04 ноября 2019

Я создаю систему микросервисов с переменными как локальной, так и рабочей среды.

У меня проблема в приложении, когда я пытаюсь запустить ее локально внутри контейнера и пытаюсь запустить локальное приложение, котороена моем компьютере.

Я получил это соединение при запуске приложения из локальной сети:

  "dev": {
    "scrapper_config_service": "http://0.0.0.0:8000",
    "margin_saver_service": "http://0.0.0.0:5000"
  }

Когда я запускаю это приложение в локальном контейнере, scrapper_config_service отклоняет соединение.

Это файл docker, который я использую для запуска своего приложения в контейнере.

FROM python:3.7
COPY . /app
WORKDIR /app
RUN pip install -r requirements.txt
EXPOSE 8002
ENV PYTHONPATH="$PYTHONPATH:/app"
CMD python ./scrapper_service.py

SCRAPPER: b'Traceback (most recent call last):'
SCRAPPER: b'  File "/usr/local/lib/python3.7/site-packages/scrapy/crawler.py", line 184, in crawl'
SCRAPPER: b'    return self._crawl(crawler, *args, **kwargs)'
SCRAPPER: b'  File "/usr/local/lib/python3.7/site-packages/scrapy/crawler.py", line 188, in _crawl'
SCRAPPER: b'    d = crawler.crawl(*args, **kwargs)'
SCRAPPER: b'  File "/usr/local/lib/python3.7/site-packages/twisted/internet/defer.py", line 1613, in unwindGenerator'
SCRAPPER: b'    return _cancellableInlineCallbacks(gen)'
SCRAPPER: b'  File "/usr/local/lib/python3.7/site-packages/twisted/internet/defer.py", line 1529, in _cancellableInlineCallbacks'
SCRAPPER: b'    _inlineCallbacks(None, g, status)'
SCRAPPER: b'--- <exception caught here> ---'
SCRAPPER: b'  File "/usr/local/lib/python3.7/site-packages/twisted/internet/defer.py", line 1418, in _inlineCallbacks'
SCRAPPER: b'    result = g.send(result)'
SCRAPPER: b'  File "/usr/local/lib/python3.7/site-packages/scrapy/crawler.py", line 85, in crawl'
SCRAPPER: b'    self.spider = self._create_spider(*args, **kwargs)'
SCRAPPER: b'  File "/usr/local/lib/python3.7/site-packages/scrapy/crawler.py", line 108, in _create_spider'
SCRAPPER: b'    return self.spidercls.from_crawler(self, *args, **kwargs)'
SCRAPPER: b'  File "/usr/local/lib/python3.7/site-packages/scrapy/spiders/__init__.py", line 50, in from_crawler'
SCRAPPER: b'    spider = cls(*args, **kwargs)'
SCRAPPER: b'  File "/app/scrapper/spiders/quote_spider.py", line 19, in __init__'
SCRAPPER: b'    resp = requests.get(url)'
SCRAPPER: b'  File "/usr/local/lib/python3.7/site-packages/requests/api.py", line 75, in get'
SCRAPPER: b"    return request('get', url, params=params, **kwargs)"
SCRAPPER: b'  File "/usr/local/lib/python3.7/site-packages/requests/api.py", line 60, in request'
SCRAPPER: b'    return session.request(method=method, url=url, **kwargs)'
SCRAPPER: b'  File "/usr/local/lib/python3.7/site-packages/requests/sessions.py", line 533, in request'
SCRAPPER: b'    resp = self.send(prep, **send_kwargs)'
SCRAPPER: b'  File "/usr/local/lib/python3.7/site-packages/requests/sessions.py", line 646, in send'
SCRAPPER: b'    r = adapter.send(request, **kwargs)'
SCRAPPER: b'  File "/usr/local/lib/python3.7/site-packages/requests/adapters.py", line 516, in send'
SCRAPPER: b'    raise ConnectionError(e, request=request)'
SCRAPPER: b"requests.exceptions.ConnectionError: HTTPConnectionPool(host='0.0.0.0', port=8000): Max retries exceeded with url: /banks (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7f9d5ae03950>: Failed to establish a new connection: [Errno 111] Connection refused'))"
SCRAPPER: b''
SCRAPPER: b'2019-11-03 21:20:44 [twisted] CRITICAL:'
SCRAPPER: b'Traceback (most recent call last):'
SCRAPPER: b'  File "/usr/local/lib/python3.7/site-packages/urllib3/connection.py", line 157, in _new_conn'
SCRAPPER: b'    (self._dns_host, self.port), self.timeout, **extra_kw'
SCRAPPER: b'  File "/usr/local/lib/python3.7/site-packages/urllib3/util/connection.py", line 84, in create_connection'
SCRAPPER: b'    raise err'
SCRAPPER: b'  File "/usr/local/lib/python3.7/site-packages/urllib3/util/connection.py", line 74, in create_connection'
SCRAPPER: b'    sock.connect(sa)'
SCRAPPER: b'ConnectionRefusedError: [Errno 111] Connection refused'
SCRAPPER: b''
SCRAPPER: b'During handling of the above exception, another exception occurred:'
SCRAPPER: b''
SCRAPPER: b'Traceback (most recent call last):'
SCRAPPER: b'  File "/usr/local/lib/python3.7/site-packages/urllib3/connectionpool.py", line 672, in urlopen'
SCRAPPER: b'    chunked=chunked,'
SCRAPPER: b'  File "/usr/local/lib/python3.7/site-packages/urllib3/connectionpool.py", line 387, in _make_request'
SCRAPPER: b'    conn.request(method, url, **httplib_request_kw)'
SCRAPPER: b'  File "/usr/local/lib/python3.7/http/client.py", line 1244, in request'
SCRAPPER: b'    self._send_request(method, url, body, headers, encode_chunked)'
SCRAPPER: b'  File "/usr/local/lib/python3.7/http/client.py", line 1290, in _send_request'
SCRAPPER: b'    self.endheaders(body, encode_chunked=encode_chunked)'
SCRAPPER: b'  File "/usr/local/lib/python3.7/http/client.py", line 1239, in endheaders'
SCRAPPER: b'    self._send_output(message_body, encode_chunked=encode_chunked)'
SCRAPPER: b'  File "/usr/local/lib/python3.7/http/client.py", line 1026, in _send_output'
SCRAPPER: b'    self.send(msg)'
SCRAPPER: b'  File "/usr/local/lib/python3.7/http/client.py", line 966, in send'
SCRAPPER: b'    self.connect()'
SCRAPPER: b'  File "/usr/local/lib/python3.7/site-packages/urllib3/connection.py", line 184, in connect'
SCRAPPER: b'    conn = self._new_conn()'
SCRAPPER: b'  File "/usr/local/lib/python3.7/site-packages/urllib3/connection.py", line 169, in _new_conn'
SCRAPPER: b'    self, "Failed to establish a new connection: %s" % e'
SCRAPPER: b'urllib3.exceptions.NewConnectionError: <urllib3.connection.HTTPConnection object at 0x7f9d5ae03950>: Failed to establish a new connection: [Errno 111] Connection refused'
SCRAPPER: b''
SCRAPPER: b'During handling of the above exception, another exception occurred:'
SCRAPPER: b''
SCRAPPER: b'Traceback (most recent call last):'
2019-11-03 21:20:44 [werkzeug] INFO: 172.17.0.1 - - [03/Nov/2019 21:20:44] "GET /scrapper HTTP/1.1" 200 -
SCRAPPER: b'  File "/usr/local/lib/python3.7/site-packages/requests/adapters.py", line 449, in send'
SCRAPPER: b'    timeout=timeout'
SCRAPPER: b'  File "/usr/local/lib/python3.7/site-packages/urllib3/connectionpool.py", line 720, in urlopen'
SCRAPPER: b'    method, url, error=e, _pool=self, _stacktrace=sys.exc_info()[2]'
SCRAPPER: b'  File "/usr/local/lib/python3.7/site-packages/urllib3/util/retry.py", line 436, in increment'
SCRAPPER: b'    raise MaxRetryError(_pool, url, error or ResponseError(cause))'
SCRAPPER: b"urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='0.0.0.0', port=8000): Max retries exceeded with url: /banks (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7f9d5ae03950>: Failed to establish a new connection: [Errno 111] Connection refused'))"
SCRAPPER: b''
SCRAPPER: b'During handling of the above exception, another exception occurred:'
SCRAPPER: b''
SCRAPPER: b'Traceback (most recent call last):'
SCRAPPER: b'  File "/usr/local/lib/python3.7/site-packages/twisted/internet/defer.py", line 1418, in _inlineCallbacks'
SCRAPPER: b'    result = g.send(result)'
SCRAPPER: b'  File "/usr/local/lib/python3.7/site-packages/scrapy/crawler.py", line 85, in crawl'
SCRAPPER: b'    self.spider = self._create_spider(*args, **kwargs)'
SCRAPPER: b'  File "/usr/local/lib/python3.7/site-packages/scrapy/crawler.py", line 108, in _create_spider'
SCRAPPER: b'    return self.spidercls.from_crawler(self, *args, **kwargs)'
SCRAPPER: b'  File "/usr/local/lib/python3.7/site-packages/scrapy/spiders/__init__.py", line 50, in from_crawler'
SCRAPPER: b'    spider = cls(*args, **kwargs)'
SCRAPPER: b'  File "/app/scrapper/spiders/quote_spider.py", line 19, in __init__'
SCRAPPER: b'    resp = requests.get(url)'
SCRAPPER: b'  File "/usr/local/lib/python3.7/site-packages/requests/api.py", line 75, in get'
SCRAPPER: b"    return request('get', url, params=params, **kwargs)"
SCRAPPER: b'  File "/usr/local/lib/python3.7/site-packages/requests/api.py", line 60, in request'
SCRAPPER: b'    return session.request(method=method, url=url, **kwargs)'
SCRAPPER: b'  File "/usr/local/lib/python3.7/site-packages/requests/sessions.py", line 533, in request'
SCRAPPER: b'    resp = self.send(prep, **send_kwargs)'
SCRAPPER: b'  File "/usr/local/lib/python3.7/site-packages/requests/sessions.py", line 646, in send'
SCRAPPER: b'    r = adapter.send(request, **kwargs)'
SCRAPPER: b'  File "/usr/local/lib/python3.7/site-packages/requests/adapters.py", line 516, in send'
SCRAPPER: b'    raise ConnectionError(e, request=request)'
SCRAPPER: b"requests.exceptions.ConnectionError: HTTPConnectionPool(host='0.0.0.0', port=8000): Max retries exceeded with url: /banks (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7f9d5ae03950>: Failed to establish a new connection: [Errno 111] Connection refused'))"
Starting new scrapping session at 2019-11-03 22:02:05.423748... 
Thread started!
SCRAPPER: b'2019-11-03 22:02:06 [scrapy.utils.log] INFO: Scrapy 1.8.0 started (bot: scrapper)'
SCRAPPER: b'2019-11-03 22:02:06 [scrapy.utils.log] INFO: Versions: lxml 4.4.1.0, libxml2 2.9.9, cssselect 1.1.0, parsel 1.5.2, w3lib 1.21.0, Twisted 19.7.0, Python 3.7.4 (default, Sep 12 2019, 15:40:15) - [GCC 8.3.0], pyOpenSSL 19.0.0 (OpenSSL 1.1.1d  10 Sep 2019), cryptography 2.8, Platform Linux-5.3.0-19-generic-x86_64-with-debian-10.1'
SCRAPPER: b"2019-11-03 22:02:06 [scrapy.crawler] INFO: Overridden settings: {'BOT_NAME': 'scrapper', 'DOWNLOAD_DELAY': 2, 'NEWSPIDER_MODULE': 'scrapper.spiders', 'SPIDER_MODULES': ['scrapper.spiders'], 'USER_AGENT': 'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_10_2) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/46.0.2490.80 Safari/537.36'}"
SCRAPPER: b'2019-11-03 22:02:06 [scrapy.extensions.telnet] INFO: Telnet Password: f6b9d46df8bc482d'
SCRAPPER: b'2019-11-03 22:02:06 [scrapy.middleware] INFO: Enabled extensions:'
SCRAPPER: b"['scrapy.extensions.corestats.CoreStats',"
SCRAPPER: b" 'scrapy.extensions.telnet.TelnetConsole',"
SCRAPPER: b" 'scrapy.extensions.memusage.MemoryUsage',"
SCRAPPER: b" 'scrapy.extensions.logstats.LogStats',"
SCRAPPER: b" 'spidermon.contrib.scrapy.extensions.Spidermon']"
SCRAPPER: b'2019-11-03 22:02:06 [urllib3.connectionpool] DEBUG: Starting new HTTP connection (1): 0.0.0.0:8000'
SCRAPPER: b'Unhandled error in Deferred:'
SCRAPPER: b'2019-11-03 22:02:06 [twisted] CRITICAL: Unhandled error in Deferred:'
SCRAPPER: b''
SCRAPPER: b'Traceback (most recent call last):'
SCRAPPER: b'  File "/usr/local/lib/python3.7/site-packages/scrapy/crawler.py", line 184, in crawl'
SCRAPPER: b'    return self._crawl(crawler, *args, **kwargs)'
SCRAPPER: b'  File "/usr/local/lib/python3.7/site-packages/scrapy/crawler.py", line 188, in _crawl'
SCRAPPER: b'    d = crawler.crawl(*args, **kwargs)'
SCRAPPER: b'  File "/usr/local/lib/python3.7/site-packages/twisted/internet/defer.py", line 1613, in unwindGenerator'
SCRAPPER: b'    return _cancellableInlineCallbacks(gen)'
SCRAPPER: b'  File "/usr/local/lib/python3.7/site-packages/twisted/internet/defer.py", line 1529, in _cancellableInlineCallbacks'
SCRAPPER: b'    _inlineCallbacks(None, g, status)'
SCRAPPER: b'--- <exception caught here> ---'
SCRAPPER: b'  File "/usr/local/lib/python3.7/site-packages/twisted/internet/defer.py", line 1418, in _inlineCallbacks'
SCRAPPER: b'    result = g.send(result)'
SCRAPPER: b'  File "/usr/local/lib/python3.7/site-packages/scrapy/crawler.py", line 85, in crawl'
SCRAPPER: b'    self.spider = self._create_spider(*args, **kwargs)'
SCRAPPER: b'  File "/usr/local/lib/python3.7/site-packages/scrapy/crawler.py", line 108, in _create_spider'
SCRAPPER: b'    return self.spidercls.from_crawler(self, *args, **kwargs)'
SCRAPPER: b'  File "/usr/local/lib/python3.7/site-packages/scrapy/spiders/__init__.py", line 50, in from_crawler'
SCRAPPER: b'    spider = cls(*args, **kwargs)'
SCRAPPER: b'  File "/app/scrapper/spiders/quote_spider.py", line 19, in __init__'
SCRAPPER: b'    resp = requests.get(url)'
SCRAPPER: b'  File "/usr/local/lib/python3.7/site-packages/requests/api.py", line 75, in get'
SCRAPPER: b"    return request('get', url, params=params, **kwargs)"
SCRAPPER: b'  File "/usr/local/lib/python3.7/site-packages/requests/api.py", line 60, in request'
SCRAPPER: b'    return session.request(method=method, url=url, **kwargs)'
SCRAPPER: b'  File "/usr/local/lib/python3.7/site-packages/requests/sessions.py", line 533, in request'
SCRAPPER: b'    resp = self.send(prep, **send_kwargs)'
SCRAPPER: b'  File "/usr/local/lib/python3.7/site-packages/requests/sessions.py", line 646, in send'
SCRAPPER: b'    r = adapter.send(request, **kwargs)'
SCRAPPER: b'  File "/usr/local/lib/python3.7/site-packages/requests/adapters.py", line 516, in send'
SCRAPPER: b'    raise ConnectionError(e, request=request)'
SCRAPPER: b"requests.exceptions.ConnectionError: HTTPConnectionPool(host='0.0.0.0', port=8000): Max retries exceeded with url: /banks (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7fccfa84e910>: Failed to establish a new connection: [Errno 111] Connection refused'))"
SCRAPPER: b''
SCRAPPER: b'2019-11-03 22:02:06 [twisted] CRITICAL:'
SCRAPPER: b'Traceback (most recent call last):'
SCRAPPER: b'  File "/usr/local/lib/python3.7/site-packages/urllib3/connection.py", line 157, in _new_conn'
SCRAPPER: b'    (self._dns_host, self.port), self.timeout, **extra_kw'
SCRAPPER: b'  File "/usr/local/lib/python3.7/site-packages/urllib3/util/connection.py", line 84, in create_connection'
SCRAPPER: b'    raise err'
SCRAPPER: b'  File "/usr/local/lib/python3.7/site-packages/urllib3/util/connection.py", line 74, in create_connection'
SCRAPPER: b'    sock.connect(sa)'
SCRAPPER: b'ConnectionRefusedError: [Errno 111] Connection refused'
SCRAPPER: b''
SCRAPPER: b'During handling of the above exception, another exception occurred:'
SCRAPPER: b''
SCRAPPER: b'Traceback (most recent call last):'
SCRAPPER: b'  File "/usr/local/lib/python3.7/site-packages/urllib3/connectionpool.py", line 672, in urlopen'
SCRAPPER: b'    chunked=chunked,'
SCRAPPER: b'  File "/usr/local/lib/python3.7/site-packages/urllib3/connectionpool.py", line 387, in _make_request'
SCRAPPER: b'    conn.request(method, url, **httplib_request_kw)'

Это сгенерированное исключение внутри контейнера, и оно было сгенерировано с момента запуска приложения локальной колбы, которое я запускаюв локальном hosdt с портом 8000 не найдено.

Я нашел проблему в Google и исправил это соединение с локальным контейнером, используя 0.0.0.0 IP вместо 127.0.0.1, но, похоже, это не решает проблему.

Однако мой контейнер отлично работает в облаке, когда указывает на разные IP-адреса.

Добро пожаловать на сайт PullRequest, где вы можете задавать вопросы и получать ответы от других членов сообщества.
...