Я успешно запускаю docker контейнеры на машинах с Ubuntu.
И у меня возникают проблемы с тем же docker на машинах с ма c. Я пробовал на двух маках, и сообщения об ошибках одинаковы.
> spark-worker_1 | java.net.UnknownHostException: docker-desktop:
> docker-desktop: Name does not resolve spark-worker_1 | at
> java.net.InetAddress.getLocalHost(InetAddress.java:1506)
> spark-worker_1 | at
> org.apache.spark.util.Utils$.findLocalInetAddress(Utils.scala:946)
> spark-worker_1 | at
> org.apache.spark.util.Utils$.org$apache$spark$util$Utils$$localIpAddress$lzycompute(Utils.scala:939)
> spark-worker_1 | at
> org.apache.spark.util.Utils$.org$apache$spark$util$Utils$$localIpAddress(Utils.scala:939)
> spark-worker_1 | at
> org.apache.spark.util.Utils$$anonfun$localHostName$1.apply(Utils.scala:1003)
> spark-worker_1 | at
> org.apache.spark.util.Utils$$anonfun$localHostName$1.apply(Utils.scala:1003)
> spark-worker_1 | at scala.Option.getOrElse(Option.scala:121)
> spark-worker_1 | at
> org.apache.spark.util.Utils$.localHostName(Utils.scala:1003)
> spark-worker_1 | at
> org.apache.spark.deploy.worker.WorkerArguments.<init>(WorkerArguments.scala:31)
> spark-worker_1 | at
> org.apache.spark.deploy.worker.Worker$.main(Worker.scala:778)
> spark-worker_1 | at
> org.apache.spark.deploy.worker.Worker.main(Worker.scala)
> spark-worker_1 | Caused by: java.net.UnknownHostException:
> docker-desktop: Name does not resolve spark-worker_1 | at
> java.net.Inet6AddressImpl.lookupAllHostAddr(Native Method)
> spark-worker_1 | at
> java.net.InetAddress$2.lookupAllHostAddr(InetAddress.java:929)
> spark-worker_1 | at
> java.net.InetAddress.getAddressesFromNameService(InetAddress.java:1324)
> spark-worker_1 | at
> java.net.InetAddress.getLocalHost(InetAddress.java:1501)
> spark-worker_1 | ... 10 more docker_spark-worker_1 exited with
> code 51
Вот мой файл docker -compose.yml
services:
spark-master:
build:
context: ../../
dockerfile: ./danalysis/docker/spark/Dockerfile
image: spark:latest
container_name: spark-master
hostname: node-master
ports:
- "7077:7077"
network_mode: host
environment:
- "SPARK_LOCAL_IP=node-master"
- "SPARK_MASTER_PORT=7077"
- "SPARK_MASTER_WEBUI_PORT=10080"
command: "/start-master.sh"
dns:
- 192.168.1.1 # IP necessary to connect to a database instance external to where the server in which the container is running
spark-worker:
image: spark:latest
environment:
- "SPARK_MASTER=spark://node-master:7077"
- "SPARK_WORKER_WEBUI_PORT=8080"
command: "/start-worker.sh"
ports:
- 8080
network_mode: host
depends_on:
- spark-master
dns:
- 192.168.1.1 # IP necessary to connect to a database instance external to where the server in which the container is running
** edit **
Итак, я нашел способ заставить это работать, закомментировав несколько строк. так почему эти две проблемы?
И хотя контейнер работает нормально и подключается к spark-master, он использует какой-то внутренний ip, как вы можете видеть, 172.18.0.2
это не то, что мы обычно видим в нашей сети, я думаю, что IP от docker container
не host
![enter image description here](https://i.stack.imgur.com/jIAeo.png)
# network_mode: host
depends_on:
- spark-master
# dns:
# - 192.168.1.1 # IP necessary to connect to a database instance external to where the server in which the container is running