Я пытаюсь настроить автономный спарк-кластер на моей машине с Windows и запустить приложение scala в форме jar, которое просто прочитало бы файл CSV размером 100 МБ и записало его в другое место.Я могу настроить мастер и двух подчиненных с 8 ядрами и 6,9 ГБ оперативной памяти каждый.Однако, когда я пытаюсь запустить приложение, мастер может запустить двух исполнителей, но приложение не запускается.
Вот журналы мастера:
2019-05-10 18:29:36 INFO Utils:54 - Successfully started service on port 6066.
2019-05-10 18:29:36 INFO StandaloneRestServer:54 - Started REST server for submitting app
lications on port 6066
2019-05-10 18:29:36 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@a767b
a6{/metrics/master/json,null,AVAILABLE,@Spark}
2019-05-10 18:29:36 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@5a51c
966{/metrics/applications/json,null,AVAILABLE,@Spark}
2019-05-10 18:29:36 INFO Master:54 - I have been elected leader! New state: ALIVE
2019-05-10 18:40:33 INFO Master:54 - Registering worker 192.168.8.102:50470 with 8 cores,
6.9 GB RAM
2019-05-10 18:40:42 INFO Master:54 - Registering worker 192.168.8.102:50523 with 8 cores,
6.9 GB RAM
2019-05-10 18:42:39 INFO Master:54 - Driver submitted org.apache.spark.deploy.worker.Driv
erWrapper
2019-05-10 18:42:39 INFO Master:54 - Launching driver driver-20190510184239-0000 on worke
r worker-20190510184041-192.168.8.102-50523
2019-05-10 18:42:50 WARN TransportChannelHandler:78 - Exception in connection from /192.1
68.8.102:50746
java.io.IOException: An existing connection was forcibly closed by the remote host
at sun.nio.ch.SocketDispatcher.read0(Native Method)
at sun.nio.ch.SocketDispatcher.read(SocketDispatcher.java:43)
at sun.nio.ch.IOUtil.readIntoNativeBuffer(IOUtil.java:223)
at sun.nio.ch.IOUtil.read(IOUtil.java:192)
at sun.nio.ch.SocketChannelImpl.read(SocketChannelImpl.java:380)
at io.netty.buffer.PooledUnsafeDirectByteBuf.setBytes(PooledUnsafeDirectByteBuf.ja
va:288)
at io.netty.buffer.AbstractByteBuf.writeBytes(AbstractByteBuf.java:1106)
at io.netty.channel.socket.nio.NioSocketChannel.doReadBytes(NioSocketChannel.java:
343)
at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteC
hannel.java:123)
at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:645)
at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.jav
a:580)
at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:497)
at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:459)
at io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecu
tor.java:858)
at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(Defa
ultThreadFactory.java:138)
at java.lang.Thread.run(Thread.java:748)
2019-05-10 18:42:50 WARN TransportChannelHandler:78 - Exception in connection from /192.1
68.8.102:50791
java.io.IOException: An existing connection was forcibly closed by the remote host
at sun.nio.ch.SocketDispatcher.read0(Native Method)
at sun.nio.ch.SocketDispatcher.read(SocketDispatcher.java:43)
at sun.nio.ch.IOUtil.readIntoNativeBuffer(IOUtil.java:223)
at sun.nio.ch.IOUtil.read(IOUtil.java:192)
at sun.nio.ch.SocketChannelImpl.read(SocketChannelImpl.java:380)
at io.netty.buffer.PooledUnsafeDirectByteBuf.setBytes(PooledUnsafeDirectByteBuf.ja
va:288)
at io.netty.buffer.AbstractByteBuf.writeBytes(AbstractByteBuf.java:1106)
at io.netty.channel.socket.nio.NioSocketChannel.doReadBytes(NioSocketChannel.java:
343)
at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteC
hannel.java:123)
at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:645)
at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.jav
a:580)
at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:497)
at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:459)
at io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecu
tor.java:858)
at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(Defa
ultThreadFactory.java:138)
at java.lang.Thread.run(Thread.java:748)
2019-05-10 18:42:50 INFO Master:54 - 192.168.8.102:50746 got disassociated, removing it.
2019-05-10 18:42:50 INFO Master:54 - 192.168.8.102:50791 got disassociated, removing it.
2019-05-10 18:42:50 INFO Master:54 - 192.168.8.102:50790 got disassociated, removing it.
2019-05-10 18:42:56 INFO Master:54 - Registering app spark session example
2019-05-10 18:42:56 INFO Master:54 - Registered app spark session example with ID app-201
90510184256-0000
2019-05-10 18:42:56 INFO Master:54 - Launching executor app-20190510184256-0000/0 on work
er worker-20190510184032-192.168.8.102-50470
2019-05-10 18:43:08 INFO Master:54 - Removing executor app-20190510184256-0000/0 because
it is EXITED
2019-05-10 18:43:08 INFO Master:54 - Launching executor app-20190510184256-0000/1 on work
er worker-20190510184032-192.168.8.102-50470
2019-05-10 18:43:20 INFO Master:54 - Removing executor app-20190510184256-0000/1 because
it is EXITED
и подчиненного
>2019-05-10 18:40:32 INFO WorkerWebUI:54 - Bound WorkerWebUI to 0.0.0.0, and started at ht
tp://ws-tujain.ivp.co.in:8081
2019-05-10 18:40:32 INFO Worker:54 - Connecting to master 192.168.8.102:7077...
2019-05-10 18:40:32 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@28299
58e{/metrics/json,null,AVAILABLE,@Spark}
2019-05-10 18:40:32 INFO TransportClientFactory:267 - Successfully created connection to
/192.168.8.102:7077 after 81 ms (0 ms spent in bootstraps)
2019-05-10 18:40:33 INFO Worker:54 - Successfully registered with master spark://192.168.
8.102:7077
2019-05-10 18:42:56 INFO Worker:54 - Asked to launch executor app-20190510184256-0000/0 f
or spark session example
2019-05-10 18:42:56 INFO SecurityManager:54 - Changing view acls to: tujain
2019-05-10 18:42:56 INFO SecurityManager:54 - Changing modify acls to: tujain
2019-05-10 18:42:56 INFO SecurityManager:54 - Changing view acls groups to:
2019-05-10 18:42:56 INFO SecurityManager:54 - Changing modify acls groups to:
2019-05-10 18:42:56 INFO SecurityManager:54 - SecurityManager: authentication disabled; u
i acls disabled; users with view permissions: Set(tujain); groups with view permissions:
Set(); users with modify permissions: Set(tujain); groups with modify permissions: Set()
2019-05-10 18:42:56 INFO ExecutorRunner:54 - Launch command: "C:\Program Files\Java\jdk1.
8.0_131\bin\java" "-cp" "C:\Spark\bin\..\conf\;C:\Spark\jars\*" "-Xmx6144M" "-Dspark.drive
r.port=50875" "-Dspark.network.timeout=300s" "-Dspark.rpc.askTimeout=10s" "org.apache.spar
k.executor.CoarseGrainedExecutorBackend" "--driver-url" "spark://CoarseGrainedScheduler@ws
-tujain.ivp.co.in:50875" "--executor-id" "0" "--hostname" "192.168.8.102" "--cores" "8" "-
-app-id" "app-20190510184256-0000" "--worker-url" "spark://Worker@192.168.8.102:50470"
2019-05-10 18:43:08 INFO Worker:54 - Executor app-20190510184256-0000/0 finished with sta
te EXITED message Command exited with code 1 exitStatus 1
2019-05-10 18:43:08 INFO Worker:54 - Asked to launch executor app-20190510184256-0000/1 f
or spark session example
2019-05-10 18:43:08 INFO SecurityManager:54 - Changing view acls to: tujain
2019-05-10 18:43:08 INFO SecurityManager:54 - Changing modify acls to: tujain
2019-05-10 18:43:08 INFO SecurityManager:54 - Changing view acls groups to:
2019-05-10 18:43:08 INFO SecurityManager:54 - Changing modify acls groups to:
2019-05-10 18:43:08 INFO SecurityManager:54 - SecurityManager: authentication disabled; u
i acls disabled; users with view permissions: Set(tujain); groups with view permissions:
Set(); users with modify permissions: Set(tujain); groups with modify permissions: Set()
2019-05-10 18:43:08 INFO ExecutorRunner:54 - Launch command: "C:\Program Files\Java\jdk1.
8.0_131\bin\java" "-cp" "C:\Spark\bin\..\conf\;C:\Spark\jars\*" "-Xmx6144M" "-Dspark.drive
r.port=50875" "-Dspark.network.timeout=300s" "-Dspark.rpc.askTimeout=10s" "org.apache.spar
k.executor.CoarseGrainedExecutorBackend" "--driver-url" "spark://CoarseGrainedScheduler@ws
-tujain.ivp.co.in:50875" "--executor-id" "1" "--hostname" "192.168.8.102" "--cores" "8" "-
-app-id" "app-20190510184256-0000" "--worker-url" "spark://Worker@192.168.8.102:50470"
2019-05-10 18:43:20 INFO Worker:54 - Executor app-20190510184256-0000/1 finished with sta
te EXITED message Command exited with code 1 exitStatus 1
2019-05-10 18:43:20 INFO Worker:54 - Asked to launch executor app-20190510184256-0000/2 f
or spark session example
2019-05-10 18:43:20 INFO SecurityManager:54 - Changing view acls to: tujain
2019-05-10 18:43:20 INFO SecurityManager:54 - Changing modify acls to: tujain
2019-05-10 18:43:20 INFO SecurityManager:54 - Changing view acls groups to:
2019-05-10 18:43:20 INFO SecurityManager:54 - Changing modify acls groups to:
Код приложения:
def main(args: Array[String]): Unit = {
val conf = new SparkConf().setMaster("spark://192.168.8.102:7077")
.set("spark.executor.memory", "6g")
.set("spark.driver.memory", "6g")
.set("spark.eventLog.enabled", "true")
.set("spark.network.timeout", "300s")
val spark = SparkSession.builder.config(conf = conf)
.appName("spark session example").getOrCreate()
var df = spark.read.format("csv")
.text("C:\\work\\spark\\Dummy100MBFile.csv")
df.show(10)
df.write.format("com.databricks.spark.csv")
.save("C:\\work\\spark\\output")
}
Я ожидал, что приложение разделит данные и выгрузит их в несколько файлов (2 файла, если быть точным, поскольку есть 2 рабочих).Однако мастер продолжает удалять и запускать исполнителей.
РЕДАКТИРОВАТЬ: после запуска ниже "C: \ Program Files \ Java \ jdk1.8.0_131 \ bin \ java" "-cp" "C: \ Spark \ bin.. \ conf \; C: \ Spark \ jars * "" -Xmx6144M "" -Dspark.driver.port = 50875 "" -Dspark.network.timeout = 300s "" -Dspark.rpc.askTimeout = 10s "" org.apache.spark.executor.CoarseGrainedExecutorBackend "" --driver-url "" spark: //CoarseGrainedScheduler@ws-tujain.ivp.co.in: 50875 "" --executor-id "" 0 "" --hostname ""192.168.8.102" "--cores" "8" "--app-id" "app-20190510184256-0000" "--worker-url" "spark: //Worker@192.168.8.102: 50470"
Нажмите здесь, чтобы получить результат попытки вручную