Apache Искра отключается после успешного подключения к мастеру - PullRequest
0 голосов
/ 22 февраля 2020

Я пытаюсь подключиться к Spark с Java. У меня Spark работает в CentOS 8 (8 ядер, 16 ГБ ОЗУ), который доступен через порт 8081, и я могу просматривать WebUI через порт 8080. Я запустил Spark с помощью следующих команд:

./start-master.sh -h 0.0.0.0 -p 8081
./start-slave.sh spark://0.0.0.0:8081

Я использую следующий код для подключения к Spark:

SparkConf sparkConf = new SparkConf();
sparkConf.setAppName("Test-App");
sparkConf.setMaster("spark://192.168.90.130:8081");
sparkConf.set("spark.testing.memory", "3147480000");
JavaSparkContext jsc = new JavaSparkContext(new SparkContext(sparkConf));

Вывод выглядит следующим образом:

Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
20/02/22 15:05:38 INFO SparkContext: Running Spark version 2.4.5
20/02/22 15:05:38 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
20/02/22 15:05:38 INFO SparkContext: Submitted application: Test-App
20/02/22 15:05:38 INFO SecurityManager: Changing view acls to: p.xxx
20/02/22 15:05:38 INFO SecurityManager: Changing modify acls to: p.xxx
20/02/22 15:05:38 INFO SecurityManager: Changing view acls groups to:
20/02/22 15:05:38 INFO SecurityManager: Changing modify acls groups to:
20/02/22 15:05:38 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(p.xxx); groups with view permissions: Set(); users  with modify 
permissions: Set(p.xxx); groups with modify permissions: Set()
20/02/22 15:05:39 INFO Utils: Successfully started service 'sparkDriver' on port 49877.
20/02/22 15:05:39 INFO SparkEnv: Registering MapOutputTracker
20/02/22 15:05:39 INFO SparkEnv: Registering BlockManagerMaster
20/02/22 15:05:39 INFO BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information
20/02/22 15:05:39 INFO BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up
20/02/22 15:05:39 INFO DiskBlockManager: Created local directory at C:\Users\p.xxx\AppData\Local\Temp\blockmgr-8cd94856-10e7-4657-895d-891fa87f69ab
20/02/22 15:05:39 INFO MemoryStore: MemoryStore started with capacity 1621.0 MB
20/02/22 15:05:39 INFO SparkEnv: Registering OutputCommitCoordinator
20/02/22 15:05:39 INFO Utils: Successfully started service 'SparkUI' on port 4040.
20/02/22 15:05:39 INFO SparkUI: Bound SparkUI to 0.0.0.0, and started at http://Aseman168-111.NoAvaran.com:4040
20/02/22 15:05:39 INFO StandaloneAppClient$ClientEndpoint: Connecting to master spark://192.168.90.130:8081...
20/02/22 15:05:39 INFO TransportClientFactory: Successfully created connection to /192.168.90.130:8081 after 40 ms (0 ms spent in bootstraps)
20/02/22 15:05:59 INFO StandaloneAppClient$ClientEndpoint: Connecting to master spark://192.168.90.130:8081...
20/02/22 15:06:19 INFO StandaloneAppClient$ClientEndpoint: Connecting to master spark://192.168.90.130:8081...
20/02/22 15:06:39 ERROR StandaloneSchedulerBackend: Application has been killed. Reason: All masters are unresponsive! Giving up.
20/02/22 15:06:39 WARN StandaloneSchedulerBackend: Application ID is not initialized yet.
20/02/22 15:06:39 INFO SparkUI: Stopped Spark web UI at http://Aseman168-111.NoAvaran.com:4040
20/02/22 15:06:39 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 49965.
20/02/22 15:06:39 INFO NettyBlockTransferService: Server created on Aseman168-111.NoAvaran.com:49965
20/02/22 15:06:39 INFO BlockManager: Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy
20/02/22 15:06:39 INFO StandaloneSchedulerBackend: Shutting down all executors
20/02/22 15:06:39 INFO CoarseGrainedSchedulerBackend$DriverEndpoint: Asking each executor to shut down
20/02/22 15:06:39 WARN StandaloneAppClient$ClientEndpoint: Drop UnregisterApplication(null) because has not yet connected to master
20/02/22 15:06:39 INFO BlockManagerMaster: Registering BlockManager BlockManagerId(driver, Aseman168-111.NoAvaran.com, 49965, None)
20/02/22 15:06:39 INFO MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!
20/02/22 15:06:39 INFO BlockManagerMasterEndpoint: Registering block manager Aseman168-111.NoAvaran.com:49965 with 1621.0 MB RAM, BlockManagerId(driver, Aseman168-111.NoAvaran.com, 49965, None)
20/02/22 15:06:39 INFO BlockManagerMaster: Registered BlockManager BlockManagerId(driver, Aseman168-111.NoAvaran.com, 49965, None)
20/02/22 15:06:39 INFO MemoryStore: MemoryStore cleared
20/02/22 15:06:39 INFO BlockManager: Initialized BlockManager: BlockManagerId(driver, Aseman168-111.NoAvaran.com, 49965, None)
20/02/22 15:06:39 INFO BlockManager: BlockManager stopped
20/02/22 15:06:39 INFO BlockManagerMaster: BlockManagerMaster stopped
20/02/22 15:06:39 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!
20/02/22 15:06:40 INFO SparkContext: Successfully stopped SparkContext
20/02/22 15:06:40 ERROR SparkContext: Error initializing SparkContext.
java.lang.IllegalArgumentException: requirement failed: Can only call getServletHandlers on a running MetricsSystem
        at scala.Predef$.require(Predef.scala:281)
        at org.apache.spark.metrics.MetricsSystem.getServletHandlers(MetricsSystem.scala:91)
        at org.apache.spark.SparkContext.<init>(SparkContext.scala:516)
        at com.test.spark.App.main(App.java:32)
20/02/22 15:06:40 INFO SparkContext: SparkContext already stopped.
Exception in thread "main" java.lang.IllegalArgumentException: requirement failed: Can only call getServletHandlers on a running MetricsSystem
        at scala.Predef$.require(Predef.scala:281)
        at org.apache.spark.metrics.MetricsSystem.getServletHandlers(MetricsSystem.scala:91)
        at org.apache.spark.SparkContext.<init>(SparkContext.scala:516)
        at com.test.spark.App.main(App.java:32)
20/02/22 15:06:40 INFO ShutdownHookManager: Shutdown hook called
20/02/22 15:06:40 INFO ShutdownHookManager: Deleting directory C:\Users\p.xxx\AppData\Local\Temp\spark-95ce7358-d4f3-4248-9803-6623a2cbf801

Краткая форма вывода такова: я получаю сообщение об ошибке, в котором говорится:

   20/02/22 15:06:40 ERROR SparkContext: Error initializing SparkContext.
java.lang.IllegalArgumentException: requirement failed: Can only call getServletHandlers on a running MetricsSystem
        at scala.Predef$.require(Predef.scala:281)
        at org.apache.spark.metrics.MetricsSystem.getServletHandlers(MetricsSystem.scala:91)
        at org.apache.spark.SparkContext.<init>(SparkContext.scala:516)
        at com.test.spark.App.main(App.java:32)
20/02/22 15:06:40 INFO SparkContext: SparkContext already stopped.
Exception in thread "main" java.lang.IllegalArgumentException: requirement failed: Can only call getServletHandlers on a running MetricsSystem
        at scala.Predef$.require(Predef.scala:281)
        at org.apache.spark.metrics.MetricsSystem.getServletHandlers(MetricsSystem.scala:91)
        at org.apache.spark.SparkContext.<init>(SparkContext.scala:516)
        at com.test.spark.App.main(App.java:32)

Что это за ошибка и как я могу ее исправить?

Добро пожаловать на сайт PullRequest, где вы можете задавать вопросы и получать ответы от других членов сообщества.
...