Я пытался загрузить spark-shell, но при загрузке spark-контекста произошла ошибка.
hadoop@vm-10-155-208-86:~/spark/conf$ spark-shell
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use
setLogLevel(newLevel).
2018-06-06 14:01:42 ERROR SparkContext:91 - Error initializing SparkContext.
org.apache.spark.SparkException: Yarn application has already ended! It
might have been killed or unable to launch application master.
at org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.waitForApplication(YarnClientSchedulerBackend.scala:89)
at org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.start(YarnClientSchedulerBackend.scala:63)
at org.apache.spark.scheduler.TaskSchedulerImpl.start(TaskSchedulerImpl.scala:164)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:500)
at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2486)
2018-06-06 14:04:06 WARN YarnSchedulerBackend$YarnSchedulerEndpoint:66 -
Attempted to request executors before the AM has registered!
2018-06-06 14:04:06 WARN MetricsSystem:66 - Stopping a MetricsSystem that is not running
org.apache.spark.SparkException: Yarn application has already ended! It
might have been killed or unable to launch application master.
<console>:14: error: not found: value spark
import spark.implicits._
^
<console>:14: error: not found: value spark
import spark.sql
^
Using Scala version 2.11.8 (OpenJDK 64-Bit Server VM, Java 1.8.0_171)
Type in expressions to have them evaluated.
Type :help for more information.
scala>
Ниже приведен файл конфигурации spark-defaults-conf
hadoop@vm-10-155-208-86:~/spark/conf$ more spark-defaults.conf
spark.master yarn
spark.driver.memory 512m
spark.eventLog.enabled true
spark.eventLog.dir hdfs://master:9000/spark-logs
spark.history.provider org.apache.spark.deploy.history.FsHistoryProvider
spark.history.fs.logDirectory hdfs://master:9000/spark-logs
spark.history.fs.update.interval 10s
spark.history.ui.port 18080
spark.yarn.jars hdfs://master:9000/user/spark/share/lib/*.jar
Любая помощь будет оценена.