Spark Sql на Windows 10 - PullRequest
       60

Spark Sql на Windows 10

0 голосов
/ 03 октября 2019

Я пытаюсь использовать структурированную потоковую передачу с Apache Bahir и Spark. Из исходной ссылки MQTTStreamWordCount

val warehouseLocation = "file://C:/dev_env/scala/"
val sess = SparkSession
            .builder
            .appName("MQTTStreamWordCount")
            .master("local[4]")
            .config("spark.sql.warehouse.dir", warehouseLocation)
            .getOrCreate()

getOrCreate () вызывает это исключение:

Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
Exception in thread "main" java.lang.IllegalArgumentException: Error while instantiating 'org.apache.spark.sql.internal.SessionState':
    at org.apache.spark.sql.SparkSession$.org$apache$spark$sql$SparkSession$$reflect(SparkSession.scala:981)
    at org.apache.spark.sql.SparkSession.sessionState$lzycompute(SparkSession.scala:110)
    at org.apache.spark.sql.SparkSession.sessionState(SparkSession.scala:109)
    at org.apache.spark.sql.SparkSession$Builder$$anonfun$getOrCreate$5.apply(SparkSession.scala:878)
    at org.apache.spark.sql.SparkSession$Builder$$anonfun$getOrCreate$5.apply(SparkSession.scala:878)
    at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:99)
    at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:99)
    at scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:230)
    at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:40)
    at scala.collection.mutable.HashMap.foreach(HashMap.scala:99)
    at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:878)
    at MqttSQL$.main(MqttSQL.scala:19)
    at MqttSQL.main(MqttSQL.scala)
Caused by: java.lang.reflect.InvocationTargetException
    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
    at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
    at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
    at org.apache.spark.sql.SparkSession$.org$apache$spark$sql$SparkSession$$reflect(SparkSession.scala:978)
    ... 12 more
Caused by: java.lang.AbstractMethodError
    at org.apache.spark.internal.Logging$class.initializeLogIfNecessary(Logging.scala:99)
    at org.apache.spark.sql.internal.SharedState.initializeLogIfNecessary(SharedState.scala:38)
    at org.apache.spark.internal.Logging$class.log(Logging.scala:46)
    at org.apache.spark.sql.internal.SharedState.log(SharedState.scala:38)
    at org.apache.spark.internal.Logging$class.logInfo(Logging.scala:54)
    at org.apache.spark.sql.internal.SharedState.logInfo(SharedState.scala:38)
    at org.apache.spark.sql.internal.SharedState.<init>(SharedState.scala:69)
    at org.apache.spark.sql.SparkSession$$anonfun$sharedState$1.apply(SparkSession.scala:101)
    at org.apache.spark.sql.SparkSession$$anonfun$sharedState$1.apply(SparkSession.scala:101)
    at scala.Option.getOrElse(Option.scala:120)
    at org.apache.spark.sql.SparkSession.sharedState$lzycompute(SparkSession.scala:101)
    at org.apache.spark.sql.SparkSession.sharedState(SparkSession.scala:100)
    at org.apache.spark.sql.internal.SessionState.<init>(SessionState.scala:157)
    ... 17 more

Я использую Windows 10 с Scala 2.11.1 и Spark 2.4+0,4. Чего мне не хватает?

РЕДАКТИРОВАТЬ: Так получается, что build.sbt пропустил эту строку:

libraryDependencies += "org.apache.spark" %% "spark-sql" % "2.4.4"
...