ClassNotFoundException при публикации сообщений на Kafka в Spark Application - PullRequest
0 голосов
/ 28 января 2020

Похоже, что приложение Spark не работает с приведенной ниже ошибкой в ​​тот момент, когда он пытается отправить сообщения в Kafka. Хотя ошибка возникает не каждый раз, а очень редко, есть ли проблема с совместимостью передаваемых файлов jar?

py4j.protocol.Py4JJavaError: An error occurred while calling o26614.save.
: org.apache.spark.SparkException: Job aborted due to stage failure: Task 4 in stage 384.0 failed 4 times, most recent failure: Lost task 4.3 in stage 384.0 (TID 5456, ip-10-164-227-204.ec2.internal, executor 2): java.lang.ClassNotFoundException: org.apache.spark.sql.kafka010.KafkaWriter$$anonfun$write$1
        at java.net.URLClassLoader.findClass(URLClassLoader.java:382)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
        at java.lang.Class.forName0(Native Method)
        at java.lang.Class.forName(Class.java:348)
        at org.apache.spark.serializer.JavaDeserializationStream$$anon$1.resolveClass(JavaSerializer.scala:67)
        at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1867)
        at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1750)
        at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2041)
        at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1572)
        at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2286)
        at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2210)
        at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2068)
        at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1572)
        at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2286)
        at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2210)
        at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2068)
        at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1572)
        at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2286)
        at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2210)
        at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2068)
        at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1572)
        at java.io.ObjectInputStream.readObject(ObjectInputStream.java:430)
        at org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:75)
        at org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:114)
        at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:83)
        at org.apache.spark.scheduler.Task.run(Task.scala:121)
        at org.apache.spark.executor.Executor$TaskRunner$$anonfun$10.apply(Executor.scala:408)
        at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)
        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:414)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
        at java.lang.Thread.run(Thread.java:748)

Cluster: AWS EMR

Jar: kafka-clients-0.10.0.1.jar, spark- sql -kafka-0-10_2.11-2.3.0.jar, spark-streaming-kafka-0- 8_2.11-2.0.0.jar

spark-submit --version версия 2.4.4

. / Kafka-themes. sh - -version 2.1.0

Пожалуйста, дайте мне знать, если потребуется какая-либо дополнительная информация.

...