У меня есть проект Scala Spark, который я пытаюсь запустить на своем Mac.
Когда я запускаю проект в sbt, я получаю следующий вывод:
lpuggini-pro13:stackoverflow lpuggini$ sbt
[residual] arg = '-sbt-create'
[process_args] java_version = '13'
[sbt_options] declare -a sbt_options='()'
[addMemory] arg = '1024'
[addJava] arg = '-Xms1024m'
[addJava] arg = '-Xmx1024m'
[addJava] arg = '-Xss4M'
[addJava] arg = '-XX:ReservedCodeCacheSize=128m'
[copyRt] java9_rt = '/Users/lpuggini/.sbt/0.13/java9-rt-ext-adoptopenjdk_13/rt.jar'
[addJava] arg = '-Dscala.ext.dirs=/Users/lpuggini/.sbt/0.13/java9-rt-ext-adoptopenjdk_13'
# Executing command line:
java
-Dfile.encoding=UTF-8
-Xms1024m
-Xmx1024m
-Xss4M
-XX:ReservedCodeCacheSize=128m
-Dscala.ext.dirs=/Users/lpuggini/.sbt/0.13/java9-rt-ext-adoptopenjdk_13
-jar
/usr/local/Cellar/sbt/1.3.1/libexec/bin/sbt-launch.jar
[info] Loading project definition from /Users/lpuggini/ProgrammingProjects/spark_coursera/stackoverflow/project
[info] Compiling 8 Scala sources to /Users/lpuggini/ProgrammingProjects/spark_coursera/stackoverflow/project/target/scala-2.10/sbt-0.13/classes...
[warn] /Users/lpuggini/ProgrammingProjects/spark_coursera/stackoverflow/project/CommonBuild.scala:3: trait Build in package sbt is deprecated: Use .sbt format instead
[warn] trait CommonBuild extends Build {
[warn] ^
[warn] one warning found
error: error while loading String, class file '/Library/Java/JavaVirtualMachines/adoptopenjdk-13.jdk/Contents/Home(java/lang/String.class)' is broken
(class java.lang.NullPointerException/null)
[info] Set current project to bigdata-stackoverflow (in build file:/Users/lpuggini/ProgrammingProjects/spark_coursera/stackoverflow/)
>
с сообщением об ошибке error: error while loading String, class file '/Library/Java/JavaVirtualMachines/adoptopenjdk-13.jdk/Contents/Home(java/lang/String.class)' is broken
Я думаю, что это позже вызывает сбой вкод
> console
[info] Compiling 2 Scala sources to /Users/lpuggini/ProgrammingProjects/spark_coursera/stackoverflow/target/scala-2.11/classes...
[info] Starting scala interpreter...
[info]
Welcome to Scala 2.11.12 (OpenJDK 64-Bit Server VM, Java 13).
Type in expressions for evaluation. Or try :help.
scala> import org.apache.spark.SparkConf
import org.apache.spark.SparkConf
scala> import org.apache.spark.SparkContext
import org.apache.spark.SparkContext
scala> import org.apache.spark.SparkContext._
import org.apache.spark.SparkContext._
scala> import org.apache.spark.rdd.RDD
import org.apache.spark.rdd.RDD
scala> import annotation.tailrec
import annotation.tailrec
scala> import scala.reflect.ClassTag
import scala.reflect.ClassTag
scala> val conf: SparkConf = new SparkConf().setMaster("local").setAppName("StackOverflow")
WARNING: An illegal reflective access operation has occurred
WARNING: Illegal reflective access by org.apache.spark.unsafe.Platform (file:/Users/lpuggini/Library/Caches/Coursier/v1/https/repo1.maven.org/maven2/org/apache/spark/spark-unsafe_2.11/2.4.3/spark-unsafe_2.11-2.4.3.jar) to method java.nio.Bits.unaligned()
WARNING: Please consider reporting this to the maintainers of org.apache.spark.unsafe.Platform
WARNING: Use --illegal-access=warn to enable warnings of further illegal reflective access operations
WARNING: All illegal access operations will be denied in a future release
conf: org.apache.spark.SparkConf = org.apache.spark.SparkConf@33f9678b
scala> val sc: SparkContext = new SparkContext(conf)
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
19/09/22 15:56:28 INFO SparkContext: Running Spark version 2.4.3
19/09/22 15:56:28 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
java.lang.StringIndexOutOfBoundsException: begin 0, end 3, length 2
at java.base/java.lang.String.checkBoundsBeginEnd(String.java:3720)
at java.base/java.lang.String.substring(String.java:1909)
at org.apache.hadoop.util.Shell.<clinit>(Shell.java:50)
at org.apache.hadoop.util.StringUtils.<clinit>(StringUtils.java:79)
at org.apache.hadoop.security.Groups.parseStaticMapping(Groups.java:116)
at org.apache.hadoop.security.Groups.<init>(Groups.java:93)
at org.apache.hadoop.security.Groups.<init>(Groups.java:73)
at org.apache.hadoop.security.Groups.getUserToGroupsMappingService(Groups.java:293)
at org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:283)
at org.apache.hadoop.security.UserGroupInformation.ensureInitialized(UserGroupInformation.java:260)
at org.apache.hadoop.security.UserGroupInformation.loginUserFromSubject(UserGroupInformation.java:789)
at org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:774)
at org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:647)
at org.apache.spark.util.Utils$$anonfun$getCurrentUserName$1.apply(Utils.scala:2422)
at org.apache.spark.util.Utils$$anonfun$getCurrentUserName$1.apply(Utils.scala:2422)
at scala.Option.getOrElse(Option.scala:121)
at org.apache.spark.util.Utils$.getCurrentUserName(Utils.scala:2422)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:293)
... 42 elided
scala>
Как это исправить?