Я хотел бы создать файл jar с помощью Spark 2.4.5. Я определил файл build.sbt
для включения Spark 2.4.5 и Breeze как
name := "trialLibrary"
version := "1.0"
scalaVersion := "2.11.12"
val sparkVersion = "2.4.5"
libraryDependencies ++= Seq(
"org.apache.spark" %% "spark-core" % sparkVersion,
"org.apache.spark" %% "spark-sql" % sparkVersion,
"org.scalanlp" %% "breeze" % "1.0",
"org.scalanlp" %% "breeze-natives" % "1.0",
)
Однако во время компиляции с sbt package
я получаю много конфликтов в зависимостях библиотеки после запуска evicted
в консоли sbt:
[warn] Found version conflict(s) in library dependencies; some are suspected to be binary incompatible:
[warn] * io.netty:netty:3.9.9.Final is selected over {3.6.2.Final, 3.7.0.Final}
[warn] +- org.apache.spark:spark-core_2.11:2.4.5 (depends on 3.9.9.Final)
[warn] +- org.apache.zookeeper:zookeeper:3.4.6 (depends on 3.7.0.Final)
[warn] +- org.apache.hadoop:hadoop-hdfs:2.6.5 (depends on 3.6.2.Final)
[warn] * com.google.code.findbugs:jsr305:3.0.2 is selected over {1.3.9, 1.3.9, 1.3.9, 1.3.9, 1.3.9, 1.3.9, 1.3.9, 1.3.9}
[warn] +- org.apache.arrow:arrow-memory:0.10.0 (depends on 3.0.2)
[warn] +- org.apache.arrow:arrow-vector:0.10.0 (depends on 3.0.2)
[warn] +- org.apache.spark:spark-unsafe_2.11:2.4.5 (depends on 1.3.9)
[warn] +- org.apache.spark:spark-network-common_2.11:2.4.5 (depends on 1.3.9)
[warn] +- org.apache.spark:spark-core_2.11:2.4.5 (depends on 1.3.9)
[warn] +- org.apache.hadoop:hadoop-common:2.6.5 (depends on 1.3.9)
[warn] * com.google.guava:guava:16.0.1 is selected over {11.0.2, 11.0.2, 11.0.2, 11.0.2, 11.0.2, 11.0.2, 11.0.2, 12.0.1, 11.0.2, 11.0.2, 11.0.2, 11.0.2, 11.0.2, 11.0.2, 11.0.2, 12.0.1, 11.0.2, 11.0.2, 11.0.2, 11.0.2, 11.0.2, 11.0.2, 11.0.2, 12.0.1, 11.0.2, 11.0.2, 11.0.2, 11.0.2, 11.0.2, 11.0.2, 11.0.2, 12.0.1, 11.0.2, 11.0.2, 11.0.2, 11.0.2, 11.0.2, 11.0.2, 11.0.2, 12.0.1}
[warn] +- org.apache.curator:curator-framework:2.6.0 (depends on 16.0.1)
[warn] +- org.apache.curator:curator-recipes:2.6.0 (depends on 16.0.1)
[warn] +- org.apache.curator:curator-client:2.6.0 (depends on 16.0.1)
[warn] +- org.htrace:htrace-core:3.0.4 (depends on 12.0.1)
[warn] +- org.apache.hadoop:hadoop-yarn-server-nodemanager:2.6.5 (depends on 11.0.2)
[warn] +- org.apache.hadoop:hadoop-yarn-server-common:2.6.5 (depends on 11.0.2)
[warn] +- org.apache.hadoop:hadoop-yarn-common:2.6.5 (depends on 11.0.2)
[warn] +- org.apache.hadoop:hadoop-yarn-client:2.6.5 (depends on 11.0.2)
[warn] +- org.apache.hadoop:hadoop-yarn-api:2.6.5 (depends on 11.0.2)
[warn] +- org.apache.hadoop:hadoop-hdfs:2.6.5 (depends on 11.0.2)
[warn] +- org.apache.hadoop:hadoop-common:2.6.5 (depends on 11.0.2)
[info] Here are other dependency conflicts that were resolved:
[info] * log4j:log4j:1.2.17 is selected over 1.2.16
[info] +- org.apache.hadoop:hadoop-hdfs:2.6.5 (depends on 1.2.17)
...
Я пропустил большинство сообщений [info]
, которые печатаются после строки «Вот и другие конфликты зависимостей, которые были разрешены».
У меня также есть пытался изменить scalaVersion
на 2.12.10
, но безуспешно.
Я также пытался заменить build.sbt
на
"org.apache.spark" % "spark-core_2.11" % 2.4.5,
"org.apache.spark" % "spark-sql_2.11" % 2.4.5
, но ничего не изменилось.
Есть ли способ решить эту проблему?