Упаковка файла JAR с помощью sbt для Spark - PullRequest
0 голосов
/ 07 октября 2018

Я просматриваю их начальную страницу: https://spark.apache.org/docs/latest/quick-start.html И от второго к последнему фрагменту кода я должен использовать sbt для упаковки своих файлов в .jar

мой build.sbt:

name := "Simple Project"

version := "1.0"

scalaVersion := "2.11.8"

libraryDependencies += "org.apache.spark" % "spark-sql" % "2.3.2"

my SimpleApp.scala:

/* SimpleApp.scala */
import org.apache.spark.sql.SparkSession

object SimpleApp {
  def main(args: Array[String]) {
    val logFile = "/usr/local/spark/README.md" // Should be some file on your system
    val spark = SparkSession.builder.appName("Simple Application").getOrCreate()
    val logData = spark.read.textFile(logFile).cache()
    val numAs = logData.filter(line => line.contains("a")).count()
    val numBs = logData.filter(line => line.contains("b")).count()
    println(s"Lines with a: $numAs, Lines with b: $numBs")
    spark.stop()
  }
}

Я помещаю файлы в правильные каталоги, как запрошено, и запускаю sbt package из / usr / local / spark / examples / where build.

Затем я получаю эту очень длинную ошибку:

test@test-ThinkPad-X230:/usr/local/spark/examples$ sbt package
[info] Loading project definition from /usr/local/spark/examples/project
[info] Loading settings for project examples from build.sbt ...
[info] Set current project to Simple Project (in build file:/usr/local/spark/examples/)
[info] Updating ...
[info] downloading https://repo1.maven.org/maven2/org/apache/avro/avro/1.7.7/avro-1.7.7.jar ...
[info]  [SUCCESSFUL ] org.apache.avro#avro;1.7.7!avro.jar (322ms)
[info] Done updating.
[info] Compiling 188 Scala sources and 125 Java sources to /usr/local/spark/examples/target/scala-2.11/classes ...
[error] /usr/local/spark/examples/src/main/scala/org/apache/spark/examples/LocalFileLR.scala:23:8: not found: object breeze
[error] import breeze.linalg.{DenseVector, Vector}
[error]        ^
[error] /usr/local/spark/examples/src/main/scala/org/apache/spark/examples/LocalFileLR.scala:39:19: not found: type DenseVector
[error]     DataPoint(new DenseVector(nums.slice(1, D + 1)), nums(0))
[error]                   ^
[error] /usr/local/spark/examples/src/main/scala/org/apache/spark/examples/LocalFileLR.scala:60:13: not found: value DenseVector
[error]     val w = DenseVector.fill(D) {2 * rand.nextDouble - 1}
[error]             ^
[error] /usr/local/spark/examples/src/main/scala/org/apache/spark/examples/LocalFileLR.scala:65:22: not found: value DenseVector
[error]       val gradient = DenseVector.zeros[Double](D)
[error]                      ^
[error] /usr/local/spark/examples/src/main/scala/org/apache/spark/examples/LocalKMeans.scala:26:8: not found: object breeze
[error] import breeze.linalg.{squaredDistance, DenseVector, Vector}
[error]        ^
[error] /usr/local/spark/examples/src/main/scala/org/apache/spark/examples/LocalKMeans.scala:42:27: not found: type DenseVector
[error]   def generateData: Array[DenseVector[Double]] = {
[error]                           ^
[error] /usr/local/spark/examples/src/main/scala/org/apache/spark/examples/LocalKMeans.scala:43:32: not found: type DenseVector
[error]     def generatePoint(i: Int): DenseVector[Double] = {
[error]                                ^
[error] /usr/local/spark/examples/src/main/scala/org/apache/spark/examples/LocalKMeans.scala:44:7: not found: value DenseVector
[error]       DenseVector.fill(D) {rand.nextDouble * R}
[error]       ^

, и она продолжается и продолжается.Я не могу сказать, что я сделал не так.

1 Ответ

0 голосов
/ 07 октября 2018

Найдена проблема: build.sbt должен был быть в /spark/ вместо /spark/examples

...