Я пытаюсь построить искру из master ветви с ./build/sbt clean package
Я хочу проверить что-то определенное c для подмодуля spark-avro. Однако, когда я запускаю ./bin/spark-shell
и пытаюсь:
scala> import org.apache.spark.sql.avro._
, я получаю объект avro не является членом пакета org. apache .spark. sql
Мне не хватает параметра сборки для тестирования spark-avro? Я не мог найти много в документах.
test@tests/spark$ ./bin/spark-shell
NOTE: SPARK_PREPEND_CLASSES is set, placing locally compiled Spark classes ahead of assembly.
20/02/21 14:28:52 WARN Utils: Your hostname, pascals resolves to a loopback address: 127.0.1.1; using 192.168.0.11 instead (on interface enp0s31f6)
20/02/21 14:28:52 WARN Utils: Set SPARK_LOCAL_IP if you need to bind to another address
20/02/21 14:28:52 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
Spark context Web UI available at http://192.168.0.11:4040
Spark context available as 'sc' (master = local[*], app id = local-1582291738090).
Spark session available as 'spark'.
Welcome to
____ __
/ __/__ ___ _____/ /__
_\ \/ _ \/ _ `/ __/ '_/
/___/ .__/\_,_/_/ /_/\_\ version 3.0.0-SNAPSHOT
/_/
Using Scala version 2.12.10 (OpenJDK 64-Bit Server VM, Java 1.8.0_242)
Type in expressions to have them evaluated.
Type :help for more information.
scala> import org.apache.spark.sql.avro.SchemaConverters
<console>:23: error: object avro is not a member of package org.apache.spark.sql
import org.apache.spark.sql.avro.SchemaConverters
Ценю любую помощь!