Я новичок в Spark и хочу написать приложение Spark. Я попытался создать SparkSession
, используя следующий Java код:
public class Get_Data_From_MySQL {
public static void main(String[]args){
SparkConf sparkConf = new SparkConf().setAppName("Load_Data_From_MySQL").setMaster("local[*]");
SparkSession spark = SparkSession
.builder()
.appName("Application Name")
.config(sparkConf)
.getOrCreate();
}
}
Я получаю эту ошибку:
Exception in thread "main" java.lang.NoSuchMethodError:
org.apache.spark.internal.Logging.$init$(Lorg/apache/spark/internal/Logging;)V
at org.apache.spark.sql.SparkSession$.<init>(SparkSession.scala:731)
at org.apache.spark.sql.SparkSession$.<clinit>(SparkSession.scala)
at org.apache.spark.sql.SparkSession.builder(SparkSession.scala)
at Model.Get_Data_From_MySQL.main(Get_Data_From_MySQL.java:15)
Моя версия Spark 3.0.0-preview2
.
Мой pom.xml
файл:
<dependencies>
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>4.11</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.10</artifactId>
<version>2.2.3</version>
</dependency>
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-log4j12</artifactId>
<version>1.7.22</version>
</dependency>
<!-- https://mvnrepository.com/artifact/org.apache.spark/spark-sql -->
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql_2.12</artifactId>
<version>3.0.0-preview2</version>
</dependency>
<!-- https://mvnrepository.com/artifact/org.apache.hadoop/hadoop-hdfs -->
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-hdfs</artifactId>
<version>3.2.1</version>
</dependency>
<!-- https://mvnrepository.com/artifact/mysql/mysql-connector-java -->
<dependency>
<groupId>mysql</groupId>
<artifactId>mysql-connector-java</artifactId>
<version>3.0.10</version>
</dependency>
</dependencies>