Это команда, которую я изучил здесь: https://github.com/SparklineData/spark-druid-olap/wiki/Generating-Denormalized-TPCH-Dataset
bin / spark-submit \ --packages com.databricks: spark-csv_2.10: 1.1.0, com.sparklinedata: spark-datetime: 0.0.2, SparklineData: spark-druid-olap: 0.1.0 \ --class org.sparklinedata.tpch.TpchGenMain \ / home / temp / tpch-spark-druid / tpchData / target/scala-2.10/tpchdata_2.10-0.0.1.jar \ --baseDir / home / temp / tpch-spark-data --scale 1
Я получил ошибку:
==== spark-packages: tried http://dl.bintray.com/spark-packages/maven/com/github/SparklineData/spark-datetime/bf5693a575a1dea5b663e4e8b30a0ba94c21d62d/spark-datetime-bf5693a575a1dea5b663e4e8b30a0ba94c21d62d.pom -- artifact com.github.SparklineData#spark-datetime;bf5693a575a1dea5b663e4e8b30a0ba94c21d62d!spark-datetime.jar: http://dl.bintray.com/spark-packages/maven/com/github/SparklineData/spark-datetime/bf5693a575a1dea5b663e4e8b30a0ba94c21d62d/spark-datetime-bf5693a575a1dea5b663e4e8b30a0ba94c21d62d.jar :::::::::::::::::::::::::::::::::::::::::::::: :: UNRESOLVED DEPENDENCIES :: :::::::::::::::::::::::::::::::::::::::::::::: :: com.sparklinedata#spark-datetime;0.0.3: not found :: com.github.SparklineData#spark-datetime;bf5693a575a1dea5b663e4e8b30a0ba94c21d62d: not found ::::::::::::::::::::::::::::::::::::::::::::::
:: ИСПОЛЬЗОВАТЬ VERBOSE ИЛИ УРОВЕНЬ СООБЩЕНИЙ ОТЛАДКИ ДЛЯ БОЛЬШЕ ДЕТАЛЕЙ Исключение в потоке "main" java.lang.RuntimeException: [неразрешенная зависимость: com.sparklinedata # spark-datetime; 0.0.3: не найдено,неразрешенная зависимость: com.github.SparklineData # spark-datetime; bf5693a575a1dea5b663e4e8b30a0ba94c21d62d: не найден] в org.apache.spark.deploy.SparkSubmitUtils $ .resolveMavenCoordinates (spark.resolMavenDependencies (DependencyUtils.scala: 53) в org.apache.spark.deploy.SparkSubmit $ .doPrepareSubmitEnvironment (SparkSubmit.scala: 364) в org.apache.spark.deploy.SparkSubmit $ .prepareSubmitEnvironment (SparkSubmit.scala: 250) в org.apache.spark.deploy.SparkSubmit $ .submit (SparkSubmit.scg):.apache.spark.deploy.SparkSubmit $ .main (SparkSubmit.scala: 137) в org.apache.spark.deploy.SparkSubmit.main (SparkSubmit.scala)