При использовании yarn api yarn для отправки искрового задания на пряжу выдана ошибка。
Мастер - это пряжа, и он работает в кластере deploy-mode =, но не работает в режиме deploy = client。
Ява API-интерфейс пряжи для отправки искры следующим образом:
public static String submitSpark(YarnSubmitConditions conditions) {
List<String> args = Lists.newArrayList(//
"--jar", conditions.getApplicationJar(),//
"--class", conditions.getMainClass()//
);
if (conditions.getOtherArgs() != null && conditions.getOtherArgs().size() > 0) {
for (String s : conditions.getOtherArgs()) {
args.add("--arg");
args.add(org.apache.commons.lang.StringUtils.join(new String[] { s }, ","));
}
}
// identify that you will be using Spark as YARN mode
System.setProperty("SPARK_YARN_MODE", "true");
SparkConf sparkConf = new SparkConf();
sparkConf.setSparkHome(conditions.getSparkHome());
sparkConf.setMaster(conditions.getMaster());
sparkConf.set("spark.submit.deployMode", conditions.getDeployMode());
sparkConf.set("spark.driver.extraJavaOptions", "-XX:+TraceClassPaths");
sparkConf.setAppName(conditions.getAppName());
// --driver-memory
sparkConf.set("spark.driver.memory", conditions.getDriverMemory());
// --executor-memory
sparkConf.set("spark.executor.memory", conditions.getExecutorMemory());
。。。。
ClientArguments cArgs = new ClientArguments(args.toArray(new String[args.size()]));
org.apache.spark.deploy.yarn.Client client = new Client(cArgs, sparkConf);
try {
ApplicationId appId = client.submitApplication();
// or client.run();
return appId.toString();
} catch (Exception e) {
logger.error("提交spark任务失败", e);
return null;
} finally {
if (client != null) {
client.stop();
}
}
}
Сообщение об ошибке выглядит следующим образом:
19/01/14 16:01:51 ERROR ApplicationMaster: Failed to connect to driver at :0, retrying ...
19/01/14 16:01:51 ERROR ApplicationMaster: Failed to connect to driver at :0, retrying ...
19/01/14 16:01:51 ERROR ApplicationMaster: Uncaught exception:
org.apache.spark.SparkException: Failed to connect to driver!
at org.apache.spark.deploy.yarn.ApplicationMaster.waitForSparkDriver(ApplicationMaster.scala:629)
at org.apache.spark.deploy.yarn.ApplicationMaster.runExecutorLauncher(ApplicationMaster.scala:489)
at org.apache.spark.deploy.yarn.ApplicationMaster.org$apache$spark$deploy$yarn$ApplicationMaster$$runImpl(ApplicationMaster.scala:303)
at org.apache.spark.deploy.yarn.ApplicationMaster$$anonfun$run$1.apply$mcV$sp(ApplicationMaster.scala:241)
at org.apache.spark.deploy.yarn.ApplicationMaster$$anonfun$run$1.apply(ApplicationMaster.scala:241)
at org.apache.spark.deploy.yarn.ApplicationMaster$$anonfun$run$1.apply(ApplicationMaster.scala:241)
at org.apache.spark.deploy.yarn.ApplicationMaster$$anon$3.run(ApplicationMaster.scala:782)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1917)
at org.apache.spark.deploy.yarn.ApplicationMaster.doAsUser(ApplicationMaster.scala:781)
at org.apache.spark.deploy.yarn.ApplicationMaster.run(ApplicationMaster.scala:240)
at org.apache.spark.deploy.yarn.ApplicationMaster$.main(ApplicationMaster.scala:806)
at org.apache.spark.deploy.yarn.ExecutorLauncher$.main(ApplicationMaster.scala:836)
at org.apache.spark.deploy.yarn.ExecutorLauncher.main(ApplicationMaster.scala)
19/01/14 16:01:51 INFO ApplicationMaster: Final app status: FAILED, exitCode: 13, (reason: Uncaught exception: org.apache.spark.SparkException: Failed to connect to driver!)
19/01/14 16:01:51 INFO ShutdownHookManager: Shutdown hook called
Из верхнего сообщения об ошибке я обнаружил, что spark.driver.host пуст, а spark.driver.port равен нулю. Поэтому я изменяю код и добавляю этот код:
args.add("--arg");
args.add("192.168.0.141:20002");
Тогда используйте java -jar '
java -cp ./sparkjars/*:./my-test.jar com.dx.streaming.yarn.TestSubmit
'для повторного запуска тестового кода. Из сообщения «вниз» вы можете увидеть, что spark.driver.host был назначен как «192.168.0.141», а spark.driver.port был назначен как «20002», но ошибка также существует。
19/01/14 16:11:52 ERROR ApplicationMaster: Failed to connect to driver at 192.168.0.141:20002, retrying ...
19/01/14 16:11:52 ERROR ApplicationMaster: Failed to connect to driver at 192.168.0.141:20002, retrying ...
19/01/14 16:11:52 ERROR ApplicationMaster: Uncaught exception:
org.apache.spark.SparkException: Failed to connect to driver!
at org.apache.spark.deploy.yarn.ApplicationMaster.waitForSparkDriver(ApplicationMaster.scala:629)